Products based on artificial intelligence should be regulated like drugs, and those that are deemed unsafe should be removed from the market, according to a new report from the Academy of Medical Royal Colleges, commissioned by NHS Digital.
The report noted that AI-based products are likely to fall under the remit of the UK’s regulator, the Medicines and Healthcare products Regulatory Agency (MHRA). However, these devices also have implications for other organisations, such as the General Medical Council, which will need to give clear guidelines on appropriate use of AI.
Medical defence organisations may also need to be prepared in light of the increasing number of patients provided with AI-generated decisions and recommendations.
The report highlighted that the Care Quality Commission regulator for hospitals and clinics will have to consider how AI systems will impact on the quality of care, with NHS Digital taking a role in risk management.
However, it was clear to emphasise that AI is unlikely to replace clinicians for the foreseeable future and that doctors should instead be trained in data science as well as medicine.
The report made seven recommendations. The first of these was that politicians and policymakers should avoid thinking that AI is going to solve all the problems, facing the UK health system.
In addition, it stated that patient safety must be regulated, and the doctors of the future will need to be well versed in using AI technology.
In addition, the report suggested that data should be shared more widely among those who meet information and governance standards, and it should be for the government to decide how widely it is shared.
It also called for joined up regulation and proposed that providers of AI services should be regulated like every other healthcare product.
“As with the pharmaceutical industry, licensing and post-market surveillance are critical and methods should be developed to remove unsafe systems,” authors said.
Above all, AI should be used to reduce, rather than increase health inequality, the report noted.