Nine in 10 life scientists surveyed don’t understand US or European AI rules
25 Feb 2024
Just 9% of life science professionals polled say they understand either EU or US laws on the use of artificial intelligence in their sector, claims a new survey.
The not-for-profit Pistoia Alliance which seeks to lower obstacles to R&D innovation revealed too that more than a third (35%) of respondents admitted their knowledge was not simply deficient but non-existent.
Chief portfolio officer Dr Christian Baber highlighted the alliance’s concern, given that the organisation’s previous Lab of the Future Report suggested AI was the issue that topped its members’ agendas.
“Our new research highlights legislation is a major barrier to adopting AI successfully. We must bridge the gap between life sciences, technology companies, vendors, and legislators to harness AI in a secure and compliant way to accelerate vital health research,” warned Baber.
The 125 international respondents, surveyed during a webinar this month, gave three main reasons why adopting AI was challenging. Complexity and ambiguity of regulations topped the list for 37%, followed by varying regulations across regions (23%), and lack of collaboration between industry and regulatory bodies (20%).
Baber said the Pistoia Alliance was well placed to overcome the problem, thanks to mechanisms such as its Artificial Intelligence and Machine Learning community of experts that could mediate between experts in data science, pharma, regulatory bodies and government.
“We now encourage more technology, compliance, and pharma experts to come forwards to join the community, and express interest in our new AI initiatives,” he added.
To date, more than 30 countries have passed AI legislation, with a further 13 debating the subject. The United States has announced new measures in an executive order while the European Union’s AI Act will be one of the most stringent in the World.
The alliance says the latter will particularly impact pharma, owing to the EU’s large market size, its tradition of setting global precedents and the fact that the new rules are calculated on AI’s potential risk and level of impact to consumers. High risk applications (including medical devices, drug manufacturing and diagnostic AI) will henceforth require conformity assessment, while applications such as chatbots must be clearly labelled as AI tools.
“AI is new territory for both legislators and pharma companies that we must navigate together. Our members have raised a number of concerns regarding emerging legislation. From ambiguities surrounding risk categories, to challenges around data and AI governance, and the use of synthetic data to train future algorithms,” said project manager of the Pistoia Alliance AI and ML community of experts Dr Vladimir Makarov.
“The Pistoia Alliance panel of experts will discuss how legal changes may affect research and allows pharma companies to get involved at the regulatory level. This research gives us a strong baseline to understand members’ current concerns, share back to the regulators, and shape our future discussions and projects.”
Pic: Wikimedia Creative Commons