To prepare for the laboratory of the future we must ask, decades ahead, how it will look? Pistoia Alliance’s Dr Becky Upton answers with her own forecasting and pinpoints the key agents of change.
One way to begin to imagine the lab of 2050 – 28 years in the future – is by looking back on the life sciences landscape 28 years ago. In 1995, the internet first launched for consumer use. Around the same time, Dolly the Sheep became the first mammal to be successfully cloned, and the first human genome was sequenced. Three decades before these events, to many it likely felt far-fetched to even conceive of cloning a mammal or understanding how much the internet would come to dominate our lives. As we look ahead to the lab of 2050, here are four areas we expect to shape development:
Natural language processing
Natural language processing (NLP) combines computational linguistics with statistical, machine learning (ML) and deeplearning models to process and analyse natural language data. It is the driving force behind AI and machine-learning, with the capability to contextualise data. In the future, NLP-driven AI and ML will augment human researchers in the lab, enabling them to uncover new relationships between data and automate large analyses to accelerate research.
As NLP-driven AI and ML develops, it’s crucial we ensure it is equitable (e.g. can recognise connections in data from diverse groups of patients) as well as ethical. Ultimately, NLP is only as powerful as the data it ‘feeds’ on; it must be underpinned by accurate data, with integrity and fairness assured. The Pistoia Alliance is working to ‘future-proof’ AI, ML, and NLP by defining best practices in life sciences R&D with its AI/ML community and NLP Use Case Database project.
Data management and semantic enrichment
Today, labs are producing and gathering huge volumes of data. These datasets have the potential to give researchers must deeper insight into their research questions and to make new connections. But now, much of the data are currently siloed between individuals, laboratories and organisations, and are not stored according to best practice FAIR principles (findable, accessible, interoperable and reusable).
These problems even persist within laboratories belonging to the same organisation. Data is often saved in different electronic lab notebooks tied to different vendors, or in different applications confined to a single researcher’s desktop. To prevent these issues from holding back the lab of the future, we should strive to make data FAIR from birth. We can also make data more easily accessible through semantic enrichment, which involves adding a layer of metadata to keywords. FAIR, semantically enriched data will result in higher quality experiments, reduced rework and better decision-making.
Virtual reality and the meta-lab
Many labs have already begun to explore the use of automation, robotics and virtual (VR) and augmented reality (AR). The Covid-19 pandemic encouraged labs to embrace VR/AR and robotic controlled technologies to continue conducting experiments remotely. This year, we are likely to see the emergence of labs that are completely driven by AI-powered robotics. Alongside these developments, the metaverse is becoming a huge topic for all sectors. We can’t yet be sure how the metaverse will develop, and the technologies we’d need to construct a ‘meta-lab’ at scale haven’t yet been perfected.
What we can more confidently predict is that even greater automation and the application of technologies like virtual and augmented reality will certainly have a place in the lab of the future. To pave the way for these developments, the industry will need to agree how such technologies should be regulated as well as how we would ensure patient centricity and safety in these new paradigms.
Diversity that drives better research
Outside of tangible developments in technology, we are going to see growing diversity in the lab environment as the life sciences sector addresses longstanding issues around equal opportunities.
This is critically important for innovation. We know that greater diversity in research teams correlates positively with citation counts, suggesting that diversity encourages better quality research.
This year, we are likely to see the emergence of labs that are completely driven by AI-powered robotics. Alongside these developments, the metaverse is becoming a huge topic
Some advances have been made; the number of women in STEM has grown in the physical sciences, but in the UK alone, Black and minority ethnic men are 28% less likely to work in STEM than White men, and 29% of LGBTQ+ people surveyed would not consider a career in STEM due to fear of discrimination. With new approaches to education, career mentoring and networking (like Pistoia Alliance’s Diversity and Inclusion in STEM programme), we expect to see this shift accelerate in the coming decade.
Delivering the future requires collaboration
Technology has the potential to transform the laboratory for the better, but only if our industry is prepared to evolve alongside it. Other challenges must also be overcome as we journey to the lab of the future – from addressing the poor record around sustainability and the green agenda in the life sciences, as well as bridging the skills gap that persists and is holding back transformation. Collaboration is vital.
Breakthroughs – whether cloning a mammal or the creation of the internet – happen at the boundaries between disciplines. As president of the Pistoia Alliance, I am keen to hear from organisations who want to collaborate to build these bridges (email projectinquiry@pistoiaalliance.org). By working together, we can deliver on the promise of the lab of the future.
Dr Becky Upton is President of the Pistoia Alliance