The past, present and future: 40 years of genomics
20 Oct 2011 by Evoluted New Media
One of the most amazing advances in the last 40 years of science is our insights into genomics. Since the 1970s and 1980s when Frederick Sanger established initial techniques for genomic sequencing and mapping, the global genomics industry has seen considerable and rapid development. Here, scientists from LGC take us on a trip down memory lane, looking back at genomics over the last 40 years, and give us their predictions for the future.
From early SNP detection through to digital PCR
Carole Foy, Principal Scientist
In the late 1980s, I was trying to locate a gene variant causing a form of hereditary deafness. This was before the days of PCR so our approach relied on Southern blot analyses. I recall the Heath-Robinson-esque mountain of paper towels, whilst the 32P labelled probes sent the Geiger counters screaming! The results took days to develop and the pattern of bands was interpreted by eye. All this to look at one locus in 20 samples! Miraculously, we still managed to map the gene after several years of effort to a particular part of chromosome two.
In more recent years, digital PCR has really come in to its own. Individual PCR assays can now be done in nanolitre volumes with over 30,000 individual single molecule amplifications being performed on a single microfluidic chip. Counting up the number of positive amplifications provides a highly sensitive and accurate method for detecting rare single nucleotide polymorphism markers (SNPs). At LGC we are now working to develop standards for such measurements to ensure confidence and comparability.
From metabolic markers to stratified medicines
David McDowell, Genomics Account Manager
In the early 1990’s I recall Professor Robert Smith sharing his personal experiences of the drug debrisoquine describing how he had taken the drug to treat high blood pressure and had collapsed as a result. It turned out he had two non-functional variants of the P450 liver enzyme gene CYP2D6. Essentially, Bob could not break down the drug, resulting in his blood pressure continuing to fall to potentially dangerous levels. Our understanding of the effects of gene variants on both the risk and effectiveness of a number of drugs has progressed significantly since those times.
Since then, colleagues and I have been involved in developing assays in personalised medicine using LGC’s HyBeacon genotyping technology. These are now being used in clinics to determine the utility of on-site genetic testing to inform the starting dose of anticlotting drugs such as Warfarin.
We are living in a new era of genomics research and understanding in which the aligning of medication and dose to your genome is increasingly possible in point-of-care applications in stratified medicine.
From phenol/chloroform to automated magnetic bead DNA extraction
Dr Frank Schubert, Business Unit Manager Nucleic Acid Preparation
In the early days of nucleic acid extraction, where everything was done by hand, the environmental and safety aspects of using harsh aggressive chemicals such as phenol/chloroform were not questioned and such methods were even considered essential to prepare ‘gold standard’ DNA.
Over the past decade, the requirement for quick methods which allow for multiple samples to be extracted in parallel has driven the introduction of automation in the lab. From the technical point of view, solid phase extraction involving the adsorption of nucleic acids to solid carrier materials has proven to be the method of choice in this regard. Using solid adsorbents with magnetic properties facilitates automation, particularly because magnetic fields can be used to manipulate these solid carriers. I have found it very interesting to develop similar systems, such as LGC Genomics’s beadex, particularly for challenging materials such as plant tissues.
Valid analytical methods to international quality methods
Helen Parkes, Principal Consultant International Biomeasurement
I remember back in the early 1980s, moving sample tubes from water bath, to ice, back to water bath – the original version of PCR before thermal cyclers were invented! In those days very few methods were validated and I was extremely grateful for a positive amplification. Sometimes it worked and sometimes it didn’t (depending on the phase of the moon!). As with most PhD students at the time, I’d eventually get the result I wanted and move on with my studies.
These days, we take PCR and all the other “black box” genomic technologies somewhat for granted. That said, how confident can we be in the reproducibility and comparability of results from such tools critical in underpinning diagnostic decisions and therapeutic medicines. It is rewarding to be involved in our designated role as the UK’s National Measurement Institute for chemical and bioanalytical measurement, in which LGC actively leads and participates in a number of international initiatives aimed at molecular standardisation. Most recently, a key focus has been the development of genomic DNA standards for next generation sequencing/high throughput sequencing.
From short sequences to whole genomes
Dr Wolfgang Zimmermann, Business Unit Manager Sequencing
The first DNA sequencing reactions I undertook in 1990 were looking at sequence variations in oncogenes labelled with 35S, using Sanger sequencing methods. Such laborious work as it was, it only delivered a few hundred bases of sequence information. The development of fluorescent labels were a great advantage and, with the ABI 377, it allowed for read lengths of up to 600 from 64 samples during one run. This technology was a pre-requisite for the Human Genome Project and I was happy to be part of a team from Japan and Germany who sequenced the complete human chromosome 21, using shot gun libraries of hundreds of bacteria and artificial chromosome clones. It took more than three years to complete the entire sequencing programme. Only a very few people could ever imagine developing techniques which would allow for the sequencing of a human genome within days. The introduction of next generation sequencing instruments brings this closer to reality. We are now at the beginning of an evolution which will enable true personalised medicine for an economically acceptable price.
From early GMO identification to standardised surveillance
Malcolm Burns, Science Leader – Food Analysis
When genetically modified organisms (GMOs) were first engineered around 25 years ago, strategies for their traceability were very unstructured and non-standardised. Over 10 years ago, when I started in this field, PCR methods would often lack specificity for particular GM varieties. The situation was further complicated as DNA extraction was not standardised and I would visualise the PCR products on an agarose gel system often using fluorescent staining dyes with carcinogenic properties. This added subjectivity (and potential safety issues) to the analysis in terms of estimating size and intensity of DNA bands on a gel.
I have witnessed better standardisation and efficiency associated with the DNA extraction process, with commercially available kits, internationally validated protocols and the use of automated DNA extraction instruments all contributing towards this. Currently the method of choice for GMO detection is real-time PCR. Along with other National Reference Laboratories we work with the European Reference Laboratory for GMOs to establish robust approaches for GMO identification. The challenge looking ahead is correctly identifying the ever-increasing number of GM varieties and evaluating highthroughput analytical strategies to cope with this.
Macroarrays to microarrays – the numbers game
Ramnath Elaswarapu, Science Leader – Functional Genomics
I was fortunate to work at the Human Genome Mapping Project Resource Centre, where high throughput analysis of genes was carried out by immobilising DNA molecules on the surface of nylon membranes. I still remember when we produced an E. coli ‘macroarray’ on a nylon membrane with 1,536 colonies; this caused huge excitement in the centre.
Developments soon followed with simplified robotic devices capable of spotting thousands of DNA features on glass microscope slides. Then in 1995, there was Affymetrix’s first whole genome GeneChip which used semiconductor manufacturing techniques comprising 65K features. This was the turning point when ‘microarrays’ took over ‘macroarrays’ and the game of ‘numbers’ started. Currently, the maximum number of features are now counted in the low millions. Several groups, including LGC, are developing controls for gene expression and single nucleotide polymorphism (SNP) applications, with a view to employing them as performance indicators and for crossplatform comparisons. The exploitation of microarray technology continues to extend to other applications including microRNA profiling, epigenomic analysis, biomarker discovery and whole genome copy number variation.
PCR the legend and the new usurpers
Rebecca Howard, Genetic Researcher
When I attended my first undergraduate lecture in 1999, PCR was not only a well-established technique but a thing of legend. I still recall hearing tales of PCR being carried out by hand with water baths and polymerases so thermally unstable they couldn’t last a full amplification. Fortunately, by the time I had the opportunity to try PCR for myself, things had advanced. Any primer could be made to order, the thermostable polymerase arrived in the post and there wasn’t a water bath in sight. Even so, I remember loading the products onto a gel with trepidation, unable to quite believe that by adding just a few ingredients into a tiny tube I could achieve the miracle of DNA amplification.
With the drive now for rapid, simple and inexpensive DNA amplification, PCR is no longer the only contender for the job. The past ten years have seen numerous isothermal amplification methods rising up to take its place, all promising results in under ten minutes and removing the need for expensive thermal cycling instrumentation. Will we one day see PCR confined to history?
Technologies and economies
Gavin Nixon, Researcher – Molecular and Cell Biology
Molecular biology has changed greatly from my days as a student in the early 1990s when I ran simple PCR on the laboratory bench and visualised the results (or lack thereof) on an agarose gel. During my time at LGC I have witnessed the transition from classical PCR to cutting edge high throughput real-time PCR instrumentation capable of monitoring the production of amplification products in thousands of microfluidic chambers and accurately detect rare DNA molecules.
However, the arrival of high throughput genomics has necessitated the use of costly high tech equipment and specialised reagent configurations. It is refreshing to see companies such as KBioscience focussing their efforts on developing cost-effective high throughput genotyping instrumentation and services.
Looking to the short term future, economic strategies for molecular analysis may be of greater importance than before. Combining technological advances with ongoing cost reductions is vital for the global genomics industry to achieve the challenge of developing a sub US$1,000 genome analysis method.
From DNA Fingerprinting in weeks to rapid profiles in an hour!
Paul Debenham, Director Innovation and Development
In 1987, I recall the tremendous exhilaration when I developed my first DNA fingerprinting results. The ideal analysis of about 5ug of DNA was a process that took several days from DNA extraction, restriction digestion, Southern blotting then radioactive probing to achieve the first autoradiograph results; and that was only for the first probe of potentially four to six probe analyses. In forensic casework one ideally had a blood stain the size of a 50p piece, but would try with the smallest of samples (perhaps a stain about the size of a 5p piece) and it might take weeks if not months to detect the faintest results.
In the past 25 years, there have been numerous method enhancements, so that fully evidential DNA analyses are reported in just days and only from a few cells. Whilst instant ID analysis may be science fiction, LGC is developing yet another step-change in the sampling and processing of crime samples by working to develop portable rapid profiling technology with results in an hour.
LOOKING FORWARD
Advancing the understanding of our genetic makeup, as well as that of the plants and animals around us is vitally important, for example, in man’s fight against diseases, improving crop production and investigating bio-fuel possibilities. Scientists have differing views on how this can be achieved: on one hand, through the relentless industrialisation of instrumentation to achieve faster high-throughput whole genome analysis. While on the other hand, the simplification of the technology, for example, to bring science closer to a patient and clinical practitioners. The past 40 years have seen some amazing developments and we look forward to seeing the next 40 years advancements and hopefully all on the pages of Laboratory News!