RNA – misunderstood molecules with a big impact
28 Sep 2010 by Evoluted New Media
Low copy number RNA molecules are becoming a key research tool for quantitative gene expression studies. Here we explores the significance of enforcing RNA quality control measures and the reliability of the ScreenTape Degradation Value
Low copy number RNA molecules are becoming a key research tool for quantitative gene expression studies. Here we explores the significance of enforcing RNA quality control measures and the reliability of the ScreenTape Degradation Value
|
: NMR structure of the central region of the human GluR-B R/G pre-mRN: bases (light green) and backbone (sky blue). |
Ribonucleic acid (RNA) plays a well understood, critical role in protein synthesis, and has more recently been identified as a key component with gene expression. In its regulatory role, its abundance can be monitored and manipulated for various applications. However, these applications are highly sensitive and in order to obtain reliable and reproducible data it is critical that RNA is assessed for degradation prior to any analysis. No single RNA degradation metric has been adopted universally, although the RNA Integrity Number (RIN – a software tool from Agilent Technologies) is commonly used. As a newly developed measure of RNA degradation, the ScreenTape Degradation Value (SDV) has been evaluated against the RIN for its robustness and reproducibility in the discrimination between different levels of degradation.
A ubiquitous molecule, RNA is a biologically important nucleic acid which is predominantly found within the nucleus. As the product of transcribed DNA, the classical role for RNA is as a template for protein synthesis. The 1959 Nobel Prize in Medicine was awarded for the discovery of RNA synthesis mechanisms, and subsequent investigations over the following years have provided a detailed insight and understanding of coding RNA1. However, the regulatory role of RNA was not discovered until 1990 when studies found that the introduction of genes into a plant resulted in the silencing of similar genes2, 3. These non-coding RNA molecules include the functionally important transfer (tRNA) and ribosomal RNA (rRNA), as well as various small RNAs (microRNA, siRNA). Many of these have a regulatory role in the up- or down-regulation of gene expression, for example through the inhibition or activation of upstream promoter regions. This extensive functionality makes RNA ideal for use in gene expression studies, as well as in the study and diagnosis of genetic disease.
Complimentary DNA (cDNA) is produced from the reverse transcription of an isolated mRNA sequence. Since the mRNA contains only coding sequence (exons), the resulting cDNA does not need to undergo any intron-splicing, making it ideal for expression studies. Two processes which utilise cDNA are DNA microarrays and quantitative reverse transcriptase polymerase chain reaction (qRT-PCR), both of which rely on the integrity of the initial mRNA to ensure that reliable data is obtained.
Large-scale gene expression analysis using microarray technology offers researchers the unique opportunity to study various interactions occurring among genes. The ability to simultaneously monitor the expression of thousands of genes in a biological sample using microarrays has enabled researchers to gain an in-depth insight into complex biological processes. Such gene expression data has facilitated the understanding of various processes, such as cellular responses to stimuli, the aetiology and pathology of disease and the response to various drugs. However, in monitoring such vast numbers of genes, the potential for random or systematic errors is high4. As an extremely sensitive and costly procedure, it is therefore essential that all quality control measures are implemented before the start of an assay to ensure that experimental integrity is maintained.
The exponential amplification of low copy number RNA molecules is a highly sensitive technique and has become a key research tool for quantitative gene expression studies. In addition, qRT-PCR is often employed in the investigation of virus genomes since many are composed completely of RNA e.g., influenza A and HIV, and these studies can be used in the development of lead compounds for therapeutic use. qRT-PCR can also be used in the diagnosis of genetic disorders such as Huntington’s disease, polycystic kidney disease, sickle-cell anaemia, cystic fibrosis and haemophilia. However, working with low quality RNA may strongly compromise the results of these downstream applications, which are often labour-intensive as well as time-consuming5.
Microarray and qRT-PCR both require an initial assessment of RNA integrity to ensure that meaningful data is obtained5. Since the mechanisms of the initiation of RNA degradation are not fully understood, any RNA sample must be routinely analysed before commencing experimental protocols. As with any degraded product, the RNA will have lost a degree of its integrity and therefore will not necessarily function as expected. In highly-sensitive applications, it is especially important that robust and reliable quality control measures are implemented, to ensure that only high quality RNA is used, prior to the initiation of the experimental protocol.
RNA integrity has commonly been determined by one of several different
|
Figure 1: SDV and RIN chromatograms: Graphical overlay of SDV chromatograms for the analysis of intact (A), and degraded (B) RNA. RIN chromatograms for intact (C) and degraded (D) RNA are shown for comparison. Image reproduced from 2010 paper by Wilkes et al (7). |
techniques, using gel or capillary electrophoresis, ribosomal peak ratios and microfluidic-based platforms that include the RIN measurement for RNA degradation.
Starting with an electrophoretic separation on microfabricated chips, RNA samples are separated according to size and are detected via fluorescence. Methods that subsequently rely on the simple visualisation of band patterns are subject to human error, especially when banding is blurred. Researchers have therefore used the ribosomal peak ratio (a measure of eukaryotic 28S to 18S rRNA peak intensity) as a qualitative measure of degradation. However, this method is unable to provide any quantitative data and can often result in an inaccurate and irreproducible assessment of the RNA integrity. Depending on the tissue type, RNA degradation can be a rapid process, for example if contaminated with RNase. As degradation proceeds, there is a decrease in the 18S to 28S ratio, since the 28S is less stable and degrades faster than the 18S peak. The RIN system measures this decrease in line with the increase in baseline signal, assigning a value between 1 and 10, where 1 is the most degraded and 10 the least6. More recent research has lead to the introduction of another quantitative metric as an alternative to RIN.
The ScreenTape Degradation Value (SDV) has been developed as part of Lab901’s novel electrophoretic ScreenTape platform. This employs precast multi-lane gels and microfluidics to enable automated, bench top operation and reduced assay times to obtain a reliable and reproducible measure of RNA degradation.
The SDV is an automatically-generated, objective quality metric which is delivered prior to the initiation of any highly-sensitive protocol. Since eukaryotic cells produce a characteristic chromatogram showing clearly defined 18S rRNA and 28S rRNA peaks as well as a small rRNA peak, the SDV provides a measure relative to the breakdown of these subunits. As total RNA degrades, the 18S and 28S peaks become less distinct and eventually disappear, while peaks from the degraded material emerge between the 18S and small RNA peaks. Derived from a mathematical model that calculates a quantitative measurement of RNA degradation, the SDV represents the ratio of the average degradation peak signal to the 18S peak signal. Thus, a higher SDV corresponds to a greater level of RNA degradation. As a result, RNA profiles can be analysed quickly and easily, providing reproducible and reliable data.
A recently published paper from LGC7 assessed the integrity of various RNA samples with increasing levels of degradation, using both SDV and RIN values. Replicated samples were run on three separate chips or tapes, depending on the platform, and across triplicate lanes. As seen in figure 1, both methods produce easily-read electropherograms, displaying the RNA trace and indicating the levels of degradation.
The robust nature of the SDV metric qualifies as a reliable new method for RNA sample quality control, and a useful predictor of downstream microarray performance. SDV data obtained correlate with the RIN values, and provided a better classification performance for this study. SDV can also accurately discriminate between highly-degraded samples, which are often obtained when using biopsies, laser micro-dissected samples or formalin fixed paraffin embedded (FFPE)-treated tissue as the starting material.
All RNA must be assessed for quality before use in assay analysis since good quality RNA is the foundation of all subsequent work. This essential QC step ensures the validity of the resulting data, which is especially important in gene expression studies linked to the development of therapeutics or assessment of drug efficacy. As the starting point for sensitive applications such as DNA microarrays and qRT-PCR, the mRNA used as a template for the production of cDNA must be viable, with as few signs of degradation as possible. As a progressive process, once degradation is observed then the quality of the RNA is likely to decrease. In order to ensure that researchers obtain high quality, reproducible data every time, it is vital that quality control measures are strictly adhered to. This will avoid time-consuming repeat experimentation as well as wasting costly samples. With published data showing that the SDV is comparable with RIN values in terms of providing a reliable and reproducible method of assessing RNA integrity, SDV is able to provide a rapid and easily interpretable data set, and can clearly differentiate varying levels of degradation.