Posts Tagged ‘real time quantitative PCR’

Video Tutorial: Program and Setup of Bio-Rad CFX manager software

 :: Posted by American Biotechnologist on 05-16-2011

Sean Taylor, Field Application Specialist at Bio-Rad Laboratories gives step-by-step detailed instructions on how to setup Bio-Rad’s CFX manager software for a qPCR run. Click on the “expanded view” button (bottom right of the video) for easier viewing (the video is blurry for 20 seconds but is very clear after that).

BMC Announces New Topical Series on qPCR

 :: Posted by American Biotechnologist on 09-13-2010

One of our favorite techniques to discuss on the American Biotechnologist is quantitative real-time PCR (qPCR). Previous posts on the topic include:

    That is why we are excited to tell you about Biomed Central’s (BMC) topical series on Quantitative Real Time PCR normalization and optimization, Edited by Joshua S. Yuan. According to BMC, topical series bring together manuscripts published in BMC Research Notes that are associated with individual topics. Through the topical series, BMC Research Notes aims to highlight exciting topical areas of research and provide a home for concise articles that raise awareness and encourage discussion of the subjects covered.

    I’ve scanned through some of the articles posted on BMC’s website and several of them (such as RNA pre-amplification enables large-scale RT-qPCR gene-expression studies on limiting sample amounts) have piqued my interest. I hope to review some of these articles in this forum over the coming weeks and I welcome your comments and feedback.

    A Practical Approach to MIQE for the Bench Scientist

     :: Posted by American Biotechnologist on 08-11-2010

    In a groundbreaking review published in February 2009, Bustin et al bemoaned the lack of standardization in Quantitative Real-Time PCR (qPCR) experimentation and data analysis. In their critique the authors cite the use of diverse reagents, protocols, analysis methods and reporting formats which has negatively impacted on the acceptance of qPCR as a robust quantitative molecular tool. The most serious technical deficiencies include:

    • sample storage
    • sample preparation
    • sample quality
    • choice of primers and probes
    • inappropriate data and statistical analysis

    In an attempt to correct these problems and instill confidence in the reliability of qPCR, the authors proposed a new set of guidelines which would help standardize the qPCR technique and encourage better experimental practice and interpretation of qPCR results. Since it’s publication, the Minimum Information for Publication of Quantitative Real-Time PCR Experiments or MIQE has become an accepted standard by the scientific community for performing qPCR and the use of the technique is becoming standard practice among molecular biology labs everywhere.

    The MIQE checklist consists of 42 points that cover experimental design, sample quality and preparation, nucleic acid extraction, reverse transcription and qPCR target information. Here’s an illustration which will give you an idea of the comprehensiveness of the list. It’s not meant to be read on the blog. An easy to view checklist can be downloaded from the Real-time PCR Data Markup Language website.

    Sean Taylor et al. subsequently published A Practical Approach to RT-qPCR-Publishing Data That Conforms to the MIQE Guidelines which serves as a great practical guide (as its title implies) for bench scientists. Ironically, the title of the paper does not conform to Bustin’s guidelines, (Bustin suggested replacing the term RT-qPCR with the abbreviation qPCR in order to prevent confusion between the terms “real-time” and “reverse transcription” which both carry the abbreviation “RT”), but nonetheless, the title of the paper should not detract from its valuable content.

    Taylor expands upon Bustin’s organizational approach and provides concrete suggestions for tackling experimental design, RNA extraction, RNA quality control, reverse transcription, primer and amplicon design, qPCR validation, reference gene choice and experimental reproducibility. His suggestions are very brief and to the point and it is definitely worthwhile reading the paper in its original form. Nonetheless, I will try to sum up the salient points below.

    1. Experimental procedures, control groups, type and number of replicates, experimental conditions and sample handling methods should be well defined in advance in order to minimize variability
    2. Handling time should be minimized during the RNA extraction procedure and should include DNase I treatment.
    3. RNA integrity should be assessed for both purity and quality. A pure RNA sample with minimal phonol and protein contamination, will have an OD 260/280 ratio of 1.8 to 2.0 (measured spectrophotometrically). Undegrated/Intact RNA (i.e. RNA quality) will have two sharp bands when run on a formaldehyde agarose gel, with the intensity of the top band being about twoice that of the smaller band. Degraded and impure RNA samples should not be used for qPCR analysis.
    4. Reverse transcription should be performed immediately following the RNA quality control assessment and the same amount of total RNA should be used for each sample. Reverse transcription reaction times should also remain consistent across samples.
    5. Primers should be designed to generate amplicons that are 75-150bp long with no secondary structure and 50-60% GC content. Primer should not have long G or C stretches, should have a G or C cap and should have melting temperatures between 55-65C.
    6. qPCR validation should include: determining the ideal annealing temperature, checking the specificity of the reaction via a melt curve analysis, running a sample for each primer pair on a gel to confirm that the amplicon is the expected size, confirming that the qPCR reaction efficiency is between 90-110% by running a standard curve.
    7. Choosing reference genes (for relative qPCR experiments) that do not exhibit changes in expression between samples from various experimental conditions or time points.
    8. Mitigating biological and technical variability by running at least 3 biological and 2 technical replicates per biological sample for each experiment.

    In addition to being published in the journal Methods, Taylor’s guidelines are also available as a Bio-Rad technical bulletin and can be downloaded here as well.

    Educational Webinar: High-Resolution Melt Analysis

     :: Posted by American Biotechnologist on 08-02-2010

    High-resolution melt (HRM) analysis is rapidly gaining in popularity as a cost-effective and faster alternative to traditional post-PCR genotyping methods such as single-stranded conformation polymorphism, denaturing high-performance liquid chromatography, and restriction fragment length polymorphism.

    In this webinar you will gain an overview of the fundamentals of HRM and learn techniques for success through appropriate experimental design, assay optimization, and data analysis. You will also learn about specific applications from scientists using HRM technology for basic microbiological genotyping research of pathogens as well as in clinical studies, detecting receptor gene mutants linked to cancer and identifying epigenetic differences in double-stranded DNA.

    The webinar will take place Wednesday August 11 at 1pm EST.

    For more information and to register, see the official announcement on Genetic Engineering & Biotechnology News

    Panelists Include

    * Kim De Leener, Ph.D., Center for Medical Genetics, University of Ghent, Belgium
    * Jonas Winchell, Ph.D., Chief, Response and Surveillance Laboratory, International Emerging Infections Program, Centers for Disease Control and Prevention
    * Adam McCoy, Ph.D., Senior Scientist, Gene Expression Division, Life Science Group, Bio-Rad Laboratories

    qPCR Analysis: It’s What’s Inside That Counts

     :: Posted by American Biotechnologist on 07-29-2010

    If you watched the video on real time quantitative PCR data analysis, you should have a good understanding of real-time quantitative PCR basics and the associated data analysis techniques. Classical quantification techniques such as Livak, delta CT and the Pfaffl rely on linear regression analysis and are currently the most widely accepted methodologies for quantitative PCR.

    In a paper published recently in PloS one, Jensen et al. discuss several drawbacks of the conventionally accepted methodologies and propose an alternate technique for conducting relative real time qPCR data analysis. If you recall from the video, relative quantification can be done either by normalizing samples against a unit of mass or against a reference gene. When normalizing against a unit of mass a calibrator sample is needed which is usually chosen from a control sample The calibrator’s unit of mass, such as its cell number or amount of nucleic acid, is then accurately measured through empirical techniques. While using a unit of mass as a calibrator for relative quantification is fairly simple, its drawbacks include the need for precisely quantifying the amount of starting material for each sample, a PCR reaction efficiency that is close to 100% and few changes in gene expression between the the experimental samples and control groups.

    When using a reference gene as a calibrator you don’t need to accurately quantify the amount of starting material in each sample, but you do need to use a known reference gene with constant expression levels that are not changed under treatment conditions. Furthermore, the Livak method (delta, delta Ct method) requires that the PCR efficiencies for both the gene of interest and reference gene are within 5% of each other and to 100%.

    Jensen et al. point out that there are several other drawbacks to the conventionally accepted methodologies:

    1. “the proportionality factor between fluorescence and sequence numbers
    differ widely between targets”

    What this means is that the dyes used for fluorescent detection may bind differently to different targets. For example, SYBR Green dye binds to the minor groove of DNA and its signal is directly proportional to the length and sequence of the amplicon as well as the salt concentration of the reaction buffer. As such, the level of SYBR Green signal may differ from one DNA sequence to another despite there being an equal number of molecules in the PCR sample.

    2. Low capacity PCR machines are limited by the number of samples that can be run and therefore cannot accommodate a full set of standard curves in each PCR run. “This introduces a run-to-run variability that inevitably contributes to the error of (PCR) efficiency.” Furthermore, “even tiny errors of the efficiency estimate are critical and induce disproportionally large errors because efficiency constitutes the base of the exponential PCR function.”

    Jensen concludes that “in this light, errors associated with run-to-run variability of PCR unknowns are highly undesirable.”

    In order to rectify these inaccuracies, Jensen et al.propose several solutions:

    1. Construct a fusion-PCR product where both the reference gene and gene of interest are cloned together into the same plasmid. This way, you will be assured that both genes are present in the equal concentrations which will allow for a more accurate linear regression estimate.

    2. Use Run-Internal Mini Standard curves (RIMS) which consists of internal data points using two different concentrations leading to more accurate intercept and slope estimations in the linear regression analysis.

    The results of Jensens’ experiments indicate that internal standard curves based on fewer samples are preferable of larger, external standard curves. Furthermore, it is desirable to chose concentrations of standards that are far from each (termed “extreme concentrations” by Jensen) when using only two samples for constructing a standard curve.

    Finally, Jensen demonstrates that the RIMS-based approach which utilizes an internal standard curve (generated with only 2 reference points), decreases run-to-run variability and is more accurate than Livaak’s delta, delta Ct method when calculated using an external standard curve.

    A significant advantage of the RIMS-based approach is that it renders calibrator sample superfluous since reference samples are run together with experimental samples. As such, run-to-run variability becomes non-existent.

    As a side note, an alternative method of avoiding run-to-run variability is to use a bigger PCR machine (such as Bio-Rad’s CFX-384) which will allow you to run a full standard curve with every PCR experiment. This way, sample quantity can be calculated using a standard curve that was run in parallel with your experimental samples.

    The math involved in this paper is quite complicated and I won’t get involved in it here. Nonetheless, I definitely welcome comments from members of the American Biotechnologist community who can help “dumb down” the equations for the general molecular biology populous.

    Bernth Jensen JM, Petersen MS, Stegger M, Ostergaard LJ, & Møller BK (2010). Real-Time Relative qPCR without Reference to Control Samples and Estimation of Run-Specific PCR Parameters from Run-Internal Mini-Standard Curves. PloS one, 5 (7) PMID: 20661435