Glossary

Welcome to the BioMS Glossary where we explain many of the common terms used in the BioMS Facility. Let us know of any terms we’ve missed by emailing us and we’ll add them in!

A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z

A

Absolute quantification: {Term} Measuring the concentration (amount of analyte per amount of sample) of an analyte within a sample. This is the most challenging form of quantification as you need to establish the link between signal intensity and concentration. The quality of the results will be determined by the level of validation to prove the level of robustness, precision and accuracy of your results. Absolute quantification is normally performed using quality control samples of known concentration within your assay and comparing the calculated value to the theoretical value. This process is relatively well defined for small molecules such as drugs but is much more challenging for less well characterised biological molecules. The key to the process is the analytical standard that unknowns are compared to. This standard should be identical to the thing to be measured and have a known concentration. Complex molecules such as proteins can have multiple isoforms and so your standard may not be the same as your analyte. Furthermore the standard may not be pure, difficult to solubilise and unstable making the production and storage of a solution of known concentration very difficult. Realistically absolute quantification may be more of a process to give you reproducibility in your assay rather than accuracy.

Accuracy: {Term} Accuracy is a measure of how close your results are to the true value. This is particularly important in absolute quantification.

Analyte: {Term} The species that is the focus of an analytical experiment. In biological mass spectrometry this with normally be carbon based molecules such as a small molecule, a peptide, a protein or drug.

Artefact: {Term} A change that exists as a result of the analytical process rather than within a biological context. Identifying or avoiding artefacts is an important aspect of experimental design.

Auto-Lysis: [Term} When a protease cleaves itself. This is not desirable as it reduces the activity of the protease and can produce large amounts of peptides that disrupt the analysis.

Average molecular weight: {Term} The molecular weight of a molecule including the average numbers of naturally occurring isotopes of the atoms involved. This value is often used for large molecules such as intact proteins.

< Back to top

B

Base Peak Chromatogram: {Term} A plot of the base peak intensity against time. Used to look at how the chromatography is performing and is complimentary to the Total Ion Chromatogram (TIC) for looking at how the LC-MS system is performing as part of a data assessment.

Base Peak Intensity : {Term} The signal intensity of the most intense peak in a spectrum.

BPC: {Abbreviation} Base Peak Chromatogram

BPI:{Abbreviation} Base Peak Intensity

< Back to top

C

Calibration curve: {Term} A calibration curve is a series of samples created with a dilution of standards to demonstrate the relationship between signal intensity and concentration. Ideally the relationship should display linearity. The number of dilution levels and the number of replicates per level defines the effectiveness of the calibration curve with the balance being between the number of levels and number of replicates. There are no fixed rules about how calibration curves are constructed but in general the greater the number of samples in the calibration curve the better it is. For the highest confidence at lower concentration the calibration levels should be closer together at lower concentrations. There are a number of ways to calculate the relationship between signal intensity and concentration and linear regression is normally used to produce a y=mx+c relationship and a measure of correlation. For studies looking at low level samples you should consider using weighted least squares linear regression.

Carry over: {Term} When molecules from previous samples are observed in the ones run after. Usually a consequence of overloading the system and is typically caused by small voids in the plumbing.

Charge state: {Term} The charge state of a gaseous ion is the number of charges on the ion. This can be determined by looking at the spacing between the peaks due to naturally occurring atomic isotopes, normally due to the presence of the carbon 13 isotope which is present at about a level of 1.1%. The gaps between these isotopes should be 1 Da but as mass spectrometers measure mass to charge ratio (m/z) then it will be measured as 1 divided by the charge rate (units are Thompsons). Using this form of charge recognition allows the calculation of the mono isotopic mass. As the mass of the analyte increases so does the potential to accept charge and so large molecules such as proteins can have many charges. As charge increases the m/z decreases until it reaches a point that the mass spectrometer cannot distinguish the differences between carbon 13 isotopes and the point at which this occurs depends upon the resolution of the mass spectrometer. In these cases the average molecular weight can be determined by looking at the patterns of peaks representing increases in the charge state of the analyte. This analysis is an iterative process requiring specialist software and it relies upon a relatively simple sample composition in order to reduce the complexity of the analysis.

Chromatography: {Technique} The separation of molecules using a combination of a moving ‘mobile phase’ and a non-moving ‘stationary phase’. In liquid chromatography (LC) the mobile phase is a liquid and the stationary phase is some form of solid either based on silica or a polymer support. Gas chromatography has a gas mobile phase and a liquid stationary phase bound to the internal surface of a silica tube. Gradient based chromatography involved using a stationary phase that binds all of the analytes and then gradually releases them as the composition of the mobile phase changes in a ‘gradient’. Developments in chromatographic separations have led to high performance liquid chromatography (HPLC) and ultra-high performance liquid chromatography (uHPLC or UPLC).

Column: {Equipment} The container for the stationary phase used in liquid chromatography. This is normally a steel, plastic or silica tube with high pressure connections at each end but new formats are being created such as etched plates.

Condition: {Term} Part of the grouping of samples in the experiment design. The condition might be a chemical treatment, a mutation or a different source of biological sample.

Contaminant: {Term} Anything within a sample that is not what you want to analyse. This can be molecules that occur naturally within the sample such as phospho lipids or things that are introduced to the sample like polymeric contamination, detergents and biological material such as skin. These are particularly problematic when they interfere with the analysis

Control: {Term} Controls are predominately used in quasi quantification and relative quantification and allow the determination of whether analytes have increased or decreased between conditions. The choice of control is critical for experimental success. In a comparative experiment significance is determined by how different a change is to those that stay the same and therefore many things have to be the same. In a typical ‘omics experiment the goal is to get about 80% to 90% of the sample to be the same and the statistics used in data analysis will often depend on that being true. Ideally the only differences between the control and your sample should be the biology that you are interested in.

< Back to top

D

Dalton, (Da) : {Unit} Main unit of mass used in describing molecules, effectively the same the atomic mass unit (amu). Where 1 Da = 1/12 of the mass of carbon 12. Large molecules such as proteins often have their mass defined in kDa (1000 Da). This term is often used interchangeably with Thompson (Th) when looking at mass to charge (m/z) data from a mass spectrometer.

Data acquisition: {Term} The use of computer software to store the results of an analytical process. The data files created are generally called the raw data.

Data analysis: {Term} The process of using raw data to answer specific questions such as ‘what is in my sample’ and ‘how much is in my sample’. This will often be done in two stages, data processing and data interpretation.

Data assessment: {Term} Examination of the quality of an analysis by the global assessment of the data acquired. This is often done as part of a validation process.

Data directed analysis: {Approach} Data directed analysis (DDA) is a mass spectrometry methodology where the fragmentation process is directed from an assessment of the precursors present at any one time. This is the most common approach used in metabolomics and proteomics and provides the most specific fragmentation patterns for identification, and quantification if using isobaric tagging approaches. The use of this type of data for identification is more straight-forward but it is limited in the number of analytes that are selected for fragmentation. This means that DDA produced datasets may be more prone to missing values and are less reproducible than targeted analysis or data independent analysis (DIA) methodologies though they are often preferred as they incorporate more analytes than targeted analyses and are more selective than DIA.

Data independent analysis: {Approach} Data independent analysis (DIA) is a mass spectrometry methodology where the fragmentation process is set in advance and should ideally cover all precursors present within a specific m/z range. It is generally performed by either stepping through the mass range in a series of overlapping m/z windows or by scanning the mass range, both using a quadrupole for selection. These approaches acquire fragmentation data on more analytes than data directed analysis (DDA) approaches however the fragmentation data acquired is mixed with many other species. The configuration of the analytical method is important in DIA with the method of acquiring the fragmentation data controlling the cycle time of the acquisition which in turn controls the number of data points across a chromatographic peak which in turn controls the quality of a quantification. The processing of this type of data can be more complicated and care has to be applied to quality of the identification of the analytes being measured. These approaches are increasing in popularity and the technologies and methodologies to perform them are in rapid development.

Data interpretation: {Term} The higher interpretation of using processed data that may include both identification and quantification values to assign biological inference.

Data processing: {Term} In mass spectrometry this is the first stage in data analysis that generally involves the identification and optionally the quantification of molecules within a sample by examination and transformation of raw data. This is produces processed data and is performed before data interpretation.

DDA: {Abbreviation} Data directed analysis.

Detergent: {Reagent} A detergent is a reagent that mimics lipids in order to aid solubilisation of complex molecules such as proteins. There are many different types of detergents but they all have a common structure of a hydrophobic ‘tail’ and a hydrophilic ‘head’. When the concentration of a detergent reaches its critical micelle concentration (CMC) it will self-assemble into structures called micelles where the hydrophobic tails are within the structure and the hydrophilic heads are outside the structure. This provides a hydrophobic region that can support the hydrophobic regions of proteins. Detergents are generally bad for mass spectrometry systems as they change the active surface of reversed phase chromatography columns and they can disrupt spray stability and contaminate the mass spectrometry system. Any detergents added to a sample should be removed again in the sample preparation.

DIA: {Abbreviation} Data independent analysis

Digestion: {Term} A process that breaks chemical bonds of larger molecules to produce smaller pieces. This is commonly done chemically or enzymatically. In biological mass spectrometry the term is generally used to describe the use of a protease to enzymatically produce peptides from proteins as part of sample preparation.

Discovery analysis: {Term} Discovery analysis means that your experiment is not focussed on specific analytes and includes the identification of unknowns. This is the typical approach of ‘omics methodologies such as metabolomics and proteomics and is used for the generation of new hypotheses.

Drop out: {Term} A period where no signal is seen in a chromatogram. This normally indicates an interruption in the generation of ions in the source and is particularly prevalent in low flow rate nanoESI applications. This is frequently associated with overloading and the presence of contaminants.

Drug: {Term} Generally an artificial small molecule that has a specific biological function. Drugs are normally very well characterised for solubility and storage. Drugs are normally targets for absolute quantification.

< Back to top

E

Elastase: {Reagent} A protease with low specificity that produces overlapping peptides which are very useful in the characterisation of proteins and their post translational modifications. Note: The variable nature of the digestion means that the data is suitable for characterisation but not quantification.

Enzyme: {Term} In biological mass spectrometry this is often used to refer to a protease used to convert an intact protein into peptides before analysis.

Experimental design: {Term} How an experiment is structured to ensure that it is capable of answering the desired question with the level of confidence required. This process is critical to a successful experiment and will involve the selection of conditions, replicates and controls. It is highly recommended to talk with facility staff about experimental design before starting your experiments.

< Back to top

F

Fractionation: {Process} The division of a sample into multiple sub samples called fractions. This is mostly performed with protein and peptide samples. For protein samples the most common form of fractionation is polyacrylamide gel electrophoresis (PAGE) and for peptide samples it is commonly by liquid chromatography using either reversed phase chromatography or ion exchange chromatography. Using a fractionation approach can increase the difficulties of data analysis as you need to reassemble the data set from the fractions. Protein separations can be relatively easy to reassemble as all the peptides from the same protein are in the same sample and therefore the data can be search independently for each fraction and reassembled afterwards. This is commonly done when using SILAC approaches. Reassembling data from peptide fractions is more challenging as peptides from the same proteins are spread across multiple runs. This means that data must be combined during data analysis making them larger and more difficult to manage. This approach is commonly performed with isobaric tagging samples.

Fragment (mass spectrometry): {Term} A part of an ion generated by the breakage of a covalent bond during fragmentation.

Fragmentation: {Term} The process of breaking an ion into smaller pieces within a mass spectrometer and is the fundamental process in tandem mass spectrometry (MS/MS). This involves giving the ion energy that converts into molecular motion that then results in the breaking of covalent bonds. The transfer of energy is generally done by collision with an inert gas molecule such as helium or nitrogen but other methods are also available such as electron impact (EI) or electron transfer dissociation (ETD). The fragmentation process occurs in a population of ions that will fragment in different places and these signals will be summed within a single spectrum. The pattern of fragmentation can be hard to predict but is generally consistent within a specific instrument or instrument type and can be used for identification.

Fragmentation pattern: {Term} Used to describe the masses of the fragments produced by fragmentation in a mass spectrometer and the relative signal intensities. This pattern is considered distinctive for a particular analyte and is used for identification. The pattern produced can depend on the instrument type used to generate the pattern and can even depend on the individual instrument itself and can be difficult to predict from the molecular structure of the analyte. It can also be further complicated by the presence of multiple compounds in the fragmentation process producing mixed spectra. The relative levels of the fragments present can be particularly difficult to predict and are generally only used to provide extra confidence in targeted analysis approaches where a chemical standard is available to determine the fragment ratios on the particular analytical system and sample type used.

< Back to top

G

Global analysis: {Term} Another way of describing unbiased discovery analysis for hypothesis generation.

Gradient: {Term} A gradient describes how a liquid chromatography system changes the composition and flow rate of the mobile phase during an analysis. The gradient controls how the analytes are separated and must be specified in any reporting or publication to ensure the work performed can be replicated.

< Back to top

H

High performance liquid chromatography: {Technique} High performance liquid chromatography (HPLC) is a development of liquid chromatography where small beads are used in the column. In HPLC the bead sizes are generally around 3 to 5um. The decrease in bead size increases the separation capability demonstrated in the resolution and the peak capacity. Decreasing the bead size leads to an increase in the back pressure of the system and leads to the common misnaming of the technology as high pressure liquid chromatography. Further developments have been made that drop beads sizes even further and this is called ultra high performance liquid chromatography (uHPLC/UPLC).

High performance liquid chromatography system: {Instrument} A high performance liquid chromatography (HPLC) system is a type of liquid chromatography system that is capable of using high performance liquid chromatography columns. This is generally defined by the pressures the system is capable of generating from the pump and tolerating by the other parts of the system such as the injection valves. An HPLC system can generally work at pressures of up to 200bar. Systems that work at significantly higher pressures, e.g. up to 1000bar are called UPLC systems.

High pH fractionation: {Technique} High pH fractionation is a form of liquid chromatography based fractionation that uses reversed phase chromatography instead of low pH buffers. This provides a different selectivity to low pH separation although the mechanism is similar. The similarity means that to get the most benefit from the LC-MS system you need to pool samples from the start, middle and end of the fractionation to ensure that the whole separation area is used.

HPLC: {Abbreviation} High performance liquid chromatography or high performance liquid chromatography system

< Back to top

I

Identification: {Term} The identification of an analyte is generally done by confirming the identity by comparing experimentally acquired data to a reference that has been recorded into a library or generated theoretically. The specificity of the match depends on the specificity of the data used for the match. Examples of the types of data used for matching include fragmentation pattern, mass of the precursor and the retention time of the analyte. Identification generally has a score or association associated with it so there will need to be a cut-off applied with the balance being between excluding true positives and including false positives. We highly recommend you speak with an informatition if you need to adjust any parameters or thresholds in your data.

In-gel digestion: {Technique} A form of sample preparation that can be used to on proteins separated by polyacrylamide gel electrophoresis (PAGE). It involves excising bands or sections from the gel, washing them, digesting the proteins within the gel using proteases and extracting out the resultant peptides. Success is often defined by the concentration of the protein in the gel section and the accessibility of the protein to the protease so the amounts of gel excised should be kept to a minimum and the sections should be cut into smaller pieces to provide a greater surface area and smaller cross section.

Instrument analysis: {Term} Used to describe the analysis of a sample using an instrument based analytical technique such as liquid chromatography-mass spectrometry. It is used to describe the data generation section of the sample analysis procedure that includes sample preparation, instrument analysis and data analysis.

Internal standard: {Term} An internal standard is a compound that is added to a sample to correct for errors generated in sample preparation and instrument analysis. It should be as closely related to the analyte of interest as possible, preferably a stable isotope labelled of the analyte itself, differing only by its molecular weight. An internal standard of defined concentration can be used in absolute quantification to determine the concentration of the analyte in comparison of their signal intensities. This should be used with caution without the use of a calibration curve as the linearity of the signal intensity has not been demonstrated and so the signal intensity of the internal standard needs to be as close to that of the analyte as possible, potentially requiring multiple analyses of the sample with different levels of the internal standard. In situations where there are many analytes a class based internal standard make be used where a labelled representative of a class of molecules is used across the class. This is generally considered a method of converting relative quantification to absolute quantification but should be treated with caution unless the behaviour of all members of the class have been demonstrated to be similar.

Ion: {Term} An ion is a molecule that carries a charge that enables it to be manipulated by electric fields. Positive charges are provided by the addition of a charged species; this is normally from a proton (hydrogen ion) but can also be from metal ions such as sodium and magnesium. Ions created by the addition of anything other than a proton is called an adduct. Negatively charged ions are generally created by the loss of a proton but can also be created by the addition of a negatively charge species or the addition of an electron.

Ion exchange chromatography: {Technique} Ion exchange chromatography (IEC) is a form of liquid chromatography. In this instance the sample is loaded on to a column that contains beads with either positive (anion exchange) or negative (cation exchange) charges in low salt. The molecules that carry the counter charge to that of the beads are bound to the beads by electrostatic forces. A gradient of increasing ion strength is then applied to the column and the molecules are eluted from the column in order of their increasing electrostatic attraction to the bead surface. IEC has a very different separation mechanism to reversed phase chromatography and so is an excellent fractionation method before LC-MS however not all molecules bind to any particular IEC column and so some losses do occur. High pH fractionation is an alternative that incorporates lower loses but shares a similar separation mechanism.

Isobaric tagging: {Technique} Isobaric tagging is a method of relative quantification that uses special reagents that have the same molecular weight (isobaric) but are specifically synthesised using atomic isotopes so that they produce distinctive fragments (“channels”) after fragmentation. The reagents are used to chemically modify peptides produced by protease digestion before combining the labelled digests in to a single sample. The two main isobaric tagging products are TMT from Thermo and iTRAQ. These reagents are sold in kits that allow up to 16 samples to be compared within the same experiment. As the ‘plex’ of the kits increase so do the demands on the mass spectrometry system and the new 16 plex kit requires high resolution MS/MS capabilities. To get the best from the isobaric tagging approach you should consider sample fractionation. The data analysis can be complicated, requiring specialist software and statistics and some recommend repeating experiments using different channels to ensure that there is no bias. These experiments can be expensive and time consuming and are generally applicable when a large amount of data is required from a relatively low number of samples and specific funding has been obtained.

Isoform (protein): {Term} Protein isoforms are proteins that originate from the same gene but have different final structures either due to splice variants or post-translational modifications. Isoforms are central to the concept of protein speciation and make the quantification of proteins challenging.

< Back to top

L

Label free quantification: {Technique} Label free quantification (LFQ) is a form of relative quantification distinctive from metabolic labelling or isobaric tagging approaches. It involves using the signal intensity of the precursor ions for quantification in the form of peak heights, areas or volumes, depending on the software used. This form of quantification requires the highest level of control over the source of the sample, sample preparation and instrument analysis as there are no inbuilt controls within the sample. The benefits of this approach are that there is minimal modification applied to the samples, it can be applied to any type of sample and to any batch size. It is the default quantification technique used in BioMS for these reasons. This technique tends not to be used with fractionation as the results can be difficult to reconstruct from fractionated data and so the datasets can be small than those achieved using fractionation in metabolic labelled or isobaric tagging techniques. Software that supports LFQ tend to use an approach called ‘match between runs (MBR)’ to maximise the size of the datasets produced and to reduce the level of missing data. MBR allows the identification of signals in one run to be matched across other runs where that identification did not take place. The data analysis of any quantification datasets can be complicated and you should seek informatics support when necessary.

LC: {Abbreviation} Can stand for Liquid chromatography as a method or a liquid chromatography system as an instrument.

LC-MS: {Abbreviation} Can stand for Liquid chromatography – mass spectrometry as a method or a liquid chromatography-mass spectrometry system as combined instrument

LFQ: {Abbreviation} Label free quantification

Linearity: {Term} Linearity is when the relationship between two factors can be described by a straight line i.e. as one factor increases the other also does by the same proportion. In mass spectrometry the two factors are generally signal intensity and analyte concentration and this relationship can be determined by a calibration curve. A linear relationship is particularly critical if changes in concentration are to be determined from comparison to a single sample such as ‘this peak in this run is twice as big as in that run so there is twice as much present’. Unfortunately relationships are frequently non-linear in LC-MS. At low concentrations there can be concentration dependant binding to surfaces, particularly the vial in which the sample is stored, and this results in lower signals than would be expected at low concentrations. At higher concentrations the system can be prone to signal suppression where signals are lower than expected and can even drop as concentration increases. This means that all quantification work should be performed with caution and you should seek expert advice before starting the project.

Liquid chromatography: {Technique} Liquid chromatography (LC) form of chromatography that uses a liquid mobile phase and a solid stationary phase. This technology is commonly matched with mass spectrometry to produce a ‘hyphenated’ technology, liquid chromatography-mass spectrometry (LC-MS). There are different types of liquid chromatography possible including reversed phase chromatography and ion exchange chromatography which are defined by the chemistry of the column and mobile phases applied to the liquid chromatography system. The level of separation (resolution) possible on the system is predominately controlled by the size of the particles in the column. Separations using beads of about 3 to 5um are termed high performance liquid chromatography (HPLC) and those with beads less than 2 um are term ultra high performance liquid chromatography (UPLC/uHPLC).

Liquid chromatography-mass spectrometry: {Technique} An example of a hyphenated approach that links two different types of technologies together to produce a combined instrument with more capabilities than the two instruments separately. The liquid chromatography system concentrates the sample and gradually elutes the components of the sample into the mass spectrometer ensuring that the complexity of the mixture presented to the mass spectrometer is as simple as possible and provides extra time for the analysis. This is the main technology used in the BioMS facility.

Liquid chromatography-mass spectrometry system: {Instrument} This is the physical instrument set used to perform liquid chromatography-mass spectrometry. It consists of a liquid chromatography system, usually an HPLC or UPLC, which is connected to a mass spectrometer via an electrospray or nano electrospray source.

*Liquid chromatography system: {Instrument} This is the physical instrument used to perform liquid chromatography. It is normally configurable and made up of multiple functional units that can either be packaged in a single box or exist as multiple linked modules. The basic configuration required is a pump that can push the mobile phase through the column at a defined flow rate and an injector system that allows the sample to be introduced to the column. More complicated systems will provide either a second pump and a mixer after the pump (high pressure gradient) or a proportioning and mixing device before the pump (low pressure gradient) in order that a gradient can be created for applications such as reversed phase chromatography or ion exchange chromatography. The injection system can also be more sophisticated using robotic automation and sample cooling. Temperature has an effect on the separations within the column and so a temperature control device is frequently used to fix the column temperature and this is generally called a column oven or column thermostat. Finally a detector can be added to look at the separated analytes. In a standalone system this will often be an ultraviolet/visible wavelength (UV/Vis) spectrometric detector or a fluorescent spectrometric detector or these can be completely replaced with a mass spectrometry system in a liquid chromatography-mass spectrometry system.

< Back to top

M

m/z: {Acronym} Mass to charge ratio

MALDI: {Acronym} Matrix assisted laser desorption/ionisation

Mass spectrometer: {Instrument} An analytical instrument that can be used to determine the molecular weight (either the monoisotopic mass or average molecular weight) of gas phase ions introduced into the mass spectrometer via the source. Some mass spectrometers also have the ability to select a single group of ions and perform fragmentation on them providing data that can be used for identification or characterisation of the molecule. They also generate a signal intensity that can be correlated with the amount of ions present and this signal can be used for quasi quantification, relative quantification or absolute quantification. Note: The acronym MS is also used for the process of mass spectrometry.

Mass spectrometry: {Technique} The field of using a mass spectrometer to analyse molecules.

Mass to charge ratio: {Term} This is the main characteristic measurement acquired by a mass spectrometry system. The behaviour of a gaseous ion in electric field is controlled by its charge and its mass. The greater the charge, the greater the effect of the electric field and the greater the mass, the less the effect of the electric field. This relationship means that the mass spectrometer sees the mass to charge ratio (m/z) and a determination of the charge state is required to calculate the mass of the species measured.

Matrix assisted laser desorption/ionisation: {Technique} Matrix assisted laser desorption/ionisation (MALDI) is a type of source to generate gaseous ions in mass spectrometry. A liquid sample is mixed with a concentrated solution of a ‘matrix’ and a droplet is left to dry on a special steel plate called a target. As the droplet dries the matrix forms crystals into which the analytes are trapped in what is called a ‘spot’. The steel target with the dried spots is loaded into a MALDI source, this is generally under high vacuum but can also be performed at atmospheric pressure. The ions are generated by hitting the spot with a high powered laser which is absorbed by the matrix generating enough energy that the analytes are released as a gas from the spot and provided with a charge. The matrix also generates gaseous ions with low masses and so this approach is normally only used with larger molecules such as peptides, glycans and proteins. MALDI approaches generally provide a single snapshot of a sample but multiple acquisitions can be made across the sample making this technique particularly useful for MS Imaging.

Matrix (MALDI): {Reagent} The MALDI matrix is generally an organic acid that forms crystals when dried from solution that absorb laser light leading to vaporisation of the crystal and whatever it contains. This is an essential part of the matrix assisted laser desorption/ionisation technique as a source of gaseous ions for mass spectrometry.

Matrix (Sample): {Term} The sample matrix is the composition of a sample that is not the analyte/analytes of interest. It is what directs the choice of sample preparation used and is the source of many contaminants in the sample.

Metabolite: {Term} A metabolite is an intermediate of product of metabolism. It is normally used as a classification of small molecules that occur naturally within a biological system and is the focus of metabolomics and metabolite screening.

Metabolite screening: {Technique} Metabolite screening is a form of targeted analysis that looks for a selection of metabolites and other small molecules. The technique usually requires an experimentally derived library of metabolites that have been analysed using a specific separation for data analysis. It can be used for relative quantification or absolute quantification. It uses external standards or class based internal standards and therefore the accuracy will be lower than those using dedicated internal standards. Metabolite screening can be performed at the same time as targeted analysis by using data independent analysis (DIA), providing absolute quantification on the analytes of interest and relative quantification of other metabolites that are within the library for that particular separation.

Metabolomics: {Term} Metabolomics is a form of ‘omics analysis that looks at the composition of metabolites within a sample in an unbiased manner. It is practically challenging as metabolomics covers a much wider distribution of molecular classes than the other ‘omics approaches. This means that no single analytical system is capable of covering all molecular classes and so multiple techniques must be used or it has to be accepted that some of the molecular classes are not covered. A further challenge for metabolomics is that the diagnostic information provided for each analyte is relatively low and many metabolites are isomers of other metabolites and so identification is very challenging. To address the identification challenge many labs use a chemical library of metabolites to confirm the identity of those metabolites in the data. Specialist software can be used to guess the identities of metabolites that are not in the library in a discovery analysis or the assay can be focussed on only those metabolites in the library in a metabolite screening approach.

Monoisotopic mass: {Term} The molecular weight of a molecule without the inclusion of any naturally occurring isotopes of the atoms involved. This will be lower than the average molecular weight. The monoisotopic mass is normally used for smaller species such as peptides and small molecules.

MS: {Abbreviation} Mass spectrometer or mass spectrometry

MS/MS: {Abbreviation} Tandem mass spectrometry.

< Back to top

N

Nano chromatography: {Technique} A form of liquid chromatography that uses very low flow rates (<1ul/min) on narrow diameter columns (<0.1mm). It provides the highest sensitivity for liquid chromatography – mass spectrometery (LC-MS) applications however it is also the least stable and most prone to signal saturation. For these reasons it is mostly used in research type applications for qualitative analysis, quasi quantification and relative quantification and is frequently avoided for absolute quantification.

Nano electrospray ionisation: {Technique} A form of electrospray ionisation that couples with nano chromatography. Nano electrospray (nESI) is more sensitive than standard electrospray but is less robust and more prone to signal suppression. Most systems use a glass spray tip that can be contaminated or damaged relatively easily and is the major cause of drop outs and spray instability. Samples using nano electrospray ionisation need to be as clean as possible, making good sample preparation important for a successful analysis.

nESI: {Acronym} Nano electrospray ionisation

nLC: {Acronym} Nano liquid chromatography, normally shortened to nano chromatography

Normalisation: {Process} Normalisation is a process that adjusts samples or data so that they are consistent with one another to maximise the similarities and thereby make differences easier to determine. It can be applied to the sample or the data acquired from it. Normalisation can have a major effect on your experiment and you should seek expert advice if in any doubt.

Normalisation (sample): {Process} Sample normalisation is performed on the sample before instrument analysis. It can be performed at an early stage such as measuring out equal amounts of tissue or at later stages such as adjusting the concentration of the sample so that an equal amount is injected onto the analytical system. Caution should be applied when using sample normalisation as it can introduce greater differences than it corrects if the methods of assessing each sample is inadequate e.g. most small scale, quick protein quantification assays suffer from high variability and so can introduce errors if used for sample normalisation. In BioMS we recommend that sample normalisation is only used for correction between conditions with all replicates within a condition being treated in identical ways, allowing data normalisation to even out subtle changes between replicates.

Normalisation (Data): {Process} Data normalisation uses mathematical models to determine a correction to be applied to each sample. This can be as simple as ensuring that the median value in any distribution is the same or using a complex algorithm. It is important to know whether normalisation has been performed on your data and what assumptions it makes. For example, many normalisation processes assume that the majority of the sample is the same and this is not the case if large changes have occurred between conditions. Seek expert advice if in any doubt as this is a common way of incorrectly performing data processing.

< Back to top

O

‘Omics: {Term} ‘Omics is the family of techniques that look at a wide number of analytes within a biological system such as genomics (DNA), transcriptomics (RNA), proteomics (proteins) and metabolomics (metabolites). Initially these terms were used to describe the analysis of the ‘total expression’ of that class of molecules within a cell or tissue, providing an unbiased snapshot of the entire system that would be described as an ‘ome e.g. genome, transcriptome, proteome and metabolome. However, it became apparent that these snapshots were either incomplete or lost other information critical to understanding the biology involved, such as spatial organisation, and so more refined experiments were required that looked at a specific biological area in an unbiased way. This is particularly the case for proteomics where protein function is highly defined by its location, modifications and partners. These selective but unbiased experiments also became termed as ‘omes such as the secreteome (secreted proteins) and the matrisome (proteins in the extracellular matrix). Use of these terms is very popular but caution should be taken as they may be poorly defined and are highly controlled by the experimental conditions used.

Overloading: {Term} Overloading occurs when the amount of material exceeds the capacity of an analytical system. There is generally an ideal loading range for any system whereby amounts lower than the range will produce lower amounts of data and loading more than the upper range has deleterious effects on the analysis. From a facility point of view overloading is a much greater disruption than under-loading. With a liquid chromatography – mass spectrometry system (LC-MS) mild overloading will lead to poorer chromatography and increased spray instability that will then affect the ability to perform data analysis. Severe overload can have catastrophic effects such as column blocking, mass contamination, drop outs and contamination of the mass spectrometry system requiring major corrective work before the system is suitable to be used again. Overloading should be avoided and samples diluted to be within the ideal loading range. Seek expert advice on how to assess samples before analysis.

< Back to top

P

PAGE: {Acronymn} Polyacrylamide gel electrophoresis

Peak capacity: {Term} This is a measure of a separation method’s ability to separate analytes from each other and is generally defined as the number of peaks that could be seen in a separation range where the signal drops to baseline between them. This can be calculated from dividing the separation range (the retention time difference between the first and last possible peaks) by the peak width at the baseline. It is a measure of the separation power of a system and is complimentary to resolution. In general, the bigger the peak capacity, the better the separation.

Peptide: {Term} Organic compounds consisting of a short amino acid chain formed using peptide bonds. Peptides can either be artificially generated by the addition of a protease during sample preparation or naturally occurring within a biological system as the product of a small gene or due to the activity of a specific protease. If the peptide is deliberately formed to have a specific biological activity then it is often termed a fragment whereas if it is randomly generated it will normally be termed a degradation product. There are no globally accepted definitions of how long an amino acid chain should be to be called a peptide. In BioMS we would class a chain of up to 3 amino acids as a metabolite or small molecule, chains of 5 to 30 as peptides, 50 to 100 as small proteins and 150 and above as proteins. There are obviously large areas of ambiguity between these regions but the term used is effectively irrelevant as the experiment will be designed according to the molecules involved and the question to be answered.

Polyacrylamide gel electrophoresis: {Technique} A method of separating proteins using electric fields to move proteins through a polyacrylamide gel lattice. The variant of this technique that uses sodium dodecyl sulphate (SDS) buffers (SDS-PAGE) is the most established and reliable way to separate intact proteins. It works on the principle that proteins bind a consistent amount of SDS related to their molecular weight and so their migration through the polyacrylamide gel due to the electric field is proportional to their molecular weight. Note: SDS-PAGE provides only an estimation of molecular weight. The highest resolution is available when using reducing buffers as variable disulphide bonding can affect the migration of proteins leading to ‘fuzzy’ bands. Other post-translational modifications such as glycosylation can also lead to ‘fuzzy’ bands. Proteins separated by SDS-PAGE can be prepared for LC-MS analysis usining in-gel digestion techniques. SDS-PAGE can be used as a method for protein fractionation where a complex protein mixture is separated and the whole lane is sliced into small sections for analysis. This process is often termed ‘gel walking’ ‘gel pixellation’ or ‘geLC-MS’ and is commonly used in methods such as SILAC.

Polymeric contamination: {Term} A contaminant that is represented by a series of peaks within a spectrum that differ by a fixed amount. This can be due a wide family of different molecules that are synthesised from repeating units such as polyethylene glycol (PEG). These molecule types are commonly used as wetting agents or mild detergents and behave similarly to peptides in LC-MS separations. They are a frequent form of contamination and have a serious impact on the analysis due to signal suppression and contamination of the column. Polymeric contamination of the sample should be avoided wherever possible by careful selection of non-contaminated buffers or containers or by the selection of non-polymeric buffers. If addition cannot be avoided then it should be removed in sample preparation.

Post translational modifications (ptm): {Term}The changing of a protein’s molecular structure after it has been created by translation. This may include the addition of a small molecule such as phosphorylation or methylation, the addition of a small protein such as ubiquitination or sumoylation or the removal of a section of the protein by specific protease activity. Post translational modifications can have particular interest when they result in changes in protein activity and the analytical challenge is distinguishing these modifications from those not associated with the change in function or are generated as an artefact. This is an important aspect of experimental design.

Precision: {Term} The measurement of how similar things are to each other. This is often measured by the co-efficient of variation (CV) otherwise known as the relative standard deviation (RSD). This value is relatively easy to calculate and should be distinguished from accuracy which is how close an answer is to the correct value.

Precursor: {Term} In mass spectrometry the precursor represents the ion form of an analyte before it subject to fragmentation to produce a fragment. Precursor signals in the mass spectrum can be used to calculate the molecular weight of an analyte or their signal intensity used for label free quantification.

Processed data: {Term} The product of data processing.

Protease: {Term} An enzyme that cleaves proteins into smaller sections. Proteases will have defined specificities that may be narrow and well defined, such as trypsin, or broad and poorly defined such as elastase and pepsin. The choice of protease for protein analysis is critical. The real world performance of an enzyme may be different to that advertised and it is important to get knowledgeable advice before purchasing and using a new protease.

Protein: {Term} Large amino acid chains generally 10kDa and more. The definition of what constitutes a specific protein can be ambiguous. Commonly a protein is defined from by the gene that it was created from but one gene can create multiple splice variants that in turn create proteins that can be variably post translationally modified to create protein isoforms with different functions. This means that molecular characterisation of a ‘protein’ can produce complicated results as the data from multiple isoforms can overlay each other and create conflicting results. The complexity of protein isoforms can be termed speciation.

Proteomics: {Technique} A form of ‘omics analysis that looks at the protein composition of a sample. It is predominately performed using mass spectrometry at the moment but other techniques are being developed that may also be used for this purposes such as antibody arrays and NanoPore sequencing. It is important to understand the concept of speciation when looking at proteomics and understand what type of questions each technology can help to answer.

PTM: {Abbreviation} Post translational modification

< Back to top

Q

Qualitative: {Term} A type of analysis or data that describes what a sample is without requiring comparison to another. Examples of qualitative analysis include analyte identification or characterisation. To be distinguished from quantitative analysis.

Quantification: {Process} The process of looking at the amount of an analyte present, generally in comparison with another sample. When the other sample is a standard of known concentration or a calibration curve then it is classed as absolute quantification and if it is another unknown sample then it is classed as relative quantification if good precision can be achieved or quasi quantification if it cannot.

Quantitative:{Term} A type of analysis or data that looks at the amount of an analyte present, generally in comparison with another sample, in a process called quantification. To be distinguished from qualitative analysis.

Quasi quantification: {Process} The crude comparison of the relative levels of an analyte when the measurements have poor precision and are only suitable for distinguishing gross differences. There is poor definition between this term and relative quantification but it is generally applied to datasets with large errors. Quasi quantification methodologies are common in biology, for example Western blotting, and can be particularly applicable when considering speciation. It is important to know the quality of your quantitative data so that it is used correctly and so expert advice should be sought if there is any doubt.

< Back to top

R

Raw data: {term} The earliest form of data generated by data acquisition and acts as the source of data for data processing. Ideally the data should not have been transformed in any way and can be considered the ‘ultimate point of truth’. This data is predominately in a third-party, instrument-specific, format that requires specialist software to review. Note: The definition of raw data is complicated as there is frequently some form of data processing or transformation present during data acquisition. This requires a level of pragmatism and clarity when stating what you define what the raw data is in your project.

Recovery: {Term} A measure of the proportion of analyte that remains after sample preparation. This should be as high as possible to ensure good precision as minor changes in recovery have less effect when recovery is high than when it is low. Factors that affect recovery can include an incomplete understanding of the analyte or human error in following the sample preparation protocol. Examples of incomplete understanding of an analyte include knowledge of the conditions required to solubilise the analyte, the formulation required to maintain it in solution and the conditions in which the analyte is chemically modified and thereby converted into a different molecule. Identifying factors that are affecting recovery is central to troubleshooting sample preparation and expert advice should be sought if in any doubt.

Relative quantification: {Process} The comparison of the amounts of analyte present when compared between samples without reference to a standard of known concentration. The general purpose of these experiments is to distinguish between things that have gone up, gone down or stayed the same. The thresholds between these states are defined by the data itself and this type of data generally requires the majority of analytes to have no change and to demonstrate this with high precision. There will be choice involved in the selection of the method of calculating thresholds and this should be matched to the question being asked and the subsequent usage of the results. Seek expert advice if in any doubt. Relative quantification experiments that produce data that have poor precision or where the majority of analytes do not remain the same may still have use for quasi quantification but caution should be applied in utilising these results and the experimental question may need to be refined.

Replicates: {Term} Replicates are a series of similar samples that and are used to assess error within an analysis. In biological research they are generally classed as either biological replicates or technical replicates. Biological replicates are samples that come from independent biological sources (e.g. from different donors) and technical replicates come from the same biological source but are processed separately. Technical replicates assess the error in sample preparation and instrument analysis. Biological replicates assess the error in both the sample preparation, instrument analysis and the variability between biological sources. As the variability is often far greater between biological sources than technical error it is generally considered that biological replicates are more important than technical ones. The number of biological replicates required depends on the level of variance between samples and can be calculated using a power calculation. In simple, well controlled systems a power calculation will not normally be required and a simple rule of a minimum of 3 replicates can be applied. In BioMS we recommend a minimum of 3 replicates but ideally 5. Note: It can be difficult to distinguish between biological and technical replicates. For example, multiple cultures of cells obtained from a single source will often be classed as technical replicates rather than biological ones.

Reproducibility: {Term} The ability to produce the same, or very similar, results if the same methodology is applied to the same samples on different occasions. This principle is central to published science and requires specific effort to achieve including identifying and setting the variables that can affect the outcome of an experiment. These variables should be defined within a comprehensive protocol that is followed explicitly and can be clearly communicated to others. It can be challenging to know what the key variables/experimental factors are and so expert advice should be sought if in doubt.

Resolution (MS): {Term} Mass spectrometric resolution is a measure of how well the mass spectrometer can distinguish between things of similar mass. There are a number of ways of calculating this value but they are generally based upon how wide a mass spectrum peak is at a specified m/z. The larger the value, the higher the instrument resolution. Low resolution is generally in the range <15,000 and high resolution in the range >70,000. High resolution is important for label free quantification (LFQ) and for isobaric tagging applications. Low resolution is OK for fragmentation spectra.

Retention time: {Term} The time that the maximum signal intensity of an analyte is observed in a chromatographic separation. This value is related to the specific interaction between the analyte, the column and the gradient and is one of the factors that can be used to match data between runs, along with the accurate mass and the fragmentation pattern obtained after fragmentation. Retention time is the factor that is the most liable to change and can be effected by changes in the chemical composition of the column due to contamination and any changes in the performance of the liquid chromatography system including changes in flow rate or column temperature. Data analysis methodologies that rely upon retention time consistency, such as label free quantification, need to be matched with highly effective sample preparation methods and a high degree of control of the analytical system.

Reversed phase chromatography: {Technique} A form of liquid chromatography that uses a non-polar stationary phase and a polar mobile phase. This is a form of two-phase separation where analytes partition between two immiscible liquids, e.g. oil and water. In this instance the non-polar phase, typically an aliphatic carbon chain, is chemically bonded on to the surface of a bead held within the column. The most popular modification used is a chain of 18 carbons and is termed C18. Reversed phase separations can have the highest resolution, reproducibility and robustness and also use buffers that are compatible with mass spectrometry. It is therefore the most common separation system used in liquid chromatography-mass spectrometry applications.

Robustness: {Term} The ability of a process to produce the same outcomes despite variations in the way it is carried out. Real world experimental conditions are unlikely to be ideal and minor variations are likely to occur. The key t robustness is ensuring where possible that variables are not set close to the threshold where they become ineffective, for example reagent levels should be in excess of the amount required rather than calculated to be at the exact level required. However caution is required to ensure that the levels used are not so much in excess that they become disruptive for the rest of the protocol. Robustness should be a major factor in the design of nay new protocol and ex[pert advice should be sought if in any doubt.

RPC: {Acronym} Reversed phase chromatography

< Back to top

S

Sample preparation: {Term} The generic term for the treatment of a sample before analysis. The goals of sample preparation are to remove contamination that has been added to or naturally occur within the sample and to retain as much of the analytes of interest as possible whilst possibly removing any other components that might interfere with the analysis. This process should demonstrate robustness, reproducibility and precision. When developing sample preparation methodologies there is a balance between adding extra steps, which will increase the specificity of the process but decrease robustness as there is more scope for things to go wrong and decrease recovery as every step will have an associated sample loss. In general the number of steps within a sample preparation protocol should be minimised and each step should be designed to be compatible with the next. Using parallelised processing, for example plate based methodologies, and automation is key to experiments that require a high degree of precision and include large sample numbers such as those in clinical research.

Selectivity: {Term} Used to describe the ability of an analytical approach to distinguish between the analyte and the compounds that are similar to it i.e. measure a correct species. Combined with specificity it achieves the quantification goals of measuring one thing that is the correct thing.

Sensitivity: {Term} A measure of the level of analyte required for identification or quantification. The more sensitive the process the less material is required for success. This is often defined by the lower limit of quantification (LLOQ) for quantification and lower limit of detection (LLOD) for identification. It is important to note that analytes can often be identified at lower levels than that required for quantification as the signal quality requirements are different. This is central to the key concept that identification and quantification are NOT the same.

Signal intensity: {Term} This is the quantitative value created by mass spectrometry. It is often a measure of the electrical signal produced when an ion hits a detector. It is important to recognise that the relationship between signal intensity and amount is specific to the molecules being analysed and similar sized signals for different molecules may represent two very different amounts. This is a very important issue in quantification.

Signal suppression: {Term} Describes the effect of one set of compounds reducing the signal intensity of another. This is a particularly important concept in mass spectrometry based applications as the signal intensity is dependent on the conversion of the analyte to gaseous ions. The charge applied to the ion is frequently a proton of which there is a limited supply, especially in nano electrospray ionisation (nESI), leading to competition for them. An excess of a molecule that readily accepts charge, such as polymeric contamination, can lead to insufficient charge available for the analyte of interest producing a lower signal intensity than would be otherwise expected. To reduce signal suppression abundant contamination should be removed during sample preparation and the amounts loaded onto the analytical system should be controlled to avoid overloading.

Significance: {Term} Is a statistical measure of whether a specific results occurred randomly or not. The methods of calculating significance can be complicated and have a raft of associated assumptions that are not always well understood. Once a significance is calculated then threshold set for this significance is a choice that should be determined by the data itself, the analytical question being asked and the subsequent use of the results. The application of these statistics in science is a current area of contention and it is critical to obtain expert advice from a statistician or informatition if in any doubt.

Solid phase extraction: {Technique} Solid phase extraction (SPE) is a generic term for the use of crude liquid chromatography in sample preparation. The beads used are large than those used in high performance liquid chromatography and are generally contained in large, single use columns or used as a slurry in a filter device. The approach typically doesn’t use a liquid chromatography system but uses centrifugation, vacuums systems or positive pressure systems to pass the mobile phase over the solid phase. The separations achieved have low resolution and either produce a single enriched pool or can be used for a small number of fractions for fractionation. SPE is very popular in sample preparation as it can be quick and robust providing high recoveries and significant depletion of contamination.

Small molecule: {Term} A classification of analytes that include metabolites and drugs and is distinguished from large molecules. It can generally be applied to all carbon based biological molecules that are not polymers of defined sub units such as proteins, oligosaccharides, RNA and DNA however it can include short chains, for example di- and tri-peptides are commonly considered metabolites and are therefore classed as small molecules. The exact boundaries of the definition are arbitrary and are practically defined by the analytical methodologies used, for example the term small molecule can be used as a ‘catch-all’ for analytes that are too small to be compatible with the generic methodologies involved in standard proteomics, transcriptomics and genomics pipelines.

Source: {Term} The part of the mass spectrometer that produces gaseous ions. This can either be at atmospheric pressure e.g. electrospray ionisation (ESI) or under vacuum e.g. matrix assisted laser desorption/ionisation (MALDI). The type of source needs to be matched to the type of sample to be analysed. Some sources can be used in a number of ways such as standard electrospray, microflow electrospray and nano electrospray.

SPE: {Acronym} Solid phase extraction

Speciation (protein): {Term} Protein speciation is a recognition of the fact that a single gene can produce multiple protein products and that these products can be modified to perform different functions using post translational modifications. This can produce challenges when converting from a genetic view of biology to a biochemical one with the most common issue being attempting to obtain a single quantitative value to represent the products from a gene. This may be reflected in obtaining different results from different antibodies whose epitopes are modified in different versions of the protein or in obtaining different ratios for different peptides from the same protein in a proteomics experiment. The means that caution must be applied to any quantitative protein analysis with specific effort given to understanding where the quantitative values originate from and what particular forms of the protein they represent. For example a proteomics experiment that uses a median based approach to summarising the results from multiple peptides from the same protein can be classed as representing the changes in the protein isoforms that contain the regions that are least prone to changes. Investigating the many important areas of biology that lie outside of these areas is the ongoing challenge for analytical biochemistry.

Specificity: {Term} Used to describe the ability of an analytical approach to isolate the signal intensity of the analyte i.e. measure a single species. Combined with selectivity it achieves the quantification goals of measuring one thing that is the correct thing.

Spectrum (plural spectra): {Term} A plot of signal intensity vs mass to charge ratio (m/z)

Spray instability: {Term} When the spray in an ESI source is unstable, either intermittent (spitting) or ‘wobbly’, with the most extreme cases leading to signal drop outs. This can have a particular detrimental effect on quantification and is often due to overloading or contamination either from the samples being analysed or the ones preceding it. Validation is an important process for avoiding these issues.

Standard: {Term} A term used to describe a compound of known identity, purity and concentration or a sample that contains it. Standards can be used for identification or quantification. In identification a standard can be used to confirm the retention time and fragmentation pattern produced by the compound obtained in an analytical system such as a liquid chromatography – mass spectrometry system and used to confirm the presence of a specific analyte in an unknown sample, often via use of a library to collate the characteristics of a range of standards. In quantification standards of known concentration can be used to correlate the signal intensity to a concentration. Multiple concentrations of the standard should be used in the form of a calibration curve to allow for any issues with the linearity in the relationship between signal intensity and concentration. Standards can also be used to allow for losses in recovery during sample preparation by adding them to the sample itself and these are called internal standards. Standards that are used for this purpose but as separate samples are called external standards and they are not considered as effective as internal standards.

< Back to top

T

Tandem mass spectrometry: {Term} Analysis method whereby an additional level of mass spectrometry is used. Charged ions (e.g. peptides) are selected according to their mass-to-charge ratio (m/z), and then subjected to fragmentation. Precursor and fragment ion m/z values are recorded. Individual fragmentation spectra can be used for identification. The signal intensities of fragments across multiple fragmentation spectra can be used to create a chromatogram that can be used for a high specific form of quantification in methods such as multiple reaction monitoring (MRM)/selected ion monitoring (SRM), pseudo reaction monitoring (PRM) or data independent acquisition (DIA).

Targeted analysis: {Technique} A form of analysis where there is a defined set of analytes that are of interest to be distinguished from global analysis. Although this approach can be used in a qualitative manner for identification it is generally used for quantification. When used with mass spectrometry it involves setting the system to look for specific signals related to the analytes of interest. Standards can be used to refine the settings used in the analysis or to trigger an action to increase the sensitivity of the system. Targeted analyses are generally more sensitive than non-targeted methods with greater specificity and can facilitate faster analysis times at the cost of lowering the number of analytes that can be quantified within a run. New semi targeted approaches are becoming available that can be used on a larger number of analytes such as data independent analysis (DIA) approaches.

Thompson: {unit} The official unit for m/z but rarely used.

TIC: {acronym} Total ion count or total ion chromatogram

Total ion chromatogram: {Term} A plot of the changes in the total ion count over time. It is particularly useful in data assessment to look for spray instability.

Total ion count: {Term} The sum of all the ions within a spectrum.

Trypsin: {Reagent} The most commonly used protease in protein analysis by mass spectrometry. It cleaves at arginines and lysines, has a high activity and reasonably high specificity. There can still be issues with trypsin even though it is well behaved specifically with partial cleavage/missed cleavage and auto lysis.

< Back to top

U

uHPLC: {Acronym} Ultra high performance liquid chromatography or ultra high performance liquid chromatography system

Ultra high performance liquid chromatography: {Technology} A form of high performance liquid chromatography that uses utilises smaller beads (<2um) that increases the resolution of the system and as a side effect increases the back pressure. This also changes the relationship between flow rate and resolution in that increasing flow rates does not have as high a negative impact on resolution as that in HPLC and so very fast runs (<5mins total runtime) are possible for simple samples.

Ultra high liquid chromatography system: {Instrument} A form of liquid chromatography system that is capable of performing ultra high performance liquid chromatography. This includes having pumps, valves and connections that can operate at higher pressures, generally up to 1000bar.

UPLC: {Acronym} Ultra high performance liquid chromatography

< Back to top

V

Validation: {Term} The process of acquiring evidence that your analysis is correct and suitable for the purpose you are using it for. This should particularly be done before any quantification experiment is performed and should be monitored regularly using data assessments.