Friday, July 6, 2012

DNA as a Data Storage Device

DNA as a Data Storage Device

In this day and age, we are all surrounded by technology, with gadgets and gizmos such as CDs, iPods, phones, computers, USBs – all driving us towards the ongoing quest for new and better ways to store information. With the past few years, scientists have been investigating every possibility, ranging from semiconductors to carbon “nanoballs” to even our very own DNA!
Deoxyribonucleic acid, or DNA, possesses many ideal characteristics of a data storage device for the future. Present in all living organisms, a key feature of DNA is its capacity to store significantly large amounts of information in its nucleotide sequences. The structure of a nucleotide consists of a sugar-phosphate backbone, attached to one of the four nitrogenous bases – Adenine, Thymine, Cytosine and Guanine.
Figure 1: The structure of a nucleotide, consisting of a phosphate group, deoxyribose (sugar) and a nitrogenous base.





Using genome sequencing, these nucleotides can be connected to form synthetic oligonucleotide sequences containing data stored in the form of specifically ordered nitrogenous bases.
In a recent study conducted by Yachie et al. (2007) at Keio University, the practicality of using bacterial DNA for long-term, large-volume data storage was investigated. The researchers were able to store a short, alphanumeric message in the loci of a Bacillus subtilis genome and retrieve it successfully. To do so, their chosen message “E=mc2” was firstly translated into dinucleotides, using a 4-bit binary code encryption key.
Figure 2: Encryption keys used in the Yachie et. al (2007) study at Keio University, Japan.

These dinucleotides were then used to form long sequences that were then injected into the Bacillus subtilis cells. After an overnight incubation period, the data was then recovered.
Figure 3: The 4-bit binary codes translate into dinucleotides which make up synthetic oligonucleotide sequences.

 The most common data storage and recovery method for DNA is based on polymerase chain reaction (PCR), which involves the use of primers to amplify the coded regions of DNA. Encryption keys are then employed to decode each dinucleotide into its corresponding bit code and if necessary, into alphanumeric code for convenient use or interpretation.
Figure 4: Bacillus subtilis under a microscopic.

Not only can DNA significantly more bytes than our currently existing mechanisms, but it is also praised for its extreme durability in long-term data storage. Naturally, DNA is passed down from generation to generation of living organisms, and because of this, scientists postulate that any data inserted in an organism’s genome will last as long as the line of the host organism, which is often hundreds of thousands of years. 
However, if the organism undergoes genetic evolution or adaptation, there are several problems that may occur, including data transmutation or loss. Several methods have been suggested to reduce the effect of these mutation rates, such as the selection of a robust host organism that can survive in harsh environments. In addition, the study by Yachie et al. (2007) suggests storing the data in an “alignment-based” method, where several back-ups of the data are also inserted with the original information to increase the stability of DNA data and reduce the chances of data deletion.
The phenomenon of using genomic DNA to archive information is considered as a significant advancement in genetics. According to recent studies, the natural characteristics of DNA, such as compactness, heritability and durability construct it as an ideal data storage device – which may ultimately blur the line between nature and technology forever.

Can't stop eating? Blame it on your genes

Can't stop eating? Blame it on your genes




Obesity is a medical condition where excess body fat accumulates to the extent of causing adverse effect on health which may lead to reduced life expectancy with increased health problems. It increases the likelihood of different diseases especially the heart disease, cancer as well as type two diabetes. Obesity tends to run in families. Weights of adults selected during studies reveal that their weights are closer to their biological parents’ weights.
A recent study done by Guey-Ying Liao and his colleagues (2012) suggests that human obesity may possibly be caused by mutations in the Bdnf gene as it produces transcripts having either short or long 3’ un translated regions (3’ UTRs). However, in regulation of energy balance, the precise role of brain-derived neurotrophic factor (BDNF) is unknown. The relationship between Bdnf mRNA with along 3’ UTR which means long 3’ UTR Bdnf mRNA, leptin neural activation and the body weight is shown. Long 3’ UTR Bdnf mRNA has been found to be enriched in dendrites of hypothalamic neurons. It has also been found that insulin and leptin could possibly stimulate its translation in dendrites.
Mice harboring a truncated long Bdnf 3’ UTR furthermore developed acute hyperphagic obesity. However, this was completely reversed by viral expression of somewhat long 3’ UTR Bdnf mRNA found in the hypothalamus. The ability of leptin in activating hypothalamic neurons and inhibiting food intake was compromised despite the presence of leptin receptors, in the mice. The results obtained revealed a novel mechanism which linked leptin action to BDNF expression happening during hypothalamic mediated body weight regulation and this also implicated dentritic protein synthesis in the process.
Researchers claim to have found a single mutant gene is the one to blame for the inability of brain to tell obese people when to stop eating. The brain derived neurotrophic factor in mice either stops or slows passage of leptin as well as the insulin signals through the brain. In the humans the aforementioned hormones are released at somewhere at the time when one can see the bottom of the colonel’s sixteen piece bucket. It is not usually the guiltiness that tells one to stop but the brain dictates when the climax is reached. In cases where the signals fail to reach the locations that are of concern in the area in brain signaling satiety.
Such discovery may possibly open up novel strategies which help the brain control body weight. The Bdnf does not only control body weight, but notably in failure to development of one of Bdnf gene, there is a flow effect resulting in deficits in learning and memory in mice. Neurons rarely talk to each other in case there is a problem with Bdnf gene and as such, the leptin and insulin signals become effective without modification of appetite. Faulty transmission line can be repaired by the strategy where missing Bdnf would be produced using virus based gene therapy despite the difficult of delivering across the brain blood barrier.
 

The lack of a single gene has been found to cause obesity. Leptin appears linked to human disease in which case several childhood diseases have been associated with mutations in leptin genes. Leptin however plays a big role in the body of human beings today. As an issue of concern in human science, research findings should be well administered so as to ensure that the risks of obesity associated with gene mutations are effectively curbed.

Epigenetics in the Ice Age

Epigenetics in the Ice Age


Recent research highlighted by the New Scientist article “Fossil DNA has clues to surviving rapid climate change” suggests that epigenetics played a significant role in the adaptions animals made during the last ice age. It focussed on research by a team from the University of Adelaide and University of New South Wales, headed by Alan Cooper and Catherine Suter, who made the discovery after investigating the genetic sequence found in some specimens of extinct bison (Holmes 2012).


Epigenetic inheritance involves the inheritance of characteristics from one generation to another, by processes that do not involve the nucleotide sequence of DNA (Reece et al, 2011, p. 364). Therefore, Cooper and Suter’s team looked at the characteristics that animals developed, which may have been passed down to future generations, when the animals were exposed to the change in environment. This was done by finding the bones of a bison that lived around 26,000 years ago in the Canadian arctic permafrost (Figure 1), before extracting the DNA contained within these bones (Holmes 2012). Tests were then performed, using the bisulfate sequencing technique, searching for DNA methylation (Holmes 2012). DNA methylation is when a methyl group joins to a base of DNA, commonly cytosine (Reece et al, 2011, p. 364) (Figure 2). According to the website Sigma-Aldrich, “DNA methylation is an epigenetic modification that changes the appearance and structure of DNA without altering its sequence” (2008). Further research completed by the team of scientists proved that some of the DNA methylations they found in the bison were in the same places of modern cow DNA (Holmes 2012). This discovery is very important, as it shows that there was some form of epigenetics during the time period that the fossils lived. As for the similar methylations that were discovered in modern cows, Holmes suggests that this “is strong evidence that the ancient methylations were not the product of chemical damage occurring after the bison’s death” (2012). More tests on five other specimens of bison found gave the scientists no results (Holmes 2012), which proves that this area of research is very difficult. Noting that scientists currently have limited knowledge when it comes to understanding epigenetic signals (Holmes 2012), advances in this field are hard to make. There are a number of different techniques scientists can use to tell if epigenetic modification has occurred, like bisulfate sequencing, fluorescent insitu hybridisation and DNA adenine methyltransferase identification for example (Medindia, n.d.). Despite this, far more research is required to understand what that information is saying, and to get anything out of it.

 There is still a lot of work to be done on this before any major conclusions can be drawn. But this research has found that at the start of the last ice age, animals may have undergone epigenetic change to adapt to the changing environment. With the current climate change situation facing humans, this find in the remains of a 26,000-year-old bison could be important in the future.

Eye Colour in Humans – Not Just a One Gene Affair

Eye Colour in Humans – Not Just a One Gene Affair

Contrary to popular belief eye colour in humans is not just controlled by one gene in our DNA. High school biology teaches us about Gregor Mendel and his theories about inheritance patterns and how they relate to human eye colour. Prior to recent studies conducted into the genetics of eye colour it was thought to be a strictly mendelian trait (White and Rabago-Smith). However it is now known to be the product of multiple genes. This theory is used to explain why eye colour does not comply with Mendelian patterns of inheritance. For example blue-eyed parents are able to have brown-eyed children, which should not be possible in a Mendelian model where brown is dominant over blue (which can only occur with homozygous recessive genes and thus they would not be able to pass on the dominant gene to their offspring).  This model using a single gene is unable to explain the spectrum of eye colour and the fact that eye colour in humans shows both incomplete dominance and epistasis (University of Queensland).  This suggests that there is more than one gene that controls eye colour.


The colour of an individual’s eye is determined by the ratio of two pigments in the iris of their eye. These two pigments are called eumelanin (the yellow pigment) and phenomelanin (the black pigment) (White and Rabago-Smith). These pigments, melanin, and are produced in the melanocytes of your eye (Ibid). Blue eyes arise from low levels of melanin and increasing levels produce the rest of the eye colour spectrum (Stanford University). The amount of melanin in the iris also affects eye colour. The more melanin that is present, the darker the apparent colour of the eye as when light enters the eye it is largely absorbed rather than reflected back as colour (White and Rabago-Smith). So people with lighter eyes have less melanin than people with darker shades. Individuals may have red or violet eyes; this is due to a condition called ocular albinism and is caused by mutations in their gene sequence (Ibid). 

 Studies conducted by various institutions including the Institute for Molecular Bioscience at the University of Queensland have shown that there are 16 genes which affect eye colour (Ibid). However most of these genes have only a small effect, the two major genes are HERC2 and OCA2 (Ibid). HERC2 affects the way in which the code of OCA2 is expressed in the DNA sequence because of its position in the DNA (Ibid). Any changes in the sequence of these genes have large impacts on the eye colour of the individual. Changes in the OCA2 gene been shown to account for around 74% of variation in eye colour (Duffy, Montgomery and Chen; White and Rabago-Smith). If both copies of the OCA2 gene are missing it leads to ocular albinism (White and Rabago-Smith). Other genes which effect eye colour include agouti signalling protein, tyrosinase, membrane associated transporter protein, p protein oculocutaneous albinism II and melanocortin 1 receptor (Ibid).



 It is clear from the research that has been conducted in this area that the Mendelian model of inheritance is unable to explain the expression of eye colour in humans. There are multiple genes responsible for melanin production in the eye and the main two are HERC2 and OCA2 (Tyler).

Genetic Testing for Newborn Hearing Loss



Genetic Testing For Newborn Hearing Loss


As modern technology allows rapid progress in the field of genetics, genetic testing for various disorders is becoming increasingly common. One recent development in this area is genetic testing for a form of congenital hearing loss.

Currently, newborn hearing screening programs are in place in many countries to test all infants for hearing abnormalities. However, this in itself does not produce a diagnosis for newborns who fail the tests, and indeed most of them have to wait up to three months before any diagnostic evaluation is started. This is not in the best interests of the child, as evidence shows that “identification and habilitation of deaf infants before six months of age improves language outcomes.” (Schimmenti, et al., 2011)


Almost half of all infants with congenital hearing loss have underlying genetic causes for their condition. It has recently been identified that the most prevalent of these are mutations of the Gap Junction Beta-2 gene (GJB2). (Schimmenti, et al., 2011) GJB2 is responsible for directing the synthesis of Connexin 26, a protein that helps to create gap junctions in the cochlea through which potassium ions can flow, thus having an important role in the homeostatic regulation of potassium in this area. This process is essential for maintaining appropriate levels of potassium in the inner ear and thus preventing the malfunction and damage of cells vital for hearing. Connexion 26 may also play an important role in the maturation of certain cochlear cells. More than 90 mutations of the GJB2 gene have been identified thus far that produce a non-functional Connexin 26 protein and result in congenital hearing loss.  (Palmer & Boudreault)


All of these mutations studied to date are autosomal recessively inherited, however it is known that autosomal dominant mutations also exist. Of the afore-mentioned mutations, the majority exert their effects by deleting base pairs. This changes the sequence of amino acids produced and leads to the manufacture of a misshapen and unstable Connexin 26 protein.

Fortunately, a genetic test for the most common of these base pair deletion mutations has recently been derived. Blood samples can be taken from newborns and the appropriate segment of DNA isolated and amplified through PCR then sequenced using this genetic test to determine if the mutation is present. This test has been experimentally proven to detect the majority of infants with GJB2-related hearing loss amongst those that fail hearing screening tests. (Schimmenti, et al., 2011)

Schimmenti et al. (2011) believe that these genetic testing results could be available before traditional diagnostic testing begins and so would lead to an earlier diagnosis and therefore better speech and learning outcomes for many individuals. It is also anticipated that further genetic testing for many other hearing loss-associated mutations of this gene will become available in the near future. Considering that GJB2-related hearing loss is accountable for the majority of genetic hearing loss (and genetic conditions cause half of all deafness), undertaking genetic testing for mutations in this gene may therefore be very worthwhile. This process would involve taking a small blood sample from newborns who fail the hearing screening tests and analysing it for the presence of a mutated GBJ2 gene.