Friday, July 6, 2012

Alcoholism - Is it a Genetic Mutation?

Alcoholism - Is it a Genetic Mutation?


Alcohol dependence, also known as alcoholism, is considered medically as a disease. Its symptoms, as listed by the American Association for Clinical Chemistry (2010), include increased tolerance, cravings, loss of control and physical dependence. For decades, sufferers of the disease have not only experienced its harsh physical and psychological effects, but also discrimination and stereotypes created by society. In recent years, however, scientific research has revealed that the likelihood of developing alcoholism is increased by the possession of variations in certain genes (Arbor 2011). When variation occurs in two specific genes, unc-79 and GABRA2, it is thought that it influences alcohol sensitivity (O’brien 2010) and impulsive behaviours (Arbor 2011).



 Gene mutations are permanent alterations to sequences of DNA sections of chromosomes (U.S. National Library of Medicine 2012). When mutations occur in genes, it can affect the cell or organism’s ability to function normal, therefore promoting alcoholism in humans.

The gene unc-79 in mice, as well as the human version of the gene, is a poorly understood gene thought to interact with a neuron called NALCN (O’Brien 2010). In studies with mice, the mice that possessed mutated unc-79 genes voluntarily chose alcohol over water when offered the two. The mutant mice also were highly more sensitive to the alcohol. When injected with pure ethanol, the mice blacked out for much longer than the non-mutant mice. These observations in mice are thought to arise from the unc-79 gene mutation, dubbed as Lightweight, altering the neuronal responses to alcohol governed by NALCN (O’Brien 2010).



The GABRA2 gene is responsible for the functioning of receptors in part of the mammalian brain called the insula (Arbor 2011). In a recent study, those with the variant GABRA2 gene demonstrated higher levels of impulsiveness when under distress, with high activation in the insula. This links to the idea humans, particularly females, turn to alcohol to relieve distress and anxiety (Arbor 2011).

Both unc-79 and GABRA2 gene variants are just some of the genes that contribute to the symptoms of alcoholism, but do not directly cause it. However, as alcoholics, their families and researchers attempt to discover its medical foundations, the discoveries of mutations in genes as alcoholism contributors is extremely significant for prevention, treatment and understanding of alcohol dependence.

DNA as a Data Storage Device

DNA as a Data Storage Device

In this day and age, we are all surrounded by technology, with gadgets and gizmos such as CDs, iPods, phones, computers, USBs – all driving us towards the ongoing quest for new and better ways to store information. With the past few years, scientists have been investigating every possibility, ranging from semiconductors to carbon “nanoballs” to even our very own DNA!
Deoxyribonucleic acid, or DNA, possesses many ideal characteristics of a data storage device for the future. Present in all living organisms, a key feature of DNA is its capacity to store significantly large amounts of information in its nucleotide sequences. The structure of a nucleotide consists of a sugar-phosphate backbone, attached to one of the four nitrogenous bases – Adenine, Thymine, Cytosine and Guanine.
Figure 1: The structure of a nucleotide, consisting of a phosphate group, deoxyribose (sugar) and a nitrogenous base.





Using genome sequencing, these nucleotides can be connected to form synthetic oligonucleotide sequences containing data stored in the form of specifically ordered nitrogenous bases.
In a recent study conducted by Yachie et al. (2007) at Keio University, the practicality of using bacterial DNA for long-term, large-volume data storage was investigated. The researchers were able to store a short, alphanumeric message in the loci of a Bacillus subtilis genome and retrieve it successfully. To do so, their chosen message “E=mc2” was firstly translated into dinucleotides, using a 4-bit binary code encryption key.
Figure 2: Encryption keys used in the Yachie et. al (2007) study at Keio University, Japan.

These dinucleotides were then used to form long sequences that were then injected into the Bacillus subtilis cells. After an overnight incubation period, the data was then recovered.
Figure 3: The 4-bit binary codes translate into dinucleotides which make up synthetic oligonucleotide sequences.

 The most common data storage and recovery method for DNA is based on polymerase chain reaction (PCR), which involves the use of primers to amplify the coded regions of DNA. Encryption keys are then employed to decode each dinucleotide into its corresponding bit code and if necessary, into alphanumeric code for convenient use or interpretation.
Figure 4: Bacillus subtilis under a microscopic.

Not only can DNA significantly more bytes than our currently existing mechanisms, but it is also praised for its extreme durability in long-term data storage. Naturally, DNA is passed down from generation to generation of living organisms, and because of this, scientists postulate that any data inserted in an organism’s genome will last as long as the line of the host organism, which is often hundreds of thousands of years. 
However, if the organism undergoes genetic evolution or adaptation, there are several problems that may occur, including data transmutation or loss. Several methods have been suggested to reduce the effect of these mutation rates, such as the selection of a robust host organism that can survive in harsh environments. In addition, the study by Yachie et al. (2007) suggests storing the data in an “alignment-based” method, where several back-ups of the data are also inserted with the original information to increase the stability of DNA data and reduce the chances of data deletion.
The phenomenon of using genomic DNA to archive information is considered as a significant advancement in genetics. According to recent studies, the natural characteristics of DNA, such as compactness, heritability and durability construct it as an ideal data storage device – which may ultimately blur the line between nature and technology forever.

Can't stop eating? Blame it on your genes

Can't stop eating? Blame it on your genes




Obesity is a medical condition where excess body fat accumulates to the extent of causing adverse effect on health which may lead to reduced life expectancy with increased health problems. It increases the likelihood of different diseases especially the heart disease, cancer as well as type two diabetes. Obesity tends to run in families. Weights of adults selected during studies reveal that their weights are closer to their biological parents’ weights.
A recent study done by Guey-Ying Liao and his colleagues (2012) suggests that human obesity may possibly be caused by mutations in the Bdnf gene as it produces transcripts having either short or long 3’ un translated regions (3’ UTRs). However, in regulation of energy balance, the precise role of brain-derived neurotrophic factor (BDNF) is unknown. The relationship between Bdnf mRNA with along 3’ UTR which means long 3’ UTR Bdnf mRNA, leptin neural activation and the body weight is shown. Long 3’ UTR Bdnf mRNA has been found to be enriched in dendrites of hypothalamic neurons. It has also been found that insulin and leptin could possibly stimulate its translation in dendrites.
Mice harboring a truncated long Bdnf 3’ UTR furthermore developed acute hyperphagic obesity. However, this was completely reversed by viral expression of somewhat long 3’ UTR Bdnf mRNA found in the hypothalamus. The ability of leptin in activating hypothalamic neurons and inhibiting food intake was compromised despite the presence of leptin receptors, in the mice. The results obtained revealed a novel mechanism which linked leptin action to BDNF expression happening during hypothalamic mediated body weight regulation and this also implicated dentritic protein synthesis in the process.
Researchers claim to have found a single mutant gene is the one to blame for the inability of brain to tell obese people when to stop eating. The brain derived neurotrophic factor in mice either stops or slows passage of leptin as well as the insulin signals through the brain. In the humans the aforementioned hormones are released at somewhere at the time when one can see the bottom of the colonel’s sixteen piece bucket. It is not usually the guiltiness that tells one to stop but the brain dictates when the climax is reached. In cases where the signals fail to reach the locations that are of concern in the area in brain signaling satiety.
Such discovery may possibly open up novel strategies which help the brain control body weight. The Bdnf does not only control body weight, but notably in failure to development of one of Bdnf gene, there is a flow effect resulting in deficits in learning and memory in mice. Neurons rarely talk to each other in case there is a problem with Bdnf gene and as such, the leptin and insulin signals become effective without modification of appetite. Faulty transmission line can be repaired by the strategy where missing Bdnf would be produced using virus based gene therapy despite the difficult of delivering across the brain blood barrier.
 

The lack of a single gene has been found to cause obesity. Leptin appears linked to human disease in which case several childhood diseases have been associated with mutations in leptin genes. Leptin however plays a big role in the body of human beings today. As an issue of concern in human science, research findings should be well administered so as to ensure that the risks of obesity associated with gene mutations are effectively curbed.

Epigenetics in the Ice Age

Epigenetics in the Ice Age


Recent research highlighted by the New Scientist article “Fossil DNA has clues to surviving rapid climate change” suggests that epigenetics played a significant role in the adaptions animals made during the last ice age. It focussed on research by a team from the University of Adelaide and University of New South Wales, headed by Alan Cooper and Catherine Suter, who made the discovery after investigating the genetic sequence found in some specimens of extinct bison (Holmes 2012).


Epigenetic inheritance involves the inheritance of characteristics from one generation to another, by processes that do not involve the nucleotide sequence of DNA (Reece et al, 2011, p. 364). Therefore, Cooper and Suter’s team looked at the characteristics that animals developed, which may have been passed down to future generations, when the animals were exposed to the change in environment. This was done by finding the bones of a bison that lived around 26,000 years ago in the Canadian arctic permafrost (Figure 1), before extracting the DNA contained within these bones (Holmes 2012). Tests were then performed, using the bisulfate sequencing technique, searching for DNA methylation (Holmes 2012). DNA methylation is when a methyl group joins to a base of DNA, commonly cytosine (Reece et al, 2011, p. 364) (Figure 2). According to the website Sigma-Aldrich, “DNA methylation is an epigenetic modification that changes the appearance and structure of DNA without altering its sequence” (2008). Further research completed by the team of scientists proved that some of the DNA methylations they found in the bison were in the same places of modern cow DNA (Holmes 2012). This discovery is very important, as it shows that there was some form of epigenetics during the time period that the fossils lived. As for the similar methylations that were discovered in modern cows, Holmes suggests that this “is strong evidence that the ancient methylations were not the product of chemical damage occurring after the bison’s death” (2012). More tests on five other specimens of bison found gave the scientists no results (Holmes 2012), which proves that this area of research is very difficult. Noting that scientists currently have limited knowledge when it comes to understanding epigenetic signals (Holmes 2012), advances in this field are hard to make. There are a number of different techniques scientists can use to tell if epigenetic modification has occurred, like bisulfate sequencing, fluorescent insitu hybridisation and DNA adenine methyltransferase identification for example (Medindia, n.d.). Despite this, far more research is required to understand what that information is saying, and to get anything out of it.

 There is still a lot of work to be done on this before any major conclusions can be drawn. But this research has found that at the start of the last ice age, animals may have undergone epigenetic change to adapt to the changing environment. With the current climate change situation facing humans, this find in the remains of a 26,000-year-old bison could be important in the future.

Eye Colour in Humans – Not Just a One Gene Affair

Eye Colour in Humans – Not Just a One Gene Affair

Contrary to popular belief eye colour in humans is not just controlled by one gene in our DNA. High school biology teaches us about Gregor Mendel and his theories about inheritance patterns and how they relate to human eye colour. Prior to recent studies conducted into the genetics of eye colour it was thought to be a strictly mendelian trait (White and Rabago-Smith). However it is now known to be the product of multiple genes. This theory is used to explain why eye colour does not comply with Mendelian patterns of inheritance. For example blue-eyed parents are able to have brown-eyed children, which should not be possible in a Mendelian model where brown is dominant over blue (which can only occur with homozygous recessive genes and thus they would not be able to pass on the dominant gene to their offspring).  This model using a single gene is unable to explain the spectrum of eye colour and the fact that eye colour in humans shows both incomplete dominance and epistasis (University of Queensland).  This suggests that there is more than one gene that controls eye colour.


The colour of an individual’s eye is determined by the ratio of two pigments in the iris of their eye. These two pigments are called eumelanin (the yellow pigment) and phenomelanin (the black pigment) (White and Rabago-Smith). These pigments, melanin, and are produced in the melanocytes of your eye (Ibid). Blue eyes arise from low levels of melanin and increasing levels produce the rest of the eye colour spectrum (Stanford University). The amount of melanin in the iris also affects eye colour. The more melanin that is present, the darker the apparent colour of the eye as when light enters the eye it is largely absorbed rather than reflected back as colour (White and Rabago-Smith). So people with lighter eyes have less melanin than people with darker shades. Individuals may have red or violet eyes; this is due to a condition called ocular albinism and is caused by mutations in their gene sequence (Ibid). 

 Studies conducted by various institutions including the Institute for Molecular Bioscience at the University of Queensland have shown that there are 16 genes which affect eye colour (Ibid). However most of these genes have only a small effect, the two major genes are HERC2 and OCA2 (Ibid). HERC2 affects the way in which the code of OCA2 is expressed in the DNA sequence because of its position in the DNA (Ibid). Any changes in the sequence of these genes have large impacts on the eye colour of the individual. Changes in the OCA2 gene been shown to account for around 74% of variation in eye colour (Duffy, Montgomery and Chen; White and Rabago-Smith). If both copies of the OCA2 gene are missing it leads to ocular albinism (White and Rabago-Smith). Other genes which effect eye colour include agouti signalling protein, tyrosinase, membrane associated transporter protein, p protein oculocutaneous albinism II and melanocortin 1 receptor (Ibid).



 It is clear from the research that has been conducted in this area that the Mendelian model of inheritance is unable to explain the expression of eye colour in humans. There are multiple genes responsible for melanin production in the eye and the main two are HERC2 and OCA2 (Tyler).