Syndicate content

Archive - Jul 22, 2015

Kiwi Genome Sequenced; Much Evidence for Adaptation to Nocturnal Behavior Noted: Loss of Color Vision, Heightened Sense of Smell, Low Metabolic Rate

The kiwi bird's unique nocturnal behavior is linked to some altered genes that eliminate color vision and others that modify its sense of smell, according to the first kiwi genome published in the open-access journal Genome Biology. Kiwis are endangered, ground-dwelling birds endemic to New Zealand. They are the smallest and only nocturnal representatives of the ratites - a group of flightless birds that includes the ostrich and the emu. Kiwis are also unusual in that they have a highly developed sense of smell, low metabolic rate, and enormous eggs in relation to body size. However, the genetic adaptations that underpin their unique traits have so far not been well understood. A team of researchers sequenced the genomes of two North Island brown kiwi. Not only was the kiwi genome found to be one of the largest bird genomes sequenced to date, but the team also identified evolutionary changes in its genome that help explain the bird's unique adaptations to nocturnality - a behavior found in under 3% of all bird species. Lead author Dr. Diana Le Duc from the University of Leipzig and the Max Planck Institute for Evolutionary Anthropology, Germany, said: "We've seen for the first time that kiwi lack color vision, and that their olfactory receptors can probably detect a larger range of odors which may be essential for their night-time foraging. These adaptations seem to have happened around 35 million years ago, soon after their arrival in New Zealand, probably as a consequence of their nocturnal lifestyle." The kiwi gene coding for the protein responsible for black-and-white vision, rhodopsin, was found to be similar to the corresponding gene in other vertebrates. However, the team identified mutations in the green-and-blue vision receptor genes, which could render blue and green color vision absent in the kiwi.

Antisense Morpholino Drug May Be Effective Against Deadly Marburg Virus; Current Human Fatality Rate Is 90%

An experimental drug that protected monkeys from the deadly Marburg virus appears to have potential for treating people who have been exposed to the virus, according to a study published in the July 23, 2015 issue of The New England Journal of Medicine. The article is titled “AVI-7288 for Marburg Virus in Nonhuman Primates and Humans.” Marburg virus is closely related to Ebola virus and, as Ebola, also causes a severe hemorrhagic fever. The research was jointly conducted by the U.S. Army Medical Research Institute of Infectious Diseases (USAMRIID) and the biotechnology firm Sarepta Therapeutics, Inc., using a compound known as AVI-7288. Taken together, the results of efficacy testing conducted in nonhuman primates and safety testing performed in a Phase I clinical trial suggest that AVI-7288 has the potential to be used to treat Marburg virus infection in humans when administered post-exposure, according to the authors. Case fatality rates associated with Marburg virus have been reported to be nearly 90 percent and the virus is deemed a potential "Category A" bioterrorism agent by the Centers for Disease Control and Prevention. No licensed vaccine or therapy is currently available for Marburg virus infection. For over a decade, USAMRIID and Sarepta have been collaborating to develop and test a class of antisense compounds known as phosphorodiamidate morpholino oligomers (or PMOs), according to senior author and USAMRIID Science Director Sina Bavari, Ph.D. Antisense drugs are designed to enter cells and eliminate viruses by preventing their replication, Dr. Bavari explained. The drugs act by blocking the translation of critical viral genetic sequences, preventing a key viral protein from being made, and giving the infected host time to mount an immune response and clear the virus.

More Birds Develop Darker Plumage on Smaller Islands; Genetic Link Between Melanism and Aggression Has Previously Been Shown in Mammals and Fish; Darker Plumage May Give Birds Aggression-Based Mating Advantage in Smaller Territories

Animal populations on islands tend to develop unusual traits over time, becoming big (like Galapagos tortoises) or small (like extinct dwarf elephants), or losing the ability to fly (as the flightless parrots of New Zealand). One less-studied pattern of evolution on islands is the tendency for animal populations to develop "melanism"--that is, dark or black coloration. Dr. J. Albert Uy and Dr. Luis Vargas-Castro of the University of Miami found an ideal species in which to study this phenomenon: the chestnut-bellied monarch (Monarcha castaneiventris) flycatcher, a bird found in the Solomon Islands. Most of these birds have the chestnut belly suggested by their name, but in the subspecies found in the Russell Islands, a few all-black birds coexist with the chestnut-bellied majority. After visiting 13 islands of varying sizes to survey their chestnut-bellied monarch populations, Dr. Uy and Dr. Vargas-Castro confirm, in a new paper published online on July 22, 2015 in an open-access article in The Auk: Ornithological Advances, that island size predicts the frequency of melanic birds, with populations on smaller islands including more dark individuals. The article is titled “Island Size Predicts the Frequency of Melanic Birds in the Color-Polymorphic Flycatcher Monarcha castaneiventris of the Solomon Islands." Because the pattern is repeated on island after island, it is very unlikely to have developed through random chance; instead, dark coloration most likely provides some sort of benefit to birds on small islands. Studies in mammals and fish have found a genetic link between melanism and aggressive behavior, and Dr. Uy and Dr. Vargas-Castro speculate that the limited space available on smaller islands makes competition for breeding territories more intense, giving an advantage to the most aggressive individuals.

High Levels of Fumarate May Play Prominent Role in Diabetic Kidney Disease; As Measurable Metabolite in Urine, Compound Could Prove Useful New Biomarker

Tapping the potential of metabolomics, an emerging field focused on the chemical processes of metabolism, researchers at the University of California (UC), San Diego School of Medicine have identified a new and pivotal player in diabetic kidney disease. The study, published online on July 22, 2015 in the Journal of the American Society of Nephrology, also clarifies a central mechanism of action in diabetic kidney disease that is generating considerable excitement among researchers and the biopharmaceutical community. The mechanism, involving the NADPH (nicotinamide adenine dinucleotide phosphate) oxidase (NOX) proteins NOX1 and NOX4 is now the subject of a phase II clinical trial for the treatment of diabetic kidney disease. "Our study further illuminates the role of NOX proteins, in particular NOX4, in mediating diabetes-associated kidney dysfunction and identifies fumarate, a product of the TCA (tricarboxylic acid) and urea cycles, as a key link in the metabolic pathways underlying diabetic kidney disease," said senior author Kumar Sharma, M.D., a Professor of Medicine and Director of the Center for Renal Translational Medicine at UC San Diego School of Medicine. By pinpointing fumarate, Dr. Sharma added, the research team has also discovered a new, and potentially better, biomarker for diagnosing and monitoring chronic diabetic kidney disease. Young-Hyun You, Ph.D., a project scientist in the Center for Renal Translational Medicine, was first author on the study. The article is titled “Metabolomics Reveals a Key Role for Fumarate in Mediating the Effects of NADPH Oxidase 4 in Diabetic Kidney Disease.” Diabetic kidney disease is the leading cause of end-stage kidney disease, the eighth leading cause of death in the United States and a major risk factor for cardiovascular disease.

100-Year-Old Theory for Origin of Mysterious “Hair Ice” Is Confirmed; Specific Fungus Is Key; Physics, Chemistry, and Biolgy Converge to Solve Riddle of Evanescent Frozen Beauty

You may have never seen or heard of it, but hair ice - a type of ice that has the shape of fine, silky hairs and resembles white candy floss - is remarkable. It grows on the rotten branches of certain trees when the weather conditions are just right, usually during humid winter nights when the air temperature drops slightly below 0°C. Now, a team of scientists in Germany and Switzerland has identified the missing ingredient that gives hair ice its peculiar shape: namely the fungus Exidiopsis effusa. The research was published online today (July 22, 2015) in Biogeosciences, an open-access journal of the European Geosciences Union (EGU). The article is titled “Evidence for Biological Shaping of Hair Ice.” "When we saw hair ice for the first time on a forest walk, we were surprised by its beauty," says Dr. Christian Mätzler from the Institute of Applied Physics at the University of Bern in Switzerland. "Sparked by curiosity, we started investigating this phenomenon, at first using simple tests, such as letting hair ice melt in our hands until it melted completely." Then Dr. Mätzler, a physicist, joined forces with a chemist (Dr. Diana Hofmann) and a biologist (Dr. Gisela Preuß) in Germany. Inspired by earlier work, and by photographs of hair ice sent in from various countries, the team performed a set of experiments to figure out just what conditions are needed to grow this type of ice and what its properties are. In the process, they confirmed a 100-year-old theory for the origin of hair ice. Dr. Alfred Wegener, of tectonic-plate fameand originator of the Continental Drift theory, was the first to study hair ice. In 1918, he noticed a whitish cobwebby coating on the surface of hair-ice-bearing wood, which his assistant identified as fungus mycelium - the mass of thin threads from where mushrooms grow. Dr.

Ultra-Brief-Pulse Electroshock Effective for Severe Depression with Far Fewer Cognitive Side Effects Than Standard Electroshock

Electroconvulsive therapy (ECT) remains one of the most effective treatments for severe depression, but new University of New South Wales (UNSW) research shows ultra-brief pulse stimulation is almost as effective as standard ECT, with far fewer cognitive side effects. The study, published online in the Journal of Clinical Psychiatry, is the first systematic review to examine the effectiveness and cognitive effects of standard ECT treatment, brief pulse stimulation, versus the newer treatment, known as ultra-brief pulse right unilateral (RUL) ECT. It comes after previous trials had shown conflicting results. The latest study reviewed six international ECT studies comprising 689 patients with a median age of 50 years old. The study found that, while standard ECT was slightly more effective for treating depression and required one fewer treatment, this came at a cost with significantly more cognitive side effects. "This new treatment, which is slowly coming into clinical practice in Australia, is one of the most significant developments in the clinical treatment of severe depression in the past two decades," according to UNSW Professor of Psychiatry Dr. Colleen Loo. "Our analysis of the existing trial data showed that ultra-brief stimulation significantly lessened the potential for the destruction of memories formed prior to ECT, reduced the difficulty of recalling and learning new information after ECT, and was almost as effective as the standard ECT treatment," Professor Loo said. The article is titled "A Systematic Review and Meta-Analysis of Brief Versus Ultrabrief Right Unilateral Electroconvulsive Therapy for Depression." ECT delivers a finely controlled electric current to the brain's prefrontal cortex, an area that is underactive in people with depression.

Levels of Two Blood Biomarkers (ADMA and Hcy) Predict Pre-Eclampsia Month Ahead of Disease Onset

Levels of biomarkers in the blood of pregnant women can be used to predict which women are at risk of pre-eclampsia, according the results of a study published online on July 22, 2015 in BJOG: An International Journal of Obstetrics and Gynaecology. ADMA and Hcy, both known to be raised in women with pre-eclampsia, are present in the blood in higher than normal concentrations a month before the onset of the condition. The article is titled “Serial Determinations of Asymmetric Dimethylarginine and Homocysteine during Pregnancy to Predict Pre-Eclampsia: a Longitudinal Study.” Pre-eclampsia is a combination of raised blood pressure (hypertension) and protein in the urine (proteinuria). It is quite common, usually occurring after 20 weeks of pregnancy, and affecting between 2 and 8 in 100 women during pregnancy. In most cases it is mild and has little effect on the pregnancy. However for 1 in 200 women, the effects are more serious. Severe pre-eclampsia can affect the mother by damaging the kidneys, liver, and other organs and, in really severe cases, cause seizures and coma. There is often less fluid than normal around the baby and the placenta can be affected, restricting blood flow and nutrients necessary for the baby's growth. The exact cause of pre-eclampsia is not understood but it is more common in first pregnancies. Once identified, mothers can be monitored and treated. Often the baby will be delivered early by being induced or by caesarean section. The current study addresses the need to find a reliable way of screening women to find those at risk of pre-eclampsia. The research team led by Dr.