Parallel altitudinal clines reveal trends in adaptive evolution of genome size in domesticated maize (Zea mays)

GENOME SIZE varies by several orders of magnitude across species, due to both changes in ploidy (number of sets of homologous chromosomes that make up the genome of a cell or an organism) as well as haploid DNA content (one-half of the diploid genome, i.e. no chromosome pairs such as one sees in egg or sperm). Early hypotheses for this variation had suggested that genome size might be linked to complexity of the organism, because more complex organisms “should require a larger number of genes”. Empirical studies, however, then revealed instead –– that most variation in genome size is due to noncoding repetitive sequence (which makes up ~45% of an organism’s genome) and that gene content is relatively constant (i.e. 1.0 to 1.2% of the genome).

Although this discovery resolved “the lack of correlation between genome size and complexity”, we still know relatively little about the make-up of many eukaryote (organisms containing chromosome pairs) genomes, the impact of genome size on phenotype (i.e. what traits are expressed by genes), or the processes that govern variation in repetitive DNA and genome size among taxa (groups of populations of an organism or organisms, as identified by taxonomists to form units).

Many theories have been offered to explain variation in genome size among taxa. Across deep evolutionary time, genome size appears to correlate with estimates of effective population size –– leading to suggestions that genetic drift permits maladaptive expansion (i.e. not providing adequate or appropriate adjustment to environmental signals or pressures) or contraction (smaller size) of genomes across species. Other models propose that variation may be due to differences in rates of insertions and deletions (indels) or a consequence of changes in modes of reproduction. While each of these models finds limited empirical support, counter-examples are common.

In addition to these neutral models, many authors have proposed adaptive explanations for genome size variation. Numerous correlations between genome size and physiologically or ecologically relevant phenotypes have been observed –– including size of nucleus, size of plant cell, size of seed, size of body, and growth rate. Adaptive models of genome-size evolution suggest that positive selection drives genome size towards an optimum due to selection on these, or other, traits, and that stabilizing selection prevents expansions and contractions away from the optimum. In most of these models, however, the mechanistic link between genome size and phenotype continues to remain obscure.

Authors [see attached] investigated parallel changes in intraspecific genome size and DNA-repeat content of domesticated maize (Zea mays) land-races versus their wild relative, teosinte, across altitudinal gradients in Mesoamerica and South America. They combined genotyping, low-coverage whole-genome-sequence data, and flow cytometry –– to test for evidence of selection on genome size and individual DNA-repeat abundance. They found that population structure alone cannot explain the observed variation, implying that clinal patterns of genome size are maintained by natural selection. To better understand the phenotypes driving selection on genome size, authors conducted a growth chamber experiment using a population of highland teosinte exhibiting extensive variation in genome size. They found weak support for a positive correlation between genome size and cell size, but stronger support for a negative correlation between genome size and rate of cell production.

Reanalyzing published data of cell counts in maize-shoot apical meristems, authors then identified a negative correlation between cell production rate and flowering time. Together, these findings suggest a model in which variation in genome size is driven by natural selection on flowering time across clines living at different altitudes –– connecting intraspecific variation in repetitive-DNA sequence to important differences in phenotypes (traits) adapting to such signals as altitude.


PLoS Genet May 2o18; 14: e1007162

Posted in Gene environment interactions | Comments Off on Parallel altitudinal clines reveal trends in adaptive evolution of genome size in domesticated maize (Zea mays)

Triggers of tree mortality under drought conditions

This is a gene-environment interactions topic. The “environmental signal” in this case is DROUGHT. Which genes in the tree’s genome will respond to the drought signal in order for the tree to survive? Forests provide a wide array of ecosystem services and are vital for maintenance of biodiversity. Whereas forests continue to face pressure from expanding human populations –– which lead to changes in land use and deforestation –– the threat posed by harsh changes in weather is less easily quantified. Evidence from a wide range of sources suggests that rising atmospheric CO2 concentrations have benefited forests, with CO2 fertilization enabling an increased leaf-area index, enhanced water-use efficiency, and greater uptake of carbon globally.

However, extreme weather events, such as heat waves, droughts, fires and storms, have the potential to offset these benefits –– causing widespread tree mortality and a net loss of CO2 into the atmosphere. Although forests are vulnerable to a wide range of extreme weather events, drought and associated disturbances have the greatest effect globally. In the [attached] Review, authors examine the physiological response of trees to drought, focusing on new insights provided by rapid advances in our understanding of the hydraulic function of plants.

Land plants require an efficient long-distance transport pathway to lift water from the soil to their leaves at a rate that satisfies transpiration.

In trees, the xylem tissue (wood) supplies water for all aspects of plant function, including photosynthesis, growth and reproduction. During long droughts, damage to this hydraulic supply network as a consequence of severe water stress has been identified as a key mechanism involved in tree mortality. Recent experimental work has quantitatively linked hydraulic-failure thresholds to plant mortality, and field studies have shown that hydraulic failure is a primary pathway for extensive canopy death or plant mortality during natural drought events. Authors herein focus on the current understanding of tree hydraulic performance under drought conditions, how to identify physiological thresholds that precipitate mortality, and the mechanisms of recovery after the drought is over.


Nature 28 Jun 2o18; 559: 531–539 [a Review]

Posted in Gene environment interactions | Comments Off on Triggers of tree mortality under drought conditions

Diarrhea from taking laxatives can alter the microbiome

The findings of this study come as no big surprise to many of us. During the past 10+ years, clinicians and scientists have begun to appreciate the importance of The Microbiome –– all the bacteria that live in our intestine. In fact, if our entire body is ground up and the DNA measured, at least 90-92% of all the DNA is derived from the gut bacteria! The “brain-gut-microbiome” is now recognized as an important interplay between gut bacteria and the central nervous system, with plenty of signals that affect our immune system. Alterations in our intestinal bacteria can alter everything from our mood to increased sensitivity to drugs and risk of various diseases.

Many forms of adversity can affect microbiota colonization and replacement –– yet little is known about how to predict bacterial community response to perturbations and to control reprogramming. Perturbations, such as fever or diarrhea, are perceived as transient. However, temperature (e.g. fever or hypothermia) and osmolality (any solution’s concentration, generally expressed as ‘total number of solute particles’, i.e. ions, chemicals, proteins, lipids, carbohydrates, per kilogram) will induce rapid and drastic changes in microbial physiology. Osmolality is a fundamental property affecting bacterial growth, in which the steady-state growth rate of most bacteria decreasing as the environmental osmolality increases, independent of osmolyte identity. Within a mammalian host (including of course a patient), osmolality is tightly regulated in blood, even if it varies in the intestine because of absorption and secretion of the intestine’s contents and water by epithelial cells lining the intestine.

Osmotic diarrhea is a common medical condition that can arise in a variety of situations (including lactose intolerance and celiac and pancreatic disease). In addition to ‘natural causes’, osmotic diarrhea also can be induced, e.g. osmotic laxatives exploit the inability of the epithelium to absorb either specific compounds (such as polyethylene glycol, PEG) or excessive amounts of solutes (such as salts). These unabsorbed solutes osmotically draw water from the intestinal epithelium into the lumen, leading to increased intestinal motility and decreased stool consistency. Over-the-counter osmotic laxatives are prevalent in the industrialized world; Miralax (PEG with an average molecular weight of 3,350 g/mol) is the second-leading digestive remedy in the U.S. Despite widespread usage and occurrence, current understanding of the impact of osmotic laxatives and osmotic diarrhea on the gut microbiota is limited, hindering the physician in learning how to deal with such problems.

Authors [see attached] assessed the resilience of the gut ecosystem to osmotic perturbation, using mice as a model system. Osmotic stress caused reproducible extinction of highly abundant taxa (groups of populations of organisms –– in this case, bacteria –– seen by taxonomists to form a unit) and expansion of less prevalent members, in the mouse (as well as the human) microbiome. Authors found destruction of the mucus barrier during osmotic perturbation, followed by recovery when osmotic stress is stopped. The immune system exhibited temporary changes in cytokine levels and a lasting immunoglobin-g (IgG) type of immune response against commensal bacteria (i.e. part of the normal gut flora). Environmental availability of microbiota members alleviated these extinction events, demonstrating how species reintroduction can affect gut bacterial community resilience. These data [a] demonstrate that even mild osmotic diarrhea can cause lasting changes to the microbiota and host, and [b] lay the foundation for interventions that might help increase system-wide resilience.


Cell 2o18; 173: 1742–1754

Posted in Center for Environmental Genetics | Comments Off on Diarrhea from taking laxatives can alter the microbiome

Questions from a Korean magazine and answers (by PROFESSOR RICHARD LINDZEN)

Questions from a Korean magazine and answers (by Richard Lindzen) to these questions:

1. A large part of the world including Europe and Northeast Asia suffered a long and severe heat wave this summer. What was the problem?

The problem was weather. Weather is produced by waves that alternately provide warming and cooling, as the wind shifts from northerly to southerly. Such waves generally travel from west to east, but sometimes stall in what is referred to as blocking. Relatively long and severe heat waves in summer that affect disparate regions are not unprecedented.

2. How much portion of such climate change is man-made?

Please don’t confuse weather and climate. The small change in global mean temperature over the past century is unlikely to have had much influence on weather. However, ‘global warming’ has always been a mostly political issue, and it has been found that people don’t take predictions of warming in the distant future very seriously, so there has been an intense effort to make people think that the impacts are present already.

3. Is there anything unique about the climate changes over the Korean Peninsula and Japan (such as the impact of China’s rapid industrialization)?

Most places on earth have unique features. In the case of the Korean Peninsula, the configuration of the Yellow Sea, the South China Sea, and the Sea of Japan does provide some unique features that can affect weather and even climate. However, the industrialization of China is more likely to have an impact on pollution, rather than climate.

4. How about Europe and North America?

As usual, summer will be warmer than winter. Summers will commonly have seemingly long episodes of warmer than average weather – which we refer to as heat waves. This year July was particularly warm in North America and much of Europe. August in much of southern Europe was pleasantly cool, but Scandinavia remained warm. August has been cool and rainy in much of the northeast of North America. However, warm weather is now returning (at least briefly). Given the wave nature of weather, none of this is unprecedented.

5. Will the type of long and hot summer be a new normal? What should be done to reverse the change, or at least to slow down the pace of it?

I don’t know what you mean by ‘new normal’. Such summers have already occurred many times, and they are commonly followed by very different summers in the following year. The same is true for winters. I doubt that there is much we can do about this. Ancient literature like the Bible and the Icelandic sagas already describe such events.

6. If you think it’s a new normal (for the question No. 5), how would such change reshape global economy/business in general (including farming and fishing)?

Of course, I don’t think we are dealing with anything new. What history shows – is that farmers are capable of significant adaptation, fishermen have other problems like overfishing, and more generally, rich societies are able to adapt better than poor societies to changes in climate. The best policy for any society is to become wealthier. However, in the case of farming, there is the obvious benefit of increased CO2 since increased CO2 enhances plant growth and reduces the need of plants for water.

7. How would the long and hot summer weather affect the overall winter weather to come?

Such an influence is possible, but does not seem to be large.

8. How much have the weather-predicting technologies been advanced in recent years and how would they be further advanced in the foreseeable future?

There has been notable improvement in numerical weather prediction models, and predictions for 2-3 days are now quite good. The model at the European Centre for Medium Range Weather Forecasting is notably good. Predictions for 6-7 days or longer remain poor, and are likely to remain so, because small and unavoidable observational errors become important at these time scales. The situation with respect to observations is mixed. The best data come from sources like balloon soundings, and the number of such stations has actually gone down. This is especially serious when it comes to weather ships which have almost disappeared. A weather ship or two in the Yellow Sea would probably help Korean forecasts quite a lot. Satellite data does enable one to identify new weather systems over poorly observed regions. However, the data that satellites provide for models are of limited utility. Although satellites provide temperature data with good horizontal resolution, the vertical resolution is too poor to actually improve model forecasts – at least in the Northern Hemisphere.

9. What are some variables or obstacles that make precise weather prediction difficult? (and how can they be overcome?)

See answer to question 8.

10. Are there many private weather information companies in the United States? How can they cooperate with the government and companies to sharpen the weather predictions and generate profit?

Private weather services in the US generally use the forecasts from the government and add locally relevant interpretations.

11. Korean government’s weather prediction is notoriously inaccurate. What should be done to improve the accuracy? (On the contrary, I found that the weather-prediction in the United Kingdom is very accurate while I was studying there. Can it be a technological difference?)

My recommendation would be for Korea to import the model from the European Centre for Medium Range Weather Forecasting (in Reading, UK), and use a comparably powerful computer. Also, add a weather ship or two in the Yellow Sea.

12. Please feel free to add your opinion on climate change and its impact humans, technological advancement to increase accuracy in weather prediction, etc.

The real problem facing the developed world is the fact that the vast majority of people (including the vast majority of the population with higher education, as well as the vast majority of political leaders) are essentially scientifically illiterate. When one realizes that government is the primary funder of science, this can lead to obvious misbehavior by both government and science. Your previous questions offer a good opportunity to explain the problem in greater detail.

Your questions freely confuse weather and climate. Thus, global warming refers to the warming of about 1.0 °C, since the end of the Little Ice Age about 200 years ago. On the other hand, your examples involve temperature changes on the order of 20.0 °C. Such large changes, which characterize weather, have a profoundly different origin than that of global warming. The large changes, crudely speaking, result from winds carrying warm and cold air from distant regions that are very warm or very cold. These winds are in the form of the waves I mentioned earlier. The strength of these waves depends on the temperature difference between the tropics and the arctic (with larger differences leading to stronger waves). Now, the models used to project global warming all predict that this temperature difference will decrease rather than increase. Thus, the increase in temperature extremes would best support global cooling rather than global warming.

However, scientifically illiterate people seem incapable of distinguishing global warming of climate from temperature extremes due to weather. In fact, there doesn’t really seem to be any discernible trend in weather extremes. There is only the greater attention paid by the media to weather, and the exploitation of this ‘news’ coverage by various parties. Moreover, the small change in global mean temperature (actually the change in temperature increase) is much smaller than what models used by the IPCC have predicted. Even if all this change were due to human activity, it would be most consistent with low sensitivity to added CO2, and the IPCC only claims that most (not all) of the warming over the past 60 years is due to human activities. Thus, the issue of man-made climate change does not appear to be a serious problem. The unwarranted concern over this issue is leading to irrational policies, as well as the diversion of monetary resources needed for the numerous real problems facing humanity.

Posted in Center for Environmental Genetics | Comments Off on Questions from a Korean magazine and answers (by PROFESSOR RICHARD LINDZEN)

Genome-wide association studies identify genes associated with “intelligence” and with “neuroticism”

As we have often presented in these GEITP pages, we report here two HEROIC (large) genome-wide association studies (GWAS), which represent huge populations (cohorts) examined for DNA-sequence differences (virtually anywhere in the genome) that appear to be associated with a multifactorial trait (i.e. a phenotype that reflects the contribution of hundreds, possibly thousands, of genes, + epigenetic effects, + environmental factors). In this case, the intriguing/controversial “traits’ being studied are INTELLIGENCE and NEUROTICISM.

Intelligence is known to be highly heritable, as well as being a major determinant of human health and well-being. Five recent GWAS meta-analyses had identified 24 genomic loci (loci = locations in the DNA sequence) linked to variations in intelligence –– but much more information underlying fundamental genetic contributions remains to be uncovered. Authors [see first attached file] present a large-scale GWAS of intelligence (N = 269,867), identifying 205 associated genomic loci (190 new) and 1,016 genes (939 new) via positional mapping, expression-quantitative-trait-locus (eQTL) mapping, chromatin-interaction mapping, and gene-based association analysis.

The associated genes identified were strongly expressed in the brain –– specifically in striatal medium spiny neurons and hippocampal pyramidal neurons. Gene-set analyses implicate pathways related to central nervous system (CNS) development, and synaptic structure. These exciting data confirm previous strong genetic correlations with multiple health-related outcomes. Mendelian-randomization-analysis results suggest protective effects of intelligence for Alzheimer disease, and attention-deficit-hyperactivity disorder (ADHD), and bidirectional causation with pleiotropic effects (i.e. when one gene influences two or more seemingly unrelated traits) for schizophrenia. These results represent an important step forward in understanding the neurobiology of cognitive function (i.e. psychological processes involved in acquisition and understanding of knowledge, formation of beliefs and attitudes, and rational decision-making and problem-solving –– traits that seem to be lacking in today’s politics, worldwide), as well as genetically related neurological and psychiatric disorders.

Neuroticism is an important risk factor for psychiatric traits –– including major depressive order (MDD), anxiety, and schizophrenia. At the time of analysis, six previous GWAS had reported 16 genomic loci associated with neuroticism. Authors [see second attached file] conducted a huge GWAS meta-analysis (N= 449,484) of neuroticism and identified 136 independent genome-wide significant loci (124 new), which implicate 599 genes. Functional follow-up analyses showed enrichment in several brain regions, and involvement of specific cell-types, including dopamine-producing neuroblasts, medium spiny neurons, and serotonin-producing neurons.

Gene-set analyses implicated three specific pathways: neurogenesis, behavioral response-to-cocaine processes, and axon portion (in vertebrates, the long slender projection of a nerve cell, or neuron, that typically conducts electrical impulses known as ‘action potentials’, away from the nerve-cell body). Authors show that neuroticism’s genetic signal originates, in part, in two genetically distinguishable subclusters (‘depressed affect ’ and ‘worry ’) –– suggesting distinct causal mechanisms for subtypes of individuals. Mendelian-randomization analysis showed uni-directional and bi-directional effects between neuroticism and multiple psychiatric traits. These exciting findings enhance our neurobiological understanding of neuroticism and provide specific leads for functional follow-up experiments. 🙂


Nature Genetics Jul 2o18; 50: 912–919 & 920–927

Posted in Center for Environmental Genetics | Comments Off on Genome-wide association studies identify genes associated with “intelligence” and with “neuroticism”

Discovery of genome of an offspring of a Neanderthal mother and a Denisovan father ….!!!!

As these GEITP pages have often described, modern human (Homo sapiens) migrated out of southeast Africa during the past 1-2 million years. Neanderthals (Homo neanderthalensis), Denisovans (Homo denisova), modern humans, and one other (still a mystery) hominin subline most likely have evolved from Homo erectus, an ancestor that left Africa ~1.8 million years ago (MYA). Increasing data have suggested that interbreeding between various hominin species had occurred (i.e. there were no “clean” “splits” of one subline from another). Numerous “molecular clock” genetic studies place divergence time of the Neanderthal and Denisovan lineages between 800,000 and 400,000 years ago, with modern human diverging from the Neanderthal/Denisovan ancester before those two sublines had become distinct. The genome of modern humans is known to contain bits and pieces (2% to 6%) of both the Neanderthal and Denisovan genomes.

The exciting publication that has just appeared [see attached] describes a female (who lived “only” ~90,000 years ago) who was half-Neanderthal and half-Denisovan, according to genome analysis of a single bone fragment recovered from Denisova Cave in the Altai Mountains of Russia. This is the first time that scientists have identified an ancient individual who was clearly “a hybrid” (i.e. parents belonged to two distinct human sublines)!! The Denisova Cave lends its name to the “Denisovans”, where DNA sequence of a bone in 2oo8 first identified that hominin subline. The Altai region, and the cave specifically, were also home to Neanderthals.

Given the patterns of genetic variation in ancient and modern humans, scientists already knew that Denisovans and Neander­thals must have bred with each other — and with Homo sapiens (See ‘Tangled tree’ p 418 of attached editorial). Until now, however, no one had previously found a first-generation offspring from such pairings –– although one study [Nature 2015; 524: 216] found the DNA of a Homo sapiens specimen who had a Neanderthal ancestor within the previous four to six generations.

Authors [see attached] show that the father’s genome bears traces of Neanderthal ancestry, derived from a population related to a later Denisovan found in the cave. The mother came from a population more closely related to Neanderthals who lived later in Europe –– than to an earlier Neanderthal found in Denisova Cave. These findings suggest that migrations of Neanderthals between eastern and western Eurasia had occurred sometime more recent than 120,000 years ago. Neanderthals and Denisovans inhabited Eurasia until they were replaced by modern humans ~30,000 years ago. Neanderthal remains have been found in western Eurasia, whereas, thus far, physical remains of Denisovans have been found only in Denisova Cave, where Neanderthal remains have also been recovered.

Although little is known about the morphology of Denisovans, their molar teeth lack the derived traits typical of Neanderthals. It has also been shown that Neanderthals mixed with ancestors of present-day non-Africans ~60,000 years ago, and possibly with earlier ancestors of modern humans. Furthermore, it is known that Denisovans interbred with the ancestors of present-day Oceanians and Asians. [In conclusion, back in those days, there seemed to have been a lot of ‘messing-around’ going on.] Finally, Denisovans appear to have received genes from at least one additional archaic hominin (i.e. one other subline that remains a mystery) –– which diverged more than a million years ago from the modern human, Denisovan, and Neanderthal sublines.


Nature 23 Aug 2o18; 560: and Editorial pp 417–418

Posted in Center for Environmental Genetics | Comments Off on Discovery of genome of an offspring of a Neanderthal mother and a Denisovan father ….!!!!

The world of fake journals and non-conferences

From time to time, these GEITP pages, have covered this RISING MENACE of “predatory open-access online” journals. During the past decade –– this corrupt field of publishers has expanded from virtually none, to well over 15,000 “fake journals” today. And there are no signs of their slowing down, because it’s such an easy way to make large amounts of money.

Every day, most of us receive at least 10 to 20, but often more than 50 emails(!!), requesting us “publish here with us, write anything, even one or two pages is enough, we need one more article to fill the present issue,” or “please be speaker at this meeting and we’ll publish the proceedings,” etc. etc. etc. For the “papers,” one must pay exorbitant “page charges”, and for meetings one must pay high registration fees, plus room and board and travel, and then pay “page charges” for the “proceedings” to be published “in our prestigious journal.” Below is the latest, yet-another, more detailed SCAM story about predatory journals and meetings/symposia/congresses invitations.


The world of fake journals and non-conferences

A small team of journalists went undercover to investigate a massive underground network of fake science journals and conferences. Their probe found millions of dollars exchanging hands, largely from unsuspecting scientists –– earning the operators of predatory platforms a tidy sum at researchers’ expense.


By Daniel Oberhaus


Aug 14 2018
Hundreds of Researchers from Harvard, Yale and Stanford Were Published in Fake Academic Journals
How the World Academy of Science, Engineering and Technology has become a multi-million-dollar organization –– promoting bullshit science through fake conferences and journals.*&output-quality=55

Image: Shutterstock

In the so-called “post-truth era,” science seems like one of the last bastions of objective knowledge, but what if science itself were to succumb to fake news? Over the past year, German journalist Svea Eckert and a small team of journalists went undercover to investigate a massive underground network of fake science journals and conferences.

In the course of the investigation, which was chronicled in the film documentary “Inside the Fake Science Factory,” the team analyzed over 175,000 articles published in predatory journals and found hundreds of papers from academics at leading institutions, as well as substantial amounts of research pushed by pharmaceutical corporations, tobacco companies, and others. Last year, one fake science institution run by a Turkish family was estimated to have earned more than $4 million in revenue through conferences and journals.

The story begins with Chris Sumner, a co-founder of the nonprofit Online Privacy Foundation, who unwittingly attended a conference organized by the World Academy of Science, Engineering and Technology (WASET) last October. At first glance, WASET seemed to be a legitimate organization. Its website lists thousands of conferences around the world in pretty much every conceivable academic discipline, with dates scheduled all the way out to 2031. It has also published over ten thousand papers in an “open science, peer reviewed, interdisciplinary, monthly and fully referred [sic] international research journal” that covers everything from aerospace engineering to nutrition. To any scientist familiar with the peer review process, however, WASET’s site has a number of red flags, such as spelling errors and the sheer scope of the disciplines it publishes.

Sumner attended the WASET conference to get feedback on his research, but after attending, it soon became obvious that the conference was a scam. After digging into WASET’s background, Sumner partnered with Eckert and her colleague Till Krause, who adopted fictitious academic personas and began submitting papers to WASET’s journal. The first paper to get accepted was titled “Highly-Available, Collaborative, Trainable Communication –– a policy neutral approach,” which claims to be about a type of cryptoanalysis based on “unified scalable theory.” The paper was accepted by the WASET journal with minimal notes and praise for the authors’ contribution to this field of research.

There was just one problem: The paper was pure nonsense that had been written by a joke software program designed by some MIT students to algorithmically generate computer science papers. It was, in a word, total bullshit, i.e. nonsense.

Posted in Center for Environmental Genetics | Comments Off on The world of fake journals and non-conferences

Decades of Mismanagement Turned US Forests into ‘Slow-Motion Time Bombs’

Decades of Mismanagement Turned US Forests into ‘Slow-Motion Time Bombs’

wildfire calif

Bob Zybach [PhD in Forestry. Oregon State University Environmental Sciences Program] feels like a broken record. Decades ago he warned government officials allowing Oregon’s forests to grow unchecked by proper management would result in catastrophic wildfires. Whereas some want to blame global warming for the uptick in catastrophic wildfires, Zybach said a change in forest management policies is the main reason Americans are seeing a return to more intense fires, particularly in the Pacific Northwest and California where millions of acres of protected forests stand.

“We knew exactly what would happen if we just walked away,” Zybach told The Daily Caller News Foundation (DCNF). Zybach spent two decades as a reforestation contractor before heading to graduate school in the 1990s. Then the Clinton administration in 1994 introduced its plan to protect old growth trees and spotted owls by strictly limiting logging. Less logging also meant government foresters weren’t doing as much active management of forests — thinnings, prescribed “back-burns,” and other activities to reduce wildfire risk. Zybach told Evergreen magazine that year the Clinton Administration’s plan for “naturally functioning ecosystems” free of human interference ignored history and would fuel “wildfires reminiscent of the Tillamook burn, the 1910 fires and the Yellowstone fire.”

Between 1952 and 1987, western Oregon experienced only one major fire above 10,000 acres. The region’s relatively fire-free streak ended with the Silver Complex Fire of 1987 that burned more than 100,000 acres in the Kalmiopsis Wilderness area, torching rare plants and trees that the federal government had set aside to protect from human activities. The area has burned several more times since the 1980s.

“Mostly fuels were removed through logging, active management — which they stopped — and grazing,” Zybach said in an interview. “You take away logging, grazing, and maintenance, and you get firebombs.”

This week, Oregonians are dealing with 13 wildfires engulfing 185,000 acres. California is battling nine fires scorching more than 577,000 acres, mostly in the northern forested parts of the state managed by federal agencies. The Mendocino Complex Fire quickly spread to become the largest wildfire in California since the 1930s, engulfing more than 283,000 acres. The previous wildfire record was set by 2017’s Thomas Fire that scorched 281,893 acres in Southern California.

While bad fires still happen on state and private lands, most of the massive blazes happen on, or around, lands managed by the U.S. Forest Service and other federal agencies, Zybach said. Poor management has turned western forests into “slow-motion time bombs,” he said. “If we can’t manage our forests, what the hell?” Zybach told The DCNF.

Is It Global Warming?

The rash of massive fires has reignited the debate over how best to handle wildfires. Experts agree that a century of fire suppression caused forests to become overgrown and filled with dead wood and debris that easily ignites in the dry summer heat. However, there’s disagreement over whether or not global warming has exacerbated western wildfires. Some scientists, those often quoted in the news, link global warming to a longer wildfire season and more intense heat.

California Democratic Gov. Jerry Brown said bigger wildfires were part of a “new normal” because of global warming. People will just have to adapt to it, the governor said while touring the destruction left by the Carr Fire. But attributing wildfires to man-made warming is trickier than many scientists let on — given the myriad of factors that determine the intensity of fires.

“Global warming, if it is real, may contribute slightly, but the key factors are mismanaged forests, years of fire suppression, increased population, people living where they should not, invasive flammable species, and the fact that California has always had fires,” the University of Washington climate scientist Cliff Mass told The DCNF. Mass also noted there hasn’t been much warming in the Pacific Northwest these past two decades, adding that natural weather patterns in California prime the state for wildfires every year — no matter what. “Many of the media and some politicians has been pushing a false narrative: that the fires are mainly about global warming. They are not,” Mass said in an email. Mass also criticized politicians and the media for trying to make last year’s wildfire season about global warming.

Zybach also doesn’t buy that global warming is exacerbating fires. Through his research, Zybach analyzed thousands of official documents, reports, and first-hand accounts of wildfire activity going back hundreds of years. His conclusion: wildfire season hasn’t changed much. “To say there’s been another change, other than management, is just grasping at straws,” Zybach said.

What has changed is land management. For example, declines in timber production on federal lands, particularly in the Northwest, not only meant the death of a once vibrant industry but also an end to thinning, controlled burns, and other activities meant to keep forest growth in check. Wildfire experts have also increasingly been pointing to the fact that more people and infrastructure are located in wildfire-prone areas than in the past, increasing the risk of wildfires impacting livelihoods.

A recent study found the number of homes at risk of wildfires in the western U.S. increased 1,000 percent since 1940, from about 607,000 in 1940 to 6.7 million. Since most fires are ignited by humans, the more people in fire-prone areas the higher the risk. “This is a people problem,” said U.S. Geological Survey fire expert Jon Keeley. “What’s changing is not the fires themselves, but the fact that we have more and more people at risk.”

Ticking Forest Fire Bombs

The Klondike Fire is one of several fires raging in southern Oregon, igniting more than 30,000 acres of protected forest and covering nearby towns with smoke and ash. The nearby Taylor Fire has engulfed more than 41,000 acres, including protected woods. Officials are worried the two fires could combine. The Klondike and Taylor fires are the fourth and fifth major blazes to burn through the Rogue River-Siskiyou National Forest since 1987 when the Silver Complex Fire burnt up more than 100,000 acres.

The Siskiyou National Forest encompasses 1.8 million acres in northern California through southwestern Oregon. The park also encompasses the Kalmiopsis Wilderness, which was created by Congress in 1964 to protect the rare plant life in the region. However, before the Silver fire, you have to go back to the 1930s to find a comparable fire, according to Forest Service figures. Fires are labeled “complex” when two or more combine.

Oregon, like much of the western U.S., was ravaged by massive wildfires in the 1930s during the Dust Bowl drought. Mega-fires were largely contained due to logging and policies to actively manage forests, but there’s been an increasing trend since the 1980s of larger fires. Active management of the forests and logging kept fires at bay for decades, but that largely ended in the 1980s over concerns too many old growth trees and the northern spotted owl. Lawsuits from environmental groups hamstrung logging and government planners cut back on thinning trees and road maintenance.

Zybach said Native Americans used controlled burns to manage the landscape in Oregon, Washington and northern California for thousands of years. Tribes would burn up to 1 million acres a year on the west coast to prime the land for hunting and grazing, Zybach’s research has shown. “The Indians had lots of big fires, but they were controlled,” Zybach said. “It’s the lack of Indian burning, the lack of grazing,” and other active management techniques that caused fires to become more destructive in the 19th and early 20th centuries before logging operations and forest management techniques got fires under control in the mid-20th Century.

What Will Trump Do?

In a recent tweet, Interior Secretary Ryan Zinke blamed the “overload of dead and diseased timber in the forests makes the fires worse and more deadly.” the secretary called for more active management of forests to reduce fuel loads. President Donald Trump signed legislation in March to prevent federal agencies from using forest management funding to pay for fire suppression, but that’s only a partial fix. Federal agencies need to be given more flexibility to clear forests of debris and lessen fuel loads.

The Forest Service planned in 2018 to treat and clear more vegetation and harvest more timber than in 2017 — to prevent fuel build-up. But that’s only happening on a small fraction of the 193 million acres managed by federal foresters. Zybach said more needs to be done to get fuel loads down and reduce wildfire risks. “It would make our forests safe and beautiful again, it would save lives and property, and it would create more jobs,” Zybach said.

Posted in Gene environment interactions | Comments Off on Decades of Mismanagement Turned US Forests into ‘Slow-Motion Time Bombs’

High-resolution comparative analysis of the Great Ape genomes: structural variation and brain-expressin differences

In these GEITP pages, we have often chatted about primate evolution –– because evolution is a representation of the genes, in genomes of each species, interacting/responding to the environment, over extended time periods (hundreds, thousands; hundreds of thousands of years). Evolutionary scientists have long been interested in the functional genetic differences that distinguish humans from other ape species. Human and chimpanzee protein-encoding changes and structural differences in regulatory DNAs, or in the copy numbers within gene families, have all been involved in adaptation of the human-ape ancestor to the changing environment into which hominin species began developing. For example, several potentially high-impact regulatory changes and human-specific genes that are important in (brain nerve) synapse density, neuronal cell count, and other morphological differences in the brain, are all well known. Most of these genetic differences, however, were not initially recognized during early comparisons of human and ape genomes, because these genetic changes mapped to regions of rapid genomic structural change that had not yet been resolved in draft genome assemblies.

Authors [see attached article] combined “long-read” sequence assembly and full-length complementary DNA (cDNA; this is DNA that has been reverse-transcribed from messenger RNA of coding genes) sequencing with a multi-platform scaffolding approach, in order to start from the beginning in producing unequivocal chimpanzee and orangutan genome assemblies. By comparing these two ape genome assemblies with two long-read de novo human genome assemblies and a gorilla genome assembly –– authors characterized lineage-specific, versus shared, great ape genetic variation, which ranged from single–base pair variants to mega–base pair (Mbp)–sized variants (“mega-base” = 1 million bases).

Authors identified ~17,000 fixed human-specific structural gene variants which helped identify genic and putative regulatory changes that have emerged in humans –– since humans’ divergence from non-human apes. Intringuingly, these DNA variants were found to be enriched near genes that are down-regulated in human, compared to those in chimpanzee cerebral organoids, particularly in cells analogous to radial glial neural progenitors (such precursors of glial cells in the brain include oligodendrocytes, astrocytes, ependymal cells, Schwann cells, microglia, and satellite cells).


Science 8 June 2o18; 360: 1085 + whole article

Posted in Center for Environmental Genetics | Comments Off on High-resolution comparative analysis of the Great Ape genomes: structural variation and brain-expressin differences

Selfish genetic element confers non-Mendelian inheritance in rice

As these GEITP pages have previously discussed, the ~3-billion base-pair (bp) genome of human comprises only ~1% of the DNA that is actually responsible for functional protein-coding genes (this portion, ~30 million bases, is called the “exome”). [Therefore, whole-genome sequencing (WGS; also called next-generation sequencing, NGS) looks at DNA sequence of virtually all of the ~3 billion bp, whereas whole-exome sequencing (WES) examines only those 30 million bp.] About half of the remaining ~99% DNA consists of transposable elements (called “selfish DNA”) which are being discovered to have important functions, whereas the other half of the DNA has no known function (so far) and is called “junk DNA.” Richard Dawkins in his 1976 book first described “the selfish gene”, and then Leslie Orgel and Francis Crick’s 1980 Nature article described “selfish DNA.” More or less similar proportions (gene-selfish-junk DNA) are known to exist in other animals –– as well as plants.

The attached report deals with selfish genetic elements (SGEs) in rice. Accumulating evidence suggests that SGEs, which are DNA sequences that gain a transmission advantage relative to the rest of the genome, could drive genome evolution by causing hybrid incompatibilities and segregation distortion (i.e. deviation from the expected Mendelian ratio, half from one parent, half from the other parent) in different organisms; however, the role of SGEs in genome evolution and their underlying molecular mechanisms remains controversial. Authors herein demonstrate that qHMS7 –– a major quantitative genetic locus for hybrid male sterility between wild rice (Oryza meridionalis) and Asian cultivated rice (Oryza sativa) –– contains two tightly linked genes [called Open Reading Frame 2 (ORF2) and ORF3].

ORF2 encodes a toxic genetic elephant that aborts pollen in a sporophytic (in the life cycle of plants, such as rice, that have alternating generations, the asexual and usually diploid [pairs of chromosomes] phase, produces spores –– from which the gametophyte arises) manner, whereas ORF3 encodes an antidote that protects pollen in a gametophytic (in the life cycle of plants with alternating generations, the gamete-producing and usually haploid [single chromosomes] phase, produce the zygote –– from which the sporophyte arises) manner. Pollens lacking ORF3 are selectively eliminated, leading to segregation distortion in the progeny. Evolutionary analysis of the genetic sequence suggests that ORF3 arose first, followed by gradual functionalization of ORF2. Furthermore, this toxin-antidote system (ORF3) may have promoted the differentiation, and/or may have maintained the genome stability, in both wild rice and cultivated rice.

Science 8 June 2o18; 360: 1130–1132

ORF2 encodes a toxic genetic elephant
That’s funny.

Thank you for detecting this (deliberate) typo. You are the Winner this time. 😉 I was just checking to see if anyone was reading this (rather dry, complicated) plant genetics email carefully. Usually C. V. is the one who first detects my typos. Of course, it is meant to read “toxic genetic elements.”

An afterthought: I should have added a comment at the end of this quick review –– that, in the near future, we anticipate more clinical phenotypes (disorders) will soon be detected as being regulated/influenced by “selfish DNA.” These transposons do crazy, unexpected things.

COMMENT: ha ha… i thought it was a deliberate joke, you know like a “white elephant”. still cute.

Posted in Center for Environmental Genetics | Comments Off on Selfish genetic element confers non-Mendelian inheritance in rice