Excellent summary from several difference scientific sources on global climate change

This is an EXCELLENT summary from several difference scientific sources — appearing in https//finance.townhall.com — that should (or could) clarify the minds of non-scientists who do not understand Climatology, and how complex it is. Anyone with an open mind, or curious mind, is invited to enjoy this. DWN

Amidst Global Warming Hysteria, NASA Expects Global Cooling – Mike Shedlock – Mike Shedlock

|Jan 30, 2019 1:03 PM
Amidst Global Warming Hysteria, NASA Expects Global Cooling

The new data are coming from NASA’s Sounding of the Atmosphere using Broadband Emission Radiometry or SABER instrument, which is onboard the space agency’s Thermosphere Ionosphere Mesosphere Energetics and Dynamics (TIMED) satellite. SABER monitors infrared radiation from carbon dioxide (CO2) and nitric oxide (NO), two substances that play a vital role in the energy output of our thermosphere, i.e. the very top level of our atmosphere.

“The thermosphere always cools off during Solar Minimum. It’s one of the most important ways the solar cycle affects our planet,” said Mlynczak, who is the associate principal investigator for SABER.

The new NASA findings are in line with studies released by UC-San Diego and Northumbria University in Great Britain last year, both of which predict a Grand Solar Minimum in coming decades due to low sunspot activity. Both studies predicted sun activity similar to the Maunder Minimum of the mid-17th to early 18th centuries, which coincided to a time known as the Little Ice Age, during which temperatures were much lower than those of today.

If all of this seems as if NASA is contradicting itself, you’re right — sort of. After all, NASA also reported last week that Arctic sea ice was at its sixth lowest level since measuring began. Isn’t that a sure sign of global warming?

All any of this “proves” is that — we have, at best, a cursory understanding of Earth’s incredibly complex climate system. So when mainstream media and carbon-credit salesman Al Gore breathlessly warn you that we must do something about climate change, it’s all right to step back, take a deep breath, and realize that we don’t have the knowledge, skill or resources to have much effect on the Earth’s climate.

Incredibly Complex Systems

See the problem? Alarmists take one variable, CO2 — that is only a tiny part of extremely long climate cycles —and make projections far into to the future based off it.

When I was in grade school (early 1970s), the alarmists were worried about global cooling. Amusingly, I recall discussing in science class the need to put soot on the arctic ice to melt it to stop the advance of glaciers.

Now, the latest Intergovernmental Panel on Climate Change (IPCC) Report said we have only 12 years left to save the planet. It triggered the usual frantic and ridiculous reactions.

NBC News offered this gem: “A last-ditch global warming fix? A man-made ‘volcanic’ eruption” to cool the planet.” Its article proclaimed, “Scientists and some environmentalists believe nations might have to mimic volcanic gases as a last-ditch effort to protect Earth from extreme warming.”

Geo-engineering: Ignoring the Consequences

From 1940 to almost 1980, the average global temperature went down. Political concerns and the alleged scientific consensus focused on global cooling. Alarmists said it could be the end of agriculture and civilization. Journalist Lowell Ponte wrote in his 1976 book, The Cooling.

The problem then was – and still is now – that people are educated in the false philosophy of uniformitarianism: the misguided belief that conditions always were and always will be as they are now, rather than any natural changes will occur over long periods of time.

Consequently, most people did not understand that the cooling was part of the natural cycle of climate variability, or that changes are often huge and sudden. Just 18,000 years ago we were at the peak of an Ice Age. Then, most of the ice melted and sea levels rose 150 meters (490 feet), because it has been warmer for almost all of the last 10,000 years than it is today.

During the cooling “danger,” geo-engineering proposals included:

* building a dam across the Bering Straits to block cold Arctic water, to warm the North Pacific and the middle latitudes of the Northern Hemisphere;

* dumping black soot on the Arctic ice cap to promote melting;

* adding carbon dioxide (CO2) to the atmosphere to raise global temperatures.

“Taking carbon dioxide out of the atmosphere,” as advocated by the IPCC in its October 8 news conference, is also foolish. Historic records show that, at about 410 parts per million (ppm), the level of CO2 supposedly in the atmosphere now, we are near the lowest in the last 280 million years. As plants evolved over that time, the average level was 1200 ppm. That is why commercial greenhouses boost CO2 to that level to increase plant growth and yields by a factor of four.

The IPCC has been wrong in every prediction it’s made since 1990. It would be a grave error to use its latest forecasts as the excuse to engage in geo-engineering experiments with the only planet we have.

Global Warming Errs Badly

Next, please consider Extreme weather not proof of global warming, NASA on global cooling

To understand the great confusion about global warming or climate change, my most lucid guide has been Dr. Richard Lindzen — a former Alfred P. Sloan professor of meteorology at MIT and member of the US National Academy of Sciences — and his now famous lecture for the Global Warming Policy Foundation last October 8.

In just a number of segments of his lecture, Dr. Lindzen crystallized for me why the church of global warming errs so badly in its dogma.

Global warming promoters fostered the popular public perception of the science of climate change as quite simple. It is that here’s one phenomenon to be explained (“global average temperature,” or GAT, which, says Lindzen, is a thoroughly unscientific concept). And there’s one explanation for it: the amount of CO2 in the atmosphere.

GAT is only one of many important phenomena to measure in the climate system, and CO2 is only one of many factors that influence both GAT and all the other phenomena.

CO2’s role in controlling GAT is at most perhaps 2 percent, yet climate alarmists think of it as the “control knob.”

Most people readily confuse weather (short-term, local-scale temperature, humidity, precipitation, wind, cloudiness, and more) with climate (long-term, large-scale of each) and think weather phenomena are driven by climate phenomena; they aren’t.

Consequently, as Lindzen says, the currently popular narrative concerning this system is this: The climate, a complex multifactor system, can be summarized in just one variable, the globally averaged temperature change, and is primarily controlled by the 1 to 2 percent perturbation in the energy budget due to a single variable — carbon dioxide — among many variables of comparable importance.

Big Chill

Did You Know the Greatest Two-Year Global Cooling Event Just Took Place?

Would it surprise you to learn the greatest global two-year cooling event of this century just occurred? From February 2016 to February 2018 (the latest month available), global average temperatures dropped 0.56°C. You have to go back to 1982-84 for the next biggest two-year drop, 0.47°C—also during the global warming era. All the data in this essay come from GISTEMP Team, 2018: GISS Surface Temperature Analysis (GISTEMP). NASA Goddard Institute for Space Studies (dataset accessed 2018-04-11 at https://data.giss.nasa.gov/gistemp/). This is the standard source used in most journalistic reporting of global average temperatures.

The 2016-18 Big Chill was composed of two Little Chills, the biggest five month drop ever (February to June 2016) and the fourth biggest (February to June 2017). A similar event from February to June 2018 would bring global average temperatures below the 1980s average. February 2018 was colder than February 1998. If someone is tempted to argue that the reason for recent record cooling periods is that global temperatures are getting more volatile, it’s not true. The volatility of monthly global average temperatures since 2000 is only two-thirds what it was from 1880 to 1999.

None of this argues against global warming. The 1950s was the last decade cooler than the previous decade, the next five decades were all warmer on average than the decade before. Two-year cooling cycles, even if they set records, are statistical noise, when compared to the long-term trend.

My point is that statistical cooling outliers garner no media attention. The global average temperature numbers come out monthly. If they show a new hottest year on record, that’s a big story. If they show a big increase over the previous month, or the same month in the previous year, that’s a story. If they represent a sequence of warming months or years, that’s a story. When they show cooling of any sort—and there have been more cooling months than warming months since anthropogenic warming began—there’s no story.

Bombarded with Garbage

Of course, you would not know this, unless you follow NASA, Real Clear Markets, or Watts Up With That.

Meanwhile, everyone is constantly bombarded with total garbage like Al Gore’s claim Migrant Caravans are Victims of Global Warming.

And of course, the media is fawning all over AOC’s “New Green Deal” hype as she, also, is a believer the World Will End in 12 Years if we don’t address climate change.

The Guardian and The Intercept are both happy to promote this nonsense — as of course the entirety of mainstream media.

Alarm Bells

When I was in grade school we had major alarm bells over global cooling. In high school it was population growth. Then came food shortages followed by peak oil.

Now the crisis du jour is global warming.

It’s always about something!

CO2 Derangement Symptom

Watts Up With That accurately labels global warming hysteria as the CO2 Derangement Syndrome.

This is an excellent synopsis of the current state of affairs — so please give it a good look.

Finally, even if you still believe man-made global warming is a threat — please ponder the notion that governments will not do anything sensible about it.

Posted in Center for Environmental Genetics | Comments Off on Excellent summary from several difference scientific sources on global climate change

Key metabolic gene for recurrent freshwater colonization and radiation in fish

This story is a great example of gene-environment interactions. When organisms evolve — so that they might occupy a new environment — what adaptations in the genome are required for this transition? Authors [see attached article & editorial] have determined the precise mechanism for explaining how a single adaptive genetic innovation has repeatedly been used to allow saltwater fish to colonize and diversify into an ability to live in fresh water. This study is a creative study (of ecology, physiology, and genetics, combined) to understand a dietary adaptation. Authors used a gene insertion technology to demonstrate that, by increasing the number of copies of a single gene [fatty-acid desaturase-2 gene (Fads2)] in a marine-adapted lineage of threespine stickleback enables the fish to survive on a fresh-water diet.

Fads2 encodes an enzyme that is crucial for fatty-acid synthesis; therefore, increasing the number of Fads2 genes in the fish genome compensates for the dietary lack of fatty acids — such as docosahexaenoic acid (DHA) — in fresh water. Fatty acids are apparently more abundant in salt water. Stickleback fish harboring only one copy of Fads2 require a DHA-enriched diet to survive in fresh water. In contrast, the genomes of some lineages have evolved to have two copies of Fads2, thereby producing more fatty acid endogenously (i.e. no need for dietary supplementation), and they survived better under DHA-restricted diets. Authors engineered extra copies of Fads2 into single-copy stickleback fish, and they showed that this was sufficient to fulfill nutritional requirement for freshwater survival.

All fresh-water stickleback populations — surveyed across three continents — appear to have been derived from an ancestor having at least one duplicated Fads2 gene. Moreover, some of these duplicated Fads2 genes encode different protein sequences (which could alter the adaptive function of the enzyme — in addition to increasing the number of copies produced). In addition to the stickleback, authors examined 48 other fish species having full genome sequences available. Even after controlling for evolutionary history, authors found that across all ray-finned fish, species with fresh-water populations have substantially more copies of the Fads2 gene than species having no fresh-water populations. This finding suggests that Fads2 gene duplications have played an important role in evolutionary transitions to freshwater diets — not just for multiple stickleback lineages but, more generally, for ray-finned fish.

Authors dated the timing of the original Fads2 duplication in present-day freshwater stickleback to 800,000 years ago; however, fossil

evidence shows that stickleback had evolved to live in freshwater well before this time. Thus, it appears that the Fads2 story may be only the most recent chapter in a long history of fish transitions — both to, and from, fresh water. Authors also identified a mechanism underlying the adaptive copying of a pivotal genetic innovation such as Fads2.

Key genetic variation can be acquired through hybridization, or by gene duplication. Intriguingly, this article shows a very specific mechanism by which gene duplications can occur. Transposons (or “jumping genes”) are repetitive sequences that can insert themselves (and any DNA in between them) into other parts of the genome. Authors discovered that transposons are responsible for the multiple independent duplications of Fads2 in different fresh-water stickleback populations. This article is unusual in pinpointing an adaptive role for transposons — that directly increase the number of copies of a key metabolic gene in a vertebrate (animal with a spine). No fish surveyed by the authors had more than three copies of the Fads2 gene. Although all fish had originated in salt water, there are currently more species of ray-finned fish in fresh-water than in marine environments, and the vast majority of marine ray-finned fish species have freshwater ancestors that migrated back to saltwater. 😊

DwN

Science 31 May 2o19; 364: 886-889 & editorial pp 831-832

Posted in Center for Environmental Genetics | Comments Off on Key metabolic gene for recurrent freshwater colonization and radiation in fish

Vision using multiple distinct rod opsins in deep-sea fishes

Why is this topic chosen for today’s GEITP email? Well, “light” is an environmental signal, and “genes” within the genome that respond to this signal, are responsible for the “vision” phenotype (trait). And, during evolution, “vision” of course was extremely important in [a] finding food, [b] avoiding predators, and [c] finding mates for survival of the species. In fact, we know that “eyes” have evolved independently — somewhere between 40 and 65 times..!! The simplest “eyes” (e.g. those in microorganisms) simply detect whether the surroundings are light or dark. In higher organisms, there are two fundamental “designs”: one in protostomes [insects, mollusks, segmented worms, spiders); the other in deuterostomes [starfish, jellyfish, sea urchin, and all vertebrates (i.e. animals having a spine)]

When ancestors of cave fish, and certain crickets, moved into pitch-black caves, their eyes virtually disappeared over generations of non-usage. However, fish — at depths greater than sunlight can penetrate — have developed a type of vision that is highly sensitive to the faint glow produced by other deep-sea organisms. This “super-vision” is now known to represent an extraordinary increase in number of genes for rod opsins (retinal proteins that detect dim light). Those extra genes have evolutionarily diversified to produce proteins capable of capturing every possible photon at multiple wavelengths — which could mean that, despite the darkness, these deep-sea fish can actually see in color.

At a depth of 1000 meters, the last trace of sunlight is gone. But now researchers realize there exists a faint bioluminescence from flashing shrimp, octopus, bacteria, and even fish. Authors [see attached article & editorial] have studied deep-sea fishes’ opsin proteins and found that variation in the opsins’ amino-acid sequences can change the wavelength of light detected; hence, multiple opsins make color vision possible. One opsin, RH1, works well in low light. Found in the eye’s rod cells, RH1 enables humans to see in the dark — but only in black and white.

By inspecting 101 fish genomes, authors [see attached article & editorial] found that three deep-sea teleost (all ray-finned fishes — except primitive bichirs, sturgeons, paddlefishes, freshwater garfishes, and bowfins) lineages have independently expanded their RH1 gene repertoires. Among these, the silver spiny-fin has the most opsin genes in vertebrates (two cone opsins and 38 rod opsins). Spiny-fins express as many as 14 RH1 genes (including the most blue-shifted rod photopigments known) — which cover the (required) range of residual daylight, as well as the deep-sea bioluminescence spectrum. These data reveal molecular and functional evidence for recurrent evolution of multiple rod opsin–based vision in vertebrates. 😊

DwN

Science 10 May 2019; 364: 588-592 and pp 520-521 [editorial]

Posted in Center for Environmental Genetics | Comments Off on Vision using multiple distinct rod opsins in deep-sea fishes

Oligogenic inheritance of a human heart disease involving a genetic modifier

Noll, I stand corrected. 😊

Obviously — I am misusing the term “compound heterozygosity” (and I wondered about that, as I wrote it). Perhaps “tri-allelic recessive heterozygosity” is a better term. Or, as the authors have in the title of their publication, simply “oligogenic inheritance.”

Oligogenic inheritance of a human heart disease involving a genetic modifier
Nebert, Daniel (nebertdw)
Thu 6/20, 12:48 PM

Noll, I stand corrected. 😊

Obviously — I am misusing the term “compound heterozygosity” (and I wondered about that, as I wrote it). Perhaps “tri-allelic recessive heterozygosity” is a better term. Or, as the authors have in the title of their publication, simply “oligogenic inheritance.” 😊

DwN
Nebert, Daniel (nebertdw)
Wed 6/19, 1:09 PM

Don, I don’t blame you; this might be considered as genetics at its most complex level. 😊

Let me try to condense this information into one crisp kernel —

The father has two variants (in his MRTFB and MYH7 genes) and is “affected but asymptomatic” (i.e. does not have the full-blown disorder). This trait, left ventricular noncompaction (LVNC), is expressed as a gradient, and the father does show the serious disease (congestive heart failure) that one sees during infancy when the trait is 100% penetrant.

The mother has one variant (in her NKX2-5 gene) and is unaffected (i.e. has a normal heart). The affected child was unfortunate to inherit all three of these variants from the two parents, which, when combined, resulted in the full-blown disorder of LVNC — manifested as severe congestive heart failure during the first two months of life.

This is an excellent example (in medical genetics) of “compound heterozygosity” (the condition of having two or more heterozygous recessive alleles, at different chromosomal loci, that can cause the genetic disease in a heterozygous state, when all necessary alleles are inherited in the same unlucky offspring). I don’t know why, but these are the kinds of things that medical geneticists get all excited about. 😉

DwN
😊

DwN

Posted in Center for Environmental Genetics | Comments Off on Oligogenic inheritance of a human heart disease involving a genetic modifier

Study of multiethnic genomes identifies 27 genetic variants associated with disease

This article is a semi-lay summary of a publication soon to appear in Nature. As most of us would expect — the genomes of different ethnic groups exhibit ethnicity-specific alleles (mutations in at least one of the two alleles of each gene) that help explain the prevalence, or lack of prevalence of a particular human complex disease or Mendelian disease in a specific ethnic group.
DwN

June 19, 2019

Study of multiethnic genomes identifies 27 genetic variants associated with disease

NIH-funded research highlights need for diversity in study populations, creates a comprehensive genomic toolkit for scientists.

In a study published in the journal Nature, researchers identified 27 new genomic variants associated with conditions such as blood pressure, type 2 diabetes, cigarette use and chronic kidney disease in diverse populations. The team collected data from 49,839 African-American, Hispanic/Latino, Asian, Native Hawaiian, Native American and people who identified as others and were not defined by those ethnic groups. The study aimed to better understand how genomic variants influence the risk of forming certain diseases in people of different ethnic groups. The work was funded by the National Human Genome Research Institute (NHGRI) and the National Institute on Minority Health and Health Disparities, both parts of the National Institutes of Health.

In this study, researchers specifically looked for genomic variants in DNA that were associated with measures of health and disease. Everyone has DNA sequences that consist of the chemical bases A, C, G, T. Genomic variants occur in DNA regions where one of those bases is replaced with another, across various individuals. The team found that some genomic variants are specifically found in certain groups. Others, such as some related to the function of hemoglobin (a protein in the blood that carries oxygen), are found in multiple groups.

“There are scientific benefits to including people from different ethnic groups in research studies. This paper gives us a glimpse of how ethnic diversity can be harnessed to better understand disease biology and clinical implications,” said Lucia Hindorff, Ph.D., program director in the Division of Genomic Medicine at NHGRI and a co-author of the paper. “This paper represents an important comprehensive effort to incorporate diversity into large-scale studies, from study design to data analysis.”

Apart from finding new genomic variants, the study assessed whether known disease associations with 8,979 established genomic variants and specific diseases in European ancestry populations could be detected in African-American, Hispanic/Latino, Asian, Native Hawaiian, and Native American populations.

Their findings show that the frequency of genomic variants associated with certain diseases can differ from one group to another. For example, a strong association was found between a new genomic variant and smokers and their daily cigarette usage in Native Hawaiian participants. However, this association was absent or rare in most other populations. Not finding the variant in all groups despite large numbers of participants in each group strengthens the argument that findings from one population cannot always be generalized to others.

A variant in the hemoglobin gene, a gene known for its role in sickle cell anemia, is associated with greater amount of blood glucose attached to hemoglobin in African-Americans. The paper in Nature is the first to confirm this association within Hispanic/Latinos, who have shared ancestry that is mixed with European, African and Native American ancestry.

Such an effort is vital because a vast majority of human genomics research use data based mostly on populations of white European ancestry. For example, a separate study showed that among 2,500 recently published human genomics papers, only 19% of the individuals studied were non-European participants.

Inclusion of non-European populations in studies is important because ethnicity may partly explain the differences in vulnerability (link is external)to diseases and treatment effects. This is because there may be genomic variants present in other ethnic populations that increase risk for diseases, but they would not be found if studies were only done on white European populations. Using genomic data from white Europeans to extrapolate to other populations may not accurately predict the disease burden carried by such groups.

The study is part of the Population Architecture using Genomics and Epidemiology (PAGE) consortium, which was formed in 2008, comprising researchers at NHGRI and centers across the United States. The paper in Nature on the study, led by researchers at the Icahn School of Medicine at Mount Sinai, the Fred Hutchinson Cancer Research Center, and other academic centers, is the result of work undertaken by the consortium within the last five years.

This is a benchmark study that addresses the need for new methods and tools for collecting and disseminating large and varied amounts of genomic data, in order to make the results clinically useful. “Ultimately, the PAGE study underscores the value of studying diverse populations, because only with a full understanding of genomic variations across populations can researchers comprehend the full potential of the human genome,” said Dr. Hindorff.

Through PAGE and subsequent studies, researchers will be able to identify genomic variants that are associated with diseases from those that are not, but also to understand how such associations differ across race and ethnicity. In turn, this improved understanding can be used to target and tailor new treatments to maximize benefit across multiple populations.

The National Human Genome Research Institute (NHGRI) is one of the 27 institutes and centers at the NIH, an agency of the Department of Health and Human Services. The NHGRI Division of Intramural Research develops and implements technology to understand, diagnose and treat genomic and genetic diseases. Additional information about NHGRI can be found at: www.genome.gov.

The National Institute of Minority Health and Disparities (NIMHD) is one of NIH’s 27 Institutes and Centers. It leads scientific research to improve minority health and eliminate health disparities by: conducting and supporting research; planning, reviewing, coordinating, and evaluating all minority health and health disparities research at NIH; promoting and supporting the training of a diverse research workforce; translating and disseminating research information; and fostering collaborations and partnerships. For more information about NIMHD, visit www.nimhd.nih.gov.

About the National Institutes of Health (NIH): NIH, the nation’s medical research agency, includes 27 Institutes and Centers and is a component of the U.S. Department of Health and Human Services. NIH is the primary federal agency conducting and supporting basic, clinical, and translational medical research, and is investigating the causes, treatments, and cures for both common and rare diseases. For more information about NIH and its programs, visit www.nih.gov.

NIH…Turning Discovery into Health®

Reference

Genetic analyses of diverse populations improve discovery for complex traits. Nature DOI: https://doi.org/10.1038/s41586-019-1310-4

Posted in Center for Environmental Genetics | Comments Off on Study of multiethnic genomes identifies 27 genetic variants associated with disease

Some thoughts on the utilization of polygenic risk scores (PRSs) for therapeutic targeting

These GEITP pages have often discussed genome-wide association studies (GWAS), in which a phenotype (i.e. trait — such as height or cancer) is selected by a research team, and then the genomes of hundreds or many thousands of subjects are searched — to see if any DNA loci can be identified as statistically significantly associated with that trait (rather than a chance occurrence, i.e. random event leading to a false positive association). Note above the word, “searched,” because this is what GWAS generally are for: it is a fishing expedition, hoping to find some unexpected gene (especially when associated with a complex disease such as, e.g. type-2 diabetes, cancer or dementia).

Still, much clarification of studies involved in genetic risk assessment is needed, if one wishes one day to implement personalized medicine. Genetic risk predictions are largely made with the intention of “positive prediction,” albeit of a disease state [i.e. the goal is to ascertain which individuals (among any ‘at-risk’ population) have the highest likelihood of developing a condition, or of progressing to a more severe state]. Thus, genome-wide polygenic risk scores (PRSs) have been generated — for coronary artery disease, atrial fibrillation, Crohn disease, type-2 diabetes, and breast cancer — in each case identifying a threshold above which a small percentage (i.e. a subset) of the population has a “disease risk” of at least 3-fold higher than that of the general population.

Because single-gene mutations with such a magnitude-of-effect are sometimes regarded as clinically actionable — yet affect a much smaller proportion of people — it has been argued that PRSs are now at the point at which it might be appropriate to integrate them into clinical care. At the very least, this might lead to encouraging high-risk individuals to meet with an appropriate medical specialist or (personally) to initiate behavioral change. More commonly, however, PRSs might encourage a course of preventive medication. As the costs of healthcare continue to increase, the impact on both patient and the healthcare system comes into focus. The percentage of incidents (prevented by pre-emptive treatment) will reflect a function of the proportion of the population who are treated, and the rate of favorable response to treatment — which itself may vary, possibly as a function of disease risk.

It has previously been argued that, because negative prediction is almost always more accurate than positive prediction — owing to the low ratio of cases to controls — the potential for using PRSs to identify low-risk individuals should be given more attention than previously has been done to date. This is because, if only the highest-risk individuals are treated, then most cases are not prevented, yet treating everyone is both prohibitively expensive and (given the possibility of adverse drug reactions) potentially harmful. Both relative and absolute risk can be used to assess efficacy of medications: relative risk focuses on reducing the rate of incidents (e.g. from 5% to 4%); absolute risk focuses on reducing the number needed to treat (NNT; e.g. from 50 to 20 incidents prevented for each person taking the drug). Both of these numbers ought to be considered, when approaching the question of how to utilize PRSs in a manner that effectively focuses medical attention on the largest population with a high likelihood of effective response to therapy.

Four variables are critical in making this assessment: the prevalence of the condition, risk in each PRS-positive target group, proportion of the population in the group, and therapeutic response rate. The author [see attached article] reviews five conditions — across a broad spectrum of chronic disease (opioid pain medication, hypertension, type-2 diabetes, major depressive disorder, and osteoporotic bone fracture), considering — in each case — how genetic prediction might be used to target drug prescription. This concept leads to a call for more research, designed to evaluate genetic likelihood of response to therapy and a call for evaluation of PRS, not just in terms of sensitivity and specificity — but also with respect to potential clinical efficacy.

DwN

PLoS Genet Apr 2019; 15: e1008060

Posted in Center for Environmental Genetics | Comments Off on Some thoughts on the utilization of polygenic risk scores (PRSs) for therapeutic targeting

Tracking humans and Their Microbiomes

As these GEITP pages have continued to emphasize: Multifactorial phenotypes — including complex diseases (e.g. type-2 diabetes, major depressive disorder), quantitative traits (e.g. body mass index, height), drug efficacy and adverse drug reactions, and responses to environmental toxicants (e.g. dioxin, cigarette smoke, arsenic) — refect contributions of genetics, epigenetic effects, environmental factors, endogenous influences, and each patient’s microbiome. The latest breakthroughs in microbiome research are the topics of this GEITP email and the next one [see three articles & one editorial attached; another article, plus editorial, on the vaginal microbiome will follow].

Studies from the US National Institutes of Health Integrative Human Microbiome Project (iHMP) provide a resource of microbial- and human-derived data — tracking the progression of two diseases (ulcerative colitis, type-2 diabetes) and pregnancy that should help us understand host-associated microorganisms and their interactions with their human host [see 2nd article & 3rd attachment editorial]. The Human Microbiome Project has now been carried out over a 10-year period (encompassing TWO phases) to provide resources, methods, and discoveries that link interactions between humans and their microbiomes to health-related outcomes. The recently completed second phase, the iHMP, comprised studies of dynamic changes in the microbiome and host under three conditions: pregnancy and preterm birth; inflammatory bowel diseases; and stressors that affect individuals with prediabetes. These results begin to: [a] elucidate mechanisms of host–microbiome interactions under these conditions, [b] provide unique data resources (at the HMP Data Coordination Center), and [c] represent a paradigm for future multi-omic studies of the human microbiome.

Inflammatory bowel diseases (IBDs) — which include Crohn disease and ulcerative colitis — are complex diseases that are heterogeneous at the clinical, immunological, molecular, genetic, and microbial levels. As part of the iHMP, authors [see attached 1st article] followed 132 subjects for one year each to generate integrated longitudinal molecular profiles of host and microbial activity during disease flare-ups (as many as 24 time-points each; in total 2,965 stool, biopsy, and blood specimens). Authors provide a comprehensive view of functional dysbiosis (a microbial imbalance, or maladaptation — on, or inside, the body) in the gut microbiome during inflammatory bowel disease activity. Authors demonstrate a characteristic increase in facultative anaerobes [organisms that can live by aerobic respiration if O2 is present, but are also able to switch to fermentation (or anaerobic respiration) if O2 is lacking] at the expense of obligate anaerobes (organisms that die in presence of O2), as well as molecular disruptions in microbial transcription (DNA —> RNA, among e.g. clostridia bacteria), metabolite pools (e.g. acylcarnitines, bile acids, and short-chain fatty acids), and levels of antibodies in host serum. Periods of disease activity were also marked by increases in temporal variability, with characteristic taxonomic (different bacterial species), functional, and biochemical shifts. Finally, integrative analysis identified microbial, biochemical, and host factors central to this dysregulation. The study’s infrastructure resources, results, and data, which are available through the Inflammatory Bowel Disease Multi’omics Database (http://ibdmdb.org), provide the most comprehensive description, to date, of host and microbial activities in inflammatory bowel diseases.

To better understand the earliest stages of type-2 diabetes mellitus (T2D), authors [see 4th attachment] obtained samples from 106 healthy individuals, vs individuals with prediabetes over approximately four years and performed deep-profiling of transcriptomes, metabolomes, cytokines (substances secreted by certain cells of the immune system, which have an effect on other cells), and proteomes, as well as changes in the microbiome. This rich longitudinal data set revealed many insights: First, healthy profiles are distinct among individuals — while exhibiting diverse patterns of intra- and/or inter-personal variability. Second, extensive host and microbial changes occur during respiratory viral infections and immunization. However, immunization triggers potentially protective responses that are distinct from responses to respiratory viral infections. Moreover, during respiratory viral infections, insulin-resistant participants respond differently than insulin-sensitive participants. Third, global co-association metabolomics analyses among the thousands of profiled molecules reveal specific host–microbe interactions that differ between insulin-resistant and insulin-sensitive individuals. Fourth, authors identified early personal molecular signatures in one individual that preceded the onset of T2D — including the inflammation markers interleukin-1-receptor agonist (IL1RA) and high-sensitivity C-reactive protein (CRP) paired with xenobiotic-induced immune signaling. This study reveals insights into pathways and responses that differ between healthy vs glucose-dysregulated individuals during health and disease. These data provide an open-access data resource to enable further research into healthy, prediabetic and T2D states. Everything attached is there — for your bedtime reading pleasure. 😊

DwN

Nature 30 May 2o19; 569: 641-648 & 655-662 & 653-671

Posted in Center for Environmental Genetics | Comments Off on Tracking humans and Their Microbiomes

The vaginal microbiome and preterm birth

As these GEITP pages have continued to emphasize: Multifactorial phenotypes — including complex diseases (e.g. type-2 diabetes, cancer, major depressive disorder), quantitative traits (e.g. height, body mass index), drug efficacy and adverse drug reactions, and responses to environmental toxicants (e.g. dioxin, cigarette smoke, arsenic) — reflect contributions of genetics, epigenetic effects, environmental factors, endogenous influences, and each patient’s microbiome. Some recent breakthroughs in intestinal microbiome research were the topics of yesterday’s GEITP email, whereas today’s GEITP topic concerns the vaginal microbiome (see attached article & editorial).

Microbiomes are located in many areas of the human body (ear, nose, mouth, umbilicus, skin, vagina, as well as our intestine; various regions of skin even have distinctly different microbiota). The vagina houses one of the least diverse microbiomes in the human body. Usually colonized by Lactobacillus species during reproductive years, the acidic metabolites of these species maintain vaginal low pH and low microbiota diversity, impeding colonization by acid-sensitive bacteria — including both gut aerobes (growing in O2) and anaerobes (growing without O2). As vaginal pH becomes more alkaline (e.g. due to infection), the vaginal environment becomes more permissive to colonization by more diverse and undesirable microbial communities; this gives way to bacterial vaginosis (BV). Presence of BV leads to lowered resistance to colonization by pathogens, including HIV. BV is also associated with pelvic inflammation, which during pregnancy can increase risk of premature birth.

There also are individual (genetic) and ethnic variations in the threshold of harmful diversity. It is known, however, that the lack of microbiome diversity is decreased during pregnancy; vaginal changes during pregnancy result in greater Lactobacillus dominance and decreased species diversity. Authors [see attached article] examined single time-points from 1,969 non-pregnant and 613 pregnant women and followed a cohort of 90 pregnant women. They found that early pregnancy differences in the vaginal microbiome — linked to ethnicity — are altered by the gestational dynamics that occur in early pregnancy, with the convergence of all pregnant women toward a Lactobacillus-dominated vaginal microbiome profile, unevenly distributed species with fewer taxa dominating, and a simpler metabolic gene profile.

These data suggest — that in the case of a high-diversity vaginal microbiome during pregnancy — the microbiome composition could be manipulated to reduce adverse risks of pregnancy. Premature-birth-associated taxa were correlated with unwelcome pro-inflammatory cytokines (chemicals secreted by immune cells that evoke responses in other cells) in vaginal fluid. These findings highlight new opportunities for assessment of the risk of preterm birth.

DwN

Nat Med June 2019; 25: 1012–1021 & News’n’Views, pp 882-883

Posted in Center for Environmental Genetics | Comments Off on The vaginal microbiome and preterm birth

Zeroing in on what actually motivates — learning and motivation

Dopamine is a neurotransmitter molecule that influences brain pathways — involved in motivation, movement, reasoning/perception, and reward-driven learning. How dopamine contributes to such seemingly-unrelated varied behaviors is the topic of this GEITP email. In fact, understanding the biochemical and genetic pathways as to how learning, memory and motivation actually “work,” is one of the biggest (still outstanding) challenging areas of future research. Environmental signals — that lead to activation of genetic pathways (responsible for motivation, movement, reasoning/perception, and reward-driven learning) — falls within the purview of gene-environment interactions (at least, in the mind of these GEITP pages). 😉

Authors [see attached article & editorial] elucidate how dopamine release is regulated (in rat brain) to accomplish these different functions. Dopamine is produced by neurons located in the midbrain — in regions known as the ventral tegmental area (VTA) and substantia nigra

pars compacta. The long efferent (outgoing) axons of these neurons extend to other parts of the brain — including the nucleus accumbens, dorsal striatum, and prefrontal cortex. Within these target sites, the axons branch extensively, like a “tree,” to form a structure known as an “arbor.”

The textbook description of dopamine-signaling suggests that activation of dopamine-producing neurons in the midbrain generates electrical signals that travel along these axons to their target regions, where they cause dopamine release — which is then transmitted throughout the regions covered by the axonal arbors. This concept is fundamental to current ideas as to how reward-based learning occurs: an unexpected reward leads to increased activity of dopamine neurons that is assumed to transmit a dopamine signal throughout the target regions to facilitate learning.

However, dopamine release in the target regions is more complicated than the textbook description (e.g. dopamine release can be regulated locally by neurotransmitters and other molecules). Furthemore, studies of dopamine neuron activity in animals (using an imaging approach to monitor dopamine neuron activity, or a microelectrode method to assess dopamine release) indicate that an unexpected reward can cause the predicted increased activity of the axonal arbor, as well as dopamine release in the nucleus accumbens.

Dopamine is famously associated with “reward” — but how, exactly? Authors compared spiking of VTA dopamine cells with nucleus accumbens dopamine release, during the same decision-making task. Cues — which would indicate an upcoming reward — were found to increase both the spiking and the release. However, nucleus accumbens core dopamine release also co-varied with dynamically evolving reward expectations, without corresponding changes in VTA dopamine cell-spiking. These intriguing data suggest that there is a fundamental difference in how dopamine release is regulated to achieve these two distinct functions: transmitted burst signals promote learning, whereas local control signals drive motivation. 😊

DwN

Nature 6 June 2o19; 570: 65-70 & News’N’Views pp 40-42

Posted in Center for Environmental Genetics | Comments Off on Zeroing in on what actually motivates — learning and motivation

Let’s Teach Factual Climate Science In Schools

Let’s Teach Factual Climate Science In Schools
Published June 22, 2019 | By Daniel W. Nebert

Daniel W. Nebert

Twenty-one Oregon teenagers have spent four years urging our federal government to take action on “climate change.” They found themselves back in court this past week, arguing their unprecedented lawsuit should move forward. To anyone who understands climate science, this lawsuit is nonsense.

“Climatology” is complicated. Almost all “climate scientists” are specialists in one area (physics, physical chemistry, mathematics, computer-modeling, geology, meteorology, oceanography). Like the “Seven Blind Men and the Elephant” parable — each focuses on one small part of the climate puzzle. The field of climatology is quite new; consequently, new discoveries continuously challenge prevailing wisdom.

Climate science is not being taught accurately in school. Teachers and parents should be instructing children as follows:

Ice-core data in Greenland and Antarctica over the past 800,000 years show that climate is cyclical; there are cycles within cycles. Earth undergoes “warming periods,” interspersed with “cooling periods.” The major Glacial-Interglacial Cycle — every ~110,000 years — reflects changes in Earth’s orbit around the Sun involving precession, axial tilt, and eccentricity.

Today we are in an Interglacial Cycle. The Last Glacial Period occurred from ~115,000 to ~12,000 years ago. From peak temperatures in the Holocene Warm Period ~6,000 years ago — ice-core data show a fairly steady temperature decline, punctuated with brief but notable warm periods, each lasting several centuries that go by names indicating the civilizations they supported: “Minoan/Greek,” “Roman,” “Medieval,” and “Modern” Warm Periods [see Figure]. During the Medieval Warm Period (~950 and 1250 A.D.), Vikings colonized southwestern Greenland; grape-growing and wine-making existed in England, and even in Stockholm.

Do we see any “pattern” here, children? This is called “climate change.” “Climate” is measured in centuries. “Weather changes” represent what is described in TV and newspapers in days-weeks-months-years.

What happens when ocean waters warm? That’s right! Carbon dioxide (CO2), sequestered/dissolved in cold liquid, dissipates into the gas phase (i.e. our atmosphere), which accounts, in part, for rising global atmospheric CO2 levels. All animals take in oxygen (O2) and give off CO2; all plants take up CO2 and excrete O2. This is Earth’s cycle that maintains life.

In 1850, global atmospheric CO2 levels were ~285 ppm; today they have reached ~412 ppm. This is good news to Earth’s cycle of life, children, because plants are substantially “starved” at CO2 levels of 150-200 ppm; optimal growth occurs above 2,000 ppm. In fact, since the beginning of weather satellites (1979), we see that Earth has become ~15% “greener” — i.e. green plants are replenishing desert regions that had been lacking vegetation. This leads to more arable land for growing crops to feed Earth’s expanding human population.

For unknown reasons, atmospheric CO2 levels were as high as ~4,000 ppm during the Cambrian Period (541-485 million years ago), when animal and plant life left an abundant fossil record for the first time; levels reached ~5,000 ppm around 215 million years ago. These CO2 levels were harmless to animals at those times. Inside nuclear-powered submarines, CO2 levels are not to exceed 5,000 ppm — with no harm to sailors.

Much of increasing atmospheric CO2 levels — from pre-industrial ~285 ppm to today’s ~412 ppm — is likely caused by burning of fossil fuels. CO2 is beneficial, not a “pollutant” as one might surmise from hysteria in the news media about the “evils” of CO2 and “carbon footprints.” Even more ridiculous is the fact that Oregon wishes to impose “carbon taxes” to “lower CO2 levels and save the planet.” Concerning that climate-change lawsuit — political indoctrination, and instilling false information and unnecessary fear in children, are serious forms of child abuse.

The Scientific Method involves generating a hypothesis and testing it against robust empirical data. The hypothesis of United Nation’s Intergovernmental Panel on Climate Change (IPCC) is that “dangerous global warming results from human’s greenhouse gas emissions.” If changes in warming-cooling, CO2 levels, polar ice, sea levels, ocean acidity, and various weather indices all reflect natural variability over the last 12,000 years — then IPCC’s hypothesis is disproved by the facts detailed herein.

Daniel W Nebert is professor emeritus at the University of Cincinnati and Cincinnati Children’s Hospital Research Center. He is semi-retired and lives in Clackamas County with his wife — to be nearer to their children and grandchildren.

Posted in Center for Environmental Genetics | Comments Off on Let’s Teach Factual Climate Science In Schools