HGNC NewsLetter, Summer 2023

Some of you will be interested in this Summer 2023 HUGO Gene Nomenclature Committee (HGNC) NewsLetter. As you may be aware, these update issues come out four times a year.
—DwN

Summer newsletter 2023

HGNC, VGNC, Newsletters · August 2023

HGNC is on the move…

Later in the year, the HGNC team will be moving from our current office within EMBL-EBI in Hinxton to the University of Cambridge Haematology Department on the Cambridge Biomedical Campus. Several members of the HGNC team are already employees of Cambridge University; those that are currently employed by EMBL-EBI will initially be granted visitor status and will transition to being employed by the university in due course. The HGNC and VGNC compute will remain at EMBL-EBI, meaning that our website users will not notice any change and the project will continue to be a collaboration between Cambridge University and EMBL-EBI. Watch this space for photos of our new office in future newsletters!
Update on genes with the ‘stable’ tag

We now have 3262 approved gene symbols marked with our stable tag, meaning we regard these symbols to be unlikely to change in future. Genes that have recently had this tag added include the protein-coding genes: ABRAXAS1 (abraxas 1, BRCA1 A complex subunit); FA2H (fatty acid 2-hydroxylase); HAAO (3-hydroxyanthranilate 3,4-dioxygenase); HNMT (histamine N-methyltransferase); HNRNPA0 (heterogeneous nuclear ribonucleoprotein A0); MLPH (melanophilin); PRODH and the non-coding RNA gene RMRP (RNA component of mitochondrial RNA processing endoribonuclease).

Out of the 141 genes that have had the stable tag added over the last three months, just 7 of these genes had their gene name changed as part of the pre-stabilisation review process. The names of BNC1 and BNC2 were updated from “basonuclin 1” and “basonuclin 2” to the more informative “basonuclin zinc finger protein 1” and “basonuclin zinc finger protein 2”; SHOX and SHOX2 were updated to replace potentially pejorative “short stature homeobox” with the more neutral “SHOX homeobox”; TUBGCP2, TUBGCP4 and TUBGCP6 were updated to replace “tubulin gamma complex associated protein” with “tubulin gamma complex component” to reflect more recent functional data on the encoded proteins. No gene symbols were changed.
Updates to placeholder symbols

The symbol and name of C4orf48 was updated to NICOL1 for “NELL2 interacting cell ontogeny regulator 1” after fruitful discussions with groups working on this gene. A 2023 Nature Communications paper published the symbol “NICOL” for “NELL2-interacting cofactor for lumicrine signalling”; the addition of the 1 makes a unique symbol for searching and the term “cofactor for lumicrine signalling” has been replaced with “cell ontogeny regulator” to reflect the wider function of the protein encoded by this gene.

The nomenclature of the FAM104 family was updated from FAM104A and FAM104B to VCF1, “VCP nuclear cofactor family member 1” and VCF2, “VCP nuclear cofactor family member 2”. The symbol VCF1 was used in a recent preprint with the name “VCP/p97 Cofactor FAM104 1”. We explained to the researchers that there is no need to reference FAM104 in the new gene names, as FAM104A is recorded as a previous symbol and is therefore still fully searchable. After discussions with two groups, the symbols VCF1 and VCF2 were agreed upon for both paralogs, along with the approved names “VCP nuclear cofactor family members 1 and 2”.
New gene groups

We continue to add new gene groups, with recent additions of the following families:

Thrombospondin family
Isthmin family
Methionine adenosyltransferase family

Gene Symbols in the News

In virus-related news, the gene product of BTN3A3 has been shown to provide defence against most avian influenza viruses by blocking viral replication in the nose, throat and lungs. All pandemic-causing viruses thus far studied, and most human viruses, are resistant to BTN3A3 — so, this may provide a predictor for which zoonotic viruses are more likely to possess future potential to cross over into humans.

In COVID news, people carrying two copies of the HLA-B variant known as HLA-B15:01 are more than eight times more likely than the average person to experience asymptomatic COVID-19. This protection appears to be independent of other risk factors for COVID-19 severity. A different study has found an association between a SNP near to the FOXP4 locus and the probability that an individual will develop long COVID. The FOXP4 gene is expressed in the lung and in immune-related cells.

A study that recruited people between the ages of 20 and 40 and subjected them to the “Cooper 12-minute run test” over the course of eight weeks has shown that a number of SNPs, including one at the ACTN3 gene, are associated with increased cardiorespiratory fitness as a result of endurance training.

GDNF gene therapy, already in trial for treating Parkinson disease, could one day be used to treat severe alcohol addiction. A study of macaques found that treating individuals who were addicted to alcohol with macaque GDNF resulted in increased dopamine expression and a reduction in the decision of these monkeys to consume alcohol by up to 90%. Due to the need for brain surgery, GDNF gene therapy would only be plausible for the most severe human cases, if it were ever approved for this usage.

In further animal gene therapy news, feral female cats can be effectively sterilised by a single injection of the AMH gene into a muscle. The muscle then produces the AMH hormone at high levels, far higher than that naturally produced by feline ovaries. This treatment would prevent the need to perform stressful and expensive spaying operations on feral populations.

Posted in Center for Environmental Genetics | Comments Off on HGNC NewsLetter, Summer 2023

The Story of Our Universe May Be Starting to Unravel [Sept 2023

This topic is well beyond our usual topics — but it bears thinking about. Perhaps something more than The Big Bang is involved in the beginning of our universe. Or, more mind-boggling, perhaps our universe is just one of an infinite number of universes. The problem is that the James Webb Space Telescope (JWST) is providing TOO much information for our tiny human brains to conceptualize. 😉

DwN

The Story of Our Universe May Be Starting to Unravel

Sept. 2, 2023

By Adam Frank and Marcelo Gleiser

Dr. Frank is an astrophysicist at University of Rochester. Dr. Gleiser is a theoretical physicist at Dartmouth College.

Not long after the James Webb Space Telescope began beaming back from outer space its stunning images of planets and nebulae last year — astronomers, though dazzled, had to admit that something was amiss. Eight months later, based in part on what the telescope has revealed, it’s beginning to look as if we may need to rethink key features of the origin and development of the universe.

Launched at the end of 2021 as a joint project of NASA, the European Space Agency and the Canadian Space Agency, the Webb, a tool with unmatched powers of observation, is on an exciting mission to look back in time, in effect, at the first stars and galaxies. But one of the Webb’s first major findings was exciting in an uncomfortable sense: It discovered the existence of fully formed galaxies far earlier than should have been possible according to the so-called standard model of cosmology.

According to the standard model, which is the basis for essentially all research in the field, there is a fixed and precise sequence of events that followed the Big Bang: First, the force of gravity pulled together denser regions in the cooling cosmic gas, which grew to become stars and black holes; then, the force of gravity pulled together the stars into galaxies.

The Webb data, though, revealed that some very large galaxies formed really fast, in too short a time, at least according to the standard model. This was no minor discrepancy. The finding is akin to parents and their children appearing in a story when the grandparents are still children themselves.

It was not, unfortunately, an isolated incident. There have been other recent occasions in which the evidence behind science’s basic understanding of the universe has been found to be alarmingly inconsistent.

Take the matter of how fast the universe is expanding. This is a foundational fact in cosmological science — the so-called Hubble constant — yet scientists have not been able to settle on a number. There are two main ways to calculate it: One involves measurements of the early universe (such as the sort that the Webb is providing); the other involves measurements of nearby stars in the modern universe. Despite decades of effort, these two methods continue to yield different answers.

At first, scientists expected this discrepancy to resolve as the data got better. But the problem has stubbornly persisted even as the data have gotten far more precise. And now new data from the Webb have exacerbated the problem. This trend suggests a flaw in the model, not in the data.

Two serious issues with the standard model of cosmology would be concerning enough. But the model has already been patched up numerous times over the past half century to better conform with the best available data — alterations that may well be necessary and correct, but which, in light of the problems we are now confronting, could strike a skeptic as a bit too convenient.

Physicists and astronomers are starting to get the sense that something may be really wrong. It’s not just that some of us believe we might have to rethink the standard model of cosmology; we might also have to change the way we think about some of the most basic features of our universe — a conceptual revolution that would have implications far beyond the world of science.

A potent mix of hard-won data and rarefied abstract mathematical physics, the standard model of cosmology is rightfully understood as a triumph of human ingenuity. It has its origins in Edwin Hubble’s discovery in the 1920s that the universe was expanding — the first piece of evidence for the Big Bang. Then, in 1964, radio astronomers discovered the so-called Cosmic Microwave Background, the “fossil” radiation reaching us from shortly after the universe began expanding. That finding told us that the early universe was a hot, dense soup of subatomic particles that has been continually cooling and becoming less dense ever since.

Over the past 60 years, cosmology has become ever more precise in its ability to account for the best available data about the universe. But along the way, to gain such a high degree of precision, astrophysicists have had to postulate the existence of components of the universe for which we have no direct evidence. The standard model today holds that “normal” matter — the material that makes up people and planets and everything else we can see — constitutes only about 4 percent of the universe. The rest is invisible stuff called dark matter (~27 percent) and dark energy (~68 percent).

Cosmic inflation is an example of yet another exotic adjustment made to the standard model. Devised in 1981 to resolve paradoxes arising from an older version of the Big Bang, the theory holds that the early universe expanded exponentially fast for a fraction of a second after the Big Bang. This theory solves certain problems but creates others. Notably, according to most versions of the theory, rather than there being one universe, ours is just one universe in a multiverse — an infinite number of universes, the others of which may be forever unobservable to us not just in practice but also in principle.

There is nothing inherently fishy about these features of the standard model. Scientists often discover good indirect evidence for things that we cannot see, such as the hyperdense singularities inside a black hole. But in the wake of the Webb’s confounding data about galaxy formation, and the worsening problem with the Hubble constant, you can’t be blamed for starting to wonder if the model is out of joint.

A familiar narrative about how science works is often trotted out at this point to assuage anxieties. It goes like this: Researchers think they have a successful theory, but new data show it is flawed. Courageously rolling up their sleeves, the scientists go back to their blackboards and come up with new ideas that allow them to improve their theory by better matching the evidence.

It’s a story of both humility and triumph, and we scientists love to tell it. And it may be what happens in this case, too. Perhaps the solution to the problems the Webb is forcing us to confront will require only that cosmologists come up with a new “dark” something or other that will allow our picture of the universe to continue to match the best cosmological data.

There is, however, another possibility. We may be at a point where we need a radical departure from the standard model, one that may even require us to change how we think of the elemental components of the universe, possibly even the nature of space and time.

Cosmology is not like other sciences. It’s not like studying mice in a maze or watching chemicals boil in a beaker in a lab. The universe is everything there is; there’s only one and we can’t look at it from the outside. You can’t put it in a box on a table and run controlled experiments on it. Because it is all-encompassing, cosmology forces scientists to tackle questions about the very environment in which science operates: the nature of time, the nature of space, the nature of lawlike regularity, the role of the observers doing the observations.

These rarefied issues don’t come up in most “regular” science (though one encounters similarly shadowy issues in the science of consciousness and in quantum physics). Working so close to the boundary between science and philosophy, cosmologists are continually haunted by the ghosts of basic assumptions hiding unseen in the tools we use — such as the assumption that scientific laws don’t change over time.

But that’s precisely the sort of assumption we might have to start questioning in order to figure out what’s wrong with the standard model. One possibility, raised by the physicist Lee Smolin and the philosopher Roberto Mangabeira Unger, is that the laws of physics can evolve and change over time. Different laws might even compete for effectiveness. An even more radical possibility, discussed by the physicist John Wheeler, is that every act of observation influences the future and even the past history of the universe. (Dr. Wheeler, working to understand the paradoxes of quantum mechanics, conceived of a “participatory universe” in which every act of observation was in some sense a new act of creation.)

It is not obvious, to say the least, how such revolutionary reconsiderations of our science might help us better understand the cosmological data that is baffling us. (Part of the difficulty is that the data themselves are shaped by the theoretical assumptions of those who collect them.) It would necessarily be a leap of faith to step back and rethink such fundamentals about our science.

But a revolution may end up being the best path to progress. That has certainly been the case in the past with scientific breakthroughs like Copernicus’s heliocentrism, Darwin’s theory of evolution, and Einstein’s laws of relativity. All three of those theories also ended up having enormous cultural influence — threatening our sense of our special place in the cosmos, challenging our intuition that we are fundamentally different from other animals, upending our faith in common-sense ideas about the flow of time. Any scientific revolution of the sort we’re imagining would presumably have comparable reverberations in our understanding of ourselves.

The philosopher Robert Crease has written that philosophy is what’s required when “doing more science may not answer a scientific question.” It’s not clear yet if that’s what’s needed to overcome the crisis in cosmology. But if more tweaks and adjustments don’t do the trick, we may need not just a new story of the universe but also a new way to tell stories about it.

Adam Frank (@AdamFrank4) is a professor of astrophysics at the University of Rochester and the author of the forthcoming book “The Little Book of Aliens.” Marcelo Gleiser (@MGleiser) is a professor of physics and astronomy at Dartmouth College and the author of “The Dawn of a Mindful Universe: A Manifesto for Humanity’s Future.”

Posted in Center for Environmental Genetics | Comments Off on The Story of Our Universe May Be Starting to Unravel [Sept 2023

FIVE LATEST PAPERS ON THE LNT INVESTIGATION by ED CALABRESE

Thanks, Ed. I was “close,” but one step removed, from knowing the extent of dishonesty and personality clashes that had transpired “before my time.” Professor Ernst Caspari was my Genetics advisor in college (in the end of the 1950s), and I was a member of one of Jim Crow’s NRC committees in the late 1970s with James V. Neel, MD, PhD. The air was sometimes thick with “something mysterious” that had happened, but no one ever volunteered any details — until your many articles of the Muller investigation. Now, all the pieces of the puzzle “have come together.” ☹ 😊

DwN

From: Edward Calabrese
Sent: Monday, September 4, 2023 4:39 AM

Dear Dan:

Here is another recent “LNT-Muller” paper [see attached]. In his Nobel Prize Lecture of December 12, 1946, Hermann J. Muller argued that the dose-response for ionizing radiation-induced germ cell mutations was linear and that there was “no escape from the conclusion that there is no threshold.”

However, a newly discovered commentary by Robert L. Brent (2015) indicated that Curt Stern, after reading a draft of part of Muller’s Nobel Prize Lecture, telephoned Muller, strongly advising him to remove reference to the research-flawed linear non-threshold (LNT)-supportive Ray-Chaudhuri findings and strongly encouraged him to be guided by the threshold supportive data of Ernst Caspari. Brent indicated that Stern recounted this experience during a genetics class at the University of Rochester.

Brent wrote that Muller refused to follow Stern’s advice — thereby proclaiming support for the LNT dose–response while withholding evidence that was contrary during his Nobel Prize Lecture. This finding is of historical importance since Muller’s Nobel Prize Lecture gained considerable international attention and was a turning point in the acceptance of the linearity model for radiation and chemical hereditary and carcinogen risk assessment.

— Ed Calabrese

Over the past decade, GEITP has covered most of Ed Calabrese’s articles — as he has meticulously unraveled the entire fraudulent story of how the Linear No-Threshold (LNT) Model was arrived at in the mid 1950s, how it was based on erroneously interpreted Drosophila (fruit fly) studies by Hermann Joseph Muller (and others) in the 1930s and 1940s, and how the Nobel Prize in Physiology or Medicine in 1946 was awarded (nevertheless) to Muller “for discovery of the production of mutations by means of x-ray irradiation” — when the mutations were, in fact, irradiation-induced DNA damage (breaks) and not mutations at all. ☹

This email summarizes Ed’s last five articles [see attached]; so, let’s call this email blog a Calabrese Festival. This catches me up on his constant barrage of new findings these past 2+ years. 😉

Paper 1 = “Ultra-low doses and biological amplification: Approaching Avogadro’s number” This paper describes evidence establishing that ultra-low doses of diverse chemical agents at concentrations from 10-18 M to 10-24 M (e.g., approaching and/or less than 1 atom or molecule of a substance/cell based on Avogadro’s constant, 6.022 × 1023/mole) are capable of engaging receptor- and intracellular-signaling systems to elicit reproducible effects in a variety of species — from unicellular organisms to humans. Multiple experimental studies have shown that only one, or very few molecules, are needed to activate a cell and/or entire organism via cascade(s) of amplification mechanisms and processes.

For example, ultra-low dose ligand exposure was able to activate both an individual cell, and ~3,000 to 25,000 neighboring cells on average, by about 50%. Such activation of cells and whole organisms typically displayed hormetic-biphasic dose responses*. These findings indicate that numerous diverse phylogenetic systems have evolved highly sensitive detection and signaling mechanisms to enhance survival functions, such as defense against infectious agents, responses to diverse types of pheromone communications (e.g., alarm, sexual attraction), and development of several types of cellular protection/resilience/survival processes. These data suggest that ultra-low dose effects may be far more common than had been previously recognized. Authors posit that such findings have important implications for evolutionary theory and ecological and systems biology, as well as clinical medicine.

*A hormetic-biphasic dose response means that a low dose of an environmental or endogenous chemical agent may trigger, from any organism, the opposite response to that by a very high dose:

Paper 2 = “How did Hermann Muller publish a paper, absent any data, in the journal Science? Ethical questions and implications of Muller’s Nobel Prize” This Letter-to-the-Editor reports the discovery of a 26 Oct 1927 letter of Hermann J. Muller concerning the owner and editor of the journal Science, which suggests an agreement that could have led to Muller’s publication in Science — absent any data — which would have been contributory to both his professional reputation, and perhaps his being considered for, and awarded, a Nobel Prize.

Paper 3 = “The Gofman-Tamplin Cancer Risk Controversy and Its Impact on the Creation of BEIR I and the Acceptance of LNT” The major public dispute between John Gofman and his colleague Arthur Tamplin and the United States Atomic Energy Commission (US AEC) — at the end of the 1960s and during the early 1970s — significantly impacted the course of cancer risk assessment in the US and worldwide. The challenging and provocative testimony of Gofman to the US Senate in early 1970 led to formation of the US National Academy of Sciences (NAS) Biological Effects of Ionizing Radiation I (BEIR I) Committee in order to evaluate the accuracy of claims by Gofman and Tamplin that emissions from nuclear power plants would significantly increase the occurrence of genetic defects and cancers (we now know this is not true).

BEIR I recommended adoption of the linear non-threshold (LNT) dose response model for the assessment of cancer risks from radiation exposures. The US EPA adopted this recommendation and generalized it to incorporate putative chemical carcinogens — thereby affecting cancer risk assessment studies over the next five decades. Despite scientific limitations and ideological framework of their perspectives, Gofman and Tamplin are of considerable historical importance, because they had essential roles in affecting the adoption of LNT by regulatory agencies.

Paper 4 = “Thresholds for radiation-induced mutation? The Muller-Evans debate: A turning point for cancer risk assessment” In 1949 Robley Evans published a paper in Science supporting a threshold dose response for ionizing radiation-induced mutation, contradicting comments of Hermann Muller during his 1946 Nobel Prize Lecture and subsequent presentations. Evans sent a final draft prior to publication to more than 50 leading geneticists/radiologists, including Muller; subsequent correspondence was generally extremely supportive — including letters from the radiation geneticists Curt Stern, James Neel and Donald Charles. Of interest is that Muller engaged in a dispute with Evans, with Evans dismissing Muller’s comments as containing “a few points of scientific interest, and many matters pertaining to personalities and prejudices.”

A cornerstone of the Evans threshold position was the study by Ernst Caspari, which was done under the direction of Curt Stern, at the University of Rochester/Manhattan Project, and for which Muller was a paid consultant, thereby having insider knowledge of the research team, results, and internal debates. Muller published a series of articles after the Evans Science publication that tried to debunk the Caspari findings — claiming that his “control group was aberrantly high,” which caused his threshold conclusion to be incorrect. [However, internal correspondence in 1947 between Muller and Stern reveals that Muller supported the use of Caspari’s “control group,” based on consistency with his own lab data.]

This correspondence shows that Muller reversed his position 3 years later, soon after the Evans publication. In that same 1947 correspondence with Stern, Muller also claimed that the mutational findings of Delta Uphoff, who was replicating the Caspari study, could not be supported because of “aberrantly low control group values,” only to reverse himself to support the LNT model. The present paper links Muller’s threshold rejection/LNT supporting actions to the timing of the debate with Evans concerning Evans’ use of the Caspari data to support the threshold model.

It is of historical significance that the duplicitous actions of Muller were rewarded, with his newly expressed reversed views becoming generally accepted (while his previously documented contrary views were hidden/remained private). At the same time, the marginalizing of the Caspari findings greatly impacted recommendations to support LNT by major advisory committees.

Paper 5 = “Muller mistakes: The linear no-threshold (LNT) dose response and US EPA’s cancer risk assessment policies and practices” This paper identifies the occurrence of *six major conceptual scientific errors of Hermann Muller and describes how these errors led to the creation of the linear no-threshold (LNT) dose-response — which historically continues to be used worldwide for cancer risk assessments for chemical carcinogens and ionizing radiation.

This paper demonstrates the significant role that Muller played in the environmental movement, affecting risk assessment policies and practices that are in force, even now, a half century following his death. This paper lends support to contemporary research that shows significant limitations of the LNT model for cancer risk assessment — thereby wasting billions of dollars of taxpayer money, because “false scientific conclusions” have been made from from “government-policy rules” rather than from accurate cutting-edge science. ☹

*MISTAKE # 1: Major misunderstanding of evolution: Muller’s Mistake proved disastrous for risk assessment and society

MISTAKE # 2: Muller failed to induce gene mutation and did not deserve the Nobel Prize

MISTAKE # 3: Background radiation is an important cause of evolution

MISTAKE # 4: The creation of the LNT single-hit model

MISTAKE # 5: Total dose (piggy bank theory) — not dose rate (repair model) — predicts risk

MISTAKE # 6: Genetic load can be an important factor in the risk of species extinction

DwN

Pharmacol Res 2021; 170: 105738

Chem-Biol Interact 2022; 368: 110204

Medicina del Lavoro 2023; 114(1): e2023007

Chem-Biol Interact 2023; 382: 110614

Chem-Biol Interact 2023; 383: 110653

Arch Toxicol 2023 Sep 4; doi: 10.1007/s00204-023-03566-5. PMID: 37665363

Posted in Center for Environmental Genetics | Comments Off on FIVE LATEST PAPERS ON THE LNT INVESTIGATION by ED CALABRESE

FIVE LATEST PAPERS ON THE LNT INVESTIGATION by ED CALABRESE

Over the past decade, GEITP has covered most of Ed Calabrese’s articles — as he has meticulously unraveled the entire fraudulent story of how the Linear No-Threshold (LNT) Model was arrived at in the mid 1950s, how it was based on erroneously interpreted Drosophila (fruit fly) studies by Hermann Joseph Muller (and others) in the 1930s and 1940s, and how the Nobel Prize in Physiology or Medicine in 1946 was awarded (nevertheless) to Muller “for discovery of the production of mutations by means of x-ray irradiation” — when the mutations were, in fact, irradiation-induced DNA damage (breaks) and not mutations at all. ☹

This email summarizes Ed’s last five articles [see attached]; so, let’s call this email a Calabrese Festival. This catches me up on his constant barrage of new findings these past 2+ years. 😉

Paper 1 = “Ultra-low doses and biological amplification: Approaching Avogadro’s number” This paper describes evidence establishing that ultra-low doses of diverse chemical agents at concentrations from 10-18 M to 10-24 M (e.g., approaching and/or less than 1 atom or molecule of a substance/cell based on Avogadro’s constant, 6.022 × 1023/mole) are capable of engaging receptor- and intracellular-signaling systems to elicit reproducible effects in a variety of species — from unicellular organisms to humans. Multiple experimental studies have shown that only one, or very few molecules, are needed to activate a cell and/or entire organism via cascade(s) of amplification mechanisms and processes.

For example, ultra-low dose ligand exposure was able to activate both an individual cell, and ~3,000 to 25,000 neighboring cells on average, by about 50%. Such activation of cells and whole organisms typically displayed hormetic-biphasic dose responses*. These findings indicate that numerous diverse phylogenetic systems have evolved highly sensitive detection and signaling mechanisms to enhance survival functions, such as defense against infectious agents, responses to diverse types of pheromone communications (e.g., alarm, sexual attraction), and development of several types of cellular protection/resilience/survival processes. These data suggest that ultra-low dose effects may be far more common than had been previously recognized. Authors posit that such findings have important implications for evolutionary theory and ecological and systems biology, as well as clinical medicine.

*A hormetic-biphasic dose response means that a low dose of an environmental or endogenous chemical agent may trigger, from any organism, the opposite response to that by a very high dose:

Paper 2 = “How did Hermann Muller publish a paper, absent any data, in the journal Science? Ethical questions and implications of Muller’s Nobel Prize” This Letter-to-the-Editor reports the discovery of a 26 Oct 1927 letter of Hermann J. Muller concerning the owner and editor of the journal Science, which suggests an agreement that could have led to Muller’s publication in Science — absent any data — which would have been contributory to both his professional reputation, and perhaps his being considered for, and awarded, a Nobel Prize.

Paper 3 = “The Gofman-Tamplin Cancer Risk Controversy and Its Impact on the Creation of BEIR I and the Acceptance of LNT” The major public dispute between John Gofman and his colleague Arthur Tamplin and the United States Atomic Energy Commission (US AEC) — at the end of the 1960s and during the early 1970s — significantly impacted the course of cancer risk assessment in the US and worldwide. The challenging and provocative testimony of Gofman to the US Senate in early 1970 led to formation of the US National Academy of Sciences (NAS) Biological Effects of Ionizing Radiation I (BEIR I) Committee in order to evaluate the accuracy of claims by Gofman and Tamplin that emissions from nuclear power plants would significantly increase the occurrence of genetic defects and cancers (we now know this is not true).

BEIR I recommended adoption of the linear non-threshold (LNT) dose response model for the assessment of cancer risks from radiation exposures. The US EPA adopted this recommendation and generalized it to incorporate putative chemical carcinogens — thereby affecting cancer risk assessment studies over the next five decades. Despite scientific limitations and ideological framework of their perspectives, Gofman and Tamplin are of considerable historical importance, because they had essential roles in affecting the adoption of LNT by regulatory agencies.

Paper 4 = “Thresholds for radiation-induced mutation? The Muller-Evans debate: A turning point for cancer risk assessment” In 1949 Robley Evans published a paper in Science supporting a threshold dose response for ionizing radiation-induced mutation, contradicting comments of Hermann Muller during his 1946 Nobel Prize Lecture and subsequent presentations. Evans sent a final draft prior to publication to more than 50 leading geneticists/radiologists, including Muller; subsequent correspondence was generally extremely supportive — including letters from the radiation geneticists Curt Stern, James Neel and Donald Charles. Of interest is that Muller engaged in a dispute with Evans, with Evans dismissing Muller’s comments as containing “a few points of scientific interest, and many matters pertaining to personalities and prejudices.”

A cornerstone of the Evans threshold position was the study by Ernst Caspari, which was done under the direction of Curt Stern, at the University of Rochester/Manhattan Project, and for which Muller was a paid consultant, thereby having insider knowledge of the research team, results, and internal debates. Muller published a series of articles after the Evans Science publication that tried to debunk the Caspari findings — claiming that his “control group was aberrantly high,” which caused his threshold conclusion to be incorrect. [However, internal correspondence in 1947 between Muller and Stern reveals that Muller supported the use of Caspari’s “control group,” based on consistency with his own lab data.]

This correspondence shows that Muller reversed his position 3 years later, soon after the Evans publication. In that same 1947 correspondence with Stern, Muller also claimed that the mutational findings of Delta Uphoff, who was replicating the Caspari study, could not be supported because of “aberrantly low control group values,” only to reverse himself to support the LNT model. The present paper links Muller’s threshold rejection/LNT supporting actions to the timing of the debate with Evans concerning Evans’ use of the Caspari data to support the threshold model.

It is of historical significance that the duplicitous actions of Muller were rewarded, with his newly expressed reversed views becoming generally accepted (while his previously documented contrary views were hidden/remained private). At the same time, the marginalizing of the Caspari findings greatly impacted recommendations to support LNT by major advisory committees.

Paper 5 = “Muller mistakes: The linear no-threshold (LNT) dose response and US EPA’s cancer risk assessment policies and practices” This paper identifies the occurrence of *six major conceptual scientific errors of Hermann Muller and describes how these errors led to the creation of the linear no-threshold (LNT) dose-response — which historically continues to be used worldwide for cancer risk assessments for chemical carcinogens and ionizing radiation.

This paper demonstrates the significant role that Muller played in the environmental movement, affecting risk assessment policies and practices that are in force, even now, a half century following his death. This paper lends support to contemporary research that shows significant limitations of the LNT model for cancer risk assessment — thereby wasting billions of dollars of taxpayer money, because “false scientific conclusions” have been made from from “government-policy rules” rather than from accurate cutting-edge science.  ☹

*MISTAKE # 1: Major misunderstanding of evolution: Muller’s Mistake proved disastrous for risk assessment and society

MISTAKE # 2: Muller failed to induce gene mutation and did not deserve the Nobel Prize

MISTAKE # 3: Background radiation is an important cause of evolution

MISTAKE # 4: The creation of the LNT single-hit model

MISTAKE # 5: Total dose (piggy bank theory) — not dose rate (repair model) — predicts risk

MISTAKE # 6: Genetic load can be an important factor in the risk of species extinction

DwN

Pharmacol Res 2021; 170: 105738

Chem-Biol Interact 2022; 368: 110204

Medicina del Lavoro 2023; 114(1): e2023007

Chem-Biol Interact 2023; 382: 110614

Chem-Biol Interact 2023; 383: 110653

 


    

 


Posted in Center for Environmental Genetics | Comments Off on FIVE LATEST PAPERS ON THE LNT INVESTIGATION by ED CALABRESE

WASF3 protein — that disrupts cells’ energy centers — might be a cause of chronic fatigue syndrome (CFS)

“How is this topic related to gene-environment interactions?” — GEITPers such as George Leikauf might ask. 😉 Well, viruses, or viral infections are considered environmental signals, or stressors, to the patient or lab animal. Subsequently, after the viral infection, the phenotype (or disease) ME/CFS appears, and remains in the host for long periods of time — or forever. This disorder appears to be closely related to symptomatology that we see in Long-COVID disease.

One suspected causal protein (WASF3) has now been identified, which is apparently activated to elicit the clinical symptoms. Alternatively, WASF3 is downstream in the pathway (which includes activation of ER stress), and one or more other steps upstream still remain to be uncovered. 😊

DwN

WASF3 protein — that disrupts cells’ energy centers — might be a cause of chronic fatigue syndrome (CFS)

People living with myalgic encephalomyelitis/chronic fatigue syndrome (ME/CFS) struggle not only with extreme exhaustion and the cognitive problems known as brain fog, but with a profound lack of information about what causes their symptoms and how to treat them. Scientists have yet to pin down the biology underlying the condition, which affects up to 2.5 million people in the United States alone, according to some estimates.

Now, researchers have identified a protein that’s present at unusually high levels in the muscles of people with ME/CFS and that disrupts cells’ ability to generate energy. The findings, reported today in the Proceedings of the National Academy of Sciences, could point to new therapeutics for this condition and for illnesses that share similar characteristics, such as Long COVID.

Akiko Iwasaki, an immunobiologist at Yale School of Medicine who was not involved in the work, praises the research as “very well done” but cautions that the suspect protein is likely “a piece of the puzzle, as opposed to explaining the whole disease.” The findings suggest it could act as one of several “middlemen” between whatever sparks the illness and symptoms such as fatigue, she says.

Paul Hwang, a physician-scientist at the National Heart, Lung, and Blood Institute (NHLBI), and his colleagues initially set out to study a 38-year-old woman with a cancer-promoting mutation in a gene called TP53. Unlike her brother and her father, who shared this mutation, the woman (referred to as S1 in the study) was experiencing extreme long-term fatigue, although she hadn’t received a formal ME/CFS diagnosis.

Hwang’s team examined tissue samples from her muscle, looking for abnormalities in biochemical pathways related to TP53. That search discovered high levels of a protein called WASF3 [encoded by the WASF3 gene, member of the WASP family-3 (Wiscott-Aldrich Syndrome protein group) on Chr 13q12.13]. WASF3 is known to play a role in a cell’s ability to migrate, Hwang says, but the team found a little-cited 2011 study of gene activity in ME/CFS patients that also had predicted it might contribute to that condition.

The NHLBI researchers wondered whether WASF3 was interacting with mitochondria, cellular compartments responsible for energy generation that have been suggested to malfunction in people with ME/CFS and Long COVID. Sure enough, by changing levels of WASF3 inside cultured cells from S1 as well as in other human and mouse cells, the team found the WASF3 protein could disrupt mitochondrial function. Specifically, high levels of WASF3 interfered with the assembly of mitochondrial proteins into molecular complexes that support normal energy production.

Hwang’s group next created genetically engineered mice to produce elevated amounts of WASF3. These animals also had defects in their mitochondrial function and were only able to run about half as far on a treadmill, compared with wild-type mice.

Curious as to whether these results might be relevant to people formally diagnosed with ME/CFS, the researchers compared muscle samples from 14 people living with the illness with those of 10 healthy individuals. They found higher average levels of WASF3—and lower levels of the associated mitochondrial protein complexes—in people with the condition.

“It’s extremely encouraging” to see this kind of detailed molecular approach applied to an understudied illness like ME/CFS, says Mady Hornig, a physician-scientist studying the condition at the Columbia University Mailman School of Public Health. Although the NHLBI researchers didn’t study Long COVID directly, their findings “stand to address a common set of health issues that are very tightly tied to disability in [both] Long COVID and ME/CFS,” she says.

Hornig (who personally has had Long COVID since 2020 adds that further work could try to address whether WASF3 also affects brain function. Deficits in brain energy metabolism may explain the cognitive fatigue that many ME/CFS patients find most debilitating, she says.

It’s not clear what causes high WASF3 levels in the first place. Hwang suggests a role for endoplasmic reticulum (ER) stress—a dysfunction of membranes that help the cell fold its proteins. Viruses can trigger ER stress, perhaps explaining why ME/CFS and related conditions often arise after infection. (S1 told Hwang her fatigue started after she had caught mononucleosis as a teenager.)

Several of the lab’s experiments support Hwang’s proposal: Both S1 and other people with ME/CFS had biochemical signatures of ER stress in their muscles, and treating S1’s cells in a dish with a drug that blocks ER stress—lowered WASF3 levels and restored mitochondrial function. On the flipside, using toxicants to artificially induce ER stress in cultured cells, or in mice, caused a rise in WASF3 levels, Hwang says.

But more work is needed to understand this link, says Pere Puigserver, a cell biologist at Harvard Medical School. ER stress can itself be prompted by mitochondrial dysfunction, making it hard to pin down the order of events leading to fatigue, he says. WASF3’s multiple cellular roles mean it might have other effects in people with ME/CFS, too, he adds.

Hwang acknowledges there are likely to be other pathways causing fatigue in ME/CFS and Long COVID, and that the drivers of illness might be different for different people. His group is now looking at drugs that could put the brakes on ER stress or decrease WASF3’s effects on mitochondria, with an eye toward designing a clinical study.

doi: 10.1126/science.adk3119

COMMENTS:
Excellent points, George. Clearly, you are a good candidate for “taking over” this email blog when I’m gone. 😉
I could have added (before sending the original email to all GEITP-ers) that:

[a] after a viral exposure (such as RSV or infectious mononucleosis), there is a huge gradient in response — from basically asymptomatic, to severe symptoms (due to underlying genetic differences);

[b] after the viral infection has been resolved, why does only a very small subset of patients develop chronic fatigue syndome (CFE), while almost everyone else does not develop CFE or myalgic encephalitis (ME)? It could be activation of WASF3, or a particular variant allele of the WASF3 gene, or (quite likely) one or more upstream genes in the network that ultimately lead to WASF3 activation.

[c] finally, can WASF3 be activated — without seeing the downstream effect of ER stress response?

This breakthrough study introduces far more questions than providing final answers. 😊
DwN
From: Leikauf, George
Sent: Tuesday, August 15, 2023 2:50 PM)

Hey DwN,

ME/CFS, long COVID-19 and even other forms of chronic fatigue are clearly related to gene-environment interactions by my definition. This is because — given the same environmental exposure — individuals always vary in their response. The variation in response is very likely to be due to genetic differences (as well as co-morbidities).

That said, we need to know reproducibity of this study and why family members with the same mutation lack the phenotype; this suggests modifier genes.
Lastly, since we now know that the phenotype “height” is associated with 2,500+ variant genes, I suggest caution about jumping in and saying “a gene is causative”, unless it has strong penetrance. This leaves me questioning whether any disease is a “single gene” disease.
Nonetheless, WASF3 might be a biomarker, if it shows sensitivity and specificity.
Thanks for calling attention to this protein.

Best wishes,
George, the purist.

Posted in Center for Environmental Genetics | Comments Off on WASF3 protein — that disrupts cells’ energy centers — might be a cause of chronic fatigue syndrome (CFS)

Life could have originated in Martian mud

Note the hexagonal shapes. Sometimes, when I have soapy water in the sink — the soap film on top forms these same hexagonal shapes. There is a thermodynamical explanation about these hexagons (lowest Gibbs Free Energy)… Allow viral particles to crystallize in a dish of saline water; they’ll do the same thing… 😊😊

Life could have originated in Martian mud

A cracked Martian surface

Hexagonal cracks discovered on Mars by NASA’s Curiosity rover could only have formed during long cycles of wet and dry conditions.

David Bowie once asked, “Is there life on Mars?” The short answer is probably not—the Red Planet we know today is an uninhabitable wasteland periodically wracked with dust storms. But the Martian landscape looked quite different millions of years ago, when liquid water cascaded through its rivers and lakes.

Now, NASA’s Curiosity rover has discovered patterns of hexagon-shaped cracks in an ancient, dried-out lake basin on Mars. They resemble patterns found on Earth in places like Death Valley, where they form after years of alternating wet and dry conditions that cause the ground to expand and contract. The discovery, reported this week in Nature, suggests that prehistoric Mars had an Earth-like climate, with long periods of wetting and drying.

This type of climate could possibly have allowed the chemistry of life to emerge on Mars, scientists say. Biological reactions use long chains of molecules called polymers, which require water to form. Too much water, however, will dilute the chemical “soup,” preventing molecular components from sticking together. Wet-dry cycling, which strikes a balance between the two conditions, “could possibly be key to the origin of life,” says lead author William Rapin, a planetary scientist at the Research Institute in Astrophysics and Planetology in France.

So perhaps Bowie should have been asking, “Was there life on Mars? And if there was, where did it go?

Posted in Center for Environmental Genetics | Comments Off on Life could have originated in Martian mud

Deconvoluting gene and environment interactions to develop an “epigenetic score meter” of disease

The title of this paper (“Deconvoluting gene and environment interactions to develop an “epigenetic score meter” of disease”) seemed “so relevant” to our GEITP email blog that one of our devoted GEITP-ers suggested that we consider it for discussion. As you can see, most of the coauthors are “Chairs” of one or another department or division, and “Chairs” notoriously write essays in a flowery manner — with a lot of handwaving, smoke and mirrors (i.e., phrases such as “deconvoluting, decoding, unraveling, reduction of dimensionality” are tossed around with impunity). 😉

One general criticism that we in GEITP have — is the emphasis (in many publications lately — throughout the world) on methylated DNA markers as THE Epigenome. It is best to keep in mind that “epigenetic processes” include not only DNA-methylation, but also RNA-interference (RNAi), histone-modifications, and chromatin-remodeling. [Each of these can affect gene expression, and genetic differences can influence each of these epigenetic processes.]

Human health is determined by both genetics (G) and the environment (E). This has been shown time-and-time again — in these GEITP email blogs — in which subsets of individuals, exposed to the same dose of an environmental toxicant, exhibit widely varying responses. Authors [see attached] propose a quantitative measure of the gene–environment interactions (GEI) effect — which, of course, has not yet been developed. [Many of us would say the GEI effect (currently) is too complex to “reduce it” to any simple equation. Maybe next year, maybe another decade — and we might change our minds.]

Examples that the authors chose to discuss [see attached] include the following.

[a] “How often does cancer occur randomly” versus “How often does cancer occur due to the patient’s lifestyle?” [Many of us would also add a 3rd category of “cancer-prone syndromes”.]

[b] Epigenetics is certainly highly likely to be involved in asbestos toxicity and cancer. An intriguing example is the BRCA1-associated protein-1 (BAP1), which has emerged as a suspect in the context of asbestos-related diseases. More than 200 families worldwide, carrying germline mutations of the BAP1 gene, have been found to develop a condition known as the “BAP1 cancer predisposition syndrome.”

[c] Liver toxicity disorders are associated with (among many other factors) reactive oxygen species (ROS) formation, aldehyde production, and genetic differences in aldehyde dehydrogenase expression. Additional factors include effects on genome integrity, and epigenetic control of aldehyde production.

[d] It is likely that differences (large variability) in genetics, epigenetics, and environmental exposure play a large role in the cause of autoimmune disorders.

[e] It is highly likely that epigenetics is a very substantial contributor to the etiology of autism spectrum disorder (ASD). The surge in the frequency of ASD — from one in 10,000 in 1970, to one in 36 in 2023 — is impossible to be blamed solely on DNA mutations (genetics), or changes in diagnostic practices, or increased awareness. Recently, one intriguing susceptibility locus implicated in ASD risk is the CHD8 gene (chromodomain helicase DNA-binding-8), which encodes a subunit of the SNF2H-like ATP-dependent chromatin-remodeling factor, CHD. Alterations in CHD8 expression have been shown to directly influence epigenetic regulation and the transcription of genes involved in neuronal development, and are associated with risk of ASD…!!

The authors’ idea assumes that all the relevant influences of genetics and environment are contained in the epigenome, and this information will someday be “decodified” and correlated with a quantitative measure of disease risk — which they suggest to be called the “epigenetic score meter.” Although this proposal appears “off-the-wall” at present, there is precedence for “the reproducible and quantitative decoding of extremely complex information,” (e.g., by the fast-Fourier transformation in the 1960s, which allowed monitoring of nuclear weapons’ tests, and is now an important basis of the science of earthquakes). The authors’ hypothesis suggests that future studies should be aimed at the “deconvolution ☹ of GxE Interactions” which will lead to the “quantification of all epigenetic effects.” 😊

DwN

EMBO Mol Med (2023) e18208

Posted in Center for Environmental Genetics | Comments Off on Deconvoluting gene and environment interactions to develop an “epigenetic score meter” of disease

Scientists need not necessarily increase overall sample size by default when including both sexes in in vivo studies

Prior to about 2010, there had always been a strong bias toward using a single sex in lab animal research (one obvious reason for preferring males, in adult laboratory animals, is the variability of females due to their estrous cycles). Although there is variation between subdisciplines, this strategy has tended to result in a heavy bias in the direction of males. For example, in 2009, authors found only 26% of studies used both sexes and, among the remainder, there was a male bias in 80% of studies.

The negative consequences of these shortcomings on scientific originality are beginning to be better understood — as evidence emerges that our current fundamental biological knowledge base may be biased. For example, a recent report concluded that the fundamental molecular basis of pain is highly sex-dimorphic, yet much of our knowledge in this area has been derived from studies solely using male animals. This situation risks generating a knowledge imbalance that might persist through the research pipeline — ultimately manifesting in the clinic.

To improve the translation of results from animals to humans, there has been a push to include both male and female animals in studies. In fact, numerous funding bodies — including the NIH in the United States and the MRC in the United Kingdom — now have inclusion mandates. These policies do not require scientists to study differences between males and females per se, but rather aim to improve the generalizability of studies by calculating an average effect estimated from both sexes.

If, however, there is a large, meaningful sex difference in the treatment (or response) effect, studies should be designed in such a way that the visualization and analysis detect it. The NIH policy even introduced the term “Sex as a Biological Variable” (SABV). Authors [see attached] use the term to represent a sex-inclusive research philosophy that emphasizes the importance of automatic inclusion, with a focus on treatment- or response-effect estimates.

Any of a wide range of factors — (including animal strain, age, health status, or other factors) — could also be the focus of a movement to improve research generalizability. However, sex is a particularly pressing and timely direction for improved representation, because clinically, females account for more than 50% of almost any population of interest but are currently largely overlooked.

Authors [see attached] conducted an in-depth examination of “the consequences of including both sexes” on statistical power. They performed simulations by constructing artificial datasets that encompass a range of outcomes that may occur in studies examining a treatment effect in the context of both sexes; this included both baseline sex differences and situations in which the size of the treatment effect depended on sex in both the same and opposite directions. The data were then analyzed, using either a factorial analysis approach (which is appropriate for the design), or a t test approach, following pooling or disaggregation of the data (which are common but erroneous strategies).

Authors’ results demonstrated that there is no loss of statistical power to detect treatment effects — when splitting the sample size across sexes in most scenarios — providing that the data are analyzed using an appropriate factorial analysis method (e.g., two-way ANOVA). In the rare situations where power is lost, the benefit of understanding the role of sex outweighs the power considerations. In addition, use of inappropriate analysis pipelines results in a loss of statistical power. Therefore, authors (as a standard strategy) recommend analyzing data collected from both sexes — using factorial analysis, followed by splitting the sample size across male and female mice. 😊

DwN

PLoS Biol June 2023; 21: e3002129

Posted in Center for Environmental Genetics | Comments Off on Scientists need not necessarily increase overall sample size by default when including both sexes in in vivo studies

Where Is This “Climate Crisis” That Activists Keep Talking About?

This is really very excellent and to the point — a simple analysis of the past four decades. However, the problem with “eco-anxiety” (in children, teens, and age 20s-30s) is that it can cause increased anxiety, depression, post-traumatic stress disorder, lower birth rates, higher rates of suicide, substance abuse, social disruptions including increased violence, and a distressing sense of loss… ☹

Where Is This “Climate Crisis” That Activists Keep Talking About?

Tyler Durden’s Photo

BY TYLER DURDEN

TUESDAY, FEB 14, 2023

Climate change hysteria has been an ongoing point of social contention since at least the 1980s. For the past 40 years, western countries have been relentlessly bombarded with global warming propaganda and predictions of an environmental cataclysm. Many people spent their formative childhoods and school years being indoctrinated with tales of oblivion: A world in which the oceans rise hundreds of feet and land masses are swallowed by the waves. A world in which exponentially rising temperatures create havoc with the weather as millions die from hurricanes, tornadoes, flooding and drought.

As many of us now know, all of these claims ended up being false. The glaciers and polar ice caps never melted. The land is not covered by the seas. The only famine today is a result of economic disaster, not climate disaster. And, most endangered species have not disappeared from the planet. But, climate scientists chasing billions of dollars in funding from governments and globalist think tanks still say the weather Apocalypse is coming; they were wrong for 40 years, but we should trust them now. The “debate is over” they say, and we must defer to the “experts.”

But where is the evidence of this climate crisis that these well funded scientists and activists keep talking about? Where are the weather effects? One can see the very tangible results of our ongoing economic crisis; inflation and high prices, floundering consumers relying on credit cards, mass layoffs in the tech industry spreading to other sectors, etc. People are experiencing the downturn and they can witness the consequences for themselves. If the climate cult wants people to take them seriously, they will have to show some kind of visible proof that global warming is real and a legitimate threat.

The problem is, they have no proof, and so they are forced to dishonestly connect every single bad weather event to “climate change” as a means to frighten the public. Let’s look at the real weather data and see if supposedly dangerous man-made carbon emissions are somehow contributing to weather calamity.

The US is often cited as a primary carbon polluter (even though nations like China produce 30% of global carbon emissions while the US produces only 14%). Let’s look at a track record of US weather data and see if we can find signs of impending disaster. If the problem is global, then it should certainly be visible in US weather as much as any other country.

How about hurricanes? Every time a hurricane hits the Gulf Coast the mainstream media rants about climate change as the cause. But has there been a significant increase in hurricanes in the US? No, there has not according to long term data. Storms are forming at a rate consistent with the historic record.

What about major flooding events? Has there been more downpours and raging rivers? No, there has not. Flooding events are not happening at a greater frequency or severity today than they have in past decades. Even climate scientists are forced to admit that US and global flood damage has been in decline for decades. Data of damage as a proportion of GDP shows this.

Does this mean we are facing increasing drought conditions? Surely, global warming is causing significant damage through loss of rainfall? Nope, that’s not happening either. The worst droughts in recent US history occurred in the 1930s and 1950s.

Maybe we can see a noticeable shift in tornadoes and severe weather inland? Are there more deadly tornadoes today than years ago? No, there are not. In fact, dangerous tornadoes have been declining.

Climate change hysteria often relies on the theory of temperature “tipping points” as a basis for their arguments. Official temp data only goes back to the 1880s, giving us a tiny window to view climate and compare data from today with the data from the past. According to the NOAA, global temps have risen less than 1°C in 100 years. They assert that it only takes a 1.5°C increase to trigger a “tipping point” event that could destroy the Earth as we know it. There is no evidence to support tipping point theory, nor is there a historic precedent. Certainly, there is no evidence in the weather, and skeptics are having a difficult time finding any indications that a catastrophe is on the horizon.

If anything, the data prove that man-made carbon emissions have no effect on weather events. So, if we are on the verge of global warming annihilation, it’s not because human industry caused it.

The truth is, climate change has become a religious ideology, an extension of Earth worship based on faith rather than facts. And like every religion, the climate cult needs an Apocalypse mythology, an end of the world image to keep the flock in line. Every decade they conjure up new tales of inevitable destruction unless we follow their rules and bow to their whims. It is a sad attempt to co-opt science as a tool for zealotry.

Posted in Center for Environmental Genetics | Comments Off on Where Is This “Climate Crisis” That Activists Keep Talking About?

Over 300 COVID-19 Papers Withdrawn for Not Meeting Standards of Scientific Soundness

This latest article sums it up nicely. I had this feeling (by mid-2020) that “things got too frenetic during the 2020-2022 period” (i.e., science got superseded by hysteria and social media buzz), which is why I opted out of being invited as coauthor on any further COVID-19 publications.

—D

Over 300 COVID-19 Papers Withdrawn for Not Meeting Standards of Scientific Soundness

Jessie Zhang
May 26 2023
biggersmaller

Research journals have withdrawn well over 300 articles on COVID-19 due to compromised ethical standards and concerns about the publications’ scientific validity.

Retraction Watch has provided a running list of withdrawn papers on COVID-19 ranging from “Acute kidney injury associated with COVID-19” to “Can Your AI Differentiate Cats from COVID-19?”

A total of 330 research papers have currently been retracted.

During the pandemic, researchers have compromised on ethical standards and tried to either get more publications approved or to take shortcuts around ethics, senior researcher Gunnveig Grødeland at the Institute of Immunology at the University of Oslo says, after going through the list of articles that have been withdrawn, and the reasons for some of them.

While it is quite natural for some articles to be updated or changed to be published in a different form, some have been retracted because the researchers did not obtain informed consent during the research.

“It will, of course, be withdrawn when it is found that ethical guidelines have been breached,” Grødeland told Khrono, a Norwegian higher education and research newspaper.

She pointed out that other articles have been withdrawn after the editors noticed that the strategies the papers mentioned were giving the wrong impression in the media of being recommended as actual treatment or prevention of COVID-19.

She said these sorts of articles had to be withdrawn as they claimed things that neither the authors of the articles nor their institutions could vouch for.

In addition, some studies did not include a large enough sample size.

When more subjects were included, the researchers could no longer maintain the same conclusions they made earlier about the effect of the drugs.

“A Little Out of Hand”

Grødeland said that part of the reason this happened during the pandemic was that relatively more people suddenly started conducting research on a topic they really knew relatively little about.

Even prestigious journals such as the Lancet were publishing those articles.

One of Lancet’s studies even caused both the World Health Organization (WHO) and the national government to stop the comprehensive testing of hydroxychloroquine’s effectiveness against COVID-19.

The extensive Lancet study, allegedly based on research fraud, said that the drug increased the risk of heart arrhythmia and mortality for COVID-19 patients.

A screenshot taken Dec 24, 2020, of thelancet.com, shows the retracted study that prompted some countries to ban the use of hydroxychloroquine to treat COVID-19.

Epoch Times Photo

However, most of the retracted papers were published in smaller journals, the vaccine researcher points out.

“When you look at the articles that have been retracted, the vast majority were published in the less interesting journals. It is they who are mainly affected by withdrawals,” Grødeland said.

But there were a number of environments that do not normally carry out research, which suddenly started producing research after receiving funding from local hospitals.

“It may have caused things to get a little out of hand in some places,” she said.

Hearing Loss?

A recent case from the University of Manchester backtracked on an earlier study that said that COVID-19 was associated with hearing loss, tinnitus (ringing in the ears), and vertigo.

Published in 2021, the researchers said they had identified about 60 studies that report audio-vestibular problems in people with confirmed COVID-19.

“Our analysis of the pooled data, published in the International Journal of Audiology, reveals that seven to 15 percent of adults diagnosed with COVID-19 report audio-vestibular symptoms,” audiology professor at the University of Manchester Kevin Munro said.

“The most common symptom is tinnitus followed by hearing difficulties and vertigo.”

Two years later, after the virus has been blamed for a range of health problems, including auditory disorders, the same university published a new study concluding that hearing loss is unlikely to be caused by COVID-19.

Epoch Times Photo
Hearing loss is unlikely to be caused by COVID-19, scientists now conclude.

Lead author and audiologist Anisa Visram explained their reasoning.

“We know that viruses such as measles, mumps, and meningitis can damage the auditory system,” Visram said in a release.

“It is also well known that COVID-19 can affect our sense of smell and taste, so it was reasonable to assume it might also affect our sense of hearing.”

Visram assured that their current study is well designed and executed and is the most thorough assessment of hearing conducted in people with COVID-19.

Munro also acknowledged that their earlier work may have been rushed.

“There was an urgent need for this carefully conducted clinical and diagnostic study to investigate the long-term effects of COVID-19 on the auditory system,” he said.

“Many previous studies were published rapidly during the pandemic but lacked good scientific rigour.”

“It hasn’t been clear if these are incidental findings or if COVID-19 is damaging the hearing system,” Professor Richard Ramsden, Trustee at the Dowager Countess Eleanor Peel Trust, added.

“While the study cannot rule out infrequent hearing loss as a result of COVID-19, we now know that for most people, there is nothing to be concerned about.”

Jessie Zhang

Jessie Zhang is a reporter based in Sydney, Australia, covering news on health and science.

Posted in Center for Environmental Genetics | Comments Off on Over 300 COVID-19 Papers Withdrawn for Not Meeting Standards of Scientific Soundness