A THB Follow Up: Climate Research Fails a Science Integrity Test
Something is rotten in academic publishing
rogerpielkejr.substack.com
Today, I heard back from PNAS that the dataset and paper are without problems and the matter is now closed. I reproduce the full PNAS email to me below. Some highlights: Remarkably, PNAS did not actually examine the dataset in question...
[...] Don’t take my word for it — look at the evidence and the data and come to your own conclusions. For me, this episode is yet another example of how some (not all) climate science has a tendency to go off the rails. This is a particularly egregious example.
- - - - - - - - - - - - - - -
In Texas, ‘Junk Science Law’ Is Not Keeping up With Science
The 2013 law allows for new trials in cases with flawed evidence. But the court has rejected most of those challenges.
undark.org
EXCERPTS: When the state legislature passed the junk science law in 2013 after two previous failed attempts, the measure was hailed across the nation as the first of its kind, with several states following suit and passing their own version.
The law creates a procedural pathway for convicted people to obtain new trials if they can show that underlying forensic evidence in their case was flawed, and that without that flawed evidence, they likely would not have been convicted. Examples of science that has formed the linchpins of junk science claims include the now-debunked bite mark comparison theory, new DNA evidence, and faulty cause-of-death determinations.
“The intention was to accommodate evolving science,” said Bob Wicoff, chief of the wrongful convictions division in the Harris County Public Defender’s Office. “Whereas the criminal justice system demands finality — they want cases to come to a close, and they want a judgment entered and pronounced and finality to be enacted — science keeps evolving.”
[...] “The law isn’t working as people believed it would,” said Estelle Hebron-Jones, director of special projects at the Texas Defender Service. “I don’t think it’s something that we can really get a gold star for, even though on paper that’s how it’s been presented.” (
MORE - details)
- - - - - - - - - - - - - - -
Looking our limitations in the eye: A call for more thorough and honest reporting of study limitations
ABSTRACT: The
replication crisis and subsequent credibility revolution in psychology have highlighted many suboptimal research practices such as p-hacking, overgeneralizing, and a lack of transparency. These practices may have been employed reflexively but upon reflection, they are hard to defend.
We suggest that current practices for reporting and discussing study limitations are another example of an area where there is much room for improvement. In this article, we call for more rigorous reporting of study limitations in social and personality psychology articles, and we offer advice for how to do this.
We recommend that authors consider what the best argument is against their conclusions (which we call the “steel-person principle”). We consider limitations as threats to construct, internal, external, and statistical conclusion validity (Shadish et al., 2002), and offer some examples for better practice reporting of common study limitations.
Our advice has its own limitations — both our representation of current practices and our recommendations are largely based on our own metaresearch and opinions. Nevertheless, we hope that we can prompt researchers to write more deeply and clearly about the limitations of their research, and to hold each other to higher standards when reviewing each other's work.
_