1/13/2008

Infamous Lancet Study Funded by George Soros

Right before the 2006 general election, the Lancet came out with a study claiming that 650,000 Iraqis had died in the war and it aftermath. The study got worldwide attention, much of it unquestioning. Well, it turns out that almost half the study was "funded by the antiwar billionaire George Soros." My problem isn't the funding, my problem is that the source of the funding wasn't revealed when the study came out immediately before the election. What is even worse is that this fact is not considered newsworthy by most of the media now. A Google news search on the term "Lancet Soros study 650,000" got only six news hits (The Times of London, The Spectator in the UK, The New York Post, Fox News, Wired News, and one other minor source). Trying "Soros Lancet Iraq" got only 15 news hits, but some of those were columnists and not news stories.

Labels: , ,

12/20/2007

Steroids for Academics?

So why if this is wrong for athletes, isn't this wrong for academics? Will we soon have congressional hearings on this?

While caffeine reigns as the supreme drug of the professoriate, some university faculty members have started popping "smart" pills to enhance their mental energy and ability to work long hours, according to two University of Cambridge scientists who polled some of their colleagues about their use of cognitive-enhancing drugs.

In a commentary published in Nature on Thursday, Barbara Sahakian and Sharon Morein-Zamir revealed the results of an informal survey they conducted of a handful of colleagues who are all involved in studying drugs that help people perform better mentally.

Ms. Morein-Zamir said they asked "fewer than 10" colleagues in different fields who have done research on cognitive-enhancing drugs, such as Provigil, which is approved in the United States to treat narcolepsy and other severe sleep disorders. "We know that some people—academics—they could be philosophers or ethicists or people who do neuroscience, they chose to take some of these drugs," said Ms. Morein-Zamir.

The notion raises hackles in some parts of academe. "It smells to me a lot like taking steroids for physical prowess," said Barbara Prudhomme White, an associate professor of occupational therapy at the University of New Hampshire, who has studied the abuse of Ritalin by college students. Revelations about the use of performance-enhancing drugs in professional baseball have stirred public interest recently, and she sees parallels between athletes and assistant professors. "You're expected to publish and teach, and the stakes are high. So young professors have to work their tails off to get that golden nugget of tenure." . . .

Worries About Side Effects

. . . . For example, she notes, cheating the body of sleep suppresses the immune system and impairs brain functions. "There's no reason to believe that modafinil is protecting you from these really bad effects of long-term sleep deprivation," she said.

In fact, although cognitive-enhancing drugs have been on the market for decades (The Chronicle, June 25, 2004), it sometimes takes that long for side effects to become apparent. A major study published in August by the Journal of the American Academy of Child & Adolescent Psychiatry showed that children with ADHD who had taken stimulants grew less than did children with ADHD who did not take the drugs. . . . .

Unfair Advantage?

Even with such warnings, the allure of chemicals that confer an advantage may be hard to resist for academics, given the pressures they face. If there were a cognitive-enhancing drug that did not have side effects, said Ms. Prudhomme White, "would I be tempted? Damn right I would. ... Who wouldn't be?"

In their Nature commentary, Ms. Sahakian and Ms. Morein-Zamir asked people to consider whether and when cognitive-enhancing drugs are acceptable. While many people might agree that students should not be allowed to use such compounds during, say, a college-entrance exam, society might decide that it was worthwhile for surgeons or air-traffic controllers to use them. . . . .

Labels: ,

12/14/2007

Dumb Academic Study: "Students Who Pull All-Nighters Have Lower GPAs"

ALBANY, N.Y. — Students who rely on all-nighters to bring up their grades might want to sleep on that strategy: a new survey says those who never study all night have slightly higher GPAs than those who do.

A survey of 120 students at St. Lawrence University, a small liberal arts college in northern New York, found that students who have never pulled an all-nighter have average GPAs of 3.1, compared to 2.9 for those who have. The study, by assistant professor of psychology Pamela Thacher, is to be included in the January issue of Behavioral Sleep Medicine


Here is the problem: presumably those who are pulling the all-nighter are way behind in their studies. Those who are caught up don't have to be up all night. The real question is not to compare the grades for those who are up all night with those who aren't, but to ask what grades would those who did stay up all night get if they didn't do it. Those are two different questions.

Labels:

12/02/2007

A Question of Causality: Political Beliefs and Mental Health

Boy, what I would give to get data that follows individuals over time. Here is the question: Does being dependent upon someone else make you more depressed or does being depressed make you want to depend on the government more? Surely both of these two claims could be true.

PRINCETON, NJ -- Republicans are significantly more likely than Democrats or independents to rate their mental health as excellent, according to data from the last four November Gallup Health and Healthcare polls. Fifty-eight percent of Republicans report having excellent mental health, compared to 43% of independents and 38% of Democrats. This relationship between party identification and reports of excellent mental health persists even within categories of income, age, gender, church attendance, and education. . . . .

But an analysis of the relationship between party identification and self-reported excellent mental health within various categories of age, gender, church attendance, income, education, and other variables shows that the basic pattern persists regardless of these characteristics. In other words, party identification appears to have an independent effect on mental health even when each of these is controlled for. . . . .(emphasis added)

Labels:

11/07/2007

For everyone who has been avoiding the Sun to protect their skin: Sorry following our advice means that you will die sooner

Obesity 'fuels cancer in women': Wrongly Scaring People

"Fuels cancer" sounds pretty bad. Surely the claim that just being "overweight" increases the risk of cancer is pretty scary. Back in graduate school one of the many points that I learned from Ed Leamer was there is an important difference between something that is statistically significant and something that is economically significant. The BBC claims that:

About 6,000 middle-aged or older women in the UK develop cancer each year because they are obese or overweight, a Cancer Research UK-funded study says. . . . .


Ironically, there is another study that just came out that claims simply being "overweight" lowers one's chance of getting cancer:

Being 25 pounds overweight doesn’t appear to raise your risk of dying from cancer or heart disease, says a new government study that seems to vindicate Grandma’s claim that a few extra pounds won’t kill you. . . . .


Well, let's just assume for a minute that the first scary claim is correct and also that the cancer is not the result of something that is also causing the women to be fat (e.g., depression may cause both obesity and cancer). According to the BBC article 4.5 million women are between 50 and 64 in the UK and 57 percent of those are obese or overweight. That comes to 2,565,000. Of those women, if we believe these claims, 6,000 will get cancer or any type each year (it would have been nice to know how many of them died from it, but that is another issue). Anyway, that means that 0.23 percent of these overweight or obese women get cancer each year because of their weight. I wouldn't want to be that less than quarter of one percent, but this doesn't sound like something to panic over. While there is no breakdown for the cancers "caused" by overweight, about one-third of cancers generally are said to result in death so that if true for these cancers, it means that .078 percent die. That is less than one tenth of one percent die from cancer.

Labels: ,

11/06/2007

Russell Roberts Unable to Respond to the Answer that I gave him?

In a previous post Russell Roberts asked for a single example of a useful instrument in empirical work.

So again, my question to my better-read colleagues in the profession--give me an example of a statistical analysis that relies on instrumental variables, for example, that is done well enough, that is so iron-clad, that it can reverse the prior beliefs of a skeptic. . . . .


I wrote him back:

1) I don't put a huge amount of weight often on instrumental variables, but let me give one example from my own work on giving women the right to vote. The instrument there is whether states voluntarily or were forced to give women the right to vote. We found that both types of states experienced a similar increase in government growth after women were given the right to vote. If it was simply increased liberalism by men that caused both suffrage for women and government growth, you should see that in states that reached a critical mass to voluntarily give women the right to vote, but not in others where states were forced to given them a vote.


After several responses back and forth I wrote Russell that: "The best that I can see in you response to what I wrote is 'So let me try again. Very few econometric analyses persuade skeptics.' You have a similar response on your blog. It does not appear to me that this is a helpful response nor is it very direct. Explain why my responses, particularly the instrument regarding women's voting, doesn't meet your concerns. You asked for an example and I gave you one that used an instrument. I have yet to see one paper challenge the instrument that we used on women's voting, and you provide no logical objection."

(Cut)

Russell asks that I not print his reply.

Labels: ,

11/01/2007

Roberts and Hanson Rounds II and III

Unfortunately, Russell Roberts seems unwilling to respond to my previous comments on his postings on empirical work. If he wants to talk about the weaknesses of empirical work, be specific. If he wants to talk about empirical work free from political biases, have the guts to point to specific examples where this bias exists and say how he would do it differently. What is this work missing. He cites Ed Leamer (a former professor of mine), but he ignores that the work that he does discuss does follow Leamer's recommendations. So then what would Roberts do differently?

No Roberts tries to impose the obiligation on those who might know the empirical literature better.

Robin is taking my observation about pragmatism and applying it to handguns. I didn't mean to. I brought up pragmatism in order to highlight the general dangers of excessive faith in reason. Assuming that econometric analysis always trumps an anecdote is an example of the potential dangers of econometric analysis. Yes, relying on anecdotes is lousy science. But lousy econometrics is lousy science, too. What my podcast with Ayres made me realize is that lousy econometrics may be the norm rather than the exception.

Such cynicism can come cheap. It also seems to leave us with anecdotes. Well, there's also common sense, intuition and general lessons gleaned from experience and empirical work that is less prone to manipulation.

So again, my question to my better-read colleagues in the profession--give me an example of a statistical analysis that relies on instrumental variables, for example, that is done well enough, that is so iron-clad, that it can reverse the prior beliefs of a skeptic.


OK, let me give Russell a response. These points and other similar ones are in my book Freedomnomics. The second point below is also in More Guns, Less Crime. I realize that Russell hasn't had time to read either book, but before he comments more on these types of empirical work, he might benefit from reading them.

1) I don't put a huge amount of weight often on instrumental variables, but let me give one example from my own work on giving women the right to vote. The instrument there is whether states voluntarily or were forced to give women the right to vote. We found that both types of states experienced a similar increase in government growth after women were given the right to vote. If it was simply increased liberalism by men that caused both suffrage for women and government growth, you should see that in states that reached a critical mass to voluntarily give women the right to vote, but not in others where states were forced to given them a vote.

2) Regarding correlation and causation that is precisely why some research try to provide many qualitatively different empirical tests. For example, with right-to-carry laws: 1) violent crime falls, 2) the size of the drop increases over time the longer the law is in effect because more permits are issued, 3) there are differences between violent crimes where a criminal comes in contact with a victim who might be able to defend herself and a property crime where there is no contact (violent crimes fall relative to property crimes), 4) there are differences between different types of violent crimes (e.g., between murder generally and multiple victim public shootings because the probability that someone will be able to defend themselves with multiple victim public shootings is much higher than the case where there is only one or two victims present), 5) a comparison between adjacent counties on opposite sides of a state border, and 6) differential benefits across different types of victims.

Russell, try to come up with an alternative explanation for these different findings.

As a general comment, I am disappointed how vague Roberts' discussion is and how filled it is with platitudes.


Finally, Robin Hanson summarizes where Russell might be coming out on all this: "Russ finally answers as I'd originally expected: he relies on simpler clearer data and theory. " I think that there is a lot of regression and empirical work that uses very simple approaches (see the above reference to women voting or I would even argue many issues involving concealed handguns such as permit holders being extremely law-abiding and not posing a risk by themselves), So Russell, what do you have to say to that? In addition, Russell, I don't think that Ayres conceded anything to you on Friedman's Monetary History.

Labels: , ,

10/29/2007

Russell Roberts and Robin Hanson on the value of Empirical work

I don't know how one figures out what is right without looking at data. Introspection only gets one so far and introspection is really built to some extent on our relationship with data (at least broadly defined) at some point in our lives. I agree that a healthy sense of skepticism is necessary regarding empirical work, but Russell Robert's recent discussions went beyond that.

Russell Roberts makes a pretty disappointing claim here that empirical data really only demonstrates the researcher's prior beliefs. Roberts then cites work on concealed handguns and lojack devices as evidence for his claim. Even more interesting, he references Ed Leamer (a former professor of mine) whose work has largely been set up to take out some of researcher's biases from the research that they present. Roberts fails to note that in More Guns, Less Crime I used Leamer's approach for both the sensitivity of specifications as well as bounding measurement error. As an aside, I also think that it is important that people share their data as an important check on the quality of research, even if others do not behave similarly.

Unfortunately, Russell isn't familiar with much of the debate over concealed handgun laws. I think that the refereed academic research on concealed handgun laws by economists has had an impact. For example, how many refereed academic papers by economists or criminologists claim that right-to-carry laws have statistically significantly raised accidental shootings, suicides, or violent crime rates? I know of none. If Roberts can point to one paper, he should do so. Even most of the relatively few papers that claim to find no statistically significant effects have most of their results showing a benefit from right-to-carry laws. For example, Black and Nagin's 1998 piece in the JLS shows that even after they elminate about 87 percent of the sample (dropping out counties with fewer than 100,000 people as well as Florida from the sample), there are still two violent crime rate categories (as well as overall violent crime that they don't report) that show a statistically drop after right-to-carry laws are adopted. Mark Duggan's paper in the JPE is a similar example. Once the admitted typos in his Table 11 are fixed, most of the estimates show a statistically significant drop in violent crime. All but one of the results for murder show a statistically significant drop. There is only one of the 30 coefficient estimates that show a statistically significant increase, and even that is because he is looking at just the before and after averages and not trends.

My question is this: before this research, how many academics would have believed that at least some refereed research would show that right to carry laws did not increase accidents, suicides, or violent crime rates? I think that most would believe that some would find these results.

Robin Hanson gets to the central question: What Roberts would use to make decisions if he isn't going to use empirical work?

Try saying this out loud: "Neither the data nor theory I've come across much explain why I believe this conclusion, relative to my random whim, inherited personality, and early culture and indoctrination, and I have no good reasons to think these are much correlated with truth."


Russell tries to respond here. In particular he states:

Where does that leave us? Economists should do empirical work, empirical work that is insulated as much as possible from confirmation bias, empirical work that isn’t subject to the malfeasance of running thousands of regressions until the data screams Uncle. And empirical work where it’s reasonable to assume that all the relevant variables have been controlled for. And let’s not pretend we’re doing science when we’re not.


Anyone can make such broad claims and many frequently have. Be specific. You have gotten into a debate over particular laws. What is it that you think should have been done regarding right-to-carry laws? What should have been controlled for that wasn't? What combination of control variables should have analyzed that wasn't?

Footnote: In Duggan's main estimates in the above discussed paper, it is interesting that the magazine he uses in much of his paper, Guns & Ammo, is the only one that shows the effect that he claims and that is because the magazine bought many copies of its issues in increasing crime areas (places where it thought there was a demand for defensive guns) when sales failed to meet promises to advertisers.

Labels: ,

9/27/2007

Felix Oberholzer-Gee and Koleman Strumpf: It is time that real pressure be put on academics who refuse to share their data

I second Craig Newmark's remarks:

the authors of a massive--42 pages--lead article in one of the economics profession's top two journals--Journal of Political Economy--whose findings have been cited in the "New York Times, Wall Street Journal, Washington Post, Chicago Tribune, Boston Globe, USA Today, Financial Times, Rolling Stone, ABC NIghtline, ABC World News Tonight, CNBC, BBC News, MTV, NPR, and Bloomberg Radio" have an extraordinary amount of responsibility. They should accept the burden of addressing careful, thoughtful criticism. They should, after a reasonable amount of time, freely share their data with other researchers so that their results can be studied, tested, and if need be, questioned.

Unfortunately, Felix Oberholzer-Gee and Koleman Strumpf, authors of "The Effect of File Sharing on Record Sales: An Empirical Analysis", don't seem to agree. Stan Liebowitz has sharply but carefully and thoughtfully attacked parts of the paper. So far he has had almost no response. And he would like to further examine the paper's main empirical results, but he has not been able, so far, to obtain the data. . . . .


Of course, this adds to others such as Steve Levitt, Ian Ayres, and John Donohue who have been reluctant to share their data either in a timely manner (measured in years after their research gets national attention) or never at all.

The irony is that Ian Ayres has written a book about extolling the value of empirical work when he has done well publicized work where he and his co-authors won't share their own data.

Labels: ,

8/10/2007

Data error in recorded world temperatures exaggerated global warming

This is yet another case of government funded data not being shared with other researchers. The warmest year in the last 100 is 1934, not 1998.

These graphs were created by NASA's Reto Ruedy and James Hansen (who shot to fame when he accused the administration of trying to censor his views on climate change). Hansen refused to provide McKintyre with the algorithm used to generate graph data, so McKintyre reverse-engineered it. The result appeared to be a Y2K bug in the handling of the raw data.

McKintyre notified the pair of the bug; Ruedy replied and acknowledged the problem as an "oversight" that would be fixed in the next data refresh.

NASA has now silently released corrected figures, and the changes are truly astounding. The warmest year on record is now 1934. 1998 (long trumpeted by the media as record-breaking) moves to second place. 1921 takes third. In fact, 5 of the 10 warmest years on record now all occur before World War II. Anthony Watts has put the new data in chart form, along with a more detailed summary of the events.

The effect of the correction on global temperatures is minor (some 1-2% less warming than originally thought), but the effect on the U.S. global warming propaganda machine could be huge. . . . .


Thanks very much to John Lazar for pointing this out to me..

Labels: , ,

7/06/2007

Among College Students Men and Women talk about the same

6/10/2007

Something for those not yet wary of following the latest medical research

4/20/2007

Research paper on Multiple Victim Public Shooting

I have been getting a lot of requests for a copy of my research on multiple victim public shootings. You can download the paper here.

UPDATE; A debate on that work can be found here. Thanks to John Fund and Cal Blalik for sending me this link.

Two comments and responses:

1) "Prof. Lott wrote in an email that he counted less-severe incidents to get enough data for statistically significant results. He justifies his exclusion of gang murders because gun usage by chronic criminals "would not be directly affected by the passage of right-to-carry laws."

That seems to be precisely the reason to include them for a full picture of the effect of these laws. Of course, the complete picture frequently goes missing in this debate."

-- The point is that we tried and reported it many different ways (including the measure used by the New York Times, two or more murders, three or more murders, four or more murders, various combinations of that with injuries as well as injuries separately, and the number of attacks). If someone had a reasonable suggestion for a measure, we would have used it. Out of all those ways, one way was not statistically significant and for what I think is a very obvious reason: the basic model had almost as many control variables as events. My question is: why is it better to try only one measure and find that is statistically insignificant as the other paper you cite does? Besides saying it is too much work, what is there reason for only trying one measure? They also include crimes that we provide a theoretical reason for why they will move the coefficient towards.

2) "Grant Duwe, a researcher on the later study, said the news-archive approach was likely incomplete, because the media don't always give publicity to multiple shootings."

-- Duwe and co-authors had a choice between doing the costly and time consuming media approach that we did or they could use the government data that includes cases that would include cases that shouldn't have been included. An important question to answer in evaluating Duwe's answer above is: Does one really believe that there are really many multiple victim public shootings involving non-gang members and where four or more people are killed that don't get any news coverage? These authors were sent our data before they did their paper and did not provide any evidence that our method missed the types of cases that we were concerned about. From having checked our data against the FBI reports (which we had done years before their work), I would argue that the only cases they picked up were gang shootings of other gangs. Something that can be interesting, but would not be particularly relevant for our estimates of the impact of right-to-carry laws.

Labels: , ,

4/10/2007

Illinois State Police rely on incorrect statistics regarding gun control

The Illinois State Police gives some amazingly bad advice that doesn't seem to be based on anything:

Use of a firearm to protect yourself or property is not recommended.

-Guns stolen from residences are a primary way of getting guns into the hands of criminals.
-Half of all the women that fire a gun trying to protect themselves shoot someone they do not want to, i.e. friend, neighbors, relatives, etc.


But for quite different information see page 9 of this BJS report for information that 9.9 percent of criminal guns came from theft or burglary. Purchased, obtained from family or friend, and rented or borrowed were all much more common sources.

I have called up and asked about both claims. At least on the second claim they had a source, but the source that they pointed to claims that they didn't provide the information. The state police were then called back to figure out where it came from, but so far no source.

The website also advises women to fight their attackers:
Concentrate on these areas only when combating an assailant.

groin
eyes
ears
nose
throat


For information on the safest course of action see this link.

I was sent this link for the Illinois State Police a few days ago and I apologize for forgetting who sent this to me, but I did appreciate it.

Labels: ,

3/26/2007

Trying to set the Police Executive Research Forum Straight

3/06/2007

The Lancet estimate of 650,000 Iraqis Dying a Fraudulent Claim?

The statistics made headlines all over the world when they were published in The Lancet in October last year. More than 650,000 Iraqis – one in 40 of the population – had died as a result of the American-led invasion in 2003. The vast majority of these “excess” deaths (deaths over and above what would have been expected in the absence of the occupation) were violent. The victims, both civilians and combatants, had fallen prey to airstrikes, car bombs and gunfire.

Body counts in conflict zones are assumed to be ballpark – hospitals, record offices and mortuaries rarely operate smoothly in war – but this was ten times any other estimate. Iraq Body Count, an antiwar web-based charity that monitors news sources, put the civilian death toll for the same period at just under 50,000, broadly similar to that estimated by the United Nations Development Agency.

The implication of the Lancet study, which involved Iraqi doctors knocking on doors and asking residents about recent deaths in the household, was that Iraqis were being killed on an horrific scale. The controversy has deepened rather than evaporated. Several academics have tried to find out how the Lancet study was conducted; none regards their queries as having been addressed satisfactorily. Researchers contacted by The Times talk of unreturned e-mails or phone calls, or of being sent information that raises fresh doubts. . . . .

Labels: , , , ,

2/11/2007

Lots of Problems with Wikipedia

2/07/2007

When Global Warming Meets Academic Freedom

Yet, another reason that government should stay out of science debates. It can't keep politics out of the discussion.

Taylor has held the title of "state climatologist" since 1991 when the legislature created a state climate office at OSU The university created the job title, not the state.

His opinions conflict not only with many other scientists, but with the state of Oregon's policies.

So the governor wants to take that title from Taylor and make it a position that he would appoint.

In an exclusive interview with KGW-TV, Governor Ted Kulongoski confirmed he wants to take that title from Taylor. The governor said Taylor's contradictions interfere with the state's stated goals to reduce greenhouse gases, the accepted cause of global warming in the eyes of a vast majority of scientists. . . .


I guess that I do object to the claim that Taylor's views are outside the mainstream of climatologists. Among them I think that he is pretty mainstream.

Thanks to Tom C. for sending this to me.

Labels: , ,

2/03/2007

Weather Forecasting still has a ways to go

Weather forecasters have gone from predicting a very wet, warm winter in Southern California. Instead they have had record a cold and dry winter. Just a thought, but people have a much greater incentive to get the current weather forecast right than a forecast 50 or 100 years from now simply because no one will remember what you said 100 years from now.

The National Oceanic and Atmospheric Administration on Thursday dramatically downgraded its forecast for a winter of warm El Niño rains . . .

Federal weather officials had been saying for months that the region would have a wet winter, but the Southland hasn't recorded significant rain since May. . . .

Some forecasters now believe the region is in for a record dry spell.

California was hit by a record heat wave that killed more than 100 people in the summer, and is just now emerging from a near-record cold snap that destroyed at least $800 million worth of crops and brought a dusting of snow to unexpected places, such as Westwood.

Labels: ,

4/14/2006

"Attempts To Intimidate Scientists About Global Warming"

Clayton Cramer has a nice link to a discussion of the political pressure in the global warming debate.

Labels: , ,

3/21/2006

Hemenway and Co-authors Refuse to Provided Data Set From 1999

Previously, I complained that David Hemenway (Harvard) and co-authors would not give out the data to a recent study that they did on road rage despite the fact that they had already published a paper in a journal and gone public talking to the media. Recently, however, I have asked for data from two other surveys in 1996 and 1999 that Hemenway also conducted. The 1996 data is available at the ICPSR, but the data from the 1999 survey is not released and Hemenway is not responding to requests on information on even when the data will be released (I last asked on March 9th). It seems as though seven years, long after their study results have been published, is excessively long. The strategy that Hemenway seems to be following is to delay providing the data for so long that no one is able to critically comment on his research simply because the data is so old. An possible concerns that anyone might have would be easy to resolve if data were provided in a timely manner.

Labels: , ,

8/02/2005

Academics drag feet on giving out data

On June 23, Rep. Barton, chairman of the House Committee on Energy and Commerce, sent letters to the climate researchers responsible for developing the notorious “hockey stick” graph, which purports to show a dramatic rise in global temperatures during the 20th century after a millennium of supposedly little change in global temperature.

The hockey stick graph has been key weapon in the arsenal of the global warming alarmists in their efforts to scare the U.S. into signing the Kyoto Protocol and clamping down on greenhouse gas emissions and energy use.

The graph has been criticized for many reasons, including its reliance on dubious estimates of historic temperatures based on the size of tree rings. Not only is temperature merely one factor that contributes to tree growth (as evidenced by the ring size), but a 15th century portion of the hockey stick graph is based on tree ring measurements from a single tree.

Noting that “sharing data and research results is a basic tenet of open scientific inquiry” and that the hockey stick research was paid for with public funds, Chairman Barton asked Dr. Michael Mann of the University of Virginia for the computer code used to generate the hockey stick graph. Dr. Mann had previously refused to provide his computer code to other climate researchers who had requested it.

Dr. Mann apparently decided that he cannot withhold his data and computer code any longer from the public and agreed in a letter to post his data and computer code on the Internet -- but not without squealing about it first. Before Dr. Mann turned over his data, virtually the entire spectrum of global warming alarmists attacked Chairman Barton for requesting access to the data and code.

The American Association for the Advancement of Science, long a proponent of global warming alarmism, chided Chairman Barton in a July 13 letter that Dr. Mann’s hockey stick had already been accepted by the United Nations’ global warming organization and that Congress ought not interfere with that process.

Labels: , ,