This is L. Hamilton's Typepad Profile.
Join Typepad and start following L. Hamilton's activity
Join Now!
Already a member? Sign In
L. Hamilton
Recent Activity
Thanks for this post, Neven. The photo captures Drew's sense of fun and enthusiasm for his work. He brought new perspectives and energy when he came into the room, and of course outdoors as well. It was a sad shock to learn that we've lost him.
Toggle Commented Sep 22, 2016 on In memoriam: Andrew Slater at Arctic Sea Ice
Regarding Peter Wadhams, the 2015 Sea Ice Outlook (where he offered the same low prediction in June, July and August cycles) illustrates his standing as an outlier among sea ice researchers. Of course, being an outlier does not mean one is wrong, but in that year Dr. Wadham's prediction (0.98 million km2) was far off the mark.
Toggle Commented Sep 3, 2016 on 2016 Mega-Dipole at Arctic Sea Ice
D: "Some statistical models may be subject to improvement but how do you improve an heuristic model that fails?" Heuristic models are a catch-all category of course, but in principle some of them can learn and improve. An interesting example of a heuristic prediction is the poll of 35 scientists at a polar prediction workshop that I mentioned upthread. The "collective wisdom" of workshop participants presumably integrated, informally, their experiences from past years. The mean of their informal predictions (in May) was 4.14, which still looks reasonable while some more formal predictions have been passed. Of course, replication is the real test. I'll be trying this again as opportunities arise, perhaps as soon as this month (looking toward 2017).
Toggle Commented Sep 3, 2016 on 2016 Mega-Dipole at Arctic Sea Ice
Wayne, the 400-odd prediction use a wide range of different input data and methods, including some from blog contributors here. Our paper classifies these into 5 broad types: - heuristic - statistical - mixed statistical-modeling - ice-ocean modeling - coupled ice-ocean-atmosphere modeling The statistical and coupled models performed somewhat better overall, but group differences are not large. Basically our conclusion echoes something Don Perovic remarked last year at a conference (I'm paraphrasing): "Ice used to be easy to predict, there was always a lot. In the future it will be easy again, there won't be very much. Right now we're in between and it's hard."
Toggle Commented Sep 1, 2016 on 2016 Mega-Dipole at Arctic Sea Ice
Our forthcoming paper concludes: Thinning ice that is sensitive to summer weather, complicating prediction, reflects our transitional era between a past Arctic cool enough to retain much thick, resistant multiyear ice; and a warmed future Arctic where little ice remains at summer’s end. The 2016 season, which was not part of our analysis, seems to show that dramatically.
Toggle Commented Sep 1, 2016 on 2016 Mega-Dipole at Arctic Sea Ice
Rob Dekker: "I noticed before that simple (linear or quadratic or Gompertz-curve) projections tend to be no less accurate (measured in standard deviation) than the most complex GCM based projections, and I'm simply not sure what to make of that." Julienne Stroeve and I have a paper recently accepted for Polar Geography, which could hit the street later this fall. The paper, with catchy title "400 Predictions," includes a comparison between median SIPN Sea Ice Outlook predictions over 2008-2015, and three naive models: linear extrapolation, quadratic extrapolation, and "persistence" (guessing that this year will be same as last). The median SIO predictions outperform all three naive methods, although sometimes not by huge amounts. This and other analyses in the new paper confirm what we concluded a few years before: "Sea prediction has easy and difficult years." For an un-paywalled & public-friendly version of the argument, including unique additional data from a contest to win ice cream, see: An abstract for our more formal Geophysical Research Letters paper is here: What all three papers report is that years where September ice extent is near its long-term downward trend (climate), many different prediction methods look good. Years with abrupt excursions above or below that trend (weather), most prediction methods look bad. And, looking good (or bad) in one particular year does not necessarily forecast how well your method will perform in the next.
Toggle Commented Sep 1, 2016 on 2016 Mega-Dipole at Arctic Sea Ice
At a Polar Prediction Workshop held at Lamont-Doherty last May, well ahead of the melt season, I ran an informal poll asking 35 attendees for their guess about the September mean extent. The average among those guesses, 4.14, turned out to be strikingly close to a simple quadratic projection summarizing the downward trend to date, 4.15. This result became a "heuristic" contribution to the SIPN Sea Ice Outlook this year. That graph seems worth reposting as food for thought, now at the start of September.
Toggle Commented Aug 31, 2016 on 2016 Mega-Dipole at Arctic Sea Ice
Following almost-stable N and 96k drop in S, global SIA on 1/31 is down to 14.48, now just 90k short of a record.
Sent a clean copy of the cycle plot to Neven, so hopefully that's forthcoming on the long-term graphs page. If anyone else wants the file, send me a note. Larry
Toggle Commented Jan 6, 2016 on A difference in nonsense at Arctic Sea Ice
"I'm looking at all 12 months, not just September." By coincidence I updated the 12-month cycle plot yesterday, following NSIDC publication of December 2015 monthly data. This shows the visible decline in both area and extent for every month of the year, 1979 to 2015.
Toggle Commented Jan 5, 2016 on A difference in nonsense at Arctic Sea Ice
If anyone is curious to know more about the history and ideas behind these ongoing Arctic-perception surveys, there's a brief sketch in the ARCUS newsletter Witness the Arctic from spring of this year (free): What's next? While those surveys will continue and experiment with new questions next year, we have two non-survey Arctic projects involving different research teams this fall. One will update our 2014 retrospective analysis of the Sea Ice Outlook, This update will analyze data on more than 400 individual predictions over the course of 8 years. The second project is looking at demographics and net migration from 43 Arctic Alaska towns and villages, including some that are said to be "on the front line of climate change" due to severe erosion problems. That project extends earlier work including a paper on "Visualizing population dynamics of Alaska's Arctic communities" (free):
I'm looking for a factual question that is "politically charged", the answer of which would likely be at odds with the global warming enthusiasts mindset. I know you are, you can't let that go, but as I keep saying, I'm not. Or the reverse.
If all the questions you asked were designed to highlight the ignorance of one subset of people They were not, that is your projection. As the "Polar facts" abstract notes, "Analysis indicates that these facts subjectively fall into two categories: those that are or are not directly connected to beliefs about climate change." Having made this observation the paper goes on to focus on facts in the second category, those that are *not* directly connected to beliefs about climate change. Any questions like that in your study? Yes, that's what the rest of the abstract is telling about.
Taking this comment at face value ( i.e. accepting that liberals do reject GMOs, vaccines and nuclear power - which I do not think is true, at least for sure liberals accept vaccines) Ff, my remark was not a comment that should be taken at face value. It's a hypothesis that we tested and found false, as described in two un-paywalled papers. For a quick view it's worth just clicking on this graph and that paper itself, reasonably public-friendly, is here,
[We've been asking the Arctic sea ice question since June of 2011] So beating a dead horse? No, watching to see whether things change. And on the other hand I could hit you with a study that show skeptics are more knowledgeable about science. Is this the study you're thinking of? There also is this study, which found different results on the point that you mention, and includes a brief discussion of why their findings diverge.
As this a poll where are the error bars? Do-it-yourself approximation for 95% confidence interval of a percentage: +/- 100/sqrt(n) where n is the number of observations. So if you have 60% in a survey of 400 people, the confidence interval should be about +/-100/sqrt(400) = +/-5 points. Confidence intervals (calculated more precisely) are drawn for all data points in Figure 1 above. They would make Figure 2 unreadable, but if you wanted to work the overall values out just for fun, we have D 5%, I 7%, R 13%, T 24%, based on 3,795 interviews (design-based F test p < 0.001). Figure 3 shows just the most recent poll, 705 interviews, so you can work out confidence intervals from that. Again, design-based F test p < 0.001. This looks to me a poll where someone/group with a bias goes looking for confirmation. Look again. We've been asking the Arctic sea ice question since June of 2011, and had no reason initially to expect there would be strong political divisions on such a basic physical fact. If those political divisions vanished on the next survey, I'd be delighted. As for the Trump/Hillary supporters thing in Figure 3, as I said that came about by coincidence -- WMUR/CNN commissioned the political question on a survey that also happened to carry my long-running ice question. I was curious to see what happens if you put those together, and the results were striking.
Are you serious? Somehow asking a question about the Arctic Sea Ice is a question about "basic relevant facts" (your words) yet a question about Antarctic Sea Ice is the definition of a "gotcha" question? No, I said that questions "crafted to expose the ignorance of one group of people rather than another" (your words) is the definition of gotcha. And you gave those words as your rationale for the Antarctic question. I think I understand where you are coming from now..... Don't think so. Did you read the abstract?
Democrats' delusions about GMOs or vaccines or whatever. Somewhat off topic but since this point has come up several times in the comments -- as Magma and Joshua noted above, we've done several recent surveys testing whether conservative rejection of science on climate change and evolution has a mirror image in liberal rejection of science on GMOs, vaccines or nuclear power. The evidence so far has been that conservatives are less inclined to trust scientists on all of these topics; there's no sign of the liberal mirror image. For the short version, see this graphic: More about where that came in this not-paywalled report: Those are New Hampshire surveys. Another survey from Oregon (also not paywalled):
asking whether or not ice extent in Antarctica is increasing/decreasing/staying the same is by no means a "gotcha" question. it is just crafted to expose the ignorance of one group of people rather than another. Actually, a question "crafted to expose the ignorance of one group of people" is pretty much the definition of a "gotcha" question, and yes you could slant those in any direction. I don't. While learning that some major factual questions (e.g., Arctic ice, CO2) elicit strong partisan responses, we've also identified other similarly major questions (e.g., is N Pole on land or sea ice? South Pole?) that get lots of wrong answers in a non-partisan way, and used those to construct a very simple, politically-neutral polar knowledge scale. How that scale behaved was a key finding of the Polar Geography paper.
As for the 2014 blip in Arctic ice "recovery," that was briefly the meme in some circles, although based on a 2-year instead of 30-year time frame.
Thank Joshua, I had not seen those Tamino graphs. I've definitely seen the same pattern in our own data (as has Tony Leiserowitz in his, IIRC). From "A four-party view of US environmental concern" Research on US public concern about environmental issues finds ideology or political party are the most consistent background predictors. Party is commonly defined by three groups: Democrats, Republicans, and Independents. Here, using statewide New Hampshire survey data, we elaborate this approach to distinguish a fourth group: respondents who say they support the Tea Party movement. On 8 out of 12 science- or environment-related questions, Tea Party supporters differ significantly from non–Tea Party Republicans. Tea Party supporters are less likely than non–Tea Party Republicans to trust scientists for information about environmental issues, accept human evolution, believe either the physical reality or the scientific consensus on anthropogenic climate change, or recognise trends in Arctic ice, glaciers, or CO2. Despite factual gaps, Tea Party supporters express greater confidence in their own understanding of climate change. Independents, on the other hand, differ less from non–Tea Party Republicans on most of these questions—although Independents do more often accept the scientific consensus on climate change. On many science and environmental questions, Republicans and Tea Party supporters stand farther apart than Republicans and Independents.
I'm sure you could have crafted any number of questions about other hot political topics - and depending on how you crafted them - you would have gotten equally wrong answers on both sides of the aisle. Of course you can craft "gotcha" questions slanted in any direction, but we're aiming for basic relevant facts, not tricky polling. An even more basic question on our most recent surveys asked whether atmospheric CO2 is increasing, with results similar to the Arctic ice one above. There is more detailed analysis of which polar questions have political predictors in our Polar Geography paper mentioned in the post. Many drivers of polar-region change originate in mid-latitude industrial societies, so public perceptions there matter. Building on earlier surveys of US public knowledge and concern, a series of New Hampshire state surveys over 2011–2015 tracked public knowledge of some basic polar facts. Analysis indicates that these facts subjectively fall into two categories: those that are or are not directly connected to beliefs about climate change. Responses to climate-linked factual questions, such as whether Arctic sea ice area has declined compared with 30 years ago, are politicized as if we were asking for climate-change opinions. Political divisions are less apparent with factual questions that do not suggest climate change, such as whether the North Pole is on land or sea ice. Only 38% of respondents could answer that question correctly, and even fewer (30%) knew or guessed correctly that melting of Greenland and Antarctic land ice, rather than Arctic sea ice, could potentially do the most to raise sea levels. At odds with the low levels of factual knowledge, most respondents say they have a moderate amount or a great deal of understanding about climate change. A combination of low knowledge with high self-assessed understanding characterizes almost half our sample and correlates with political views. The low knowledge/high understanding combination is most prevalent among Tea Party supporters, where it reaches 61%. It also occurs often (60%) among people who do not believe climate is changing. These results emphasize that diverse approaches are needed to communicate about science with people having different configurations of certainty and knowledge.
As longtime ASIB readers may know, my colleagues and I have been tracking US public perceptions of Arctic change. This started with analysis of questions written by others for the nationwide General Social Survey in 2006 and 2010, then shifted to our own questions placed on another nationwide survey in... Continue reading
Posted Oct 4, 2015 at Arctic Sea Ice
@Jim Hunt Do you think I should now be a bit rueful about my "projection NOT prediction" argument? Well no, a similar discussion has gone on within SIPN. The initial idea for SIO was that all of these contributions should be explicitly viewed as "projections," meaning if...then statements that trace the implications of specific inputs and modeling methods. So they aren't claimed as "predictions," meaning what we think is going to happen. Two things work against that scientific distinction, however: 1) Many of the contributions are statistical rather than model-based, and in statistics "prediction" has a more general meaning, basically the calculated values on the left-hand side of your equation, as in a regression prediction; 2) Scientifically declaring these are "projections not predictions" will probably get lost in general discussion, because as one SIO founder put it, "If it walks like a duck...."
@Chris Reynolds But I find it hard to believe Dr Wadhams is seriously suggesting 1 million this year. Yes, Wadham's 0.98 projection is hard to figure out. As the June SIO report by SIPN nobserves, Given current central basin sea ice conditions and the lack of a projected atmospheric activity in the central basin (such as a high sea level pressure dipole pattern seen in previous years that is conducive to sea ice loss), there is no a priori information that would support record sea ice loss for summer 2015. Furthermore, for the September sea ice extent to fall below one million square kilometers would require over 100,000 square kilometers of loss each day from June 16th (when the information for this report was collected) until mid-September. Based on actual loss rates from the index of daily sea ice extent for 2007-2014, the likelihood that the next 90 days could sustain such a high rate of loss is less than 1 in 10 million.