Glacier’s Sperry Chalet slowly rising from the ashes
By JUSTIN FRANZ
Sunday, October 14
KALISPELL, Mont. (AP) — More than a year after it was gutted by a wildfire, Glacier National Park’s beloved Sperry Chalet is slowly rising from the ashes. This month, contractors are putting the finishing touches on the first and second floors of the dormitory building and the roof before wrapping up their work for the season.
If everything goes according to plan, the chalet will be completed in 2019 and once again welcoming guests soon after.
It’s an impressive resurrection following the devastating wildfire fire that gutted the century-old chalet built by the Great Northern Railway. On Aug. 31, 2017, the Sprague Fire made a sudden run toward the chalet that had been evacuated weeks earlier. Despite the best efforts of five firefighters, an ember got inside the building and burned it from the inside out. What had taken more than a year to build in 1913 and 1914 took less than an hour to destroy.
Soon after the fire, Secretary of Interior Ryan Zinke announced that he wanted the historic wilderness chalet to be rebuilt. Last fall, Glacier employees conducted emergency stabilization work to ensure that the remaining stonewalls would survive the winter. The National Park Service streamlined the permitting process for rebuilding the chalet and in June 2018 announced that it would spend $12 million reconstructing Sperry over the next two years.
Starting in July, laborers with Dick Anderson Construction of Great Falls began additional stabilization work at the chalet and the installation of new concrete footings. Once those projects were out of the way, they began to construct the first and second floors and a roof.
While crews hiked into the remote work site, materials were brought in by helicopter. By August, two rotating work crews of seven to eight people would hike in and work for eight days before hiking back out for six days off. Travis Neil, project manager for Dick Anderson Construction, said the Sperry project posed a number of challenges. Some of those were predictable, like the lack of easy access to additional supplies or communication with the outside world, but others were a surprise, Neil said.
“Those little marmots are a pain in the rear,” Neil told the Flathead Beacon.
Marmots, like other animals in the alpine region of Glacier National Park, like salt and so it was common for the animals to run off with sweat-stained shirts or rummage through the laborers’ tents. To combat the curious critters, small electric wires were placed around the tents to shock the marmots and keep them away. Glacier Park spokeswoman Lauren Alley said this is not the first time the park has had to resort to zapping critters to keep them from bothering people. Hot wires have been used to keep bears away from a sewage lagoon that was being pumped in Many Glacier and to keep swallows from nesting in the eaves of the St. Mary Visitor Center. The park also electrified a backpack at Avalanche Lake a few years back to teach a noisy bruin that it should stay out of visitor’s belongings.
Rob Terrio was a superintendent for one of the crews working on the chalet this summer. He said that while the marmots caused problems at the beginning of the season, they quickly lost interest in what the laborers were doing. He said that his crews never got tired of seeing wildlife, especially the goats.
“Everyone has that one job that they’ll remember for the rest of their life and in 27 years of doing construction I can’t think of a job as cool as this one,” he said. “I doubt I’ll be able to top in in the eight or so years before I retire.”
Plenty of other people were also at the Sperry Chalet site this summer. The National Park Service had a number of employees working on trails and Belton Chalets Inc., the concessioner for both Sperry and Granite Park Chalets, provided food services for the work crews. While the dormitory was nearly destroyed in last year’s fire, the old kitchen building escaped unscathed. Concessioner Kevin Warrington said it was a privilege to play a small part in rebuilding the chalet and to feed the workers. He said his cooks prepared everything from chicken stir-fry to pizza for the crews. Neil joked that his crews ate better in the middle of the woods than they do at home.
Crews are currently working to protect the recently installed roof from snow. Once it is protected, they will return home for the season. Alley said that design work on the chalet continues and that the park expects to put phase two of the project out for bid in spring 2019 in order to continue work in July of next year. When it is complete, the chalet will have one ADA-accessible room and feature a number of other modern improvements. Alley notes, however, that the chalet will still look and feel like it did before the fire.
Neil said his crews are optimistic that Dick Anderson Construction will win the contract to finish the chalet. Next year, proper roof shingling will be installed, along with windows, doors and interior walls.
Neil said it’s been the privilege of a lifetime to work in such a beautiful place.
“If you were ever having a bad day at that work site all you had to do was look around at the mountains and it would cheer you right up,” he said.
Terrio, who lives in Helena, said that while he had heard of the Sperry Chalet before it burned, he had no idea how important it was to the Glacier National Park community until this summer.
“It’s really humbling to be working on this and giving it back to the community,” he said. “This is a once-in-a-lifetime job.”
Information from: Flathead Beacon, http://www.flatheadbeacon.com
Dispatches from the morgue: Toxicology tests don’t tell the whole story of the opioid epidemic
October 15, 2018
Justin Wade Hubbard
Doctoral Candidate, Medical History, Vanderbilt University
Justin Wade Hubbard does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Vanderbilt University provides funding as a founding partner of The Conversation US.
“Drug overdoses killed more Tennesseans than ever last year, fentanyl deaths up 70 percent,” a recent headline from my hometown newspaper, The Tennessean, proclaimed.
Variations of this headline have become routine across the U.S. In June 2017, a reporter at The New York Times revealed that opioid overdose deaths in 2016 in the U.S. surpassed the peak number of car deaths, a record that had stood since 1972. Vox, an internet media outlet, announced that “in one year, drug overdoses killed more Americans than the entire Vietnam War did,” while CBS News claimed that “drug overdoses now kill more Americans than guns.”
These and similar dispatches from America’s morgues sound like an alarm bell. But, what do all these dead opiate users actually tell us about the opioid crisis? Having studied the history of drug screens, I’d say not much as much as we’d hoped, it turns out.
The world the screens make
Drug screens serve a number of clinical purposes. For clinicians in methadone programs, drug screens are an incomparable, albeit contentious, resource to monitor patient compliance. For pathologists and medical examiners, screens identify chemicals present in a corpse. However, clinical care is only one fraction of why these screens matter.
Epidemiologists, scientists who study populations of people to learn about disease and injury patterns, aggregate machine-assisted, post-mortem diagnoses into the data of public health. Policymakers weigh these stats in forming governmental interventions. Screens, then, form a foundation on which decisions about medical care and governmental responsibility rest.
But, where did drug screens come from, how do they work and how reliable are they in helping us address the opioid crisis?
Measuring drug addiction
The first narcotics screens emerged in the mid-1950s. My own unpublished research has turned up two tests that composed most drug screening: the Nalorphine Test and chromatography.
The Nalorphine Test, also called the Nalline Test, comprised two steps. First, subjects received an injection of an opiate antagonist, N-allylnormorphine.
Opiate antagonists are chemicals that sit on opioid receptors without activating them, essentially working the opposite of opiates. In the human body, antagonists induce withdrawal symptoms, including pupil dilation. After administering the antagonist, a clinician measured the pupil size against standardized circles – a ruler called the pupillometer.
Jailers and physicians were especially keen on this method. One physician remarked that “the test was designed to be and has been used as a club over the head of the addict whom no one should believe.”
Critics reaffirmed that the test was a club, describing the procedure’s painful induced withdrawals and its supposedly inexact methods. Accuracy was not paramount to the Nalorphine test. Its utility was forcing patients and prisoners alike to fear discovery.
A gold standard emerges
Chromatography involves separating a specimen – urine, blood, hair, even organs! – into its constituent chemicals.
Two types of chromatography exist and serve distinct goals. Thin-layer chromatography identifies the component chemicals in a specimen, while gas-liquid chromatography combined with a mass spectrometer (GLC-MS) identifies and weighs the mass of each substance.
Chromatography, unlike the Nalorphine test, found an early audience among toxicologists and chemists. The benefit of chromatography is its ability to quantify, and, supposedly, to render objective diagnoses.
Eventually, chromatography won out. GLC-MS remains the gold standard in drug testing. Insofar as GLC-MS measures the quantities of a given chemical, these screens work great. However, I remain skeptical of marshaling its results to understand the opioid crisis.
The pitfalls of a toxicological imagination
Drug screens aren’t just a means of diagnosing overdoses. They constitute a distinct mode of making and interpreting biological data using specialized laboratory measuring devices, a perspective I call the “toxicological imagination.” That perspective imports pitfalls into individual, and, by extension, aggregate cases alike.
First, GLC can never prove conclusively that this or that drug is responsible for an individual death. GLC belches out results in milligrams/milliliter, but the significance of these numbers is relative. And there is no universal lethal dosage. GLC-MS can’t account for individual tolerance levels, which affect the dose at which a drug becomes lethal.
Screens have to be juxtaposed against other data: patient history, anatomical and histological observation, and social setting of the death. Synthesis of all this data reinjects the human, and all of its subjectivity, into diagnosis.
Second, screens overemphasize misleading concerns, especially drug potency levels. Remember when we thought crack was going to kill us all because it was supposedly so much stronger than cocaine? Fentanyl currently sits on crack’s vacated throne in this regard.
When we evaluate the opioid crisis by confirmed overdose deaths, we advance the kinds of interpretations that colored reactions to, for example, crack.
An alternative to the toxicological imagination?
Instead, I think we need to discern the medical landscapes that turn an overdose into a mortality. What is the availability of Narcan, an opiate antagonist that reverses an overdose? Where is the nearest ER? How easily can drug users access in-patient rehab?
I choose these questions specifically to raise the point that when we see individual and aggregate deaths, or observe the potency of x, y or z drug, we miss out on distal causes that produce an overdosing death. Using overdose deaths or drug potency as a basis to address the opioid crisis is akin to responding to Hurricane Katrina knowing only its wind speed or inches of rain.
Let me be plain: I’m trying to say that drug screens, regardless of their sensitivity, can never reconstruct the social relations that underwrite individual mortalities.
Trump says climate change not a hoax, not sure of its source
Monday, October 15
WASHINGTON (AP) — President Donald Trump is backing off his claim that climate change is a hoax but says he doesn’t know if it’s man-made and suggests that the climate will “change back again.”
In an interview with CBS’ “60 Minutes” that aired Sunday night, Trump said he doesn’t want to put the U.S. at a disadvantage in responding to climate change.
“I think something’s happening. Something’s changing and it’ll change back again,” he said. “I don’t think it’s a hoax. I think there’s probably a difference. But I don’t know that it’s man-made. I will say this: I don’t want to give trillions and trillions of dollars. I don’t want to lose millions and millions of jobs.”
Trump called climate change a hoax in November 2012 when he sent a tweet stating, “The concept of global warming was created by and for the Chinese in order to make U.S. manufacturing non-competitive.” He later said he was joking about the Chinese connection, but in years since has continued to call global warming a hoax.
“I’m not denying climate change,” he said in the interview. “But it could very well go back. You know, we’re talking about over a … millions of years.”
As far as the climate “changing back,” temperature records kept by NASA and the National Oceanic and Atmospheric Administration show that the world hasn’t had a cooler-than-average year since 1976 or a cooler-than-normal month since the end of 1985.
Trump, who is scheduled on Monday to visit areas of Georgia and Florida damaged by Hurricane Michael, also expressed doubt over scientists’ findings linking the changing climate to more powerful hurricanes.
“They say that we had hurricanes that were far worse than what we just had with Michael,” said Trump, who identified “they” as “people” after being pressed by “60 Minutes” correspondent Leslie Stahl. She asked, “What about the scientists who say it’s worse than ever?” the president replied, “You’d have to show me the scientists because they have a very big political agenda.”
Trump’s comments came just days after a Nobel Prize-winning Intergovernmental Panel on Climate Change issued a warning that global warming would increase climate-related risks to health, livelihoods, food security, water supply, human security and economic growth. The report detailed how Earth’s weather, health and ecosystems would be in better shape if the world’s leaders could somehow limit future human-caused warming.
Citing concerns about the pact’s economic impact, Trump said in 2017 that the U.S. will leave the Paris climate accord. The agreement set voluntary greenhouse gas emission targets in an effort to lessen the impact of fossil fuels.
On a different topic, Trump told “60 Minutes” that he’s been surprised by Washington being a tough, deceptive and divisive place, though some accuse the real estate mogul elected president of those same tactics.
“So I always used to say the toughest people are Manhattan real estate guys and blah, blah,” he said. “Now I say they’re babies.”
He said the political people in Washington have changed his thinking.
“This is the most deceptive, vicious world. It is vicious, it’s full of lies, deceit and deception,” he said. “You make a deal with somebody and it’s like making a deal with — that table.”
Taxing carbon may sound like a good idea but does it work?
October 15, 2018
Distinguished Professor of Management, University of California, Davis
Paul Griffin does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
University of California provides funding as a founding partner of The Conversation US.
Exxon Mobil is backing a proposal to tax oil, gas and coal companies for the carbon they emit and redistribute the money raised that way to all Americans. It’s also giving a group urging Washington to enact a tax on carbon US $1 million to advocate for this policy.
The carbon dividends plan, named after the former U.S. officials who conceived it – James Baker and George Shultz – reflects the research of Yale economist William Nordhaus, one of the two winners of the 2018 Nobel Memorial Prize in Economic Sciences.
Based on my research regarding how stock prices and greenhouse gas emissions are connected, I find it very encouraging to see an economist become a Nobel laureate for his climate change work. Even so, I am skeptical of the Baker-Shultz proposal.
In particular, I question whether it would prompt Exxon Mobil and other big energy corporations to either change their business priorities enough or to force them to pay for their contribution to the steep costs of dealing with climate change.
On the one hand, economists argue that in theory taxing the companies that produce fossil fuels or the consumers who buy their products, or perhaps both, should curb the supply of and demand for oil, gas and coal. Presto. The carbon tax reduces emissions.
Depending on the model, the government either uses this revenue for a specific purpose, such as investing in renewable energy technologies, or distributes that money to the public to offset any hardship the tax may cause consumers.
However, economists have two hands. They also need to look at the details of any proposal and the accumulated evidence thus far so as not to repeat the mistakes of the past. Unfortunately, the findings and outlook for carbon taxes alone as a way to reduce emissions are not promising.
Carbon taxes are most prevalent in Europe, especially Scandinavia. Finland became the first country to adopt one in 1990, followed within a few years by Sweden, Norway, the Netherlands and Denmark and later by other European nations. More recently, governments in the Americas and Asia have followed suit, including some local ones in California and Colorado.
Studies, however, indicate that greenhouse gas emission reductions from carbon taxes have been mostly underwhelming.
Researchers generally use two approaches to draw this conclusion, by either building a “counterfactual” model of what the past experience would have looked like with no carbon taxes or by comparing emissions before and after the introduction of a tax with controls for reasons for emissions changes other than a carbon tax.
For example, a 2016 paper examining several studies of emission reductions in 16 countries and two Canadian provinces found an average reduction in carbon emission intensity and energy use of less than 1 percent per year. British Columbia, though, was at the upper end of the emission reduction scale, with emissions per capita falling by as much as 9 percent.
Perhaps the biggest challenge to make these plans work better is raising the per-ton tax to reflect new and higher forecasts for the future costs of climate change. These estimates will likely skyrocket within 25 years into hundreds of dollars per ton of carbon if the world is to keep the increase in global temperatures to less than 2 degrees centigrade compared to pre-industrial times, and an effective tax would need to be even higher for maximum warming of 1.5 degrees.
That is far higher than the current average of about $20 per ton.
I have sought in my own research to estimate the toll on stock prices taken for every ton of carbon. My findings suggest that in 2012 capital markets were pricing the cost of carbon at close to $80 per ton. This penalty imposed by the financial marketplace, a guide to what a carbon tax should be, would be higher today if adjusted for inflation.
Given that about half of Americans don’t see addressing climate change as an urgent priority, I believe U.S. voters would find taxes based on carbon costs that high unacceptable, making a potentially effective tax policy politically difficult to implement.
To their credit, the proposal from Baker and Shultz does have some sensible safeguards. For example, it would tax imports from countries without carbon taxes, and it would raise the carbon tax it proposes from an initial $40 per ton commensurate with increases in the damage from higher temperatures and sea levels.
My most serious concern, though, with their plan is its apparent quid pro quo. It would shield energy companies from some existing regulations and from being held liable for damage to the environment at the federal or state level from decades of earlier fossil fuel production.
This is not a hypothetical concern. Several states and local governments are already suing Exxon Mobil and other oil and gas corporations over damage from climate change.
Looking closely at the carbon tax proposal, if it were to become law, the fossil fuel industries would likely pay a small carbon tax bill that they could easily pass on to consumers in the form of higher gasoline prices. At the same time, Exxon Mobil and its peers would be absolving themselves of what someday could amount to trillions of dollars in liability due to climate change lawsuits.
Exxon Mobil’s support for this carbon tax, in other words, does not signal any generous altruism on its part.
What’s more, even without the tangled web of a national carbon tax, renewable energy is getting cheaper through innovation, some of it subsidized by existing incentives, and economies of scale due to the swift growth of the solar and wind industries.
Climate risk disclosure
Also missing from the Baker-Shultz plan is the clear role that better information for investors and consumers on companies’ climate change impacts can play in guiding markets to accurately and promptly price and allocate carbon risk.
I find that market forces generally are better ways to obtain signals about and establish prices of future states of uncertainty, which is particularly important because climate impacts can evolve over long horizons. Often present in economists’ theoretical views of climate policy, however, is the assumption that high-quality information is available at no cost as a basis for sound decision-making. This may not be the case.
Specifically, economists like me want to know at least two things that are highly relevant for investors and creditors. First, the size of a company’s carbon footprint. Second, the policies that company would be following to avoid an increase of global temperatures, limits on global sea level rise, or both.
Climate scientists, however, are slowly generating better data to trace the links between carbon production and product use and their impacts on people and biodiversity.
In my view, more and better information from carbon emitters is critically needed to establish effective climate change policies. That’s why I am urging the SEC to make companies disclose their carbon risks and carbon footprints voluntarily.
Under my plan, the SEC would provide guidance and apply its enforcement powers to any laggards that might choose to under-disclose or not disclose at all.
I believe this voluntary approach has worked well under the Foreign Corrupt Practices Act, an anti-bribery measure enacted in 1977. I see no reason why it would not also work well as a way to reduce climate risk.