Ford begins probe into whether gas mileage was overstated
By TOM KRISHER
AP Auto Writer
Friday, February 22
DETROIT (AP) — Ford Motor Co. has launched an investigation into whether it overstated gas mileage and understated emissions from a wide range of vehicles.
The company said Thursday that in September, a group of employees reported possible problems with a mathematical model used to calculate pollution and mileage, prompting the company to hire an outside firm to run tests. Testing will start with the 2019 Ford Ranger small pickup truck, and if problems are found, the company will start looking at models dating to 2017.
Ford said it has no evidence yet that mileage or pollution numbers are wrong, but the investigation has just started. The company says it’s too early to tell how many and which models might be involved.
Depending on what is found, Ford could be required to restate the mileage on EPA-approved window stickers as well as reimburse owners for the mileage difference. The company could also face penalties from the government agencies.
“At Ford, we believe that trust in our brand is earned by acting with integrity and transparency,” Kim Pittel, the company’s vice president for environment and safety engineering, said Thursday. “We have a process for looking at how we perform and behave in our broad and complex company.”
The U.S. Environmental Protection Agency and the California Air Resources Board, which monitor emissions and mileage, have been informed of the probe, according to the company.
The EPA said in a statement that Ford disclosed the issues on Tuesday. “The investigation is ongoing and the information is too incomplete for EPA to reach any conclusions,” the agency said. “We take the potential issues seriously and are following up with the company to fully understand the circumstances behind this disclosure.”
The problems do not involve “defeat device” software that activates pollution controls for emissions tests and turns them off on the road, according to Ford. For years, Volkswagen used a defeat device to cheat on diesel emissions tests until they were caught by university researchers and a nonprofit organization. Ford said the investigation is focused on vehicles with gasoline engines.
In an interview, Pittel said it’s too early to tell how widespread the problem is or if it goes beyond 2017 models. “We’re just going to go where the investigation takes us,” she said. “We will be very, very thorough.”
Most emissions and mileage tests are done by automakers and spot-checked for accuracy by the EPA and California. The tests are done on a dynamometer, which is a treadmill-like device, and the mathematical model makes calculations for “road load,” which is drag from wind, tire rolling resistance, drag from engine-driven devices and other factors.
“We have identified potential concerns with how we calculate road load,” Pittel said.
If problems are found, it won’t be the first time Ford has had to restate mileage. In 2014, the Dearborn, Michigan, company had to cut the window-sticker mileage on six models, and had to pay thousands of dollars to more than 200,000 owners after errors were discovered in how the mileage was calculated. The company said the latest issues are unrelated to the 2014 problems. Ford also had to restate mileage on the C-Max gas-electric hybrid in 2013.
The Ford cases came after a 2012 EPA investigation found inflated mileage on 13 Hyundai and Kia models, affecting 900,000 vehicles.
Because Ford found out about the problem in September, it made calculations and probably knows it has problems with the tests, said Navigant Research analyst Sam Abuelsamid, who closely follows auto engines and transmissions.
“I don’t think that they would be making a statement like this at this point if they didn’t already know there was a problem,” Abuelsamid said. “It definitely implies that there’s an issue there.”
The EPA, Abuelsamid said, changed the testing guidelines for emissions and fuel economy in 2017, which could be why Ford said it will look at models from that year.
Ellen Knickmeyer contributed to this report from Washington, D.C.
Utilities are starting to invest in big batteries instead of building new power plants
February 22, 2019
Authors: Jeremiah Johnson, Associate Professor of Environmental Engineering, North Carolina State University
Joseph F. DeCarolis, Associate Professor of Environmental Engineering, North Carolina State University
Disclosure statement: Jeremiah Johnson receives funding from the U.S. National Science Foundation, the U.S. Department of Energy, and the NC Policy Collaboratory. Joseph F. DeCarolis receives funding from the National Science Foundation and the NC Policy Collaboratory to conduct related research on energy systems.
Partners: North Carolina State University provides funding as a member of The Conversation US.
Due to their decreasing costs, lithium-ion batteries now dominate a range of applications including electric vehicles, computers and consumer electronics.
You might only think about energy storage when your laptop or cellphone are running out of juice, but utilities can plug bigger versions into the electric grid. And thanks to rapidly declining lithium-ion battery prices, using energy storage to stretch electricity generation capacity.
Based on our research on energy storage costs and performance in North Carolina, and our analysis of the potential role energy storage could play within the coming years, we believe that utilities should prepare for the advent of cheap grid-scale batteries and develop flexible, long-term plans that will save consumers money.
Peak demand is pricey
The amount of electricity consumers use varies according to the time of day and between weekdays and weekends, as well as seasonally and annually as everyone goes about their business.
Those variations can be huge.
For example, the times when consumers use the most electricity in many regions is nearly double the average amount of power they typically consume. Utilities often meet peak demand by building power plants that run on natural gas, due to their lower construction costs and ability to operate when they are needed.
However, it’s expensive and inefficient to build these power plants just to meet demand in those peak hours. It’s like purchasing a large van that you will only use for the three days a year when your brother and his three kids visit.
The grid requires power supplied right when it is needed, and usage varies considerably throughout the day. When grid-connected batteries help supply enough electricity to meet demand, utilities don’t have to build as many power plants and transmission lines.
Given how long this infrastructure lasts and how rapidly battery costs are dropping, utilities now face new long-term planning challenges.
About half of the new generation capacity built in the U.S. annually since 2014 has come from solar, wind or other renewable sources. Natural gas plants make up the much of the rest but in the future, that industry may need to compete with energy storage for market share.
In practice, we can see how the pace of natural gas-fired power plant construction might slow down in response to this new alternative.
So far, utilities have only installed the equivalent of one or two traditional power plants in grid-scale lithium-ion battery projects, all since 2015. But across California, Texas, the Midwest and New England, these devices are benefiting the overall grid by improving operations and bridging gaps when consumers need more power than usual.
Based on our own experience tracking lithium-ion battery costs, we see the potential for these batteries to be deployed at a far larger scale and disrupt the energy business.
When we were given approximately one year to conduct a study on the benefits and costs of energy storage in North Carolina, keeping up with the pace of technological advances and increasing affordability was a struggle.
Projected battery costs changed so significantly from the beginning to the end of our project that we found ourselves rushing at the end to update our analysis.
Once utilities can easily take advantage of these huge batteries, they will not need as much new power-generation capacity to meet peak demand.
Even before batteries could be used for large-scale energy storage, it was hard for utilities to make long-term plans due to uncertainty about what to expect in the future.
For example, most energy experts did not anticipate the dramatic decline in natural gas prices due to the spread of hydraulic fracturing, or fracking, starting about a decade ago – or the incentive that it would provide utilities to phase out coal-fired power plants.
In recent years, solar energy and wind power costs have dropped far faster than expected, also displacing coal – and in some cases natural gas – as a source of energy for electricity generation.
Something we learned during our storage study is illustrative.
We found that lithium ion batteries at 2019 prices were a bit too expensive in North Carolina to compete with natural gas peaker plants – the natural gas plants used occasionally when electricity demand spikes. However, when we modeled projected 2030 battery prices, energy storage proved to be the more cost-effective option.
Federal, state and even some local policies are another wild card. For example, Democratic lawmakers have outlined the Green New Deal, an ambitious plan that could rapidly address climate change and income inequality at the same time.
And no matter what happens in Congress, the increasingly frequent bouts of extreme weather hitting the U.S. are also expensive for utilities. Droughts reduce hydropower output and heatwaves make electricity usage spike.
Several utilities are already investing in energy storage.
California utility Pacific Gas & Electric, for example, got permission from regulators to build a massive 567.5 megawatt energy-storage battery system near San Francisco, although the utility’s bankruptcy could complicate the project.
Hawaiian Electric Company is seeking approval for projects that would establish several hundred megawatts of energy storage across the islands. And Arizona Public Service and Puerto Rico Electric Power Authority are looking into storage options as well.
We believe these and other decisions will reverberate for decades to come. If utilities miscalculate and spend billions on power plants it turns out they won’t need instead of investing in energy storage, their customers could pay more than they should to keep the lights through the middle of this century.
Basic income: world’s first national experiment in Finland shows only modest benefits
February 21, 2019
Author: Luke Martinelli, Research Associate, Institute for Policy Research, University of Bath
Disclosure statement: Luke Martinelli receives funding from Google Deepmind. This was provided as an unconditional donation to my employer, the Institute for Policy Research at the University of Bath.
Partners: University of Bath provides funding as a member of The Conversation UK.
The preliminary findings from Finland’s basic income experiment are out and they show mixed results. Both advocates and critics of the idea of a universal basic income will find cause for consternation and celebration. Though widely anticipated by basic income enthusiasts, the Finnish experiment will only fuel further debate on whether or not the idea works.
The experiment ran for two years from January 2017 and was implemented by a centre-right coalition government. It was motivated by a distaste for costly welfare bureaucracy and a desire to eliminate work disincentives that arise when means-tested benefits are withdrawn as recipients increase their earned income.
The idea of basic income is hotly contested around the world. Advocates point to a number of other pluses – from the economic benefits to enhancing psychological well-being. Opponents say it is economically unfeasible and will discourage people from doing much-needed work.
Yet, despite the intensity of this debate, there was little empirical evidence on the policy’s effects – until now. Although it experienced waves of support going back decades, basic income has never been implemented at the national level.
The US state of Alaska has a basic income of sorts since 1982, with a small annual payment to each resident funded as a dividend from the state’s oil revenue. There were also other policy experiments in North America in the 1960s and 1970s, and more recently in other countries such as Namibia and India. But, for a number of reasons, the existing empirical evidence has limited value in assessing the effects of basic income “proper” as a fundamental welfare state reform.
The resulting uncertainty and lack of any concrete evidence means that critics of a basic income are able to exploit both the public’s risk aversion and the favourite bogeyman of the right-wing press: the fabled work-shy benefit scrounger. So the results of the Finnish experiment were highly anticipated, because they might provide a reality check.
The Finnish experiment paid 2,000 randomly-selected unemployed people a basic income of €560 per month, equivalent to the lower-tier unemployment benefit which it replaced. Payment was guaranteed to continue, no strings attached, for the full two years of the experiment – regardless of whether the individual engaged in job search activities or received income from other sources. Labour market outcomes were analysed, as well as broader indicators of well-being, and were compared with a “control group” of unemployed people on the existing benefits system.
Does cash in hand stop you from wanting to work?
The results show that those pessimistic predictions of a labour market exodus did not transpire. Unfortunately for basic income’s proponents, neither did the more optimistic accounts. Overall, the number of days in employment, and total labour market earnings, were no higher for those receiving the basic income than for those in the control group.
This doesn’t mean that it had no effects on the labour market. It might be that some people were more likely to find employment and others less likely, with the effects balancing out. From the results presented, we simply do not know.
Recipients of the basic income also reported positive effects on their sense of well-being and feelings of trust in other people and the government. But, given that this was self-reported, it may simply reflect a vested interest in stressing the advantages of the policy.
Nevertheless, these effects, plus anecdotal evidence of the wider benefits of the unconditional payment, strengthen the case for basic income. Indeed, advocates have always maintained that their argument does not rest on labour market effects and reduced bureaucratic costs. Rather it rests on more fundamental ideas of social justice, freedom and economic security.
But what is clear is that the findings of Finland’s experiment are unlikely to settle the question of whether basic income is desirable. It is likely that both advocates and opponents will seize on the results as supporting evidence for their positions, as was the case with the North American experiments.
In the end this impasse around basic income reflects the irreconcilable beliefs of supporters and critics regarding their views of “fairness” and the primacy of work in social organisation. The realities of how the labour market is actually affected play second fiddle to these concerns. So it would be surprising indeed if these fairly modest findings were to move anyone from a deeply held position.
Another set of reasons that the results will not resolve the controversy relates to limitations in experimental design. The study only focuses on the unemployed, so we remain in the dark about how basic income might affect the desire of other groups to work, or the broader societal benefits associated with a basic income being universal.
Another core feature was the low level of payment, which meant many recipients still had to apply for additional unemployment benefits – if they were entitled to them – just as they did before. Thus, the majority of people in the treatment group did not benefit from the lower bureaucracy and freedom that basic income is meant to bring.
A final limitation is that the experiment did not try and model the effects of the tax changes that would be required to finance a universal basic income. Yet, tax rises are a constraint that most advocates accept as a practical political reality.
It should be emphasised that these are only preliminary findings, with further analysis to follow later in the spring and again next year. But, it seems likely that insights generated by the Finnish experiment will inevitably fall short of the expectations – and hopes – of many following the debate.
Lessons from IBM for Google, Amazon and Facebook
February 25, 2019
Author: James Cortada, Senior Research Fellow, Charles Babbage Institute, University of Minnesota
Disclosure statement: James Cortada does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
It’s impressive when companies last for decades – or even more than a century – and especially so when they’re in a fast-changing industry like computer technology. IBM, which traces its roots to the 1880s, grew from three small firms to a multi-billion-dollar information technology services company today. Its ups and downs along the way offer some insights into the global technology industry, and may contain some instructive lessons for up-and-coming digital giants like Google, Amazon and Facebook – all of which are far younger than IBM.
In my new book, “IBM: The Rise and Fall and Reinvention of a Global Icon,” I explore the company’s history of creating and selling data processing equipment and software. As a former IBM employee and a historian, the most important lesson I found is that many people confuse incremental changes in technology with more fundamental ones that actually shape the course of a company’s destiny.
There is a difference between individual products – successive models of PCs or typewriters – and the underlying technologies that make them work. Over 130 years, IBM released well over 3,600 hardware products and nearly a similar amount of software. But all those items and services were based on just a handful of real technological advances, such as shifting from mechanical machines to those that relied on computer chips and software, and later to networks like the internet. The transitions between those advances took place far more slowly than the steady stream of new products might suggest.
These transitions from the mechanical, to the digital, and now to the networked reflected an ever-growing ability to collect and use greater amounts of information easily and quickly. IBM moved from manipulating statistical data to using technologies that teach themselves what people want and are interested in seeing.
A focus that can adapt
Between 1914 and 1918, IBM management decided that the business the company would be in was data processing. In more modern terms, that business has become “big data” and analytics. But it’s still collecting and organizing data, and performing calculations and computations on it.
Since the early 1920s, IBM has taken a disciplined approach to product development and research, focusing on developing the underpinning technologies for its data processing products. Nothing seemed to be done by accident.
In its first half-century, IBM’s basic technology platform from which many products emerged was the punch-card, yielding tabulators, card sorters, card readers and the famous IBM Card. In its second half-century, the basic technology platform was the computer, including mainframes, minicomputers, PCs and laptops. In its most recent 30 years, computer sales have brought in a declining share of the company’s total revenue, as IBM transitions to providing more internet-based services, including software and technical and managerial consulting.
The rise of each succeeding technology happened during the maturity and decline of its predecessor. IBM first started selling computers in the 1950s, but kept selling tabulating equipment that still used punch cards until the early 1960s. As recently as the early 1990s, over 90 percent of IBM’s revenues came from selling computers, though it was introducing new services like management and process consulting, information technology management and software sales.
It wasn’t until the end of 2018 that IBM announced that 50 percent of its business now came from services and software, most of which were new offerings developed in the previous decade.
The news media – and even IBM employees – may have perceived that IBM was transforming itself quickly and frequently. In fact the company had planted seeds for growth early and carefully tended new technologies until they bore fruit – fortunately, around the same time as earlier systems were ending their useful lives.
This strategic approach is not uncommon – Apple has been selling personal computers for more than 40 years. Its management, of course, talks much more about its role in the smartphone business, which is already beginning to level off. Apple may soon need – or already be working on – a new technological focus to remain relevant.
The future of the giants
Microsoft, like Apple, evolved away from selling just computer software and operating systems. It started internet-based projects like its Bing search engine and OneDrive cloud storage – as well as providing cloud-based computing services for businesses.
Companies that started on the internet may also face similar transitions. Amazon, Google and Facebook at times claim to have transformed themselves, but haven’t yet fully left their original businesses.
Amazon still makes most of its money selling physical items online, though its internet-based cloud services division is growing rapidly. Amazon has also invested in a wide range of other business that might grow in the future, such as health care and entertainment content.
Google and Facebook still make most of their money selling information about how users behave to advertisers and groups that want to attract people to a particular point of view. Both are exploring other avenues, whether it’s Google’s self-driving cars or Facebook’s experiments with virtual reality.
But at their core, all three internet giants are still finding new ways to capitalize on the vast quantities of information they accumulate about customers’ activities and interests – just as decades earlier IBM found new ways to use tabulating equipment and computers. If they’re to last decades or centuries into the future, the companies will need to probe, experiment and innovate to find new ways to profit as technologies change.
john ranta: This is an overly rosy view. In truth IBM never figured out what to do about PCs, the Internet, smart phones or cloud computing. They clung to their mainframe strategy for far too long, while others passed them by. Their revenues have been declining for years, and their annual revenues now are lower than they were 20 years ago. With the exception of Watson, IBM hasn’t had one innovative idea in decades, and they face continued decline as they chase Amazon, Google and other leaders. Ten years from now IBM will be a smaller, and even less relevant company.