The Apple Watch is inching toward becoming a medical device
By MICHAEL LIEDTKE
AP Technology Writer
Friday, September 14
CUPERTINO, Calif. (AP) — Apple is trying to turn its smartwatch from a niche gadget into a lifeline to better health by slowly evolving it into a medical device.
In its fourth incarnation, called Series 4 and due out later this month, the Apple Watch will add features that allow it to take high-quality heart readings and detect falls. It’s part of Apple’s long-in-the-making strategy to give people a distinct reason to buy a wrist gadget that largely does things smartphones already do.
Since the Apple Watch launched in April 2015 , most people haven’t figured out why they need to buy one. Apple doesn’t release sales figures, but estimates from two analysts suggest the company shipped roughly 18 million watches in 2017. Apple sold almost 12 times as many iPhones — 216 million — last year.
Worldwide, about 48 million smartwatches are expected to be sold this year, compared with nearly 1.9 billion phones, according to the research firm Gartner.
Apple CEO Tim Cook has long emphasized the watch’s health and fitness-tracking capabilities. The original version featured a heart-rate sensor that fed data into fitness and workout apps so they could suggest new goals and offer digital “rewards” for fitness accomplishments.
Two years later, Apple called its watch “the ultimate device for a healthy life,” emphasizing water resistance for swimmers and built-in GPS for tracking runs or cycling workouts. In February, the company announced that the watch would track skiing and snowboarding runs, including data on speed and vertical descent.
The latest version, unveiled Wednesday, is pushing the health envelope even further — in particular by taking electrocardiograms, or EKGs, a feature given clearance by the U.S. Food and Drug Administration, Apple said. The watch will also monitor for irregular heartbeats and can detect when the wearer has fallen, the company said.
EKGs are important tests of heart health and typically require a visit to the doctor. The feature gained an onstage endorsement from Ivor Benjamin, a cardiologist who is president of the American Heart Association. He said such real-time data would change the way doctors work.
Gartner analyst Tuong Nguyen said the feature could turn smartwatches “from something people buy for prestige into something they buy for more practical reasons.”
It could also lead some health insurance plans to subsidize the cost of an Apple Watch, Nguyen said. That would help defray the $400 starting price for a device that still requires a companion iPhone, which can now cost more than $1,000.
Apple’s watch will use new sensors on the back and on the watch dial. A new app will say whether each reading is normal or shows signs of atrial fibrillation, an irregular heart rate that increases the risk of heart complications, such as stroke and heart failure.
Apple says the heart data can be shared with doctors through a PDF file, though it’s not yet clear how ready doctors are to receive a possible flood of new EKG data from patients — nor how useful they will find the electronic files.
Eric Topol, a cardiologist and director of the Scripps Research Translational Institute, warned that the EKG feature could spur more tests than necessary, result in unnecessary prescriptions for blood thinners and overwhelm doctors with calls from patients who probably don’t need treatment.
He said that while the feature will probably save some lives and prevent strokes with early detection of heart trouble, “the ratio between the benefits and the costs remains a big unknown.”
Apple said the EKG feature will be available to U.S. customers later this year, an indication that it may not be ready for launch.
Fall detection could also be significant, especially for elderly users. The new Apple Watch claims to be able to tell the difference between a trip and a fall — and when the latter occurs, it will suggest calling 911 (or the equivalent outside the U.S.). If it receives no response within a minute, the watch will automatically place an emergency call and message friends and family designated as emergency contacts.
Only certain Apple Watch models support cellular calls, but those that don’t can still make emergency calls when near a paired iPhone or Wi-Fi service.
Apple says it monitored some 2,500 people — measuring how they fell off ladders, missed a step while walking or got their legs caught in their pants while getting dressed. It used that data to separate real falls from other heavy wrist movements, such as clapping and hammering.
The feature is available immediately worldwide and will turn on automatically for users 65 and over. Younger people can activate it in the settings.
“I can see kids buying one for their parents and grandparents,” analyst Patrick Moorhead of Moor Insights said.
But the Apple Watch still lacks one feature found in rival wrist gadgets: the ability to analyze sleep quality. Battery life in the new watch remains at 18 hours, meaning it needs a nightly recharge.
Delaware County Office of Homeland Security and Emergency Management
Nationwide Wireless Emergency Alert Test September 20th
Citizens to Receive “Presidential Alert”
Test Message on Wireless Devices
DELAWARE COUNTY – The Federal Emergency Management Agency (FEMA) in coordination with the Federal Communications Commission (FCC) will conduct a nationwide test of the Emergency Alert System (EAS) and Wireless Emergency Alert (WEA) on Thursday, September 20, 2018 beginning at 2:18 p.m. The test will evaluate the operational readiness of the infrastructure for distribution of a national message and determine whether technological improvements are needed.
The WEA system is used to warn the public about dangerous weather, missing children, and other critical situations through alerts on cell phones.
It allows customers whose wireless provider participates in WEA and who own a WEA compatible wireless phone to receive geo-targeted alerts of imminent threats to safety in their area through unique tones and vibration.
The national WEA test will use the same tone and vibration. The test message will display “Presidential Alert” and read “THIS IS A TEST of the National Wireless Emergency Alert System. No action is needed.”
The WEA test will be sent through the Integrated Public Alert & Warning System (IPAWS), as part of the nation’s modern alert and warning infrastructure that automatically authenticates alerts. The test is intended to ensure public safety officials have the methods and systems that will deliver urgent alerts and warnings to the public in times of an emergency or disaster.
“We want residents of and visitors to Delaware County to be aware that this is just a test to ensure that a very important national warning system is working appropriately. Those with compatible cell phones will be familiar with this system as the means through which AMBER Alerts have been recently issued.
Specific to Delaware County, the test will coincide with the Little Brown Jug Harness Race date. We do not want fairgoers to be concerned when numerous cell phones alarm as part of this test,” said Sean Miller, Director of Delaware County Homeland Security & Emergency Management The EAS test will take place at 2:20 p.m. and is only available to EAS participants (e.g., radio and television broadcasters, cable systems, satellite radio and television providers, and wireline video providers). The test message will last approximately one minute and be similar to regular monthly EAS test messages with which the public is familiar.
FEMA has set October 3, 2018 as secondary test date if needed.
New data paint an unpleasant picture of poverty in the US
September 12, 2018
Professor of Economics, Colorado State University
Steven Pressman does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Colorado State University provides funding as a member of The Conversation US.
On Sept. 12, the U.S. Census Bureau released national poverty data for 2017.
The headline was that 39.7 million people were poor in 2017. This works out to 12.3 percent of the population or one in eight Americans. The good news is that the U.S. poverty rate has fallen since 2010, when it hit 15.1 percent, and is now where it was before the Great Recession.
The bad news is that poverty still exceeds the 11.3 percent rate of 2000 and far too many people are poor in a country that is so rich. Another bit of bad news is that things look even worse if we use what many scholars like myself believe is a better poverty measure.
Who is poor?
In 2017, women had higher poverty rates than men and minorities had higher poverty rates than non-Hispanic whites, mainly because women earn less than men and minorities receive lower wages on average than whites. For similar reasons, adults with lower education levels are more likely to be poor.
What’s more, having an additional adult able to earn money gives married-couple families much lower poverty rates than households headed by a single woman.
Poverty also varies by age. For those 65 and over, the poverty rate fell from the 1960s until the 1990s, mainly due to more generous Social Security benefits. Since then, it has remained at around 10 percent. The poverty rate for prime-age adults fell until around 1980. After 1980, it fluctuated around 10 percent, rising during recessions and falling during economic expansions.
Child poverty, however, has been relatively high in the U.S. since the late 1970s; it now stands at 17.5 percent. For children in a female-headed household, the poverty rate is near 50 percent.
Problems with measuring poverty
These data all come from American households, using methodology developed in the early 1960s by Mollie Orshansky of the Social Security Administration.
Taking Agriculture Department data on minimum food requirements, Orshansky calculated the annual cost of a subsistence food budget for families of different sizes and types. Household budget studies from the 1950s showed that families spent one-third of their income on food. So, Orshanksy multiplied the cost of a minimum food budget for each family type by three to arrive at their poverty threshold. Thresholds rise annually based on inflation over the past year.
Being poor means having insufficient income during the year to purchase bare necessities. The poverty rate is the percentage of the population in this situation.
The Orshansky poverty measure has been subject to substantial criticism. Clearly, poverty thresholds are not very high. A single individual making US$1,060 a month would not be considered poor. Yet, in most areas in the U.S., it’s hard to rent a place for less than $500 a month.
Even if that’s possible, this leaves only $20 a day for transportation, clothing, phone, food and other expenses. Orshansky’s minimal food budget assumed that people shop wisely, never eat out and never give their children treats. She actually preferred a more generous food budget to get multiplied by three; but she was overruled by senior government officials.
Another problem is that the U.S. poverty measure ignores income and payroll taxes. In the early 1960s, the poor paid minimal taxes. Starting in the late 1970s, low-income families faced a more formidable tax burden, leaving them less money to purchase basic necessities. Conversely, in the late 1990s, tax credits began to lower the tax burden on the poor.
Finally, standards concerning what is required to be a respectable member of society vary over time and place. For example, cellphones did not exist until recently. Childcare was not necessary for many in the 1950s or 1960s; but when all adults in a family work, it’s essential.
More bad news
To deal with this last problem, many scholars prefer a relative measure of poverty. The Luxembourg Income Study, a research organization that analyzes income distribution, considers households to be poor if their income, adjusted for household size, falls below 50 percent of the median income of their country for the particular year.
Unlike the U.S. Census Bureau, the Luxembourg Income Study subtracts taxes from income when measuring poverty. It also adds government benefits, and makes data as comparable as possible across nations. The result is a poverty rate that is typically two to four percentage points above the official U.S. measure.
From an international perspective, the U.S. clearly does poorly. According to Luxembourg Income Study, the U.S. poverty rate was 17.2 percent in the mid-2010s – much higher than other developed countries, such as Canada and the U.K.
Things are even worse when it comes to child poverty. In the U.S., child poverty rates have surpassed 20 percent for several decades, making it an outlier among developed nations.
My research has identified two important policies responsible for this last result: child allowances and paid parental leave. Child allowances are fixed monthly payments to parents made for each child. Paid leave provides income to parents around the birth or adoption of a new child. Both policies are available in developed nations throughout the world – except the U.S. The more generous these national benefits are, the lower the child poverty rate.
Considerable research shows that growing up poor adversely affects children’s health, as well as their intellectual and social development. It lowers earnings in adulthood, and reduces future tax revenues for the government while increasing government social spending.
The annual cost of child poverty comes to around $1 trillion. Meanwhile, every dollar spent reducing child poverty is estimated to yield $7 in the future. This exceeds the return on most private investments.
For centuries, anonymous insider accounts have chipped away at ruling regimes – and sometimes toppled them
September 13, 2018
Professor of English, Cleveland State University
Rachel Carnell does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Bob Woodward’s new book, “Fear: Trump in the White House,” seems to contain scant new information.
Like Michael Wolff’s “Fire and Fury: Inside the Trump White House,” it portrays President Donald Trump as an “emotionally overwrought, mercurial and unpredictable leader,” whose senior staff struggle to contain his most dangerous impulses.
This same view of Trump was reiterated in a Sept. 5 anonymous New York Times op-ed, which, as Nebraska Sen. Ben Sasse observed, is “just so similar to what so many of us hear from senior people around the White House, you know, three times a week.”
But whether “Fear” tells us something new matters less than the fact that the book is yet another broadside against Trump’s image. It adds more fuel to the suspicions many have about the president’s behind-the-scenes behavior.
In fact, Woodward’s “Fear” – together with Wolff’s “Fire and Fury,” Omarosa Manigault’s “Unhinged” and the anonymous op-ed – is part of a long tradition of political “secret histories,” a genre that recounts salacious and scandalous details about the dealings, relationships and temperaments of those in power. It’s a practice that goes back centuries, and it’s one that my co-editor and I explore in our book “The Secret History in Literature, 1660-1820.”
Secret histories tend to take two forms. There is the plain-spoken, just-the-facts approach, similar to Woodward’s “Fear.” Then there are novelistic accounts with major figures depicted using pseudonyms, as in “Primary Colors,” a lightly fictionalized dramatization of the Clinton White House.
But the secrets unveiled in these works usually don’t come out of nowhere. Instead, they contain anecdotes that have long been whispered or suspected. The goal of secret histories is to emphasize embarrassing stories about a ruler or government – to propel the drumbeat of negative coverage in order to strengthen the opposition and, in some instances, to even topple governments.
Justinian was the subject of a secret history circulated by the military historian Procopius. Petar Milošević
Secret histories date back at least to the sixth century, when the military historian Procopius wrote down sordid anecdotes about Byzantine Emperor Justinian and his wife, Theodora, in a work that became known as “Anekdota,” which translates to “unpublishable things.” Ten centuries later, it appeared in Latin as “Historia Arcana,” or “Secret History.”
As a military historian, Procopius had helped create the myth of Justinian’s greatness in his eight-book treatise “The Wars of Justinian.” But in his “Anekdota,” Procopius finally told the ugly backstory of Justinian’s reign: his lust, his seizure of others’ property, his petty vengefulness and his persecution of non-Christians. The work was almost certainly circulated in manuscript scroll among Justinian’s enemies. While it probably damaged his standing, Justinian was nonetheless able to retain his grip on power.
After French and English translations of Procopius’ “Anekdota” appeared in 1669 and 1674, secret histories in the same style began to appear about King Charles II of England.
These tended to focus on his mistresses – particularly the infamous Duchess of Cleveland, who manipulated Charles for over a decade, persuading him to grant her land and money and bestow titles of nobility on their illegitimate children.
These reports, which read like tabloid-style gossip, were never just about sex.
Readers of one account, titled “The Amours of the King of Tamaran,” likely realized that if the king could be duped and controlled by his powerful mistress, he was also susceptible to being influenced by England’s adversaries.
Indeed, he was: Another secret history, Andrew Marvell’s “Account of the Growth of Popery and Arbitrary Government in England,” described the backstory of the Secret Treaty of Dover, in which Charles II accepted large sums of money from the French king in exchange for promising to return England to Catholicism.
These publications didn’t bring down the politically skilled Charles II, who was glad to take Louis XIV’s money but savvy enough to decide against changing his country’s religion.
They did, however, sow suspicion towards Charles II and his family. After Charles II’s death, his openly Catholic younger brother, James, ascended the throne in 1685, instilling fear that England would return to Catholicism. Seven Englishmen wrote to Prince William of Orange – who was a Protestant – pleading that he invade England. In the Glorious Revolution that ensued, James II fled to France, and Parliament declared William and his wife, Mary, joint monarchs of England.
The Glorious Revolution of 1688 helped inspire American colonists to rebel against another British monarch, with the not-so-secret history of George’s III’s “repeated injuries and usurpations” enshrined in the Declaration of Independence.
Some might disparage Woodward’s book as “anonymously-sourced gossip.”
But gossip has always been important to humankind. As Israeli historian Yuval Noah Harari notes in “Sapiens,” his best-selling account of early human history:
“It is not enough for individual men and women to know the whereabouts of lions and bison. It’s much more important for them to know who in their band hates whom, who is sleeping with whom, who is honest, and who is a cheat.”
Those who dismiss Woodward’s book underestimate the power that gossip and behind-the-scenes revelations wield over politics – and the way it has shaped the course of human history.
Lessons from White House disinformation a century ago: ‘It’s dangerous to believe your own propaganda’
September 13, 2018
John Maxwell Hamilton
Global Scholar at Woodrow Wilson International Center for Scholars, Washington, DC and Hopkins P Breazeale Professor, Manship School of Mass Communications, Louisiana State University
Meghan Menard McCune
Ph.D. candidate, Manship School of Mass Communication, Louisiana State University
The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.
One hundred years ago, the U.S. government published documents that fueled the mounting Red Scare, helped justify the American military invasion of Russia and poisoned American-Russian relations for years to come.
Newspapers across the United States began to publish the fake papers on Sept. 15, 1918.
Unbeknownst to the government, the documents were forgeries. They were created by Russian political interests whose party affiliation remains obscure, but whose objectives were clear. The documents were part of a Russian disinformation campaign – a common propaganda tactic during World War I – to discredit the Bolsheviks, who had just seized power in Russia after leading a Marxist revolution.
The publication of the documents was a classic case of the American government accepting bogus information because it confirmed its preconceptions and justified actions it wished to take.
We are scholars who work at the intersection of media and politics. We believe the incident illustrates the base power of disinformation lies not in technology, which many blame for the rise of counterfeit information today, but in weakness in human nature. The desire to confirm their beliefs about the Bolsheviks led top U.S. leaders to ignore credible and persistent warnings of the documents’ inaccuracy and aggressively assert government authority to discredit those few who questioned them.
The Sisson documents
By 1918, World War I had raged on for nearly four years, and the Russians, who fought on the side of the U.S. and others against Germany, had experienced two recent revolutions.
The first revolution ousted the Czarist regime. The second, led by Bolshevik leaders Vladimir Lenin and Leon Trotsky, ousted the provisional government.
Despite the chaos, the U.S. and Allies desperately needed Russia to continue fighting. But the Bolsheviks were intent on taking an exhausted Russian military out of the war and promised to begin peace talks with Germany once in power.
The Committee on Public Information, which operated as an American propaganda ministry during the war, sent Edgar Sisson, a former muckraking journalist, to Petrograd in November 1917, before the Bolsheviks seized power. He was to use publicity tools, which included press releases, films and speeches, to urge Russians to remain in the war.
By the time he arrived, the Bolsheviks were in power. A month later, they began peace negotiations with Germany.
President Woodrow Wilson wanted to build a case against the Bolsheviks. Library of Congress
The peace talks played into widespread rumors in Russia and elsewhere that Lenin and Trotsky were paid German agents. Before its ouster, the Russian provisional government tried to use the rumors to discredit the Bolsheviks and hasten their demise. But the rumors had never been proven.
This all seemed to change, however, in February 1918.
Raymond Robins, head of the American Red Cross Commission in Russia, gave Sisson confidential documents that implied Germans financed and directed the Bolsheviks. Sisson deemed the documents valid despite Robins’ doubts.
President Woodrow Wilson and the State Department encouraged Sisson to collect evidence that Lenin and his comrades were German pawns, which would support the administration’s anti-Bolshevik policies.
By the time he left Russia in March, Sisson had collected 68 documents, mostly with the help of a shadowy figure, Evgeni Petrovich Semenov. Semenov, a former secret service agent in the provisional government, told Sisson he lifted the papers from Bolshevik headquarters.
Documents make news in the US
American press coverage of Sisson’s story was sensational. Most stories appeared on the front page and were published in installments during the week. They accepted the government view of Bolshevik treachery.
The government’s timing of the release was politically strategic. Wilson had by this time agreed to an Allied military intervention in Russia for the purpose of protecting stockpiles of Allied war material and ensuring the safe travel of anti-German forces through Siberia to the Eastern Front.
In August and September, U.S. troops were arriving in Russia. This move constituted a threat to the Bolshevik government and violated Wilson’s promise of self-determination.
But the fake documents legitimized intervention by suggesting that the Bolshevik stooges were not representative of the Russian people.
The documents presented a problem that often arises in national security reporting.
With little time or expertise to properly determine an account’s authenticity, journalists often rely solely on the government’s word. When a majority of the public is mentally prepared to accept the government’s account because it conforms to their preconceptions, the government can easily beat back doubts by calling doubters disloyal and un-American.
In a reasoned analysis of the documents published in a pamphlet by the Liberator, left-wing journalist John Reed showed they were probably forgeries and said they falsely justified military intervention in Russia.
George Creel, the head of the Committee on Public Information, sought to discredit Reed by labeling him the “center of the Bolsheviki movement in this country.”
Mostly, though, Creel evoked government authority as the chief basis for accepting the validity of the documents when they were questioned. This was the case with the lone establishment newspaper that challenged the authenticity of the documents, the New York Evening Post.
In letters found at the National Archives, Creel said the Post gave “aid and comfort to the enemies of the United States.” He expressed surprise that the New York Evening Post refused to accept evidence put forth by the government. And he told the paper’s owner that his editor had acted as an advocate of Lenin and Trotsky. He said the paper behaved as if it, too, had taken German money.
The New York Evening Post, he complained bitterly, was demanding “the Government should take the witness stand.”
The credulous acceptance of the documents by the press can be traced to the effectiveness of extensive wartime propaganda by the Committee on Public Information, which we have been studying for the past several years. One of the committee’s chief propaganda messages was widespread German spying and treachery in the United States and abroad.
It was an easy step, when nudged by the government, to believe the Germans enlisted godless Bolsheviks in their cause.
As the Literary Digest magazine observed, editors found “great satisfaction in adding legal proof to their moral certainty, and when the Government guarantees the authenticity of the documents proving that Lenin and Trotzky are German agents, it gives them an opportunity to speak their minds without hesitation and without reserve.”
Congress, too, was willing to endorse this view. The Democratic majority in the Senate used a special subcommittee to emphasize Bolshevik-German ties. They found witnesses who testified the Bolsheviki movement was a branch of the German government.
Power in plausibility
George Kennan and other historians have concluded the Sisson documents are “unquestionable forgeries,” but this does not mean they were devoid of truth.
As with all effective disinformation, their power lay in their plausibility. The documents’ authors enhanced their forgeries with facts. Germans did help the Bolsheviks, funneling millions of Deutsche marks to them during the war.
But, as one diplomat noted, the Bolsheviks would have accepted money from anyone. More important, the Bolsheviks sought to foment a communist revolution in Germany as soon as they could.
For those who spread disinformation, then, it is often not a matter of being tricked into believing the information they have spread. Creel, Sisson and others recklessly ignored warnings the documents were false. They wanted to believe the conspiracy, so they did.
Today, we call this confirmation bias.
The United States’ path to war with Iraq in 2003 eerily recalled that element of the Sisson documents. To make the case for an invasion, the George W. Bush administration relied heavily on Ahmadi Chalabi, an exiled Iraqi politician, and fellow Iraqi dissidents who wanted Saddam Hussein ousted.
Chalabi lined up a parade of Iraq defectors to provide compelling – and inaccurate – stories of Hussein’s terrorist connections and his stockpile of weapons of mass destruction. In addition to selling the invasion to the public, the campaign solidified the administration’s conviction that it was right to do what it wanted to do.
“It’s dangerous,” Chalabi was known to say from time to time, “if you believe your own propaganda.”