Reparations considered by candidate


News & Views

Staff & Wire Reports



From left, Greenville, Miss. Mayor Errick Simmons, 2020 Democratic presidential candidate Sen. Elizabeth Warren and Mable Starks, former CEO Mississippi Action for Community Education, tour Central Avenue from Poplar Street in Greenville, Miss., Monday  March 18, 2019. The group discussed dilapidated and affordable housing in the rural communities. (Bill Johnson/The Democrat-Times via AP)

From left, Greenville, Miss. Mayor Errick Simmons, 2020 Democratic presidential candidate Sen. Elizabeth Warren and Mable Starks, former CEO Mississippi Action for Community Education, tour Central Avenue from Poplar Street in Greenville, Miss., Monday March 18, 2019. The group discussed dilapidated and affordable housing in the rural communities. (Bill Johnson/The Democrat-Times via AP)


Warren backs congressional plan for reparations study

Tuesday, March 19

JACKSON, Miss. (AP) — Democratic presidential candidate Elizabeth Warren on Monday embraced a congressional proposal to study a framework for reparations to African-Americans hurt by the legacy of slavery as the best way to begin a “national, full-blown conversation” on the issue.

Warren first voiced support for reparations last month, becoming one of three 2020 Democratic candidates to do so. But her comments about a study on reparations, made during a CNN town hall broadcast from Mississippi, mark a keener focus from the Massachusetts senator on her preferred route to tackle the thorny question of how best to deal with systemic racial inequality.

The Democratic field’s ongoing debate over reparations comes as African-American voters are poised to exert significant influence over the selection of the party’s nominee to take on President Donald Trump.

Warren offered in-depth answers to several other questions that touched on issues important to African-American communities, winning cheers for a call for Mississippi to replace its state flag — the only one in the nation that depicts a Confederate image. Warren, 69, has made racial justice a centerpiece of her case for the Democratic nomination, even as she doubles down on her long-running emphasis on economic inequity.

Warren also came out in favor of eliminating of the electoral college, the most pointed instance of her opposition to the polarizing mechanism the nation uses to elect its presidents.

She has been critical of the electoral college in the past, saying last year that Trump’s 2016 victory — despite Democrat Hillary Clinton’s winning 3 million more total votes — is “not exactly the sign of a healthy democracy.” But Warren’s comments on Monday were her most straightforward endorsement of an end to the electoral college system.

“I think everybody ought to have to come and ask for your vote,” Warren said.

She also faced a tough question about her past claims to Native American identity, a political liability for her presidential run as she attempts to move past a DNA analysis she released last year that showed “significant evidence” of a distant tribal ancestor.

Warren told the audience that, growing up in Oklahoma, “I learned about my family from my family,” adding, “That’s just kind of who I am, and I do the best I can with it.” She added that, based on her experiences traveling to nearly a dozen states so far in her campaign, Americans are more inclined to ask her about issues that affect their everyday lives.

Opinion: A Town Acknowledges Its History, Looks to the Future

By James Huffman

InsideSources.com

Greenville, South Carolina, is all the rage these days. Businesses are flocking to the Upcountry, drawing new residents from all over the United States. Visitors and locals stroll on the city’s tree-lined Main Street and along the paths in Falls Park on the Reedy River. The restaurant scene has become a draw for foodies from near and far. They even have a Red Sox farm team with a scaled-down version of the Green Monster in left field.

On a recent visit my wife and I were surprised to come upon 89 “unknown” soldier graves just off Main Street — each with an undisturbed, small Confederate flag standing next to it. Nearby, on Main Street, stands a tall pedestal supporting a larger-than-life statue of a Confederate soldier. A few feet away is a small memorial to Robert E. Lee. Near the stadium there is also a statue of Shoeless Joe Jackson, another bad actor, though I suspect most of the locals believe he was innocent.

Earlier in the day we passed a handsome monument on Main Street celebrating the students of Sterling High School who, in the 1950s and 1960s, demanded an end to segregation of Greenville County’s public schools. Elsewhere in Greenville is the A.J. Whittenberg Elementary School commemorating the leader of a legal effort to desegregate the county schools. Whittenberg’s daughter Elaine was one of a handful of students who demanded admission to the previously all-white Greenville Junior High School in 1964.

A dozen decades earlier Greenville was a Unionist stronghold, but by 1860 had joined the secessionists. Although the town saw little action in the Civil War, its young men paid the price to which the graves off Main Street testify. The inscriptions on the commanding Confederate memorial speak to their personal sacrifices in defense of the only home and way of life they ever knew.

Greenville is also the home of the Upcountry History Museum in which the history of South Carolina’s western region is documented. Like any place in the South, it is a history of settlement, westward expansion, expropriation of Indian lands, slavery, war, reconstruction, segregation and now a half-century of slow but steady progress toward full democratic equality. The museum presents that story objectively, with neither celebration nor condemnation. Exactly what a history museum should do.

Altogether our little tour of Greenville left us feeling like the people of this beautiful southern town have got it just about right in this age of growing racial tension. The Confederacy is not celebrated nor its defeat lamented, but the young boys who died are honored, albeit in the anonymity imposed by war. The young black students who were denied equal access to education and opportunity are recognized for their determination and sacrifices. The good and bad of Greenville’s history are laid bare for visitors and those who now make their lives in this lovely town.

This is not to say that there has not been controversy. In the spirit of protesters across the country, Fighting Injustice Together held a 2017 rally demanding removal of the Confederate monument. The city took refuge in a state law that requires a two-thirds’ vote of the state’s General Assembly to remove a monument memorializing a historical figure. The protesters’ claims of emotional hurt from the presence of such monuments were made less convincing when one of the group’s leaders, Travis Greene, acknowledged that he didn’t know there was a Confederate memorial in the city until recently.

Also on Greenville’s Main Street is a statue of former mayor Max Heller who is credited with leading the revival of this old mill town. He was, we are told, a forward-looking man. So it seems is Greenville. Its history is what it is. Its future is bright.

ABOUT THE WRITER

James Huffman is dean emeritus at Lewis & Clark Law School in Portland, Ore. He wrote this for InsideSources.com.

OtherWords

Silicon Valley’s Next Target: America’s Farmers

They’re building robots to siphon farm profits out of local communities and into the pockets of rich investors.

By Jim Hightower | March 13, 2019

How’re you gonna keep ‘em down on the farm after they’ve seen… Angus? Not the cattle breed, but the 1,000-pound “farmer of the future.”

Angus is a robot, toiling away on an indoor hydroponic farm that’s soilless as well as soulless. Programmed by a multimillion-dollar Silicon Valley start-up named Iron Ox, Angus’ homestead is an 8,000-square-foot concrete warehouse in a San Francisco suburb.

The farm bot is more of a heavy lifter than a heavy thinker, wheeling around the warehouse to lift, move, and hand off large pallets produce to another robot that, so far, hasn’t earned a name. The human overseers of this robotic animal farm don’t wear John Deere caps, but clean-room hair nets, apparently to prevent anything organic from contaminating the edibles or the bots.

Started by a Google engineer, Iron Ox hopes to install duplicates of its faux farm in metro areas across the country. “If we can feed people using robots,” he says, “what could be more impactful than that?”

How about this: Reconnecting our food system to nature, a democratic economy, and humans?

The roboticists brag that local warehouses can provide fresher lettuce than the mega farms ship from thousands of miles away. But local farmers markets already do that, and the consumer dollars stay in the community, rather than being siphoned off to Iron Ox and the Wall Street financiers of Angus robots.

The robotic indoor farm hucksters quietly concede that their real business plan depends on “sidestepping” the cost of human labor and local farm owners. Instead of democratizing our food economy, their scheme concentrates food profits in a handful of absentee syndicators, rich investors, and technology giants.

Deep in his digital brain, even Angus must know that this is stupid.

OtherWords columnist Jim Hightower is a radio commentator, writer, and public speaker. Distributed by OtherWords.org.

Barrick Gold drops takeover bid for Newmont Mining

Monday, March 11

ELKO, Nev. (AP) — Barrick Gold is dropping its takeover bid for Newmont Mining, as the gold companies instead form a joint venture to combine their Nevada mining operations.

Last month Barrick Gold Corp. offered to acquire Newmont Mining Corp. for about $18 billion in stock.

The joint venture will include the companies’ assets and reserves in Nevada. It doesn’t include Barrick’s Fourmile project and Newmont’s Fiberline and Mike deposits, pending the determination of their commercial feasibility.

The companies estimate they’ll achieve $500 million in average pretax savings a year in the first five full years of the combination.

The venture, which is expected to be completed in the coming months, still needs regulatory approval.

Shares of Barrick Gold rose more than 2 percent in Monday premarket trading, while Newmont’s stock was flat.

The Conversation

Rise and fall of the landline: 143 years of telephones becoming more accessible – and smart

March 14, 2019

Author: Jay L. Zagorsky, Senior lecturer, Boston University

Disclosure statement: Jay L. Zagorsky does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

Partners: Boston University provides funding as a founding partner of The Conversation US.

The global economy has changed dramatically over the past century and a half.

When I lecture my Boston University business students on this topic, I use one of the world’s most transformative inventions to illustrate my point: the telephone.

Before the telephone was invented, it was impossible to communicate by voice across any kind of distance. The landline in 1876, along with the telegraph a few decades earlier, revolutionized communications, leading leap by leap to the powerful computers tucked snugly in our pockets and purses today. And in the process, living standards exploded, with inflation-adjusted GDP surging from US$1,200 per person in 1870 to more than $10,000 today.

What follows are a few facts I like to share with my students, as well as several others that you might not be aware of about how the phone has reshaped our lives – and continues to do so.

‘Watson – I want to see you!’

One of the reasons I use the telephone in my lectures is because inventor Alexander Graham Bell actually created his phone and made the first call while a professor at Boston University, where I teach economics.

The first telephone call happened on March 10, 1876, a few days after the Scottish-born inventor received a patent for the device. After he accidentally spilled battery acid on himself, Bell called for his assistant with the famous phrase “Mr. Watson, come here – I want to see you!”

But that’s not the end of the story. Controversy continues over who actually invented the phone first. While Bell won the series of court battles over the first patent, some historians still give credit to Elisha Gray or Antonio Meucci, both of whom had been working on similar devices.

In fact, in 2002, the U.S. Congress acknowledged Meucci’s role in the invention of the telephone – though it didn’t give him sole credit.

Number of connected telephones

Phones started out as novelty items shown just to kings and queens.

Today, they are something almost everyone carries with them, even the homeless.

In 1914, at the start of World War I, there were 10 people for every working telephone in the U.S. By the end of World War II in 1945, there were five people for every working phone.

The technology passed a key milestone in 1998, when there was one phone for every man, woman and child in the U.S.

As of 2017, there were 455 million telephone numbers for the United States’ 325 million residents, or 1.4 per person. About three-quarters of those numbers were tied to mobile phones, a little over 10 percent were for old-fashioned landlines, and the rest were for internet-enabled phones.

People used to rent their phones

It may sound odd today, but until the early 1980s many consumers had to rent their phones from AT&T.

Until then, the company had a monopoly over most of the U.S. phone system. And in many states, AT&T would only rent phones to customers. In the early 1980s, the rental fee was $1.50 to about $5 per month depending on the type of phone.

That changed in 1983, when the U.S. government ended AT&T’s monopoly. Consumers in all parts of the country suddenly had the option to buy their own phone. At the time, the price for the most basic black rotary dial phone was $19.95, or a bit over $50 in today’s dollars.

The fanciest Trimline phone with push-buttons, instead of a rotary dial was sold for about $55, which is just under $150 today.

Plummeting costs

One reason phones have become so indispensable for communicating is that the cost keeps dropping to make calls.

Making a coast-to-coast phone call a century ago was very expensive. Back in 1915, a three-minute daytime phone call from New York City to San Francisco cost $20.70. Adjusted for inflation, that means the rather abrupt call cost more than $500 in today’s money.

Over the next half-century, prices fell drastically, although it was still rather pricey. In 1968, the same three-minute call cost $1.70 – or about $12 today. That’s why, when I was dating the woman who became my wife, we primarily spoke at night – when phone calls were much cheaper – to save a little money.

Today, almost no one thinks about the price of a single cross-country call or tries to keep conversations short to save money. Phone call prices plummeted after the breakup of the U.S. telephone monopoly in the 1980s. And the invention of technologies like “voice over IP” – popularized by Skype – pushed prices down even further.

Prices have gotten so low that the Federal Communications Commission stopped tracking the cost of long-distance calls in 2006. After decades of recording phone call costs it reported the average long-distance call in 2006 cost just 6 cents per minute. Since most people don’t pay by the minute anymore an extra minute of talking on the phone today is effectively free.

There’s a dark side to cheap calls, however. Robocalls are now constantly spamming Americans. The same reduction in price makes it easy for con artists to ring millions of phone numbers looking for someone gullible enough to believe their pitches.

Phone demographics

It gets a bit more interesting when you look at what types of phones households still use. There has been a dramatic shift in the last few years from landlines to cellphones, with a surprising connection to our well-being.

In 2018, a government survey found that almost 55 percent of households use cellphones exclusively, up from less than 10 percent in 2005. Another 36 percent have both a mobile phone and a working landline. Just over 5 percent of those surveyed said they relied entirely on a landline, compared with over a third of households in 2005. The remaining 3 percent said they didn’t have a phone.

So who are those people who still only use landlines?

Since it’s the Centers for Disease Control and Prevention that actually conducts this survey, we know a little more about those 5 percent. As you might expect, they are primarily elderly people – and they tend to own their homes. In contrast, households that have only mobile phones are more likely to be made up of young people who are renting. They’re also more likely to be poor and live in the Northeast.

In terms of well-being, the CDC notes that the adults in wireless homes are more likely to be healthier and get plenty of exercise than those with only landlines. Conversely, they are also substantially more likely to have had at least one “heavy drinking day” in the past year and more apt to be a current smoker.

Phones have reshaped our lives. The next time you pull out your phone, spend a minute pondering what your life and the world would be like if the phone hadn’t been created.

The Conversation

Why your brain never runs out of problems to find

June 28, 2018

Author: David Levari, Postdoctoral Researcher in Psychology, Harvard University

Disclosure statement: This research was funded by the National Science Foundation.

Why do many problems in life seem to stubbornly stick around, no matter how hard people work to fix them? It turns out that a quirk in the way human brains process information means that when something becomes rare, we sometimes see it in more places than ever.

Think of a “neighborhood watch” made up of volunteers who call the police when they see anything suspicious. Imagine a new volunteer who joins the watch to help lower crime in the area. When they first start volunteering, they raise the alarm when they see signs of serious crimes, like assault or burglary.

Let’s assume these efforts help and, over time, assaults and burglaries become rarer in the neighborhood. What would the volunteer do next? One possibility is that they would relax and stop calling the police. After all, the serious crimes they used to worry about are a thing of the past.

But you may share the intuition my research group had – that many volunteers in this situation wouldn’t relax just because crime went down. Instead, they’d start calling things “suspicious” that they would never have cared about back when crime was high, like jaywalking or loitering at night.

You can probably think of many similar situations in which problems never seem to go away, because people keep changing how they define them. This is sometimes called “concept creep,” or “moving the goalposts,” and it can be a frustrating experience. How can you know if you’re making progress solving a problem, when you keep redefining what it means to solve it? My colleagues and I wanted to understand when this kind of behavior happens, why, and if it can be prevented.

Looking for trouble

To study how concepts change when they become less common, we brought volunteers into our laboratory and gave them a simple task – to look at a series of computer-generated faces and decide which ones seem “threatening.” The faces had been carefully designed by researchers to range from very intimidating to very harmless.

As we showed people fewer and fewer threatening faces over time, we found that they expanded their definition of “threatening” to include a wider range of faces. In other words, when they ran out of threatening faces to find, they started calling faces threatening that they used to call harmless. Rather than being a consistent category, what people considered “threats” depended on how many threats they had seen lately.

This kind of inconsistency isn’t limited to judgments about threat. In another experiment, we asked people to make an even simpler decision: whether colored dots on a screen were blue or purple.

As blue dots became rare, people started calling slightly purple dots blue. They even did this when we told them blue dots were going to become rare, or offered them cash prizes to stay consistent over time. These results suggest that this behavior isn’t entirely under conscious control – otherwise, people would have been able to be consistent to earn a cash prize.

Expanding what counts as immoral

After looking at the results of our experiments on facial threat and color judgments, our research group wondered if maybe this was just a funny property of the visual system. Would this kind of concept change also happen with non-visual judgments?

To test this, we ran a final experiment in which we asked volunteers to read about different scientific studies, and decide which were ethical and which were unethical. We were skeptical that we would find the same inconsistencies in these kind of judgments that we did with colors and threat.

Why? Because moral judgments, we suspected, would be more consistent across time than other kinds of judgments. After all, if you think violence is wrong today, you should still think it is wrong tomorrow, regardless of how much or how little violence you see that day.

But surprisingly, we found the same pattern. As we showed people fewer and fewer unethical studies over time, they started calling a wider range of studies unethical. In other words, just because they were reading about fewer unethical studies, they became harsher judges of what counted as ethical.

The brain likes to make comparisons

Why can’t people help but expand what they call threatening when threats become rare? Research from cognitive psychology and neuroscience suggests that this kind of behavior is a consequence of the basic way that our brains process information – we are constantly comparing what is front of us to its recent context.

Instead of carefully deciding how threatening a face is compared to all other faces, the brain can just store how threatening it is compared to other faces it has seen recently, or compare it to some average of recently seen faces, or the most and least threatening faces it has seen. This kind of comparison could lead directly to the pattern my research group saw in our experiments, because when threatening faces are rare, new faces would be judged relative to mostly harmless faces. In a sea of mild faces, even slightly threatening faces might seem scary.

It turns out that for your brain, relative comparisons often use less energy than absolute measurements. To get a sense for why this is, just think about how it’s easier to remember which of your cousins is the tallest than exactly how tall each cousin is. Human brains have likely evolved to use relative comparisons in many situations, because these comparisons often provide enough information to safely navigate our environments and make decisions, all while expending as little effort as possible.

Being consistent when it counts

Sometimes, relative judgments work just fine. If you are looking for a fancy restaurant, what you count as “fancy” in Paris, Texas, should be different than in Paris, France.

But a neighborhood watcher who makes relative judgments will keep expanding their concept of “crime” to include milder and milder transgressions, long after serious crimes have become rare. As a result, they may never fully appreciate their success in helping to reduce the problem they are worried about. From medical diagnoses to financial investments, modern humans have to make many complicated judgments where being consistent matters.

How can people make more consistent decisions when necessary? My research group is currently doing follow-up research in the lab to develop more effective interventions to help counter the strange consequences of relative judgment.

One potential strategy: When you’re making decisions where consistency is important, define your categories as clearly as you can. So if you do join a neighborhood watch, think about writing down a list of what kinds of transgressions to worry about when you start. Otherwise, before you know it, you may find yourself calling the cops on dogs being walked without leashes.

From left, Greenville, Miss. Mayor Errick Simmons, 2020 Democratic presidential candidate Sen. Elizabeth Warren and Mable Starks, former CEO Mississippi Action for Community Education, tour Central Avenue from Poplar Street in Greenville, Miss., Monday March 18, 2019. The group discussed dilapidated and affordable housing in the rural communities. (Bill Johnson/The Democrat-Times via AP)
https://www.sunburynews.com/wp-content/uploads/sites/48/2019/03/web1_122524662-cbfcc2cf3f0d404dabbb410202206d5c.jpgFrom left, Greenville, Miss. Mayor Errick Simmons, 2020 Democratic presidential candidate Sen. Elizabeth Warren and Mable Starks, former CEO Mississippi Action for Community Education, tour Central Avenue from Poplar Street in Greenville, Miss., Monday March 18, 2019. The group discussed dilapidated and affordable housing in the rural communities. (Bill Johnson/The Democrat-Times via AP)
News & Views

Staff & Wire Reports