UK’s May says she’ll still have her job after Brexit vote
By JILL LAWLESS
Monday, December 3
LONDON (AP) — British Prime Minister Theresa May brushed aside questions Monday about whether she will resign if her Brexit deal is rejected by Parliament next week, saying she’s confident she’ll still have a job after the crucial vote.
May is battling to persuade lawmakers to support the divorce agreement between Britain and the European Union in a Dec. 11 House of Commons vote. Opposition parties say they will vote against it, as do dozens of lawmakers from May’s Conservatives.
Defeat would leave the U.K. facing a messy, economically damaging “no-deal” Brexit on March 29 and could topple the Conservative prime minister, her government, or both.
May said Monday that “I will still have a job in two weeks’ time.”
“My job is making sure that we do what the public asked us to: We leave the EU but we do it in a way that is good for them,” she told broadcaster ITV.
May has consistently refused to say what she plans to do if — as widely predicted — the British Parliament rejects the deal she reached with the EU.
“I’m focusing on … getting that vote and getting the vote over the line,” she said.
Politicians on both sides of Britain’s EU membership debate oppose the agreement that May has struck with the bloc — Brexiteers because it keeps Britain bound closely to the EU, and pro-EU politicians because it erects barriers between the U.K. and its biggest trading partner.
May’s opponents on both sides argue that Britain can renegotiate the deal for better terms.
But the British government and the EU insist that the agreement, which took a year and a half to negotiate, is the only one on the table, and rejecting it means leaving the bloc without a deal.
Dutch Prime Minister Mark Rutte said Monday that “there is no Plan B.”
Rutte cited “red lines” drawn by both sides during the negotiations, including the U.K.’s refusal to accept the free movement of people between Britain and the EU, and the need to keep an open border between the U.K.’s Northern Ireland and EU member Ireland.
“When you take all these red lines into account, it’s simply impossible to come up with something different than we have currently, the deal on the table,” he told The Associated Press on the sidelines of the global climate conference in Katowice, Poland.
Rutte said the choice was “this, or a hard Brexit, or no Brexit at all.”
Frank Jordans in Katowice, Poland contributed to this story.
PROTECTING THE MOST VULNERABLE FROM GENOCIDE
By J.P. Linstroth
Ghosts of European colonialism still haunt us today even in the 21stcentury. This was evident from the untimely death of American John Allen Chau, 26, from Washington state on November 16that the hands of Sentinelese-Jarawa people from the Andaman Islands.
The last days of Mr. Chau are reminiscent of the novel At Play in the Fields of the Lord(1965) by late naturalist, Peter Mathiessen, where everything goes wrong for the Christian missionaries, as they did for Chau. This young American was naively determined to eradicate what he called “Satan’s last stronghold” in the world and paid for it with his life.
There are only about 100 Sentinelese-Jarawa people left. They are fiercely independent, rejecting contact with outsiders and culturally remaining intact. Jarawa are small-statured and ebony-complexioned and thought to be remnants of people who originally migrated thousands of years ago from Africa and settled in the Andaman Islands east of India and west of Myanmar. For the most part, the Indian government has been successful in preventing interlopers from accessing their island. Chau paid some local fishermen to reach these remote Jarawa people.
As these tribal people have managed to remain isolated, they are also highly susceptible to viral contagion, even from the common cold and flu.
As an anthropologist who writes about indigenous issues, I am aware how much European colonialism is an ever-present issue in the minds and memories of many indigenous peoples because of the horrifying genocide wreaked upon them.
Indeed, there are very few “uncontacted” indigenous peoples remaining in our globalized planet. The majority of uncontacted natives may be found in the Amazon region in the borderland areas of Bolivia, Brazil, and Peru. Anthropologists are continually worried about the safety of such vulnerable people because of their lacking immunity to Western-borne illnesses and threats from outsiders as tourists, illegal loggers, and gold miners encroaching upon their territories.
In assessing the situation of isolated people like the Sentinelese-Jarawa of the Andaman Islands as well as the isolated Amerindians in the Amazon, we need to return to the history of Western thought and Western civilization for explaining the problems associated with contact in relation to indigenous peoples and Europeans.
We may begin with the Abrahamic religions’ origin story of Adam and Eve, who bit from the forbidden fruit of knowledge in the biblical Old Testament and were ejected from Eden by Yahweh. The symbolism is fairly clear. The Garden of Eden represents idyllic nature, the millennia when humankind spent hunting and gathering until the Neolithic Revolution about 12,000 years ago with the domestication of plants and animals. This was the so-called transition to civilization, eventually producing states, writing, hierarchies and class systems, organized religion, slavery, mass warfare, science, astronomy, mathematics, and genocide. It was a knowledge supposing humankind was somehow separate from nature and humans were superior to nature—at least to Western minds in the Judeo-Christian tradition.
What is more, in the “Age of Exploration,” Europeans believed they had to “civilize” non-Western people and bring them God’s word and convert natives to European ways. As such, along with killing and torturing natives, Europeans made it a practice of saving indigenes from their supposed state of nature, ignorance, and savagery.
When the Portuguese first landed in Brazil in 1500, Pêro Vaz de Caminha wrote King Manoel I: “They seem to be such innocent people that, if we could understand their speech and they ours, they would immediately become Christians. For it appears that they do not have or understand any faith…”Salvation of natives indeed became one of the primary projects of the conquest.
Such religious proselytizing supposes, like Mr. Chau, native peoples have no minds and no wills of their own and are devoid of proper religious thought. They are rather native objects and vessels who require filling with the true faith brought to them by force, if necessary—why would God give the means of force to the colonizers unless He wanted them to convert the natives?
Witness accounts from the Spanish conquest such as those of Bartolomé de las Casas describe how devastating the violence against Indians was, such as during the invasion of Peru: “…I affirm that with my own eyes I saw Spaniards cut off the nose, hands, and ears of Indians, male and female, without provocation, merely because it pleased them to do it, and this they did in so many places that it would take a long time to recount.”
Similarly, here is an eyewitness account by Captain Nicholas Hodt of an 1861 massacre of Navajo (Diné) in present-day Arizona: “The Navahos, squaws, and children ran in all directions and were shot and bayoneted. I succeeded in forming about twenty men…I then marched out to the east side of the post; there I saw a soldier murdering two little children and a woman. I hallooed immediately to the soldier to stop. He looked up, but did not obey my order…”
The arc of history of this protracted genocide, from massacres like “Wounded Knee” against the Sioux in 1890, or more recently, the effects of development in the Brazilian Amazon during the 1960s, is long and collectively never forgotten by descendants of the victims. In every case, aboriginal peoples are subsequently plagued by a sense of loss of their identities, homelands, ways of life, cultures, resulting often in alcoholism, domestic violence, drug abuse, and suicide.
Unbeknownst to most Americans, Hitler partially based his ideas of the concentration camp and extermination of European Jews on Native American reservations and our racial policies in the United States. Americans systematically conquered original peoples and justified the ensuing genocide with the idea of “Manifest Destiny,” a term coined by journalist John O’Sullivan, meaning the divine right to settle the continental United States from “sea to shining sea.” The Natives were just part of the natural landscape and in the way.
Genocide continues happening now in Brazil with the persecution of the Guarani-Kaiowá on ranch lands in Mato Grosso do Sul. Even more worrying is that newly-elected Brazilian President Jair Bolsonaro promises to persecute Brazilian Amerindians for their lands.
Like the Sentinelese-Jarawa who voluntarily choose to be isolated, the Mashco-Piro people of Western Peru live by choice without contact. As anthropologist Glenn Shepard, of the Museu Emilio Goeldi of Brazil, explains: “…Isolated Amazonian groups have not remained stuck in the Stone Age since time immemorial. Rather, they have resorted to ‘voluntary isolation’ in modern times in order to survive.”
“Civilization” is an ambivalent outcome at best for indigenous peoples. At this late date, it is likely that we can learn much more from them than the reverse. After all, their lifeways would sustain for unknown millennia, whereas climate chaos, nuclear annihilation, resource depletion and contamination—products of conquering empires—are supplanting genocide with societal suicide. Time to listen instead of preach, time to slow down, consume much less, and respect all peoples and our planet.
J. P. Linstroth earned his PhD from the University of Oxford in Social and Cultural Anthropology and is a former Fulbright Scholar to Brazil. He is author of Marching Against Gender Practice: Political Imaginings in the Basqueland.
Taking a second look at the learn-to-code craze
Updated November 30, 2018
Are computers in the classroom more helpful to students – or the companies that sell the machines?
Author: Kate M. Miltner, Ph.D. Candidate in Communication, University of Southern California, Annenberg School for Communication and Journalism
Disclosure statement: Kate M. Miltner does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Partners: University of Southern California, Annenberg School for Communication and Journalism provides funding as a member of The Conversation US.
Over the past several years, the idea that computer programming – or “coding” – is the key to the future for both children and adults alike has become received wisdom in the United States. The aim of making computer science a “new basic” skill for all Americans has driven the formation of dozens of nonprofit organizations, coding schools and policy programs.
As this year’s annual Computer Science Education Week begins, it is worth taking a closer look at this recent coding craze. The Obama administration’s “Computer Science For All” initiative and the Trump administration’s effort are both based on the idea that computer programming is not only a fun and exciting activity, but a necessary skill for the jobs of the future.
However, the American history of these education initiatives shows that their primary beneficiaries aren’t necessarily students or workers, but rather the influential tech companies that promote the programs in the first place. The current campaign to teach American kids to code may be the latest example of tech companies using concerns about education to achieve their own goals. This raises some important questions about who stands to gain the most from the recent computer science push.
Old rhetoric about a ‘new economy’
One of the earliest corporate efforts to get computers into schools was Apple’s “Kids Can’t Wait” program in 1982. Apple co-founder Steve Jobs personally lobbied Congress to pass the Computer Equipment Contribution Act, which would have allowed companies that donated computers to schools, libraries and museums to deduct the equipment’s value from their corporate income tax bills. While his efforts in Washington failed, he succeeded in his home state of California, where companies could claim a tax credit for 25 percent of the value of computer donations.
The bill was clearly a corporate tax break, but it was framed in terms of educational gaps: According to a California legislative analysis, the bill’s supporters felt that “computer literacy for children is becoming a necessity in today’s world” and that the bill would help in “placing needed ‘hardware’ in schools unable to afford computers in any other way.”
Kids Can’t Wait took advantage of Reagan-era concerns that Americans were “falling behind” global competitors in the “new economy.” In 1983, a U.S. Department of Education report titled “A Nation at Risk” warned that the country’s “once unchallenged preeminence in commerce, industry, science, and technological innovation is being overtaken by competitors throughout the world.” The report’s authors blamed the American education system for turning out graduates who were underprepared for a fast-changing, technology-infused workplace.
Over the past 30 years, the same rhetoric has appeared again and again. In 1998, Bill Clinton proclaimed that “access to new technology means … access to the new economy.” In 2016, U.S. Chief Technology Officer Megan Smith described the Obama administration’s coding initiative as an “ambitious, all-hands-on-deck effort to get every student in America an early start with the skills they’ll need to be part of the new economy.”
While technology is often framed as the solution for success in a globalized labor market, the evidence is less clear. In his 2001 book “Oversold and Underused: Computers in the Classroom,” education researcher Larry Cuban warned that technology on its own would not solve “education’s age-old problems,” such as inequitable funding, inadequate facilities and overworked teachers.
Cuban found that some educational technology initiatives from the 1990s did help students get access to computers and learn basic skills. But that didn’t necessarily translate into higher-wage jobs when those students entered the workforce. However, the equipment and software needed to teach them brought large windfalls for tech companies – in 1995 the industry was worth US $4 billion.
If computers in schools didn’t work as promised two decades ago, then what’s behind the current coding push? Cuban points out that few school boards and administrators can resist pressure from business leaders, public officials and parents. Organizations like the CS For All Consortium, for example, have a large membership of education companies who are taking advantage of funding from state legislatures.
A huge boost comes from the tech giants, too. Amazon, Facebook, Google, Microsoft and others are collectively contributing $300 million to the Trump administration’s new federal initiative – no doubt seeing, as The New York Times observed, the potential to “market their own devices and software in schools as coding classes spread.”
This isn’t always the best deal for students. In 2013, the Los Angeles Unified School District planned to give Apple iPads to every student in every school – at a cost of $1.3 billion. The program was a fiasco: The iPads had technical problems and incomplete software that made them essentially useless. The fallout included investigations by the FBI and the U.S. Securities and Exchange Commission, and a legal settlement in which Apple and its partners repaid the school district $6.4 million.
However, tech companies are framing their efforts in more noble terms. In June 2017, Microsoft president Brad Smith compared the efforts of tech industry nonprofit Code.org to previous efforts to improve science and technology training in the United States. Recalling the focus on scientific research that drove the Space Race, Smith said, “We think computer science is to the 21st century what physics was to the 20th century.”
Indeed, tech companies are having a very hard time hiring and retaining software engineers. With new concerns about restrictions on visas for skilled immigrant workers, the industry could definitely benefit from a workforce trained with public dollars.
For some tech companies, this is an explicit goal. In 2016, Oracle and Micron Technology helped write a state education bill in Idaho which read, “It is essential that efforts to increase computer science instruction, kindergarten through career, be driven by the needs of industry and be developed in partnership with industry.” While two lawmakers objected to the corporate influence on the bill, it passed with an overwhelming majority.
Some critics argue that the goal of the coding push is to massively increase the number of programmers on the market, depressing wages and bolstering tech companies’ profit margins. Though there is no concrete evidence to support this claim, the fact remains that only half of college students who majored in science, technology, engineering or math-related subjects get jobs in their field after graduation. That certainly casts doubt on the idea that there is a “skills gap” between workers’ abilities and employers’ needs. Concerns about these disparities has helped justify investment in tech education over the past 20 years.
As millions of dollars flow to technology companies in the name of education, they often bypass other major needs of U.S. schools. Technology in the classroom can’t solve the problems that budget cuts, large class sizes and low teacher salaries create. Worse still, new research is finding that contemporary tech-driven educational reforms may end up intensifying the problems they were trying to fix.
Who will benefit most from this new computer science push? History tells us that it may not be students.
Editor’s notes: This is an updated version of an article originally published Dec. 4, 2017.