2-for-1: Total lunar eclipse comes with supermoon bonus
By MARCIA DUNN
AP Aerospace Writer
Monday, January 21
CAPE CANAVERAL, Fla. (AP) — The only total lunar eclipse this year and next came with a supermoon bonus.
On Sunday night, the moon, Earth and sun lined up to create the eclipse, which was visible throughout North and South America, where skies were clear. There won’t be another until the year 2021.
It was also the year’s first supermoon, when a full moon appears a little bigger and brighter thanks to its slightly closer position.
The entire eclipse took more than three hours. Totality — when the moon’s completely bathed in Earth’s shadow — lasted an hour. During a total lunar eclipse, the eclipsed, or blood, moon turns red from sunlight scattering off Earth’s atmosphere.
In addition to the Americas, the entire lunar extravaganza could be observed, weather permitting, all the way across the Atlantic to parts of Europe.
Why paper maps still matter in the digital age
January 22, 2019
Author: Meredith Broussard, Assistant Professor of Journalism, New York University
Disclosure statement: MIT Press provides funding as a member of The Conversation US.
Ted Florence is ready for his family trip to Botswana. He has looked up his hotel on Google Maps and downloaded a digital map of the country to his phone. He has also packed a large paper map. “I travel all over the world,” says Florence, the president of the international board of the International Map Industry Association and Avenza, a digital map software company. “Everywhere I go, my routine is the same: I get a paper map, and I keep it in my back pocket.”
With the proliferation of smartphones, it’s easy to assume that the era of the paper map is over. That attitude, that digital is better than print, is what I call “technochauvinism.” In my book, “Artificial Unintelligence: How Computers Misunderstand the World,” I look at how technochauvinism has been used to create an unnecessary, occasionally harmful bias for digital over print or any other kind of interface. A glance at the research reveals that the paper map still thrives in the digital era, and there are distinct advantages to using print maps.
Your brain on maps
Cognitive researchers generally make a distinction between surface knowledge and deep knowledge. Experts have deep knowledge of a subject or a geography; amateurs have surface knowledge.
Digital interfaces are good for acquiring surface knowledge. Answering the question, “How do I get from the airport to my hotel in a new-to-me city?” is a pragmatic problem that requires only shallow information to answer. If you’re traveling to a city for only 24 hours for a business meeting, there’s usually no need to learn much about a city’s layout.
When you live in a place, or you want to travel meaningfully, deep knowledge of the geography will help you to navigate it and to understand its culture and history. Print maps help you acquire deep knowledge faster and more efficiently. In experiments, people who read on paper consistently demonstrate better reading comprehension than people who read the same material on a screen. A 2013 study showed that, as a person’s geographic skill increases, so does their preference for paper maps.
For me, the difference between deep knowledge and surface knowledge is the difference between what I know about New York City, where I have lived for years, and San Francisco, which I have visited only a handful of times. In New York, I can tell you where all the neighborhoods are and which train lines to take and speculate about whether the prevalence of Manhattan schist in the geological substrate influenced the heights of the buildings that are in Greenwich Village versus Midtown. I’ve invested a lot of time in looking at both paper and digital maps of New York. In San Francisco, I’ve only ever used digital maps to navigate from point to point. I’ll be the first to admit that I don’t know where anything is in the Bay Area.
Our brains encode knowledge as what scientists call a cognitive map. In psychology-speak, I lack a cognitive map of San Francisco.
“When the human brain gathers visual information about an object, it also gathers information about its surroundings, and associates the two,” wrote communication researchers Jinghui Hou, Justin Rashid and Kwan Min Lee in a 2017 study. “In a similar manner to how people construct a mental map of a physical environment (e.g., a desk in the center of an office facing the door), readers form a ‘cognitive map’ of the physical location of a text and its spatial relationship to the text as a whole.”
Reading in print makes it easier for the brain to encode knowledge and to remember things. Sensory cues, like unfolding the complicated folds of a paper map, help create that cognitive map in the brain and help the brain to retain the knowledge.
The same is true for a simple practice like tracing out a hiking route on a paper map with your finger. The physical act of moving your arm and feeling the paper under your finger gives your brain haptic and sensorimotor cues that contribute to the formation and retention of the cognitive map.
Another factor in the paper versus digital debate is accuracy. Obviously, a good digital map is better than a bad paper map, just like a good paper map is better than a bad digital map.
Technochauvinists may believe that all digital maps are good, but just as in the paper world, the accuracy of digital maps depends entirely on the level of detail and fact-checking invested by the company making the map.
For example, a 2012 survey by the crowdsourcing company Crowdflower found that Google Maps accurately located 89 percent of businesses, while Apple Maps correctly found 74 percent. This isn’t surprising, as Google invests millions in sending people around the world to map terrain for Google StreetView. Google Maps are good because the company invests time, money and human effort in making its maps good – not because digital maps are inherently better.
Fanatical attention to detail is necessary to keep digital maps up to date, as conditions in the real world change constantly. Companies like Google are constantly updating their maps, and will have to do so regularly for as long as they continue to publish. The maintenance required for digital content is substantial – a cost that technochauvinists often ignore.
In my view, it’s easier to forgive the errors in a paper map. Physical maps usually include an easily visible publication date so users can see when the map was published. (When was the last time you noticed the date-of-last-update on your car navigation system?) When you are passively following the spoken GPS directions of a navigation system, and there is, say, an unmarked exit, it confuses the GPS system and causes chaos among the people in the car. (Especially the backseat drivers.)
The best map for the job
Some of the deeper flaws of digital maps are not readily apparent to the public. Digital systems, including cartographic ones, are more interconnected than most people realize. Mistakes, which are inevitable, can go viral and create more trouble than anyone anticipates.
For example: Reporter Kashmir Hill has written about a Kansas farm in the geographic center of the U.S. that has been plagued by legal trouble and physical harassment, because a digital cartography database mistakenly uses the farm’s location as a default every time the database can’t identify the real answer.
“As a result, for the last 14 years, every time MaxMind’s database has been queried about the location of an IP address in the U.S. it can’t identify, it has spit out the default location of a spot two hours away from the geographic center of the country,” Hill wrote. “This happens a lot: 5,000 companies rely on MaxMind’s IP mapping information, and in all, there are now over 600 million IP addresses associated with that default coordinate.”
A technochauvinist mindset assumes everything in the future will be digital. But what happens if a major company like Google stops offering its maps? What happens when a government shutdown means that satellite data powering smartphone GPS systems isn’t transmitted? Right now, ambulances and fire trucks can keep a road atlas in the front seat in case electronic navigation fails. If society doesn’t maintain physical maps, first responders won’t be able to get to addresses when there is a fire or someone is critically ill.
Interrupting a country’s GPS signals is also a realistic cyberwarfare tactic. The U.S. Navy has resumed training new recruits in celestial navigation, a technique that dates back to ancient Greece, as a guard against when the digital grid gets hacked.
Ultimately, I don’t think it should be a competition between physical and digital. In the future, people will continue to need both kinds of maps. Instead of arguing whether paper or digital is a better map interface, people should consider what map is the right tool for the task.
Meredith Broussard is the author of: Artificial Unintelligence: How Computers Misunderstand the World. MIT Press provides funding as a member of The Conversation US.
Ira More: Another argument, the strongest one from my perspective, of paper vs. digital maps, is the pure joy of mapping out a route on paper. I am a private pilot and long-distance motorcycle rider. While I have GPS for both my aircraft and motorcycle, it is used only as a last resort.
Kremlin ‘optimistic’ ahead of WADA doping ruling
By JAMES ELLINGWORTH
AP Sports Writer
Tuesday, January 22
MOSCOW (AP) — Russia is “optimistic” ahead of a World Anti-Doping Agency ruling on whether the country’s authorities met demands to turn over lab data, Vladimir Putin’s spokesman said Tuesday.
The WADA executive committee reinstated Russia’s anti-doping agency in September — despite protests from many Western athletes and officials — on condition the country turned over data from a Moscow laboratory. The data could help WADA pursue doping cases against many top Russian athletes for past offenses.
WADA representatives left Moscow with the data last week but only after Russia missed a Dec. 31 deadline.
“Our sports authorities have clearly made the maximum effort to arrange the work of the WADA representatives in Moscow, to arrange all the necessary procedures and contacts,” said Putin’s spokesman, Dmitry Peskov. “So in Moscow everyone is optimistic.”
The U.S. Anti-Doping Agency had called for the Russian agency, known as RUSADA, to be suspended for missing the deadline, though WADA said it preferred to wait for the data.
Russian law enforcement sealed off the lab and its data after the former director, Grigory Rodchenkov, testified to WADA that he covered up doping for several years and swapped doped Russian athletes’ samples for clean urine during the 2014 Winter Olympics in Sochi.
WADA president Craig Reedie on Thursday hailed the recovery of the data as “a major breakthrough for clean sport,” but said WADA couldn’t yet be sure the information was genuine.
More AP sports: https://apnews.com/apf-sports and https://twitter.com/AP_Sports
Are microbes causing your milk allergy?
January 22, 2019
Author: Cathryn Nagler, Bunning Food Allergy Professor, University of Chicago
Disclosure statement: Cathryn Nagler is the President and Co-Founder of ClostraBio, Inc.
In the past 30 years, food allergies have become increasingly common in the United States. Changes to human genetics can’t explain the sudden rise. That is because it takes many generations for changes to spread that widely within a population. Perhaps the explanation lies in changes to our environment, particularly our internal environment. Shifting lifestyle practices over the last half-century – increasing antibiotic and antimicrobial use, surface sterilization, air filtration and changes to diet – have changed our internal environment and wiped out important bacteria with beneficial health effects.
For many years, my research group at the University of Chicago has been exploring the role that intestinal bacteria play in preventing allergic responses to food. Bacteria, together with viruses, fungi and other small organisms living in and on our bodies, collectively make up the microbiome and play a critical supporting role in health.
The microbiome is our internal environment. Humans and microbes have “grown up” together: As humans evolved, so did their microbes. We tend to think of health practices as changing slowly, but from the perspective of the bacteria in our guts, changes in their composition and function have happened more quickly – and the results are dramatic.
Intestinal bacteria and allergies
Several years ago, my research group, together with a collaborator in Italy, Roberto Berni Canani, was comparing the bacteria present in infants with a diagnosed cow’s milk allergy to those without. We found some remarkable differences between the two groups. This led us to wonder whether the different bacteria present in each of the two groups are sufficient to protect against allergy. And if so, could we figure out why?
To test this idea, we transplanted the entire microbiome of the two different groups – the healthy infants and those allergic to cow’s milk – into special laboratory mice that were bred in a completely sterile environment, with no bacteria of their own. The idea was simple: If we transplant the different groups of bacteria into mice, will the mouse become allergic to cow’s milk or not?
When we did this, we were stunned by the results: The bacteria from a healthy infant could protect the mouse from developing an anaphylactic response to a cow’s milk protein, while the bacteria from a cow’s milk allergic infant did not.
A new diagnostic?
When we cataloged bacteria present in the mice colonized with healthy bacteria and those present in the mice colonized with the cow’s milk allergic bacteria, we were able to calculate a ratio of protective to nonprotective groups. This ratio could accurately predict whether or not the infants had an allergy. We also learned that the two different groups of bacteria activate different genes in the mouse gut.
These genes influence a variety of processes in the intestine, such as metabolism and permeability. We identified one bacterial species in particular, Anaerostipes caccae, as the key. When we put only this species into a germ-free mouse, the mouse was protected from food allergy.
These studies show a health-promoting role for the microbiome in food allergy. It’s clear that the internal environment of the intestine is very different in infants with and without food allergy, and that this internal environment changes the biochemistry of the intestine.
Our study also suggests a way forward to harness these protective bacteria, and the molecules that they produce, as therapies to prevent and to treat food allergy. They could also work well as a diagnostic tool for predicting allergies and allergy risk. Therapies based on this idea remain 5 to 10 years away, but I am excited for their prospects. Such therapies may provide relief for children, parents, caregivers and patients living with food allergy.
TSA screener sick-outs hit 10 percent over holiday weekend
By DAVID KOENIG
AP Airlines Writer
Tuesday, January 22
The percentage of TSA airport screeners missing work has hit 10 percent as the partial government shutdown stretches into its fifth week.
The Transportation Security Administration said Monday that Sunday’s absence rate compared to 3.1 percent on the comparable Sunday a year ago.
The workers who screen passengers and their bags face missing another paycheck if the shutdown doesn’t end early this week. According to TSA, many of them say the financial hardship is preventing them from reporting to work.
TSA says the national average waiting time in airport checkpoint lines is within the normal limit of 30 minutes, but there are longer lines at some airports.
The agency has dispatched extra screeners to airports in Atlanta, LaGuardia Airport in New York, and Newark, New Jersey. A TSA spokesman said other airports might also be getting additional help.
Sunday’s 10 percent absence rate indicates that more than 3,000 airport screeners missed work. TSA has 51,000 screeners, and a spokesman said that about 33,000 work on any given day. That topped the previous high of 8 percent on Saturday.
With fewer screeners, TSA closed one of its security checkpoints at Baltimore/Washington airport over the weekend, reopened it, but closed it again Monday afternoon, according to an airport spokeswoman.
A checkpoint at Houston’s Bush Intercontinental Airport remained closed. An airport spokesman said lines were relatively short at the other six checkpoints.
TSA appeared to be managing the high sick-out rate as well as could be expected. The agency said that on Sunday it screened 1.78 million passengers, and only 6.9 percent — roughly 120,000 people — had to wait 15 minutes or longer to get through security.
No figures were yet available for Monday, but websites or spokespeople for several major airports including Dallas-Fort Worth and Chicago’s O’Hare reported normal security lines and few problems. Hartsfield-Jackson Atlanta International Airport, which had some of the longest lines in the country last week, reported waits of 15 to 30 minutes at domestic-travel checkpoints Monday. Los Angeles International Airport showed most lines under 20 minutes.
TSA got a break from bad weather: Storms in the Midwest and Northeast led airlines to cancel more than 4,400 flights over the three-day weekend, which reduced the number of passengers to screen.
A few airports — San Francisco’s being the largest — conduct screening with government-approved private contractors, not TSA. A long government shutdown and more TSA sick-outs could lead other airports to consider going private, although that hasn’t happened yet.
The holiday honoring Martin Luther King Jr. is not as busy for travel as many other three-day weekends. However, inconvenience could become a crisis for the travel industry the longer the shutdown lasts — and there are few signs of movement by President Donald Trump or congressional Democrats to break the stalemate over border-wall spending that is causing the shutdown.
“Presidents’ Day weekend is much bigger, and then spring break and Easter— those are really important,” said Savanthi Syth, an airline analyst for Raymond James. Presidents’ Day is Feb. 18, and Syth said if the shutdown drags into next month it could cause some passengers to cancel travel plans.
David Koenig can be reached at http://twitter.com/airlinewriter
Lessons from ‘Spider-Man’: How video games could change college science education
January 22, 2019
Author: Aaron Harrison, Teaching and Research Fellow, Chapman University
Disclosure statement: Aaron Harrison does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Like many people over the holidays, I spent some time – maybe too much – playing one of the most popular and best reviewed video games of 2018: “Spider-Man.”
While I thought I’d be taking a break from chemistry research, I found myself web-swinging through virtual research missions all over New York City. I collected samples of polycyclic aromatic hydrocarbons in Hell’s Kitchen, studied vehicle emissions in Chinatown and determined the chemical composition of atmospheric particulate matter in Midtown.
“Spider-Man” has many of these eco-friendly research missions. But what I found most encouraging is that the game also includes tools that can potentially teach advanced concepts in chemistry and physics. These tools include adjusting the wavelength and amplitude of radio waves, rewiring circuits to meet target voltages, and what will be examined here, using absorption spectroscopy to identify unknown chemicals.
Beleive it or not, the millions of people playing “Spider-Man” have been unwittingly introduced to principles of quantum mechanics. There is a lot of veiled science to this aspect of the video game. Perhaps more importantly – as a chemistry researcher and university lecturer – I believe the game represents an interesting opportunity to teach science in a fun and engaging way in higher education.
Spectroscopy and ‘Spider-Man’
To better understand the scientific technique that players simulate in “Spider-Man,” it helps to have a short primer on what absorption spectroscopy is.
The interaction of light with matter is the most powerful means scientists have to understand what matter is made of. When matter does not interact with light, we are quite literally left in the dark. This problem is made obvious in the still unknown composition of dark matter that constitutes the vast majority of matter in the universe.
Using light to study ordinary matter like atoms and molecules is a broad field of science known as spectroscopy. It is an important part of university courses in chemistry and physics. There are currently many different types of spectroscopy. However, the underlying concepts are almost entirely the same as the original version that began in the 17th century when Isaac Newton first dispersed sunlight with a prism.
As famously illustrated on Pink Floyd’s “Dark Side of the Moon” album cover, dispersing the white light of the sun with a prism reveals its continuous color spectrum extending from violet (higher energy, shorter wavelength) to red (lower energy, longer wavelength). However, if this is done carefully, you would find that this continuous spectrum is patterned with intermittent dark bands.
While the origin of these dark bands was not fully understood until the 20th century, scientists now know that they are due to absorption of specific wavelengths of light by atoms and molecules present in the sun. In fact, this kind of spectroscopy led to the discovery of helium in the solar spectrum before it was identified on Earth. This is why it derives its name from the Greek “helios” meaning sun.
So what causes this phenomenon? Atoms and molecules have a set of energy levels that depend on how their electrons are arranged. The absorption of light – which remember is energy – can cause the electrons to rearrange into these different levels. The catch is that the energy – or wavelength – of light must exactly match the energy difference between two electron arrangements in an atom or molecule for absorption to occur. This set of energies is unique for each chemical and leads to a distinct absorption spectrum much like a fingerprint from which it can be identified.
In “Spider-Man,” the player identifies unknown substances using simplified versions of these spectra.
Spectrum of Unknown Molecule from Research Mission.
The goal is to match the pattern in the spectrum using the fragment inventory provided to give the absorption spectrum of the unknown substance. Unfortunately for chemists everywhere, determining the chemical structure of an unknown molecule is much more complicated.
Still, there is a significant amount of science conveyed in the video game version of what a spectroscopist would call assigning this spectrum. Only slight modifications and additional explanation could make these parts of the game an excellent way to teach these concepts to undergraduate science students. But are video games ever used in higher education?
Video games in higher education
Video games for teaching more elementary skills like arithmetic or spelling are common. Similarly, colleges and universities are increasingly infusing video games into their coursework.
In a recent publication in the journal Nature Chemistry, researchers presented a modified version of the video game “Minecraft” called “PolyCraft World.” In this game, the player learns polymer chemistry by crafting materials in the game. Preliminary results showed that students learned real chemistry through the game even though they weren’t doing it for grades or getting regular classroom instruction.
In the popular game “Kerbal Space Program,” the player builds their own space program by successfully launching rockets into orbit. The game was not originally intended for educational purposes but implements rigorous orbital mechanics in its physics calculations. It is so accurate that NASA joined the game’s developers to create new missions, and it now has a teaching-ready standalone game that could be used directly in university physics courses.
A unique approach has been taken with the biochemistry-based game “FoldIt.” This game serves as both an educational as well as a citizen science platform. In the game, players manipulate the structures of real proteins to search for the “best” or lowest energy structures. Results published in the journal Nature showed that the player’s search methods can be successfully combined with computer-based algorithms to solve actual scientific problems.
The use of video games in higher education is a real possibility and could even have a promising future in higher education given the advantages of delivering educational content through a video game format. These advantages include things such as remote access, personalized student progress and immediate feedback. However, creating an engaging video game from scratch is challenging, costly and time-consuming. As indicated by the creators of “PolyCraft World,” finding existing games to modify for educational purposes – like the research missions in “Spider-Man” – could be the best way forward.