Science Says: A big space crash likely made Uranus lopsided
By SETH BORENSTEIN
AP Science Writer
Friday, December 21
WASHINGTON (AP) — Uranus is a lopsided oddity, the only planet to spin on its side. Scientists now think they know how it got that way: It was pushed over by a rock at least twice as big as Earth.
Detailed computer simulations show that an enormous rock crashed into the seventh planet from the sun, said Durham University astronomy researcher Jacob Kegerreis, who presented his analysis at a large earth and space science conference this month.
Uranus is unique in the solar system. The massive planet tilts about 90 degrees on its side, as do its five largest moons. Its magnetic field is also lopsided and doesn’t go out the poles like ours does, said NASA chief scientist Jim Green. It also is the only planet that doesn’t have its interior heat escape from the core. It has rings like Saturn, albeit faint ones.
“It’s very strange,” said Carnegie Institution planetary scientist Scott Sheppard, who wasn’t part of the research.
The computer simulations show that the collision and reshaping of Uranus — maybe enveloping some or all of the rock that hit it — happened in a matter of hours, Kegerreis said. He produced an animation showing the violent crash and its aftermath.
It’s also possible that the big object that knocked over Uranus is still lurking in the solar system too far for us to see, said Green. It would explain some of the orbits of the planet and fit with a theory that a missing planet X is circling the sun well beyond Pluto, he said.
Green said it’s possible that a lot of smaller space rocks — the size of Pluto — pushed Uranus over, but Kegerreis’ research and Sheppard point to a single huge unknown suspect. Green said a single impact “is the right thinking.”
The collision happened 3 billion to 4 billion years ago, likely before the larger moons of Uranus formed. Instead there was a disk of stuff that would eventually come together to form moons. And when that happened, Uranus’ odd tilt acted like a gravity tidal force pushing those five large moons to the same tilt, Kegerreis said.
It also would have created an icy shell that kept Uranus’ inner heat locked in, Kegerreis said. (Uranus’ surface is minus 357 degrees, or minus 216 Celsius.)
Ice is key with Uranus and its neighbor Neptune. A little more than a decade ago, NASA reclassified those two planets as “ice giants,” no longer lumping them with the other large planets of the solar system, the gas giants Saturn and Jupiter.
Pluto, which is tiny, farther from the sun and not even officially a planet anymore, has been explored more than Uranus and Neptune. They only got brief flybys by Voyager 2, the space probe that entered interstellar space last month.
Uranus and Neptune “are definitely the least understood planets,” Sheppard said.
But that may change. A robotic probe to one or both of those planets was high up on the last wish-list from top planetary scientists and likely will be at or near the top of the next list.
Uranus was named for the Greek god of the sky. Its name often generates juvenile humor when it is wrongly pronounced like a body part. (It’s correctly pronounced YUR’-uh-nus.)
“No one laughs when I say Uranus,” NASA’s Green said. “They have to mispronounce it to get the chuckles.”
Follow Seth Borenstein on Twitter: borenbears .
This Associated Press series was produced in partnership with the Howard Hughes Medical Institute’s Department of Science Education. The AP is solely responsible for all content.
Nightlights for stream dwellers? No, thanks
Ohio State News
Dec. 19, 2018
Study: Artificial light harms ecosystem health
COLUMBUS, Ohio – Artificial light at night isn’t just a health problem for those of us sitting in bed scrolling through Instagram instead of hitting the sack — it hurts entire outdoor ecosystems.
When the critters that live in and around streams and wetlands are settling into their nighttime routines, streetlights and other sources of illumination filter down through the trees and into their habitat, monkeying with the normal stateof affairs, according to new research from The Ohio State University.
“This is among the first studies to show that light at night has detrimental effects not just on individual organisms in the environment, but also on communities and ecosystems,” said Mažeika Sullivan, lead author of the study, which appears today (Dec. 19, 2018) in the journal Ecological Applications.
“Nighttime light is having profound impacts that extend to the entire ecosystem,” said Sullivan, director of Ohio State’s Schiermeier Olentangy River Wetland Research Park and associate professor of environment and natural resources.
Though many people might not consider it, artificial light is a pollutant, changing the natural course of life for people, animals and plants, he said, adding that urbanization is rapidly increasing both in the United States and aroundthe globe.
“We are experiencing this pollution that we don’t think about, but it’s all around us and it’s chronic and it’s happening everywhere – from newly lit villages in rural Africa to streams alongside the highway in Columbus, Ohio,” he said. “It’s also unprecedented in Earth’s history.”
The new study explored the role of light on streams and wetlands in and around Columbus. Moonlight under a clear sky can give an illuminance of 0.1 to 0.3 lux, Sullivan said. The streams in the study were headwater streams draining intothe Scioto and Olentangy rivers, with light ranging from 0.01 to 4.0 lux. Wetlands of the Olentangy River Wetland Research Park had lighting from 0 to 20 lux.
The research team examined the effect of existing artificial light in streams, and they manipulated the light in wetlands. In all cases, there was a canopy of trees and other vegetation overhead, buffering the light.
From those areas, they collected a variety of ubiquitous water-dwelling and land-dwelling invertebrate species, including mayflies, water bugs, ants and spiders.
They found that species composition changed with increases in light intensity.
They also discovered that the food chain length of the invertebrate communities – a measure that tells researchers about the complexity of a food web – shortened with more light.
“Decreases in food chain length are a pretty big deal, as it reflects not just changes in the architecture of an ecosystem – the numbers of various species – but also shifts in ecosystem stability and nutrient flows,” Sullivan said.
“Artificial light decreased food chain length in this study, which means the ecosystem is less complex.”
The researchers also saw detrimental changes in energy flow – how nutrients are cycled between aquatic and nearby ecosystems. In particular, invertebrates became less reliant on food sources that originate in the water when they were exposedto moderate light levels.
The reasons for the changes are complex, and could include such factors as an increase in predators attracted to a lit area at night, which decreases the population of their prey, or even shifts in plant growth, Sullivan said.
Previous studies have clearly shown that individual species are impacted by artificial light.
“The classic example is hatchling sea turtles that became disoriented and instead of going toward the moonlit ocean at night, they were headed inland, toward coastal lighting,” Sullivan said. “In that case, light-management tactics havehelped address the problem.”
But until now, scientists haven’t fully detailed the broader implications – how light might affect species interactions, communities and important ecosystem functions.
“One of the neatest parts about this research is that we scaled up and said, ‘What does it mean for the entire community and ecosystem?’” Sullivan said.
Going forward, Sullivan would like to explore larger ecosystem-level questions, such as how light’s effects contribute to fish and bird populations and to water quality.
“I think we’re going to start to find that that light has cascading consequences that are linked to other environmental problems we’re seeing, possibly including harmful algal blooms.”
Work in this area will continue with a new $600,000 grant from the Ohio Department of Transportation, which is looking for ways to better understand how roadway lighting interacts with the environment, Sullivan said.
A particular focus will be the role of LED lights, which are replacing many halogen lights in the interest of energy conservation.
Interventions such as carefully directing light, using motion sensors to activate lights only when they’re needed and dimming lighting during times when human activity is minimal could all have the potential to buffer the effects of lightingnear wildlife, Sullivan said. Thinking about what types of lights are being used could also be important, he said.
Lars Meyer of Ohio State and Katie Hossler, now at Wright State University, also worked on the study.
David vs. Goliath: What a tiny electron can tell us about the structure of the universe
December 20, 2018
Author: Alexey Petrov, Professor of Physics, Wayne State University
Disclosure statement: Alexey Petrov receives funding from US Department of Energy.
Partners: Wayne State University provides funding as a member of The Conversation US.
What is the shape of an electron? If you recall pictures from your high school science books, the answer seems quite clear: an electron is a small ball of negative charge that is smaller than an atom. This, however, is quite far from the truth.
The electron is commonly known as one of the main components of atoms making up the world around us. It is the electrons surrounding the nucleus of every atom that determine how chemical reactions proceed. Their uses in industry are abundant: from electronics and welding to imaging and advanced particle accelerators. Recently, however, a physics experiment called Advanced Cold Molecule Electron EDM (ACME) put an electron on the center stage of scientific inquiry. The question that the ACME collaboration tried to address was deceptively simple: What is the shape of an electron?
Classical and quantum shapes?
As far as physicists currently know, electrons have no internal structure – and thus no shape in the classical meaning of this word. In the modern language of particle physics, which tackles the behavior of objects smaller than an atomic nucleus, the fundamental blocks of matter are continuous fluid-like substances known as “quantum fields” that permeate the whole space around us. In this language, an electron is perceived as a quantum, or a particle, of the “electron field.” Knowing this, does it even make sense to talk about an electron’s shape if we cannot see it directly in a microscope – or any other optical device for that matter?
To answer this question we must adapt our definition of shape so it can be used at incredibly small distances, or in other words, in the realm of quantum physics. Seeing different shapes in our macroscopic world really means detecting, with our eyes, the rays of light bouncing off different objects around us.
Simply put, we define shapes by seeing how objects react when we shine light onto them. While this might be a weird way to think about the shapes, it becomes very useful in the subatomic world of quantum particles. It gives us a way to define an electron’s properties such that they mimic how we describe shapes in the classical world.
What replaces the concept of shape in the micro world? Since light is nothing but a combination of oscillating electric and magnetic fields, it would be useful to define quantum properties of an electron that carry information about how it responds to applied electric and magnetic fields. Let’s do that.
Electrons in electric and magnetic fields
As an example, consider the simplest property of an electron: its electric charge. It describes the force – and ultimately, the acceleration the electron would experience – if placed in some external electric field. A similar reaction would be expected from a negatively charged marble – hence the “charged ball” analogy of an electron that is in elementary physics books. This property of an electron – its charge – survives in the quantum world.
Likewise, another “surviving” property of an electron is called the magnetic dipole moment. It tells us how an electron would react to a magnetic field. In this respect, an electron behaves just like a tiny bar magnet, trying to orient itself along the direction of the magnetic field. While it is important to remember not to take those analogies too far, they do help us see why physicists are interested in measuring those quantum properties as accurately as possible.
What quantum property describes the electron’s shape? There are, in fact, several of them. The simplest – and the most useful for physicists – is the one called the electric dipole moment, or EDM.
In classical physics, EDM arises when there is a spatial separation of charges. An electrically charged sphere, which has no separation of charges, has an EDM of zero. But imagine a dumbbell whose weights are oppositely charged, with one side positive and the other negative. In the macroscopic world, this dumbbell would have a non-zero electric dipole moment. If the shape of an object reflects the distribution of its electric charge, it would also imply that the object’s shape would have to be different from spherical. Thus, naively, the EDM would quantify the “dumbbellness” of a macroscopic object.
Electric dipole moment in the quantum world
The story of EDM, however, is very different in the quantum world. There the vacuum around an electron is not empty and still. Rather it is populated by various subatomic particles zapping into virtual existence for short periods of time.
The Standard Model of particle physics has correctly predicted all of these particles. If the ACME experiment discovered that the electron had an EDM, it would suggest there were other particles that had not yet been discovered. Designua/Shutterstock.com
These virtual particles form a “cloud” around an electron. If we shine light onto the electron, some of the light could bounce off the virtual particles in the cloud instead of the electron itself.
This would change the numerical values of the electron’s charge and magnetic and electric dipole moments. Performing very accurate measurements of those quantum properties would tell us how these elusive virtual particles behave when they interact with the electron and if they alter the electron’s EDM.
Most intriguing, among those virtual particles there could be new, unknown species of particles that we have not yet encountered. To see their effect on the electron’s electric dipole moment, we need to compare the result of the measurement to theoretical predictions of the size of the EDM calculated in the currently accepted theory of the Universe, the Standard Model.
So far, the Standard Model accurately described all laboratory measurements that have ever been performed. Yet, it is unable to address many of the most fundamental questions, such as why matter dominates over antimatter throughout the universe. The Standard Model makes a prediction for the electron’s EDM too: it requires it to be so small that ACME would have had no chance of measuring it. But what would have happened if ACME actually detected a non-zero value for the electric dipole moment of the electron?
Patching the holes in the Standard Model
Theoretical models have been proposed that fix shortcomings of the Standard Model, predicting the existence of new heavy particles. These models may fill in the gaps in our understanding of the universe. To verify such models we need to prove the existence of those new heavy particles. This could be done through large experiments, such as those at the international Large Hadron Collider (LHC) by directly producing new particles in high-energy collisions.
Alternatively, we could see how those new particles alter the charge distribution in the “cloud” and their effect on electron’s EDM. Thus, unambiguous observation of electron’s dipole moment in ACME experiment would prove that new particles are in fact present. That was the goal of the ACME experiment.
This is the reason why a recent article in Nature about the electron caught my attention. Theorists like myself use the results of the measurements of electron’s EDM – along with other measurements of properties of other elementary particles – to help to identify the new particles and make predictions of how they can be better studied. This is done to clarify the role of such particles in our current understanding of the universe.
What should be done to measure the electric dipole moment? We need to find a source of very strong electric field to test an electron’s reaction. One possible source of such fields can be found inside molecules such as thorium monoxide. This is the molecule that ACME used in their experiment. Shining carefully tuned lasers at these molecules, a reading of an electron’s electric dipole moment could be obtained, provided it is not too small.
However, as it turned out, it is. Physicists of the ACME collaboration did not observe the electric dipole moment of an electron – which suggests that its value is too small for their experimental apparatus to detect. This fact has important implications for our understanding of what we could expect from the Large Hadron Collider experiments in the future.
Interestingly, the fact that the ACME collaboration did not observe an EDM actually rules out the existence of heavy new particles that could have been easiest to detect at the LHC. This is a remarkable result for a tabletop-sized experiment that affects both how we would plan direct searches for new particles at the giant Large Hadron Collider, and how we construct theories that describe nature. It is quite amazing that studying something as small as an electron could tell us a lot about the universe.
What are you looking at? How attention affects decision-making
Eye-trackers show that what you gaze at influences your choices
COLUMBUS, Ohio — Scientists using eye-tracking technology have found that what we look at helps guide our decisions when faced with two visible choices, such as snack food options.
But it is not as easy as saying we simply choose what we look at the most, the research found. Instead, our gaze amplifies our desire for choices we already like.
“We don’t necessarily choose something just because we look at it more, as some researchers have suggested. If we look at something we feel neutral about, our attention will have little effect,” said Ian Krajbich, co-author of the study and assistant professor of psychology and economics at The Ohio State University.
“But if we look at something we already like, our attention makes us like it even more in that moment.”
Say you’re looking at two candy bars in a vending machine. You like both of them, but prefer the one with peanut butter slightly more than the one with just chocolate. You’ll usually choose the one with peanut butter, but not always.
“We can use eye tracking to predict when people are going to go against their usual preference,” Krajbich said. “When someone spends more time looking at their less-valued but still liked item, it amplifies how appealing it is.”
Krajbich conducted the study with Stephanie Smith, a graduate student in psychology at Ohio State. Their results appear online in the journal Psychological Science and will appear in a future print edition.
Another interesting finding was that people tended to make their decisions more quickly when they liked both of their two choices, Krajbich said.
“That is surprising to some scientists. The thought is that the quick decision should come when you’re choosing between two items you feel neutral about, because why would you care?” he said.
Instead, people struggle more with the decisions about neutral items and choose quickly between two liked items.
“When both items are good, your attention plays a larger role in your decision and you choose more quickly.”
The researchers used data from six eye-tracking studies involving a total of 228 people, some from their lab and some from other researchers.
In the Krajbich lab, most of the eye-tracking studies involved food choices. Participants start these studies by rating how much they would like to eat over 100 different foods. Then the researchers present participants with pairs of the foods on a computer screen and ask them which one they prefer at that moment. Eye-trackers measure what they look at before they make their choice. At the end of the study, participants actually receive one of the foods they chose.
“What we find is that how long they look at items is not correlated with their ratings – so it is not the case that they are simply looking more at items they rated higher,” Krajbich said.
“But we found that the amount of attention participants give does predict which food they are going to choose, above and beyond their ratings.”
These results suggest that product marketing will have the biggest effect on items you already like, he said. If you’re looking at two brands of an item you like at a store, the package that grabs and holds your attention will probably have an edge when you’re deciding which to buy.
Marketing can actually backfire when you’re forced to buy a product you don’t like.
“An older paper shows that if you have to choose from two disliked items, the item that gets more attention is less likely to be chosen,” Krajbich said.
Overall, this new study shows that the link between attention and choice is more complex than previously believed, he said.
“More attention doesn’t always translate into us choosing a particular item.”
The study was supported by the National Science Foundation.