University head recommends covering controversial murals
By MARY HUDETZ
Tuesday, October 9
ALBUQUERQUE, N.M. (AP) — A blond-haired, blue-eyed man gazes forward from the decades-old, controversial mural, his arms stretched out holding the hand of a faceless Hispanic man on one side and a faceless Native American on the other.
For decades, the Great Depression-era mural on the University of New Mexico campus in Albuquerque and the white man’s place at the center of it have irked numerous faculty members and students, who say the scene it depicts marginalizes the state’s two largest minority groups. Painted in 1939, it is one of four “Three Peoples” murals commissioned under the federal Public Works of Art Project for the school’s Zimmerman Library.
Now, the university’s president and provost are recommending that curtains cover all four murals as a campus dialogue continues on them, said Alex Lubin, who is the school’s associate provost for faculty development. The proposal from President Garnett Stokes and Provost Richard Wood must next go before the Regents Historic Preservation Committee.
If approved, it would mark the latest change made by the university in response to concerns raised by Native American faculty and student groups over symbolism, art and celebrations on campus.
“This is a process that’s ongoing,” Lubin said. “They understand that tending to these issues are important today — maybe more important than ever.”
Lubin was tasked recently with convening a task force on how to address decades-old concerns over the murals by Kenneth Adams, the artist who had lived in Taos. There also have been student groups — including the Kiva Club, which represents Native American students — that have expressed their opposition to the murals, following decades of protests against them.
“It causes some psychological distress,” said Robin Starr Minthorn, a Native American studies professor who is Kiowa. “You’re always having to walk by there, or you’re sitting in front of it, and you don’t see people representing you who have any facial expressions.”
The students also had pushed for several years for the school to recognize Indigenous Peoples’ Day instead of Columbus Day, which happened on campus for the first time Monday, Minthorn said.
Last year, administrators suspended use of an official University of New Mexico seal that had been etched with the profiles of a frontiersman and conquistador — two groups viewed as having a part in the historical mistreatment of Native Americans in the state.
On campus, the views on the murals among students outside of the organized groups and efforts to change the murals can be mixed. Student Zachary Martinez told KRQE-TV last week that he saw beauty in the artwork. “I think it helps represent New Mexico well,” he said.
“This is a very debated topic, but I do think that we gotta keep it here at UNM,” he added. “Because I think this will teach us about our culture and our past and will help us prevent these things again.”
The first scene in the set of murals shows five Native Americans with a weaving loom at the center. One woman is at work on a basket. Another wears silver and turquoise jewelry. The imagery was intended to serve as a nod toward the traditional arts and crafts of tribes.
In the next mural, two Hispanic women are at work on an adobe building, while a man labors in a field. The scene is meant to pay tribute to Hispanic contributions in agriculture and architecture.
The third mural centers on science, with a man and woman — both blond — hovering over desks with a microscope and other machinery. A white doctor holds a baby.
The mural with the white man flanked by Hispanic and Native American men is last in the series.
“The murals themselves present a very racialized perspective of who is in authority,” said Glenabah Martinez, a Native American professor and associate dean in the university’s Language, Literacy and Sociocultural Studies department. “It was about the portrayal of how the racial hierarchy works in New Mexico.”
Information from: KRQE-TV, http://www.krqe.com
Statistics and data science degrees: Overhyped or the real deal?
October 9, 2018
P. Richard Hahn
Associate Professor of Statistics, Arizona State University
P. Richard Hahn does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Arizona State University provides funding as a member of The Conversation US.
“Data science” is hot right now. The number of undergraduate degrees in statistics has tripled in the past decade, and as a statistics professor, I can tell you that it isn’t because freshmen love statistics.
Way back in 2009, economist Hal Varian of Google dubbed statistician the “next sexy job.” Since then, statistician, data scientist and actuary have topped various “best jobs” lists. Not to mention the enthusiastic press coverage of industry applications: Machine learning! Big data! AI! Deep learning!
But is it good advice? I’m going to voice an unpopular opinion for the sake of starting a conversation. Stats is indeed useful, but not in the way that the popular media – and all those online data science degree programs – seem to suggest.
While all the press tends to go to the sensationalist applications – computers that watch cat videos, anyone? – the data science boom reflects a broad increase in demand for data literacy, as a baseline requirement for modern jobs.
The “big data era” doesn’t just mean large amounts of data; it also means increased ease and ability to collect data of all types, in all walks of life. Although the big five tech companies – Google, Apple, Amazon, Facebook and Microsoft – represent about 10 percent of the U.S. market cap and dominate the public imagination, they employ only one-half of one percent of all employees.
Therefore, to be a true revolution, data science will need to infiltrate nontech industries. And it is. The U.S. has seen its impact on political campaigns. I myself have consulted in the medical devices sector. A few years back, Walmart held a data analysis competition as a recruiting tool. The need for people that can dig into the data and parse it is everywhere.
In a speech at the National Academy of Sciences in 2015, Steven “Freakonomics” Levitt related his insights about the need for data-savvy workers, based on his experience as a sought-after consultant in fields ranging from the airline industry to fast food. He concluded that the next-generation super-employee is someone with a bit of business sense, a bit of computing know-how and a bit of statistics under his or her belt.
Data is increasingly being called on to inform all our decisions. But this broad utility means that it isn’t sexy. The sexy jobs – working on self-driving cars or Go-playing computers – are going to require more than an undergrad major in statistics or a week-long bootcamp on prediction using Python. In fact, I was once told by an industry colleague that the term “data scientist” was coined to placate Ph.D. physicists who were tasked with running linear regressions all day long.
So, the way I see it, there will be egghead types off at the edge of the field, and there will some folks doing the necessary drudge work, and there will be a lot of people in between, looking carefully at the data and trying to glean useful insights. But – and this is the big point – everyone had better know how to make basic graphs and poke around a database.
So where do I sign up?
Five years ago there was no such thing as a data science degree, and now the list runs for pages and pages. And that’s not counting the traditional statistics programs, or programs in related subjects like computer science or operations research. LinkedIn’s sidebar strongly feels I should consider an online master’s degree in data analytics, from several different places.
The proliferation of these programs speaks to the inadequacy of many people’s undergraduate educations in terms of statistics and data competency. Although stats majors have tripled, there were only 3,000 last year, compared to 370,000 business degrees and 117,000 psych degrees. More of these students should certainly give statistics (or one of the newer data science degrees) a hard look, given that a bachelor’s degree is borderline compulsory these days.
But I worry that the premise behind the appeal of these degrees – especially at the master’s level – is the idea that the technology alone can solve problems. Nothing could be farther from the truth. Statistics is a tool for understanding data, but cannot by itself understand anything. Probably the biggest mistake people make when applying statistical or machine learning methods is not recognizing that the data being analyzed is insufficient to answer the relevant question. A degree that teaches you only about the hottest predictive analytics technology, like deep learning, is a bit like learning how to drive without knowing the first thing about how to navigate.
Setting realistic expectations for the added value of a statistics education is important to me because I’m a true believer. I feel that more people should learn statistics and how to analyze data because it is a powerful way to understand modern life. In addition to boosting one’s job prospects, a statistics education can teach you when to ignore your doctor’s bad advice, help you understand important financial ideas and, in general, help you be wrong less often. These real virtues are undermined by big data hype.
So yes, lots more folks are studying statistics at the college level than in the past and, absolutely, even more people should be. But I think focusing on the surge in data science specialists is misinterpreting the nature of the demand. Everyone should have more of these skills, even if it isn’t their primary job title.
Why I’m not surprised Nobel Laureate Donna Strickland isn’t a full professor
Updated October 5, 2018
Associate Professor, University of British Columbia
Michelle Stack does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
University of British Columbia provides funding as a founding partner of The Conversation CA.
Donna Strickland, an associate professor at the University of Waterloo in Canada, was awarded the 2018 Nobel Prize in physics. The third woman to have ever been awarded this prize in 117 years, she shares it with Arthur Ashkin and Gérard Mourou.
Strickland explained that she’s been treated as an equal by her male peers, which does bring a ray of hope in a rather bleak time for women. Her experience is also different than that of many.
Women make up 27 per cent of full professors in the academy as a whole, and in science, technology, engineering and math, that number is much lower. For racialized and Indigenous women in all fields, the numbers go down even further.
The rank of full professor offers more pay, more prestige and more opportunities to be selected for senior leadership roles within a university.
As Kirsty Duncan, Canada’s minister of science, explained in an opinion piece last summer, sexism was a problem for her and still is a problem for women in universities.
A 2018 report from the Canadian Association of University Teachers also concluded that, despite talk by universities and colleges of a commitment to inclusive institutions, progress on equity has been exceptionally slow.
The twins of sexism and racism
The reasons why so few women get Nobel Prizes and achieve full professorship, and even fewer who are racialized or Indigenous do, are interconnected.
It’s the twins of sexism and racism. As political scientist Malinda Smith shows, there are a number of factors — she calls them the “dirty dozen” — that normalize gender and racial biases. This results in a tiny demographic (white, male) as the predictable winners in a rigged game.
Examples of the “dirty dozen” include white male students receiving more opportunities to network. Then there are the reference letters. While female students might have better grades, it is more likely in letters of reference that their professors will talk about them as having potential or being hard workers, compared to white men who are cast as brilliant.
On top of that, workload concerns leave women less time for research. For example, women, particularly if racialized or Indigenous, are more likely to get sessional work, with significantly lower pay scales, higher teaching loads and little time for research.
Finally, tenure-track women are less likely to get competitive research funding, and when they do, they often earn less money than men.
To become a full professor, one needs to apply and be evaluated by committees repeatedly — first for a tenure-track position, then for tenure and promotion — usually with an impressive portfolio of research funding and peer-reviewed publications. This is a portfolio that requires extensive research time, collaborations and support to achieve.
Women scientists are victims of a systemic inequity that impacts us all.
A reward system biased towards men
I have just completed a research project on the role of the Nobel Prize in university rankings and the impact on equity. The study found that influential university rankings judge institutions based on the number of articles published by faculty and staff in top-ranked journals.
It found that this reward system is biased towards men.
Men are more likely to publish other men in top-ranked health and science journals. The role of sexism in terms of who gets published and what gets published isn’t considered when deciding who and what is ranked as world-class.
This can impact our health. For example, an abundance of studies demonstrate the bias against including women in health research, and the harm to women’s health when they are not included in all stages of research studies.
The majority of decision-makers who create or accept the metrics used to decide who and what is world-class are white and male — including ranking advisories, university leaders, top journal editors and adjudication committees for major awards. Sexism and racism are reinforced and normalized through these feedback loops.
This Boys Club impacts women when they go up for promotion at research-intensive universities, because how they are deemed worthy or unworthy is largely based on how many publications they have in top-ranked journals, awards and, depending on the field, the research funding that they bring in.
Science as the heroic man
All the talk of equity over the last 30 years has really been a distraction from talking about how little progress has actually been made. Not because women, racialized and Indigenous scholars are less productive or doing less innovative work, but because of sexism and racism.
As education scholar Annette Henry reminds us, it’s also important to understand how issues intersect — white women might get in but we still need to look at white privilege so racialized women have opportunities as well.
Since 2003, 95 per cent of Nobel Prize winners with university appointments have been male. Yet the Nobel Prize and Fields Medal in Math count for 30 per cent of how the influential ARWU Rankings determine which universities are “world class.”
By promoting and accepting this ranking as legitimate, universities reinforce a sexist and racist metric as the way to determine the quality of a university and what counts in the wider academic systems.
The Nobel as an indicator of world-class research maintains the illusion that science is conducted by the heroic man and — very, very rarely — a woman. Men are represented as toiling away to make great discoveries.
What is left out is the reality of science as a collaborative effort — with women most likely not receiving credit for their work. What is left out is that the Nobel is decided by a few men.
White men decide who is world-class
In the case of the Nobel, a few (mainly Swedish and Norwegian) white men ultimately decide who is best in the fields of physics, medicine, chemistry, advocating for world peace and literature.
Incidentally, this year the Nobel Prize for literature was cancelled after the Swedish Academy announced it was investigating allegations of sexual misconduct and other improprieties by the husband of a key member of the committee that awards the literature prize.
The Nobel adjudication committees mirror society. Predominately white men decide on who and what is world class, and based on these decisions, who to invite into the club.
Once in a while someone who isn’t part of the demographic gets in, but the status quo remains intact.
What Strickland achieved is impressive. But it isn’t a sign that the patriarchy is being smashed.
This is a corrected version of a story originally published Oct. 4, 2018. The earlier story said the Nobel Prize for literature was cancelled in 2018 because of allegations of rape against a former committee chair. The allegations were against the husband of a committee member, not a member of the committee.
Why we can’t reverse climate change with ‘negative emissions’ technologies
October 9, 2018
Without rapid and dramatic changes, the world will face a higher risk of extreme weather and other effects of climate change.
Howard J. Herzog
Senior Research Engineer, Massachusetts Institute of Technology
Howard J. Herzog receives funding from Energy Futures Initiative, Exxon-Mobil, QRI, Total.
In a much-anticipated report, the Intergovernmental Panel on Climate Change (IPCC) said the world will need to take dramatic and drastic steps to avoid the catastrophic effects of climate change.
Featured prominently in the report is a discussion of a range of techniques for removing carbon dioxide from the air, called Carbon Dioxide Removal (CDR) technologies or negative emissions technologies (NETs). The IPCC said the world would need to rely significantly on these techniques to avoid increasing Earth’s temperatures above 1.5 degrees Celsius, or 2.7 degrees Fahrenheit, compared to pre-industrial levels.
Given that the level of greenhouse gases continues to rise and the world’s efforts at lowering emissions are falling way short of targets climate scientists recommend, what contribution we can expect from NETs is becoming a critical question. Can they actually work at a big enough scale?
What are negative emissions technologies?
There is a wide range of opinion on how big an impact these techniques can have in addressing climate change. I became involved in the debate because two of the most prominent negative emissions technologies involve CO2 capture and storage (CCS), a technology that I have been researching for almost 30 years.
Many NETs remove the CO2 from the atmosphere biologically through photosynthesis – the simplest example being afforestation, or planting more trees. Depending on the specific technique, the carbon removed from the atmosphere may end up in soils, vegetation, the ocean, deep geological formations, or even in rocks.
NETs vary on their cost, scale (how many tons they can potentially remove from the atmosphere), technological readiness, environmental impacts and effectiveness. Afforestation/reforestation is the only NET to have been deployed commercially though others have been tested at smaller scales. For example, there are a number of efforts to produce biochar, a charcoal made with plant matter that has a net negative carbon balance.
A recent academic paper discusses the “costs, potentials, and side-effects” of the various NETs. Afforestation/reforestation is one of the least expensive options, with a cost on the order of tens of dollars per ton of CO2, but the scope for carbon removal is small compared to other NETs.
On the other extreme is direct air capture, which covers a range of engineered systems meant to remove CO2 from the air. The costs of direct air capture, which has been tested at small scales, are on the order of hundreds of dollars or more per ton of CO2, but is on the high end in terms of the potential amount of CO2 that can be removed.
In a 2014 IPCC report, a technology called bio-energy with carbon capture and storage (BECCS) received the most attention. This entails burning plant matter, or biomass, for energy and then collecting the CO2 emissions and pumping the gases underground. Its cost is high, but not excessive, in the range of US$100-200 per ton of CO2 removed.
The biggest constraint on the size of its deployment relates to the availability of “low-carbon” biomass. There are carbon emissions associated with the growing, harvesting, and transporting of biomass, as well as potential carbon emissions due to land-use changes – for example, if forests are cut down in favor of other forms of biomass. These emissions must all be kept to a minimum for biomass to be “low-carbon” and for the overall scheme to result in negative emissions. Potential “low-carbon” biomass includes switchgrass or loblolly pine, as opposed to say corn, which is currently turned into liquid fuels and acknowledged to have a high carbon footprint.
Some of the proposed NETs are highly speculative. For example, ocean fertilization is generally not considered a realistic option because its environmental impact on the ocean is probably unacceptable. Also, there are questions about how effective it would be in removing CO2.
A 2017 study at the University of Michigan did a literature review of NETs. One the one hand, they showed that the literature was very bullish on NETs. It concluded these techniques could capture the equivalent of 37 gigatons (billion tons) of CO2 per year at a cost of below $70 per metric ton. For comparison, the world currently emits about 38 gigatons of CO2 a year.
However, I think this result should be taken with a large grain of salt, as they rated only one NET as established (afforestation/reforestation), three others as demonstrated (BECCS, biochar and modified agricultural practices), and the rest as speculative. In other words, these technologies have potential, but they have yet to be proven effective.
Other studies have a much harsher view of NETs. A study in Nature Climate Change from 2015 states, “There is no NET (or combination of NETs) currently available that could be implemented to meet the <2°C target without significant impact on either land, energy, water, nutrient, albedo or cost, and so ‘plan A’ must be to immediately and aggressively reduce GHG emissions.” In another study from 2016, researchers Kevin Anderson and Glen Peters concluded “Negative-emission technologies are not an insurance policy, but rather an unjust and high-stakes gamble. There is a real risk they will be unable to deliver on the scale of their promise.”
The bottom line is that NETs must be shown to work on a gigaton scale, at an affordable cost, and without serious environmental impacts. That has not happened yet. As seen from above, there is a wide range of opinion on whether this will ever happen.
A critical question is what role NETs can play, both from a policy and economic point of view, as we struggle to stabilize the mean global temperature at an acceptable level.
One potential role for NETs is as an offset. This means that the amount of CO2 removed from the atmosphere generates credits that offset emissions elsewhere. Using negative emissions this way can be a powerful policy or economic lever.
For example, with airline travel the best approach to net zero emissions may be to let that industry to continue to emit CO2, but offset those emissions using credits from NETs. Essentially those negative emissions are a way to compensate for the emissions from flying, which is expected to rely on fossil fuels for many years.
About 25 percent of our current carbon emissions can be classified as hard to mitigate. This offset model makes economic sense when the cost of negative emissions is less than the cost to cut emissions from the source itself. So if we can produce negative emissions from say BECCS at about $150 per ton of CO2, they can economically be used to offset emissions from aircraft that would cost several hundred dollars per ton CO2 to mitigate by changing how planes are fueled.
The economics of using NETs to correct an “overshoot” are very different.
We as a society seem unwilling to undertake sufficient efforts to reduce carbon emissions today at costs of tens of dollars per ton CO2 in order to keep enough CO2 out of the atmosphere to meet stabilization targets of 1.5 or 2 degrees Celsius. However, correcting an “overshoot” means we expect future generations to clean up our mess by removing CO2 from the atmosphere at costs of hundreds of dollars or more per ton CO2, which is what the future deployment of NETs may cost.
This makes no sense, economic or otherwise. If we are unwilling to use the relatively cheap mitigation technologies to lower carbon emissions available today, such as improved efficiency, increased renewables, or switching from coal to natural gas, what makes anyone think that future generations will use NETs, which are much, much more expensive?
That’s why I see the role of NETs as an offset being very sound, with some deployment already happening today and increased deployment expected in the future. By contrast, treating NETs as a way to compensate for breaking the carbon budget and overshooting stabilization targets is more hope than reality. The technical, economic and environmental barriers of NETs are very real. In formulating climate policy, I believe we cannot count on the future use of NETs to compensate for our failure to do enough mitigation today.
Meet the trillions of viruses that make up your virome
October 9, 2018
Every surface of our body – inside and out – is covered in microorganisms: bacteria, viruses, fungi and many other microscopic life forms.
Associate Director of Microbiology, University of California San Diego
Visiting Scientist, The Rockefeller University
The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.
University of California provides funding as a founding partner of The Conversation US.
If you think you don’t have viruses, think again.
It may be hard to fathom, but the human body is occupied by large collections of microorganisms, commonly referred to as our microbiome, that have evolved with us since the early days of man. Scientists have only recently begun to quantify the microbiome, and discovered it is inhabited by at least 38 trillion bacteria. More intriguing, perhaps, is that bacteria are not the most abundant microbes that live in and on our bodies. That award goes to viruses.
It has been estimated that there are over 380 trillion viruses inhabiting us, a community collectively known as the human virome. But these viruses are not the dangerous ones you commonly hear about, like those that cause the flu or the common cold, or more sinister infections like Ebola or dengue. Many of these viruses infect the bacteria that live inside you and are known as bacteriophages, or phages for short. The human body is a breeding ground for phages, and despite their abundance, we have very little insight into what all they or any of the other viruses in the body are doing.
I am a physician-scientist studying the human microbiome by focusing on viruses, because I believe that harnessing the power of bacteria’s ultimate natural predators will teach us how to prevent and combat bacterial infections. One might rightly assume that if viruses are the most abundant microbes in the body, they would be the target of the majority of human microbiome studies. But that assumption would be horribly wrong. The study of the human virome lags so far behind the study of bacteria that we are only just now uncovering some of their most basic features. This lag is due to it having taken scientists much longer to recognize the presence of a human virome, and a lack of standardized and sophisticated tools to decipher what’s actually in your virome.
The 411 on the virome
Here’s a few of the things we have learned thus far. Bacteria in the human body are not in love with their many phages that live in and around them. In fact they developed CRISPR-Cas systems – which humans have now co-opted for editing genes – to rid themselves of phages or to prevent phage infections altogether. Why? Because phages kill bacteria. They take over the bacteria’s machinery and force them to make more phages rather than make more bacteria. When they are done, they burst out of the bacterium, destroying it. Finally, phages sit on our body surfaces just waiting to cross paths with vulnerable bacteria. They are basically bacteria stalkers.
It’s clear that there’s a war being fought on our body surfaces every minute of every day, and we haven’t a clue who’s winning or what the consequences of this war might be.
Viruses may inhabit all surfaces both inside and outside of the body. Everywhere researchers have looked in the human body, viruses have been found. Viruses in the blood? Check. Viruses on the skin? Check. Viruses in the lungs? Check. Viruses in the urine? Check. And so on. To put it simply, when it comes to where viruses live in the human body, figuring out where they don’t live is a far better question than asking where they do.
Viruses are contagious. But we often don’t think about bacterial viruses as being easily shared. Researchers have shown that just living with someone will lead to rapid sharing of the viruses in your body. If we don’t know what the consequences are of the constant battle between bacteria and viruses in our body, then it gets exponentially more complicated considering the battle between your bacteria and their viruses that are then shared with everyone including your spouse, your roommate, and even your dog.
Viruses keeping us healthy?
Ultimately, we need to know what all these viruses in the human body are doing, and figure out whether we can take advantage of our virome to promote our health. But it’s probably not clear at this point why anyone would believe that our virome may be helpful.
It may seem counterintuitive, but harming our bacteria can be harmful to our health. For example, when our healthy bacterial communities are disturbed by antibiotic use, other microbial bad guys, also called pathogens, take advantage of the opportunity to invade our body and make us sick. Thus, in a number of human conditions, our healthy bacteria play important roles in preventing pathogen intrusion. Here’s where viruses come in. They’ve already figured out how to kill bacteria. It’s all they live for.
So the race is on to find those viruses in our viromes that have already figured out how to protect us from the bad guys, while leaving the good bacteria intact. Indeed, there are recent anecdotal examples utilizing phages successfully to treat life-threatening infections from bacteria resistant to most if not all available antibiotics – a treatment known as phage therapy. Unfortunately, these treatments are and will continue to be hampered by inadequate information on how phages behave in the human body and the unforeseen consequences their introduction may have on the human host. Thus, phage therapy remains heavily regulated. At the current pace of research, it may be many years before phages are used routinely as anti-infective treatments. But make no mistake about it; the viruses that have evolved with us for so many years are not only part of our past, but will play a significant role in the future of human health.
Trapping toxic compounds with ‘molecular baskets’
New research shows promise with simulated nerve agents
COLUMBUS, Ohio — Researchers have developed designer molecules that may one day be able to seek out and trap deadly nerve agents and other toxic compounds in the environment – and possibly in humans.
The scientists, led by organic chemists from The Ohio State University, call these new particles “molecular baskets.” As the name implies, these molecules are shaped like baskets and research in the lab has shown they can find simulated nerve agents, swallow them in their cavities and trap them for safe removal.
In a new study published in Chemistry – A European Journal, the researchers took the first step in creating versions that could have potential for use in medicine.
“Our goal is to develop nanoparticles that can trap toxic compounds not only in the environment, but also from the human body,” said Jovica Badjić, leader of the project and professor of chemistry and biochemistry at Ohio State.
The research focuses on nerve agents, sometimes called nerve gas, which are deadly chemical poisons that have been used in warfare.
In a study published last year in the Journal of the American Chemical Society, Badjić and his colleagues created molecular baskets with amino acids around the rims. These amino acids helped find simulated nerve agents in a liquid environment and direct them into the basket.
The researchers then started a chemical reaction by shining a light with a particular wavelength on the baskets. The light caused the amino acids to shed a carbon dioxide molecule, which effectively trapped the nerve agents inside the baskets. The new molecule complex, no longer soluble in water, precipitates (or separates) from the liquid and becomes a solid.
“We can then very easily filter out the molecular baskets containing the nerve agent and be left with purified water,” Badjić said.
The researchers have since created a variety of molecular baskets with different shapes and sizes, and different amino acid groups around the rim.
“We should be able to develop baskets that will target a variety of different toxins,” he said. “It is not going to be a magic bullet – it won’t work with everything, but we can apply it to different targets.”
While this early research showed the promise of molecular baskets in the environment, the scientists wanted to see if they could develop similar structures that could clear nerve agents or other toxins from humans.
In this case, you wouldn’t want the baskets with the nerve agents to separate from the blood, Badjić said, because there would be no easy way to remove them from the body.
In their new paper, Badjić and his colleagues developed a molecular basket with a particular type of amino acid – glutamic acid – around its rim. But here they experimented with the ejection of multiple carbon dioxide molecules when they exposed the molecular baskets to light.
In this case, they found that the molecular baskets could trap the simulated nerve agents as they did in the previous research, but they did not precipitate from the liquid. Instead, the molecules assembled into masses.
“We found that they aggregated into nanoparticles – tiny spheres consisting of a mass of baskets with nerve agents trapped inside,” he said.
“But they stayed in solution, which means they could be cleared from the body.”
Of course, you can’t use light inside the body. Badjić said the light could be used to create nanoparticles outside the body before they are put into medicines.
But Badjić noted that this research is still basic science done in a lab and is not ready for use in real life.
“I’m excited about the concept, but there’s still a lot of work to do,” he said.
Co-authors on the Chemistry – A European Journal paper were Sarah Border, Radoslav Pavolić, Lei Zhiquan and Michael Gunther from Ohio State; and Han Wang and Honggang Cui from The Johns Hopkins University.
The study was supported by a grant from the National Science Foundation.
URL : http://news.osu.edu/trapping-toxic-compounds-with-molecular-baskets/
(AP Photo/Susan Montoya Bryan)