Showing posts with label SCIENCE. Show all posts
Showing posts with label SCIENCE. Show all posts

Sunday 3 April 2016

Programming language for novel biological circuits

circuits
MIT biological engineers have created a programming language that allows them to rapidly design complex, DNA-encoded circuits that give new functions to living cells.
Using this language, anyone can write a program for the function they want, such as detecting and responding to certain environmental conditions. They can then generate a DNA sequence that will achieve it.
"It is literally a programming language for bacteria," says Christopher Voigt, an MIT professor of biological engineering. "You use a text-based language, just like you're programming a computer. Then you take that text and you compile it and it turns it into a DNA sequence that you put into the cell, and the circuit runs inside the cell."
Voigt and colleagues at Boston University and the National Institute of Standards and Technology have used this language, which they describe in the April 1 issue of Science, to build circuits that can detect up to three inputs and respond in different ways. Future applications for this kind of programming include designing bacterial cells that can produce a cancer drug when they detect a tumor, or creating yeast cells that can halt their own fermentation process if too many toxic byproducts build up.
The researchers plan to make the user design interface available on the Web.
No experience needed
Over the past 15 years, biologists and engineers have designed many genetic parts, such as sensors, memory switches, and biological clocks, that can be combined to modify existing cell functions and add new ones.
However, designing each circuit is a laborious process that requires great expertise and often a lot of trial and error. "You have to have this really intimate knowledge of how those pieces are going to work and how they're going to come together," Voigt says.
Users of the new programming language, however, need no special knowledge of genetic engineering.
"You could be completely naive as to how any of it works. That's what's really different about this," Voigt says. "You could be a student in high school and go onto the Web-based server and type out the program you want, and it spits back the DNA sequence."
The language is based on Verilog, which is commonly used to program computer chips. To create a version of the language that would work for cells, the researchers designed computing elements such as logic gates and sensors that can be encoded in a bacterial cell's DNA. The sensors can detect different compounds, such as oxygen or glucose, as well as light, temperature, acidity, and other environmental conditions. Users can also add their own sensors. "It's very customizable," Voigt says.
The biggest challenge, he says, was designing the 14 logic gates used in the circuits so that they wouldn't interfere with each other once placed in the complex environment of a living cell.
In the current version of the programming language, these genetic parts are optimized for E. coli, but the researchers are working on expanding the language for other strains of bacteria, including Bacteroides, commonly found in the human gut, and Pseudomonas, which often lives in plant roots, as well as the yeast Saccharomyces cerevisiae. This would allow users to write a single program and then compile it for different organisms to get the right DNA sequence for each one.
Biological circuits
Using this language, the researchers programmed 60 circuits with different functions, and 45 of them worked correctly the first time they were tested. Many of the circuits were designed to measure one or more environmental conditions, such as oxygen level or glucose concentration, and respond accordingly. Another circuit was designed to rank three different inputs and then respond based on the priority of each one.
One of the new circuits is the largest biological circuit ever built, containing seven logic gates and about 12,000 base pairs of DNA.
Another advantage of this technique is its speed. Until now, "it would take years to build these types of circuits. Now you just hit the button and immediately get a DNA sequence to test," Voigt says.
His team plans to work on several different applications using this approach: bacteria that can be swallowed to aid in digestion of lactose; bacteria that can live on plant roots and produce insecticide if they sense the plant is under attack; and yeast that can be engineered to shut off when they are producing too many toxic byproducts in a fermentation reactor.
The lead author of the Science paper is MIT graduate student Alec Nielsen. Other authors are former MIT postdoc Bryan Der, MIT postdoc Jonghyeon Shin, Boston University graduate student Prashant Vaidyanathan, Boston University associate professor Douglas Densmore, and National Institute of Standards and Technology researchers Vanya Paralanov, Elizabeth Strychalski, and David Ross.

Saturday 2 April 2016

History of Vikings Unearthed

With the use of pioneering satellite imagery analysis, excavation and investigation of archeological evidence, the team has uncovered what could be the first new Norse site to be discovered in North America in over 50 years. If confirmed by further research, the site at Point Rosee in Newfoundland will show that the Vikings traveled much farther into North America than previously known, pushing the boundary of their explorations over 300 miles to the southwest.
To date, scientists have known of only one other Viking site, found on the very northern tip of Newfoundland in Canada, at L'Anse Aux Meadows. In the 1960s, archaeologists uncovered the foundations of 1,000-year-old Viking buildings, signs of metalworking, iron nails and artifacts. The site appeared to pre-date Columbus' voyages to the New World by some 500 years, confirming that Norse explorers had reached North America as suggested in the Vinland sagas. For more than 50 years, scientists have searched for another Norse site.
Using infrared images from 400 miles in space, Parcak and her team looked at tens of thousands of square kilometers along the Eastern Seaboard of the U.S. and Canada. Images taken in Point Rosee revealed possible human-made shapes under discolored vegetation. This intriguing evidence suggests the Vikings traveled farther south than previously known. The Newfoundland project was co-directed by Gregory Mumford, Ph.D., Parcak's husband and professor in the UAB College of Arts and Sciences Department of Anthropology. Preliminary excavations took place over a period of two and a half weeks in June 2015.
Parcak recently made international headlines when she was named winner of the 2016 TED Prize. She is an expert in satellite remote sensing for archaeology and wrote the first textbook in the field. Her methods have helped locate 17 potential pyramids in Egypt, in addition to 3,100 forgotten settlements and 1,000 lost tombs. She has also made major discoveries throughout the Roman Empire. She is a National Geographic Senior Fellow, TED Senior Fellow and a professor of archaeology at UAB. Parcak is also the founder and director of the UAB Laboratory for Global Observation.
The discovery is the subject of a two-hour special called "Vikings Unearthed." The program will first stream online at pbs.org/nova Monday, April 4, at 2:30 p.m. CT to coincide with the premiere of a 90-minute version of the film in the U.K. on BBC One. A two-hour U.S. broadcast will follow Wednesday, April 6, at 8 p.m. CT on PBS.

Story Source:
The above post is reprinted from materials provided by University of Alabama at Birmingham. The original item was written by Tiffany Westry.Note: Materials may be edited for content and length.

Thursday 31 March 2016

Study : Your season of birth is stamped on your DNA and can affect your risk of allergies

Gabrielle A Lockett, University of Southampton and John W Holloway, University of Southampton

People born in autumn or winter are more likely to suffer from allergies than people born in spring or summer. Nobody is certain why this is, but there are several theories. These include seasonal variations in sunlight (which could affect vitamin D levels), levels of allergens such as pollen and house dust mite (which vary by season), the timing of the baby’s first chest infection (colds tend to be more common in winter), and maternal diet (price and availability of fruit and vegetables vary by season).
But no matter which of these exposures causes changes to the risk of developing an allergy, until now nobody knew how these early environmental influences were so long lasting.
Our study tested whether epigenetic marks on a person’s DNA could be a mechanism behind these birth season effects. Of course, your genome doesn’t change depending on which season you’re born in, but there are epigenetic marks attached to your DNA that can influence gene expression – the process where specific genes are activated to produce a certain protein. This may result in different responses to immune triggers and hence different susceptibility to diseases.
Unlike DNA, which is inherited from your parents, epigenetic marks can change in response to the environment and allow gene expression to respond to environmental exposures. And they can also be very long-lasting.

Epigenetic imprint

We scanned DNA methylation (one type of epigenetic mark) profiles of 367 people from the Isle of Wight and found, for the first time, that the season in which a person is born leaves an epigenetic print on the genome that is still visible at the age of 18. This discovery means that these marks on the genome could be how season of birth is able to influence the risk of having allergies later in life.
We went on to test whether these DNA methylation differences that varied by season of birth were also associated with allergic disease. We found that two of them appeared to be influencing the risk of allergy in the participants. As well as allergies, other studies have shown that season of birth is associated with a number of things such as height, lifespan, reproductive performance, and the risks of diseases including heart conditions and schizophrenia. It is possible that the birth season-associated DNA methylation that we discovered might also influence these other outcomes but this will need further investigation.
The marks that we found in the DNA samples collected from the 18-year-olds were mostly similar to the epigenetic marks found in a group of Dutch eight-year-olds that we used to validate our findings. But when we looked at another cohort – a group of newborn babies – the marks were not there. This suggests that these DNA methylation changes occur after birth, not during pregnancy.

This might help. www.shutterstock.com

There’s something about the seasons

We are not advising women to change the timing of their pregnancy, but if we understood exactly what it was about birth season that causes these effects, this could potentially be changed to reduce the risk of allergy in children. For example, if the birth season effect on allergies was found to be driven by sunlight levels experienced by the mother during pregnancy or breastfeeding, then the increased risk of allergies among babies born in autumn and winter might be lessened by giving the expectant or breastfeeding mother vitamin D supplements. You wouldn’t need to time births with the seasons to get the benefits.
Our study reports the first discovery of a mechanism through which birth season could influence disease risk, though we still don’t know exactly which seasonal stimuli cause these effects. Future studies are needed to pinpoint these, as well as to investigate the relationship between DNA methylation and allergic disease, and what other environmental exposures have an effect.
With the considerable burden allergic disease places not only on individual sufferers but also on society, any step towards reducing allergy is a step in the right direction.
The Conversation
Gabrielle A Lockett, Postdoctoral research associate, University of Southampton and John W Holloway, , University of Southampton
This article was originally published on The Conversation. Read the original article.

Hobbit might have gone into extinction than earlier thought

An artist's drawing sits on display at the Australian Museum in Sydney October 28, 2004 of a newly discovered species of hobbit-sized humans that adds another piece to the complex puzzle of human evolution.
An artist's drawing sits on display at the Australian Museum in Sydney October 28, 2004 of a newly discovered species of hobbit-sized humans that adds another piece to the complex puzzle of human evolution.(REUTERS/HO/Peter Schouten-National Geographic Society)


The extinct human species dubbed the "Hobbit" vanished from its home on the Indonesian island of Flores far earlier than previously thought, according to scientists who suspect our species may have had a hand in these diminutive people's demise.
Researchers on Wednesday said they recalculated the age of bones of the species, named Homo floresiensis, found inside a Flores cave, and determined it disappeared about 50,000 years ago rather than 12,000 years ago as previously estimated.
The Hobbit's discovery in 2003 created a scientific sensation. Homo floresiensis stood 3-1/2 feet tall (106 cm), possessed a small, chimpanzee-sized brain, used stone tools and may have hunted pygmy elephants.
The researchers said there is not yet direct evidence the Hobbit people encountered Homo sapiens but noted that our species was already on other islands in the region at around that time and had reached Australia by about 50,000 years ago.
Geochronologist Bert Roberts of Australia's University of Wollongong said it was possible Homo sapiens played a role in the Hobbit's extinction and the issue would be a major focus of further research.
"To me, the question is, 'Would the Hobbits have become extinct if humans had never made landfall on Flores?' And the answer is 'no.' We were likely the decisive factor in their demise, but we still need to find hard evidence to back up this hunch," Roberts said.
Numerous animals disappeared on Flores at the same time, said paleoanthropologist Matt Tocheri of Canada's Lakehead University and the Smithsonian Institution's Human Origins Program. These included small elephants, giant marabou storks, vultures and large Komodo dragon lizards.
After fresh excavations from 2007 to 2014 improved the understanding of the cave site, the scientists re-evaluated the ages of sediment containing Homo floresiensis remains and the actual bones.
The Hobbits' skeletal remains were 60,000 to 100,000 years old while their stone tools were 50,000 to 190,000 years old, said archaeologist Thomas Sutikna of the University of Wollongong and Indonesia's National Research Centre for Archaeology.
Homo sapiens first appeared in Africa about 200,000 years ago and later trekked to other parts of the world, encountering other human species like Neanderthals who went extinct not long afterward.
The previous assessment that the Hobbits had lived as recently as 12,000 years ago indicated they had survived for perhaps 40,000 years after our species reached the region. The new results show this was not the case.

Sunday 27 March 2016

The scientific reason why women need more sleep than men


Jaleesa Baulkman For Medical Daily
Woman sleeping (Shutterstock)
It’s tough being a woman in this world. Women are twice as likely to experience major depression than men and more likely to suffer from heart disease. This could be because they’re often overworked and underpaid, or because on top of it all, they have to deal with ridiculous standards of beauty. Unfortunately, that’s not even the half of it. All this goes to say that women have more than enough reasons to hit the snooze button and sleep in every morning — and science agrees.

Women Need More Sleep

After collecting data from more than 200 middle-aged men and women, researchers from the Loughborough University’s Sleep Research Center in Leicestershire, England say women need 20 more minutes of shut-eye than men to function properly because their brains are both more complex and used more, Woman’s Day reported.
According to the National Sleep Foundation, people aged 18 and older should get at least seven to nine hours of sleep.
"Women's brains are wired differently from men's and are more complex, so their sleep needs will be slightly greater," said Jim Horne, director of Loughborough University's Sleep Research Center, according to The Tab. “The average is 20 minutes more, but some women may need slightly more or less than this.”
The idea that men and women’s brains differ significantly is not new. A 2013 study from the University of Pennsylvania suggested the same thing and confirmed that women outdo men when it comes to multitasking. In fact, the current study cites women’s ability to handle more than one task at the same time as one of the reasons they require more sleep than men.

In addition to spending more mental energy than men, researchers also found women need more sleep because they experience more consequences from lack of sleep than do men.
"Women tend to multitask — they do lots at once and are flexible — and so they use more of their actual brain than men do,” Horne previously told The Australian in 2013. "Because of that, their sleep need is greater."
"For women, poor sleep is strongly associated with high levels of psychological distress and greater feelings of hostility, depression, and anger," Horne said. "In contrast, these feelings were not associated with the same degree of sleep disruption in men."

Health Effects of Sleep Deprivation

Horne and his colleagues’ study builds on previous research that has found women are more affected by lack of sleep, or sleep deprivation, compared to men.
2013 study from Duke University found that women — who are already more susceptible to depression than men — had more depression, anger, and hostility in the morning when they didn’t get an adequate amount of rest.
"One of the major functions of sleep is to allow the brain to recover and repair itself. During deep sleep, the cortex — the part of the brain responsible for thought, memory, language, and so on — disengages from the senses and goes into recovery mode,”  Horne told The Australian. "The more of your brain you use during the day, the more of it that needs to recover and, consequently, the more sleep you need.”
Poor sleep has also been linked to weight gain,  increased suicide risk, and chronic, life-threatening illnesses like type 2 diabetes and heart disease.

Saturday 26 March 2016

Gravitational waves discovered: top scientists respond

Keith Riles, University of Michigan; Alan Duffy, Swinburne University of Technology; Amanda Weltman, University of Cape Town; Daniel Kennefick, University of Arkansas; David Parkinson, The University of Queensland; Maria Womack, University of South Florida; Stephen Smartt, Queen's University Belfast; Tamara Davis, The University of Queensland, and Tara Murphy, University of Sydney
One hundred years ago, Albert Einstein published his general theory of relativity, which described how gravity warps and distorts space-time.
While this theory triggered a revolution in our understanding of the universe, it made one prediction that even Einstein doubted could be confirmed: the existence of gravitational waves.
Today, a century later, we have that confirmation, with the detection of gravitational waves by the Advanced Laser Interferometer Gravitational-Wave Observatory (aLIGO) detectors.
Here we collect reactions and analysis from some of the leading astronomers and astrophysicists from around the world.

Keith Riles, University of Michigan


Keith Riles explains gravitational waves.

Einstein was skeptical that gravitational waves would ever be detected because the predicted waves were so weak. Einstein was right to wonder – the signal detected on September 14, 2015 by the aLIGO interferometers caused each arm of each L-shaped detector to change by only 2 billionths of a billionth of a meter, about 400 times smaller than the radius of a proton.
It may seem inconceivable to measure such tiny changes, especially with a giant apparatus like aLIGO. But the secret lies in the lasers (the real “L” in LIGO) that are projected down each arm.
Fittingly, Einstein himself indirectly helped make those lasers happen, first by explaining the photoelectric effect in terms of photons (for which he earned the Nobel Prize), and second, by creating (along with Bose) the theoretical foundation of lasers, which create coherent beams of photons, all with the same frequency and direction.
In the aLIGO arms there are nearly a trillion trillion photons per second impinging on the mirrors, all sensing the precise positions of the interferometer mirrors. It is this collective, coherent sensing that makes it possible to determine that one mirror has moved in one direction, while a mirror in the other arm has moved in a different direction. This distinctive, differential motion is what characterizes a gravitational wave, a momentary differential warp of space itself.
By normally operating aLIGO in a mode of nearly perfect cancellation of the light returning from the two arms (destructive interference), scientists can therefore detect the passage of a gravitational wave by looking for a momentary brightening of the output beam.
The particular pattern of brightening observed on September 14 agrees remarkably well with what Einstein’s General Theory of Relativity predicts for two massive black holes in the final moments of a death spiral. Fittingly, Einstein’s theory of photons has helped to verify Einstein’s theory of gravity, a century after its creation.

Amanda Weltman, University of Cape Town

The results are in and they are breathtaking. Almost exactly 100 years ago Einstein published “Die Feldgleichungen der Gravitation” in which he laid out a new theory of gravity, his General Theory of Relativity. Einstein not only improved on his predecessor, Newton, by explaining the unexpected orbit of the planet Mercury, but he went beyond and laid out a set of predictions that have shaken the very foundations of our understanding of the universe and our place in it. These predictions include the bending of light leading to lensed objects in the sky, the existence of black holes from which no light can escape as well as the entire framework for our modern understanding of cosmology.

NASA’s Hubble Space Telescope captured gravitational lensing of light, as predicted by Einstein. NASA, ESA, K. Sharon (Tel Aviv University) and E. Ofek (Caltech), CC BY

Einstein’s predictions have so far all proven true, and today, the final prediction has been directly detected, that of gravitational waves, the tiniest ripples through space; the energy radiated away by two massive heavenly bodies spiralling into each other. This is the discovery of the century, and it is perhaps poetic that one of the places it is being announced is Pisa, the very place where, according to legend, 500 years ago, Galileo dropped two massive objects to test how matter reacts to gravity.
As we bathe in the glory of this moment it is appropriate to ask, what is next for astronomy and physics and who will bring about the next revolution? Today’s discovery will become tomorrow’s history. Advanced LIGO brings a new way of testing gravity, of explaining the universe, but it also brings about the end of an era of sorts. It is time for the next frontier, with the Square Kilometre Array project finally afoot across Africa and Australia, the global South and indeed Africa itself is poised to provide the next pulse of gravity research.

Stephen Smartt, Queen’s University Belfast

Not only is this remarkable discovery of gravitational waves an extraordinary breakthrough in physics, it is a very surprising glimpse of a massive black hole binary system, meaning two black holes that are merging together.
Black holes are dark objects with a mass beyond what is possible for neutron stars, which are a type of very compact stars – about 10 km across and weighing up to two solar masses. To imagine this kind of density, think of the entire human population squeezed onto a tea spoon. Black holes are even more extreme than that. We’ve known about binary neutron stars for years and the first detection of gravitational waves were expected to be two neutron stars colliding.
What we know about black hole pairs so far comes from the study of the stars orbiting around them. These binary systems typically have black holes with masses five to 20 times that of the sun. But LIGO has seen two black holes with about 30 times the mass of the sun in a binary system that has finally merged. This is remarkable for several reasons. It is the first detection of two merging black holes, it is at a much greater distance than LIGO expected to find sources, and the total mass in the system is also much larger than expected.
This raises interesting questions about the stars that could have produced this system. We know massive stars die in supernovae, and most of these supernovae (probably at least 60%) produce neutron stars. The more massive stars have very large cores that collapse and are too massive to be stable neutron stars so they collapse all the way to black holes.
But a binary system with two black holes of around 30 solar masses is puzzling. We know of massive binary star systems in our own and nearby galaxies, and they have initial masses well in excess of 100 suns. But we see them losing mass through enormous radiation pressure and they are predicted, and often observed, to end their lives with masses much smaller – typically about ten times the sun.
If the LIGO object is a pair of 30 solar mass black holes, then the stars that formed it must have been at least as massive. Astronomers will be asking – how can massive stars end their lives so big and how can they create black holes so massive? As well as the gravitational wave discovery, this remarkable result will affect the rest of astronomy for some time.

Alan Duffy, Swinburne University

The detection of gravitational waves is the confirmation of Albert Einstein’s final prediction and ends a century-long search for something that even he believed would remain forever untested.
This discovery marks not the end, but rather the beginning, of an era in which we explore the universe around us with a fundamentally new sense. Touch, smell, sight and sound all use ripples in an electromagnetic field, which we call light, but now we can make use of ripples in the background field of space-time itself to “see” our surroundings. That is why this discovery is so exciting.
The Advanced Laser Interferometer Gravitational-Wave Observatory (aLIGO) measured the tiny stretching of space-time by distant colliding black holes, giving them a unique view into the most extreme objects in general relativity.
The exact “ringing” of space-time as the ripples pass through the detector test this theory and our understanding of gravity in ways no other experiment can.
We can even probe the way galaxies grow and collide by trying to measure the gravitational waves from the even larger collisions of supermassive black holes as the galaxies they are contained in smash together.
Australia in particular is a leading nation in this search, using distant pulsars as the ruler at the Parkes telescope.

The LIGO detectors are sensitive to the minute ripples in space-time caused by the merging of two black holes. University of Birmingham Gravitational Waves Group, Christopher Berry


Tara Murphy, University of Sydney

In addition to binary black holes, aLIGO will detect gravitational waves from other events such as the collision of neutron stars, which are the dense remnants left over when a massive stars collapse.
Astronomers think that two neutron stars colliding may trigger a gamma-ray burst, which we can detect with “regular” telescopes.

Simulation of neutron stars colliding. Credit: NASA

In Australia, we have been using the Murchison Widefield Array and the Australian Square Kilometre Array Pathfinder) to follow-up aLIGO candidates.
aLIGO is an incredibly sensitive instrument but it has very poor ability to determine where the gravitational waves are coming from. Our radio telescopes can scan large areas of sky extremely quickly, so can play a critical part in identifying the event.
This project has been like no other one I have worked on. When aLIGO identifies a candidate, it sends out a private alert to an international network of astronomers. We respond as quickly as possible with our telescopes, scanning the region the event is thought to have occurred in, to see if we can detect any electromagnetic radiation.
Everything is kept top secret – even the other people using our telescopes are not allowed to know where we are pointing them.
To make sure their complex processing pipeline was working correctly, someone in the aLIGO team inserted fake events into the process. Nobody on the team, or those of us doing follow-up, had any idea whether what we were responding to was real or one of these fake events.
We are truly in an era of big science. This incredible result has been the work of not only hundreds of aLIGO researchers and engineers, but hundreds more astronomers collaborating around the globe. We are eagerly awaiting the next aLIGO observing run, to see what else we can find.

Tamara Davis, University of Queensland

Rarely has a discovery been so eagerly anticipated.
When I was a university undergraduate, almost 20 years ago, I remember a physics lecturer telling us about the experiments trying to detect gravitational waves. It felt like the discovery was imminent, and it was one of the most exciting discoveries that could be made in physics.
Mass and energy warping the fabric of space is one of the pieces of general relativity that most captures the imagination. However, while it has enormous explanatory power, the reality of that curvature is hard to grasp or confirm.
For the last few months I’ve had to sit quietly and watch as colleagues followed up the potential gravitational wave signal. This is the one and only time in my scientific career that I wasn’t allowed to talk about a scientific discovery in progress.
But that’s because it is such a big discovery that we had to be absolutely sure about it before announcing it, lest we risk “crying wolf”.
Every last check had to be done, and of course, we didn’t know whether it was a real signal, or a signal injected by the experimenters to keep us on our toes, test the analysis and follow-up.
I work with a project called the Dark Energy Survey, and with our massive, wide-field, half-billion pixel camera on a four metre telescope in Chile, my colleagues took images trying to find the source of the gravitational waves.
The wide-field is important, because the gravitational wave detectors aren’t very good at pinpointing the exact location of the source.
Unfortunately if it was a black hole merger, we wouldn’t expect to see any visible light.
Now that we’re in the era of detecting gravitational waves, though, we’ll be able to try again with the next one.

Maria Womack, University of South Florida

This is a momentous change for astronomy. Gravitational-wave astronomy can now truly begin, opening a new window to the universe. Normal telescopes collect light at different wavelengths, such as Xray, ultraviolet, visible, infrared and radio, collectively referred to as electromagnetic radiation (EM). Gravitational waves are emitted from accelerating mass analogous to the way electromagnetic waves are emitted from accelerating charge; both are emitted from accelerating matter.
The most massive objects with the highest accelerations will be the first events detected. For example, Advanced LIGO, funded by the U.S. National Science Foundation, can detect binary black holes in tight, fast orbits. GWs carry away energy from the orbiting pair, which in turn causes the black holes to shrink their orbit and accelerate even more, until they merge in a violent event, which is now detectable on Earth as a whistling “chirp.”

An example signal from an inspired gravitational wave source. A. Stuver/LIGO, CC BY-ND

The gravitational-wave sky is completely uncharted, and new maps will be drawn that will change how we think of the universe. GWs might be detected coming from cosmic strings, hypothetical defects in the curvature of space-time. They will also be used to study what makes some massive stars explode into supernovae, and how fast the universe is expanding. Moreover, GW and traditional telescopic observing techniques can be combined to explore important questions, such as whether the graviton, the presumed particle that transmits gravity, actually have mass? If massless, they will arrive at the same time as photons from a strong event. If gravitons have even a small mass, they will arrive second.

Daniel Kennefick, University of Arkansas

Almost 100 years ago, in February 1916, Einstein first mentioned gravitational waves in writing. Ironically it was to say that he thought they did not exist! Within a few months he changed his mind and by 1918 had published the basis of our modern theory of gravitational waves, adequate to describe them as they pass by the Earth. However his calculation does not apply to strongly gravitating systems like a binary black hole.

Albert Einstein was the original theorist who started the hunt for gravitational waves.

It was not until 1936 that Einstein returned to the problem, eventually publishing one of the earliest exact solutions describing gravitational waves. But his original sceptical attitude was carried forward by some of his former assistants into the postwar rebirth of General Relativity. In the 1950s, doubts were expressed as to whether gravitational waves could carry energy and whether binary star systems could even generate them.
One way to settle these disputes was to carry out painstaking calculations showing how the emission of gravitational waves affected the motion of the binary system. This proved a daunting challenge. Not only were the calculations long and tedious, but theorists found they needed a much more sophisticated understanding of the structure of space-time itself. Major breakthroughs included the detailed picture of the asymptotic structure of space-time, and the introduction of the concept of matched asymptotic expansions. Prior to breakthroughs such as these, many calculations got contradictory results. Some theorists even got answers that the binary system should gain, not lose, energy as a result of emitting gravitational waves!
While the work of the 1960s convinced theorists that binary star systems did emit gravitational waves, debate persisted as to whether Einstein’s 1918 formula, known as the quadrupole formula, correctly predicted the amount of energy they would radiate. This controversy lasted into the early 1980s and coincided with the discovery of the binary pulsar which was a real-life system whose orbit was decaying in line with the predictions of Einstein’s formula.
In the 1990s, with the beginnings of LIGO, theorists' focus shifted to providing even more detailed corrections to formulas such as these. Researchers use descriptions of the expected signal as templates which facilitate the extraction of the signal from LIGO’s noisy data. Since no gravitational wave signals had ever been seen before, theorists found themselves unusually relevant to the detection project – only they could provide such data analysis templates.

David Parkinson, University of Queensland

Gravitational waves can be used to provide a direct probe of the very early universe. The further away we look, the further back in time we can see. But there is a limit to how far back we can see, as the universe was initially an opaque plasma, and remained so even as late as 300,000 years after the Big Bang.
This surface, from which the cosmic microwave background is emitted, represents the furthest back any measurement of electromagnetic radiation can directly investigate.
But this plasma is no impediment for gravitational waves, which will not be absorbed by any intervening matter, but come to us directly. Gravitational waves are predicted to be generated by a number of different mechanisms in the early universe.
For example, the theory of cosmic inflation, which suggests a period of accelerated expansion moments after the Big Bang, goes on to predict not just the creation of all structure that we see in the universe, but also a spectrum of primordial gravitational waves.
It is these primordial gravitational waves that the BICEP2 experiment believed it had detected in March 2014.
BICEP2 measured the polarisation pattern of the cosmic microwave background, and reported a strong detection of the imprint of primordial gravitational waves. These results turned out in fact to be contamination by galactic dust, and not primordial gravitational waves.
But there is every reason to believe that future experiments may be able detect these primordial gravitational waves, either directly or indirectly, and so provide a new and complementary way to understand the physics of the Big Bang.
The Conversation

Keith Riles, Professor of Physics, University of Michigan; Alan Duffy, Research Fellow, Swinburne University of Technology; Amanda Weltman, SARChI in Physical Cosmology, Department of Mathematics and Applied Mathematics, University of Cape Town; Daniel Kennefick, Associate Professor of Physics, University of Arkansas; David Parkinson, Researcher in astrophysics, The University of Queensland; Maria Womack, Research Professor of Physics, University of South Florida; Stephen Smartt, Professor of Physics and Mathematics, Queen's University Belfast; Tamara Davis, Professor, The University of Queensland, and Tara Murphy, Associate Professor and ARC Future Fellow, University of Sydney
This article was originally published on The Conversation. Read the original article.

Thursday 24 March 2016

From Stonehenge to Nefertiti: how high-tech archaeology is transforming our view of history

TOMB

Kristian Strutt, University of Southampton

A recent discovery could radically change our views of one of the world’s most famous archaeological sites, Tutankhamun’s tomb. Scans of the complex in Egypt’s Valley of the Kings revealed it may still include undiscovered chambers – perhaps even the resting place of Queen Nefertiti – even though we have been studying the tomb for almost 100 years.
It’s common to get excited about high-profile archaeological discoveries, but it’s the slower, ongoing research that shows the real potential of new technology to change our understanding of history.
The latest findings touch on the mystery and conjecture around the tomb of the Egyptian queen consort Nefertiti, who died around 1330 BC. Some scholars believe that she was buried in a chamber in her stepson Tutankhamun’s tomb (known as KV62), although others have urged caution over this hypothesis.
Nefertiti is a pivotal figure in Egyptology. She and her husband Pharaoh Akhenaten helped bring about a religious revolution in ancient Egypt, and she may have even briefly ruled the country after his death. But we have little solid information about her life or death and her remains have never been found.
So the discovery of her tomb could be instrumental in revealing more about this critical period in history, and even change our views on how powerful and important she was. Nicholas Reeves, the director of the research, believes that the size and layout of KV62 means that it may have originally been designed for a queen. He has also used a ground-penetrating radar (GPR) survey to look for possible hidden antechambers that may contain Nefertiti’s remains after reassessment of the relationship between Nefertiti and Tutankhamun led to renewed interest in the tomb.


Ground penetrating radar. University of Southampton, Author provided

Underground archaeology

The geophysical survey techniques used to study the tomb have been applied in archaeology since the 1970s. GPR involves emitting electromagnetic radar waves through a structure and measuring how long it takes for them to be reflected by the different objects and elements that comprise it. Different materials reflect the radar waves at different velocity so it’s possible to use this information to build a 3D map of the structure. For KV62, the map suggests there are spaces beyond the standing walls of the tomb, which could be undiscovered antechambers.
The problem with such surveys is that the high hopes of the initial conclusions released to the public may not match the reality of later findings. The data can often be interpreted in different ways. For example, natural breaks and fissures in the rock may produce responses similar to undiscovered chambers. Scanning the relatively small area of the walls of an individual chamber can make it difficult to place the results in a broader context.
By gathering a wider range of data, we can slowly build up a clearer picture of the history of a site. While not as dramatic as uncovering a forgotten tomb, the process of using technology to gradually study a site can, directly and indirectly, significantly change our view of it or the people associated with it.
Other geophysical techniques tend to be used to study more open sites or landscapes. Magnetometry measures the variations in the Earth’s magnetic field that are caused by many forms of buried archaeological material, from fired material such as kilns to building material and filled ditches. Earth resistance measures how easily electrical current passes through the ground. Features such as walls, paving and rubble have a high resistance to current, while filled ditches and pits tend to have a low resistance.


Hidden landscape. Stonehenge Hidden Landscape Project LBIArchPRO

Uncovering the real Stonehenge

Putting such techniques to use at Stonehenge, for example, has completely transformed the way we think about how the landscape was used, and the forms of worship used by Neolithic society. Prior to the survey only a handful of ritual monuments were known around the impressive remains of Stonehenge, meaning that archaeologists could not easily evaluate the way in which the landscape was used.
The geophysical survey revealed hundreds of archaeological features, including 17 major ritual monuments. For the first time archaeologists were able to map every single possible buried monument in the landscape, including henges, pits, barrows and ditches. This means we can start to fully appreciate the way in which the ritual landscape was organised. For example, the new monuments reveal astronomical alignments that were previously unknown or only partly recognised.
Similar geophysical survey work at Ostia Antica in Italy has completely altered our theories about the layout of the city and its harbour. A magnetometer survey conducted across the area between Portus and Ostia between 2008 and 2011, discovered the presence of buried warehouses and associated structures. These were enclosed by the line of a defensive wall, showing that the extent of the ancient city included both banks of the river Tiber. This crucial fact changes the potential size of the city and alters our plan of its harbour area. This suggests much more of the city was used for storage, perhaps making it even more important as a port for nearby Rome than previously thought.


Sarum revealed. University of Southampton, Author provided

An ongoing survey at Old Sarum in Wiltshire in the UK has been studying the area surrounding the remains of the Iron-age hillfort and medieval town. Using GPR, magnetometry and earth resistance together has uncovered an unprecedented number of Roman and medieval structures, courtyards and other remains. This indicates that there was a much more substantial and complex settlement at Old Sarum much earlier than previously thought. Further work in 2016 may even prove claims of a late Saxon settlement and mint at the site.
These kinds of discoveries show that geophysical technology has a huge role to play in archaeology, both through investigation of sites and landscapes, and also of smaller monuments such as buildings and tombs. But we need to look beyond the more sensational aspects of such research and understand the role it plays in the bigger picture of uncovering the past.
The Conversation

Kristian Strutt, Experimental Officer and Geophysical Researcher, University of Southampton
This article was originally published on The Conversation. Read the original article.

Wednesday 23 March 2016

New Atheism, Meet Existential Risk Studies

 
By Phil Torres

 While the New Atheist movement isn’t, and has never been, a monolithic phenomenon, its primary motivating idea can be reduced to a single statement, namely that religion is not merely wrong, but dangerous. In fact, religion is dangerous precisely because it’s wrong: it commands believers to act according to “moral” precepts and guidelines that are ultimately based on private revelations had by ancient prophets claiming special access to the supernatural. Put differently, religion is our very best instance of institutionalized bad epistemology, and this is what makes it unreasonable to accept. And when its doctrinal systems are put into practice, they often compromise our well-being and prosperity.
Copious evidence substantiates this contention. On the one hand, history is overflowing with bloody conflicts driven by antagonistic religious dogmas held by fanatics who cared more about the otherworldly than the worldly. And, as the 2014 Global Terrorism Index affirms, religious extremism constitutes the primary driver of terrorism around the world today. Even more, numerous empirical studies have shown that, to quote the sociologist Phil Zuckerman, secular people are “markedly less nationalistic, less prejudiced, less anti-Semitic, less racist, less dogmatic, less ethnocentric, less close-minded, and less authoritarian” than religious people. And the most secularized countries tend to be the happiest, the most peaceable (according to the Global Peace Index), and, as reported by the Economist’s think tank several years ago, the “best places to be born.” While Christopher Hitchens’ declaration that “religion poisons everything” might be somewhat exaggerated, religious belief is consistently associated with diminished levels of human flourishing.
But I believe that the New Atheist’s position is even more compelling than the New Atheists themselves have previously realized. Concomitant with the rise of the New Atheist movement about a decade ago, another field took shape in some of the top universities around the world, most notably Oxford and Cambridge. This field, called existential risk studies (or existential riskology), grew out of the innovative work of thinkers like John Leslie, Sir Martin Rees, Richard Posner, and Nick Bostrom. Its focus is a special kind of tragedy known as an existential risk, or a catastrophe resulting in either our extinction or a state of permanent and severe deprivation.
While humanity has always been haunted by a small number of improbable threats to our survival, such as asteroid/comet impacts, supervolcanoes, and pandemics (call these our “cosmic risk background”), advanced technologies are introducing a constellation of brand-new existential risks that humanity has never before encountered—and therefore has no track record of surviving. These risks stem largely from technologies like nuclear weapons, biotechnology, synthetic biology, nanotechnology, and even artificial superintelligence, which a growing number of scholars identify as the greatest (known) threat to the long-term survival of humanity. Add to this the ongoing slow-motion catastrophes of climate change and biodiversity loss that threaten our planetary spaceship with environmental ruination. While these two risks could genuinely bring about our extinction, they’re probably best described as “conflict multipliers” that will nontrivially raise the probability of other risk scenarios being realized, as state and non-state actors compete for land and dwindling resources.
Taking all of this into account, many riskologists believe that the probability of an existential catastrophe occurring in the foreseeable future is unsettlingly high. For example, last year the Bulletin of Atomic Scientists moved the minute hand of its Doomsday Clock (a metaphorical clock according to which midnight represents doom) from five minutes before midnight to a mere three minutes. And in January, Bulletin board members (including 2015 Humanist of the Year Lawrence Krauss) held a press conference to announce that despite the Paris climate agreement and the Iran nuclear deal, the hands of the clock would not move from their perilous position in 2016. As a point of reference, the furthest away from midnight that we’ve been since the Doomsday Clock was created in 1947 is seventeen minutes, at the close of the Cold War. And only once before, at the height of the Cold War, has the hand been closer than it currently is today.


Source: http://thehumanist.com/magazine/march-april-2016/features/new-atheism-meet-existential-risk-studies

Brain holds more than one road to fear

MRI scans
In a pair of twin sisters, a rare disease had damaged the brain’s structures believed necessary to feel fear. But an injection of a drug could nevertheless make them anxious.
The results of that experiment, described in the March 23 Journal of Neuroscience, add to evidence that the amygdalae, small, almond-shaped brain structures tucked deep in the brain, aren’t the only bits of the brain that make a person feel afraid. “Overall, this suggests multiple different routes in the brain to a common endpoint of the experience of fear,” says cognitive neuroscientist Stephan Hamann of Emory University in Atlanta.
The twins, called B.G. and A.M., have Urbach-Wiethe disease, a genetic disorder that destroyed most of their amygdalae in late childhood. Despite this, the twins showed fear after inhaling air laden with extra carbon dioxide (an experience that can create the sensation of suffocating), an earlier study showed (SN: 3/23/13, p. 12). Because carbon dioxide affects a wide swath of the body and brain, scientists turned to a more specific cause of fear that stems from inside the body: a drug called isoproterenol, which can set the heart racing and make breathing hard. Sensing these bodily changes provoked by the drug can cause anxiety.
“If you know what adrenaline feels like, you know what isoproterenol feels like,” says study coauthor Sahib Khalsa, a psychiatrist and neuroscientist at the Laureate Institute for Brain Research in Tulsa, Okla.
After injections of isoproterenol, both twins felt shaky and anxious. B.G. experienced a full-blown panic attack, a result of the drug that afflicts about a quarter of people who receive it, says Khalsa. In a second experiment, researchers tested the women’s ability to judge their bodies’ responses to the drug. While receiving escalating doses, the women rated the intensity of their heartbeats and breathing. A.M., the woman who didn’t have a panic attack, was less accurate at sensing the drug’s effects on her body than both her sister and healthy people, researchers found.
It’s not clear why the twins responded differently, Khalsa says. Further experiments using brain scans may help pinpoint neural differences that could be behind the different reactions.
The results suggest that the amygdala isn’t the only part of the brain involved in fear and anxiety, but there’s more work to do before scientists understand how the brain creates these emotions, Khalsa says. “It’s definitely a complicated question and a debate that’s unresolved,” he says.

A new way to determine the age of stars?


University of Rochester

Researchers have developed a new conceptual framework for understanding how stars similar to our Sun evolve. Their framework helps explain how the rotation of stars, their emission of x-rays, and the intensity of their stellar winds vary with time. According to first author Eric Blackman, professor of physics and astronomy at the University of Rochester, the work could also "ultimately help to determine the age of stars more precisely than is currently possible."
In a paper published today in Monthly Notices of the Royal Astronomical Society, the researchers describe how they have corroborated known, observable data for the activity of Sun-like stars with fundamental astrophysics theory. By looking at the physics behind the speeding up or slowing down of a star's rotation, its x-ray activity, and magnetic field generation, Blackman says the research is a "first attempt to build a comprehensive model for the activity evolution of these stars".
Using our Sun as the calibration point, the model most accurately describes the likely behavior of the Sun in the past, and how it would be expected to behave in the future. But Blackman adds that there are many stars of similar mass and radius, and so the model is a good starting point for predictions for these stars.
"Our model shows that stars younger than our Sun can vary quite significantly in the intensity of their x-ray emission and mass loss," said Blackman. "But there is a convergence in the activity of the stars after a certain age, so you could say that our Sun is very typical for stars of its mass, radius, and its age. They get more predictable as they age."
"We're not yet at the point where we can accurately predict a star's precise age, because there are simplifying assumptions that go into the model," said Blackman. "But in principle, by extending the work to relax some of these assumptions we could predict the age of for a wide range of stars based on their x-ray luminosity."
At the moment, empirically determining the age of stars is most easily accomplished if a star is among a cluster of stars, from whose mutual properties astronomers can estimate the age. Blackman explains that its age can then be estimated "to an accuracy not better than a factor of 25% of its actual age, which is typically billions of years." The problem is worse for "field stars," alone in space such that the cluster method of dating cannot be used. For these stars, astronomers have turned to "gyrochronology" and "activity" aging - empirically aging the stars based the fact that older stars of known age rotate more slowly and have lower x-ray luminosities than younger stars.
"Over the past few decades astronomers have been able to empirically measure these trends in rotation and magnetic activity for stars like the Sun, but Eric and his collaborators are trying to devise a comprehensive theoretical interpretation," said Eric Mamajek, professor of physics and astronomy at the University of Rochester and one of the astronomers leading the development of empirical methods for determining a star's age. "Ultimately this should lead to improved constraints on the evolution of rotation and activity in Sun-like stars, and better constraints on how the magnetic properties of our Sun have changed over the course of its main sequence life."
And this is where the model developed by Blackman and his coauthor James E. Owen is important: it provides a physics explanation for how stellar rotation, activity, magnetic field, and mass loss all mutually evolve with age.
"Only by tackling the entire problem of how stellar rotation, x-ray activity, magnetic field and mass-loss mutually affect each other could we build a complete picture," said Owen, a NASA Hubble fellow at the Institute for Advanced Study, Princeton. "We find these processes to be strongly intertwined and the majority of previous approaches had only considered the evolution of one or two processes together, not the complete problem."
###
Blackman carried out part of this work while he was on sabbatical as a IBM-Einstein Fellow/Simons Fellow at the Institute for Advanced Study, Princeton. The authors would also like to acknowledge NSF and NASA for their grant support.
Reference:
"Minimalist coupled evolution model for stellar x-ray activity, rotation, mass loss and magnetic field," MNRAS, 2016 March 23. Paper: http://mnras.oxfordjournals.org/lookup/doi/10.1093/mnras/stw369.

People With Fewer Friends Tend To Be Intelligent: Study


There is an archetype or trope in fiction and mass media called Intelligence Equals Isolation - when a person or character is "very smart," but often suffer for it by being unable to relate to the worries and personalities of their friends, relatives or anybody.
Now, a new real-world study may be able to back up this fictional archetype. Evolutionary psychologists from Singapore and London have found that intelligent people find it difficult to engage in social interaction even with their close friends.
Measuring Happiness
Satoshi Kanazawa of the London School of Economics and Political Science, and Norman Li of Singapore Management University originally dug into the question: what makes a life well-lived?
 Kanazawa and Li hypothesize that the lifestyle of our hunter-gatherer ancestors forms the foundation of what makes modern humans happy now.
They applied a concept called "the savanna theory of happiness" to explain their findings from a large survey that involved 15,000 people, who were 18 to 28 years old.
The pair found that people who reside in areas that are densely populated were more likely to report less satisfaction with their life. The greater the population density, the less happy the respondents said they were.
Researchers also found that the more interaction the respondents had with their close friends, the greater their self-reported happiness was.
There was, however, a huge exception: for intelligent people, the correlations were reversed or diminished.
More Alone Time, Please
The team measured intelligence thru the people's intelligent quotient. Although the exact IQ levels of the respondents were not disclosed, the baseline is considered 100, while genius level is at 140.
Kanazawa and Li found that the effect of population density on life satisfaction was more than twice as large for individuals with low IQ than for individuals with high IQ.
In fact, more intelligent individuals were less satisfied with their life if they socialized with their friends more frequently.
In other words: intelligent people tended to need more alone time. If they spend too much time with friends, they would feel less satisfied with life.
Carol Graham of Brookings Institution, an expert who studies the economics of happiness, has an explanation why.
"The findings in here suggest - and it is no surprise - that those with more intelligence and the capacity to use it ... are less likely to spend so much time socializing because they are focused on some other longer term objective," Graham says.
It could be that the person prefers to spend more time treating cancer as a doctor, writing his next book as a novelist, or working to protect vulnerable people in society as a human rights lawyer. Frequent social interaction may seem to detract them to pursue these goals, negatively affecting their life satisfaction.
The Link To Our Prehistoric Ancestors
However, Kanazawa and Li's savanna theory of happiness explains it differently.
It begins with the premise that the human brain evolved to meet the demands of the ancestral environment on the African Savanna, where the population density was similar to rural Alaska, with less than one person per square kilometer. A brain like that in an environment like modern Manhattan would result to evolutionary friction.
Still, our prehistoric ancestors who were hunter-gatherers lived in small bands of 150 individuals.
"In such settings, having frequent contact with lifelong friends and allies was likely necessary for survival and reproduction for both sexes," researchers said.
Kanazawa and Li found a twist: intelligent people may be better equipped to deal with evolutionary changes, so living in an area with high population may have a smaller effect on their overall disposition and well-being.
Meanwhile, the study has a caveat: it defines happiness in terms of self-reported satisfaction and does not consider experienced well-being, such as the last time the person laughed or how many times the person has been angry in the past week.
Still, Kanazawa and Li said the distinction does not matter for their savanna theory.
"Even though our empirical analyses ... used a measure of global life satisfaction, the savanna theory of happiness is not committed to any particular definition and is compatible with any reasonable conception of happiness, subjective well-being, and life satisfaction," researchers said.

The study is featured in the British Journal Of Psychology.

Tuesday 22 March 2016

Will the end of breeding orcas at SeaWorld change much for animals in captivity?


When SeaWorld announced it would stop breeding orcas and begin to phase out “theatrical performances” using the animals, the news appeared to mark a significant change in ideas about animals and captivity.
Wayne Pacelle, president of the Humane Society of the United States (HSUS), and Joel Manby, CEO of SeaWorld, promoted their new partnership in interviews. After a long history of mutual recrimination, the two organizations say they’ll work together to provide needed support for wild marine creatures in distress and to improve the circumstances of currently captive orcas in the U.S. As SeaWorld’s Manby put it:
It’s clear to me that society is shifting. People’s view to have these beautiful, majestic animals under human care – people are more and more uncomfortable with that. And no matter what side you are on this issue, it’s clear that that’s shifting, and we need to shift with that.
If there is indeed a shift going on, it seems to be more in the rhetoric of the animal exhibition industries than in public comfort (or discomfort) with seeing large animals in captivity.

Changing with the times…

For anyone interested in the history of exhibiting exotic animals, the news that people’s expectations have changed and that zoological gardens, aquariums and circuses are responsive to those changes can’t help but illicit a little cynicism.
The SeaWorld/HSUS announcement echoes news from last year that Ringling Bros. and Barnum & Bailey Circus decided to phase out elephant performances and retire the animals to a state-of-the-art sanctuary. In both cases, the companies were clearly facing growing public criticism damaging their bottom lines. They appear to have made business decisions to protect their brands and refocus the public’s attention on what they describe as more critical core missions.
At the same time, both announcements were framed as having resulted from the recognition that the times have changed – “that society is shifting” – and that change is making circumstances better for animals in captivity. This claim reaches far beyond charismatic whales and elephants and is deployed for all kinds of new policies and exhibits.

Zoological Society of London’s advertisement for ‘Land of the Lions.’

Later this month, for example, the London Zoo will open its “breath-taking” newest exhibit, “Land of the Lions,” featuring “thrilling, immersive Indian-themed areas to explore – including a train station, crumbling temple clearing, high street and guard hut.” The exhibit is described as an “interactive adventure,” through which visitors will “get closer than ever before to mighty Asiatic lions.”

Queen Elizabeth opens ‘Land of the Lions.’

As remarkable as this exhibit sounds, a video of the queen officially opening the exhibit shows a fairly unsurprising couple of female lions “activated” by having food dispersed in a relatively small exhibit with wire fencing.

But the times have been changing for a while

I’m not sure whether the queen felt transported to India in visiting this exhibit. What is clear, though, is that the zoo wants us to believe that this exhibit is something entirely novel. This sort of claim is very old, indeed.
Even in 1869, for example, almost 150 years ago, an editorial appeared in the Daily News of London describing a proposed new lion house for this same zoo. Pointing to a history of “dismal menagerie cages,“ the article heralded a new vision of “displaying lions and tigers, in what may be called by comparison a state of nature" and the public can look forward to seeing “lions at play, free as their own jungle home; tigers crouching, springing, gamboling, with as little restraint as the low plains of their native India.”

A late 19th-century vision of a zoological park of the future. From Nigel Rothfels' Savages and Beasts: The Birth of the Modern Zoo

Ever since public zoos began to be built in the 19th century, there’s been a consistent rhetorical pattern behind any proposed new zoo or aquarium or exhibit.
The argument typically runs something like this: whereas in the past our exhibits have been disappointing, uninspiring and small, our new exhibit will finally make it seem like the animals are not in captivity. As importantly, the animals themselves will also finally be happy.
Unfortunately, almost all of these new exhibits turn out to be somehow less than was envisioned, less than was hoped…simply less.
This is not to say that exhibits haven’t in fact gotten better. Exhibited animals are in general better cared for and healthier in all ways than they used to be.
Each generation of exhibits does tend to improve on what came before; elephant exhibits being built at the more ambitious zoos of today, like the Oregon Zoo’s “Elephant Lands,” for example, have typically radically improved the conditions for the animals, keepers and the visiting public. And these changes have been pushed by public concerns along with the ambitions of designers and directors to provide better circumstances for the animals.
But all that doesn’t alter the fact of captivity. And that fact will, as best as I can tell, continue to undermine whatever rhetorical gestures may be made declaring a new day for animals and people.
The Conversation
Nigel Rothfels, Director of the Office of Undergraduate Research, University of Wisconsin-Milwaukee
This article was originally published on The Conversation. Read the original article.

Radiation combined with immune-stimulating drugs could pack a powerful punch against cancer cells

Charlie Garnett Benson, Georgia State University

In his final State of the Union address, President Obama tasked Vice President Joe Biden with leading a new National Cancer Moonshot initiative. The hope is that this will put America on course to be “the country that cures cancer once and for all.” Listed among the cutting-edge research areas of the initiative is a class of treatments called cancer immunotherapy and combination therapy.
Cancer immunotherapies are treatments that stimulate the immune system to target and attack cancer. Researchers now believe that combining immunotherapy with traditional therapies could open up new possibilities for cancer treatment.
For instance, radiation is one of the oldest and most commonly used forms of cancer treatment out there. But there are limits to how much radiation a person can receive, and it can’t kill every cancer cell. However, in combination therapy, radiation could be paired with immunotherapy to pack a one-two punch against cancer cells.

A patient is prepared for radiation treatment at Walter Reed National Military Medical Center. Airman Magazine/U.S. Air Force photo/Staff Sgt. Russ Scalf/Flickr, CC BY-NC

How does radiation kill cancer?

Most cancer patients receive some combination of surgery, chemotherapy and radiation during the course of their care. Radiation is used in about 50 percent of cancer patients. And unlike chemotherapy, which includes hundreds of different drugs that target cancers cells in different ways, ionizing radiation is simply high energy waves. Regardless of how it is delivered, cells generally experience equal doses of radiation in similar ways.
Radiation can kill cancer cells directly by damaging their DNA, which then triggers various forms of cell death, including cell suicide (apoptosis). Because the high energy waves can also hit healthy cells around the targeted cancer cells, there’s a limit to how high a dose of radiation a person can receive without causing damage to healthy tissue.
Over the years, radiation delivery has improved, allowing more focused delivery to tumors and less damage to surrounding normal cells. Today patients are often treated with smaller doses, separated over time, called fractions. This allows for a higher overall dose to the tumor, but with less of the acute toxic side effects.
But even with these advancements there are still many patients whose cancer isn’t cured by radiation alone. Cell suicide, for instance, requires the activity of cellular proteins that trigger the apoptotic process.
Cancer cells can develop mutations in these genes that render them resistant to death from radiation, and these cancer cells escape elimination. Other cancer cells may survive because they receive a sublethal dose due to their location within the tumor.
In some cases, radiation offers no hope of a cure at all. It can still be given, however, to alleviate pain or cause some tumor shrinkage before other treatments are given.
However the cancer cells that survive radiation treatment are not left unaffected. More recently, researchers have realized that zapping cancer cells with radiation can make them better targets for the immune system’s own response, and in turn for immunotherapies.

A cancer patient receives an intravenous dose of Pembrolizumab during a promising cancer treatment clinical trial at UCLA Medical Center in Los Angeles, California, August 19, 2013. David McNew/Reuters

How does immunotherapy fight cancer?

In the early days of immunotherapy, people thought that therapies intended to kill multiplying cells (like radiation and chemotherapy) would never be able to work with an immune-based therapy that is meant to multiply an army of immune cells.
My work and that of others has shown that radiation can make tumor cells express genes that increase the activity of immune cells. This is exciting since many cancer cells evade detection by decreasing the expression of genes that would allow the immune system to recognize and attack the cell. Radiation can reverse this and make cancer cells more noticeable.

A diagram of Immunogenic Modulation (IM) of Tumor Cells by Ionizing Radiation. Tumors have been reported to be modulated in several ways, which could directly alter the function, activity or recruitment of CD8+ killer T cells, as well as the function of other immune cells. Charlie Benson, Author provided

For instance, radiation can increase the expression of proteins on the surface of colorectal and prostate tumor cells that increase the survival and killing activity of T cells.
Radiation can also cause cancer cells to release molecules that recruit T cells to the tumor, or stimulate the activity of other cancer-killing cells called natural killer cells.
Many other immune-stimulating genes can be modulated in cancer cells by radiation, called immunogenic modulation, in a variety of cancer cell types. So the old perspective that radiation is wholly immunosuppressive, and can’t be used with new immune-based therapies, isn’t true after all. So from this immunologist’s point of view, I think radiation is still being greatly underutilized.
The addition of radiation to these therapies makes cancer cells better targets for the T cells produced by immunotherapy treatments. And that isn’t the only potential benefit of using radiation and immunotherapy together.

Attacking cancer in other parts of the body

Radiation can’t target every tumor or every cancer cell in the body. It isn’t feasible to deliver radiation to each and every place tumor cells have migrated once the disease metastasizes throughout the body.
That is where a phenomenon called the “abscopal effect” comes in. Abscopal means “away from target,” and is a radiation biology term that describes a fascinating phenomenon: sometimes treating a tumor with radiation in one part of the body causes the elimination and cure of a nontreated tumor at a different location.
Many scientists attribute this effect to the activity of immune cells, triggered by radiation, mounting an effective attack against untreated tumors. This can be shown experimentally in mouse models of cancer. However, it has also been observed in cancer patients in the clinic.
In the past few years, there have been several high-profile reports of abscopal responses in patients with lung, melanoma and other cancers.
This abscopal response occurred even in some melanoma patients receiving radiation just to treat pain. These patients all received some form of cancer immunotherapy in addition to the radiation therapy.
The bad news is that we still aren’t sure exactly how to make abscopal responses happen consistently. The challenge is to figure out what exactly is responsible for the abscopal effect so that it can be reproduced reliably in more patients treated with combination therapy.
Questions remain about what the best radiation dose is to cause this effect, the optimal timing to give radiation relative to the cancer immunotherapy, what specific types of cancer are most likely to respond this way and which immunotherapy (from the ever-growing list) is the best for causing this effect in combination strategies.
The overall good news is that radiation has new tricks up its sleeve and can make tumor cells tickle T cells into action. Thus, cancer immunotherapies may help repurpose the use of one of the oldest cancer treatments in new ways.
The Conversation

Charlie Garnett Benson, Assistant Professor, Georgia State University
This article was originally published on The Conversation. Read the original article.

MY AD 2