BTemplates.com

Powered by Blogger.

Pageviews past week

Quantum mechanics

Auto News

artificial intelligence

About Me

Recommend us on Google!

Information Technology

Popular Posts

Saturday, December 29, 2012

Chinese Medicine Yields Secrets: Atomic Mechanism of Two-Headed Molecule Derived from Chang Shan, a Traditional Chinese Herb


The mysterious inner workings of Chang Shan -- a Chinese herbal medicine used for thousands of years to treat fevers associated with malaria -- have been uncovered thanks to a high-resolution structure solved at The Scripps Research Institute (TSRI).

Atomic Mechanism of Two-Headed Molecule Derived from Chang Shan, a Traditional Chinese Herb
Scripps Research Institute scientists have determined a molecular 
 structure that helps explain how the Chinese herbal medicine 
Chang Shan works. (Credit: Image courtesy of the Schimmel lab.)
Described in the journal Nature this week, the structure shows in atomic detail how a two-headed compound derived from the active ingredient in Chang Shan works. Scientists have known that this compound, called halofuginone (a derivative of the febrifugine), can suppress parts of the immune system -- but nobody knew exactly how.

The new structure shows that, like a wrench in the works, halofuginone jams the gears of a molecular machine that carries out "aminoacylation," a crucial biological process that allows organisms to synthesize the proteins they need to live. Chang Shan, also known as Dichroa febrifuga Lour, probably helps with malarial fevers because traces of a halofuginone-like chemical in the herb interfere with this same process in malaria parasites, killing them in an infected person's bloodstream.

"Our new results solved a mystery that has puzzled people about the mechanism of action of a medicine that has been used to treat fever from a malaria infection going back probably 2,000 years or more," said Paul Schimmel, PhD, the Ernest and Jean Hahn Professor and Chair of Molecular Biology and Chemistry and member of The Skaggs Institute for Chemical Biology at TSRI. Schimmel led the research with TSRI postdoctoral fellow Huihao Zhou, PhD.

Halofuginone has been in clinical trials for cancer, but the high-resolution picture of the molecule suggests it has a modularity that would make it useful as a template to create new drugs for numerous other diseases.

The Process of Aminoacylation and its Importance to Life

Aminoacylation is a crucial step in the synthesis of proteins, the end products of gene expression. When genes are expressed, their DNA sequence is first read and transcribed into RNA, a similar molecule. The RNA is then translated into proteins, which are chemically very different from DNA and RNA but are composed of chains of amino acid molecules strung together in the order called for in the DNA.

Necessary for this translation process are a set of molecules known as transfer RNAs (tRNAs), which shuttle amino acids to the growing protein chain where they are added like pearls on a string. But before the tRNAs can move the pearls in place, they must first grab hold of them.

Aminoacylation is the biological process whereby the amino acid's pearls are attached to these tRNA shuttles. A class of enzymes known as aminoacyl-tRNA synthetases is responsible for attaching the amino acids to the tRNAs, and Schimmel and his colleagues have been examining the molecular details of this process for years. Their work has given scientists insight into everything from early evolution to possible targets for future drug development.

Over time what has emerged as the picture of this process basically involves three molecular players: a tRNA, an amino acid and the aminoacyl-tRNA synthetase enzyme that brings them together. A fourth molecule called ATP is a microscopic form of fuel that gets consumed in the process.

The new work shows that halofuginone gets its potency by interfering with the tRNA synthetase enzyme that attaches the amino acid proline to the appropriate tRNA. It does this by blocking the active site of the enzyme where both the tRNA and the amino acid come together, with each half of the halofuginone blocking one side or the other.

Interestingly, said Schimmel, ATP is also needed for the halofuginone to bind. Nothing like that has ever been seen in biochemistry before.

"This is a remarkable example where a substrate of an enzyme (ATP) captures an inhibitor of the same enzyme, so that you have an enzyme-substrate-inhibitor complex," said Schimmel.

The article, "ATP-Directed Capture of Bioactive Herbal-Based Medicine on Human tRNA Synthetase," by Huihao Zhou, Litao Sun, Xiang-Lei Yang and Paul Schimmel was published in the journal Nature on Dec. 23, 2012.

This work was supported by the National Institutes of Health through grants #GM15539, #23562 and #88278 and by a fellowship from the National Foundation for Cancer Research.

Friday, December 28, 2012

Strange Behavior: New Study Exposes Living Cells to Synthetic Protein


One approach to understanding components in living organisms is to attempt to create them artificially, using principles of chemistry, engineering and genetics. A suite of powerful techniques -- collectively referred to as synthetic biology -- have been used to produce self-replicating molecules, artificial pathways in living systems and organisms bearing synthetic genomes.

The depletion of ATP in cells of the bacterium Escherichia coli causes them to transition to a filamentous state and form dense lipid structures known as endoliposomes. The structures can be clearly observed in these transmission electron micrographs of increasing magnification.
The depletion of ATP in cells of the bacterium Escherichia coli causes them to transition to a filamentous state and form dense lipid structures known as endoliposomes. The structures can be clearly observed in these transmission electron micrographs of increasing magnification. (Credit: Image courtesy of Arizona State University)

In a new twist, John Chaput, a researcher at Arizona State University's Biodesign Institute and colleagues at the Department of Pharmacology, Midwestern University, Glendale, AZ have fabricated an artificial protein in the laboratory and examined the surprising ways living cells respond to it.

"If you take a protein that was created in a test tube and put it inside a cell, does it still function," Chaput asks. "Does the cell recognize it? Does the cell just chew it up and spit it out?" This unexplored area represents a new domain for synthetic biology and may ultimately lead to the development of novel therapeutic agents.

The research results, reported in the advanced online edition of the journal ACS Chemical Biology, describe a peculiar set of adaptations exhibited by Escherichia coli bacterial cells exposed to a synthetic protein, dubbed DX. Inside the cell, DX proteins bind with molecules of ATP, the energy source required by all biological entities.

"ATP is the energy currency of life," Chaput says. The phosphodiester bonds of ATP contain the energy necessary to drive reactions in living systems, giving up their stored energy when these bonds are chemically cleaved. The depletion of available intracellular ATP by DX binding disrupts normal metabolic activity in the cells, preventing them from dividing, (though they continue to grow).

After exposure to DX, the normally spherical E. coli bacteria develop into elongated filaments. Within the filamentous bacteria, dense intracellular lipid structures act to partition the cell at regular intervals along its length. These unusual structures, which the authors call endoliposomes, are an unprecedented phenomenon in such cells.

"Somewhere along the line of this filamentation, other processes begin to happen that we haven't fully understood at the genetic level, but we can see the results phenotypically," Chaput says. "These dense lipid structures are forming at very regular regions along the filamented cell and it looks like it could be a defense mechanism, allowing the cell to compartmentalize itself." This peculiar adaptation has never been observed in bacterial cells and appears unique for a single-celled organism.

Producing a synthetic protein like DX, which can mimic the elaborate folding characteristics of naturally occurring proteins and bind with a key metabolite like ATP is no easy task. As Chaput explains, a clever strategy known as mRNA display was used to produce, fine-tune and amplify synthetic proteins capable of binding ATP with high affinity and specificity, much as a naturally occurring ATP-binding protein would.

First, large libraries of random sequence peptides are formed from the four nucleic acids making up DNA, with each strand measuring around 80 nucleotides in length. These sequences are then transcribed into RNA with the help of an enzyme -- RNA polymerase. If a natural ribosome is then introduced, it attaches to the strand and reads the random sequence RNA as though it was a naturally-occurring RNA, generating a synthetic protein as it migrates along the strand. In this way, synthetic proteins based on random RNA sequences can be generated.

Exposing the batch of synthetic proteins to the target molecule and extracting those that bind can then select for ATP-binding proteins. But as Chaput explains, there's a problem: "The big question is how do you recover that genetic information? You can't reverse transcribe a protein back into DNA. You can't PCR amplify a protein. So we have to do all these molecular biology tricks."

The main trick involves an earlier step in the process. A molecular linker is chemically attached to the RNA templates, such that each RNA strand forms a bond with its newly translated protein. The mRNA-protein hybrids are exposed to selection targets (like ATP) over consecutive rounds of increasing stringency. After each round of selection, those library members that remain bound to the target are reverse-transcribed into cDNA (using their conveniently attached RNA messages), and then PCR amplified.

In the current study, E. coli cells exposed to DX transitioned into a filamentous form, which can occur naturally when such cells are subject to conditions of stress. The cells display low metabolic activity and limited cell division, presumably owing to their ATP-starved condition.

The study also examined the ability of E. coli to recover following DX exposure. The cells were found to enter a quiescent state known as viable but non-culturable (VBNC), meaning that they survived ATP sequestration and returned to their non-filamentous state after 48 hours, but lost their reproductive capacity. Further, this condition was difficult to reverse and seems to involve a fundamental reprogramming of the cell.

In an additional response to DX, the filamentous cells form previously undocumented structures, which the authors refer to as endoliposomes. These dense lipid concentrations, spanning the full width of the filamented E. coli, segment the cells into distinct compartments, giving the cells a stringbean-like appearance under the microscope.

The authors speculate that this adaptation may be an effort to maintain homeostasis in regions of the filamentous cell, which have essentially been walled off from the intrusion of ATP-depleting DX. They liken endoliposomes to the series of water-tight compartments found in submarines which are used to isolate damaged sections of the ship and speculate that DX-exposed cells are partitioning their genetic information into regions where it can be safely quarantined. Such self-compartmentalization is known to occur in some eukaryotic cells, but has not been previously observed in prokaryotes like E. coli.

The research indicates that there is still a great deal to learn about bacterial behavior and the repertoire of responses available when such cells encounter novel situations, such as an unfamiliar, synthetic protein. The study also notes that many infectious agents rely on a dormant state, (similar to the VBNC condition observed in the DX-exposed E. coli), to elude detection by antibiotics. A better understanding of the mechanisms driving this behavior could provide a new approach to targeting such pathogens.

The relative safety of E. coli as a model organism for study may provide a fruitful tool for more in-depth investigation of VBNC states in pathogenic organisms. Further, given ATP's central importance for living organisms, its suppression may provide another avenue for combating disease. One example would be an engineered bacteriophage capable of delivering DX genes to pathogenic organisms.

Human Evolution Driven By Changing Environment


A series of rapid environmental changes in East Africa roughly 2 million years ago may be responsible for driving human evolution, according to researchers at Penn State and Rutgers University.

The researchers examined lake sediments from Olduvai Gorge in northern Tanzania, looking for biomarkers -- fossil molecules -- from ancient trees and grasses.
The researchers examined lake sediments from Olduvai Gorge in northern Tanzania, looking for biomarkers -- fossil molecules -- from ancient trees and grasses. (Credit: Gail Ashley)

"The landscape early humans were inhabiting transitioned rapidly back and forth between a closed woodland and an open grassland about five to six times during a period of 200,000 years," said Clayton Magill, graduate student in geosciences at Penn State. "These changes happened very abruptly, with each transition occurring over hundreds to just a few thousand years."

According to Katherine Freeman, professor of geosciences, Penn State, the current leading hypothesis suggests that evolutionary changes among humans during the period the team investigated were related to a long, steady environmental change or even one big change in climate.

"There is a view this time in Africa was the 'Great Drying,' when the environment slowly dried out over 3 million years," she said. "But our data show that it was not a grand progression towards dry; the environment was highly variable."

According to Magill, many anthropologists believe that variability of experience can trigger cognitive development.

"Early humans went from having trees available to having only grasses available in just 10 to 100 generations, and their diets would have had to change in response," he said. "Changes in food availability, food type, or the way you get food can trigger evolutionary mechanisms to deal with those changes. The result can be increased brain size and cognition, changes in locomotion and even social changes -- how you interact with others in a group. Our data are consistent with these hypotheses. We show that the environment changed dramatically over a short time, and this variability coincides with an important period in our human evolution when the genus Homo was first established and when there was first evidence of tool use."

The researchers -- including Gail Ashley, professor of earth and planetary sciences, Rutgers University -- examined lake sediments from Olduvai Gorge in northern Tanzania. They removed the organic matter that had either washed or was blown into the lake from the surrounding vegetation, microbes and other organisms 2 million years ago from the sediments. In particular, they looked at biomarkers -- fossil molecules from ancient organisms -- from the waxy coating on plant leaves.

"We looked at leaf waxes because they're tough, they survive well in the sediment," said Freeman.

The team used gas chromatography and mass spectrometry to determine the relative abundances of different leaf waxes and the abundance of carbon isotopes for different leaf waxes. The data enabled them to reconstruct the types of vegetation present in the Olduvai Gorge area at very specific time intervals.

The results showed that the environment transitioned rapidly back and forth between a closed woodland and an open grassland.

To find out what caused this rapid transitioning, the researchers used statistical and mathematical models to correlate the changes they saw in the environment with other things that may have been happening at the time, including changes in the Earth's movement and changes in sea-surface temperatures.

"The orbit of the Earth around the sun slowly changes with time," said Freeman. "These changes were tied to the local climate at Olduvai Gorge through changes in the monsoon system in Africa. Slight changes in the amount of sunshine changed the intensity of atmospheric circulation and the supply of water. The rain patterns that drive the plant patterns follow this monsoon circulation. We found a correlation between changes in the environment and planetary movement."

The team also found a correlation between changes in the environment and sea-surface temperature in the tropics.

"We find complementary forcing mechanisms: one is the way Earth orbits, and the other is variation in ocean temperatures surrounding Africa," Freeman said. The researchers recently published their results in the Proceedings of the National Academy of Sciences along with another paper in the same issue that builds on these findings. The second paper shows that rainfall was greater when there were trees around and less when there was a grassland.

"The research points to the importance of water in an arid landscape like Africa," said Magill. "The plants are so intimately tied to the water that if you have water shortages, they usually lead to food insecurity.

"Together, these two papers shine light on human evolution because we now have an adaptive perspective. We understand, at least to a first approximation, what kinds of conditions were prevalent in that area and we show that changes in food and water were linked to major evolutionary changes."

The National Science Foundation funded this research.

Sunday, December 23, 2012

Sound Beam Could One Day Be Invisible Scalpel


A carbon-nanotube-coated lens that converts light to sound can focus high-pressure sound waves to finer points than ever before. The University of Michigan engineering researchers who developed the new therapeutic ultrasound approach say it could lead to an invisible knife for noninvasive surgery.

With a new technique that uses tightly-focused sound waves for micro-surgery, University of Michigan engineering researchers drilled a 150-micrometer hole in a confetti-sized artificial kidney stone.
With a new technique that uses tightly-focused sound waves for micro-surgery, University of Michigan engineering researchers drilled a 150-micrometer hole in a confetti-sized artificial kidney stone. (Credit: Hyoung Won Baac)

Today's ultrasound technology enables far more than glimpses into the womb. Doctors routinely use focused sound waves to blast apart kidney stones and prostate tumors, for example. The tools work primarily by focusing sound waves tightly enough to generate heat, says Jay Guo, a professor of electrical engineering and computer science, mechanical engineering, and macromolecular science and engineering. Guo is a co-author of a paper on the new technique published in the current issue of Nature's journal Scientific Reports.

The beams that today's technology produces can be unwieldy, says Hyoung Won Baac, a research fellow at Harvard Medical School who worked on this project as a doctoral student in Guo's lab.

"A major drawback of current strongly focused ultrasound technology is a bulky focal spot, which is on the order of several millimeters," Baac said. "A few centimeters is typical. Therefore, it can be difficult to treat tissue objects in a high-precision manner, for targeting delicate vasculature, thin tissue layer and cellular texture. We can enhance the focal accuracy 100-fold."

The team was able to concentrate high-amplitude sound waves to a speck just 75 by 400 micrometers (a micrometer is one-thousandth of a millimeter). Their beam can blast and cut with pressure, rather than heat. Guo speculates that it might be able to operate painlessly because its beam is so finely focused it could avoid nerve fibers. The device hasn't been tested in animals or humans yet, though.

"We believe this could be used as an invisible knife for noninvasive surgery," Guo said. "Nothing pokes into your body, just the ultrasound beam. And it is so tightly focused, you can disrupt individual cells."

To achieve this superfine beam, Guo's team took an optoacoustic approach that converts light from a pulsed laser to high-amplitude sound waves through a specially designed lens. The general technique has been around since Thomas Edison's time. It has advanced over the centuries, but for medical applications today, the process doesn't normally generate a sound signal strong enough to be useful.

The U-M researchers' system is unique because it performs three functions: it converts the light to sound, focuses it to a tiny spot and amplifies the sound waves. To achieve the amplification, the researchers coated their lens with a layer of carbon nanotubes and a layer of a rubbery material called polydimethylsiloxane. The carbon nanotube layer absorbs the light and generates heat from it. Then the rubbery layer, which expands when exposed to heat, drastically boosts the signal by the rapid thermal expansion.

The resulting sound waves are 10,000 times higher frequency than humans can hear. They work in tissues by creating shockwaves and microbubbles that exert pressure toward the target, which Guo envisions could be tiny cancerous tumors, artery-clogging plaques or single cells to deliver drugs. The technique might also have applications in cosmetic surgery.

In experiments, the researchers demonstrated micro ultrasonic surgery, accurately detaching a single ovarian cancer cell and blasting a hole less than 150 micrometers in an artificial kidney stone in less than a minute.

"This is just the beginning," Guo said. "This work opens a way to probe cells or tissues in much smaller scale."

The researchers will present the work at the SPIE Photonics West meeting in San Francisco. The research was funded by the National Science Foundation and the National Institutes of Health.

Friday, December 21, 2012

Maya Scholar Debunks World-Ending Myth


As we hurtle toward the end of 2012, the conversation about a certain date with roots in an ancient Maya calendar has reached a fever pitch.

David Stuart discusses the new inscriptions with colleagues from Tulane University and Universidad del Valle de Guatemala. Seated left to right: Marcello Canuto (Tulane), Stuart, Tomás Barrientos (UVG), Jocelyn Ponce (UVG). (Credit: Image courtesy of University of Texas at Austin)

Dec. 21, 2012, has taken over popular culture this year: It's been the subject of movies, books and news shows. The date and its supposed prophecy that the world will come to an end has been the subject of water cooler conversations and international media attention.

But the truth regarding the date, according to renowned Maya scholar and University of Texas at Austin art history professor David Stuart, is that the day is indeed meaningful -- but not in the way you might think.

"The Maya never actually predicted the end of times," says Stuart, who recently won a UNESCO medal for his lifetime contributions to the study of ancient Maya culture and archaeological sites, including UNESCO World Heritage Sites. "In the Maya scheme of time, the approaching date was thought to be the turn of an important cycle, or as they put it, the end of 13 bak'tuns. The thing is, there are many more bak'tuns still to come."

Earlier this year, Stuart was working with colleagues at the ruins of La Corona in the Guatemalan jungle, where they excavated many inscribed stones that had been part of a staircase. As the world's leading epigrapher of Maya script, Stuart was brought in to decipher the 56 glyphs carved into the stones. He discovered 200 years of political history and, to his surprise, the second known reference in Maya culture to the so-called end date of Dec. 21, 2012.

But despite the popular misconception, the date doesn't predict the end of times. Rather, it was intended to promote continuity during a time of crisis.

"The hieroglyphs emphasized seventh century history and politics, linking the reign of an ancient king to the turn of the 13th bak'tun many centuries later," Stuart explains. "The point was to associate the divine king's time on the throne to time on a cosmic scale.

"The monument commemorated a royal visit to La Corona in AD 696 by the most powerful Maya ruler of that time, a few months after his defeat by a longstanding rival in AD 695," said Stuart. "This ruler was visiting allies and allaying their fears after his defeat. It was a time of great political turmoil in the Maya region, and this king felt compelled to allude to a larger cycle of time that happens to end in 2012."

Rather than prophesy, the 2012 reference served to place this king's troubled reign and accomplishments into a larger cosmological framework. In times of crisis, the ancient Maya used their calendar to promote continuity and stability.

Assuming 21st century soothsayers are incorrect about the impending end of the world, Stuart's research will continue in 2013, starting in January with the Maya Meetings, an international conference held, alternately, in Austin and Antigua, Guatemala, each year. Stuart has served as director of the event since 2004, and this year it is a family affair. Stuart's father, George E. Stuart, will be the keynote speaker at this year's meeting, which will be in Austin.

The elder Stuart was hired as a cartographer for the National Geographic Society and remained on staff for nearly 40 years working in a variety of capacities, including as editor for archaeology of National Geographic Magazine and chairman of the Committee for Research and Exploration. He founded the Center for Maya Research in 1984.

Scientists Create Nanoscale Window to Biological World


If the key to winning battles is knowing both your enemy and yourself, then scientists are now well on their way toward becoming the Sun Tzus of medicine by taking a giant step toward a priceless advantage -- the ability to see the soldiers in action on the battlefield.

A novel microfluidics platform allowed viewing of structural details of rotavirus double-layered particles; the 3-D graphic of the virus, in purple, was reconstructed from data gathered by the new technique.
A novel microfluidics platform allowed viewing of structural 
details of rotavirus double-layered particles; the 3-D graphic 
of the virus, in purple, was reconstructed from data gathered by 
the new technique. (Credit: Virginia Tech)

Investigators at the Virginia Tech Carilion Research Institute have invented a way to directly image biological structures at their most fundamental level and in their natural habitats. The technique is a major advancement toward the ultimate goal of imaging biological processes in action at the atomic level.

"It's sort of like the difference between seeing Han Solo frozen in carbonite and watching him walk around blasting stormtroopers," said Deborah Kelly, an assistant professor at the VTC Research Institute and a lead author on the paper describing the first successful test of the new technique. "Seeing viruses, for example, in action in their natural environment is invaluable."

The technique involves taking two silicon-nitride microchips with windows etched in their centers and pressing them together until only a 150-nanometer space between them remains. The researchers then fill this pocket with a liquid resembling the natural environment of the biological structure to be imaged, creating a microfluidic chamber.

Then, because free-floating structures yield images with poor resolution, the researchers coat the microchip's interior surface with a layer of natural biological tethers, such as antibodies, which naturally grab onto a virus and hold it in place.

In a recent study in Lab on a Chip, Kelly joined Sarah McDonald, also an assistant professor at the VTC Research Institute, to prove that the technique works. McDonald provided a pure sample of rotavirus double-layered particles for the study.

"What's missing in the field of structural biology right now is dynamics -- how things move in time," said McDonald. "Debbie is developing technologies to bridge that gap, because that's clearly the next big breakthrough that structural biology needs."

Rotavirus is the most common cause of severe diarrhea among infants and children. By the age of 5, nearly every child in the world has been infected at least once. And although the disease tends to be easily managed in the developed world, in developing countries rotavirus kills more than 450,000 children a year.

At the second step in the pathogen's life cycle, rotavirus sheds its outer layer, which allows it to enter a cell, and becomes what is called a double-layered particle. Once its second layer is exposed, the virus is ready to begin using the cell's own infrastructure to produce more viruses. It was the viral structure at this stage that the researchers imaged in the new study.

Kelly and McDonald coated the interior window of the microchip with antibodies to the virus. The antibodies, in turn, latched onto the rotaviruses that were injected into the microfluidic chamber and held them in place. The researchers then used a transmission electron microscope to image the prepared slide.

The technique worked perfectly.

The experiment gave results that resembled those achieved using traditional freezing methods to prepare rotavirus for electron microscopy, proving that the new technique can deliver accurate results.

"It's the first time scientists have imaged anything on this scale in liquid," said Kelly.

The next step is to continue to develop the technique with an eye toward imaging biological structures dynamically in action. Specifically, McDonald is looking to understand how rotavirus assembles, so as to better know and develop tools to combat this particular enemy of children's health.

The researchers said their ongoing collaboration is an example of the cross-disciplinary work that is becoming a hallmark of the VTC Research Institute.

"It's an ideal collaboration because Sarah provides a phenomenal model system by which we can develop new technologies to move the field of microstructural biology forward," said Kelly.

"It's very win-win," McDonald added. "While the virus is a great tool for Debbie to develop her techniques, her technology is critical for allowing me to understand how this deadly virus assembles and changes dynamically over time."

The paper "Visualizing viral assemblies in a nanoscale biosphere" was published online and will appear in a 2013 edition of Lab on a Chip.

The authors are Brian Gilmore, a research associate at the VTC Research Institute; Shannon Showalter, a research assistant at the VTC Research Institute; Madeline Dukes, an applications scientist at Protochips; Justin Tanner, a postdoctoral associate at the VTC Research Institute; Andrew Demmert, a student at the Virginia Tech Carilion School of Medicine; McDonald, in addition to her position at the VTC Research Institute, is an assistant professor of biomedical sciences and pathobiology in the Virginia-Maryland Regional College of Veterinary Medicine; and Kelly, in addition to her position at the VTC Research Institute, is an assistant professor of biological sciences in Virginia Tech's College of Science.

Monday, October 15, 2012

Highest Freefall From Edge Of Space


Felix Baumgartner Successfully Lands After Highest Freefall from Edge of Space

Austria's Felix Baumgartner earned his place in the history books on Sunday (Oct. 14, 2012) after overcoming concerns with the power for his visor heater that impaired his vision and nearly jeopardized the mission. Baumgartner reached an estimated speed of 1,342.8 km/h (Mach 1.24) jumping from the stratosphere, which when certified will make him the first man to break the speed of sound in freefall and set several other records* while delivering valuable data for future space exploration.
Screens at the mission control shows Pilot Felix Baumgartner of Austria jump during the final manned flight for Red Bull Stratos in Roswell, New Mexico, USA on October 14, 2012.
Screens at the mission control shows Pilot Felix Baumgartner of Austria jump during the final manned flight for Red Bull Stratos in Roswell, New Mexico, USA on October 14, 2012. (Credit: Jörg Mitter/Red Bull Content Pool)
After flying to an altitude of 39,045 meters (128,100 feet) in a helium-filled balloon, Felix Baumgartner completed Sunday morning a record breaking jump for the ages from the edge of space, exactly 65 years after Chuck Yeager first broke the sound barrier flying in an experimental rocket-powered airplane. The 43-year-old Austrian skydiving expert also broke two other world records (highest freefall, highest manned balloon flight), leaving the one for the longest freefall to project mentor Col. Joe Kittinger.

Baumgartner landed safely with his parachute in the desert of New Mexico after jumping out of his space capsule at 39,045 meters and plunging back towards earth, hitting a maximum of speed of 1,342.8 km/h through the near vacuum of the stratosphere before being slowed by the atmosphere later during his 4:20 minute long freefall. Countless millions of people around the world watched his ascent and jump live on television broadcasts and live stream on the Internet. At one point during his freefall Baumgartner appeared to spin rapidly, but he quickly re-gained control and moments later opened his parachute as members of the ground crew cheered and viewers around the world heaved a sigh of relief.

"It was an incredible up and down today, just like it's been with the whole project," a relieved Baumgartner said. "First we got off with a beautiful launch and then we had a bit of drama with a power supply issue to my visor. The exit was perfect but then I started spinning slowly. I thought I'd just spin a few times and that would be that, but then I started to speed up. It was really brutal at times. I thought for a few seconds that I'd lose consciousness. I didn't feel a sonic boom because I was so busy just trying to stabilize myself. We'll have to wait and see if we really broke the sound barrier. It was really a lot harder than I thought it was going to be."

Baumgartner and his team spent five years training and preparing for the mission that is designed to improve our scientific understanding of how the body copes with the extreme conditions at the edge of space.

Baumgartner had endured several weather-related delays before finally lifting off under bright blue skies and calm winds on Sunday morning. The Red Bull Stratos crew watching from Mission Control broke out into spontaneous applause when the balloon lifted off.

* The data on the records set by the jump are preliminary pending confirmation from the authorized governing bodies.

Sunday, October 14, 2012

Complex Logic Circuit from Bacterial Genes


By force of habit we tend to assume computers are made of silicon, but there is actually no necessary connection between the machine and the material. All that an engineer needs to do to make a computer is to find a way to build logic gates -- the elementary building blocks of digital computers -- in whatever material is handy.
Just as electronic circuits are made from resistors, capacitors and transistors, biological circuits can be made from genes and regulatory proteins. Engineer Tae Seok Moon’s dream is to design modular “genetic parts” that can be used to build logic controllers inside microbes that will program them to make fuel, clean up pollutants, or kill infectious bacteria or cancerous cells.
Just as electronic circuits are made from resistors, capacitors 
and transistors, biological circuits can be made from genes 
and regulatory proteins. Engineer Tae Seok Moon’s dream is 
to design modular “genetic parts” that can be used to build 
logic controllers inside microbes that will program them to 
make fuel, clean up pollutants, or kill infectious bacteria 
or cancerous cells. (Credit: © madarakis / Fotolia)
So logic gates could theoretically be made of pipes of water, channels for billiard balls or even mazes for soldier crabs.

By comparison Tae Seok Moon's ambition, which is to build logic gates out of genes, seems eminently practical. As a postdoctoral fellow in the lab of Christopher Voigt, PhD, a synthetic biologist at the Massachusetts Institute of Technology, he recently made the largest gene (or genetic) circuit yet reported.

Moon, PhD, now an assistant professor of energy, environmental and chemical engineering in the School of Engineering & Applied Science at Washington University in St. Louis is the lead author of an article describing the project in the Oct. 7 issue of Nature. Voigt is the senior author.

The tiny circuits constructed from these gene gates and others like them may one day be components of engineered cells that will monitor and respond to their environments.

The number of tasks they could undertake is limited only by evolution and human ingenuity. Janitor bacteria might clean up pollutants, chemical-engineer bacteria pump out biofuels and miniature infection-control bacteria might bustle about killing pathogens.

How to make an AND gate out of genes

The basis of modern computers is the logic gate, a device that makes simple comparisons between the bits, the 1s and 0s, in which computers encode information. Each logic gate has multiple inputs and one output. The output of the gate depends on the inputs and the operation the gate performs.

An AND gate, for example, turns on only if all of its inputs are on. An OR gate turns on if any of its inputs are on.

Suggestively, genes are turned on or off when a transcription factor binds to a region of DNA adjacent to the gene called a promotor.

To make an AND gate out of genes, however, Moon had to find a gene whose activation is controlled by at least two molecules, not one. So only if both molecule 1 AND molecule 2 are present will the gene be turned on and translated into protein.

Such a genetic circuit had been identified in Salmonella typhimurium, the bacterium that causes food poisoning. In this circuit, the transcription factor can bind to the promotor of a gene only if a molecule called a chaperone is present. This meant the genetic circuit could form the basis of a two-input AND gate.

The circuit Moon eventually built consisted of four sensors for four different molecules that fed into three two-input AND gates. If all four molecules were present, all three AND gates turned on and the last one produced a reporter protein that fluoresced red, so that the operation of the circuit could be easily monitored.

In the future, Moon says, a synthetic bacterium with this circuit might sense four different cancer indicators and, in the presence of all four, release a tumor-killing factor.

Crosstalk and timing faults

There are huge differences, of course, between the floppy molecules that embody biological logic gates and the diodes and transistors that embody electronic ones.

Engineers designing biological circuits worry a great deal about crosstalk, or interference. If a circuit is to work properly, the molecules that make up one gate cannot bind to molecules that are part of another gate.

This is much more of a problem in a biological circuit than in an electronic circuit because the interior of a cell is a kind of soup where molecules mingle freely.

To ensure that there wouldn't be crosstalk among his AND gates, Moon mined parts for his gates from three different strains of bacteria: Shigella flexneri and Pseudomonas aeruginosa, as well as Salmonella.

Although the parts from the three different strains were already quite dissimilar, he made them even more so by subjecting them to error-prone copying cycles and screening the copies for ones that were even less prone to crosstalk (but still functional).

Another problem Moon faced is that biological circuits, unlike electronic ones, don't have internal clocks that keep the bits moving through the logic gates in lockstep. If signals progress through layers of gates at different speeds, the output of the entire circuit may be wrong, a problem called a timing fault.

Experiments designed to detect such faults in the synthetic circuit showed that they didn't occur, probably because the chaperones for one layer of logic gates degrades before the transcription factors for the next layer are generated, and this forces a kind of rhythm on the circuit.

Hijacking a bacterium's controller

"We're not trying to build a computer out of biological logic gates," Moon says. "You can't build a computer this way. Instead we're trying to make controllers that will allow us to access all the things biological organisms do in simple, programmable ways."

"I see the cell as a system that consists of a sensor, a controller (the logic circuit), and an actuator," he says. "This paper covers work on the controller, but eventually the controller's output will drive an actuator, something that will do work on the cell's surroundings. "

An synthetic bacterium designed by a friend of Moon's at Nanyang Technological University in Singapore senses signaling molecules released by the pathogen Pseudomonas aeruginosa. When the molecules reach a high enough concentration, the bacterium generates a toxin and a protein that causes it to burst, releasing the toxin, and killing nearby P. aeruginosa.

"Silicon cannot do that," Moon says.

'Invisibility': Key to Better Electronics?


Visual 'Cloaking' Technology Enables More Efficient Transfer of Electrons

 A new approach that allows objects to become "invisible" has now been applied to an entirely different area: letting particles "hide" from passing electrons, which could lead to more efficient thermoelectric devices and new kinds of electronics.
Diagram shows the 'probability flux' of electrons, a representation of the paths of electrons as they pass through an 'invisible' nanoparticle. While the paths are bent as they enter the particle, they are subsequently bent back so that they re-emerge from the other side on the same trajectory they started with — just as if the particle wasn't there.
Diagram shows the 'probability flux' of electrons, a 
representation of the paths of electrons as they pass through 
an 'invisible' nanoparticle. While the paths are bent as 
they enter the particle, they are subsequently bent back 
so that they re-emerge from the other side on the same 
trajectory they started with — just as if the particle 
wasn't there. (Credit: Image courtesy Bolin Liao et al.)

The concept -- developed by MIT graduate student Bolin Liao, former postdoc Mona Zebarjadi (now an assistant professor at Rutgers University), research scientist Keivan Esfarjani, and mechanical engineering professor Gang Chen -- is described in a paper in the journal Physical Review Letters.

Normally, electrons travel through a material in a way that is similar to the motion of electromagnetic waves, including light; their behavior can be described by wave equations. That led the MIT researchers to the idea of harnessing the cloaking mechanisms developed to shield objects from view -- but applying it to the movement of electrons, which is key to electronic and thermoelectric devices.

Previous work on cloaking objects from view has relied on so-called metamaterials made of artificial materials with unusual properties. The composite structures used for cloaking cause light beams to bend around an object and then meet on the other side, resuming their original path -- making the object appear invisible.

"We were inspired by this idea," says Chen, the Carl Richard Soderberg Professor of Power Engineering at MIT, who decided to study how it might apply to electrons instead of light. But in the new electron-cloaking material developed by Chen and his colleagues, the process is slightly different.

The MIT researchers modeled nanoparticles with a core of one material and a shell of another. But in this case, rather than bending around the object, the electrons do actually pass through the particles: Their paths are bent first one way, then back again, so they return to the same trajectory they began with.

In computer simulations, the concept appears to work, Liao says. Now, the team will try to build actual devices to see whether they perform as expected. "This was a first step, a theoretical proposal," Liao says. "We want to carry on further research on how to make some real devices out of this strategy."

While the initial concept was developed using particles embedded in a normal semiconductor substrate, the MIT researchers would like to see if the results can be replicated with other materials, such as two-dimensional sheets of graphene, which might offer interesting additional properties.

The MIT researchers' initial impetus was to optimize the materials used in thermoelectric devices, which produce an electrical current from a temperature gradient. Such devices require a combination of characteristics that are hard to obtain: high electrical conductivity (so the generated current can flow freely), but low thermal conductivity (to maintain a temperature gradient). But the two types of conductivity tend to coexist, so few materials offer these contradictory characteristics. The team's simulations show this electron-cloaking material could meet these requirements unusually well.

The simulations used particles a few nanometers in size, matching the wavelength of flowing electrons and improving the flow of electrons at particular energy levels by orders of magnitude compared to traditional doping strategies. This might lead to more efficient filters or sensors, the researchers say. As the components on computer chips get smaller, Chen says, "we have to come up with strategies to control electron transport," and this might be one useful approach.

The concept could also lead to a new kind of switches for electronic devices, Chen says. The switch could operate by toggling between transparent and opaque to electrons, thus turning a flow of them on and off. "We're really just at the beginning," he says. "We're not sure how far this is going to go yet, but there is some potential" for significant applications.

Xiang Zhang, a professor of mechanical engineering at the University of California at Berkeley who was not involved in this research, says "this is very exciting work" that expands the concept of cloaking to the domain of electrons. The authors, he says, "uncovered a very interesting approach that may be very useful to thermoelectric applications."

This research was funded by the U.S. Department of Energy (DOE) through MIT's Solid-State Solar-Thermal Energy Conversion center, a DOE Energy Frontier Research Center.

Saturday, October 13, 2012

Gravity Lenses: When Galaxies Eat Galaxies


Using gravitational "lenses" in space, University of Utah astronomers discovered that the centers of the biggest galaxies are growing denser -- evidence of repeated collisions and mergers by massive galaxies with 100 billion stars.
This image, taken by the Hubble Space Telescope, shows a ring of light from a distant galaxy created when a closer galaxy in the foreground – not shown in this processed image – acts as a “gravitational lens” to bend the light from the more distant galaxy into the ring of light, known as an Einstein ring. In a new study, University of Utah astronomer Adam Bolton and colleagues measured these Einstein rings to determine the mass of 79 lens galaxies that are massive elliptical galaxies, the largest kind of galaxy with 100 billion stars. The study found the centers of these big galaxies are getting denser over time, evidence of repeated collisions between massive galaxies.
This image, taken by the Hubble Space Telescope, shows 
a ring of light from a distant galaxy created when a closer 
galaxy in the  foreground – not shown in this processed 
image – acts as a “gravitational lens” to bend the light 
from the more distant galaxy into the ring of light, 
known as an Einstein ring. In a new study, University of 
Utah astronomer Adam Bolton and colleagues measured 
these Einstein rings to determine the mass of 79 lens 
galaxies that are massive elliptical galaxies, the largest 
kind of galaxy with 100 billion stars. The study found 
the centers of these big galaxies are getting denser over 
time, evidence of repeated collisions between massive 
galaxies. (Credit: Joel Brownstein, University of Utah, 
for NASA/ESA and the Sloan Digital Sky Survey)
"We found that during the last 6 billion years, the matter that makes up massive elliptical galaxies is getting more concentrated toward the centers of those galaxies. This is evidence that big galaxies are crashing into other big galaxies to make even bigger galaxies," says astronomer Adam Bolton, principal author of the new study.

"Most recent studies have indicated that these massive galaxies primarily grow by eating lots of smaller galaxies," he adds. "We're suggesting that major collisions between massive galaxies are just as important as those many small snacks."

The new study -- published recently in The Astrophysical Journal -- was conducted by Bolton's team from the Sloan Digital Sky Survey-III using the survey's 2.5-meter optical telescope at Apache Point, N.M., and the Earth-orbiting Hubble Space Telescope.

The telescopes were used to observe and analyze 79 "gravitational lenses," which are galaxies between Earth and more distant galaxies. A lens galaxy's gravity bends light from a more distant galaxy, creating a ring or partial ring of light around the lens galaxy.

The size of the ring was used to determine the mass of each lens galaxy, and the speed of stars was used to calculate the concentration of mass in each lens galaxy.

Bolton conducted the study with three other University of Utah astronomers -- postdoctoral researcher Joel Brownstein, graduate student Yiping Shu and undergraduate Ryan Arneson -- and with these members of the Sloan Digital Sky Survey: Christopher Kochanek, Ohio State University; David Schlegel, Lawrence Berkeley National Laboratory; Daniel Eisenstein, Harvard-Smithsonian Center for Astrophysics; David Wake, Yale University; Natalia Connolly, Hamilton College, Clinton, N.Y.; Claudia Maraston, University of Portsmouth, U.K.; and Benjamin Weaver, New York University.

Big Meals and Snacks for Massive Elliptical Galaxies

The new study deals with the biggest, most massive kind of galaxies, known as massive elliptical galaxies, which each contain about 100 billion stars. Counting unseen "dark matter," they contain the mass of 1 trillion stars like our sun.

"They are the end products of all the collisions and mergers of previous generations of galaxies," perhaps hundreds of collisions," Bolton says.

Despite recent evidence from other studies that massive elliptical galaxies grow by eating much smaller galaxies, Bolton's previous computer simulations showed that collisions between large galaxies are the only galaxy mergers that lead, over time, to increased mass density on the center of massive elliptical galaxies.

When a small galaxy merges with a larger one, the pattern is different. The smaller galaxy is ripped apart by gravity from the larger galaxy. Stars from the smaller galaxy remain near the outskirts -- not the center -- of the larger galaxy.

"But if you have two roughly comparable galaxies and they are on a collision course, each one penetrates more toward the center of the other, so more mass ends up in the center," Bolton says.

Other recent studies indicate stars are spread more widely within galaxies over time, supporting the idea that massive galaxies snack on much smaller ones.

"We're finding galaxies are getting more concentrated in their mass over time even though they are getting less concentrated in the light they emit," Bolton says.

He believes large galaxy collisions explain the growing mass concentration, while galaxies gobbling smaller galaxies explain more starlight away from galactic centers.

"Both processes are important to explain the overall picture," Bolton says. "The way the starlight evolves cannot be explained by the big collisions, so we really need both kinds of collisions, major and minor -- a few big ones and a lot of small ones."

The new study also suggests the collisions between large galaxies are "dry collisions" -- meaning the colliding galaxies lack large amounts of gas because most of the gas already has congealed to form stars -- and that the colliding galaxies hit each other "off axis" or with what Bolton calls "glancing blows" rather than head-on.

Sloan Meets Hubble: How the Study Was Conducted

The University of Utah joined the third phase of the Sloan Digital Sky Survey, known as SDSS-III, in 2008. It involves about 20 research institutions around the world. The project, which continues until 2014, is a major international effort to map the heavens as a way to search for giant planets in other solar systems, study the origin of galaxies and expansion of the universe, and probe the mysterious dark matter and dark energy that make up most of the universe.

Bolton says his new study was "almost gravy" that accompanied an SDSS-III project named BOSS, for Baryon Oscillation Spectrographic Survey. BOSS is measuring the history of the universe's expansion with unprecedented precision. That allows scientists to study the dark energy that accelerates expansion of the universe. The universe is believed to be made of only 4 percent regular matter, 24 percent unseen "dark matter" and 72 percent yet-unexplained dark energy.

During BOSS' study of galaxies, computer analysis of light spectra emitted by galaxies revealed dozens of gravitational lenses, which were discovered because the signatures of two different galaxies are lined up.

Bolton's new study involved 79 gravitational lenses observed by two surveys:

- The Sloan Survey and the Hubble Space Telescope collected images and emitted-light color spectra from relatively nearby, older galaxies -- including 57 gravitational lenses -- 1 billion to 3 billion years back into the cosmic past.

- Another survey identified 22 lenses among more distant, younger galaxies from 4 billion to 6 billion years in the past.

The rings of light around gravitational-lens galaxies are named "Einstein rings" because Albert Einstein predicted the effect, although he wasn't the first to do so.

"The more distant galaxy sends out diverging light rays, but those that pass near the closer galaxy get bent into converging light rays that appear to us as of a ring of light around the closer galaxy," says Bolton.

The greater the amount of matter in a lens galaxy, the bigger the ring. That seems counterintuitive, but the larger mass pulls with enough gravity to make the distant star's light bend so much that lines of light cross as seen by the observer, creating a bigger ring.

If there is more matter concentrated near the center of a galaxy, the faster stars will be seen moving toward or being slung away from the galactic center, Bolton says.

Alternative Theories

Bolton and colleagues acknowledge their observations might be explained by theories other than the idea that galaxies are getting denser in their centers over time:

- Gas that is collapsing to form stars can increase the concentration of mass in a galaxy. Bolton argues the stars in these galaxies are too old for that explanation to work.

- Gravity from the largest massive galaxies strips neighboring "satellite" galaxies of their outskirts, leaving more mass concentrated in the centers of the satellite galaxies. Bolton contends that process is not likely to produce the concentration of mass observed in the new study and explain how the extent of that central mass increases over time.

- The researchers merely detected the boundary in each galaxy between the star-dominated inner regions and the outer regions, which are dominated by unseen dark matter. Under this hypothesis, the appearance of growing galaxy mass concentration over time is due to a coincidence in researchers' measurement method, namely that they are measuring younger galaxies farther from their centers and measuring older galaxies closer to their centers, giving an illusion of growing mass concentration in galactic centers over time. Bolton says this measurement difference is too minor to explain the observed pattern of matter density within the lens galaxies.

Thursday, October 11, 2012

Nobel Prize in Chemistry 2012: Smart Receptors On Cell Surfaces


The Royal Swedish Academy of Sciences has decided to award the Nobel Prize in Chemistry for 2012 to Robert J. Lefkowitz Howard Hughes Medical Institute and Duke University Medical Center and Brian K. Kobilka Stanford University School of Medicine "for studies of G-protein-coupled receptors."
The seven-transmembrane α-helix structure of a G-protein-coupled receptor. (Credit: By Bensaccount at en.wikipedia [Public domain], from Wikimedia Commons)
The seven-transmembrane α-helix structure of a 
G-protein-coupled receptor. (Credit: By Bensaccount at 
en.wikipedia [Public domain], from Wikimedia Commons)
Your body is a fine-tuned system of interactions between billions of cells. Each cell has tiny receptors that enable it to sense its environment, so it can adapt to new situtations. Robert Lefkowitz and Brian Kobilka are awarded the 2012 Nobel Prize in Chemistry for groundbreaking discoveries that reveal the inner workings of an important family of such receptors: G-protein-coupled receptors.

For a long time, it remained a mystery how cells could sense their environment. Scientists knew that hormones such as adrenalin had powerful effects: increasing blood pressure and making the heart beat faster. They suspected that cell surfaces contained some kind of recipient for hormones. But what these receptors actually consisted of and how they worked remained obscured for most of the 20th Century.

Lefkowitz started to use radioactivity in 1968 in order to trace cells' receptors. He attached an iodine isotope to various hormones, and thanks to the radiation, he managed to unveil several receptors, among those a receptor for adrenalin: β-adrenergic receptor. His team of researchers extracted the receptor from its hiding place in the cell wall and gained an initial understanding of how it works.

The team achieved its next big step during the 1980s. The newly recruited Kobilka accepted the challenge to isolate the gene that codes for the β-adrenergic receptor from the gigantic human genome. His creative approach allowed him to attain his goal. When the researchers analyzed the gene, they discovered that the receptor was similar to one in the eye that captures light. They realized that there is a whole family of receptors that look alike and function in the same manner.

Today this family is referred to as G-protein-coupled receptors. About a thousand genes code for such receptors, for example, for light, flavour, odour, adrenalin, histamine, dopamine and serotonin. About half of all medications achieve their effect through G-protein-coupled receptors.

The studies by Lefkowitz and Kobilka are crucial for understanding how G-protein-coupled receptors function. Furthermore, in 2011, Kobilka achieved another break-through; he and his research team captured an image of the β-adrenergic receptor at the exact moment that it is activated by a hormone and sends a signal into the cell. This image is a molecular masterpiece -- the result of decades of research.

Monday, July 23, 2012

Coursera makes top college courses free online


Daphne Koller and Andrew Ng share a vision in which anyone, no matter how destitute, can expand their minds and prospects with lessons from the world's top universities.

Coursera co-founders Andrew Ng and Daphne Koller
That dream was joined this week by a dozen vaunted academic institutions including Duke University, the Ecole Polytechnique Federale de Lausanne (EPFL) in Switzerland and the University of Edinburgh in Scotland.

The schools will add online versions of classes to Coursera.org, a website launched by Stanford University professors Koller and Ng early this year with debut offerings from Princeton, Stanford and two other US universities.

"We have a vision where students everywhere around the world, regardless of country, family circumstances or financial circle have access to top quality education whether to expand their minds or learn valuable skills," Koller said.

"Where education becomes a right, not a privilege."

Academic institutions are increasingly turning to the Internet as an educational platform. A Khan Academy website created by Massachusetts Institute of Technology (MIT) graduate Salman Khan provides thousands of video lectures.

The nonprofit behind prestigious TED gatherings recently launched a TED-Ed channel at YouTube that teams accomplished teachers with talented animators to make videos that captivate while they educate.

In May, Harvard University and MIT announced that they were teaming up to expand their online education programs -- and invited other institutions to jump on board.

Called edX, the $60 million joint venture builds on MIT's existing MITx platform that enables video lesson segments, embedded quizzes, immediate feedback, online laboratories and student-paced learning.

"Universities have come to realize that online is not a fad," Koller said. "The question is not whether to engage in this area but how to do it."

Coursera classes are free, and completion certificates are issued that people can use to win jobs or improve careers.

"If a student takes a Stanford computer class and a Princeton business class, it shows they are motivated and have skills," Koller said. "We know it has helped employees get better jobs."

Coursera is distinguishing itself with essentially virtual versions of real classes.

"A lot of what is out there is basically video with, perhaps, some static content like lecture notes," Koller said.

"We are providing an actual course exchange were people register and there is weekly homework that is graded with feedback about how they are doing."

Coursera classes launched in February with most of the courses slated to begin in the coming months but it has already attracted students in 190 countries, according to Koller.

Coursera uses crowd-sourcing to translate material into various languages and hopes to connect with French-speaking populations around the world with EPFL classes.

Hoping to spread knowledge around the world, Coursera is a way to inspire faculty to try new methods of teaching and find ways that Internet Age tools can enhance on-campus courses, according to Duke provost Peter Lange.

"Our faculty is incredibly excited by the idea of trying it out and seeing if we can learn from it," Lange said.

"I love the idealism of it; the potential to reach people who might never get the chance to attend the university."

Duke designs its online courses to get students involved, complete with social networking tools for collaborating outside of classes.

"This is a great experiment in innovation and learning," Lange said.

As of Friday, Coursera boasted about 740,000 students and that number is expected to soar as word spreads and class offerings expand.

Coursera plans to keep classes free but perhaps one day make money for operations by charging for course completion certificates or matching employers with qualified workers.

"Current ethos in Silicon Valley is that if you build a website that people keep coming back to and is changing the lives of millions, you can eventually make money," Koller said.

"If and when we develop revenue, universities will share in it."

Paying the bills is not a worry at Coursera due to generous backing that includes a $3.7 million combined investment by the University of Pennsylvania and the California Institute of Technology, as well as funding from venture capital powerhouse Kleiner Perkins Caufield & Byers.

'Minority Report' software hits the real world


The software behind the film "Minority Report" -- where Tom Cruise speeds through video on a large screen using only hand gestures -- is making its way into the real world.


The interface developed by scientist John Underkoffler has been commercialized by the Los Angeles firm Oblong Industries as a way to sift through massive amounts of video and other data.

And yes, the software can be used by law enforcement and intelligence services. But no, it is not the "pre-crime" detection program illustrated in the 2002 Steven Spielberg sci-fi film.

Kwin Kramer, chief executive of Oblong, said the software can help in searching through "big data" for information. It can also create souped-up video-conference capabilities where participants share data from multiple devices like smartphones and tablets, integrated into a large video display.

"We think the future of computing is multiuser, multiscreen, multidevice," Kramer told AFP.

"This system helps with big workflow problems."

A key part of the system is the gesture interface, which the company calls the "g-speak" spatial operating environment.

That grew out of a project by Underkoffler -- then a scientist at the prestigious Massachusetts Institute of Technology -- for "Minority Report," before he became chief scientist at startup Oblong.

"We have demo versions of this kind of software which show exactly the 'Minority Report' user experience, allowing you to move back and forth in time, or to zoom in to look at details," Kramer said.

He said the same software can help businesses to "allow better collaboration, visualization and analysis of large amounts of data.

"You can have a lot of data but it's hard to make use of that," Kramer said.

"It can be on different machines and hard to access. This allows multiple people to look at that."
An employee demonstrates the use of a data glove to navigate a map on a computer screen at Los Angeles-based software company Oblong Industries' offices in Washington in June 2012. The software behind the film "Minority Report" -- where Tom Cruise speeds through video on a large screen using only hand gestures -- is making its way into the real world.

Gestural interfaces have been developed for other firms including Microsoft's Kinect but Oblong says it has far more sophisticated systems which can use Kinect and more.

Some highly sensitive systems use a data glove which can be more precise than ordinary hand movements.

Oblong has contracts with firms such as Boeing, General Electric and Saudi Aramco to help in analyzing large amounts of data. It is also developing a gestural interface for onboard computers with automaker Audi.

It has raised an unspecified amount of venture capital from investors including Foundry Group, Energy Technology Ventures and Morgan Stanley Alternative Investment Partners.

Brad Feld, managing director at Foundry Group, said Oblong offers "a path to fundamentally change the way we interact with computers."

Yet the question Oblong often gets is how users can get the "Minority Report" software.

David Schwartz, the company's vice president for sales, said "We get calls from people in the military who say, 'I want the 'Minority Report' interface."

He said the systems could be used for a realistic version of high-tech software interfaces on TV shows like "CSI."

"They would like to get it for free," he added.

What makes the real-life version of the software different from the one seen on film is that Oblong does not supply the analytics of the futuristic "pre-crime" division.

That does not prevent a company or law enforcement agency from using the software and adding its own analytics.

"We think law enforcement and intelligence are big data users and we think our technology is the leader," Kramer said.

He said Oblong currently has no government customers in the United States or abroad but offers itself as "a core technology provider."

Still, Oblong leverages its role in the movies to get in the door, even if the software is not quite the same.

"I think most people look at those 'Minority Report' interfaces and imagine how they could use that flexible system in their own office or designs studio," Kramer said.

"It isn't science fiction, it's real."

Saturday, July 21, 2012

Entire Genetic Sequence of Individual Human Sperm Determined


The entire genomes of 91 human sperm from one man have been sequenced by Stanford University researchers. The results provide a fascinating glimpse into naturally occurring genetic variation in one individual, and are the first to report the whole-genome sequence of a human gamete -- the only cells that become a child and through which parents pass on physical traits.

Entire Genetic Sequence of Individual Human Sperm Determined
Every sperm cell looks essentially the same, with that characteristic tadpole appearance. But inside, sperm cells carry differences within their genes -- even cells from the same man. Now, researchers provide a detailed picture of how the cell's DNA varies in a new study published in the July 20, 2012 issue of the Cell Press journal Cell. The techniques used could be helpful for understanding male reproductive disorders or, when applied to other areas of research, for characterizing normal and diseased cells in the body. (Credit: iStockphoto/Alexandr Mitiuc)

"This represents the culmination of nearly a decade of work in my lab," said Stephen Quake, PhD, the Lee Otterson Professor in the School of Engineering and professor of bioengineering and of applied physics. "We now have devices that will allow us to routinely amplify and sequence to a high degree of accuracy the entire genomes of single cells, which has far-ranging implications for the study of cancer, infertility and many other disorders."

Quake is the senior author of the research, published July 20 in Cell. Graduate student Jianbin Wang and former graduate student H. Christina Fan, PhD, now a senior scientist at ImmuMetrix, share first authorship of the paper.

Sequencing sperm cells is particularly interesting because of a natural process called recombination that ensures that a baby is a blend of DNA from all four of his or her grandparents. Until now, scientists had to rely on genetic studies of populations to estimate how frequently recombination had occurred in individual sperm and egg cells, and how much genetic mixing that entailed.

"Single-sperm sequencing will allow us to chart and understand how recombination differs between individuals at the finest scales. This is an important proof of principle that will allow us to study both fundamental dynamics of recombination in humans and whether it is involved in issues relating to male infertility," said Gilean McVean, PhD, professor of statistical genetics at the Wellcome Trust Centre for Human Genetics. McVean was not involved in the research.

The Stanford study showed that the previous, population-based estimates were, for the most part, surprisingly accurate: on average, the sperm in the sample had each undergone about 23 recombinations, or mixing events. However, individual sperm varied greatly in the degree of genetic mixing and in the number and severity of spontaneously arising genetic mutations. Two sperm were missing entire chromosomes. The study has long-ranging implication for infertility doctors and researchers.

"For the first time, we were able to generate an individual recombination map and mutation rate for each of several sperm from one person," said study co-author Barry Behr, PhD, HCLD, professor of obstetrics and gynecology and director of Stanford's in vitro fertilization laboratory. "Now we can look at a particular individual, make some calls about what they would likely contribute genetically to an embryo and perhaps even diagnose or detect potential problems."

Most cells in the human body have two copies of each of 23 chromosomes, and are known as "diploid" cells. Recombination occurs during a process called meiosis, which partitions a single copy of each chromosome into a sperm (in a man) or egg (in a woman) cell. When a sperm and an egg join, the resulting fertilized egg again has a full complement of DNA.

To ensure an orderly distribution during recombination, pairs of chromosomes are lined up in tight formation along the midsection of the cell. During this snug embrace, portions of matching chromosomes are sometimes randomly swapped. The process generates much more genetic variation in a potential offspring than would be possible if only intact chromosomes were segregated into the reproductive cells.

"The exact sites, frequency and degree of this genetic mixing process is unique for each sperm and egg cell," said Quake, "and we've never before been able to see it with this level of detail. It's very interesting that what happens in one person's body mirrors the population average."

Major problems with the recombination process can generate sperm missing portions or even whole chromosomes, making them incapable of or unlikely to fertilize an egg. But it can be difficult for fertility researchers to identify potential problems.

"Most of the techniques we currently use to assess sperm viability are fairly crude," said Quake.

To conduct the research, Wang, Quake and Behr first isolated and sequenced nearly 100 sperm cells from the study subject, a 40-year-old man. The man has healthy offspring, and the semen sample appeared normal. His whole-genome sequence (obtained from diploid cells) has been previously sequenced to a high level of accuracy.

They then compared the sequence of the sperm with that of the study subject's diploid genome. They could see, by comparing the sequences of the chromosomes in the diploid cells with those in the haploid sperm cells, where each recombination event took place. The researchers also identified 25 to 36 new single nucleotide mutations in each sperm cell that were not present in the subject's diploid genome. Such random mutations are another way to generate genetic variation, but if they occur at particular points in the genome they can have deleterious effects.

It's important to note that individual sperm cells are destroyed by the sequencing process, meaning that they couldn't go on to be used for fertilization. However, the single-cell sequencing described in the paper could potentially be used to diagnose male reproductive disorders and help infertile couples assess their options. It could also be used to learn more about how male fertility and sperm quality change with increasing age.

"This could serve as a new kind of early detection system for men who may have reproductive problems," said Behr, who also co-directs Stanford's reproductive endocrinology and infertility program. "It's also possible that we could one day use other, correlating features to harmlessly identify healthy sperm for use in IVF. In the end, the DNA is the raw material that ultimately defines a sperm's potential. If we can learn more about this process, we can better understand human fertility."

The research was supported by the National Institute of Health, the Chinese Scholarship Council and the Siebel Foundation.