BTemplates.com

Powered by Blogger.

Pageviews past week

Quantum mechanics

Auto News

artificial intelligence

About Me

Recommend us on Google!

Information Technology

Popular Posts

Tuesday, August 31, 2010

Scientists Succeed in Filming Organs and Joints in Real Time Using Magnetic Resonance Imaging


"Please hold absolutely still": This instruction is crucial for patients being examined by magnetic resonance imaging (MRI). It is the only way to obtain clear images for diagnosis. Up to now, it was therefore almost impossible to image moving organs using MRI.
Real-time MRI of the heart with a measurement time of 33 
milliseconds per image and 30 images per second. The 
spatial resolution is 1.5 millimetres in the image plane 
(section thickness 8 millimetres). The eight successive 
images show the movement of the heart muscle of a healthy 
subject for a period of 0.264 seconds during a single 
heartbeat. The images range from the systolic phase 
(arrow, top left: contraction of the heart muscle) 
to the diastolic phase (arrow, bottom right: relaxation 
and expansion). The bright signal in the heart chambers 
is the blood. (Credit: Jens Frahm)

Max Planck researchers from Gƶttingen have now succeeded in significantly reducing the time required for recording images -- to just one fiftieth of a second. With this breakthrough, the dynamics of organs and joints can be filmed "live" for the first time: movements of the eye and jaw as well as the bending knee and the beating heart. The new MRI method promises to add important information about diseases of the joints and the heart. In many cases MRI examinations may become easier and more comfortable for patients.

A process that required several minutes until well into the 1980s, now only takes a matter of seconds: the recording of cross-sectional images of our body by magnetic resonance imaging (MRI). This was enabled by the FLASH (fast low angle shot) method developed by Gƶttingen scientists Jens Frahm and Axel Haase at the Max Planck Institute for Biophysical Chemistry. FLASH revolutionised MRI and was largely responsible for its establishment as a most important modality in diagnostic imaging. MRI is completely painless and, moreover, extremely safe. Because the technique works with magnetic fields and radio waves, patients are not subjected to any radiation exposure as is the case with X-rays. At present, however, the procedure is still too slow for the examination of rapidly moving organs and joints. For example, to trace the movement of the heart, the measurements must be synchronised with the electrocardiogram (ECG) while the patient holds the breath. Afterwards, the data from different heart beats have to be combined into a film.

Future prospect: extended diagnostics for diseases

The researchers working with Jens Frahm, Head of the non-profit "Biomedizinische NMR Forschungs GmbH," now succeeded in further accelerating the image acquisition process. The new MRI method developed by Jens Frahm, Martin Uecker and Shuo Zhang reduces the image acquisition time to one fiftieth of a second (20 milliseconds), making it possible to obtain "live recordings" of moving joints and organs at so far inaccessible temporal resolution and without artefacts. Filming the dynamics of the jaw during opening and closing of the mouth is just as easy as filming the movements involved in speech production or the rapid beating of the heart. "A real-time film of the heart enables us to directly monitor the pumping of the heart muscle and the resulting blood flow -- heartbeat by heartbeat and without the patient having to hold the breath," explains Frahm.

The scientists believe that the new method could help to improve the diagnosis of conditions such as coronary heart disease and myocardial insufficiency. Another application involves minimally invasive interventions which, thanks to this discovery, could be carried out in future using MRI instead of X-rays. "However, as it was the case with FLASH, we must first learn how to use the real-time MRI possibilities for medical purposes," says Frahm. "New challenges therefore also arise for doctors. The technical progress will have to be 'translated' into clinical protocols that provide optimum responses to the relevant medical questions."

Less is more: acceleration through better image reconstruction

To achieve the breakthrough to MRI measurement times that only take very small fractions of a second, several developments had to be successfully combined with each other. Whilst still relying on the FLASH technique, the scientists used a radial encoding of the spatial information which renders the images insensitive to movements. Mathematics was then required to further reduce the acquisition times. "Considerably fewer data are recorded than are usually necessary for the calculation of an image. We developed a new mathematical reconstruction technique which enables us to calculate a meaningful image from data which are, in fact, incomplete," explains Frahm. In the most extreme case it is possible to calculate an image of comparative quality out of just five percent of the data required for a normal image -- which corresponds to a reduction of the measurement time by a factor of 20. As a result, the Gƶttingen scientists have accelerated MRI from the mid 1980s by a factor of 10000.

Although these fast MRI measurements can be easily implemented on today's MRI devices, something of a bottleneck exists when it comes to the availability of sufficiently powerful computers for image reconstruction. Physicist Martin Uecker explains: "The computational effort required is gigantic. For example, if we examine the heart for only a minute in real time, between 2000 and 3000 images arise from a data volume of two gigabytes." Uecker consequently designed the mathematical process in such a way that it is divided into steps that can be calculated in parallel. These complex calculations are carried out using fast graphical processing units that were originally developed for computer games and three-dimensional visualization. "Our computer system requires about 30 minutes at present to process one minute's worth of film," says Uecker. Therefore, it will take a while until MRI systems are equipped with computers that will enable the immediate calculation and live presentation of the images during the scan.

In order to minimise the time their innovation will take to reach practical application, the Gƶttingen researchers are working in close cooperation with the company Siemens Healthcare.

Monday, August 30, 2010

New View of Tectonic Plates: Computer Modeling of Earth's Mantle Flow, Plate Motions, and Fault Zones


Computational scientists and geophysicists at the University of Texas at Austin and the California Institute of Technology (Caltech) have developed new computer algorithms that for the first time allow for the simultaneous modeling of Earth's mantle flow, large-scale tectonic plate motions, and the behavior of individual fault zones, to produce an unprecedented view of plate tectonics and the forces that drive it.
Plate boundaries, which can be seen as narrow red lines are 
resolved using an adaptively refined mesh with 1km local 
resolution. Shown are the Pacific and the Australian tectonic 
plates and the New Hebrides and Tonga microplates. 
(Credit: Georg Stadler, Institute for Computational 
Engineering & Sciences, UT Austin)

A paper describing the whole-earth model and its underlying algorithms will be published in the August 27 issue of the journal Science and also featured on the cover.

The work "illustrates the interplay between making important advances in science and pushing the envelope of computational science," says Michael Gurnis, the John E. and Hazel S. Smits Professor of Geophysics, director of the Caltech Seismological Laboratory, and a coauthor of the Science paper.

To create the new model, computational scientists at Texas's Institute for Computational Engineering and Sciences (ICES) -- a team that included Omar Ghattas, the John A. and Katherine G. Jackson Chair in Computational Geosciences and professor of geological sciences and mechanical engineering, and research associates Georg Stadler and Carsten Burstedde -- pushed the envelope of a computational technique known as Adaptive Mesh Refinement (AMR).

Partial differential equations such as those describing mantle flow are solved by subdividing the region of interest (such as the mantle) into a computational grid. Ordinarily, the resolution is kept the same throughout the grid. However, many problems feature small-scale dynamics that are found only in limited regions. "AMR methods adaptively create finer resolution only where it's needed," explains Ghattas. "This leads to huge reductions in the number of grid points, making possible simulations that were previously out of reach."

"The complexity of managing adaptivity among thousands of processors, however, has meant that current AMR algorithms have not scaled well on modern petascale supercomputers," he adds. Petascale computers are capable of one million billion operations per second. To overcome this long-standing problem, the group developed new algorithms that, Burstedde says, "allows for adaptivity in a way that scales to the hundreds of thousands of processor cores of the largest supercomputers available today."

With the new algorithms, the scientists were able to simulate global mantle flow and how it manifests as plate tectonics and the motion of individual faults. According to Stadler, the AMR algorithms reduced the size of the simulations by a factor of 5,000, permitting them to fit on fewer than 10,000 processors and run overnight on the Ranger supercomputer at the National Science Foundation (NSF)-supported Texas Advanced Computing Center.

A key to the model was the incorporation of data on a multitude of scales. "Many natural processes display a multitude of phenomena on a wide range of scales, from small to large," Gurnis explains. For example, at the largest scale -- that of the whole earth -- the movement of the surface tectonic plates is a manifestation of a giant heat engine, driven by the convection of the mantle below. The boundaries between the plates, however, are composed of many hundreds to thousands of individual faults, which together constitute active fault zones. "The individual fault zones play a critical role in how the whole planet works," he says, "and if you can't simulate the fault zones, you can't simulate plate movement" -- and, in turn, you can't simulate the dynamics of the whole planet.

In the new model, the researchers were able to resolve the largest fault zones, creating a mesh with a resolution of about one kilometer near the plate boundaries. Included in the simulation were seismological data as well as data pertaining to the temperature of the rocks, their density, and their viscosity -- or how strong or weak the rocks are, which affects how easily they deform. That deformation is nonlinear -- with simple changes producing unexpected and complex effects.

"Normally, when you hit a baseball with a bat, the properties of the bat don't change -- it won't turn to Silly Putty. In the earth, the properties do change, which creates an exciting computational problem," says Gurnis. "If the system is too nonlinear, the earth becomes too mushy; if it's not nonlinear enough, plates won't move. We need to hit the 'sweet spot.'"

After crunching through the data for 100,000 hours of processing time per run, the model returned an estimate of the motion of both large tectonic plates and smaller microplates -- including their speed and direction. The results were remarkably close to observed plate movements.

In fact, the investigators discovered that anomalous rapid motion of microplates emerged from the global simulations. "In the western Pacific," Gurnis says, "we have some of the most rapid tectonic motions seen anywhere on Earth, in a process called 'trench rollback.' For the first time, we found that these small-scale tectonic motions emerged from the global models, opening a new frontier in geophysics."

One surprising result from the model relates to the energy released from plates in earthquake zones. "It had been thought that the majority of energy associated with plate tectonics is released when plates bend, but it turns out that's much less important than previously thought," Gurnis says. "Instead, we found that much of the energy dissipation occurs in the earth's deep interior. We never saw this when we looked on smaller scales."

The Pentagon Approves a Flying Car! The Pentagon wants a flying car, and one seriously out-there military concept has been given the go-ahead. Here is a look at the Pentagon's next armored, armed, airborne Humvee.


The race to build the world's first flying military jeep just moved a step closer to the finish line. The Pentagon's Defense Advanced Research Projects Agency (DARPA) has selected two companies to proceed with the next stage of its Transformer, known as TX—a fully automated four-person vehicle that can drive like a car and then take off and fly like an aircraft to avoid roadside bombs. Lockheed Martin and AAI Corp., a unit of Textron Systems, are currently in negotiations with DARPA for the first stage of the Transformer project, several industry sources told Popular Mechanics at a robotics conference here in Denver. DARPA has not announced the official winners yet.

It's unclear how many companies competed for the DARPA project, but the competition brought together an unusual mix of large defense companies with smaller aviation firms vying to build the vertical takeoff and landing craft. Perhaps most surprising—and for some competitors galling— is that DARPA selected a rotor-based aircraft for one of the two winning submissions. At an industry day held earlier this year, DARPA officials had initially said they weren't interested in a traditional rotary-wing aircraft, though they might consider a vehicle if the rotor was shrouded.

AAI's winning concept does not have a shrouded rotor, but it is also far from being a traditional helicopter. The company, which produces the Shadow unmanned aerial vehicle, is basing its proposal in part on the slowed-rotor/compound concept, a technology that uses rotor blades heavily weighted in the tips, or high inertia. The rotor provides lift on takeoff, and then as it gains speed, the rotor slows down and the wings provide lift.

Lockheed Martin has declined to detail any aspect of its submission, but those familiar with the Phantom Works project speculated that it might combine aspects of the company's Joint Tactical Light Vehicle, a follow-on to the Humvee, with a ducted fan propulsion system that it would use to fly.

The two companies are still a ways away from building flying Humvees; the first stage of the DARPA project is merely working on conceptual designs. The total funding available for Transformer is about $40 million.

Officials from both companies declined to comment on the record about the negotiations, and DARPA did not respond to request to comment.

Sunday, August 29, 2010

Tiny Logo Demonstrates Advanced Display Technology Using Nano-Thin Metal Sheets


In a step toward more efficient, smaller and higher-definition display screens, a University of Michigan professor has developed a new type of color filter made of nano-thin sheets of metal with precisely spaced gratings.
An optical microscopy image of a 12-by-9-micron U-M logo 
produced with this new color filter process. (Credit: Jay Guo)

The gratings, sliced into metal-dielectric-metal stacks, act as resonators. They trap and transmit light of a particular color, or wavelength, said Jay Guo, an associate professor in the Department of Electrical Engineering and Computer Science. A dielectric is a material that does not conduct electricity.

"Simply by changing the space between the slits, we can generate different colors," Guo said. "Through nanostructuring, we can render white light any color."

A paper on the research is published Aug. 24 in Nature Communications.

His team used this technique to make what they believes is the smallest color U-M logo. At about 12-by-9 microns, it's about 1/6 the width of a human hair.

Conventional LCDs, or liquid crystal displays, are inefficient and manufacturing-intensive to produce. Only about 5 percent of their back-light travels through them and reaches our eyes, Guo said. They contain two layers of polarizers, a color filter sheet, and two layers of electrode-laced glass in addition to the liquid crystal layer. Chemical colorants for red, green and blue pixel components must be patterned in different regions on the screen in separate steps.

Guo's color filter acts as a polarizer simultaneously, eliminating the need for additional polarizer layers. In Guo's displays, reflected light could be recycled to save much of the light that would otherwise be wasted.

Because these new displays contain fewer layers, they would be simpler to manufacture, Guo said. The new color filters contain just three layers: two metal sheets sandwiching a dielectric. Red, green and blue pixel components could be made in one step by cutting arrays of slits in the stack. This structure is also more robust and can endure higher- powered light.

Red light emanates from slits set around 360 nanometers apart; green from those about 270 nanometers apart and blue from those approximately 225 nanometers apart. The differently spaced gratings essentially catch different wavelengths of light and resonantly transmit through the stacks.

"Amazingly, we found that even a few slits can already produce well-defined color, which shows its potential for extremely high-resolution display and spectral imaging," Guo said.

The pixels in Guo's displays are about an order of magnitude smaller than those on a typical computer screen. They're about eight times smaller than the pixels on the iPhone 4, which are about 78 microns. He envisions that this pixel size could make this technology useful in projection displays, as well as wearable, bendable or extremely compact displays.

The paper is called "Plasmonic nano-resonators for high resolution color filtering and spectral imaging."

Guo is also an associate professor in the Department of Macromolecular Science and Engineering. This research is supported in part by the Air Force Office of Scientific Research and the Defense Advanced Research Projects Agency. The university is pursuing patent protection for the intellectual property and is seeking commercialization partners to help bring the technology to market.

Saturday, August 28, 2010

El NiƱos Are Growing Stronger, NASA/NOAA Study Finds


A relatively new type of El NiƱo, which has its warmest waters in the central-equatorial Pacific Ocean, rather than in the eastern-equatorial Pacific, is becoming more common and progressively stronger, according to a new study by NASA and NOAA. The research may improve our understanding of the relationship between El NiƱos and climate change, and has potentially significant implications for long-term weather forecasting.
Deviations from normal sea surface temperatures (left) 
and sea surface heights (right) at the peak of the 2009-2010 
central Pacific El NiƱo, as measured by NOAA polar orbiting 
satellites and NASA's Jason-1 spacecraft, respectively. The 
warmest temperatures and highest sea levels were located in 
the central equatorial Pacific. Image credit: 
(Credit: NASA/JPL-NOAA)

Lead author Tong Lee of NASA's Jet Propulsion Laboratory, Pasadena, Calif., and Michael McPhaden of NOAA's Pacific Marine Environmental Laboratory, Seattle, measured changes in El NiƱo intensity since 1982. They analyzed NOAA satellite observations of sea surface temperature, checked against and blended with directly-measured ocean temperature data. The strength of each El NiƱo was gauged by how much its sea surface temperatures deviated from the average. They found the intensity of El NiƱos in the central Pacific has nearly doubled, with the most intense event occurring in 2009-10.

The scientists say the stronger El NiƱos help explain a steady rise in central Pacific sea surface temperatures observed over the past few decades in previous studies-a trend attributed by some to the effects of global warming. While Lee and McPhaden observed a rise in sea surface temperatures during El NiƱo years, no significant temperature increases were seen in years when ocean conditions were neutral, or when El NiƱo's cool water counterpart, La NiƱa, was present.

"Our study concludes the long-term warming trend seen in the central Pacific is primarily due to more intense El NiƱos, rather than a general rise of background temperatures," said Lee.

"These results suggest climate change may already be affecting El NiƱo by shifting the center of action from the eastern to the central Pacific," said McPhaden. "El NiƱo's impact on global weather patterns is different if ocean warming occurs primarily in the central Pacific, instead of the eastern Pacific.

"If the trend we observe continues," McPhaden added, "it could throw a monkey wrench into long-range weather forecasting, which is largely based on our understanding of El NiƱos from the latter half of the 20th century."

El NiƱo, Spanish for "the little boy," is the oceanic component of a climate pattern called the El NiƱo-Southern Oscillation, which appears in the tropical Pacific Ocean on average every three to five years. The most dominant year-to-year fluctuating pattern in Earth's climate system, El NiƱos have a powerful impact on the ocean and atmosphere, as well as important socioeconomic consequences. They can influence global weather patterns and the occurrence and frequency of hurricanes, droughts and floods; and can even raise or lower global temperatures by as much as 0.2 degrees Celsius (0.4 degrees Fahrenheit).

During a "classic" El NiƱo episode, the normally strong easterly trade winds in the tropical eastern Pacific weaken. That weakening suppresses the normal upward movement of cold subsurface waters and allows warm surface water from the central Pacific to shift toward the Americas. In these situations, unusually warm surface water occupies much of the tropical Pacific, with the maximum ocean warming remaining in the eastern-equatorial Pacific.

Since the early 1990s, however, scientists have noted a new type of El NiƱo that has been occurring with greater frequency. Known variously as "central-Pacific El NiƱo," "warm-pool El NiƱo," "dateline El NiƱo" or "El NiƱo Modoki" (Japanese for "similar but different"), the maximum ocean warming from such El NiƱos is found in the central-equatorial, rather than eastern, Pacific. Such central Pacific El NiƱo events were observed in 1991-92, 1994-95, 2002-03, 2004-05 and 2009-10. A recent study found many climate models predict such events will become much more frequent under projected global warming scenarios.

Lee said further research is needed to evaluate the impacts of these increasingly intense El NiƱos and determine why these changes are occurring. "It is important to know if the increasing intensity and frequency of these central Pacific El NiƱos are due to natural variations in climate or to climate change caused by human-produced greenhouse gas emissions," he said.

Results of the study were published recently in Geophysical Research Letters.

For more information on El NiƱo, visit: http://sealevel.jpl.nasa.gov/.

Friday, August 27, 2010

Dry Water Could Make a Big Splash Commercially, Help Fight Global Warming


An unusual substance known as "dry water," which resembles powdered sugar, could provide a new way to absorb and store carbon dioxide, the major greenhouse gas that contributes to global warming, scientists reported at the 240th National Meeting of the American Chemical Society.
Powdered material called "dry water" could provide a new way to store carbon dioxide in an effort to fight global warming. (Credit: Ben Carter)

The powder shows bright promise for a number of other uses, they said. It may, for instance, be a greener, more energy-efficient way of jumpstarting the chemical reactions used to make hundreds of consumer products. Dry water also could provide a safer way to store and transport potentially harmful industrial materials.

"There's nothing else quite like it," said Ben Carter, Ph.D., researcher for study leader Professor Andrew Cooper. "Hopefully, we may see 'dry water' making waves in the future."

Carter explained that the substance became known as "dry water" because it consists of 95 percent water and yet is a dry powder. Each powder particle contains a water droplet surrounded by modified silica, the stuff that makes up ordinary beach sand. The silica coating prevents the water droplets from combining and turning back into a liquid. The result is a fine powder that can slurp up gases, which chemically combine with the water molecules to form what chemists term a hydrate.

Dry water was discovered in 1968 and got attention for its potential use in cosmetics. Scientists at the University of Hull, U.K. rediscovered it in 2006 in order to study its structure, and Cooper's group at the University of Liverpool has since expanded its range of potential applications.

One of the most recent involves using dry water as a storage material for gases, including carbon dioxide. In laboratory-scale research, Cooper and co-workers found that dry water absorbed over three times as much carbon dioxide as ordinary, uncombined water and silica in the same space of time. This ability to absorb large amounts of carbon dioxide gas as a hydrate could make it useful in helping to reduce global warming, the scientists suggested.

Cooper and colleagues demonstrated in previous studies that dry water is also useful for storing methane, a component of natural gas, and may help expand its use as a future energy source. In particular, they hope that engineers can use the powder to collect and transport stranded deposits of natural gas. This also exists on the ocean floor in the form of gas hydrates, a form of frozen methane also known as the "ice that burns." The powder could also provide a safer, more convenient way to store methane fuel for use in vehicles powered by natural gas. "A great deal of work remains to be done before we could reach that stage," Carter added.

In another potential new application, the scientists also showed that dry water is a promising means to speed up catalyzed reactions between hydrogen gas and maleic acid to produce succinic acid, a feedstock or raw material widely used to make drugs, food ingredients, and other consumer products. Manufacturers usually have to stir these substances together to get them to react. By developing dry water particles that contain maleic acid, Cooper and colleagues showed that they could speed up the acid's reaction with hydrogen without any stirring, resulting in a greener, more energy-efficient process.

"If you can remove the need to stir your reactions, then potentially you're making considerable energy savings," Carter said.

Prof. Cooper's team describes an additional new application in which dry water technology shows promise for storing liquids, particularly emulsions. Emulsions are mixtures of two or more unblendable liquids, such as the oil and water mixture in mayonnaise. The scientists showed that they could transform a simple emulsion into a dry powder that is similar to dry water. The resulting powder could make it safer and easier for manufacturers to store and transport potentially harmful liquids.

Carter noted that he and his colleagues are seeking commercial or academic collaboration to further develop the dry water technology. The U.K. Engineering and Physical Sciences Research Council (EPSRC) and the Center for Materials Discovery provided funding and technical support for this study.

Ultralow-power memory uses orders of magnitude less power than other devices


As RFID tags are becoming more widespread for tracking and identifying almost anything, researchers are continuing to develop cheap, ultralow-power memory devices for these applications. In a recent study, scientists from Cambridge have taken another step forward in this area by developing a write-once-read-many-times (WORM) memory device that requires just a fraction of the power needed by previous devices. In principle, the low-power memory can be used in any organic electronic circuit where the operation power is low.
The device structure and energy level diagram of the WORM
memory, which can be programmed at power densities that
are orders of magnitude lower than previously reported
ultralow-power WORM devices. Image credit: Wang, et al.
©2010 American Institute of Physics.

Thursday, August 26, 2010

Biosynthetic Corneas Restore Vision in Humans


A new study from researchers in Canada and Sweden has shown that biosynthetic corneas can help regenerate and repair damaged eye tissue and improve vision in humans. The results, from an early phase clinical trial with 10 patients, are published in the August 25th, 2010 issue of Science Translational Medicine.
Dr. May Griffith displays a biosynthetic cornea that can be implanted into the eye to repair damage and restore sight. (Credit: Photo courtesy of the Ottawa Hospital Research Institute)

"This study is important because it is the first to show that an artificially fabricated cornea can integrate with the human eye and stimulate regeneration," said senior author Dr. May Griffith of the Ottawa Hospital Research Institute, the University of Ottawa and Linkƶping University. "With further research, this approach could help restore sight to millions of people who are waiting for a donated human cornea for transplantation."

The cornea is a thin transparent layer of collagen and cells that acts as a window into the eyeball. It must be completely transparent to allow the light to enter and it also helps with focus. Globally, diseases that lead to clouding of the cornea represent the most common cause of blindness. More than a decade ago, Dr. Griffith and her colleagues began developing biosynthetic corneas in Ottawa, Canada, using collagen produced in the laboratory and moulded into the shape of a cornea. After extensive laboratory testing, Dr. Griffith began collaborating with Dr. Per Fagerholm, an eye surgeon at Linkƶping University in Sweden, to provide the first-in-human experience with biosynthetic cornea implantation.

Together, they initiated a clinical trial in 10 Swedish patients with advanced keratoconus or central corneal scarring. Each patient underwent surgery on one eye to remove damaged corneal tissue and replace it with the biosynthetic cornea, made from synthetically cross-linked recombinant human collagen. Over two years of follow-up, the researchers observed that cells and nerves from the patients' own corneas had grown into the implant, resulting in a "regenerated" cornea that resembled normal, healthy tissue. Patients did not experience any rejection reaction or require long-term immune suppression, which are serious side effects associated with the use of human donor tissue. The biosynthetic corneas also became sensitive to touch and began producing normal tears to keep the eye oxygenated. Vision improved in six of the ten patients, and after contact lens fitting, vision was comparable to conventional corneal transplantation with human donor tissue.

"We are very encouraged by these results and by the great potential of biosynthetic corneas," said Dr. Fagerholm. "Further biomaterial enhancements and modifications to the surgical technique are ongoing, and new studies are being planned that will extend the use of the biosynthetic cornea to a wider range of sight-threatening conditions requiring transplantation."

This research was supported by grants from the Canadian Stem Cell Network, the Swedish Research Council and County of Ɩstergƶtland and a European Union Marie Curie International Fellowship. Initial work in developing the biosynthetic corneas was supported by the Natural Sciences and Engineering Research Council of Canada and the Canadian Institutes of Health Research. Recombinant human collagen type III used in formulating the biosynthetic corneas for the clinical study was provided by FibroGen, Inc., San Francisco, CA, U.S.A.

Dr. May Griffith is a Senior Scientist at the Ottawa Hospital Research Institute, Professor at the University of Ottawa (Faculty of Medicine) and Professor of Regenerative Medicine and Director of the Integrative Regenerative Medicine Centre at Linkƶping University. Dr. Per Fagerholm is a Professor of Ophthalmology at Linkƶping University. Dr. Neil Lagali is a senior lecturer at Linkƶping University. Other authors are listed in the paper.

Electricity collected from the air could become the newest alternative energy source


Imagine devices that capture electricity from the air ― much like solar cells capture sunlight ― and using them to light a house or recharge an electric car. Imagine using similar panels on the rooftops of buildings to prevent lightning before it forms. Strange as it may sound, scientists already are in the early stages of developing such devices, according to a report presented today at the 240th National Meeting of the American Chemical Society (ACS).
Powering homes with electricity collected from the air may
be possible after scientists report solving a centuries old
riddle about how moisture in the atmosphere
becomes electrically charged. Credit: Martin Fischer

"Our research could pave the way for turning electricity from the atmosphere into an alternative energy source for the future," said study leader Fernando Galembeck, Ph.D. His research may help explain a 200-year-old scientific riddle about how electricity is produced and discharged in the atmosphere. "Just as solar energy could free some households from paying electric bills, this promising new energy source could have a similar effect," he maintained.

"If we know how electricity builds up and spreads in the atmosphere, we can also prevent death and damage caused by lightning strikes," Galembeck said, noting that lightning causes thousands of deaths and injuries worldwide and millions of dollars in property damage.

The notion of harnessing the power of electricity formed naturally has tantalized scientists for centuries. They noticed that sparks of static electricity formed as steam escaped from boilers. Workers who touched the steam even got painful electrical shocks. Famed inventor Nikola Tesla, for example, was among those who dreamed of capturing and using electricity from the air. It's the electricity formed, for instance, when water vapor collects on microscopic particles of dust and other material in the air. But until now, scientists lacked adequate knowledge about the processes involved in formation and release of electricity from water in the atmosphere, Galembeck said. He is with the University of Campinas in Campinas, SP, Brazil.

Scientists once believed that water droplets in the atmosphere were electrically neutral, and remained so even after coming into contact with the electrical charges on dust particles and droplets of other liquids. But new evidence suggested that water in the atmosphere really does pick up an electrical charge.

Galembeck and colleagues confirmed that idea, using laboratory experiments that simulated water's contact with dust particles in the air. They used tiny particles of silica and aluminum phosphate, both common airborne substances, showing that silica became more negatively charged in the presence of high humidity and aluminum phosphate became more positively charged. High humidity means high levels of water vapor in the air ― the vapor that condenses and becomes visible as "fog" on windows of air-conditioned cars and buildings on steamy summer days.

"This was clear evidence that water in the atmosphere can accumulate electrical charges and transfer them to other materials it comes into contact with," Galembeck explained. "We are calling this 'hygroelectricity,' meaning 'humidity electricity'."

In the future, he added, it may be possible to develop collectors, similar to the solar cells that collect the sunlight to produce electricity, to capture hygroelectricity and route it to homes and businesses. Just as solar cells work best in sunny areas of the world, hygroelectrical panels would work more efficiently in areas with high humidity, such as the northeastern and southeastern United States and the humid tropics.

Galembeck said that a similar approach might help prevent lightning from forming and striking. He envisioned placing hygroelectrical panels on top of buildings in regions that experience frequent thunderstorms. The panels would drain electricity out of the air, and prevent the building of electrical charge that is released in lightning. His research group already is testing metals to identify those with the greatest potential for use in capturing atmospheric electricity and preventing lightning strikes.

"These are fascinating ideas that new studies by ourselves and by other scientific teams suggest are now possible," Galembeck said. "We certainly have a long way to go. But the benefits in the long range of harnessing hygroelectricity could be substantial."

Wednesday, August 25, 2010

'Spintronics' Breakthrough Holds Promise for Next-Generation Computers


Using powerful lasers, Hui Zhao, assistant professor of physics and astronomy at the University of Kansas, and graduate student Lalani Werake have discovered a new way to recognize currents of spinning electrons within a semiconductor.
Hui Zhao, assistant professor of physics and astronomy. 
(Credit: Image courtesy of University of Kansas)

Their findings could lead the way to development of superior computers and electronics. Results from their work in KU's Ultrafast Laser Lab will be published in the September issue of the journal Nature Physics and was posted online in early August.

Zhao and Werake research spin-based electronics, dubbed "spintronics."

"The goal is to replace everything -- from computers to memory devices -- to have higher performance and less energy consumption," said Zhao.

The KU investigator said that future advancements to microchips would require a different approach for transmitting the sequences of ones and zeros that make up digital information.

"We have been using the charge of the electron for several decades," said Zhao. "But right now the size of each device is just 30 to 50 nanometers, and you don't have many atoms remaining on that tiny scale. We can't continue that way anymore because we're hitting a fundamental limit."

Instead of using the presence or absence of electronic charges, spintronics relies on the direction of an electron's rotation to convey data.

"Roughly speaking, an electron can be viewed as a tiny ball that spins like a baseball," said Zhao. "The difference is that a baseball can spin at any speed, but an electron can only spin at a certain speed -- either counterclockwise or clockwise. Therefore, we can use one spin state to represent 'zero' and another to represent 'one.' Because a single electron can carry this information, this takes much less time and much less energy."

However, one major hurdle for spintronics researchers has been the difficulty in detecting the flow of spinning electrons in real time.

"We haven't been able to monitor the velocity of those spinning electrons, but velocity is associated with the spin current," Zhao said. "So there's been no way to directly detect the spin current so far."

The discovery by Zhao and Werake changes that.

The KU researchers have discovered that shining a laser beam on a piece of semiconductor generates different color lights if the spinning electrons are flowing, and the brightness of the new light is related to the strength of the spin current.

The optical effect, known as "second-harmonic generation," can monitor spin-current in real time without altering the current itself. Zhao compares his new method with a police officer's radar gun, which tracks a car's speed as it passes.

This vastly improves upon spin-current analysis now in use, which the KU researcher says is akin to analyzing still photographs to determine a car's speed, long after the car has sped away.

"Spintronics is still in the research phase, and we hope that this new technology can be used in labs to look at problems that interest researchers," said Zhao. "As spintronics become industrialized, we expect this could become a routine technique to check the quality of devices, for example."

A five-year CAREER award from the National Science Foundation funded the work by Zhao and Werake.

Richest Planetary System Discovered


Astronomers using ESO's world-leading HARPS instrument have discovered a planetary system containing at least five planets, orbiting the Sun-like star HD 10180. The researchers also have tantalising evidence that two other planets may be present, one of which would have the lowest mass ever found. This would make the system similar to our Solar System in terms of the number of planets (seven as compared to the Solar System's eight planets). Furthermore, the team also found evidence that the distances of the planets from their star follow a regular pattern, as also seen in our Solar System.
The planetary system around the Sun-like star HD 
10180 (artist’s impression). (Credit: ESO/L. CalƧada)

"We have found what is most likely the system with the most planets yet discovered," says Christophe Lovis, lead author of the paper reporting the result. "This remarkable discovery also highlights the fact that we are now entering a new era in exoplanet research: the study of complex planetary systems and not just of individual planets. Studies of planetary motions in the new system reveal complex gravitational interactions between the planets and give us insights into the long-term evolution of the system."

The team of astronomers used the HARPS spectrograph, attached to ESO's 3.6-metre telescope at La Silla, Chile, for a six-year-long study of the Sun-like star HD 10180, located 127 light-years away in the southern constellation of Hydrus (the Male Water Snake). HARPS is an instrument with unrivalled measurement stability and great precision and is the world's most successful exoplanet hunter.

Thanks to the 190 individual HARPS measurements, the astronomers detected the tiny back and forth motions of the star caused by the complex gravitational attractions from five or more planets. The five strongest signals correspond to planets with Neptune-like masses -- between 13 and 25 Earth masses [1] -- which orbit the star with periods ranging from about 6 to 600 days. These planets are located between 0.06 and 1.4 times the Earth-Sun distance from their central star.

"We also have good reasons to believe that two other planets are present," says Lovis. One would be a Saturn-like planet (with a minimum mass of 65 Earth masses) orbiting in 2200 days. The other would be the least massive exoplanet ever discovered, with a mass of about 1.4 times that of the Earth. It is very close to its host star, at just 2 percent of the Earth-Sun distance. One "year" on this planet would last only 1.18 Earth-days.

"This object causes a wobble of its star of only about 3 km/hour -- slower than walking speed -- and this motion is very hard to measure," says team member Damien SĆ©gransan. If confirmed, this object would be another example of a hot rocky planet, similar to Corot-7b (eso0933).

The newly discovered system of planets around HD 10180 is unique in several respects. First of all, with at least five Neptune-like planets lying within a distance equivalent to the orbit of Mars, this system is more populated than our Solar System in its inner region, and has many more massive planets there [2]. Furthermore, the system probably has no Jupiter-like gas giant. In addition, all the planets seem to have almost circular orbits.

So far, astronomers know of fifteen systems with at least three planets. The last record-holder was 55 Cancri, which contains five planets, two of them being giant planets. "Systems of low-mass planets like the one around HD 10180 appear to be quite common, but their formation history remains a puzzle," says Lovis.

Using the new discovery as well as data for other planetary systems, the astronomers found an equivalent of the Titius-Bode law that exists in our Solar System: the distances of the planets from their star seem to follow a regular pattern [3]. "This could be a signature of the formation process of these planetary systems," says team member Michel Mayor.

Another important result found by the astronomers while studying these systems is that there is a relationship between the mass of a planetary system and the mass and chemical content of its host star. All very massive planetary systems are found around massive and metal-rich stars, while the four lowest-mass systems are found around lower-mass and metal-poor stars [4]. Such properties confirm current theoretical models.

The discovery was announced Aug. 24 at the international colloquium "Detection and dynamics of transiting exoplanets," at the Observatoire de Haute-Provence, France.

Notes

[1] Using the radial velocity method, astronomers can only estimate a minimum mass for a planet as the mass estimate also depends on the tilt of the orbital plane relative to the line of sight, which is unknown. From a statistical point of view, this minimum mass is however often close to the real mass of the planet.

[2] On average the planets in the inner region of the HD 10180 system have 20 times the mass of the Earth, whereas the inner planets in our own Solar System (Mercury, Venus, Earth and Mars) have an average mass of half that of the Earth.

[3] The Titius-Bode law states that the distances of the planets from the Sun follow a simple pattern. For the outer planets, each planet is predicted to be roughly twice as far away from the Sun as the previous object. The hypothesis correctly predicted the orbits of Ceres and Uranus, but failed as a predictor of Neptune's orbit.

[4] According to the definition used in astronomy, "metals" are all the elements other than hydrogen and helium. Such metals, except for a very few minor light chemical elements, have all been created by the various generations of stars. Rocky planets are made of "metals."

More information

This research was presented in a paper submitted to Astronomy and Astrophysics ("The HARPS search for southern extra-solar planets. XXVII. Up to seven planets orbiting HD 10180: probing the architecture of low-mass planetary systems" by C. Lovis et al.).

The team is composed of C. Lovis, D. SĆ©gransan, M. Mayor, S. Udry, F. Pepe, and D. Queloz (Observatoire de GenĆØve, UniversitĆ© de GenĆØve, Switzerland), W. Benz (UniversitƤt Bern, Switzerland), F. Bouchy (Institut d'Astrophysique de Paris, France), C. Mordasini (Max-Planck-Institut fĆ¼r Astronomie, Heidelberg, Germany), N. C. Santos (Universidade do Porto, Portugal), J. Laskar (Observatoire de Paris, France), A. Correia (Universidade de Aveiro, Portugal), and J.-L. Bertaux (UniversitĆ© Versailles Saint-Quentin, France) and G. Lo Curto (ESO).

Berries May Activate the Brain's Natural Housekeeper


Scientists have reported the first evidence that eating blueberries, strawberries, and acai berries may help the aging brain stay healthy in a crucial but previously unrecognized way. Their study, presented at the 240th National Meeting of the American Chemical Society (ACS), concluded that berries, and possibly walnuts, activate the brain's natural "housekeeper" mechanism, which cleans up and recycles toxic proteins linked to age-related memory loss and other mental decline.
Many berries could help protect the brain against 
aging, new research suggests. (Credit: iStockphoto
/Raychel Deppe)

Shibu Poulose, Ph.D., who presented the report, said previous research suggested that one factor involved in aging is a steady decline in the body's ability to protect itself against inflammation and oxidative damage. This leaves people vulnerable to degenerative brain diseases, heart disease, cancer, and other age-related disorders.

"The good news is that natural compounds called polyphenolics found in fruits, vegetables and nuts have an antioxidant and anti-inflammatory effect that may protect against age-associated decline," said Poulose, who is with the U. S. Department of Agriculture-Agricultural Research Service (USDA-ARS) Human Nutrition Research Center on Aging in Boston. Poulose did the research with James Joseph, Ph.D., who died June 1. Joseph, who headed the laboratory, pioneered research on the role of antioxidants in fruits and nuts in preventing age-related cognitive decline.

Their past studies, for instance, showed that old laboratory rats fed for two months on diets containing 2 percent high-antioxidant strawberry, blueberry, or blackberry extract showed a reversal of age-related deficits in nerve function and behavior that involves learning and remembering.

In the new research, Poulose and Joseph focused on another reason why nerve function declines with aging. It involves a reduction in the brain's natural house-cleaning process. Cells called microglia are the housekeepers. In a process called autophagy, they remove and recycle biochemical debris that otherwise would interfere with brain function.

"But in aging, microglia fail to do their work, and debris builds up," Poulose explained. "In addition, the microglia become over-activated and actually begin to damage healthy cells in the brain. Our research suggests that the polyphenolics in berries have a rescuing effect. They seem to restore the normal housekeeping function. These findings are the first to show these effects of berries."

The findings emerged from research in which Joseph and Poulose have tried to detail factors involved in the aging brain's loss of normal housekeeping activity. Using cultures of mouse brain cells, they found that extracts of berries inhibited the action of a protein that shuts down the autophagy process.

Poulose said the study provides further evidence to eat foods rich in polyphenolics. Although berries and walnuts are rich sources, many other fruits and vegetables contain these chemicals ― especially those with deep red, orange, or blue colors. Those colors come from pigments termed anthocyanins that are good antioxidants. He emphasized the importance of consuming the whole fruit, which contains the full range of hundreds of healthful chemicals. Frozen berries, which are available year round, also are excellent sources of polyphenolics, he added.