BTemplates.com

Powered by Blogger.

Pageviews past week

Quantum mechanics

Auto News

artificial intelligence

About Me

Recommend us on Google!

Information Technology

Popular Posts

Friday, April 30, 2010

Nude-Colored Hospital Gowns Could Help Doctors Better Detect Hard-to-See Symptoms


Changing the hue of hospital gowns and bed sheets to match a patient's skin color could greatly enhance a physician's ability to detect cyanosis and other health-related skin color changes, according to a new study from Rensselaer Polytechnic Institute.

Me
A new study from Rensselaer Professor Mark Changizi 
suggests that changing the color of hospital gowns and 
bed sheets to match a patient’s skin color could greatly 
enhance the ability of a doctor or nurse to detect cyanosis 
and other health-related skin color changes. Perceived color 
on skin crucially depends on the background color. For example, 
the five small squares are identical in each of the above boxes. 
The change and gradation of the small squares, however, are 
much easier to identify in the box in the lower right-hand corner 
than the other hospital gown-colored boxes. 
(Credit: Mark Changizi / Rensselaer)

"If a doctor sees a patient, and then sees the patient again later, the doctor will have little or no idea whether the patient's skin has changed color," said neurobiologist and study leader Mark Changizi, assistant professor in the Department of Cognitive Science at Rensselaer. "Small shifts in skin color can have tremendous medical implications, and we have proposed a few simple tools -- skin-colored gowns, sheets, and adhesive tabs -- that could better arm physicians to make more accurate diagnoses."

Human eyes evolved to see in color largely for the purpose of detecting skin color changes such as when other people blush, Changizi said. These emotive skin color changes are extremely apparent because humans are hard-wired to notice them, and because the background skin color remains unchanged. The contrast against the nearby "baseline" skin color is what makes blushes so noticeable, he said.

Human skin also changes color as a result of hundreds of different medical conditions.

Pale skin, yellow skin, and cyanosis -- a potentially serious condition of bluish discoloration of the skin, lips, nails, and mucous membranes due to lack of oxygen in the blood -- are common symptoms. These color changes often go unnoticed, however, because they often involve a fairly universal shift in skin color, Changizi said. The observer in most instances will just assume the patient's current skin color is the baseline color. The challenge is that there is no color contrast against the baseline for the observer to pick up on, as the baseline skin color has changed altogether.

(To hear Changizi address the age-old question of why human veins look blue, see: http://blogger.rpi.edu/approach/2010/04/26/so-why-do-our-veins-look-blue/)

One potential solution, Changizi said, is for hospitals to outfit patients with gowns and sheets that are nude-colored and closely match their skin tone. Another solution is to develop adhesive tabs in a large palette of skin-toned colors. Physicians could then choose the tabs that most closely resemble the patient's skin tone, and place the tabs at several places on the skin of the patient. Both techniques should afford doctors and clinicians an easy and effective tool to record the skin tone of a patient, and see if it deviates -- even very slightly -- from its "baseline" color over time.

"If a patient's skin color shifts a small amount, the change will often be imperceptible to doctors and nurses," Changizi said. "If that patient is wearing a skin-colored gown or adhesive tab, however, and their skin uniformly changes slightly more blue, the initially 'invisible' gown or tab will appear bright and yellow to the observer."

While there are devices for specifically measuring the oxygen content of blood to help detect the onset of cyanosis, Changizi said the color recognition offered by the color-matched adhesive tabs and hospital gowns would be another tool to tip off the clinician that there is even a need to measure blood oxygen content. The color-matched tabs and gowns would also benefit many hospital departments, as well as international hospitals, which lack equipment to measure blood oxygen content, he said.

Changizi's findings are detailed in the paper "Harnessing color vision for visual oximetry in central cyanosis," published in the journal Medical Hypotheses. The complete paper may be viewed online at Changizi's Web site at: http://www.changizi.com/colorclinical.pdf.

Last year, Changizi's eye-opening book, The Vision Revolution: How the Latest Research Overturns Everything We Thought We Knew About Human Vision, hit store shelves. Published by BenBella Books, The Vision Revolution investigates why vision has evolved as it has over millions of years, and challenges theories that have dominated the scientific literature for decades.
Reblog this post [with Zemanta]

Wednesday, April 28, 2010

Personality Impacts Brain Shrinkage in Aging?


Psychologists at Washington University in St. Louis have found an intriguing possibility that personality and brain aging during the golden years may be linked.

Studying MRI images of 79 volunteers between the ages of 44 and 88 -- who also had provided personality and demographic data -- the researchers found lower volumes of gray matter in the frontal and medial temporal brain regions of volunteers who ranked high in neuroticism traits, compared with higher volumes of gray matter in those who ranked high in conscientious traits.
Me
Top: The amygdala, which is part of the medial temporal region and involved in emotion processing, was larger in conscientious individuals but smaller in neurotic individuals. Bottom: The orbitofrontal cortex, which is part of the prefrontal region and involved in social/emotional processing, showed similar associations with personality. (Credit: Image courtesy of Washington University in St. Louis)


The orbitofrontal cortex, which is part of the prefrontal region and involved in social/emotional processing, showed similar associations with personality.

"This is a first step in seeing how personality might affect brain aging," says Denise Head, PhD, assistant professor of psychology in Arts & Sciences at Washington University. "Our data clearly show an association between personality and brain volume, particularly in brain regions associated with emotional and social processing. This could be interpreted that personality may influence the rate of brain aging."

She notes also that the results could be seen as "the tail wagging the dog." That is, it is actually brain changes during aging that influence personality.

"Right now, we can't disentangle those two, but we plan to in the future by conducting ongoing studies of the volunteers over time to note future structural changes," Head says.

Head's graduate student Jonathan Jackson, first author of a recently published paper on the research in Neurobiology in Aging, says that he, and co-authors Head and David A. Balota, PhD, professor of psychology, tested the hypotheses that aging individuals high in neuroticism would show lower brain volume, while those high in either conscientiousness or extroversion would have larger brain volume. The extroversion results were not clear, but the data validated the other two hypotheses.

"There are lots of nonhuman animal studies that suggest that chronic stress is associated with deleterious effects on the brain, and this helped us form the hypothesis that we'd see similar effects in older adults." Jackson says.

"We assumed that neuroticism would be negatively related to structural volume," Jackson says. "We really focused on the prefrontal and medial temporal regions because they are the regions where you see the greatest age changes, and they are also seats of attention, emotion and memory. We found that more neurotic individuals had smaller volumes in certain prefrontal and medial temporal parts of the brain than those who were less neurotic, and the opposite pattern was found with conscientiousness."

"A unique thing that we've done is to reliably measure personality differences and associate them with age-related effects on brain structures in healthy middle-aged and older adults" Head says. "Specifically, we found that neuroticism was associated with greater age-related decline in brain volume, whereas conscientiousness was associated with less age-related decline."

The researchers were interested in healthy aging brains because, down the road, the findings might serve as a useful marker for later diagnosis of dementia. The volunteers they studied are normal control participants at Washington University's Alzheimer's Disease Research Center (ADRC), led by John C. Morris, MD, the Friedman Distinguished Professor of Neurology and director of the ADRC.

One of the first changes in Alzheimer's disease may be in personality. There is accumulating research from the ADRC and other institutions that suggest that people tend to become more neurotic and less conscientious in early-stage Alzheimer's.

"It might be that changes in personality track onto those people more likely to develop Alzheimer's," Jackson says. "It's why we looked at older healthy adults because it's important to track these relationships in healthy populations before you look at pathological ones.

"We know that there are degenerative processes going on before the diagnosis of Alzheimer's. We want to be able to see if the subtle personality changes might be particular to an early clinical picture and possibly see if one can predict who will become demented based on personality changes," Jackson says.

Another way of looking at the findings, Head says, is that neuroticism might add an increasing vulnerability to the pathological processes that go on in aging, particularly in Alzheimer's.

"We will continue to pursue the relationship between personality and brain structure as one of the earlier processes in Alzheimer's and hence a possible risk factor," Head says.

Monday, April 26, 2010

How We Can Sense Temperatures: Discovery Could Lead to Novel Therapies for Acute and Chronic Pain


Scientists at The Scripps Research Institute and the Genomics Institute of the Novartis Research Foundation (GNF) have shed new light on the molecular mechanism that enables us to sense temperature, such as the heat from a sizzling stove. In addition to contributing to our knowledge of basic biology, the findings could one day lead to new therapies for conditions such as acute or chronic inflammatory pain.
Me
New research sheds light on the molecular 
mechanism that enables us to sense temperature, 
such as the heat from a sizzling stove. The discovery 
could one day lead to new therapies for conditions 
such as acute or chronic inflammatory pain. 
(Credit: iStockphoto/Mark Evans)

The study, which was led by Scripps Research and GNF Professor Ardem Patapoutian, was published in an advance, online edition of the journal Nature Neuroscience on April 22, 2010.

To better understand temperature sensation, the team focused on a protein called TRPV1, which is a member of a small family of proteins known to enable temperature sensation, and is involved in inflammation and the communication of pain to the brain. After producing thousands of mutants of this protein, the scientists were able to identify a region of the protein that enabled temperature sensitivity and to detail some of the molecular mechanisms at work in the molecule.

"Ever since the discovery of these proteins, it has been an outstanding question how they can be activated by temperatures," said Research Associate Jörg Grandl, a member of the Patapoutian lab and first author of the paper. "The new study addresses this question."

"Because our ability to sense temperature is closely linked to our ability to sense pain, some of these ion channels are considered targets to treat chronic inflammatory and neuropathic pain indications," said Patapoutian. "Understanding these proteins could be crucial in designing future drugs that can either activate or block them."

Hot and Cold

Humans and other vertebrate animals use specialized sensory neurons to detect temperature, pressure, and other physical stimuli on the skin. These neurons are located in the spinal column and are connected to the skin and organs through long extensions known as axons.

On the surface of these axons are ion channel (pore-forming) proteins, which span the axon's membrane, connecting the inside with the outside. Some of these ion channels act like temperature receptors or "molecular thermometers" by opening and closing according to the temperature. At a particular temperature, the receptors open. This allows an influx of ions into the neuronal processes, and this electrical signal is relayed through the neuron to the brain.

The existence of specialized hot- and cold-neurons had been known for years, but the molecules that actually sense the temperatures and signal back to the neuron through the axon were a mystery. That changed in 1997 when a group cloned TRPV1, which is a type of transient receptor potential (TRP) channel. TRPV1, an ion channel, opens when it senses hot temperatures -- above 42° C (108° F).

That discovery opened the floodgates for identifying other temperature-detecting proteins. Within a few years, several laboratories -- including Patapoutian's -- had identified additional temperature-detecting proteins and confirmed that mammals used them to detect temperature.

But how the proteins achieved their temperature-sensing ability remained a mystery. While scientists in the field knew in much detail how ion channels were activated by chemicals or voltage signals, the molecular structures required for temperature activation remained unknown.

Two competing theories were advanced to explain the activation of ion channels in response to temperature. Drawing on the proteins' similarity to voltage-gated potassium channels, the first theory posited that TRP channels are generating temperature sensitivity because the energies required for voltage activation are very finely tuned. In contrast, the second theory proposed that these channels have a modular structure and therefore possess a specific domain that enables them to be activated by temperature -- and postulated the existence of a 'temperature-sensor domain'.

Point by Point
To gain insight into how these ion channels achieve their temperature sensitivity, in the new study the scientists conducted studies of TRPV1, which was not only the first TRP to be discovered but is also the best understood. A previous study in the lab had focused on a related, warm-activated ion channel, TPRV3, but since the biophysics of this molecule is complicated, the team was unable to tease apart its mechanisms.

Using mutagenesis techniques, for the new study the scientists first generated some 8,500 mutants of TRPV1. Then, working with the high throughput equipment available at GNF, the team performed an unbiased screen of these compounds to identify mutations of interest.

"We were looking for mutations in these proteins that would only change the temperature sensitivity of these channels, but would not affect any of the other activation mechanisms," said Grandl. "We were looking for single-point mutations [changes of a single amino acid] where the channel still functioned normally in response to capsacin (the active ingredient in chili peppers) or pH, but not to temperature."

Indeed, the team found a number of these mutations that affected the molecule's sensitivity to temperature, but not to other cues. Interestingly, the mutations were clustered in one area of the protein, the outer pore region, which provides further support to the existence of the predicted 'temperature-sensor domain'.

Next, with these mutant versions of TRPV1 in hand, the scientists examined what had changed in the molecule to disrupt temperature sensitivity.

In findings new to the field, the team discovered that TRPV1 has two ways of opening its channel -- for a brief time, opening for only for a millisecond before returning to its closed resting state, and for a relatively long time, opening for about 10 milliseconds. The team found that the mutations disrupting temperature sensitivity interfered with the long channel openings, but not the short ones.

"This study suggests a potential molecular mechanism that generates extreme temperature sensitivity from two mildly temperature sensitive steps," said Grandl. The team postulates that by stabilizing the open state, the pore domain contributes to thermosensitivity of TRPV1. "We now have a novel working model of how Nature could have evolved such exquisite temperature sensitivity, a hypothesis that can be tested in future work."

In addition to Grandl and Patapoutian, the paper was authored by Sung Eun Kim and Valerie Uzzell of Scripps Research, and Badry Bursulaya, Matt Petrus, and Michael Bandell of the Genomics Institute of the Novartis Research Foundation.

The research was supported by the U.S. National Institutes of Health, the Novartis Research Foundation, and fellowships to Grandl from the American Heart Association and the U.S. National Institutes of Health.

Sunday, April 25, 2010

Particulate Matter from Fires in the Amazon Affects Lightning Patterns


Native Americans used smoke signals to indicate danger, and a white plume is sent up by the Vatican when a new Pope is chosen. Now, a new research project by Tel Aviv University researchers and their colleagues shows that where there's "smoke" there may be significant consequences for local weather patterns, rainfall and thunderstorms.
Me
Lightning. Scientists researched data on lightning patterns in the Amazon to show how clouds are affected by particulate matter emitted by the fires used for slash-and-burn foresting practices. Researchers found that while low levels of particulate matter actually help the development of thunderstorms, the reverse is true once a certain concentration is reached -- the particles then inhibit the formation of clouds and thunderstorms. 
(Credit: iStockphoto/Chee Ming Wong)

In a new study, Prof. Colin Price, head of Tel Aviv University's Department of Geophysics and Planetary Science, researched data on lightning patterns in the Amazon to show how clouds are affected by particulate matter emitted by the fires used for slash-and-burn foresting practices. His findings, recently published in the journal Geophysical Research Letters, could be used by climate change researchers trying to understand the impact of pollution on global weather patterns.

Along with colleagues at the Weizmann Institute and the Open University in Israel, Prof. Price demonstrated how pollution's effects on cloud development could negatively impact our environment. While low levels of particulate matter actually help the development of thunderstorms, the reverse is true once a certain concentration is reached -- the particles then inhibit the formation of clouds and thunderstorms.

"The clouds just dry up," he says.

Lightning strikes to the center of the issue

Scientists have known for some time that man-made aerosols affect cloud formation, but specific scientific findings have been inconclusive. How clouds and storms change in response to air pollution is central to the debate about climate change and global warming, since clouds have a general cooling effect on the Earth's climate.

But how man-made pollution impacts clouds, rainfall and weather patterns remains poorly understood, and natural particulates, such as those generated by Iceland's recent volcano eruptions may add to this effect. The thick volcanic ash cloud absorbs solar radiation, heating the upper atmosphere, similar to the forest fire smoke, and can hence also impact the development of clouds and rainfall, Price said.

While studying the climatology of the Amazon forest during its annual dry season, the researchers noticed how thousands of man-made forest fires injected smoke into the atmosphere. Since thunderstorms still occur during the dry season, it was the perfect opportunity for studying the effects of these particulates on thundercloud development.

Cloud droplets form on small particles called "cloud condensation nuclei" (CCN). As the number of CCN increase due to the fire activity, the lightning activity increased in the storms ingesting the smoke. More CCN implies more small droplets that can be carried aloft into the upper parts of the cloud where lightning is generated. Increased lightning activity generally also implies increasing rainfall over the Amazon. But when particulate matter became too dense, they observed, clouds didn't form, and the lightning activity in thunderstorms diminished dramatically.

Seeking answers to vital questions

These results may have significant implications for polluted regions of the world that rely on rainfall for agriculture and human consumption. "One of the most debated topics related to future climate change is what will happen to clouds, and rainfall, if the earth warms up," says Prof. Price, "and how will clouds react to more air pollution in the atmosphere?"

Clouds deflect the sun's rays, cooling the Earth's climate. If we change the duration of cloud cover, the aerial coverage of clouds, or the brightness of clouds, we can significantly impact the climate, Prof. Price and his colleagues explain. And too many aerosols may have disastrous impacts on rainfall patterns as well.

Air pollution from car exhausts and smokestacks at power plants and factories contribute to increasing particulate matter in our atmosphere. This is the first study of its kind that uses lightning as a quantitative way to measure the impact of air pollution on cloud development over a large area, and across a number of years.

"Lightning is a sensitive index to the inner workings of polluted clouds over the Amazon Basin," concludes Prof. Price.
Reblog this post [with Zemanta]

4G Wireless: It's Not Just for Phones Anymore


Verizon says its next wireless network technology could link up cars, home appliances, and more.
Credit: Technology Review

Verizon is gearing up to launch its next wireless network technology, called Long Term Evolution (LTE), by the end of this year. While Verizon will, of course, still sell phones for this fourth generation (4G) network, it is also pushing to have it built into many other types of devices.

LTE will run on the spectrum formerly used to send television signals, which Verizon licensed from the U.S. government in 2008. The company expects to be able to support about 100 million users by the end of the year. But the saturation of the cell-phone market means that Verizon is also hoping to see the wireless technology used for many other kinds of devices. "We want to get to 500 to 600 percent penetration," says executive vice president and CTO Richard Lynch. This would mean an average of five or six wireless devices per person.

LTE promises better speed and lower latency than existing networks. Lynch says that users can expect uniform, reliable performance at five to 12 megabits per second--significantly faster than many wired connections today. He expects data to travel round-trip in 25 to 30 milliseconds, a fifth of the latency on the current network.

Lynch envisions people using Verizon's 4G wireless network for cars, computers, TVs, and other home appliances, as well as regular cell phones. Among other devices, the company has tested wall sockets and power strips that include 4G wireless capabilities. This could enable new forms of home-monitoring and energy management.

One potential problem is that not all of these devices will be under Verizon's control. When the company purchased the 700-megahertz spectrum, it had to agree to open its network to devices made by other companies. These devices must still be tested and certified to ensure they run safely on the network, but third-party developers will have much more latitude.

Last week, the company broke ground on a new lab in Waltham, MA, where it plans to let third-party developers develop and test devices for the LTE network under simulated real-world conditions.

Another key difference of LTE is that it runs the Internet Protocol (IP). Lynch says that voice will be treated as an application over the new network. (Verizon will use VoIP to deliver voice calls.) An all-IP system will also allow for the use of secure protocols. And getting rid of non-IP components should also make device compatibility easier.

Jeff Kagan, an independent wireless and telecom industry analyst, expects users to embrace new 4G wireless devices. Since Apple created a market for mobile data devices with the iPhone, he says, users have come to expect mobile data connections. The growing success of 3G-enabled e-readers is just one example, he says. When users can get data faster and more reliably, Kagan believes, they'll want ever-broader classes of devices to be connected to the Internet.

Arogyaswami Paulraj, a Stanford professor who has worked on LTE and a competing 4G technology called WiMax, says it makes sense to focus on data applications, since revenue from wireless data exceeds that for voice in many parts of the world. However, Paulraj thinks that Verizon shouldn't be too cavalier about its bandwidth needs. A big hit like the iPhone could leave Verizon scrambling, he says, adding that "LTE is a good technology, but challenged by the lack of spectrum."

Lynch acknowledges that a plethora of new wireless devices could eat up whatever bandwidth is gained by going to 4G. "There will never be enough bandwidth for my vision of what wireless can do," he says. But he believes that much of the new wireless traffic will come from simple IP-based devices sending relatively small amounts of data over the network--they'll need to be connected, but they won't be bandwidth hogs.
Reblog this post [with Zemanta]

Saturday, April 24, 2010

Novel Negative-Index Metamaterial Bends Light 'Wrong' Direction


A group of scientists led by researchers from the California Institute of Technology (Caltech) has engineered a type of artificial optical material -- a metamaterial -- with a particular three-dimensional structure such that light exhibits a negative index of refraction upon entering the material. In other words, this material bends light in the "wrong" direction from what normally would be expected, irrespective of the angle of the approaching light.
Me
Arrays of coupled plasmonic coaxial waveguides offer 
a new approach by which to realize negative-index 
metamaterials that are remarkably insensitive to angle 
of incidence and polarization in the visible range. 
(Credit: Caltech/Stanley Burgos)

This new type of negative-index metamaterial (NIM), described in an advance online publication in the journal Nature Materials, is simpler than previous NIMs -- requiring only a single functional layer -- and yet more versatile, in that it can handle light with any polarization over a broad range of incident angles. And it can do all of this in the blue part of the visible spectrum, making it "the first negative index metamaterial to operate at visible frequencies," says graduate student Stanley Burgos, a researcher at the Light-Material Interactions in Energy Conversion Energy Frontier Research Center at Caltech and the paper's first author.

"By engineering a metamaterial with such properties, we are opening the door to such unusual -- but potentially useful -- phenomena as superlensing (high-resolution imaging past the diffraction limit), invisibility cloaking, and the synthesis of materials index-matched to air, for potential enhancement of light collection in solar cells," says Harry Atwater, Howard Hughes Professor and professor of applied physics and materials science, director of Caltech's Resnick Institute, founding member of the Kavli Nanoscience Institute, and leader of the research team

What makes this NIM unique, says Burgos, is its engineering. "The source of the negative-index response is fundamentally different from that of previous NIM designs," he explains. Those previous efforts used multiple layers of "resonant elements" to refract the light in this unusual way, while this version is composed of a single layer of silver permeated with "coupled plasmonic waveguide elements."

Surface plasmons are light waves coupled to waves of electrons at the interface between a metal and a dielectric (a non-conducting material like air). Plasmonic waveguide elements route these coupled waves through the material. Not only is this material more feasible to fabricate than those previously used, Burgos says, it also allows for simple "tuning" of the negative-index response; by changing the materials used, or the geometry of the waveguide, the NIM can be tuned to respond to a different wavelength of light coming from nearly any angle with any polarization. "By carefully engineering the coupling between such waveguide elements, it was possible to develop a material with a nearly isotopic refractive index tuned to operate at visible frequencies."

This sort of functional flexibility is critical if the material is to be used in a wide variety of ways, says Atwater. "For practical applications, it is very important for a material's response to be insensitive to both incidence angle and polarization," he says. "Take eyeglasses, for example. In order for them to properly focus light reflected off an object on the back of your eye, they must be able to accept and focus light coming from a broad range of angles, independent of polarization. Said another way, their response must be nearly isotropic. Our metamaterial has the same capabilities in terms of its response to incident light."

This means the new metamaterial is particularly well suited to use in solar cells, Atwater adds. "The fact that our NIM design is tunable means we could potentially tune its index response to better match the solar spectrum, allowing for the development of broadband wide-angle metamaterials that could enhance light collection in solar cells," he explains. "And the fact that the metamaterial has a wide-angle response is important because it means that it can 'accept' light from a broad range of angles. In the case of solar cells, this means more light collection and less reflected or 'wasted' light."

"This work stands out because, through careful engineering, greater simplicity has been achieved," says Ares Rosakis, chair of the Division of Engineering and Applied Science at Caltech and Theodore von Kármán Professor of Aeronautics and Mechanical Engineering.

Their work was supported by the Energy Frontier Research Centers program of the Office of Science of the Department of Energy, the National Science Foundation, the Nederlandse Organisatie voor Wetenschappelijk Onderzoek, and "NanoNed," a nanotechnology program funded by the Dutch Ministry of Economic Affairs.

Friday, April 23, 2010

Researchers Create 'Sound Bullets': Highly Focused Acoustic Pulses Could Be Used as Sonic Scalpels and More


Taking inspiration from a popular executive toy ("Newton's cradle"), researchers at the California Institute of Technology (Caltech) have built a device -- called a nonlinear acoustic lens -- that produces highly focused, high-amplitude acoustic signals dubbed "sound bullets."
Me
Potential employment of a nonlinear acoustic lens to 
generate a sound bullet for hyperthermia procedures. 
The colored spheres depict nonlinear acoustic waves 
traveling within sphere chains. The curvature of the 
wavefront is induced by precompressing each row, 
and is used to generate appropriate time delays to 
focus acoustic energy at a desired focal point. The 
stylized image depicts a sound bullet superposed onto 
a brain MR image provided by Mike Tyszka of the 
Caltech Brain Imaging Center. 
(Credit: Spadoni & Daraio/Caltech)

The acoustic lens and its sound bullets (which can exist in fluids -- like air and water -- as well as in solids) have "the potential to revolutionize applications from medical imaging and therapy to the nondestructive evaluation of materials and engineering systems," says Chiara Daraio, assistant professor of aeronautics and applied physics at Caltech and corresponding author of a recent paper in the Proceedings of the National Academy of Sciences (PNAS) describing the development.

Daraio and postdoctoral scholar Alessandro Spadoni, first author of the paper, crafted their acoustic lens by assembling 21 parallel chains of stainless steel spheres into an array. Each of the 21 chains was strung with 21 9.5-millimeter-wide spheres. (Daraio says particles composed of other elastic materials and/or with different shapes also could be used.)

The device is akin to the Newton's cradle toy, which consists of a line of identical balls suspended from a frame by wires in such a way that they only move in one plane, and just barely touch one another. When one of the end balls is pulled back and released, it strikes the next ball in line and the ball at the opposite end of the cradle flies out; the balls in the middle appear to remain stationary (but really are not, because of the nonlinear dynamics triggered in the system).

The chains of particles in Daraio's and Spadoni's acoustic lens are like a longer version of a Newton's cradle. In the lens, a pulse is excited at one end by an impact with a striker, and nonlinear waves are generated within each chain. These chains, Daraio says, "are the simplest representation of highly nonlinear acoustic waveguides, which exploit the properties of particle contacts to tune the shapes of the traveling acoustic signals and their speed of propagation, creating compact acoustic pulses known as solitary waves." Solitary waves -- unlike the rippling waves produced by dropping a pebble into a pond -- can exist in isolation, neither preceded nor followed by other waves.

"The solitary waves always maintain the same spatial wavelength in a given system," she adds, "and can have very high amplitude without undergoing any distortion within the lens, unlike the signals produced by currently available technology."

The chains are squeezed closer together -- or "precompressed" -- using fishing line. By changing the amount of precompression, Daraio and Spadoni were able to vary the speed of the solitary wave. When a series of those waves exit the array, they coalesce at a particular location -- a focal point -- in a target material (which can be a gas, like air; a liquid; or a solid). This superposition of solitary waves at the focal point forms the sound bullet -- a highly compact, large-amplitude acoustic wave. Varying the parameters of the system can also produce a rapid-fire barrage of sound bullets, all trained on the same spot.

In the current design, the spheres are assembled in a two-dimensional arrangement, with each row independent of its neighbors. "Three-dimensional arrangements will be just as easy to create and will allow 3-D control of the sound bullets' appearance and travel path," Spadoni says.

"Our lens introduces the ability to generate compact, high-amplitude signals in a linear medium, and also allows us to dynamically control the location of the focal point," Daraio says. That means it isn't necessary to change any of the geometric components of the lens to change the location of the focal point.

"All we do is adjust the precompression for each chain of spheres," she says.

This simple adjustment should make the sound bullets easy to adapt to a variety of applications. "Anybody who has had an ultrasound exam has noted that the operator switches the probes according to the characteristics and location within the body of what is being imaged," Daraio says. "The acoustic lens we propose would not require replacement of its components, but rather simple adjustments of the precompression on each chain."

The acoustic lens created by Daraio and Spadoni was intended to be a proof of concept, and is probably many years away from being used in commercial applications. "For practical uses," Daraio says, "an improved design for controlling the application of static precompression on each chain would be required -- based, for example, on electronics rather than on mechanical impacts as is currently done in our lab."

Still, the instrument has the potential to surpass the clarity and safety of conventional medical ultrasound imaging. The pulses produced by the acoustic lens -- which are an order of magnitude more focused and have amplitudes that are orders of magnitude greater than can be created with conventional acoustic devices -- "reduce the detrimental effects of noise, producing a clearer image of the target." They also "can travel farther" -- deeper within the body -- "than low-amplitude pulses," Daraio says.

More intriguingly, the device could enable the development of a non-invasive scalpel that could home in on and destroy cancerous tissues located deep within the body.

"Medical procedures such as hyperthermia therapy seek to act on human tissues by locally increasing the temperature. This is often done by focusing high-energy acoustic signals onto a small area, requiring significant control of the focal region" so that healthy tissue is not also heated and damaged, Daraio explains. "Our lens produces a very compact focal region which could aid further development of hyperthermia techniques."

Furthermore, sound bullets could offer a nondestructive way to probe and analyze the interior of nontransparent objects like bridges, ship hulls, and airplane wings, looking for cracks or other defects.

"Today the performance of acoustic devices is decreased by their linear operational range, which limits the accuracy of the focusing and the amplitude achievable at the focal point," Daraio says. "The new nonlinear acoustic lens proposed with this work leverages nonlinear effects to generate compact acoustic pulses with energies much higher than are currently achievable, with the added benefit of providing great control of the focal position."

Thursday, April 22, 2010

Bizarre Matter Could Find Use in Quantum Computers


There are enticing new findings in the search for materials that support fault-tolerant quantum computing. New results from Rice University and Princeton University indicate that a bizarre state of matter that acts like a particle with one-quarter electron charge also has a "quantum registry" that is immune to information loss from external perturbations.
Me
From left, Rice physicist Rui-Rui Du, graduate students Chi Zhang and Yanhua Dai, and former postdoctoral researcher Tauno Knuuttila (not pictured) have found that odd groupings of ultracold electrons could be useful in making fault-tolerant quantum computers. (Credit: Jeff Fitlow/Rice University)

The research appeared online April 21 in Physical Review Letters. The team of physicists found that ultracold mixes of electrons caught in magnetic traps could have the necessary properties for constructing fault-tolerant quantum computers -- future computers that could be far more powerful than today's computers. The mixes of electrons are dubbed "5/2 quantum Hall liquids" in reference to the unusual quantum properties that describe their makeup.

"The big goal, the whole driving force, besides deep academic curiosity, is to build a quantum computer out of this," said the study's lead author Rui-Rui Du, professor of physics at Rice. "The key for that is whether these 5/2 liquids have 'topological' properties that would render them immune to the sorts of quantum perturbations that could cause information degradation in a quantum computer."

Du said the team's results indicate the 5/2 liquids have the desired properties. In the parlance of condensed-matter physics, they are said to represent a "non-Abelian" state of matter.

Non-Abelian is a mathematical term for a system with "noncommutative" properties. In math, commutative operations, like addition, are those that have the same outcome regardless of the order in which they are carried out. So, one plus two equals three, just as two plus one equals three. In daily life, commutative and noncommutative tasks are commonplace. For example, when doing the laundry, it doesn't matter if the detergent is added before the water or the water before the detergent, but it does matter if the clothes are washed before they're placed in the dryer.

"It will take a while to fully understand the complete implications of our results, but it is clear that we have nailed down the evidence for 'spin polarization,' which is one of the two necessary conditions that must be proved to show that the 5/2 liquids are non-Abelian," Du said. "Other research teams have been tackling the second condition, the one-quarter charge, in previous experiments."

The importance of the noncommutative quantum properties is best understood within the context of fault-tolerant quantum computers, a fundamentally new type of computer that hasn't been built yet.

Computers today are binary. Their electrical circuits, which can be open or closed, represent the ones and zeros in binary bits of information. In quantum computers, scientists hope to use "quantum bits," or qubits. Unlike binary ones and zeros, the qubits can be thought of as little arrows that represent the position of a bit of quantum matter. The arrow might represent a one if it points straight up or a zero if it points straight down, but it could also represent any number in between. In physics parlance, these arrows are called quantum "states." And for certain complex calculations, being able to represent information in many different states would present a great advantage over binary computing.

The upshot of the 5/2 liquids being non-Abelian is that they have a sort of "quantum registry," where information doesn't change due to external quantum perturbations.

"In a way, they have internal memory of their previous state," Du said.

The conditions needed to create the 5/2 liquids are extreme. At Rice, Tauno Knuuttila, a former postdoctoral research scientist in Du's group, spent several years building the "demagnetization refrigerator" needed to cool 5-millimeter squares of ultrapure semiconductors to within one-10,000th of a degree of absolute zero. It took a week for Knuuttila to simply cool the nearly one-ton instrument to the necessary temperature for the Rice experiments.

The gallium arsenide semiconductors used in the tests are the most pure on the planet. They were created by Loren Pfieiffer, Du's longtime collaborator at Princeton and Bell Labs. Rice graduate student Chi Zhang conducted additional tests at the National High Magnetic Field Laboratory in Tallahassee, Fla., to verify that the 5/2 liquid was spin- polarized.

Study co-authors include Zhang, Knuuttila, Pfeiffer, Princeton's Ken West and Rice's Yanhua Dai. The research is supported by the Department of Energy, the National Science Foundation and the Keck Foundation.

Saturday, April 17, 2010

Electronic 'Nose' Can Predict Pleasantness of Novel Odors


Weizmann Institute scientists have 'trained' an electronic system to be able to predict the pleasantness of novel odors, just like a human would perceive them -- turning the popular notion that smell is completely personal and culture-specific on its head. In research published in PLoS Computational Biology, the scientists argue that the perception of an odor's pleasantness is innately hard-wired to its molecular structure, and it is only within specific contexts that personal or cultural differences are made apparent.
Electronic Nose
Scientists have 'trained' an electronic system to
be able to predict the pleasantness of novel odors, 
just like a human would perceive them. 
(Credit: iStockphoto/Sharon Dominick)

These findings have important implications for automated environmental toxicity and malodor monitoring, fast odor screening in the perfume industry, and provide a critical building block for the Holy Grail of sense technology -- transmitting scent digitally.

Over the last decade, electronic devices, commonly known as electronic noses or 'eNoses,' have been developed to be able to detect and recognize odors. The main component of an eNose is an array of chemical sensors. As an odor passes through the eNose, its molecular features stimulate the sensors in such a way as to produce a unique electrical pattern -- an 'odor fingerprint' -- that characterizes that specific odor. Like a sniffer dog, an eNose first needs to be trained with odor samples so as to build a database of reference. Then the instrument can recognize new samples of those odors by comparing the odor's fingerprint to those contained in its database.

But unlike humans, if eNoses are presented with a novel odor whose fingerprint has not already been recorded in their database, they are unable to classify or recognize it.

So a team of Weizmann scientists, led by Dr. Rafi Haddad, then a graduate student of Prof. Noam Sobel of the Neurobiology Department and co-supervisor Prof. David Harel of the Computer Science and Applied Mathematics Department, together with their colleague Abebe Medhanie of the Neurobiology Department, and Dr. Yehudah Roth of the Edith Wolfson Medical Center, Holon, decided to approach this issue from a different perspective. Rather than train an eNose to recognize a particular odor, they trained it to estimate the odor along a particular perceptual axis. The axis they chose was odorant pleasantness. In other words, they trained their eNose to predict whether an odor would be perceived as pleasant or unpleasant, or anywhere in between.

To achieve this, the scientists first asked a group of native Israelis to rate the pleasantness of a selection of odors according to a 30-point scale ranging from 'very pleasant' to 'very unpleasant.' From this dataset, they developed an 'odor pleasantness' algorithm, which they then programmed into the eNose. The scientists then got the eNose to predict the pleasantness of a completely new set of odors not contained in their database against the ratings provided by a completely different group of native Israelis. The scientists found that the eNose was able to generalize and rate the pleasantness of novel odors it never smelled before, and these ratings were about 80% similar to those of naive human raters who had not participated in the eNose training phase. Moreover, if the odors were simply categorized as either 'pleasant' or 'unpleasant,' as opposed to being rated on a scale, it achieved an accuracy of 99%.

But these findings still don't determine whether olfactory perception is culture-specific or not. With this in mind, the scientists decided to test eNose predictions against a group of recent immigrants to Israel from Ethiopia. The results showed that the eNose's ability to predict the pleasantness of novel odors against the native Ethipoians' ratings was just as good, even though it was 'tuned' to the pleasantness of odors as perceived by native Israelis. In other words, even though different odors have different meanings across cultures, the eNose performed equally well across these populations. This suggests a fundamental cross-cultural similarity in odorant pleasantness.

Sobel comments: "Being able to predict whether a person who we never tested before would like a specific odorant, no matter their cultural background, provides evidence that odor pleasantness is a fundamental biological property, and that certain aspects of molecular structure are what determine whether an odor is pleasant or not.' So how are cultural differences accounted for? 'We believe that culture influences the perception of olfactory pleasantness mostly in particular contexts. To stress this point, many may wonder how the French can like the smell of their cheese, when most find the smell quite repulsive. We believe that it is not that the French think the smell is pleasant per se, they merely think it is a sign of good cheese. However, if the smell was presented out of context in a jar, then the French would probably rate the odor just as unpleasant as anyone else would."

The scientists' findings that odor perception is hard-wired to molecular structure and their design of an eNose that is able to classify new odors could provide new methods for odor screening and environmental monitoring, and may, in the future, allow for the digital transmission of smell to scent-enable movies, games and music to provide a more immersive and captivating experience.

Prof. Noam Sobel's research is supported by the Nella and Leon Benoziyo Center for Neurosciences; the J&R Foundation; and Regina Wachter, New York. This research was funded by an FP7 grant from the European Research Council awarded to Noam Sobel.

Wednesday, April 14, 2010

First Direct Recording Made of Mirror Neurons in Human Brain


Mirror neurons, many say, are what make us human. They are the cells in the brain that fire not only when we perform a particular action but also when we watch someone else perform that same action.
Me
Mirror neurons, many say, are what make us human. 
(Credit: Image courtesy of University of California
- Los Angeles)

Neuroscientists believe this "mirroring" is the mechanism by which we can "read" the minds of others and empathize with them. It's how we "feel" someone's pain, how we discern a grimace from a grin, a smirk from a smile.

Problem was, there was no proof that mirror neurons existed -- only suspicion and indirect evidence. Now, reporting in the April edition of the journal Current Biology, Dr. Itzhak Fried, a UCLA professor of neurosurgery and of psychiatry and biobehavioral sciences, Roy Mukamel, a postdoctoral fellow in Fried's lab, and their colleagues have for the first time made a direct recording of mirror neurons in the human brain.

The researchers recorded both single cells and multiple-cell activity, not only in motor regions of the brain where mirror neurons were thought to exist but also in regions involved in vision and in memory.

Further, they showed that specific subsets of mirror cells increased their activity during the execution of an action but decreased their activity when an action was only being observed.

"We hypothesize that the decreased activity from the cells when observing an action may be to inhibit the observer from automatically performing that same action," said Mukamel, the study's lead author. "Furthermore, this subset of mirror neurons may help us distinguish the actions of other people from our own actions."

The researchers drew their data directly from the brains of 21 patients who were being treated at Ronald Reagan UCLA Medical Center for intractable epilepsy. The patients had been implanted with intracranial depth electrodes to identify seizure foci for potential surgical treatment. Electrode location was based solely on clinical criteria; the researchers, with the patients' consent, used the same electrodes to "piggyback" their research.

The experiment included three parts: facial expressions, grasping and a control experiment. Activity from a total of 1,177 neurons in the 21 patients was recorded as the patients both observed and performed grasping actions and facial gestures. In the observation phase, the patients observed various actions presented on a laptop computer. In the activity phase, the subjects were asked to perform an action based on a visually presented word. In the control task, the same words were presented and the patients were instructed not to execute the action.

The researchers found that the neurons fired or showed their greatest activity both when the individual performed a task and when they observed a task. The mirror neurons making the responses were located in the medial frontal cortex and medial temporal cortex, two neural systems where mirroring responses at the single-cell level had not been previously recorded, not even in monkeys.

This new finding demonstrates that mirror neurons are located in more areas of the human brain than previously thought. Given that different brain areas implement different functions -- in this case, the medial frontal cortex for movement selection and the medial temporal cortex for memory -- the finding also suggests that mirror neurons provide a complex and rich mirroring of the actions of other people.

Because mirror neurons fire both when an individual performs an action and when one watches another individual perform that same action, it's thought this "mirroring" is the neural mechanism by which the actions, intentions and emotions of other people can be automatically understood.

"The study suggests that the distribution of these unique cells linking the activity of the self with that of others is wider than previously believed," said Fried, the study's senior author and director of the UCLA Epilepsy Surgery Program.

"It's also suspected that dysfunction of these mirror cells might be involved in disorders such as autism, where the clinical signs can include difficulties with verbal and nonverbal communication, imitation and having empathy for others," Mukamel said. "So gaining a better understanding of the mirror neuron system might help devise strategies for treatment of this disorder."

Other authors on the study included Arne D. Ekstrom, Jonas Kaplan and Marco Iacoboni, all of UCLA. The project was supported by the National Center for Research Resources, a component of the National Institutes of Health (NIH). The authors report no conflict of interest.

Monday, April 12, 2010

Exotic Quantum Spin-Liquid Simulated: A Starting Point for Superconductivity?


An exotic state of matter that physicists call a "quantum spin-liquid" can be realized by electrons in a honeycomb crystal structure.
flat honeycomb structure
The simulation of the quantum spin-liquid was performed 
on a flat honeycomb structure, where the electrons show
a dynamical phase lacking any order. 
(Credit: Image courtesy of University of Stuttgart)

This is shown by scientists from the Universities of Stuttgart and Würzburg, Germany in a new study published in the journal Nature.

Electrons inside a crystal exist in different states. In many cases it is the crystal structure that decides, if the material is a metal with a finite electric conductivity, or if it is an insulator, which does not carry an electric current. But there also exist insulating materials, whose crystal structures suggest that they should behave like metals. Such materials are called "Mott insulators," and it is the repulsion between the electrons, that suppresses a metallic behaviour, such that the electrons are locked to the atoms.

Such localized electrons tend to order upon lowering the temperature, such as for example in magnetic structures. A "quantum spin-liquid" however is a non-magnetic Mott-insulator that is stabilized purely by quantum mechanical effects. The electrons inside a quantum spin-liquid resist to order down to the lowest temperatures, way down to the absolute zero of temperature at minus 273 degrees Celsius. The tendency to order is suppressed by dynamical fluctuations of the electrons even at zero absolute temperature (quantum fluctuations). For this to happen, the quantum fluctuations must be sufficiently large, which is rarely the case in nature, and also hard to realize in realistic models.

Now theorists from Stuttgart University, Zi Yang Meng, Priv.-Doz. Stefan Wessel, and Prof. Alejandro Muramatsu, together with their colleges Thomas Lang and Prof. Fakher Assaad from Würzburg University, showed that such a quantum spin-liquid exists in a realistic model of interacting electrons. For their study, they used large-scale computer simulations, in order to account for both the interactions between the electrons and their quantum fluctuations. Their unexpected findings were thus accepted for publication in the "Nature" magazine.

The quantum spin-liquid found by Meng et al. occurs in materials where the atoms form a two-dimensional, periodic array of hexagons, thus realizing a honeycomb lattice. Such a crystal structre is found for example in Graphene, a two-dimensional carbon allotrope, that was only recently synthesized, and has since then been the focus of intensive research. If the electronic interactions could be enhanced in such a material, then the highly interesting quantum spin-liquid state could be realized. It appears unlikly that this can be achieved, for example by expansion, in Graphene. Thus, the physicists from Stuttgart and Würzburg suggest exploring honeycomb-like structures formed from other group IV elements that show enhanced electronic interactions. A first step in this direction might already have been taken, since previously chemist succeeded in synthesizing Graphene-like structures of silicon atoms.

Furthermore, the quantum spin-liquid should also be realizable using ultra-cold atoms. In fact, the mathematical model studied by the physicists describes both interacting electrons in solid state systems as well as interacting ultra-cold atoms in an optical lattice. The impressive progress that has been achieved in this research field opens up the possibility to realize the quantum spin-liquid with ultra-cold atoms.

Another fascinating aspect of the quantum spin-liquid is that it can also be viewed as a starting point for superconductivity. Electric currents would then flow without resistance through the material. This has potential for many applications, such as ultra fast computers or the dissipation free transport of electricity.

In their fundamental research, the two theory groups in Stuttgart and Würzburg analyse complex phases of strongly interacting quantum many-body systems in general. They discovered the quantum spin-liquid phase, while studying possible transitions between metallic and insulating phases in a model for Graphene. In the vicinity of such transitions, the quantum fluctuations become significantly enhanced, and destroy any magnetic order. The scientists could also exclude other types of electronic orders from an extensive analysis. Such a study was only possible with the help of modern supercomputers. In particular, for their calculations, the theorists could profit from the highly efficient supercomputer centers in Jülich, München and Stuttgart. For the future, they hope to apply simulations of strongly interacting electrons also to the design of novel materials that realize exotic states of matter -- including the quantum spin-liquid.

The research described above is embedded within the general research activities of the two universities. At the University of Stuttgart, the DFG research unit SFB/TRR 21, "Controll of Quantum Correlations in Tailored Matter," focuses on the realization of tailored quantum systems. Its spokesperson is Prof. Tilmann Pfau from the University of Stuttgart. At the University of Würzburg, a recently established research group "Electron Correlation-Induced Phenomena in Surfaces and Interfaces with Tuneable Interactions" complex electronic states are of central focus. Its spokesperson is Prof. Ralph Claessen from Würzburg University.