BTemplates.com

Powered by Blogger.

Pageviews past week

Quantum mechanics

Auto News

artificial intelligence

About Me

Recommend us on Google!

Information Technology

Popular Posts

Showing posts with label California Institute of Technology. Show all posts
Showing posts with label California Institute of Technology. Show all posts

Monday, July 23, 2012

Coursera makes top college courses free online


Daphne Koller and Andrew Ng share a vision in which anyone, no matter how destitute, can expand their minds and prospects with lessons from the world's top universities.

Coursera co-founders Andrew Ng and Daphne Koller
That dream was joined this week by a dozen vaunted academic institutions including Duke University, the Ecole Polytechnique Federale de Lausanne (EPFL) in Switzerland and the University of Edinburgh in Scotland.

The schools will add online versions of classes to Coursera.org, a website launched by Stanford University professors Koller and Ng early this year with debut offerings from Princeton, Stanford and two other US universities.

"We have a vision where students everywhere around the world, regardless of country, family circumstances or financial circle have access to top quality education whether to expand their minds or learn valuable skills," Koller said.

"Where education becomes a right, not a privilege."

Academic institutions are increasingly turning to the Internet as an educational platform. A Khan Academy website created by Massachusetts Institute of Technology (MIT) graduate Salman Khan provides thousands of video lectures.

The nonprofit behind prestigious TED gatherings recently launched a TED-Ed channel at YouTube that teams accomplished teachers with talented animators to make videos that captivate while they educate.

In May, Harvard University and MIT announced that they were teaming up to expand their online education programs -- and invited other institutions to jump on board.

Called edX, the $60 million joint venture builds on MIT's existing MITx platform that enables video lesson segments, embedded quizzes, immediate feedback, online laboratories and student-paced learning.

"Universities have come to realize that online is not a fad," Koller said. "The question is not whether to engage in this area but how to do it."

Coursera classes are free, and completion certificates are issued that people can use to win jobs or improve careers.

"If a student takes a Stanford computer class and a Princeton business class, it shows they are motivated and have skills," Koller said. "We know it has helped employees get better jobs."

Coursera is distinguishing itself with essentially virtual versions of real classes.

"A lot of what is out there is basically video with, perhaps, some static content like lecture notes," Koller said.

"We are providing an actual course exchange were people register and there is weekly homework that is graded with feedback about how they are doing."

Coursera classes launched in February with most of the courses slated to begin in the coming months but it has already attracted students in 190 countries, according to Koller.

Coursera uses crowd-sourcing to translate material into various languages and hopes to connect with French-speaking populations around the world with EPFL classes.

Hoping to spread knowledge around the world, Coursera is a way to inspire faculty to try new methods of teaching and find ways that Internet Age tools can enhance on-campus courses, according to Duke provost Peter Lange.

"Our faculty is incredibly excited by the idea of trying it out and seeing if we can learn from it," Lange said.

"I love the idealism of it; the potential to reach people who might never get the chance to attend the university."

Duke designs its online courses to get students involved, complete with social networking tools for collaborating outside of classes.

"This is a great experiment in innovation and learning," Lange said.

As of Friday, Coursera boasted about 740,000 students and that number is expected to soar as word spreads and class offerings expand.

Coursera plans to keep classes free but perhaps one day make money for operations by charging for course completion certificates or matching employers with qualified workers.

"Current ethos in Silicon Valley is that if you build a website that people keep coming back to and is changing the lives of millions, you can eventually make money," Koller said.

"If and when we develop revenue, universities will share in it."

Paying the bills is not a worry at Coursera due to generous backing that includes a $3.7 million combined investment by the University of Pennsylvania and the California Institute of Technology, as well as funding from venture capital powerhouse Kleiner Perkins Caufield & Byers.

Tuesday, December 20, 2011

Big Ecosystem Shifts from Climate Change




By 2100, global climate change will modify plant communities covering almost half of Earth's land surface and will drive the conversion of nearly 40 percent of land-based ecosystems from one major ecological community type -- such as forest, grassland or tundra -- toward another, according to a new NASA and university computer modeling study.

Predicted percentage of ecological landscape
being driven toward changes in plant species
as a result of projected human-induced climate
change by 2100. (Credit: NASA/JPL-Caltech)

Researchers from NASA's Jet Propulsion Laboratory and the California Institute of Technology in Pasadena, Calif., investigated how Earth's plant life is likely to react over the next three centuries as Earth's climate changes in response to rising levels of human-produced greenhouse gases. Study results are published in the journal Climatic Change.

The model projections paint a portrait of increasing ecological change and stress in Earth's biosphere, with many plant and animal species facing increasing competition for survival, as well as significant species turnover, as some species invade areas occupied by other species. Most of Earth's land that is not covered by ice or desert is projected to undergo at least a 30 percent change in plant cover -- changes that will require humans and animals to adapt and often relocate.

In addition to altering plant communities, the study predicts climate change will disrupt the ecological balance between interdependent and often endangered plant and animal species, reduce biodiversity and adversely affect Earth's water, energy, carbon and other element cycles.

"For more than 25 years, scientists have warned of the dangers of human-induced climate change," said Jon Bergengren, a scientist who led the study while a postdoctoral scholar at Caltech. "Our study introduces a new view of climate change, exploring the ecological implications of a few degrees of global warming. While warnings of melting glaciers, rising sea levels and other environmental changes are illustrative and important, ultimately, it's the ecological consequences that matter most."

When faced with climate change, plant species often must "migrate" over multiple generations, as they can only survive, compete and reproduce within the range of climates to which they are evolutionarily and physiologically adapted. While Earth's plants and animals have evolved to migrate in response to seasonal environmental changes and to even larger transitions, such as the end of the last ice age, they often are not equipped to keep up with the rapidity of modern climate changes that are currently taking place. Human activities, such as agriculture and urbanization, are increasingly destroying Earth's natural habitats, and frequently block plants and animals from successfully migrating.

To study the sensitivity of Earth's ecological systems to climate change, the scientists used a computer model that predicts the type of plant community that is uniquely adapted to any climate on Earth. This model was used to simulate the future state of Earth's natural vegetation in harmony with climate projections from 10 different global climate simulations. These simulations are based on the intermediate greenhouse gas scenario in the United Nations' Intergovernmental Panel on Climate Change Fourth Assessment Report. That scenario assumes greenhouse gas levels will double by 2100 and then level off. The U.N. report's climate simulations predict a warmer and wetter Earth, with global temperature increases of 3.6 to 7.2 degrees Fahrenheit (2 to 4 degrees Celsius) by 2100, about the same warming that occurred following the Last Glacial Maximum almost 20,000 years ago, except about 100 times faster. Under the scenario, some regions become wetter because of enhanced evaporation, while others become drier due to changes in atmospheric circulation.

The researchers found a shift of biomes, or major ecological community types, toward Earth's poles -- most dramatically in temperate grasslands and boreal forests -- and toward higher elevations. Ecologically sensitive "hotspots" -- areas projected to undergo the greatest degree of species turnover -- that were identified by the study include regions in the Himalayas and the Tibetan Plateau, eastern equatorial Africa, Madagascar, the Mediterranean region, southern South America, and North America's Great Lakes and Great Plains areas. The largest areas of ecological sensitivity and biome changes predicted for this century are, not surprisingly, found in areas with the most dramatic climate change: in the Northern Hemisphere high latitudes, particularly along the northern and southern boundaries of boreal forests.

"Our study developed a simple, consistent and quantitative way to characterize the impacts of climate change on ecosystems, while assessing and comparing the implications of climate model projections," said JPL co-author Duane Waliser. "This new tool enables scientists to explore and understand interrelationships between Earth's ecosystems and climate and to identify regions projected to have the greatest degree of ecological sensitivity."

"In this study, we have developed and applied two new ecological sensitivity metrics -- analogs of climate sensitivity -- to investigate the potential degree of plant community changes over the next three centuries," said Bergengren. "The surprising degree of ecological sensitivity of Earth's ecosystems predicted by our research highlights the global imperative to accelerate progress toward preserving biodiversity by stabilizing Earth's climate."

JPL is managed for NASA by the California Institute of Technology in Pasadena.

Saturday, August 6, 2011

Engineers Solve Longstanding Problem in Photonic Chip Technology: Findings Help Pave Way for Next Generation of Computer Chips


Stretching for thousands of miles beneath oceans, optical fibers now connect every continent except for Antarctica. With less data loss and higher bandwidth, optical-fiber technology allows information to zip around the world, bringing pictures, video, and other data from every corner of the globe to your computer in a split second. But although optical fibers are increasingly replacing copper wires, carrying information via photons instead of electrons, today's computer technology still relies on electronic chips.
Caltech engineers have developed a new way to 
isolate light on a photonic chip, allowing light to 
travel in only one direction. This finding can lead 
to the next generation of computer-chip technology: 
photonic chips that allow for faster computers 
and less data loss. (Credit: Caltech/Liang Feng)

Now, researchers led by engineers at the California Institute of Technology (Caltech) are paving the way for the next generation of computer-chip technology: photonic chips. With integrated circuits that use light instead of electricity, photonic chips will allow for faster computers and less data loss when connected to the global fiber-optic network.

"We want to take everything on an electronic chip and reproduce it on a photonic chip," says Liang Feng, a postdoctoral scholar in electrical engineering and the lead author on a paper to be published in the August 5 issue of the journal Science. Feng is part of Caltech's nanofabrication group, led by Axel Scherer, Bernard A. Neches Professor of Electrical Engineering, Applied Physics, and Physics, and co-director of the Kavli Nanoscience Institute at Caltech.

In that paper, the researchers describe a new technique to isolate light signals on a silicon chip, solving a longstanding problem in engineering photonic chips.

An isolated light signal can only travel in one direction. If light weren't isolated, signals sent and received between different components on a photonic circuit could interfere with one another, causing the chip to become unstable. In an electrical circuit, a device called a diode isolates electrical signals by allowing current to travel in one direction but not the other. The goal, then, is to create the photonic analog of a diode, a device called an optical isolator. "This is something scientists have been pursuing for 20 years," Feng says.

Normally, a light beam has exactly the same properties when it moves forward as when it's reflected backward. "If you can see me, then I can see you," he says. In order to isolate light, its properties need to somehow change when going in the opposite direction. An optical isolator can then block light that has these changed properties, which allows light signals to travel only in one direction between devices on a chip.

"We want to build something where you can see me, but I can't see you," Feng explains. "That means there's no signal from your side to me. The device on my side is isolated; it won't be affected by my surroundings, so the functionality of my device will be stable."

To isolate light, Feng and his colleagues designed a new type of optical waveguide, a 0.8-micron-wide silicon device that channels light. The waveguide allows light to go in one direction but changes the mode of the light when it travels in the opposite direction.

A light wave's mode corresponds to the pattern of the electromagnetic field lines that make up the wave. In the researchers' new waveguide, the light travels in a symmetric mode in one direction, but changes to an asymmetric mode in the other. Because different light modes can't interact with one another, the two beams of light thus pass through each other.



Previously, there were two main ways to achieve this kind of optical isolation. The first way -- developed almost a century ago -- is to use a magnetic field. The magnetic field changes the polarization of light -- the orientation of the light's electric-field lines -- when it travels in the opposite direction, so that the light going one way can't interfere with the light going the other way. "The problem is, you can't put a large magnetic field next to a computer," Feng says. "It's not healthy."

The second conventional method requires so-called nonlinear optical materials, which change light's frequency rather than its polarization. This technique was developed about 50 years ago, but is problematic because silicon, the material that's the basis for the integrated circuit, is a linear material. If computers were to use optical isolators made out of nonlinear materials, silicon would have to be replaced, which would require revamping all of computer technology. But with their new silicon waveguides, the researchers have become the first to isolate light with a linear material.

Although this work is just a proof-of-principle experiment, the researchers are already building an optical isolator that can be integrated onto a silicon chip. An optical isolator is essential for building the integrated, nanoscale photonic devices and components that will enable future integrated information systems on a chip. Current, state-of-the-art photonic chips operate at 10 gigabits per second (Gbps) -- hundreds of times the data-transfer rates of today's personal computers -- with the next generation expected to soon hit 40 Gbps. But without built-in optical isolators, those chips are much simpler than their electronic counterparts and are not yet ready for the market. Optical isolators like those based on the researchers' designs will therefore be crucial for commercially viable photonic chips.

In addition to Feng and Scherer, the other authors on the Science paper, "Non-reciprocal light propagation in a silicon photonic circuit," are Jingqing Huang, a Caltech graduate student; Maurice Ayache of UC San Diego and Yeshaiahu Fainman, Cymer Professor in Advanced Optical Technologies at UC San Diego; and Ye-Long Xu, Ming-Hui Lu, and Yan-Feng Chen of the Nanjing National Laboratory of Microstructures in China. This research was done as part of the Center for Integrated Access Networks (CIAN), one of the National Science Foundation's Engineering Research Centers. Fainman is also the deputy director of CIAN. Funding was provided by the National Science Foundation, and the Defense Advanced Research Projects Agency.

Friday, May 20, 2011

Japan's 9.0 Tohoku-Oki Earthquake: Surprising Findings About Energy Distribution Over Fault Slip and Stress Accumulation



When the magnitude 9.0 Tohoku-Oki earthquake and resulting tsunami struck off the northeast coast of Japan on March 11, they caused widespread destruction and death. Using observations from a dense regional geodetic network (allowing measurements of earth movement to be gathered from GPS satellite data), globally distributed broadband seismographic networks, and open-ocean tsunami data, researchers have begun to construct numerous models that describe how the earth moved that day.
The image represents on overhead model of the 
estimated fault slip due to the 9.0 Tohoku Oki 
earthquake. The fault responsible for this earthquake 
dips under Japan, starting at the Japan Trench 
(indicated by the barbed line), which is the point 
of contact between the subducting Pacific Plate 
and the overriding Okhotsk Plate. The magnitude 
of fault slip is indicated both by the color and the 
contours, which are at 8 meter intervals. The question 
mark indicates the general region where researchers 
currently lack information about future seismic potential. 
(Credit: Mark Simons/Caltech Seismological Laboratory)

Now, a study led by researchers at the California Institute of Technology (Caltech), published online in the May 19 issue of Science Express, explains the first large set of observational data from this rare megathrust event.

"This event is the best recorded great earthquake ever," says Mark Simons, professor of geophysics at Caltech's Seismological Laboratory and lead author of the study. For scientists working to improve infrastructure and prevent loss of life through better application of seismological data, observations from the event will help inform future research priorities.

Simons says one of the most interesting findings of the data analysis was the spatial compactness of the event. The megathrust earthquake occurred at a subduction zone where the Pacific Plate dips below Japan. The length of fault that experienced significant slip during the Tohoku-Oki earthquake was about 250 kilometers, about half of what would be conventionally expected for an event of this magnitude.

Furthermore, the area where the fault slipped the most -- 30 meters or more -- happened within a 50- to 100-kilometer-long segment. "This is not something we have documented before," says Simons. "I'm sure it has happened in the past, but technology has advanced only in the past 10 to 15 years to the point where we can measure these slips much more accurately through GPS and other data."

For Jean Paul Ampuero, assistant professor of seismology at Caltech's Seismological Laboratory who studies earthquake dynamics, the most significant finding was that high- and low-frequency seismic waves can come from different areas of a fault. "The high-frequency seismic waves in the Tohoku earthquake were generated much closer to the coast, away from the area of the slip where we saw low-frequency waves," he says.

Simons says there are two factors controlling this behavior; one is because the largest amount of stress (which is what generates the highest-frequency waves) was found at the edges of the slip, not near the center of where the fault began to break. He compares the finding to what happens when you rip a piece of paper in half. "The highest amounts of stress aren't found where the paper has just ripped, but rather right where the paper has not yet been torn," he explains. "We had previously thought high-frequency energy was an indicator of fault slippage, but it didn't correlate in our models of this event." Equally important is how the fault reacts to these stress concentrations; it appears that only the deeper segments of the fault respond to these stresses by producing high-frequency energy.

Ampuero says the implications of these observations of the mechanical properties of tectonic faults need to be further explored and integrated in physical models of earthquakes, which will help scientists better quantify earthquake hazards.

"We learn from each significant earthquake, especially if the earthquake is large and recorded by many sensors," says Ampuero. "The Tohoku earthquake was recorded by upwards of 10 times more sensors at near-fault distances than any other earthquake. This will provide a sharper and more robust view of earthquake rupture processes and their effects."

For seismologist Hiroo Kanamori, Caltech's Smits Professor of Geophysics, Emeritus, who was in Japan at the time of the earthquake and has been studying the region for many years, the most significant finding was that a large slip occurred near the Japan Trench. While smaller earthquakes have happened in the area, it was believed that the relatively soft material of the seafloor would not support a large amount of stress. "The amount of strain associated with this large displacement is nearly five to 10 times larger than we normally see in large megathrust earthquakes," he notes. "It has been generally thought that rocks near the Japan Trench could not accommodate such a large elastic strain."

The researchers are still unsure why such a large strain was able to accumulate in this area. One possibility is that either the subducting seafloor or the upper plate (or both) have some unusual structures -- such as regions that were formerly underwater mountain ranges on the Pacific Plate -- that have now been consumed by the subduction zone and cause the plates to get stuck and build up stress.

"Because of this local strengthening -- whatever its cause -- the Pacific Plate and the Okhotsk Plate had been pinned together for a long time, probably 500 to 1000 years, and finally failed in this magnitude 9.0 event," says Kanamori. "Hopefully, detailed geophysical studies of seafloor structures will eventually clarify the mechanism of local strengthening in this area."

Simons says researchers knew very little about the area where the earthquake occurred because of limited historical data.

"Instead of saying a large earthquake probably wouldn't happen there, we should have said that we didn't know," he says. Similarly, he says the area just south of where the fault slipped is in a similar position; researchers don't yet know what it might do in the future.

"It is important to note that we are not predicting an earthquake here," emphasizes Simons. "However, we do not have data on the area, and therefore should focus attention there, given its proximity to Tokyo."

He says that the relatively new Japanese seafloor observation systems will prove very useful in scientists' attempts to learn more about the area.

"Our study is only the first foray into what is an enormous quantity of available data," says Simons. "There will be a lot more information coming out of this event, all of which will help us learn more in order to help inform infrastructure and safety procedures."

The work was funded by the Gordon and Betty Moore Foundation, National Science Foundation grants, the Southern California Earthquake Center, and NASA's internal Research and Technology Development program.

Tuesday, March 8, 2011

'Elephant Trunks' in Space: WISE Captures Image of Star-Forming Cloud of Dust and Gas


NASA's Wide-field Infrared Survey Explorer, or WISE, captured this image of a star-forming cloud of dust and gas, called Sh2-284, located in the constellation of Monoceros. Lining up along the edges of a cosmic hole are several "elephant trunks" -- or monstrous pillars of dense gas and dust.

NASA's Wide-field Infrared Survey Explorer, or WISE, 
captured this image of a star-forming cloud of dust and 
gas located in the constellation of Monoceros. 
(Credit: NASA/JPL-Caltech/UCLA)


 

The most famous examples of elephant trunks are the "Pillars of Creation" found in an iconic image of the Eagle nebula from NASA's Hubble Space Telescope. In this WISE image, the trunks are seen as small columns of gas stretching toward the center of the void in Sh2-284, The most notable one can be seen on the right side at about the 3 o'clock position. It appears as a closed hand with a finger pointing toward the center of the void. That elephant trunk is about 7 light-years long.

Deep inside Sh2-284 resides an open star cluster, called Dolidze 25, which is emitting vast amounts of radiation in all directions, along with stellar winds. These stellar winds and radiation are clearing out a cavern inside the surrounding gas and dust, creating the void seen in the center. The bright green wall surrounding the cavern shows how far out the gas has been eroded. However, some sections of the original gas cloud were much denser than others, and they were able to resist the erosive power of the radiation and stellar winds. These pockets of dense gas remained and protected the gas "downwind" from them, leaving behind the elephant trunks.

Sh2-284 is relatively isolated at the very end of an outer spiral arm of our Milky Way galaxy. In the night sky, it's located in the opposite direction from the center of the Milky Way.

NASA's Jet Propulsion Laboratory, Pasadena, Calif., manages and operates the Wide-field Infrared Survey Explorer for NASA's Science Mission Directorate, Washington. The principal investigator, Edward Wright, is at UCLA. The mission was competitively selected under NASA's Explorers Program managed by the Goddard Space Flight Center, Greenbelt, Md. The science instrument was built by the Space Dynamics Laboratory, Logan, Utah, and the spacecraft was built by Ball Aerospace & Technologies Corp., Boulder, Colo. Science operations and data processing take place at the Infrared Processing and Analysis Center at the California Institute of Technology in Pasadena. Caltech manages JPL for NASA.

More information is online at http://www.nasa.gov/wise and http://wise.astro.ucla.edu and http://jpl.nasa.gov/wise

Monday, February 7, 2011

Brain's Electrical Fields: Neural Communication



The brain -- awake and sleeping -- is awash in electrical activity, and not just from the individual pings of single neurons communicating with each other. In fact, the brain is enveloped in countless overlapping electric fields, generated by the neural circuits of scores of communicating neurons. The fields were once thought to be an "epiphenomenon, a 'bug' of sorts, occurring during neural communication," says neuroscientist Costas Anastassiou, a postdoctoral scholar in biology at the California Institute of Technology (Caltech).
Ephaptic coupling leads to coordinated spiking 
of nearby neurons, as measured using a 12-pipette 
electrophysiology setup developed in the laboratory of 
coauthor Henry Markram. (Credit: Image from 
Figure 4 in Anastassiou et., Nature Neuroscience, 2011)


New work by Anastassiou and his colleagues, however, suggests that the fields do much more -- and that they may, in fact, represent an additional form of neural communication.

"In other words," says Anastassiou, the lead author of a paper about the work appearing in the journal Nature Neuroscience, "while active neurons give rise to extracellular fields, the same fields feed back to the neurons and alter their behavior," even though the neurons are not physically connected -- a phenomenon known as ephaptic coupling. "So far, neural communication has been thought to occur at localized machines, termed synapses. Our work suggests an additional means of neural communication through the extracellular space independent of synapses."

Extracellular electric fields exist throughout the living brain, though they are particularly strong and robustly repetitive in specific brain regions such as the hippocampus, which is involved in memory formation, and the neocortex, the area where long-term memories are held. "The perpetual fluctuations of these extracellular fields are the hallmark of the living and behaving brain in all organisms, and their absence is a strong indicator of a deeply comatose, or even dead, brain," Anastassiou explains.

Previously, neurobiologists assumed that the fields were capable of affecting -- and even controlling -- neural activity only during severe pathological conditions such as epileptic seizures, which induce very strong fields. Few studies, however, had actually assessed the impact of far weaker -- but very common -- non-epileptic fields. "The reason is simple," Anastassiou says. "It is very hard to conduct an in vivo experiment in the absence of extracellular fields," to observe what changes when the fields are not around.

To tease out those effects, Anastassiou and his colleagues, including Caltech neuroscientist Christof Koch, the Lois and Victor Troendle Professor of Cognitive and Behavioral Biology and professor of computation and neural systems, focused on strong but slowly oscillating fields, called local field potentials (LFP), that arise from neural circuits composed of just a few rat brain cells. Measuring those fields and their effects required positioning a cluster of tiny electrodes within a volume equivalent to that of a single cell body -- and at distances of less than 50 millionths of a meter from one another.

"Because it had been so hard to position that many electrodes within such a small volume of brain tissue, the findings of our research are truly novel," Anastassiou says. Previously, he explains, "nobody had been able to attain this level of spatial and temporal resolution."

An "unexpected and surprising finding was how already very weak extracellular fields can alter neural activity," he says. "For example, we observed that fields as weak as one millivolt per millimeter robustly alter the firing of individual neurons, and increase the so-called "spike-field coherence" -- the synchronicity with which neurons fire with relationship to the field."In the mammalian brain, we know that extracellular fields may easily exceed two to three millivolts per millimeter. Our findings suggest that under such conditions, this effect becomes significant."

What does that mean for brain computation? "Neuroscientists have long speculated about this," Anastassiou says. "Increased spike-field coherency may substantially enhance the amount of information transmitted between neurons as well as increase its reliability. Moreover, it has been long known that brain activity patterns related to memory and navigation give rise to a robust LFP and enhanced spike-field coherency. We believe ephaptic coupling does not have one major effect, but instead contributes on many levels during intense brain processing."

Can external electric fields have similar effects on the brain? "This is an interesting question," Anastassiou says. "Indeed, physics dictates that any external field will impact the neural membrane. Importantly, though, the effect of externally imposed fields will also depend on the brain state. One could think of the brain as a distributed computer -- not all brain areas show the same level of activation at all times.

"Whether an externally imposed field will impact the brain also depends on which brain area is targeted. During epileptic seizures, pathological fields can be as strong as 100 millivolts per millimeter¬ -- such fields strongly entrain neural firing and give rise to super-synchronized states." And that, he adds, suggests that electric field activity -- even from external fields -- in certain brain areas, during specific brain states, may have strong cognitive and behavioral effects.

Ultimately, Anastassiou, Koch, and their colleagues would like to test whether ephaptic coupling affects human cognitive processing, and under which circumstances. "I firmly believe that understanding the origin and functionality of endogenous brain fields will lead to several revelations regarding information processing at the circuit level, which, in my opinion, is the level at which percepts and concepts arise," Anastassiou says. "This, in turn, will lead us to address how biophysics gives rise to cognition in a mechanistic manner -- and that, I think, is the holy grail of neuroscience."

The work in the paper was supported by the Engineering Physical Sciences Research Council, the Sloan-Swartz Foundation, the Swiss National Science Foundation, EU Synapse, the National Science Foundation, the Mathers Foundation, and the National Research Foundation of Korea.



Thursday, January 20, 2011

New Reactor to Make Fuel from Sunlight


Using a common metal most famously found in self-cleaning ovens, Sossina Haile hopes to change our energy future. The metal is cerium oxide -- or ceria -- and it is the centerpiece of a promising new technology developed by Haile and her colleagues that concentrates solar energy and uses it to efficiently convert carbon dioxide and water into fuels.
Sossina Haile and William Chueh stand next to the 
benchtop thermochemical reactor used to screen 
materials for implementation on the solar reactor. 
(Credit: Courtesy of Caltech)
Solar energy has long been touted as the solution to our energy woes, but while it is plentiful and free, it can't be bottled up and transported from sunny locations to the drearier -- but more energy-hungry -- parts of the world. The process developed by Haile -- a professor of materials science and chemical engineering at the California Institute of Technology (Caltech) -- and her colleagues could make that possible.

The researchers designed and built a two-foot-tall prototype reactor that has a quartz window and a cavity that absorbs concentrated sunlight. The concentrator works "like the magnifying glass you used as a kid" to focus the sun's rays, says Haile.

At the heart of the reactor is a cylindrical lining of ceria. Ceria -- a metal oxide that is commonly embedded in the walls of self-cleaning ovens, where it catalyzes reactions that decompose food and other stuck-on gunk -- propels the solar-driven reactions. The reactor takes advantage of ceria's ability to "exhale" oxygen from its crystalline framework at very high temperatures and then "inhale" oxygen back in at lower temperatures.

"What is special about the material is that it doesn't release all of the oxygen. That helps to leave the framework of the material intact as oxygen leaves," Haile explains. "When we cool it back down, the material's thermodynamically preferred state is to pull oxygen back into the structure."

Specifically, the inhaled oxygen is stripped off of carbon dioxide (CO2) and/or water (H2O) gas molecules that are pumped into the reactor, producing carbon monoxide (CO) and/or hydrogen gas (H2). H2 can be used to fuel hydrogen fuel cells; CO, combined with H2, can be used to create synthetic gas, or "syngas," which is the precursor to liquid hydrocarbon fuels. Adding other catalysts to the gas mixture, meanwhile, produces methane. And once the ceria is oxygenated to full capacity, it can be heated back up again, and the cycle can begin anew.

For all of this to work, the temperatures in the reactor have to be very high -- nearly 3,000 degrees Fahrenheit. At Caltech, Haile and her students achieved such temperatures using electrical furnaces. But for a real-world test, she says, "we needed to use photons, so we went to Switzerland." At the Paul Scherrer Institute's High-Flux Solar Simulator, the researchers and their collaborators -- led by Aldo Steinfeld of the institute's Solar Technology Laboratory -- installed the reactor on a large solar simulator capable of delivering the heat of 1,500 suns.

In experiments conducted last spring, Haile and her colleagues achieved the best rates for CO2 dissociation ever achieved, "by orders of magnitude," she says. The efficiency of the reactor was uncommonly high for CO2 splitting, in part, she says, "because we're using the whole solar spectrum, and not just particular wavelengths." And unlike in electrolysis, the rate is not limited by the low solubility of CO2 in water. Furthermore, Haile says, the high operating temperatures of the reactor mean that fast catalysis is possible, without the need for expensive and rare metal catalysts (cerium, in fact, is the most common of the rare earth metals -- about as abundant as copper).

In the short term, Haile and her colleagues plan to tinker with the ceria formulation so that the reaction temperature can be lowered, and to re-engineer the reactor, to improve its efficiency. Currently, the system harnesses less than 1% of the solar energy it receives, with most of the energy lost as heat through the reactor's walls or by re-radiation through the quartz window. "When we designed the reactor, we didn't do much to control these losses," says Haile. Thermodynamic modeling by lead author and former Caltech graduate student William Chueh suggests that efficiencies of 15% or higher are possible.

Ultimately, Haile says, the process could be adopted in large-scale energy plants, allowing solar-derived power to be reliably available during the day and night. The CO2 emitted by vehicles could be collected and converted to fuel, "but that is difficult," she says. A more realistic scenario might be to take the CO2 emissions from coal-powered electric plants and convert them to transportation fuels. "You'd effectively be using the carbon twice," Haile explains. Alternatively, she says, the reactor could be used in a "zero CO2 emissions" cycle: H2O and CO2 would be converted to methane, would fuel electricity-producing power plants that generate more CO2 and H2O, to keep the process going.

The work was funded by the National Science Foundation, the State of Minnesota Initiative for Renewable Energy and the Environment, and the Swiss National Science Foundation.
Enhanced by Zemanta

Tuesday, July 20, 2010

Of Bugs and Brains: Gut Bacteria Affect Multiple Sclerosis


Biologists at the California Institute of Technology (Caltech) have demonstrated a connection between multiple sclerosis (MS) -- an autoimmune disorder that affects the brain and spinal cord -- and gut bacteria.
Image
In the absence of bacteria in the intestines, 
pro-inflammatory Th17 cells do not develop in 
either the gut or the central nervous system; and 
animals do not develop disease (top panel). When 
animals are colonized with symbiotic segmented 
filamentous bacteria, Th17 cell differentiation is 
induced in the gut. Th17 cells promote experimental 
autoimmune encephalomyelitis, an animal model for 
multiple sclerosis. In this way, non-pathogenic bacteria 
of the microbiota promote disease by shaping the immune 
response in both the gut and the brain (top panel). 
(Credit: Lee, Mazmanian/Caltech; modified from 
Savidge TC et al. Laboratory Investigation 2007)

The work -- led by Sarkis K. Mazmanian, an assistant professor of biology at Caltech, and postdoctoral scholar Yun Kyung Lee -- appears online the week of July 19-23 in the Proceedings of the National Academy of Sciences.

Multiple sclerosis results from the progressive deterioration of the protective fatty myelin sheath surrounding nerve cells. The loss of myelin hinders nerve cells from communicating with one another, leading to a host of neurological symptoms including loss of sensation, muscle spasms and weakness, fatigue, and pain. Multiple sclerosis is estimated to affect about half a million people in the United States alone, with rates of diagnosis rapidly increasing. There is currently no cure for MS.

Although the cause of MS is unknown, microorganisms seem to play some sort of role. "In the literature from clinical studies, there are papers showing that microbes affect MS," Mazmanian says. "For example, the disease gets worse after viral infections, and bacterial infections cause an increase in MS symptoms."

On the other hand, he concedes, "it seems counterintuitive that a microbe would be involved in a disease of the central nervous system, because these are sterile tissues."

And yet, as Mazmanian found when he began examining the multiple sclerosis literature, the suggestion of a link between bacteria and the disease is more than anecdotal. Notably, back in 1993, Caltech biochemist Leroy Hood -- who was then at the University of Washington -- published a paper describing a genetically engineered strain of mouse that developed a lab-induced form of multiple sclerosis known as experimental autoimmune encephalomyelitis, or EAE.

When Hood's animals were housed at Caltech, they developed the disease. But, oddly, when the mice were shipped to a cleaner biotech facility -- where their resident gut bacterial populations were reduced -- they didn't get sick. The question was, why? At the time, Mazmanian says, "the authors speculated that some environmental component was modulating MS in these animals." Just what that environmental component was, however, remained a mystery for almost two decades.

But Mazmanian -- whose laboratory examines the relationships between gut microbes, both harmful and helpful, and the immune systems of their mammalian hosts -- had a hunch that intestinal bacteria were the key. "As we gained an appreciation for how profoundly the gut microbiota can affect the immune system, we decided to ask if symbiotic bacteria are the missing variable in these mice with MS," he says.

To find out, Mazmanian and his colleagues tried to induce MS in animals that were completely devoid of the microbes that normally inhabit the digestive system. "Lo and behold, these sterile animals did not get sick," he says.

Then the researchers decided to see what would happen if bacteria were reintroduced to the germ-free mice. But not just any bacteria. They inoculated mice with one specific organism, an unculturable bug from a group known as segmented filamentous bacteria. In prior studies, these bacteria had been shown to lead to intestinal inflammation and, more intriguingly, to induce in the gut the appearance of a particular immune-system cell known as Th17. Th17 cells are a type of T helper cell -- cells that help activate and direct other immune system cells. Furthermore, Th17 cells induce the inflammatory cascade that leads to multiple sclerosis in animals.

"The question was, if this organism is inducing Th17 cells in the gut, will it be able to do so in the brain and central nervous system?" Mazmanian says. "Furthermore, with that one organism, can we restore to sterile animals the entire inflammatory response normally seen in animals with hundreds of species of gut bacteria?"

The answer? Yes on all counts. Giving the formerly germ-free mice a dose of one species of segmented filamentous bacteria induced Th17 not only in the gut but in the central nervous system and brain -- and caused the formerly healthy mice to become ill with MS-like symptoms.

"It definitely shows that gut microbes have a strong role in MS, because the genetics of the animals were the same. In fact, everything was the same except for the presence of those otherwise benign bacteria, which are clearly playing a role in shaping the immune system," Mazmanian says. "This study shows for the first time that specific intestinal bacteria have a significant role in affecting the nervous system during MS -- and they do so from the gut, an anatomical location very, very far from the brain."

Mazmanian and his colleagues don't, however, suggest that gut bacteria are the direct cause of multiple sclerosis, which is known to be genetically linked. Rather, the bacteria may be helping to shape the immune system's inflammatory response, thus creating conditions that could allow the disease to develop. Indeed, multiple sclerosis also has a strong environmental component; identical twins, who possess the same genome and share all of their genes, only have a 25 percent chance of sharing the disease. "We would like to suggest that gut bacteria may be the missing environmental component," he says.

For their part, Th17 cells are needed for the immune system to properly combat infection. Problems only arise when the cells are activated in the absence of infection -- just as disease can arise, Mazmanian and others suspect, when the species composition of gut bacteria become imbalanced, say, by changes in diet, because of improved hygiene (which kills off the beneficial bacteria as well as the dangerous ones), or because of stress or antibiotic use. One impact of the dysregulation of normal gut bacterial populations -- a phenomenon dubbed "dysbiosis" -- may be the rising rate of multiple sclerosis seen in recent years in more hygienic societies.

"As we live cleaner, we're not just changing our exposure to infectious agents, but we're changing our relationship with the entire microbial world, both around and inside us, and we may be altering the balance between pro- and anti-inflammatory bacteria," leading to diseases like MS, Mazmanian says. "Perhaps treatments for diseases such as multiple sclerosis may someday include probiotic bacteria that can restore normal immune function in the gut… and the brain."

The work was supported by funding from the California Institute of Technology, the Weston Havens Foundation, and the Edward Mallinckrodt, Jr. Foundation.

Monday, July 5, 2010

Coolest Stars Come out of the Dark: Spitzer Spies Frigid Brown Dwarfs


Astronomers have uncovered what appear to be 14 of the coldest stars known in our universe. These failed stars, called brown dwarfs, are so cold and faint that they'd be impossible to see with current visible-light telescopes. Spitzer's infrared vision was able to pick out their feeble glow, much as a firefighter uses infrared goggles to find hot spots buried underneath a dark forest floor.

This artist's concept shows simulated data predicting the hundreds of failed stars, or brown dwarfs, that NASA's Wide-field Infrared Survey Explorer (WISE) is expected to add to the population of known stars in our solar neighborhood. (Credit: AMNH/UCB/NASA/JPL-Caltech)


The brown dwarfs join only a handful of similar objects previously discovered. The new objects are between the temperatures of about 450 Kelvin to 600 Kelvin (350 to 620 degrees Fahrenheit). As far as stars go, this is bitter cold -- as cold, in some cases, as planets around other stars.

These cool orbs have remained elusive for years, but will soon start coming out of the dark in droves. NASA's Wide-field Infrared Survey Explorer (WISE) mission, which is up scanning the entire sky now in infrared wavelengths, is expected to find hundreds of objects of a similarly chilly disposition, if not even colder. WISE is searching a volume of space 40 times larger than that sampled in the recent Spitzer study, which concentrated on a region in the constellation Boötes. The Spitzer mission is designed to look at targeted patches of sky in detail, while WISE is combing the whole sky.

"WISE is looking everywhere, so the coolest brown dwarfs are going to pop up all around us," said Peter Eisenhardt, the WISE project scientist at NASA's Jet Propulsion Laboratory, Pasadena, Calif., and lead author of a recent paper in the Astronomical Journal on the Spitzer discoveries. "We might even find a cool brown dwarf that is closer to us than Proxima Centauri, the closest known star."

Brown dwarfs form like stars out of collapsing balls of gas and dust, but they are puny in comparison, never collecting enough mass to ignite nuclear fusion and shine with starlight. The smallest known brown dwarfs are about 5 to 10 times the mass of our planet Jupiter -- that's as massive as some known gas-giant planets around other stars. Brown dwarfs start out with a bit of internal heat left over from their formation, but with age, they cool down. The first confirmed brown dwarf was announced in 1995.

"Brown dwarfs are like planets in some ways, but they are in isolation," said astronomer Daniel Stern, co-author of the Spitzer paper at JPL. "This makes them exciting for astronomers -- they are the perfect laboratories to study bodies with planetary masses."

Most of the new brown dwarfs found by Spitzer are thought to belong to the coolest known class of brown dwarfs, called T dwarfs, which are defined as being less than about 1,500 Kelvin (2,240 degrees Fahrenheit). One of the objects appears to be so cold that it may even be a long-sought Y dwarf -- a proposed class of even colder stars. The T and Y classes are part of a larger system categorizing all stars; for example, the hottest, most massive stars are O stars; our sun is a G star.

"Models indicate there may be an entirely new class of stars out there, the Y dwarfs, that we haven't found yet," said co-author Davy Kirkpatrick, a co-author of the study and a member of the WISE science team at the California Institute of Technology, Pasadena, Calif. "If these elusive objects do exist, WISE will find them." Kirkpatrick is a world expert in brown dwarfs -- he came up with L, T and Y classifications for the cooler stars.

Kirkpatrick says that it's possible that WISE could find an icy, Neptune-sized or bigger object in the far reaches of our solar system -- thousands of times farther from the sun than Earth. There is some speculation amongst scientists that such a cool body, if it exists, could be a brown dwarf companion to our sun. This hypothetical object has been nicknamed "Nemesis."

"We are now calling the hypothetical brown dwarf Tyche instead, after the benevolent counterpart to Nemesis," said Kirkpatrick. "Although there is only limited evidence to suggest a large body in a wide, stable orbit around the sun, WISE should be able to find it, or rule it out altogether."

The 14 objects found by Spitzer are hundreds of light-years away -- too far away and faint for ground-based telescopes to see and confirm with a method called spectroscopy. But their presence implies that there are a hundred or more within only 25 light-years of our sun. Because WISE is looking everywhere, it will find these missing orbs, which will be close enough to confirm with spectroscopy. It's possible that WISE will even find more brown dwarfs within 25-light years of the sun than the number of stars known to exist in this space.

"WISE is going to transform our view of the solar neighborhood," said Eisenhardt. We'll be studying these new neighbors in minute detail -- they may contain the nearest planetary system to our own."

Other authors of the Spitzer paper are Roger Griffith and Amy Mainzer of JPL; Ned Wright, A.M. Ghez and Quinn Konopacky of UCLA; Matthew Ashby and Mark Brodwin of the Harvard-Smithsonian Center for Astrophysics, Cambridge; Mass., Michael Brown of Monash University, Australia; R.S. Bussmann of the University of Arizona, Tucson; Arjun Dey of National Optical Astronomy Observatory, Tucson, Ariz.; Eilat Glikman of Caltech; Anthony Gonzalez and David Vollbach of the University of Florida, Gainesville; and Shelley Wright of the University of California, Berkeley.

NASA's Jet Propulsion Laboratory, Pasadena, Calif., manages the Spitzer Space Telescope mission for NASA's Science Mission Directorate, Washington. Science operations are conducted at the Spitzer Science Center at the California Institute of Technology in Pasadena. Caltech manages JPL for NASA.

JPL manages the Wide-field Infrared Survey Explorer for NASA's Science Mission Directorate, Washington. The principal investigator, Edward Wright, is at UCLA. The mission was competitively selected under NASA's Explorers Program managed by the Goddard Space Flight Center, Greenbelt, Md. The science instrument was built by the Space Dynamics Laboratory, Logan, Utah, and the spacecraft was built by Ball Aerospace & Technologies Corp., Boulder, Colo. Science operations and data processing take place at the Infrared Processing and Analysis Center at the California Institute of Technology in Pasadena. Caltech manages JPL for NASA.

For more information about Spitzer, visit http://spitzer.caltech.edu/ and http://www.nasa.gov/spitzer. More information about WISE is online at http://wise.astro.ucla.edu and http://www.nasa.gov/wise.

Monday, November 30, 2009

Scientists Explain Puzzling Lake Asymmetry on Saturn's Moon Titan


Researchers at the California Institute of Technology (Caltech) suggest that the eccentricity of Saturn's orbit around the sun may be responsible for the unusually uneven distribution of methane and ethane lakes over the northern and southern polar regions of the planet's largest moon, Titan. On Earth, similar "astronomical forcing" of climate drives ice-age cycles.

This image shows the northern and southern hemispheres of Titan, showing the disparity between the abundance of lakes in the north and their paucity in the South. The hypothesis presented favors long-term flux of volatile hydrocarbons, predominantly methane, from hemisphere to hemisphere. Recently the direction of transport has been from south to north, but the effect would have reversed tens of thousands of years ago. (Credit: The mosaic includes Cassini SAR, ISS, and VIS images (NASA/JPL/Caltech/University of Arizona/Cassini Imaging Team).)