BTemplates.com

Powered by Blogger.

Pageviews past week

Quantum mechanics

Auto News

artificial intelligence

About Me

Recommend us on Google!

Information Technology

Popular Posts

Wednesday, March 31, 2010

Large Hadron Collider: Beams Colliding at Record Energies Mark Start of Research Program


Beams collided at 7 trillion (1012) electron volts (or 7 tera electron volts -- TeV) in the Large Hadron Collider on March 30 at 13:06 Central European Summer Time (CEST) at CERN, marking the start of the LHC research programme. Particle physicists around the world are looking forward to a potentially rich harvest of new physics as the LHC begins its first long run at an energy three and a half times higher than previously achieved at a particle accelerator.
Large Hadron Collider
An event at the Large Hadron Collider beauty experiment (LHCb), which physicists are carrying out for precise measurements of CP violation and rare decays. (Credit: Copyright CERN)
"It's a great day to be a particle physicist," said CERN Director General Rolf Heuer. "A lot of people have waited a long time for this moment, but their patience and dedication is starting to pay dividends."

"With these record-shattering collision energies, the LHC experiments are propelled into a vast region to explore, and the hunt begins for dark matter, new forces, new dimensions and the Higgs boson," said ATLAS collaboration spokesperson, Fabiola Gianotti. "The fact that the experiments have published papers already on the basis of last year's data bodes very well for this first physics run."

"We've all been impressed with the way the LHC has performed so far," said Guido Tonelli, spokesperson of the CMS experiment, "and it's particularly gratifying to see how well our particle detectors are working while our physics teams worldwide are already analysing data. We'll address soon some of the major puzzles of modern physics like the origin of mass, the grand unification of forces and the presence of abundant dark matter in the universe. I expect very exciting times in front of us."

"This is the moment we have been waiting and preparing for," said ALICE spokesperson Jürgen Schukraft. "We're very much looking forward to the results from proton collisions, and later this year from lead-ion collisions, to give us new insights into the nature of the strong interaction and the evolution of matter in the early Universe."

"LHCb is ready for physics," said the experiment's spokesperson Andrei Golutvin, "we have a great research programme ahead of us exploring the nature of matter-antimatter asymmetry more profoundly than has ever been done before."

CERN will run the LHC for 18-24 months with the objective of delivering enough data to the experiments to make significant advances across a wide range of physics channels. As soon as they have "re-discovered" the known Standard Model particles, a necessary precursor to looking for new physics, the LHC experiments will start the systematic search for the Higgs boson. With the amount of data expected, called one inverse femtobarn by physicists, the combined analysis of ATLAS and CMS will be able to explore a wide mass range, and there's even a chance of discovery if the Higgs has a mass near 160 GeV. If it's much lighter or very heavy, it will be harder to find in this first LHC run.

For supersymmetry, ATLAS and CMS will each have enough data to double today's sensitivity to certain new discoveries. Experiments today are sensitive to some supersymmetric particles with masses up to 400 GeV. An inverse femtobarn at the LHC pushes the discovery range up to 800 GeV.

"The LHC has a real chance over the next two years of discovering supersymmetric particles," explained Heuer, "and possibly giving insights into the composition of about a quarter of the Universe."

Even at the more exotic end of the LHC's potential discovery spectrum, this LHC run will extend the current reach by a factor of two. LHC experiments will be sensitive to new massive particles indicating the presence of extra dimensions up to masses of 2 TeV, where today's reach is around 1 TeV.

"Over 2000 graduate students are eagerly awaiting data from the LHC experiments," said Heuer. "They're a privileged bunch, set to produce the first theses at the new high-energy frontier."

Following this run, the LHC will shutdown for routine maintenance, and to complete the repairs and consolidation work needed to reach the LHC's design energy of 14 TeV following the incident of 19 September 2008. Traditionally, CERN has operated its accelerators on an annual cycle, running for seven to eight months with a four to five month shutdown each year. Being a cryogenic machine operating at very low temperature, the LHC takes about a month to bring up to room temperature and another month to cool down. A four-month shutdown as part of an annual cycle no longer makes sense for such a machine, so CERN has decided to move to a longer cycle with longer periods of operation accompanied by longer shutdown periods when needed.

"Two years of continuous running is a tall order both for the LHC operators and the experiments, but it will be well worth the effort," said Heuer. "By starting with a long run and concentrating preparations for 14 TeV collisions into a single shutdown, we're increasing the overall running time over the next three years, making up for lost time and giving the experiments the chance to make their mark."
Reblog this post [with Zemanta]

Tuesday, March 30, 2010

Moral Judgments Can Be Altered: Neuroscientists Influence People’s Moral Judgments by Disrupting Specific Brain Region


MIT neuroscientists have shown they can influence people's moral judgments by disrupting a specific brain region -- a finding that helps reveal how the brain constructs morality.

Moral Judgments
In a new study, researchers disrupted activity in the right temporo-parietal junction by inducing a current in the brain using a magnetic field applied to the scalp. They found that the subjects' ability to make moral judgments that require an understanding of other people's intentions -- for example, a failed murder attempt -- was impaired. (Credit: Graphic by Christine Daniloff)


To make moral judgments about other people, we often need to infer their intentions -- an ability known as "theory of mind." For example, if a hunter shoots his friend while on a hunting trip, we need to know what the hunter was thinking: Was he secretly jealous, or did he mistake his friend for a duck?

Previous studies have shown that a brain region known as the right temporo-parietal junction (TPJ) is highly active when we think about other people's intentions, thoughts and beliefs. In the new study, the researchers disrupted activity in the right TPJ by inducing a current in the brain using a magnetic field applied to the scalp. They found that the subjects' ability to make moral judgments that require an understanding of other people's intentions -- for example, a failed murder attempt -- was impaired.

The researchers, led by Rebecca Saxe, MIT assistant professor of brain and cognitive sciences, report their findings in the Proceedings of the National Academy of Sciences.

The study offers "striking evidence" that the right TPJ, located at the brain's surface above and behind the right ear, is critical for making moral judgments, says Liane Young, lead author of the paper. It's also startling, since under normal circumstances people are very confident and consistent in these kinds of moral judgments, says Young, a postdoctoral associate in MIT's Department of Brain and Cognitive Sciences.

"You think of morality as being a really high-level behavior," she says. "To be able to apply (a magnetic field) to a specific brain region and change people's moral judgments is really astonishing."

How they did it: The researchers used a non-invasive technique known as transcranial magnetic stimulation (TMS) to selectively interfere with brain activity in the right TPJ. A magnetic field applied to a small area of the skull creates weak electric currents that impede nearby brain cells' ability to fire normally, but the effect is only temporary.

In one experiment, volunteers were exposed to TMS for 25 minutes before taking a test in which they read a series of scenarios and made moral judgments of characters' actions on a scale of 1 (absolutely forbidden) to 7 (absolutely permissible).

In a second experiment, TMS was applied in 500-milisecond bursts at the moment when the subject was asked to make a moral judgment. For example, subjects were asked to judge how permissible it is for someone to let his girlfriend walk across a bridge he knows to be unsafe, even if she ends up making it across safely. In such cases, a judgment based solely on the outcome would hold the perpetrator morally blameless, even though it appears he intended to do harm.

In both experiments, the researchers found that when the right TPJ was disrupted, subjects were more likely to judge failed attempts to harm as morally permissible. Therefore, the researchers believe that TMS interfered with subjects' ability to interpret others' intentions, forcing them to rely more on outcome information to make their judgments.

Next steps: Young is now doing a study on the role of the right TPJ in judgments of people who are morally lucky or unlucky. For example, a drunk driver who hits and kills a pedestrian is unlucky, compared to an equally drunk driver who makes it home safely, but the unlucky homicidal driver tends to be judged more morally blameworthy.

Contact Lenses Loaded With Vitamin E May Treat Glaucoma


The popular dietary supplement vitamin E, loaded into special medicated contact lenses, can keep glaucoma medicine near the eye -- where it can treat that common disease -- almost 100 times longer than possible with current commercial lenses, scientists report.

Me
Contact lenses containing vitamin E like the one above can treat glaucoma, the second leading cause of blindness, and other eye conditions, scientists are reporting. (Credit: Anuj Chauhan, Ph.D.)
In a presentation at the 239th National Meeting of the American Chemical Society (ACS) in San Francisco, they described use of vitamin E to develop contact lenses that may deliver more medication for glaucoma and perhaps other diseases to the eye.

Anuj Chauhan, Ph.D., who headed the research team, explained that glaucoma is second only to cataracts as the leading cause of vision loss and blindness in the world. It affects almost 67 million people. Eye drops that relieve the abnormal build-up of pressure inside the eye that occurs in glaucoma, are a mainstay treatment.

"The problem is within about two to five minutes of putting drops in the eye, tears carry the drug away and it doesn't reach the targeted tissue," said Chauhan, who is with the University of Florida in Gainesville. "Much of the medicine gets absorbed into the bloodstream, which carries it throughout the body where it could cause side effects. Only about one to five percent of drugs in eye drops actually reach the cornea of the eye."

Chauhan and colleagues have developed a new extended-release delivery approach incorporating vitamin E into contact lenses. The invisible clusters, or aggregates, of vitamin E molecules form what Chauhan describes as "transport barriers." that slow down the elusion of the glaucoma medication from the lens into the eye. The drug released from the lens into the eye stays in the tears far longer than the 2-5 minutes with eye drops, leading to more effective therapy.

"These vitamin structures are like 'nano-bricks'," Chauhan said. "The drug molecules can't go through the vitamin E. They must go around it. Because the nanobricks are so much bigger than the drug molecules -- we believe about a few hundred times bigger -- the molecules get diverted and must travel a longer path. This increases the duration of the drug release from the lenses."

In research with laboratory animals, the lenses containing vitamin E nanobricks administered drugs up to 100 times longer than most commercial lenses. The lenses could be designed for continuous wear for up to a month, Chauhan said. In addition to treating glaucoma, the contacts could help other eye conditions, such as cataract and dry eye. Cataract is a clouding of the lens of the eye, and dry eye involves decreased production of tears. It affects about 2 in 10 people and can lead to more severe eye problems.

"Vitamin E is a proven nutraceutical that in small amounts is good for the eye because of its ant-oxidant properties. Also Vitamin E presence in the contact lenses blocks UV radiation, leading to increased protection against the UV light. Our research has shown that the vitamin can be loaded into the lenses without any reduction in transparency. We believe it could be helpful in disease treatment and in prevention as well," he said.

Chauhan said that clinical trials of the new lenses could begin within a year to 2 years.

Here is an excerpt from Chauhan's ACS presentation:

"We have developed a novel approach of extending the duration of drug release from contact lenses by including nanosized aggregates of Vitamin E in the lenses. The Vitamin E nano-aggregates force the drug molecules to travel in a tortuous path leading to increased drug release durations. Another benefit of Vitamin E incorporation is that Vitamin E is known to be an anti-oxidant, whose slow release from lenses could also help in prevention of ophthalmic diseases like cataract and glaucoma. Furthermore, Vitamin E blocks UV radiation, leading to reduced ocular damage from the UV light. Our research has shown that Vitamin E can be loaded into the lenses without any reduction in transparency. The drug release durations from Vitamin E loaded lenses are about 100 times longer than from commercial lenses for several ophthalmic drugs including glaucoma drug timolol, anti-inflammatory drug dexamethasone, and anti-viral drug flucanozole. Thus, Vitamin E loaded lenses could be highly effective in synergistic prevention and treatment of ophthalmic diseases through extended delivery of the desired drugs and the nutraceutical Vitamin E. Animal studies in beagle dogs are ongoing to explore glaucoma treatment through Vitamin E laden contact lenses."
Reblog this post [with Zemanta]

Monday, March 29, 2010

Individual Light Atoms, Such as Carbon and Oxygen, Identified With New Microscope


Using the latest in aberration-corrected electron microscopy, researchers at the Department of Energy's Oak Ridge National Laboratory and their colleagues have obtained the first images that distinguish individual light atoms such as boron, carbon, nitrogen and oxygen.

Z-contrast scanning electron transmission microscope image
Individual boron and nitrogen atoms are clearly distinguished by their intensity in this Z-contrast scanning electron transmission microscope image from Oak Ridge National Laboratory. Each single hexagonal ring of the boron-nitrogen structure, for instance the one marked by the green circle in the figure a, consists of three brighter nitrogen atoms and three darker boron atoms. The lower (b) image is corrected for distortion. (Credit: Department of Energy, Oak Ridge National Laboratory)
The ORNL images were obtained with a Z-contrast scanning transmission electron microscope (STEM). Individual atoms of carbon, boron, nitrogen and oxygen--all of which have low atomic numbers--were resolved on a single-layer boron nitride sample.

"This research marks the first instance in which every atom in a significant part of a non-periodic material has been imaged and chemically identified," said Materials Science and Technology Division researcher Stephen Pennycook. "It represents another accomplishment of the combined technologies of Z-contract STEM and aberration correction."

Pennycook and ORNL colleague Matthew Chisholm were joined by a team that includes Sokrates Pantelides, Mark Oxley and Timothy Pennycook of Vanderbilt University and ORNL; Valeria Nicolosi at United Kingdom's Oxford University; and Ondrej Krivanek, George Corbin, Niklas Dellby, Matt Murfitt, Chris Own and Zotlan Szilagyi of Nion Company, which designed and built the microscope. The team's Z-contrast STEM analysis is described in an article published March 25 in the journal Nature.

The new high-resolution imaging technique enables materials researchers to analyze, atom by atom, the molecular structure of experimental materials and discern structural defects in those materials. Defects introduced into a material--for example, the placement of an impurity atom or molecule in the material's structure--are often responsible for the material's properties.

The group analyzed a monolayer hexagonal boron nitride sample prepared at Oxford University and was able to find and identify three types of atomic substitutions--carbon atoms substituting for boron, carbon substituting for nitrogen and oxygen substituting for nitrogen. Boron, carbon, nitrogen and oxygen have atomic numbers--or Z values-- of five, six, seven and eight, respectively.

The annular dark field analysis experiments were performed on a 100-kilovolt Nion UltraSTEM microscope optimized for low-voltage operation at 60 kilovolts.

Aberration correction, in which distortions and artifacts caused by lens imperfections and environmental effects are computationally filtered and corrected, was conceived decades ago but only relatively recently made possible by advances in computing. Aided by the technology, ORNL's Electron Microscopy group set a resolution record in 2004 with the laboratory's 300-kilovolt STEM.

The recent advance comes at a much lower voltage, for a reason.

"Operating at 60 kilovolts allows us to avoid atom-displacement damage to the sample, which is encountered with low Z-value atoms above about 80 kilovolts," Pennycook said. "You could not perform this experiment with a 300-kilovolt STEM."

Armed with the high-resolution images, materials, chemical and nanoscience researchers and theorists can design more accurate computational simulations to predict the behavior of advanced materials, which are key to meeting research challenges that include energy storage and energy efficient technologies.

The research was funded by the DOE Office of Science.

Sunday, March 28, 2010

HTC EVO 4G: Better Than the Nexus One?


Sprint's new HTC EVO 4G smartphone is being hailed as the new ruler of the Android empire. But has the crown really been passed?

The HTC EVO 4G, unveiled at the CTIA Wireless exhibition this week, sure has a feature-list fit for a king. The phone boasts a 4.3-inch capacitive touchscreen with HDMI output, dual front- and back-facing cameras, and a superspeedy 1GHz Snapdragon processor. Oh yeah -- and there's that whole 4G thing, too.

Me

HTC EVO 4G vs. Nexus One: The Display

It's hard to miss all the gushing over the HTC EVO 4G's display, and there's a reason for the excitement: The phone has one sweet screen, and you don't have to be an Android fanboy to see that. The EVO 4G's 4.3-inch display beats the Nexus One's 3.7-inch offering (which beat practically everything else back when it debuted). Both devices feature the same WVGA resolution: 800-by-480.

HTC EVO 4G vs. Nexus One: The Data Network

Sprint's biggest selling point with the HTC EVO 4G is all about those final two characters. A 4G data connection, according to Sprint, brings you download speeds as much as 10 times faster than what you'd get on a flimsy old 3G alternative.

But -- and this is a big but (you're welcome, Sir Mix-a-Lot) -- you won't be able to get those tasty 4G connections in much of the country. So far, Sprint's 4G network is available only in 27 U.S. cities. The carrier has plans to expand to a handful of other major markets later this year, but that still leaves everyone else with that aforementioned flimsy old 3G.

Plus, the EVO 4G will be available only on Sprint -- so if you're in an area where network coverage is spotty, you'll be out of luck. The Nexus One, on the other hand, will soon be available on all major carriers, giving you greater choice in the data-providing department.

Which phone wins this category, then, truly depends on where you are and how the carriers' coverage compares for your specific area.

HTC EVO 4G vs. Nexus One: The Hardware

The HTC EVO 4G is powered by the same chip as the Nexus -- that snazzy-sounding 1GHz Snapdragon processor -- so there's a virtual tie in that department.

When it comes to cameras, the HTC EVO 4G is victorious: Its back has an 8-megapixel camera and its front features a 1.3-megapixel one. The Nexus One, in comparison, has a single 5-megapixel photo-snapper.

HTC EVO 4G vs. Nexus One: The Body

The HTC EVO 4G is slightly larger than its Google-endorsed cousin (4.8-by-2.6-by-0.5 inches, compared to 4.69-by-2.35-by-0.45 inches). It's about 1.4 ounces heavier, too.

A deal-breaker? Unless you're Thumbelina, probably not. 


HTC EVO 4G vs. Nexus One: The OS

Both the HTC EVO 4G and the Nexus One are running Android 2.1, the latest version of Google's mobile operating system. Despite the matching versions, however, the user experience will be quite different on the two phones.

The reason is that the HTC EVO 4G runs HTC's Sense user interface, while the Nexus One uses the stock Android interface. The Sense interface gives Android an entirely different look, with specialized home screen widgets and custom navigation tools. As far as which is better, it's really just a matter of personal preference.

One area where the Nexus One's setup will have a distinct advantage, though, is in future Android upgrades: Given the fact that the phone is running the stock Android interface, updating it to a new OS version will be a simple and likely delay-free process (the fact that the Nexus One is Google's baby probably won't hurt, either). Custom interfaces such as HTC's Sense tend to take more time to update, as the manufacturer has to rebuild the interface around the revised platform.

HTC EVO 4G vs. Nexus One: The Data Perks

Sprint is billing the HTC EVO 4G as a mobile hotspot, meaning you can connect up to eight Wi-Fi-enabled devices to the phone and use its data connection to get them on the Internet.

It's not difficult to set up tethering on any Android phone (even if some carriers may discourage it). Still, this built-in multidevice functionality is certainly a perk worth considering.

HTC EVO 4G vs. Nexus One: The Final Judgment

Ultimately, the truth is that there'll never be an end-all Android phone; it really comes down to what's right for you. Given the nature of the platform's open ecosystem, a new contender will always be right around the corner, and hyperbole-loving bloggers will always be chomping at the bit to label it the "killer" of everything else.

That, my friends, is the one thing you can count on.
Reblog this post [with Zemanta]

Friday, March 26, 2010

Self-Healing Nuclear Reactors?


Self-repairing materials within nuclear reactors may one day become a reality as a result of research by Los Alamos National Laboratory scientists.

Me
Diagram of mechanism. (Credit: Image courtesy of DOE/Los Alamos National Laboratory)

In a paper appearing March 26 in the journal Science, Los Alamos researchers report a surprising mechanism that allows nanocrystalline materials to heal themselves after suffering radiation-induced damage. Nanocrystalline materials are those created from nanosized particles, in this case copper particles. A single nanosized particle -- called a grain -- is the size of a virus or even smaller. Nanocrystalline materials consist of a mixture of grains and the interface between those grains, called grain boundaries.

When designing nuclear reactors or the materials that go into them, one of the key challenges is finding materials that can withstand an outrageously extreme environment. In addition to constant bombardment by radiation, reactor materials may be subjected to extremes in temperature, physical stress, and corrosive conditions. Exposure to high radiation alone produces significant damage at the nanoscale.

Radiation can cause individual atoms or groups of atoms to be jarred out of place. Each vagrant atom becomes known as an interstitial. The empty space left behind by the displaced atom is known as a vacancy. Consequently, every interstitial created also creates one vacancy. As these defects -- the interstitials and vacancies -- build up over time in a material, effects such as swelling, hardening or embrittlement can manifest in the material and lead to catastrophic failure.

Therefore, designing materials that can withstand radiation-induced damage is very important for improving the reliability, safety and lifespan of nuclear energy systems.

Because nanocrystalline materials contain a large fraction of grain boundaries -- which are thought to act as sinks that absorb and remove defects -- scientists have expected that these materials should be more radiation tolerant than their larger-grain counterparts. Nevertheless, the ability to predict the performance of nanocrystalline materials in extreme environments has been severely lacking because specific details of what occurs within solids are very complex and difficult to visualize.

Recent computer simulations by the Los Alamos researchers help explain some of those details.

In the Science paper, the researchers describe the never-before-observed phenomenon of a "loading-unloading" effect at grain boundaries in nanocrystalline materials. This loading-unloading effect allows for effective self-healing of radiation-induced defects. Using three different computer simulation methods, the researchers looked at the interaction between defects and grain boundaries on time scales ranging from picoseconds to microseconds (one-trillionth of a second to one-millionth of a second).

On the shorter timescales, radiation-damaged materials underwent a "loading" process at the grain boundaries, in which interstitial atoms became trapped -- or loaded -- into the grain boundary. Under these conditions, the subsequent number of accumulated vacancies in the bulk material occurred in amounts much greater than would have occurred in bulk materials in which a boundary didn't exist. After trapping interstitials, the grain boundary later "unloaded" interstitials back into vacancies near the grain boundary. In so doing, the process annihilates both types of defects -- healing the material.

This unloading process was totally unexpected because grain boundaries traditionally have been regarded as places that accumulate interstitials, but not as places that release them. Although researchers found that some energy is required for this newly-discovered recombination method to operate, the amount of energy was much lower than the energies required to operate conventional mechanisms -- providing an explanation and mechanism for enhanced self-healing of radiation-induced damage.

Modeling of the "loading-unloading" role of grain boundaries helps explain previously observed counterintuitive behavior of irradiated nanocrystalline materials compared to their larger-grained counterparts. The insight provided by this work provides new avenues for further examination of the role of grain boundaries and engineered material interfaces in self-healing of radiation-induced defects. Such efforts could eventually assist or accelerate the design of highly radiation-tolerant materials for the next generation of nuclear energy applications.

The Los Alamos National Laboratory research team includes: Xian-Ming Bai, Richard G. Hoagland and Blas P. Uberuaga of the Materials Science and Technology Division; Arthur F. Voter, of the Theoretical Division; and Michael Nastasi of the Materials Physics and Applications Division.

The work was primarily sponsored by the Los Alamos Laboratory-Directed Research and Development (LDRD) program, which, at the discretion of the Laboratory Director, invests a small percentage of the Laboratory's budget in high-risk, potentially high-payoff projects to help position the Laboratory to anticipate and prepare for emerging national security challenges. The research also received specific funding through the Center for Materials under Irradiation and Mechanical Extremes, an Energy Frontier Research Center funded by the U.S. Department of Energy Office of Science, Office of Basic Energy Sciences.

Harmful Intent: Emotions Key to Judging Others


A new study from MIT neuroscientists suggests that our ability to respond appropriately to intended harms -- that is, with outrage toward the perpetrator -- is seated in a brain region associated with regulating emotions.

Brain
New research suggests that our ability to respond appropriately to intended harms -- that is, with outrage toward the perpetrator -- is seated in a brain region associated with regulating emotions. (Credit: iStockphoto/Mark Evans)
Patients with damage to this brain area, known as the ventromedial prefrontal cortex (VMPC), are unable to conjure a normal emotional response to hypothetical situations in which a person tries, but fails, to kill another person. Therefore, they judge the situation based only on the outcome, and do not hold the attempted murderer morally responsible.

The finding offers a new piece to the puzzle of how the human brain constructs morality, says Liane Young, a postdoctoral associate in MIT's Department of Brain and Cognitive Sciences and lead author of a paper describing the findings in the March 25 issue of the journal Neuron.

"We're slowly chipping away at the structure of morality," says Young. "We're not the first to show that emotions matter for morality, but this is a more precise look at how emotions matter."

How they did it: Working with researchers at the University of Southern California, led by Antonio Damasio, Young studied a group of nine patients with damage (caused by aneurisms or tumors) to the VMPC, a plum-sized area located behind and above the eyes.

Such patients have difficulty processing social emotions such as empathy or embarrassment, but "they have a perfectly intact capacity for reasoning and other cognitive functions," says Young.

The researchers gave the subjects a series of 24 hypothetical scenarios and asked for their reactions. The scenarios of most interest to the researchers were ones featuring a mismatch between the person's intention and the outcome -- either failed attempts to harm or accidental harms.

When confronted with failed attempts to harm, the patients had no problems understanding the perpetrator's intentions, but they failed to hold them morally responsible. The patients even judged attempted harms as more permissible than accidental harms (such as accidentally poisoning someone) -- a reversal of the pattern seen in normal adults.

"They can process what people are thinking and their intentions, but they just don't respond emotionally to that information," says Young. "They can read about a murder attempt and judge it as morally permissible because no harm was done."

This supports the idea that making moral judgments requires at least two processes -- a logical assessment of the intention, and an emotional reaction to it. The study also supports the theory that the emotional component is seated in the VMPC.

Next steps: Young hopes to study patients who incurred damage to the VMPC when they were younger, to see if they have the same impaired judgment. She also plans to study patient reactions to situations where the harmful attempts may be directed at the patient and therefore are more personal.
Reblog this post [with Zemanta]

Astronomers Confirm Einstein's Theory of Relativity


A group of astronomers [1], led by Tim Schrabback of the Leiden Observatory, conducted an intensive study of over 446 000 galaxies within the COSMOS field, the result of the largest survey ever conducted with Hubble. In making the COSMOS survey, Hubble photographed 575 slightly overlapping views of the same part of the Universe using the Advanced Camera for Surveys (ACS) onboard Hubble. It took nearly 1000 hours of observations.

Me
This image shows a smoothed reconstruction of the total (mostly dark) matter distribution in the COSMOS field, created from data taken by the NASA/ESA Hubble Space Telescope and ground-based telescopes. It was inferred from the weak gravitational lensing distortions that are imprinted onto the shapes of background galaxies. The color coding indicates the distance of the foreground mass concentrations as gathered from the weak lensing effect. Structures shown in white, cyan and green are typically closer to us than those indicated in orange and red. To improve the resolution of the map, data from galaxies both with and without redshift information were used. The new study presents the most comprehensive analysis of data from the COSMOS survey. The researchers have, for the first time ever, used Hubble and the natural "weak lenses" in space to characterise the accelerated expansion of the universe. (Credit: NASA, ESA, P. Simon (University of Bonn) and T. Schrabback (Leiden Observatory))
In addition to the Hubble data, researchers used redshift [2] data from ground-based telescopes to assign distances to 194 000 of the galaxies surveyed (out to a redshift of 5). "The sheer number of galaxies included in this type of analysis is unprecedented, but more important is the wealth of information we could obtain about the invisible structures in the Universe from this exceptional dataset," says co-author Patrick Simon from Edinburgh University.

In particular, the astronomers could "weigh" the large-scale matter distribution in space over large distances. To do this, they made use of the fact that this information is encoded in the distorted shapes of distant galaxies, a phenomenon referred to as weak gravitational lensing [3]. Using complex algorithms, the team led by Schrabback has improved the standard method and obtained galaxy shape measurements to an unprecedented precision. The results of the study will be published in an upcoming issue of Astronomy and Astrophysics.

The meticulousness and scale of this study enables an independent confirmation that the expansion of the Universe is accelerated by an additional, mysterious component named dark energy. A handful of other such independent confirmations exist. Scientists need to know how the formation of clumps of matter evolved in the history of the Universe to determine how the gravitational force, which holds matter together, and dark energy, which pulls it apart by accelerating the expansion of the Universe, have affected them. "Dark energy affects our measurements for two reasons. First, when it is present, galaxy clusters grow more slowly, and secondly, it changes the way the Universe expands, leading to more distant -- and more efficiently lensed -- galaxies. Our analysis is sensitive to both effects," says co-author Benjamin Joachimi from the University of Bonn. "Our study also provides an additional confirmation for Einstein's theory of general relativity, which predicts how the lensing signal depends on redshift," adds co-investigator Martin Kilbinger from the Institut d'Astrophysique de Paris and the Excellence Cluster Universe.

The large number of galaxies included in this study, along with information on their redshifts is leading to a clearer map of how, exactly, part of the Universe is laid out; it helps us see its galactic inhabitants and how they are distributed. "With more accurate information about the distances to the galaxies, we can measure the distribution of the matter between them and us more accurately," notes co-investigator Jan Hartlap from the University of Bonn. "Before, most of the studies were done in 2D, like taking a chest X-ray. Our study is more like a 3D reconstruction of the skeleton from a CT scan. On top of that, we are able to watch the skeleton of dark matter mature from the Universe's youth to the present," comments William High from Harvard University, another co-author.

The astronomers specifically chose the COSMOS survey because it is thought to be a representative sample of the Universe. With thorough studies such as the one led by Schrabback, astronomers will one day be able to apply their technique to wider areas of the sky, forming a clearer picture of what is truly out there.

Notes:

The Hubble Space Telescope is a project of international cooperation between ESA and NASA.

[1] The international team of astronomers in this study was led by Tim Schrabback of the Leiden University. Other collaborators included: J. Hartlap (University of Bonn), B. Joachimi (University of Bonn), M. Kilbinger (IAP), P. Simon (University of Edinburgh), K. Benabed (IAP), M. Bradac (UCDavis), T. Eifler (University of Bonn), T. Erben (University of Bonn), C. Fassnacht (University of California, Davis), F. W. High(Harvard), S. Hilbert (MPA), H. Hildebrandt (Leiden Observatory), H. Hoekstra (Leiden Observatory), K. Kuijken (Leiden Observatory), P. Marshall (KIPAC), Y. Mellier (IAP), E. Morganson (KIPAC), P. Schneider (University of Bonn), E. Semboloni (University of Bonn), L. Van Waerbeke (UBC) and M. Velander (Leiden Observatory).

[2] In astronomy, the redshift denotes the fraction by which the lines in the spectrum of an object are shifted towards longer wavelengths due to the expansion of the Universe. The observed redshift of a remote galaxy provides an estimate of its distance. In this study the researchers used redshift information computed by the COSMOS team (http://ukads.nottingham.ac.uk/abs/2009ApJ...690.1236I) using data from the SUBARU, CFHT, UKIRT, Spitzer, GALEX, NOAO, VLT, and Keck telescopes.

[3] Weak gravitational lensing: The phenomenon of gravitational lensing is the warping of spacetime by the gravitational field of a concentration of matter, such as a galaxy cluster. When light rays from distant background galaxies pass this matter concentration, their path is bent and the galaxy images are distorted. In the case of weak lensing, these distortions are small, and must be measured statistically. This analysis provides a direct estimate for the strength of the gravitational field, and therefore the mass of the matter concentration. When determining precise shapes of galaxies, astronomers have to deal with three main factors: the intrinsic shape of the galaxy (which is unknown), the gravitational lensing effect they want to measure, and systematic effects caused by the telescope and camera, as well as the atmosphere, in case of ground-based observations.
Reblog this post [with Zemanta]

Wednesday, March 24, 2010

TalkTalk - the Search Engine of the Future


After a lot of hush-hush for several years the much longed for search engine TalkTalk was presented to the press this week. One day talking basically made me speechless; the future has never looked brighter in finding information.

TalkTalk will open to the public next week and this service will be something that you will use more than you can imagine. For the first time you can not only talk to the search engine, you can discuss with it what you are looking for.

If you want to know more about the oil price, TalkTalk asks if you want to know the current oil price, the development of the oil price, or news related to the oil price. You say that you want to read news about it and TalkTalk asks you if you prefer a certain source (information that is stored for you if you want). TalkTalk then direct you to your source, or let you have the latest news related to the oil price in order from the most respected sources.

If you are looking for a certain person, you say his name, and TalkTalk will ask you what you know about him, is he alive, where is he working, is he publicly known etc. Then it asks you what you want to know and easily guide you to a website to find the information. This has made the search possible for a person named Gary Smith which has been impossible through previous search services.

Compared to other search services that uses a certain algorithm to provide data from a search, the artificial intelligence behind TalkTalk is said to easily spot if a certain source is aiming to deceive the searcher. TalkTalk also evaluates and stores every given reply and discussion, to learn how to give even more precise answers. How well this will work in the long run is yet to be seen, but thousands of people have challenged TalkTalk to tune it in before the launch, and the quality is remarkably good.

The first talking search engine saw the light of day more than 30 years ago and was called Speegle. It could read you the results from a written search on the Internet, and was more for the visually handicapped. TalkTalk is there for you 24/7 just a phone call away and on the Internet.

So far, TalkTalk can not read the information from a certain source to you by phone, if it is not in the public domain or freely available. There are currently negotiations to find an arrangment for this, but it would most likely be difficult due to copyright, and to secure an income for the publisher.

TalkTalk is also set to answer questions directly where there is a definite answer. So I called the phone service and it replied "TalkTalk, how may I help you?" I said "Which is the most populous nation in the world?" and before I was ready to take down the answer, it replied "India...anything else?"

Several new features are in the works, licensed from the same artificial intelligence technology. It is more detailed services like how to repair your car, a joker that have thousands of jokes to cheer you up, and the giant project to let everyone have access to a therapist with the same knowledge about the human mind as any experienced therapist.

TalkTalk is accessible over Internet and also by phone for all major territories, even though it only talks English. There are no plans to add other languages in the near future, most likely beacuse the giant investments needed. When you are tired of asking TalkTalk all your questions, just ask, "Where is TalkTalk?" and you will get an answer that will make you leave it with a smile on your lips.

Argument: Artificial intelligence will develop during year 2020-2030 and by then the computers are at the same level as the human brain. Today search engines are used frequently all over the world, and combined with artificial intelligence you will have a friend to talk to that can either give you answers to all the questions you have, or direct you to them.

Questions: What other services can use artificial intelligence? How will education change in the future if basically all knowledge is just a phone call away?

This news Publish in future : Year 2035

Men and Women Respond Differently to Stress


Age and gender play a major role in how people respond to stress, according to a new study on 20-to-64-year-olds. Published in the journal Psychophysiology, the investigation was led by scientists from the Université de Montréal and the Montreal Heart Institute in collaboration with colleagues from the Université du Québec à Montréal and McGill University.


"Our findings suggest that women who are more defensive are at increased cardiovascular risk, whereas low defensiveness appears to damage the health of older men," says Bianca D'Antono, a professor at the Université de Montréal Department of Psychiatry and a Montreal Heart Institute researcher.

Defensiveness is a trait characterized by avoidance, denial or repression of information perceived as threatening. In women, a strong defensive reaction to judgment from others or a threat to self-esteem will result in high blood pressure and heart rate. Contrarily, older men with low defensive reactions have a higher cardiovascular rates.

The study was conducted on 81 healthy working men and 118 women. According to Dr. Jean-Claude Tardif a Université de Montréal professor and Montreal Heart Institute researcher, the physiological response to stress in women and older men is linked to this desire of maintaining self-esteem and securing social bonds.

"The sense of belonging is a basic human need," says D'Antono. "Our findings suggest that socialization is innate and that belonging to a group contributed to the survival of our ancestors. Today, it is possible that most people view social exclusion as a threat to their existence. A strong defensive reaction is useful to maintain one's self-esteem faced with this potential threat."

As part of the experiment, participants completed four tasks of varying stress levels. The first task involved reading a neutral text on Antarctica's geography before a person of the same sex. The second and third tasks involved role-playing in which participants followed a script where they were sometimes agreeable and sometimes aggressive. The final task involved a non-scripted debate on abortion.

Heart rate and blood pressure were measured during each of these tasks as was the level of cortisol in saliva. Results showed that women and older men had elevated cardiovascular, autonomic and endocrine responses to stress -- all potentially damaging to their health. The research team cautions, however, that more studies are needed to evaluate the long-term effects of defensiveness and its association to stress response patterns in disease development.

This study was supported by the Canadian Institutes of Health Research and the Fonds de la recherche en santé du Québec.


Reblog this post [with Zemanta]

Tuesday, March 23, 2010

High-Fructose Corn Syrup Prompts Considerably More Weight Gain, Researchers Find


A Princeton University research team has demonstrated that all sweeteners are not equal when it comes to weight gain: Rats with access to high-fructose corn syrup gained significantly more weight than those with access to table sugar, even when their overall caloric intake was the same.
Me
A Princeton University research team, including (from left) undergraduate Elyse Powell, psychology professor Bart Hoebel, visiting research associate Nicole Avena and graduate student Miriam Bocarsly, has demonstrated that rats with access to high-fructose corn syrup -- a sweetener found in many popular sodas -- gain significantly more weight than those with access to water sweetened with table sugar, even when they consume the same number of calories. The work may have important implications for understanding obesity trends in the United States. (Credit: Princeton University, Office of Communications, Denise Applewhite)
In addition to causing significant weight gain in lab animals, long-term consumption of high-fructose corn syrup also led to abnormal increases in body fat, especially in the abdomen, and a rise in circulating blood fats called triglycerides. The researchers say the work sheds light on the factors contributing to obesity trends in the United States.

"Some people have claimed that high-fructose corn syrup is no different than other sweeteners when it comes to weight gain and obesity, but our results make it clear that this just isn't true, at least under the conditions of our tests," said psychology professor Bart Hoebel, who specializes in the neuroscience of appetite, weight and sugar addiction. "When rats are drinking high-fructose corn syrup at levels well below those in soda pop, they're becoming obese -- every single one, across the board. Even when rats are fed a high-fat diet, you don't see this; they don't all gain extra weight."

In results published online March 18 by the journal Pharmacology, Biochemistry and Behavior, the researchers from the Department of Psychology and the Princeton Neuroscience Institute reported on two experiments investigating the link between the consumption of high-fructose corn syrup and obesity.

The first study showed that male rats given water sweetened with high-fructose corn syrup in addition to a standard diet of rat chow gained much more weight than male rats that received water sweetened with table sugar, or sucrose, in conjunction with the standard diet. The concentration of sugar in the sucrose solution was the same as is found in some commercial soft drinks, while the high-fructose corn syrup solution was half as concentrated as most sodas.

The second experiment -- the first long-term study of the effects of high-fructose corn syrup consumption on obesity in lab animals -- monitored weight gain, body fat and triglyceride levels in rats with access to high-fructose corn syrup over a period of six months. Compared to animals eating only rat chow, rats on a diet rich in high-fructose corn syrup showed characteristic signs of a dangerous condition known in humans as the metabolic syndrome, including abnormal weight gain, significant increases in circulating triglycerides and augmented fat deposition, especially visceral fat around the belly. Male rats in particular ballooned in size: Animals with access to high-fructose corn syrup gained 48 percent more weight than those eating a normal diet. In humans, this would be equivalent to a 200-pound man gaining 96 pounds.

"These rats aren't just getting fat; they're demonstrating characteristics of obesity, including substantial increases in abdominal fat and circulating triglycerides," said Princeton graduate student Miriam Bocarsly. "In humans, these same characteristics are known risk factors for high blood pressure, coronary artery disease, cancer and diabetes." In addition to Hoebel and Bocarsly, the research team included Princeton undergraduate Elyse Powell and visiting research associate Nicole Avena, who was affiliated with Rockefeller University during the study and is now on the faculty at the University of Florida. The Princeton researchers note that they do not know yet why high-fructose corn syrup fed to rats in their study generated more triglycerides, and more body fat that resulted in obesity.

High-fructose corn syrup and sucrose are both compounds that contain the simple sugars fructose and glucose, but there at least two clear differences between them. First, sucrose is composed of equal amounts of the two simple sugars -- it is 50 percent fructose and 50 percent glucose -- but the typical high-fructose corn syrup used in this study features a slightly imbalanced ratio, containing 55 percent fructose and 42 percent glucose. Larger sugar molecules called higher saccharides make up the remaining 3 percent of the sweetener. Second, as a result of the manufacturing process for high-fructose corn syrup, the fructose molecules in the sweetener are free and unbound, ready for absorption and utilization. In contrast, every fructose molecule in sucrose that comes from cane sugar or beet sugar is bound to a corresponding glucose molecule and must go through an extra metabolic step before it can be utilized.

This creates a fascinating puzzle. The rats in the Princeton study became obese by drinking high-fructose corn syrup, but not by drinking sucrose. The critical differences in appetite, metabolism and gene expression that underlie this phenomenon are yet to be discovered, but may relate to the fact that excess fructose is being metabolized to produce fat, while glucose is largely being processed for energy or stored as a carbohydrate, called glycogen, in the liver and muscles.

In the 40 years since the introduction of high-fructose corn syrup as a cost-effective sweetener in the American diet, rates of obesity in the U.S. have skyrocketed, according to the Centers for Disease Control and Prevention. In 1970, around 15 percent of the U.S. population met the definition for obesity; today, roughly one-third of the American adults are considered obese, the CDC reported. High-fructose corn syrup is found in a wide range of foods and beverages, including fruit juice, soda, cereal, bread, yogurt, ketchup and mayonnaise. On average, Americans consume 60 pounds of the sweetener per person every year.

"Our findings lend support to the theory that the excessive consumption of high-fructose corn syrup found in many beverages may be an important factor in the obesity epidemic," Avena said.

The new research complements previous work led by Hoebel and Avena demonstrating that sucrose can be addictive, having effects on the brain similar to some drugs of abuse.

In the future, the team intends to explore how the animals respond to the consumption of high-fructose corn syrup in conjunction with a high-fat diet -- the equivalent of a typical fast-food meal containing a hamburger, fries and soda -- and whether excessive high-fructose corn syrup consumption contributes to the diseases associated with obesity. Another step will be to study how fructose affects brain function in the control of appetite.

The research was supported by the U.S. Public Health Service.

Editor's Note: In response to the above-mentioned study, The Corn Refiners Association issued a statement titled "Gross Errors in Princeton Animal Study on Obesity and High Fructose Corn Syrup: Research in Humans Discredits Princeton Study" (http://www.corn.org/princeton-hfcs-study-errors.html).

Monday, March 22, 2010

'Cold Fusion' Moves Closer to Mainstream Acceptance


A potential new energy source so controversial that people once regarded it as junk science is moving closer to acceptance by the mainstream scientific community. That's the conclusion of the organizer of one of the largest scientific sessions on the topic -- "cold fusion" -- being held here for the next two days in the Moscone Center during the 239th National Meeting of the American Chemical Society (ACS).
Me
A new "calorimeter," shown immersed in this water bath, provides the first inexpensive means of identifying the hallmark of cold fusion reactions: the production of excess heat. (Credit: Melvin Miles)
 
"Years ago, many scientists were afraid to speak about 'cold fusion' to a mainstream audience," said Jan Marwan, Ph.D., the internationally known expert who organized the symposium. Marwan heads the research firm, Dr. Marwan Chemie in Berlin, Germany. Entitled "New Energy Technology," the symposium will include nearly 50 presentations describing the latest discoveries on the topic.

The presentations describe invention of an inexpensive new measuring device that could enable more labs to begin cold fusion research; indications that cold fusion may occur naturally in certain bacteria; progress toward a battery based on cold fusion; and a range of other topics. Marwan noted that many of the presentations suggest that cold fusion is real, with a potential to contribute to energy supplies in the 21st Century.

"Now most of the scientists are no longer afraid and most of the cold fusion researchers are attracted to the ACS meeting," Marwan said. "I've also noticed that the field is gaining new researchers from universities that had previously not pursued cold fusion research. More and more people are becoming interested in it. There's still some resistance to this field. But we just have to keep on as we have done so far, exploring cold fusion step by step, and that will make it a successful alternative energy source. With time and patience, I'm really optimistic we can do this!"

The term "cold fusion" originated in 1989 when Martin Fleishmann and Stanley Pons claimed achieving nuclear fusion at room temperature with a simple, inexpensive tabletop device. That claim fomented an international sensation because nuclear fusion holds potential for providing the world with a virtually limitless new source of energy. Fuel for fusion comes from ordinary seawater, and estimates indicate that 1 gallon of seawater packs the energy equivalent of 16 gallons of gasoline at 100 percent efficiency for energy production. The claim also ignited scepticism, because conventional wisdom said that achieving fusion required multi-billion-dollar fusion reactors that operate at tens of millions of degrees Fahrenheit.

When other scientists could not reproduce the Pons-Fleishmann results, research on cold fusion fell into disrepute. Humiliated by the scientific establishment, their reputations ruined, Pons and Fleishmann closed their labs, fled the country, and dropped out of sight. The handful of scientists who continued research avoided the term "cold fusion." Instead, they used the term "low energy nuclear reactions (LENR)." Research papers at the ACS symposium openly refer to "cold fusion" and some describe cold fusion as the "Fleishmann-Pons Effect" in honor of the pioneers, Marwan noted.

"The field is now experiencing a rebirth in research efforts and interest, with evidence suggesting that cold fusion may be a reality." Marwan said. He noted, for instance, that the number of presentations on the topic at ACS National Meetings has quadrupled since 2007.

Among the reports scheduled for the symposium are:

  • Michael McKubre, Ph.D., of SRI International in Menlo Park, Calif., provides an overview of cold fusion research. McKubre will discuss current knowledge in the field and explain why some doubts exist in the broader scientific community. He will also discuss recent experimental work performed at SRI. McKubre will focus on fusion, heat production and nuclear products. [3pm, Monday March 22, Cyril Magnin ]

  • George Miley, Ph.D., reports on progress toward a new type of battery that works through a new cold fusion process and has a longer life than conventional batteries. The battery consists of a special type of electrolytic cell that operates at low temperature. The process involves purposely creating defects in the metal electrode of the cell. Miley is a professor at the University of Illinois in Urbana and director of its Fusion Studies Lab. [11am, Sunday March 21, Cyril Magnin I]

  • Melvin Miles, Ph.D., describes development of the first inexpensive instrument for reliably identifying the hallmark of cold fusion reactions: Production of excess heat from tabletop fusion devices now in use. Current "calorimeters," devices that measure excess heat, tend to be too complicated and inefficient for reliable use. The new calorimeter could boost the quality of research and open the field to scores of new scientists in university, government, and private labs, Miles suggests. He is with Dixie State College in St. George, Utah. [2.30pm, Sunday March 21, Cyril Magnin I]

  • Vladimir Vysotskii, Ph.D., presents surprising experimental evidence that bacteria can undergo a type of cold fusion process and could be used to dispose of nuclear waste. He will describe studies of nuclear transmutation -- the transformation of one element into another -- of stable and radioactive isotopes in biological systems. Vysotskii is a scientist with Kiev National Shevchenko University in Kiev, Ukraine. [11.20am, Monday March 22, Cyril Magnin I].

  • Tadahiko Mizuno, Ph.D., discusses an unconventional cold fusion device that uses phenanthrene, a substance found in coal and oil, as a reactant. He reports on excess heat production and gamma radiation production from the device. "Overall heat production exceeded any conceivable chemical reaction by two orders of magnitude," Mizuno noted. He is with Hokkaido University in Japan, and wrote the book Nuclear Transmutation: The Reality of Cold Fusion. [3pm, Sunday March 21, Cyril Magnin I]

  • Peter Hagelstein, Ph.D., describes new theoretical models to help explain excess heat production in cold fusion, one of the most controversial aspects of the field. He notes that in a nuclear reaction, one would expect that the energy produced would appear as kinetic energy in the products, but in the Fleischmann-Pons experiment there do not appear energetic particles in amounts consistent with the energy observed. His simple models help explain the observed energy changes, including the type and quantity of energy produced. Hagelstein is with the Massachusetts Institute of Technology. [10.20am, Sunday March 21, Cyril Magnin I].

  • Xing Zhong Li, Ph.D., presents research demonstrating that cold fusion can occur without the production of strong nuclear radiation. He is developing a cold fusion reactor that demonstrates this principle. Li is a scientist with Tsinghua University in Beijing, China. [9.10am, Sunday March 21, Cyril Magnin I].
    Reblog this post [with Zemanta]