BTemplates.com

Powered by Blogger.

Pageviews past week

Quantum mechanics

Auto News

artificial intelligence

About Me

Recommend us on Google!

Information Technology

Popular Posts

Showing posts with label Robot. Show all posts
Showing posts with label Robot. Show all posts

Wednesday, July 7, 2010

Thermal-Powered, Insect-Like Robot Crawls Into Microrobot Contenders' Ring


Robotic cars attracted attention last decade with a 100-mile driverless race across the desert competing for a $1 million prize put up by the U.S. government.
the robot
Tiny, four-sided cilia, pulsating structures that mimic 
the hairs that line the human windpipe, are arranged 
in rows along the underside of the robot. (Credit: John 
Suh, Stanford University)

The past few years have given rise to a growing number of microrobots, miniaturized mobile machines designed to perform specific tasks. And though spectators might need magnifying glasses to see the action, some think the time has come for a microrobotics challenge.

"I'd like to see a similar competition at the small scale, where we dump these microrobots from a plane and have them go off and run for days and just do what they've been told," said Karl Böhringer, a University of Washington professor of electrical engineering. "That would require quite an effort at this point, but I think it would be a great thing."

Researchers at the UW and Stanford University have developed what might one day be a pint-sized contender. Böhringer is lead author of a paper in the June issue of the Journal of Microelectromechanical Systems introducing an insectlike robot with hundreds of tiny legs.

Compared to other such robots, the UW model excels in its ability to carry heavy loads -- more than seven times its own weight -- and move in any direction.

Someday, tiny mobile devices could crawl through cracks to explore collapsed structures, collect environmental samples or do other tasks where small size is a benefit. The UW's robot weighs half a gram (roughly one-hundredth of an ounce), measures about 1 inch long by a third of an inch wide, and is about the thickness of a fingernail.

Technically it is a centipede, with 512 feet arranged in 128 sets of four. Each foot consists of an electrical wire sandwiched between two different materials, one of which expands under heat more than the other. A current traveling through the wire heats the two materials and one side expands, making the foot curl. Rows of feet shuffle along in this way at 20 to 30 times each second.

"The response time is an interesting point about these tiny devices," Böhringer said. "On your stove, it might take minutes or even tens of minutes to heat something up. But on the small scale it happens much, much faster."

The legs' surface area is so large compared to their volume that they can heat up or cool down in just 20 milliseconds.

"It's one of the strongest actuators that you can get at the small scale, and it has one of the largest ranges of motion," Böhringer said. "That's difficult to achieve at the small scale."

The microchip, the robot's body and feet, was first built in the mid 1990s at Stanford University as a prototype part for a paper-thin scanner or printer. A few years later the researchers modified it as a docking system for space satellites. Now they have flipped it over so the structures that acted like moving cilia are on the bottom, turning the chip into an insectlike robot.

"There were questions about the strength of the actuators. Will they be able to support the weight of the device?" Böhringer said. "We were surprised how strong they were. For these things that look fragile, it's quite amazing."

The tiny legs can move more than just the device. Researchers were able to pile paper clips onto the robot's back until it was carrying more than seven times its own weight. This means that the robot could carry a battery and a circuit board, which would make it fully independent. (It now attaches to nine threadlike wires that transmit power and instructions.)

Limbs pointing in four directions allow the robot flexibility of movement.

"If you drive a car and you want to be able to park it in a tight spot, you think, 'Wouldn't it be nice if I could drive in sideways,'" Böhringer said. "Our robot can do that -- there's no preferred direction."

Maneuverability is important for a robot intended to go into tight spaces.

The chip was not designed to be a microrobot, so little effort was made to minimize its weight or energy consumption. Modifications could probably take off 90 percent of the robot's weight, Böhringer said, and eliminate a significant fraction of its power needs.

As with other devices of this type, he added, a major challenge is the power supply. A battery would only let the robot run for 10 minutes, while researchers would like it to go for days.

Another is speed. Right now the UW robot moves at about 3 feet per hour -- and it's far from the slowest in the microrobot pack.

Co-authors are former UW graduate students Yegan Erdem, Yu-Ming Chen and Matthew Mohebbi; UW electrical engineering professor Robert Darling; John Suh at General Motors; and Gregory Kovacs at Stanford.

Research funding was provided by the U.S. Defense Advanced Research Projects Agency, the National Science Foundation and General Motors Co.

Tuesday, September 8, 2009

A robot that can take decisions


Robots that can make their own decisions have so far been confined to science fiction movies, but a child sized figure with big eyes and a white face is trying hard to turn fiction into reality.

Its name is iCub and scientists are hoping it will learn how to adapt its behavior to changing circumstances, offering new insights into the development of human consciousness.

Six versions of the iCub exist in laboratories across Europe, where scientists are painstakingly tweaking its electronic brain to make it capable of learning, just like a human child.

"Our goal is to really understand something that is very human - the ability to cooperate, to understand what somebody else wants us to do, to be able to get aligned with them and work together," said research director Peter Ford Dominey.

iCub is about 1 meter high, with an articulated trunk, arms and legs made up of intricate electronic circuits. It has a white face with the hint of a nose and big round eyes that can see and follow moving objects.

"Shall we play the old game or play a new one?" iCub asked Dominey during a recent experiment at a laboratory in Lyon, in southeastern France. Its voice was robotic, unsurprisingly, though it did have the intonation of a person asking a question. The "game" consisted of one person picking up a box, revealing a toy that was placed underneath. Then another person picked up the toy, before putting it down again. Finally, the first person put the box back down, on top of the toy.

Having watched two humans perform this action, iCub was able to join in the fun.

"The robot is demonstrating that it can change roles. It can play the role of either the first person in the interaction or the second," said Dominey, who receives European Union funding for his work with iCub.

"These robots will be a huge tool for analytical philosophy and philosophy of mind," said Dominey, whose background is in computational neuroscience - in layman's terms, building computer models for different brain functions.

Dominey said after years of research he had understood that such models needed to be "unleashed into the world" and given vision and motor control in order to interact with humans. "Is perception consciousness? The ability to understand that somebody has a goal, is that consciousness?" he asked. "These kinds of questions, we will be able to ask with much more precision because we can have a test bed, this robot, or zombie, that we can use to implement things," he said, describing working with iCub as "an outstanding pleasure."

In the short term, iCub could be used in hospitals to help patients in need of physiotherapy by playing games with them. In the longer term, iCub could gain enough autonomy to help around the house, making its own assessments of needs.

"People have their habits, loading their dishwasher, putting away their dishes. The goal is that the robot can become like a helper, just like a polite apprentice visitor would come into your house and begin to help you," said Dominey.

Anyone looking to cut down on their household chores will need to be patient, however. "It won't be for tomorrow. It's maybe in the next decade we will begin to see this kind of thing," said the scientist.

If you like this post, buy me a Pittza at $1!
Reblog this post [with Zemanta]

Saturday, April 4, 2009

I think, therefore I, robot


Human researchers have developed their mechanical counterparts: ‘robo-scientists’ that can think independently,

Two separate teams of researchers, reporting on Thursday in the journal Science, said that they had created machines that could reason, formulate theories and discover scientific knowledge on their own, marking a major advance in the field of artificial intelligence.

Such “robo-scientists” could be put to work unravelling complex biological systems, designing new drugs, modelling the world’s climate or understanding the cosmos.

For the moment, though, they are performing more humble tasks...

Meet Adam: The first robot scientist to make an independent discovery

A robot developed by UK scientists, which can think up scientific theories and test them with almost no human help, has ushered in a new era in artificial intelligence.

In tests, the machine – named Adam – was able to identify previously unknown genetic processes in baker’s yeast.

It produced hypotheses about how certain genes should work and devised tests to prove its ideas were right.

Professor Ross King of Aberystwyth University, Wales – who helped create Adam – said: “This is the first time we believe such a system has discovered novel scientific knowledge. We are very excited about it.”

The robot takes up 15 square metres of space at the university and is equipped with an arm and a range of devices, including an automated freezer and an incubator.

“It is not the management and analysis of complex data that is the big deal about Adam. What’s amazing is the ability of the machine to reason with those data and make proposals about how a living thing works,” said Stephen Oliver, who co-authored the study on the project.

A second robot, called Eve, will work alongside it to help find new medicines for diseases such as malaria.

Professor Douglas Kell, whose biotech group BBSRC funded the research, said: “Computers play a fundamental role in the scientific process, which is becoming increasingly automated, for instance, in drug design and DNA sequencing.”

“Ultimately, we hope to have teams of human and robot scientists working together in labs,” King said.

Although Adam’s discoveries were simple, experts believe future models may one day rival Albert Einstein for genius.

King said: “I wouldn’t rule out the possibility, but it probably wouldn’t be in my lifetime.”



If you like this post, buy me a beer at $3!
Reblog this post [with Zemanta]

Sunday, March 15, 2009

Computer vision


Computer vision is the science and technology of machines that see.

Relation between computer vision and various o...Image via Wikipedia


As a scientific discipline, computer vision is concerned with the theory and technology for building artificial systems that obtain information from images or multi-dimensional data. A significant part of artificial intelligence deals with planning or deliberation for system which can perform mechanical actions such as moving a robot through some environment. This type of processing typically needs input data provided by a computer vision system, acting as a vision sensor and providing high-level information about the environment and the robot. Other parts which sometimes are described as belonging to artificial intelligence and which are used in relation to computer vision is pattern recognition and learning techniques.. For more information about the topic Computer vision, read the full article at Wikipedia.org

If you like this post, buy me a beer at $3!
Reblog this post [with Zemanta]

Friday, March 13, 2009

Follow me?


A researcher waves to a new robot that detects non-verbal commands and can follow its ‘master’, thanks to a new depth-imaging camera (inset) and advanced software

Imagine a day when you turn to your own personal robot, give it a task and then sit down and relax, confident that your robot is doing exactly what you wanted it to do. A team of US-based engineers is working to bring this futuristic scenario closer to reality, with a new robot that can follow a person – indoors and outdoors – and even understand non-verbal commands through gestures.

“We have created a novel system where the robot will follow you at a precise distance, where you don’t need to wear special clothing; you don’t need to be in a special environment; and you don’t need to look backward to track it,” said team leader Chad Jenkins, assistant professor of computer science at Brown University.

A paper on the research was presented at the 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI 2009) in San Diego on Wednesday.

The team started with a PackBot, a mechanised platform that has been used widely by the US military for bomb disposal, among other tasks.

The researchers outfitted their robot with a commercial depth-imaging camera, which makes it look like “the head on the robot in the film Wall-E”.

They also attached a laptop with novel computer programs that enabled the bot to recognise human gestures, decipher them and respond to them.

In a demonstration, graduate student Sonia Chernova used a variety of hand-arm signals to instruct the automaton, including “follow”, “halt”, “wait” and “door breach”.

She walked with her back to it, turning corners in narrow hallways and walking briskly in an outdoor parking lot. Throughout, the bot followed dutifully, maintaining an approximate three-foot distance – and even backed up when Chernova turned around and approached it.

The team also successfully instructed the machine to turn around (a full 180-degree pivot), and to freeze when the student disappeared from view – essentially idling until the instructor reappeared and gave a nonverbal or verbal command.

HOW IT WORKS

To build the robot, the researchers had to address two key issues. The first involved what scientists call visual recognition, which helps robots orient themselves with respect to the objects in a room.

“Robots can see things, but recognition remains a challenge,” Jenkins explained. The scientists overcame this obstacle by creating a computer program, whereby the robot recognised a human by extracting a silhouette, as if a person were a virtual cut-out.

This let it “home in” on the human and receive commands without being distracted by other objects in the space.

The second advance involved the depth-imaging camera, which uses infrared light to detect objects and to establish their distance from the camera.

This enabled the Brown robot to stay locked in on the human controller, which was essential to maintaining aset distance while following the person.

The result is a robot that doesn’t require remote control or constant vigilance, which is a key step in developing autonomous devices, Jenkins said.

“Advances in enabling intuitive human-robot interaction, such as through speech or gestures, go a long way into making the robot more of a valuable sidekick and less of a machine you have to constantly command,” added Chris Jones, the principal investigator on the project.

The team is now working to add more non-verbal and verbal commands for the robot and to increase the three-foot working distance between it and the commander.

If you like this post, buy me a beer at $3!

Sunday, March 8, 2009

Soon, a portable unit of surgical robots to replace army medics on battlefields


The Trauma Pod unit in action

Researchers in the US are working on a project that could replace army medics on a battlefield with robotic surgeons and nurses in the next 10 years.

The ‘Trauma Pod’ – being developed by US’ Defence Advanced Research Projects Agency (DARPA) - is currently undergoing trials.

Brendan Visser, a surgeon at Stanford University in California who helped develop the Trauma Pod, described it as: “Three separate robots dance over the top of the patient with their powerful arms moving very quickly, yet they don’t crash and they’re able to deliver very small items from one arm to another.”

The purpose of the Trauma Pod is to provide a quick “temporary fix” to wounded soldiers before being taken to the hospital.

“The system will focus on damage control surgery, which is the minimum necessary to stabilise someone. It could provide airway control, relieve life-threatening injuries such as a collapsed lung, or stop bleeding temporarily,” Pablo Garcia – of non-profit lab SRI International, which leads the project – told New Scientist magazine.

HOW IT WORKS

The Trauma Pod unit comprises one three-armed surgeon robot, assisted by 12 other robotic systems.

Remotely controlled by a human, the surgeon bot communicates with and instructs the other robots. One of its three arms holds an endoscope to allow the human controller to see inside the patient, while the other two grip surgical tools.

Garcia added that the robot could be allowed to carry out some simple tasks without human help, such as placing stitches or tying knots.

The bed itself monitors vital signs, administers fluids and oxygen, and may eventually administer anaesthesia.

A voice-activated robotic arm “Hot Lips” - derived from the nickname of a nurse in the TV series M*A*S*H - passes fresh tools and supplies to the surgeon bot. A third “circulating nurse” robot gives out the right tools.

The Trauma Pod unit recently passed the first phase of a feasibility trial, where robots treated a mannequin with bullet injuries by inserting a plastic tube into a damaged blood vessel and operating to close a perforated bowel.

The team hopes to eventually shrink the Trauma Pod to a collapsible unit encased in a portable shell that can be carried on the back of a vehicle.
Zemanta Pixie

Robot teacher comes to Japanese school...



A team of Japanese scientists has developed the world’s first robot teacher, which can take attendance and even get angry, apart from teaching students.

Previously employed as a secretary, the humanoid robot, named Saya, is being trialled at a primary school in Tokyo.

According to the scientists, the automaton can speak different languages, carry out roll calls, and set tasks, leading British newspaper The Daily Telegraph reported.

Eighteen motors hidden behind its latex face allows it to adopt several expressions, including anger.

The humanoid was originally developed to replace a variety of workers, including secretaries, in a bid to allow firms to cut costs while still retaining some kind of human interaction.

Its creator, professor Hiroshi Kobayashi at University of Tokyo, has been working on robots for 15 years. Saya is the latest in a long line of robots that are spreading to every aspect of life in Japan.

They already guide traffic, approach students to sign up to courses; and one is now being built to give company to Alzheimer’s sufferers.
Reblog this post [with Zemanta]