BTemplates.com

Powered by Blogger.

Pageviews past week

Quantum mechanics

Auto News

artificial intelligence

About Me

Recommend us on Google!

Information Technology

Popular Posts

Showing posts with label Robotics. Show all posts
Showing posts with label Robotics. Show all posts

Tuesday, June 18, 2013

A Robot That Runs Like a Cat


Thanks to its legs, whose design faithfully reproduces feline morphology, EPFL's four-legged "cheetah-cub robot" has the same advantages as its model: it is small, light and fast. Still in its experimental stage, the robot will serve as a platform for research in locomotion and biomechanics.

This is cheetah-cub, a compliant quadruped robot.
This is cheetah-cub, a compliant quadruped robot.
(Credit: (c) EPFL)
Even though it doesn't have a head, you can still tell what kind of animal it is: the robot is definitely modeled upon a cat. Developed by EPFL's Biorobotics Laboratory (Biorob), the "cheetah-cub robot," a small-size quadruped prototype robot, is described in an article appearing today in the International Journal of Robotics Research. The purpose of the platform is to encourage research in biomechanics; its particularity is the design of its legs, which make it very fast and stable. Robots developed from this concept could eventually be used in search and rescue missions or for exploration.

This robot is the fastest in its category, namely in normalized speed for small quadruped robots under 30Kg. During tests, it demonstrated its ability to run nearly seven times its body length in one second. Although not as agile as a real cat, it still has excellent auto-stabilization characteristics when running at full speed or over a course that included disturbances such as small steps. In addition, the robot is extremely light, compact, and robust and can be easily assembled from materials that are inexpensive and readily available. 






Faithful reproduction

The machine's strengths all reside in the design of its legs. The researchers developed a new model with this robot, one that is based on the meticulous observation and faithful reproduction of the feline leg. The number of segments -- three on each leg -- and their proportions are the same as they are on a cat. Springs are used to reproduce tendons, and actuators -- small motors that convert energy into movement -- are used to replace the muscles.

"This morphology gives the robot the mechanical properties from which cats benefit, that's to say a marked running ability and elasticity in the right spots, to ensure stability," explains Alexander Sprowitz, a Biorob scientist. "The robot is thus naturally more autonomous."

Sized for a search


According to Biorob director Auke Ijspeert, this invention is the logical follow-up of research the lab has done into locomotion that included a salamander robot and a lamprey robot. "It's still in the experimental stages, but the long-term goal of the cheetah-cub robot is to be able to develop fast, agile, ground-hugging machines for use in exploration, for example for search and rescue in natural disaster situations. Studying and using the principles of the animal kingdom to develop new solutions for use in robots is the essence of our research."

Tuesday, August 10, 2010

Robots Created That Develop Emotions in Interaction With Humans


The first prototype robots capable of developing emotions as they interact with their human caregivers and expressing a whole range of emotions have been finalised by researchers.
Dr Cañamero with a sad robot. (Credit: Image courtesy of University of Hertfordshire)

Led by Dr. Lola Cañamero at the University of Hertfordshire, and in collaboration with a consortium of universities and robotic companies across Europe, these robots differ from others in the way that they form attachments, interact and express emotion through bodily expression.

Developed as part of the interdisciplinary project FEELIX GROWING (Feel, Interact, eXpress: a Global approach to development with Interdisciplinary Grounding), funded by the European Commission and coordinated by Dr. Cañamero, the robots have been developed so that they learn to interact with and respond to humans in a similar way as children learn to do it, and use the same types of expressive and behavioural cues that babies use to learn to interact socially and emotionally with others.

The robots have been created through modelling the early attachment process that human and chimpanzee infants undergo with their caregivers when they develop a preference for a primary caregiver.

They are programmed to learn to adapt to the actions and mood of their human caregivers, and to become particularly attached to an individual who interacts with the robot in a way that is particularly suited to its personality profile and learning needs. The more they interact, and are given the appropriate feedback and level of engagement from the human caregiver, the stronger the bond developed and the amount learned.

The robots are capable of expressing anger, fear, sadness, happiness, excitement and pride and will demonstrate very visible distress if the caregiver fails to provide them comfort when confronted by a stressful situation that they cannot cope with or to interact with them when they need it.

"This behaviour is modelled on what a young child does," said Dr Cañamero. "This is also very similar to the way chimpanzees and other non-human primates develop affective bonds with their caregivers."

This is the first time that early attachment models of human and non-human primates have been used to program robots that develop emotions in interaction with humans.

"We are working on non-verbal cues and the emotions are revealed through physical postures, gestures and movements of the body rather than facial or verbal expression," Dr Cañamero added.

The researchers led by Dr. Cañamero at the University of Hertfordshire are now extending the prototype further and adapting it as part of the EU project ALIZ-E, which will develop robots that learn to be carer/companion for diabetic children in hospital settings.

Within this project, coordinated by Dr Tony Belpaeme of the University of Plymouth, the Hertfordshire group will lead research related to the emotions and non-linguistic behaviour of the robots. The future robot companions will combine non-linguistic and linguistic communication to interact with the children and become increasingly adapted to their individual profiles in order to support both, therapeutic aspects of their treatment and their social and emotional wellbeing.

The FEELIX GROWING project has been funded by the Sixth Framework Programme of the European Commission. The other partners in the project are: Centre National de la Recherche Scientifique (France), Université de Cergy Pontoise (France), Ecole Polytechnique Fédérale de Lausanne (Switzerland), University of Portsmouth (U.K.), Institute of Communication and Computer Systems (Greece), Entertainment Robotics (Denmark), and Aldebaran Robotics (France).

Sunday, August 8, 2010

Artificial Bee Eye Gives Insight Into Insects’ Visual World


Despite their tiny brains, bees have remarkable navigation capabilities based on their vision. Now scientists have recreated a light-weight imaging system mimicking a honeybee's field of view, which could change the way we build mobile robots and small flying vehicles.

Bee eye view. (Credit: Image courtesy of Institute 
of Physics)

New research published Aug. 6 in IOP Publishing's Bioinspiration & Biomimetics, describes how the researchers from the Center of Excellence 'Cognitive Interaction Technology' at Bielefeld University, Germany, have built an artificial bee eye, complete with fully functional camera, to shed light on the insects' complex sensing, processing and navigational skills.

Consisting of a light-weight mirror-lens combination attached to a USB video camera, the artificial eye manages to achieve a field of vision comparable to that of a bee. In combining a curved reflective surface that is built into acrylic glass with lenses covering the frontal field, the bee eye camera has allowed the researchers to take unique images showing the world from an insect's viewpoint.

In the future, the researchers hope to include UV to fully reflect a bee's colour vision, which is important to honeybees for flower recognition and discrimination and also polarisation vision, which bees use for orientation. They also hope to incorporate models of the subsequent neural processing stages.

As the researchers write, "Despite the discussed limitations of our model of the spatial resolution of the honeybees compound eyes, we are confident that it is useful for many purposes, e.g. for the simulation of bee-like agents in virtual environments and, in combination with presented imaging system, for testing bee-inspired visual navigation strategies on mobile robots."

Wednesday, July 7, 2010

Thermal-Powered, Insect-Like Robot Crawls Into Microrobot Contenders' Ring


Robotic cars attracted attention last decade with a 100-mile driverless race across the desert competing for a $1 million prize put up by the U.S. government.
the robot
Tiny, four-sided cilia, pulsating structures that mimic 
the hairs that line the human windpipe, are arranged 
in rows along the underside of the robot. (Credit: John 
Suh, Stanford University)

The past few years have given rise to a growing number of microrobots, miniaturized mobile machines designed to perform specific tasks. And though spectators might need magnifying glasses to see the action, some think the time has come for a microrobotics challenge.

"I'd like to see a similar competition at the small scale, where we dump these microrobots from a plane and have them go off and run for days and just do what they've been told," said Karl Böhringer, a University of Washington professor of electrical engineering. "That would require quite an effort at this point, but I think it would be a great thing."

Researchers at the UW and Stanford University have developed what might one day be a pint-sized contender. Böhringer is lead author of a paper in the June issue of the Journal of Microelectromechanical Systems introducing an insectlike robot with hundreds of tiny legs.

Compared to other such robots, the UW model excels in its ability to carry heavy loads -- more than seven times its own weight -- and move in any direction.

Someday, tiny mobile devices could crawl through cracks to explore collapsed structures, collect environmental samples or do other tasks where small size is a benefit. The UW's robot weighs half a gram (roughly one-hundredth of an ounce), measures about 1 inch long by a third of an inch wide, and is about the thickness of a fingernail.

Technically it is a centipede, with 512 feet arranged in 128 sets of four. Each foot consists of an electrical wire sandwiched between two different materials, one of which expands under heat more than the other. A current traveling through the wire heats the two materials and one side expands, making the foot curl. Rows of feet shuffle along in this way at 20 to 30 times each second.

"The response time is an interesting point about these tiny devices," Böhringer said. "On your stove, it might take minutes or even tens of minutes to heat something up. But on the small scale it happens much, much faster."

The legs' surface area is so large compared to their volume that they can heat up or cool down in just 20 milliseconds.

"It's one of the strongest actuators that you can get at the small scale, and it has one of the largest ranges of motion," Böhringer said. "That's difficult to achieve at the small scale."

The microchip, the robot's body and feet, was first built in the mid 1990s at Stanford University as a prototype part for a paper-thin scanner or printer. A few years later the researchers modified it as a docking system for space satellites. Now they have flipped it over so the structures that acted like moving cilia are on the bottom, turning the chip into an insectlike robot.

"There were questions about the strength of the actuators. Will they be able to support the weight of the device?" Böhringer said. "We were surprised how strong they were. For these things that look fragile, it's quite amazing."

The tiny legs can move more than just the device. Researchers were able to pile paper clips onto the robot's back until it was carrying more than seven times its own weight. This means that the robot could carry a battery and a circuit board, which would make it fully independent. (It now attaches to nine threadlike wires that transmit power and instructions.)

Limbs pointing in four directions allow the robot flexibility of movement.

"If you drive a car and you want to be able to park it in a tight spot, you think, 'Wouldn't it be nice if I could drive in sideways,'" Böhringer said. "Our robot can do that -- there's no preferred direction."

Maneuverability is important for a robot intended to go into tight spaces.

The chip was not designed to be a microrobot, so little effort was made to minimize its weight or energy consumption. Modifications could probably take off 90 percent of the robot's weight, Böhringer said, and eliminate a significant fraction of its power needs.

As with other devices of this type, he added, a major challenge is the power supply. A battery would only let the robot run for 10 minutes, while researchers would like it to go for days.

Another is speed. Right now the UW robot moves at about 3 feet per hour -- and it's far from the slowest in the microrobot pack.

Co-authors are former UW graduate students Yegan Erdem, Yu-Ming Chen and Matthew Mohebbi; UW electrical engineering professor Robert Darling; John Suh at General Motors; and Gregory Kovacs at Stanford.

Research funding was provided by the U.S. Defense Advanced Research Projects Agency, the National Science Foundation and General Motors Co.

Tuesday, September 8, 2009

A robot that can take decisions


Robots that can make their own decisions have so far been confined to science fiction movies, but a child sized figure with big eyes and a white face is trying hard to turn fiction into reality.

Its name is iCub and scientists are hoping it will learn how to adapt its behavior to changing circumstances, offering new insights into the development of human consciousness.

Six versions of the iCub exist in laboratories across Europe, where scientists are painstakingly tweaking its electronic brain to make it capable of learning, just like a human child.

"Our goal is to really understand something that is very human - the ability to cooperate, to understand what somebody else wants us to do, to be able to get aligned with them and work together," said research director Peter Ford Dominey.

iCub is about 1 meter high, with an articulated trunk, arms and legs made up of intricate electronic circuits. It has a white face with the hint of a nose and big round eyes that can see and follow moving objects.

"Shall we play the old game or play a new one?" iCub asked Dominey during a recent experiment at a laboratory in Lyon, in southeastern France. Its voice was robotic, unsurprisingly, though it did have the intonation of a person asking a question. The "game" consisted of one person picking up a box, revealing a toy that was placed underneath. Then another person picked up the toy, before putting it down again. Finally, the first person put the box back down, on top of the toy.

Having watched two humans perform this action, iCub was able to join in the fun.

"The robot is demonstrating that it can change roles. It can play the role of either the first person in the interaction or the second," said Dominey, who receives European Union funding for his work with iCub.

"These robots will be a huge tool for analytical philosophy and philosophy of mind," said Dominey, whose background is in computational neuroscience - in layman's terms, building computer models for different brain functions.

Dominey said after years of research he had understood that such models needed to be "unleashed into the world" and given vision and motor control in order to interact with humans. "Is perception consciousness? The ability to understand that somebody has a goal, is that consciousness?" he asked. "These kinds of questions, we will be able to ask with much more precision because we can have a test bed, this robot, or zombie, that we can use to implement things," he said, describing working with iCub as "an outstanding pleasure."

In the short term, iCub could be used in hospitals to help patients in need of physiotherapy by playing games with them. In the longer term, iCub could gain enough autonomy to help around the house, making its own assessments of needs.

"People have their habits, loading their dishwasher, putting away their dishes. The goal is that the robot can become like a helper, just like a polite apprentice visitor would come into your house and begin to help you," said Dominey.

Anyone looking to cut down on their household chores will need to be patient, however. "It won't be for tomorrow. It's maybe in the next decade we will begin to see this kind of thing," said the scientist.

If you like this post, buy me a Pittza at $1!
Reblog this post [with Zemanta]

Friday, August 28, 2009

'Plasmobot': Scientists To Design First Robot Using Mould


Scientists at the University of the West of England are to design the first ever biological robot using mould.


Plasmodium used in the research.
(Credit: Image courtesy of University of the West of England)

Researchers have received a Leverhulme Trust grant worth £228,000 to develop the amorphous non-silicon biological robot, plasmobot, using plasmodium, the vegetative stage of the slime mould Physarum polycephalum, a commonly occurring mould which lives in forests, gardens and most damp places in the UK. The Leverhulme Trust funded research project aims to design the first every fully biological (no silicon components) amorphous massively-parallel robot.


This project is at the forefront of research into unconventional computing. Professor Andy Adamatzky, who is leading the project, says their previous research has already proved the ability of the mould to have computational abilities.


Professor Adamatzky explains, “Most people’s idea of a computer is a piece of hardware with software designed to carry out specific tasks. This mould, or plasmodium, is a naturally occurring substance with its own embedded intelligence. It propagates and searches for sources of nutrients and when it finds such sources it branches out in a series of veins of protoplasm. The plasmodium is capable of solving complex computational tasks, such as the shortest path between points and other logical calculations. Through previous experiments we have already demonstrated the ability of this mould to transport objects. By feeding it oat flakes, it grows tubes which oscillate and make it move in a certain direction carrying objects with it. We can also use light or chemical stimuli to make it grow in a certain direction.


“This new plasmodium robot, called plasmobot, will sense objects, span them in the shortest and best way possible, and transport tiny objects along pre-programmed directions. The robots will have parallel inputs and outputs, a network of sensors and the number crunching power of super computers. The plasmobot will be controlled by spatial gradients of light, electro-magnetic fields and the characteristics of the substrate on which it is placed. It will be a fully controllable and programmable amorphous intelligent robot with an embedded massively parallel computer.”


This research will lay the groundwork for further investigations into the ways in which this mould can be harnessed for its powerful computational abilities.


Professor Adamatzky says that there are long term potential benefits from harnessing this power, “We are at the very early stages of our understanding of how the potential of the plasmodium can be applied, but in years to come we may be able to use the ability of the mould for example to deliver a small quantity of a chemical substance to a target, using light to help to propel it, or the movement could be used to help assemble micro-components of machines. In the very distant future we may be able to harness the power of plasmodia within the human body, for example to enable drugs to be delivered to certain parts of the human body. It might also be possible for thousands of tiny computers made of plasmodia to live on our skin and carry out routine tasks freeing up our brain for other things. Many scientists see this as a potential development of amorphous computing, but it is purely theoretical at the moment.”


Professor Adamatzky has recently edited and had published by Springer, ‘Artificial Life Models in Hardware’ aimed at students and researchers of robotics. The book focuses on the design and real-world implementation of artificial life robotic devices and covers a range of hopping, climbing, swimming robots, neural networks and slime mould and chemical brains.



If you like this post, buy me a Pittza at $1!
Reblog this post [with Zemanta]

Saturday, March 21, 2009

Robot fish to detect pollution in waters


The robotic fish, equipped with sensors that detect hazardous elements, can operate underwater for over eight hours at a time

Scientists in the UK have developed new robotic fish to detect water pollution in rivers, lakes and seas.

The robots – costing around $29,000 each – are being built by Professor Huosheng Hu and his robotics team at the School of Computer Science and Electronic Engineering, University of Essex.

The life-like creatures, which will mimic the undulating movement of real fish, will be equipped with tiny chemical sensors to find the source of potentially hazardous pollutants in the water, such as leaks from vessels in the port or underwater pipelines.

The fish will then transmit their data through Wi-Fi technology when they dock to charge their batteries, which last around eight hours.

Rory Doyle, senior research scientist at technology consultancy BMT Group, has described the project as a “world first”.

“In using robotic fish, we are building on a design created by hundreds of millions of years’ worth of evolution which is incredibly energy efficient. This efficiency is something we need to ensure that our pollution detection sensors can navigate in the underwater environment for hours on end,” he said.

“We will produce a system that allows the fish to search underwater, meaning that we will be able to analyse not only chemicals on the surface of the water, but also those that are dissolved in the water,” he added.

Doyle and Hu hope to release five of the bots into the water by the end of next year.

If you like this post, buy me a beer at $3!
Reblog this post [with Zemanta]

Friday, March 13, 2009

Follow me?


A researcher waves to a new robot that detects non-verbal commands and can follow its ‘master’, thanks to a new depth-imaging camera (inset) and advanced software

Imagine a day when you turn to your own personal robot, give it a task and then sit down and relax, confident that your robot is doing exactly what you wanted it to do. A team of US-based engineers is working to bring this futuristic scenario closer to reality, with a new robot that can follow a person – indoors and outdoors – and even understand non-verbal commands through gestures.

“We have created a novel system where the robot will follow you at a precise distance, where you don’t need to wear special clothing; you don’t need to be in a special environment; and you don’t need to look backward to track it,” said team leader Chad Jenkins, assistant professor of computer science at Brown University.

A paper on the research was presented at the 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI 2009) in San Diego on Wednesday.

The team started with a PackBot, a mechanised platform that has been used widely by the US military for bomb disposal, among other tasks.

The researchers outfitted their robot with a commercial depth-imaging camera, which makes it look like “the head on the robot in the film Wall-E”.

They also attached a laptop with novel computer programs that enabled the bot to recognise human gestures, decipher them and respond to them.

In a demonstration, graduate student Sonia Chernova used a variety of hand-arm signals to instruct the automaton, including “follow”, “halt”, “wait” and “door breach”.

She walked with her back to it, turning corners in narrow hallways and walking briskly in an outdoor parking lot. Throughout, the bot followed dutifully, maintaining an approximate three-foot distance – and even backed up when Chernova turned around and approached it.

The team also successfully instructed the machine to turn around (a full 180-degree pivot), and to freeze when the student disappeared from view – essentially idling until the instructor reappeared and gave a nonverbal or verbal command.

HOW IT WORKS

To build the robot, the researchers had to address two key issues. The first involved what scientists call visual recognition, which helps robots orient themselves with respect to the objects in a room.

“Robots can see things, but recognition remains a challenge,” Jenkins explained. The scientists overcame this obstacle by creating a computer program, whereby the robot recognised a human by extracting a silhouette, as if a person were a virtual cut-out.

This let it “home in” on the human and receive commands without being distracted by other objects in the space.

The second advance involved the depth-imaging camera, which uses infrared light to detect objects and to establish their distance from the camera.

This enabled the Brown robot to stay locked in on the human controller, which was essential to maintaining aset distance while following the person.

The result is a robot that doesn’t require remote control or constant vigilance, which is a key step in developing autonomous devices, Jenkins said.

“Advances in enabling intuitive human-robot interaction, such as through speech or gestures, go a long way into making the robot more of a valuable sidekick and less of a machine you have to constantly command,” added Chris Jones, the principal investigator on the project.

The team is now working to add more non-verbal and verbal commands for the robot and to increase the three-foot working distance between it and the commander.

If you like this post, buy me a beer at $3!

Sunday, March 8, 2009

Soon, a portable unit of surgical robots to replace army medics on battlefields


The Trauma Pod unit in action

Researchers in the US are working on a project that could replace army medics on a battlefield with robotic surgeons and nurses in the next 10 years.

The ‘Trauma Pod’ – being developed by US’ Defence Advanced Research Projects Agency (DARPA) - is currently undergoing trials.

Brendan Visser, a surgeon at Stanford University in California who helped develop the Trauma Pod, described it as: “Three separate robots dance over the top of the patient with their powerful arms moving very quickly, yet they don’t crash and they’re able to deliver very small items from one arm to another.”

The purpose of the Trauma Pod is to provide a quick “temporary fix” to wounded soldiers before being taken to the hospital.

“The system will focus on damage control surgery, which is the minimum necessary to stabilise someone. It could provide airway control, relieve life-threatening injuries such as a collapsed lung, or stop bleeding temporarily,” Pablo Garcia – of non-profit lab SRI International, which leads the project – told New Scientist magazine.

HOW IT WORKS

The Trauma Pod unit comprises one three-armed surgeon robot, assisted by 12 other robotic systems.

Remotely controlled by a human, the surgeon bot communicates with and instructs the other robots. One of its three arms holds an endoscope to allow the human controller to see inside the patient, while the other two grip surgical tools.

Garcia added that the robot could be allowed to carry out some simple tasks without human help, such as placing stitches or tying knots.

The bed itself monitors vital signs, administers fluids and oxygen, and may eventually administer anaesthesia.

A voice-activated robotic arm “Hot Lips” - derived from the nickname of a nurse in the TV series M*A*S*H - passes fresh tools and supplies to the surgeon bot. A third “circulating nurse” robot gives out the right tools.

The Trauma Pod unit recently passed the first phase of a feasibility trial, where robots treated a mannequin with bullet injuries by inserting a plastic tube into a damaged blood vessel and operating to close a perforated bowel.

The team hopes to eventually shrink the Trauma Pod to a collapsible unit encased in a portable shell that can be carried on the back of a vehicle.
Zemanta Pixie

Robot teacher comes to Japanese school...



A team of Japanese scientists has developed the world’s first robot teacher, which can take attendance and even get angry, apart from teaching students.

Previously employed as a secretary, the humanoid robot, named Saya, is being trialled at a primary school in Tokyo.

According to the scientists, the automaton can speak different languages, carry out roll calls, and set tasks, leading British newspaper The Daily Telegraph reported.

Eighteen motors hidden behind its latex face allows it to adopt several expressions, including anger.

The humanoid was originally developed to replace a variety of workers, including secretaries, in a bid to allow firms to cut costs while still retaining some kind of human interaction.

Its creator, professor Hiroshi Kobayashi at University of Tokyo, has been working on robots for 15 years. Saya is the latest in a long line of robots that are spreading to every aspect of life in Japan.

They already guide traffic, approach students to sign up to courses; and one is now being built to give company to Alzheimer’s sufferers.
Reblog this post [with Zemanta]