Thursday, May 22, 2008

Strokable robot rabbit talks with touch


Source:
A pet robot that communicates with humans only by touch is being used to probe the way the oft-neglected sense bolsters our emotional relationships. The findings could be used to make humans' relationships with robots and other pieces of technology more emotionally rewarding.
Steve Yohanan at the University of British Columbia in Vancouver, Canada, says that robotics researchers too often neglect haptics – touch – as a form of communication. Vision and sound have been concentrated on instead.But missing out tactility has a detrimental effect on the quality of the interaction, he says. "I'm trying to provide a deeper experience by adding touch," says Yohanan.
"I had a cat for many years, and what I miss most about interacting with her is touch," he says. "For example, the cat would sit in my lap while I worked at the computer – I would scratch the top of her head and feel her purr."

Purring robot:
Yohanan's new robot, dubbed the Haptic Creature, is designed to recreate that touch-based communication between pet and owner to inject an element of emotion into human-robot interactions. Working out how to do that could have applications ranging from toys to domestic robot servants. The creature is around 35cm long and has shorter fur on its belly and the back of its two "ears" (see image, right).
Using pressure sensors, the Haptic Creature can detect the way it is touched or stroked. It can only respond with breathing movements of its body, inaudible purring vibrations, or by moving its ears.
But even those simple responses to touch can elicit a range of emotions in humans, says Yohanan. "Our preliminary investigation showed participants could identify most of the emotional responses [across a scale from negative to positive]," he says.
Sommer Gentry, an applied mathematician at the United States Naval Academy in Annapolis, Maryland, US, says that the importance of haptic interaction to the way people use technology has long been neglected.

Dancing arm:
"I am not sure whether it is the technical challenges of human-robot haptic interaction, or under-appreciation of the potential for these technologies that make this a relatively immature area," she says.
In 2003, Gentry programmed a robotic arm to perform a random sequence of hand movements associated with swing-dance moves.
By isolating the movements in this way, she found that a human swing dancer could tell the sequence of moves using touch alone, without needing to observe the movements of the arm or of a dancer.
Steve Yohanan presented the Haptic Creature at the Artificial Intelligence and Simulation of Behaviour (AISB) 2008 Convention in Aberdeen, Scotland in April.
Fausto Intilla - www.oloscience.com

New Robot Walks Like A Human


Source:
ScienceDaily (May 22, 2008) — Researcher Daan Hobbelen of TU Delft (The Netherlands) has developed a new, highly-advanced walking robot: Flame. This type of research, for which Hobbelen will receive his PhD on Friday 30 May, is important as it provides insight into how people walk. This can in turn help people with walking difficulties through improved diagnoses, training and rehabilitation equipment.
If you try to teach a robot to walk, you will discover just how complex an activity it is. Walking robots have been around since the seventies. The applied strategies can roughly be divided into two types. The first derives from the world of industrial robots, in which everything is fixed in routines, as is the case with factory robots. This approach can, where sufficient time and money are invested, produce excellent results, but there are major restrictions with regard to cost, energy consumption and flexibility.
Human
TU Delft is a pioneer of the other method used for constructing walking robots, which examines the way humans walk. This is really very similar to falling forward in a controlled fashion. Adopting this method replaces the cautious, rigid way in which robots walk with the more fluid, energy-efficient movement used by humans.
PhD student Daan Hobbelen has demonstrated for the first time that a robot can be both energy-efficient and highly stable. His breakthrough came in inventing a suitable method for measuring the stability of the way people walk for the first time. This is remarkable, as ‘falling forward’ is traditionally viewed as an unstable movement.
Next he built a new robot with which he was able to demonstrate the improved performance: Flame. Flame contains seven motors, an organ of balance and various algorithms which ensure its high level of stability.
For instance, the robot can apply the information provided by its organ of balance to place its feet slightly further apart in order to prevent a potential fall. According to Hobbelen, Flame is the most advanced walking robot in the world, at least in the category of robots which apply the human method of walking as a starting principle.
Rehabilitation
Modelling the walking process allows researchers to construct two-legged robots which walk more naturally. More insight into the walking process can in turn help people with walking difficulties, for example through improved diagnoses, training and rehabilitation equipment. TU Delft is working on this together with motion scientists at VU University Amsterdam.
Hobbelen cites ankles as an example. These joints are a type of spring which can be used to define the best level of elasticity. Research conducted by Hobbelen into Flame’s ankles has provided motion scientists with more insight into this topic.
Football-playing robots
Over the next few years, TU Delft intends to take major steps forward in research into walking robots. These include developing walking robots which can ‘learn’, see and run.
One very special part of the robot research concerns football-playing robots. On Thursday 29 May, together with the University of Twente, TU Eindhoven and Philips, TU Delft will present the Dutch RoboCup team which is to participate in the 2008 RoboCup Soccer in China this summer.
This presentation will take place at TU Delft during the international Dynamic Walking 2008 conference held from 26-29 May. Biomechanics experts, motion scientists and robot experts will come together at this event to exchange expertise on the walking process.
Fausto Intilla - www.oloscience.com

Monday, May 19, 2008

Fuels Cells: New Material Increases Power Output By More Than 50 Percent


Source:
ScienceDaily (May 19, 2008) — MIT engineers have improved the power output of one type of fuel cell by more than 50 percent through technology that could help these environmentally friendly energy storage devices find a much broader market, particularly in portable electronics.
The new material key to the work is also considerably less expensive than its conventional industrial counterpart, among other advantages.
"Our goal is to replace traditional fuel-cell membranes with these cost-effective, highly tunable and better-performing materials," said Paula T. Hammond, Bayer Professor of Chemical Engineering and leader of the research team. She noted that the new material also has potential for use in other electrochemical systems such as batteries.
The work was reported in a recent issue of Advanced Materials by Hammond, Avni A. Argun and J. Nathan Ashcraft. Argun is a postdoctoral associate in chemical engineering; Ashcraft is a graduate student in the same department.
Like a battery, a fuel cell has three principal parts: two electrodes (a cathode and anode) separated by an electrolyte. Chemical reactions at the electrodes produce an electronic current that can be made to flow through an appliance connected to the battery or fuel cell. The principal difference between the two? Fuel cells get their energy from an external source of hydrogen fuel, while conventional batteries draw from a finite source in a contained system.
The MIT team focused on direct methanol fuel cells (DMFCs), in which the methanol is directly used as the fuel and reforming of alcohol down to hydrogen is not required. Such a fuel cell is attractive because the only waste products are water and carbon dioxide (the latter produced in small quantities). Also, because methanol is a liquid, it is easier to store and transport than hydrogen gas, and is safer (it won't explode). Methanol also has a high energy density-a little goes a long way, making it especially interesting for portable devices.
The DMFCs currently on the market, however, have limitations. For example, the material currently used for the electrolyte sandwiched between the electrodes is expensive. Even more important: that material, known as Nafion, is permeable to methanol, allowing some of the fuel to seep across the center of the fuel cell. Among other disadvantages, this wastes fuel-and lowers the efficiency of the cell-because the fuel isn't available for the reactions that generate electricity.
Using a relatively new technique known as layer-by-layer assembly, the MIT researchers created an alternative to Nafion. "We were able to tune the structure of [our] film a few nanometers at a time," Hammond said, getting around some of the problems associated with other approaches. The result is a thin film that is two orders of magnitude less permeable to methanol but compares favorably to Nafion in proton conductivity.
To test their creation, the engineers coated a Nafion membrane with the new film and incorporated the whole into a direct methanol fuel cell. The result was an increase in power output of more than 50 percent.
The team is now exploring whether the new film could be used by itself, completely replacing Nafion. To that end, they have been generating thin films that stand alone, with a consistency much like plastic wrap.
This work was supported by the DuPont-MIT Alliance through 2007. It is currently supported by the National Science Foundation.
In addition, Hammond and colleagues have begun exploring the new material's potential use in photovoltaics. That work is funded by the MIT Energy Initiative.
Fausto Intilla - www.oloscience.com

Sunday, May 18, 2008

Weather, Waves And Wireless: Super Strength Signalling

Source:

ScienceDaily (May 16, 2008) — A new study from the University of Leicester has discovered a particular window of time when mobile signals and radio waves are 'super strength' -- allowing them to be clearer and travel greater distances, potentially interfering with other systems.
The research, examining the signal strength of radio waves travelling over the sea, identified late afternoons and early evenings in spring and summer as a time when enhanced signals occur.
The research by Salil Gunashekar was part of his Doctoral studies at the University of Leicester's Department of Engineering and has yielded results that have implications for the design of cellular telephone networks operating in marine and coastal regions
Dr Gunashekar, who is now a Post-Doctoral Research Associate in the Radio Systems Research Group, said: "In today's world, radio waves are an indispensable means of communicating information 'without wires' from one place to another, be it for radio broadcasts or cell phones, television transmissions or airport radars.
"When radio waves travel for long distances over the sea their strength can be affected by the weather. The constantly changing weather conditions over the sea mean that marine and coastal environments, in particular, are prone to unusual atmospheric phenomena that enable radio waves to travel longer distances and have higher strengths than expected."
On Wednesday 4th June, in the fourth of the series of Doctoral Inaugural Lectures, Dr Gunashekar will present the key findings of his Ph.D. research in which he conducted a detailed theoretical and experimental investigation of the propagation characteristics of over-sea radio communications.
Specifically, between August 2003 and August 2005, three long-range radio paths operating at a frequency in the ultra high frequency band (UHF: specifically 2 Gigahertz) were established in the British Channel Islands. This frequency is of particular importance since it is used by many mobile phones. The relationship between specific over-sea propagation mechanisms and signal strength distribution patterns in a temperate region such as the English Channel have been examined, modelled and correlated with meteorological parameters.
Dr Gunashekar said: "Interestingly, signal strength enhancements have been observed on all three radio paths, predominantly in the late afternoon and evening periods, in the spring and summer months. During these periods, which occur only approximately 5-10% of the time, the influence of higher-altitude radio wave 'trapping' structures has been verified."
The research conducted in this investigation is expected to have implications for the design of cellular telephone networks operating in marine and coastal regions, as well as other maritime communication systems such as those used in commercial shipping and sea-rescue operations, and is all the more applicable to the United Kingdom because of its extensive coastline.

Fausto Intilla - www.oloscience.com

Tuesday, May 13, 2008

Designing Bug Perception Into Robots


Source:
ScienceDaily (May 13, 2008) — Insects have provided the inspiration for a team of European researchers seeking to improve the functionality of robots and robotic tools.
The research furthers the development of more intelligent robots, which can then be used by industry, and by emergency and security services, among others. Smarter robots would be better able to find humans buried beneath the rubble of a collapsed building, for example.
The EU-funded SPARK project set out to develop a new robot control architecture for roving robots inspired by the principles governing the behaviour of living systems and based on the concept of self-organisation.
Basing their work on the basic functions of the insect brain, the team developed a new architecture for artificial cognitive systems that could significantly increase the ability of robots to react to changing environmental conditions and to ‘learn’ behaviour in response to external stimuli.
The research team calls their new software architecture a spatial-temporal array computer based structure (SPARC).
Robots are complex systems that rely on software, hardware and mechanical systems all working together. One of the challenges facing researchers is to develop robots, or moving artefacts, that are capable of several different behaviours, that are able to sense or perceive external signals and, most importantly, are able to ‘learn’ and react appropriately to changing conditions.
For example, a robot travelling over unknown terrain may need to adapt its way of moving depending on whether it is navigating flat, rocky or wet ground. Or it may need to modify its course to reach a defined target.
The objective is to enable a robot to do this without human intervention, based on its own powers of perception and ability to adapt.
Powers of perception
Within the SPARC software architecture, the robot’s powers of perception are enhanced by its ability to use information derived from visual, audio and tactile sensors to form a dynamically evolving pattern. The pattern is in turn used to determine the movements of the device.
The researchers’ technical objective was to produce a moving artefact able to actively interact with its environment to carry out a set task.
The research so far has already provided a new theoretical framework, or paradigm, for active robot perception. The paradigm is based on principles borrowed from psychology, synergetics, artificial intelligence and non-linear dynamical systems theory.
Learning as you go
One of the researchers’ central objectives was to develop a machine with the ability to build knowledge independent of human control. Researchers based the proposed architecture for artificial cognitive systems on the basic building blocks of the insect brain.
“The SPARC architecture is a starting step toward emulating the essential perception-action architecture of living beings, where some basic behaviours are inherited, like escaping or feeding, while others are incrementally learned, leading to the emergence of higher cognitive abilities,” notes Paolo Arena, the project coordinator.
The cognitive system allows the device to autonomously ‘learn’ based on a combination of basic reflexive behaviours and feedback from external environmental data.
Once the robot is assigned a mission, compatible with its structural and mechanical capabilities – for example ‘find people alive’ – it is able to work out how best to do this itself in a particular external context.
“The robot will initially behave by using primarily the basic inherited behaviours,” says Arena. “Higher knowledge will be incrementally formed in the higher layer of the architecture, which is a neuron lattice based on the Reaction-Diffusion Cellular Non-linear Network (RD-CNN) paradigm, able to generate self-organising dynamic patterns.”
Basic behaviours incorporated in the demonstrations so far include, for example, the ability of a robot to direct itself towards a specific sound source. This optomotor reflex allows the robot to maintain heading and avoid obstacles.
During the course of the demonstration, the robot ‘learns’ how to safely reach the sound source. This it does while it is properly modulating its basic behaviours so it does not become trapped into the deadlock situations that are typical of complex and dynamically changing environments.
Next steps
The project’s experimental robots used some of the partners’ technologies, such as the real-time visual processing features of the Eye-RIS vision system, one of the lead products of Spain-based Innovaciones Microelectrònicas (Anafocus).
The project also attracted the interest of other commercial enterprises, including STMicroelectronics, which provided components and boards for Rover II, one of the robots developed by SPARK.
Altera, another company, supplied field-programmable gate array (FPGA) devices for the development and implementation of perceptual algorithms.
The advances made have led to a number of software and hardware innovations for the improvement of machine perception. The project’s industrial partners are continuing to work on the innovations.
The cognitive visual algorithms designed and improved by the project’s researchers have, for example, already been integrated into products produced by some of the project’s partners.
Hungary-based Analogic Computers, a partner in the project, has launched its InstantVision software package based on some of the research. The package has become one of the company’s lead products.
The work of the SPARK project is continuing with the SPARK II project, which will look more deeply into the details of insect brain neurobiology to refine, assess and generalise the SPARK cognitive architecture.
Further down the line, the research is expected to lead to the introduction of powerful and flexible machines suitable for use in dynamically changing environments where conditions are unstable or unpredictable, such as war zones or disaster areas.
The project has introduced a new model for action-oriented perception. Ongoing work will focus on assessing this model and on expanding it to a larger family of moving machines.
The SPARK project received funding from the EU's Sixth Framework Programme for research.
Fausto Intilla - www.oloscience.com

Alternative To Silicon Chip Invented By Student


Source:
ScienceDaily (May 13, 2008) — Even before Weixiao Huang received his doctorate from Rensselaer Polytechnic Institute, his new transistor captured the attention of some of the biggest American and Japanese automobile companies. The 2008 graduate's invention could replace one of the most common pieces of technology in the world--the silicon transistor for high-power and high-temperature electronics.
Huang, who comes from humble roots as the son of farmers in rural China, has invented a new transistor that uses a compound material known as gallium nitride (GaN), which has remarkable material properties. The new GaN transistor could reduce the power consumption and improve the efficiency of power electronics systems in everything from motor drives and hybrid vehicles to house appliances and defense equipment.
"Silicon has been the workhorse in the semiconductor industry for last two decades," Huang said. "But as power electronics get more sophisticated and require higher performing transistors, engineers have been seeking an alternative like gallium nitride-based transistors that can perform better than silicon and in extreme conditions."
Each household likely contains dozens of silicon-based electronics. An important component of each of those electronics is usually a silicon-based transistor know as a silicon metal/oxide semiconductor field-effect transistor (silicon MOSFET). To convert the electric energy to other forms as required, the transistor acts as a switch, allowing or disallowing the flow of current through the device.
Huang first developed a new process that demonstrates an excellent GaN MOS (metal/oxide/GaN) interface. Engineers have known that GaN and other gallium-based materials have some extremely good electrical properties, much better than silicon. However, no useful GaN MOS transistor has been developed. Huang's innovation, the first GaN MOSFET of its kind in the world, has already shown world-record performance according to Huang.
In addition, Huang has shown that his innovation can integrate several important electronic functions onto one chip like never before. "This will significantly simplify entire electronic systems," Huang said. Huang has also designed and experimentally demonstrated several new novel high-voltage MOS-gated FETs which have shown superior performance compared to silicon MOSFET in terms of lower power consumption, smaller chip size, and higher power density.
The new transistors can greatly reduce energy loss, making energy conversion more efficient. "If these new GaN transistors replaced many existing silicon MOSFETs in power electronics systems, there would be global reduction in fossil fuel consumption and pollution," Huang said.
The new GaN transistors can also allow the electronics system to operate in extremely hot, harsh, and high-power environments and even those that produce radiation. "Because it is so resilient, the device could open up the field of electronic engineering in ways that were not previously possible due to the limitations imposed by less tolerant silicon transistors," he said.
Huang has published more than 15 papers during his time as doctoral student in the Department of Electrical, Computer, and Systems Engineering at Rensselaer. Despite obvious difficulties, his parents worked tirelessly to give Huang the best possible educational opportunities according to Huang. And when school wasn't enough, Huang's father woke him up early every morning to practice mathematical calculations without a calculator, instilling in Huang a lifelong appreciation for basic, theoretical mathematics and sciences.
He received a bachelor's in electronics from Peking University in Beijing in 2001 and a master's in physics from Rensselaer in 2003. He will receive his doctorate from Rensselaer on May 17, 2008 and plans to work as a device engineer in the semiconductor industry.
Fausto Intilla - www.oloscience.com

Wednesday, May 7, 2008

Smart Miniature Pump Could Deliver Medicine


Source:

ScienceDaily (May 7, 2008) — An innovative micro-pump makes it possible for tiny quantities of liquid – such as medicines – to be dosed accurately and flexibly. Active composites and an electronic control mechanism ensure that the low-maintenance pump works accurately – both forwards and backwards.
Medicines sometimes have to be administered in extremely small quantities. Just a few tenths of a milliliter may be sufficient to give the patient the ideal treatment. Micro-pumps greatly facilitate the dosage of minute quantities. Pumps like these have been built and constantly optimized for over 25 years. They find application in numerous areas – from medical engineering to microproduction technology – wherever tiny volumes have to be variably dosed with extreme accuracy.
However, these micro-pump systems are usually not as flexible as desired: They often work in only one direction, bubbles in the liquid impair their operation, they do not tolerate bothersome particles, they have a fixed pump output and they contain expendable parts such as valves or cogwheels. Together with partners from research institutes and industry, researchers at the Fraunhofer Institute for Mechanics of Materials IWM in Freiburg have developed an innovative pump system that solves all these problems: a controllable peristaltic micro-pump.
“The peristaltic pump is a highly complex system,” explains IWM project manager Dr. Bärbel Thielicke. “It contracts in waves in a similar way to the human esophagus, and thus propels the liquid along – it changes shape of its own accord. To achieve this, we had to use a whole range of different materials and special material composites.” The researchers use lead-zirconate-titanate (PZT) films that are joined in a suitable way with bending elements made of carbon-fiber-reinforced plastic and a flexible tube. “PZT materials change their shape as soon as you apply an electric field to them. This makes it possible to control the pump system electronically,” says Thielicke. Special adhesives additionally hold the various components of the pump system together. Thanks to the special control electronics, tiny quantities can be pumped accurately through the system.
The peristaltic pump system has already passed its first functional tests. Now the researchers are working to adapt the peristaltic micro-pump to the various different applications. “We work with special simulation models to do this,” says Thielicke. “We calculate in advance how the structure of the pump needs to be modified in order to administer other dosages or other liquids. This helps us save time and money during the development phase.”
Adapted from materials provided by Fraunhofer-Gesellschaft.

Fausto Intilla - www.oloscience.com

First Steps Toward Autonomous Robot Surgeries


Source:
ScienceDaily (May 7, 2008) — The day may be getting a little closer when robots will perform surgery on patients in dangerous situations or in remote locations, such as on the battlefield or in space, with minimal human guidance.
Engineers at Duke University believe that the results of feasibility studies conducted in their laboratory represent the first concrete steps toward achieving this space age vision of the future. Also, on a more immediate level, the technology developed by the engineers could make certain contemporary medical procedures safer for patients, they said.
For their experiments, the engineers started with a rudimentary tabletop robot whose "eyes" used a novel 3-D ultrasound technology developed in the Duke laboratories. An artificial intelligence program served as the robot's "brain" by taking real-time 3-D information, processing it, and giving the robot specific commands to perform.
"In a number of tasks, the computer was able to direct the robot's actions," said Stephen Smith, director of the Duke University Ultrasound Transducer Group and senior member of the research team. "We believe that this is the first proof-of-concept for this approach. Given that we achieved these early results with a rudimentary robot and a basic artificial intelligence program, the technology will advance to the point where robots -- without the guidance of the doctor -- can someday operate on people."
The results of a series of experiments on the robot system directing catheters inside synthetic blood vessels was published online in the journal IEEE Transactions on Ultrasonics, Ferroelectrics and Frequency Control. A second study, published in April in the journal Ultrasonic Imaging, demonstrated that the autonomous robot system could successfully perform a simulated needle biopsy.
Advances in ultrasound technology have made these latest experiments possible, the researchers said, by generating detailed, 3-D moving images in real-time.
The Duke laboratory has a long track record of modifying traditional 2-D ultrasound -- like that used to image babies in utero -- into the more advanced 3-D scans. After inventing the technique in 1991, the team also has shown its utility in developing specialized catheters and endoscopes for real-time imaging of blood vessels in the heart and brain.
In the latest experiment, the robot successfully performed its main task: directing a needle on the end of the robotic arm to touch the tip of another needle within a blood vessel graft. The robot's needle was guided by a tiny 3-D ultrasound transducer, the "wand" that collects the 3-D images, attached to a catheter commonly used in angioplasty procedures.
"The robot was able to accurately direct needle probes to target needles based on the information sent by the catheter transducer," said John Whitman, a senior engineering student in Smith's laboratory and first author on both papers. "The ability of the robot to guide a probe within a vascular graft is a first step toward further testing the system in animal models."
While the research will continue to refine the ability of robots to perform independent procedures, the new technology could also have more direct and immediate applications.
"Currently, cardiologists doing catheter-based procedures use fluoroscopy, which employs radiation, to guide their actions," Smith said. "Putting a 3-D ultrasound transducer on the end of the catheter could provide clearer images to the physician and greatly reduce the need for patients to be exposed to radiation."
In the earlier experiments, the tabletop robot arm successfully touched a needle on the arm to another needle in a water bath. Then it performed a simulated biopsy of a cyst, fashioned out of a liquid-filled balloon in a medium designed to simulate tissue.
"These experiments demonstrated the feasibility of autonomous robots accomplishing simulated tasks under the guidance of 3-D ultrasound, and we believe that it warrants additional study," Whitman said.
The researchers said that adding this 3-D capability to more powerful and sophisticated surgical robots already in use at many hospitals could hasten the development of autonomous robots that could perform complex procedures on humans.
The research in Smith's lab is supported by the National Institutes of Health. Other Duke members of the team were Matthew Fronheiser and Nikolas Ivancevich.
Adapted from materials provided by Duke University.
Fausto Intilla - www.oloscience.com

New Cell-based Sensors Sniff Out Danger Like Bloodhounds


Source:
ScienceDaily (May 7, 2008) — A small, unmanned vehicle makes its way down the road ahead of a military convoy. Suddenly it stops and relays a warning to the convoy commander. The presence of a deadly improvised explosive device, or IED, has been detected by sophisticated new sensor technology incorporating living olfactory cells on microchips mounted on the unmanned vehicle. The IED is safely dismantled and lives are saved.
This scenario may become a reality, thanks to the work of three faculty researchers in the University of Maryland's A. James Clark School of Engineering who are collaborating across engineering disciplines to make advanced "cell-based sensors-on-a-chip" technology possible. Pamela Abshire, electrical and computer engineering (ECE) and Institute for Systems Research (ISR); Benjamin Shapiro, aerospace engineering and ISR; and Elisabeth Smela, mechanical engineering and ECE; are working on new sensors that take advantage of the sensory capabilities of biological cells.
These tiny sensors, only a few millimeters in size, could speed up and improve the detection of everything from explosive materials to biological pathogens to spoiled food or impure water.
Today's biochemical detectors are slow and produce an unacceptable number of false readings.
They are easily fooled because they often cannot distinguish subtle differences between deadly pathogens and harmless substances, and cannot fully monitor or interpret the different ways these substances interact with biological systems. To solve this problem, the Clark School researchers are learning how to incorporate real cells into tiny micro-systems to detect chemical and biological pathogens.
Different cells can be grown on these microchips, depending on the task at hand. Like a bloodhound hot on the trail of a scent, a chip containing a collection of olfactory cells plus sensing circuits that can interpret their behavior could detect the presence of explosives.
The researchers plan to use other specialized cells much like a canary in a coal mine. The cells would show stress or die when exposed to certain pathogens, and the sensing circuits monitoring them would trigger a warning -- more quickly and accurately than in present systems.
The researchers are tackling the many challenges that must be met for such chips to become a reality. Abshire, for example, is building circuits that can interact with the cells and transmit alerts about their condition. Shapiro and Smela are working on micro-fluidics technology to get the cells where they need to be on the chip, and to keep them alive and healthy once they're in position. Smela is also developing packages that incorporate the kind of wet, life-sustaining environments the biological components need, while keeping the sensitive electronic parts of the sensor dry.
Current research funding for the cell-based sensor technology comes from the National Science Foundation, the Department of Homeland Security and the Defense Intelligence Agency. Potential applications for their use extend well beyond national security, however.
For example, cell-based sensors could detect the presence of harmful bacteria in ground beef or spinach, or detect the local origin of specialty foods like cheeses or wines. In the pharmaceutical industry they could identify the most promising medicines in advance of animal and human trials, increasing cost-effectiveness and speed in developing new drugs. And they could speed up research in basic science. Imagine tiny biology labs, each one on a chip, in an array of thousands of chips that could fit in the palm of your hand.
Such arrays could advance biologists' fundamental understanding about the sense of smell or help doctors better see how the immune system works. They could be placed on fish as they swim in the ocean to monitor water quality, or set on a skyscraper's roof to evaluate air pollution.
"We bring the capability to monitor many different cells in parallel on these chips," explains Abshire. "You could say we're applying Moore's Law of exponentially increasing computer processing capability to cell biology."
The research won the University of Maryland's 2004 Invention of the Year Award in the physical science category. A patent application is on file with the U.S. Patent and Trademark Office.
Adapted from materials provided by University of Maryland, via EurekAlert!, a service of AAAS.
Fausto Intilla - www.oloscience.com

Tuesday, May 6, 2008

Piecing Together The Next Generation Of Cognitive Robots


Source:
ScienceDaily (May 6, 2008) — Building robots with anything akin to human intelligence remains a far off vision, but European researchers are making progress on piecing together a new generation of machines that are more aware of their environment and better able to interact with humans.
Making robots more responsive would allow them to be used in a greater variety of sophisticated tasks in the manufacturing and service sectors. Such robots could be used as home helpers and caregivers, for example.
As research into artificial cognitive systems (ACS) has progressed in recent years it has grown into a highly fragmented field. Some researchers and teams have concentrated on machine vision, others on spatial cognition, and on human-robot interaction, among many other disciplines.
All have made progress, but, as the EU-funded project CoSy (Cognitive Systems for Cognitive Assistants) has shown, by working together the researchers can make even more advances in the field.
“We have brought together one of the broadest and most varied teams of researchers in this field,” says Geert-Jan Kruijff, the CoSy project manager at the German Research Centre for Artificial Intelligence. “This has resulted in an ACS architecture that integrates multiple cognitive functions to create robots that are more self-aware, understand their environment and can better interact with humans.”
The CoSy ACS is indeed greater than the sum of its parts. It incorporates a range of technologies from a design for cognitive architecture, spatial cognition, human-robot interaction and situated dialogue processing, to developmental models of visual processing.
“We have learnt how to put the pieces of ACS together, rather than just studying them separately,” adds Jeremy Wyatt, one of the project managers at the UK’s University of Birmingham.
The researchers have made the ACS architecture toolkit they developed available under an open source license. They want to encourage further research. The toolkit has already sparked several spin-off initiatives.
Overcoming the integration challenge
“The integration of different components in an ACS is one of the greatest challenges in robotics,” Kruijff says. “Getting robots to understand their environment from visual inputs and to interact with humans from spoken commands and relate what is said to their environment is enormously complex.”
Because of the complexity most robots developed to date have tended to be reactive. They simply react to their environment rather than act in it autonomously. Similar to a beetle that scuttles away when prodded, many mobile robots back off when they collide with an object, but have little self-awareness or understanding of the space around them and what they can do there.
In comparison, a demonstrator called the Explorer developed by the CoSy team has a more human-like understanding of its environment. Explorer can even talk about its surroundings with a human.
Instead of using just geometric data to create a map of its surroundings, the Explorer also incorporates qualitative, topographical information. Through interaction with humans it can then learn to recognise objects, spaces and their uses. For example, if it sees a coffee machine it may reason that it is in a kitchen. If it sees a sofa it may conclude it is in a living room.
“The robot sees a room much as humans see it because it has a conceptual understanding of space,” Kruijff notes.
Another demonstrator, called the PlayMate, applied machine vision and spatial recognition in a different context. PlayMate uses a robotic arm to manipulate objects in response to human instructions.
In Wyatt’s view the development of machine vision and its integration with other ACS components is still a big obstacle to creating more advanced robots, especially if the goal is to replicate human sight and awareness.
“Don’t underestimate how sophisticated we are…,” he says. “We don’t realise how agile our brains are at interpreting what we see. You can pick out colours from a scene, look at a bottle of water, a packet of cornflakes, or a coffee mug and know what activities each of them allows. You recognise them, see where to grasp them, and how to manipulate them, and you do it all seamlessly. We are still so very, very far from doing that with robots.”
Robotic ‘gofers’
Fortunately, replicating human-like intelligence and awareness, if it is indeed possible, is not necessary when creating robots that are useful to humans.
Kruijff foresees robots akin to those developed in the CoSy project becoming an everyday sight over the coming years in what he describes as ‘gofer scenarios’. Already some robots with a lower level of intelligence are being used to bring medicines to patients in hospitals and could be used to transport documents around office buildings.
Robotic vacuum cleaners are becoming increasingly popular in homes, as too are toys that incorporate artificial intelligence. And the creation of robots that are able to interact with people opens the door to robotic home helpers and caregivers.
“In the future people may all be waited on by robots in their old age,” Wyatt says.
Adapted from materials provided by ICT Results.
Fausto Intilla - www.oloscience.com

Sunday, May 4, 2008

The rise of the emotional robot


Duke is careering noisily across a living room floor resplendent in the dark blue and white colours of Duke University in Durham, North Carolina. He's no student but a disc-shaped robotic vacuum cleaner called the Roomba. Not only have his owners dressed him up, they have also given him a name and gender.
Duke is not alone. Such behaviour is common, and takes myriad forms according to a survey of almost 400 Roomba owners, conducted late last year by Ja-Young Sung and Rebecca Grinter, who research human-computer interaction at the Georgia Institute of Technology in Atlanta.
"Dressing up Roomba happens in many ways," Sung says. People also often gave their robots a name and gender, according to the survey (see Diagram) which Sung presented at the Human-Robot Interaction conference earlier this month in Amsterdam, the Netherlands.
Kathy Morgan, an engineer based in Atlanta, said that her robot wore a sticker saying "Our Baby", indicating that she viewed it almost as part of the family. "We just love it. It frees up our lives from so much cleaning drudgery," she says.
Sung believes that the notion of humans relating to their robots almost as if they were family members or friends is more than just a curiosity. "People want their Roomba to look unique because it has evolved into something that's much more than a gadget," she says. Understanding these responses could be the key to figuring out the sort of relationships people are willing to have with robots.
Until now, robots have been designed for what the robotics industry dubs "dull, dirty and dangerous" jobs, like welding cars, defusing bombs or mowing lawns. Even the name robot comes from robota, the Czech word for drudgery. But Sung's observations suggest that we have moved on. "I have not seen a single family who treats Roomba like a machine if they clothe it," she says. "With skins or costumes on, people tend to treat Roomba with more respect."
The Roomba, which is made by iRobot in Burlington, Massachusetts, isn't the only robot that people seem to bond with. US soldiers serving in Iraq and interviewed last year by The Washington Post developed strong emotional attachments to Packbots and Talon robots, which dispose of bombs and locate landmines, and admitted feeling deep sadness when their robots were destroyed in explosions. Some ensured the robots were reconstructed from spare parts when they were damaged and even took them fishing, using the robot arm's gripper to hold their rod.

Figuring out just how far humans are willing to go in shifting the boundaries towards accepting robots as partners rather than mere machines will help designers decide what tasks and functions are appropriate for robots. Meanwhile, working out whether it's the robot or the person who determines the boundary shift might mean designers can deliberately create robots that elicit more feeling from humans. "Engineers will need to identify the positive robot design factors that yield good emotions and not bad ones - and try to design robots that promote them," says Sung.
To work out which kinds of robots are more likely to coax social responses from humans, researchers led by Frank Heger at Bielefeld University in Germany are scanning the brains of people as they interact with robots. The team starts by getting humans to "meet" four different "opponents": a computer program running on a laptop, a pair of robotic lego arms that tap the keys of a laptop, a robot with a human-shaped body and rubbery human-like head, which also taps at a laptop, and a human. Then the volunteers don video goggles and enter an MRI machine. While inside the machine, a picture of the opponent they must play against flashes up inside their goggles.
The game, a modified version of the prisoner's dilemma, asks volunteers to choose between cooperating with their opponent or betraying them. As they can't tell what their opponent will do, it requires them to predict what their opponent is thinking. The volunteers indicate their choice from inside the scanner using a handset that controls their video display. The team carried out the experiment on 32 volunteers, who each played all four opponents. Then they compared the brain scans for each opponent, paying particular attention to the parts of the brain associated with assessing someone else's mental state, known as theory of mind. This ability is considered a vital part of successful social interactions.
Unsurprisingly, the team found that neurons associated with having a theory of mind were active to some extent when playing all opponents. However, they were more active the more human-like their opponent was, with the human triggering the most activity in this region, followed by the robot with the human-like body and head. The team says this shows that the way a robot looks affects the sophistication of an interaction.
Not surprisingly, though there are similarities between the way people view robots and other human beings, there are also differences. Daniel Levin and colleagues at Vanderbilt University in Nashville, Tennessee, showed people videos of robots in action and then interviewed them. He says that people are unwilling to attribute intentions to robots, no matter how sophisticated they appear to be.
Further complicating the matter, researchers have also shown that the degree to which someone socialises with and trusts a robot depends on their gender and nationality (See "Enter the gender-specific robot").
These uncertainties haven't stopped some researchers from forming strong opinions. Herbert Clark, a psychologist at Stanford University in California, is sceptical about humans ever having sophisticated relationships with robots. "Roboticists should admit that robots will never approach human-like interaction levels - and the sooner they do the sooner we'll get a realistic idea of what people can expect from robots." He says that robots' lack of desire and free will is always going to limit the way humans view them.
But Hiroshi Ishiguro of Osaka University in Japan thinks that the sophistication of our interactions with robots will have few constraints. He has built a remote-controlled doppelgänger, which fidgets, blinks, breathes, talks, moves its eyes and looks eerily like him (New Scientist, 12 October 2006, p 42). Recently he has used it to hold classes at his university while he controls it remotely. He says that people's reactions to his doppelgänger suggest that they are engaging with the robot emotionally. "People treat my copy completely naturally and say hello to it as they walk past," he says. "Robots can be people's partners and they will be."


Enter the gender-specific robot
How people view robots may inform what future robots can do, but it seems that gender and nationality feed into our reaction, too.
Cognitive scientist Paul Schermerhorn and colleagues at Indiana University in Bloomington asked 24 men and 23 women to cooperate with a machine-like robot on solving a mathematical problem and filling in a survey form. The robot consisted of a base with metre-high posts either side supporting a head with two cameras that looked like eyes. A voice synthesiser allowed it to speak. The team found that men thought of the robot as "more human-like" than women and engaged well with it at a social level, while women felt socially aloof and described it as "more machine-like".
However, the researchers say the difference in perception may be due to the way this particular robot interacted with the women - perhaps for some reason that robot appealed to men. They say that robots might need to acquire gender-specific behaviours to engage with humans. "People might prefer to interact with robots that exhibit characteristics of their gender, or of the opposite gender," says Schermerhorn. "This could lead to tailoring of the robot's characteristics to the [gender of the] human in future interactions."
Meanwhile, Vanessa Evers of the University of Amsterdam, the Netherlands, together with researchers at Stanford University in California have found that US volunteers of European descent perceive robots differently to people raised in China who lived elsewhere for less than six years. They asked their volunteers how they would react in a hypothetical space emergency when a robot was on hand that might save them. It turned out that the US participants were more willing to trust the robot's decisions and were happier giving it control of the spacecraft than the Chinese participants. "This confirms that people from different national cultures may respond differently to robots," Evers says.

Fausto Intilla - www.oloscience.com

Robotic bugs set to invade the battlefield

Source:

A swarm of robotic insects is being developed for the military to hunt down enemy fighters in buildings and caves, carry mini bombs and identify chemical, nuclear or biological weapons.

Watch the video at:
http://link.brightcove.com/services/link/bcpid1488655367/bctid1536203797

They look as though they have crawled from the set of a science fiction film, but the bugs are based on the design and size of real insects, including spiders and dragonflies.
They are to be fitted with cameras, as well as sensors to identify different types of weapon, and can be kitted out with a small payload of explosives.
The spider model is similar to that featured in the 2002 sci-fi film, Minority Report, starring Tom Cruise, in which robot insects are sent into a building by police to search for a suspect.
The robots are being developed for use by the American military and its allies, including the British Army, by BAE Systems.
Prototypes small enough to sit on a fingertip have already been created, including a fly that weighs less than an ounce and has a wingspan of 1.18in.
Lightweight carbon joints allow the robot to mimic precisely the movements of a real fly, with wings that beat 110 times a second.
Steve Scalera, programme manager for the project, said: "We’re trying to harness nature’s designs. Evolution has done a fabulous job of producing extremely efficient and capable systems.
"We’re building a collection of miniature robots that can explore complex terrain we wouldn’t normally be able to approach because it is too dangerous.
"This might mean exploring buildings or caves looking for people inside, searching for dangerous items like munitions, chemical, biological or nuclear substances that might be there."
The battery-powered insects will not be remotely-controlled by soldiers, but will be fitted with "artificial intelligence" software that lets them operate autonomously, and in teams.
Mr Scalera added: "We don't want to overburden soldiers on the battlefield. These devices can find their own way and work together in teams, much like groups of ants or bees do. But they work for the soldiers, feeding them information.
"At the soldier level, on the battlefield, we envisage these pieces of equipment to be ubiquitous. We want to actually put them in the hands of soldiers, who may have a pocketful of them.
"They can then use them at a moment’s notice, to provide additional awareness and to extend the soldier’s senses and reach, perhaps to look over a wall or search a building, before breaching it. They will enable us to do things that we currently just can’t do. They will save lives.”
The creators also envisage civilian uses for the insects, such as search-and-rescue operations, following building or mine collapses.
The Micro Autonomous Systems and Technology (Mast) project is being led by BAE Systems and involves scientists at universities across America.
It has been funded by a £19 million grant to BAE by the US Army Research Laboratory for use by America and Britain. Dr Joseph Mait, of the laboratory, said: "Robotic platforms provide operational capabilities that would otherwise be costly, impossible, or deadly to achieve."
Prof Ismet Gursul, who has been studying insect flight for use in robotics at the University of Bath, said: "This might seem like science fiction, but it is a process of natural evolution for robots. "Engineers are making robots smaller and smaller, because it saves on costs and allows you to make more."

Fausto Intilla - www.oloscience.com

Prepping Robots to Perform Surgery


Source:
By BARNABY J. FEDER
Published: May 4, 2008
WHAT do you call a surgeon who operates without scalpels, stitching tools or a powerful headlamp to light the patient’s insides? A better doctor, according to a growing number of surgeons who prefer to hand over much of the blood-and-guts portion of their work to medical robots controlled from computer consoles.
Many urologists performing prostate surgery view the precise, tremor-free movements of a robot as the best way to spare nerves crucial to bladder control and sexual potency. A robot’s ability to deftly handle small tools may lead to a less invasive procedure and faster recovery for a patient. Robots also can protect surgeons from physical stress and exposure to X-rays that may force them into premature retirement.
A generation ago, the debate in medicine was whether robotics would ever play a role. Today, robots are a fast-growing, diversifying $1 billion segment of the medical device industry. And Wall Street has just two questions for the industry: How far is this going, and how fast?
There are no simple answers, of course, but it is remarkable how often Frederic H. Moll comes up in any discussion.
Dr. Moll, 56, is a soft-spoken man who can look uncomfortable on stage. Yet his role in founding Intuitive Surgical, the company that now dominates the field, and his current involvement with three other robotics companies, has kept him in the sights of investors, health care providers and fellow entrepreneurs.
He’s now best known as chief executive of Hansen Medical, a publicly traded robotics company focused on minimally invasive cardiac care. But he’s also an investor in and a board member of Mako Surgical, an orthopedics robotics company that recently went public, and he is a co-founder and chairman of Restoration Robotics, a start-up company focused on cosmetic surgery.
“Anyone who meets Fred will remember him,” says Maurice R. Ferré, the chief executive of Mako, which makes a drill that shuts off if a knee surgeon starts removing too much bone. “He will cut you off to ask technical questions and drives right to what’s important. A lot of people are looking at the Mako story because Fred’s involved.”
Despite Wall Street’s growing fondness for medical robotics companies, plenty of health care providers and insurers are cautious. They’re looking for more evidence that robotics improves outcomes for patients at a cost hospitals can absorb. Many still wonder whether it is more about marketing than medical progress.
Winifred Hayes, chief executive of Hayes Inc., a health care technology consulting firm in Lansdale, Pa., says that most clinical data doesn’t support contentions that patients fare better with robotic surgery. Most hospitals and clinics are losing money or making poor returns on their robots, she says.
“The real story is that this is a technology that has been disseminated fairly widely prematurely,” she says.
Even so, interest in robotics remains strong, and the arc of Dr. Moll’s own career has landed him at the intersection of tussles between business and medicine.
His parents were both pediatricians, and he sailed through medical school. But during his surgical residency at the Virginia Mason Medical Center in Seattle in the early 1980s, he found the ailments of patients less compelling than the shortcomings of the tools that surgeons used to treat them.
“I was struck by the size of the incision and injury created just to get inside the body,” Dr. Moll says. “It felt antiquated.”
So he obtained a leave of absence to study whether the long slender cutting tools he had seen gynecologists use in sterilization surgery on women could be adapted to gall bladder removal.
“We saved the spot for 10 years, but he never came back,” said Dr. John A. Ryan Jr., then head of the surgical training at Virginia Mason.
Indeed, Dr. Moll had left Seattle for Silicon Valley, where he spent the next decade creating and selling two medical equipment businesses while getting a graduate degree in management at Stanford. He walked away from the two deals with about $7.5 million. That was modest by the standards of, say, Paul Allen and Bill Gates, the Microsoft founders who were his schoolmates at the exclusive Lakeside School in Seattle in the early 1970s, but Dr. Moll had found his calling.
He says his immersion in the entrepreneurial life cost him his marriage; he remembers once telling his wife he was so busy he couldn’t talk to her for a month. But it also set him on a course to become a pioneer in the emerging field of medical robotics.
ROBOTS revolutionized manufacturing during the 1980s, on the back of advances in computing, motion controls and software design.
Visionaries like Dr. Richard M. Satava, who oversaw federally funded medical robotics research at the time, predicted that robots would eventually be able to operate as precisely as the world’s greatest surgeons and far more tirelessly, perhaps even in remote locations, through satellite links.
A project that Dr. Satava’s group financed to build a remotely controlled medical robot for the battlefield caught Dr. Moll’s eye in 1994.
Dr. Moll saw scant commercial potential for long-distance surgery, but he became convinced that the technology, being developed by SRI International, a nonprofit contract research firm in Palo Alto, Calif., could be adapted to make routine surgery much less invasive in the hands of civilian surgeons.
He took the idea to his employer, Guidant, a medical device company. Guidant decided that robotic surgery was too futuristic and too risky, so Dr. Moll rounded up backers, resigned, and in 1995, founded Intuitive Surgical.
A competitor, Computer Motion, had a head start using technology developed for the space program. But Intuitive Surgical had an experienced management team headed by Lonnie M. Smith. Mr. Smith was recruited from Hillenbrand Industries, where he oversaw health care companies, to become chief executive in 1997, leaving Dr. Moll to concentrate on strategic development.
Intuitive went public in 2000 at $9 a share. (Dr. Moll’s stake at the time was worth roughly $13.5 million, and he still owns a significant number of shares.) In 2003, it acquired Computer Motion, eliminating both patent wars and the competing design. Since then, soaring sales and profits have laid to rest any Wall Street doubts that robots could be commercially successful.
The company earned $144.5 million last year on sales of $600.8 million. Based on first-quarter results that were better than expected, Intuitive forecasts that sales will grow 42 percent this year, to $853.2 million. Its stock, which traded at $42.42 three years ago, closed Friday at $290.03 a share.
The company prospered by proving that robots could deftly handle rigid surgical tools like scalpels and sewing needles through small incisions in a patient’s skin. In prostate surgery, it is rapidly becoming unusual for a urologist to operate without using one of Intuitive’s da Vinci robots, which sell for $1.3 million, on average. Each also generates hundreds of thousands of dollars more in annual revenue from service contracts and attachments that must be replaced after each procedure. Intuitive is now marketing the da Vinci to other specialists, including gynecologists and heart surgeons.
Intuitive’s success has not put to rest questions about how many hospitals and clinics can afford robots. The da Vinci and the CyberKnife, a precision radiation robot from Accuray to treat tumors, are featured in hospital ads to attract patients, but it is hard for hospitals to get extra reimbursement from insurers for using them.
However, hospitals that have been leaders in adopting robotic technology say they are content to just break even for now, because the investment is partly about attracting surgeons who want to be leaders in research and training.
“If you are looking at the future, it’s hard to envision a hospital not offering robotics,” said Robert Glenning, chief financial officer at the Hackensack University Medical Center in New Jersey, which has bought five da Vinci’s and has a sixth on loan from Intuitive Surgical that is used to train visiting doctors.
DR. MOLL left Intuitive in 2002 to pursue a more ambitious concept at Hansen Medical: robots that manipulate the tips of thin, flexible catheters that doctors insert deep in the heart. If he succeeds, the Sensei robotic systems from Hansen, costing about $675,000, may become the go-to tools for treating many circulatory problems.
Relations between the two companies were rocky in the first year because of disagreements over the breadth of Intuitive’s patents. Eventually, the two signed an intellectual property agreement that gives Intuitive a 3 percent royalty on Hansen sales. With Intuitive expanding into cardiac care, the two may eventually collide in some procedures.
Doctors who use catheters generally gain access to the circulatory system through a small incision in the major veins that run through the thigh or arm. Both the makers of rigid tools and the catheter companies are competing in another fast-developing field of “scarless” therapy involving operations performed through the urinary tract and other natural openings.
Dr. Moll is betting that flexible tools like those that work with the Sensei will dominate as this movement matures. He took a team of four Hansen employees to India last summer for a series of surgeries testing whether kidney stones could be removed by using a robotic catheter. Dr. Inderbir S. Gill, a urologist from the Cleveland Clinic who led the research, said that Dr. Moll had followed every case for four days.
“He was at the console like a mother hen even though he wasn’t allowed to touch it,” said Dr. Gill, who received stock in Hansen for work on the research and is planning a clinical trial.
Like Intuitive in its early days, Hansen faces a competitor that got an earlier start. Stereotaxis, based in St. Louis, makes the Niobe, a robot that generates magnetic fields around the patient. By manipulating the magnetic field from Niobe’s computer, doctors can manage the movements inside the patient of its customized magnetic catheters.
The Sensei manipulates a Hansen catheter called Artisan, a hollow sheath through which doctors can deploy smaller catheters. Sensei and Artisan were approved by federal regulators last May for use with catheters that map electrical activity in the heart. While mapping is currently the only job for which Hansen can actively market the Sensei, the robot’s real focus is to combine mapping with minimally invasive treatments to halt electrical short circuits in the heart that cause it to beat abnormally.
Fans include Dr. Davendra Mehta, chief arrhythmia specialist at Mount Sinai Medical Center, who last fall became the first doctor in New York City to order a Sensei. “This is like power steering versus conventional steering,” said Dr. Mehta during a recent procedure.
Using the robot also lets Dr. Mehta avoid spending up to five hours a day wearing a lead vest to limit his exposure to the X-rays when monitoring the catheter’s location in a patient.
THE potential appeal of the Sensei may be obvious. But with just 23 systems installed at the end of March, the competition from Stereotaxis and doubts among many health care providers about whether robots are worth the expense, Dr. Moll has plenty of obstacles ahead.
Still, he and his team members took Hansen public in November 2006, and received approval from regulators in Europe and the United States to market the Sensei. In April, Hansen raised $39.4 million in a secondary stock offering despite Wall Street’s gloomy outlook on the economy. Hansen also has an agreement with St. Jude Medical, the heart device company that is a leader in 3-D heart mapping systems, for co-marketing of technologies.
Dr. Moll said Hansen, based in Mountain View, Calif., should become profitable by the end of next year, two and a half years sooner than Intuitive crossed that threshold. Hansen’s volatile stock, which hit a peak of $39.32 in October before tumbling to $13.48 in March, now trades at $18.54 a share after the company reported better-than-expected first-quarter results on Thursday. Hansen sold eight new robots in the quarter, producing revenue of $6.2 million, and operating losses narrowed.
Even while juggling all of this, Dr. Moll is serving as chairman of Restoration Robotics, a start-up he has financed that aims to apply robotics to hair replacement surgeries for bald men.
Dr. Moll says robotics will ultimately advance on still other fronts, largely because it can help doctors of varying ability perform at the level of the world’s top surgeons.
“The public has no idea of the extent of difference between top surgeons and bad ones,” he said. “Robots are good at going where they are supposed to, remembering where they are and stopping when required.”
Fausto Intilla - www.oloscience.com

Nanotechnology Produces New Electronic Materials

Source:
Written by Philip Buonpastore
Friday, 02 May 2008

STONEY BROOK, NY - The nanotechnology of engineering atomic layer interfaces to produce desired properties, called “improper ferroelectricity” reportedly holds promise for a technological revolution that may compare to the development of modern electronics.
According to an article in the April 10th issue of Nature, a new artificial material is has been created that may mark the beginning of a revolution in the development of materials for electronic applications.
The new material, called a superlattice, which has a layered structure composed of alternating atomically thin layers of two different oxides (PbTiO3 and SrTiO3) takes on properties radically different than either of the two materials by themselves. According to the article, these properties are a direct consequence of the artificially layered atomic structure, and the interactions at the atomic level interface between the layers.
As stated in the article, Ferroelectrics are useful functional materials, with applications ranging from non-volatile computer memories, to micro-electromechanical machines or infrared detectors. “Improper ferroelectricity” is a kind of property that occurs only rarely in natural materials, with effects that are typically too small to be useful. This new superlattice material shows improper ferroelectricity (a property in neither of the original oxides) at a magnitude around 100 times greater than any naturally occurring improper ferroelectric, creating options for many more real world applications.
According to on of the material’s researcher, Dr. Matthew Dawber, “Besides the immediate applications that could be generated by this nanomaterial, this discovery opens a completely new field of investigation and the possibility of new functional materials based on…interface engineering on the atomic scale.”
Transition metal oxides are a class of materials that provoke great interest because of the great range of functional properties that they can present (dielectrics, ferroelectrics, piezoelectrics, magnets or superconductors) and the possibilities for integration into numerous devices. The majority of these oxides have a similar structure (referred to as ‘perovskite’) and recently, researchers have developed the ability to build these materials atomic layer by layer, to attempt to produce new materials with exceptional properties.
Fausto Intilla - www.oloscience.com

Ready for the Robot Revolution


Source:
By Pam Baker TechNewsWorld
Part of the ECT News Network 05/03/08 4:00 AM PT

Compared with the agile, intelligent robots envisioned in science fiction, today's real-life robots may seem relatively unimpressive. But advances in robotics are indeed being made, and the results don't necessarily manifest themselves in humanoid automatons that can dance and shake hands. Often, the technology finds other practical applications.
Despite some impressive showings in robotics lately, the accolades are slow to come from industry outsiders. We, the general public, watch Honda's Asimo slowly make its way down a few steps, for example, and unfairly compare it to the glib and golden C-3PO of science fiction, and thus blind ourselves to the miracle before us.
But it's not just Asimo that suffers from this prejudice; it's all of them walking, scrambling or rolling on the planet today.
In that misguided and erroneous judgment of all real robots , there is "Danger, Will Robinson," as the Model B-9 Environmental Control Robot in "Lost in Space" would say. For ignorance and diminished enthusiasm leads to lack of funding, derailed projects, slowed progress, and fewer minds focused on bringing sci-fi technology to life.
Fortunately, the industry has not been deterred. Once the blinders come off, you see robotics at work everywhere.
Don't Miss the Revolution

"If you are looking for robots, you might miss the robotic revolution," Matt Mason, director of the Robotics Institute at Carnegie Mellon University, told TechNewsWorld.
He says a robot is a bunch of technologies all packed together in one human- or animal-sized bundle. But in most cases there is no reason to cram the technologies into such a tight bundle. So even though you don't see it, robotics technologies are having a significant impact already, and that impact is growing.
For example, computer vision technology is being deployed in cameras for improved auto-focus and red eye correction. It's also deployed on the Web to assist in image database searches. Robotics technologies are used in computer games , to produce computer generated animation for movies, and even to provide real-time enhancement of televised sporting events.

Serendipitous Design

Where did these robotic enhancements come from? "These are useful by-products we discover along the way," Oussama Khatib, professor of Computer Science at Stanford University, told TechNewsWorld. Khatib heads a research group at the Stanford Artificial Intelligence Laboratory and is part of Honda's Humanoid Robot Project. He is also the father of the famous earlier robots: Romeo and Juliet.
"We are making progress on building a humanoid robot, but we don't need to completely build one before we can apply what we have learned to other disciplines," explained Khatib.
From enhanced medical tools and procedures to energy-saving appliances and smart security systems, robotics is pushing advancements overall at a speedier pace.
"There is still a long way to go before we see robots that can perform at a level that Hollywood presents, but things are moving ahead enough that there is value emerging," Tandy Trower, general manager of Microsoft (Nasdaq: MSFT) Robotics Group, told TechNewsWorld.
"Techniques like vision recognition that 10 years ago were black arts are popping up all over the place. What would likely cost you thousands of dollars in the past might even be free today," he added.

Better Living Through Robots

Beyond the invisible but useful robotics that power much of our mundane world, the quest for a perfect robot to aid in an uncertain future is under way in earnest.
In Japan, for example, a think tank is predicting that robots will start to fill the jobs of humans as the population there ages and shrinks. The U.S. is facing a similar population problem, as are a number of other well-developed countries.
"In the U.S. alone, 40 million people are over the age of 65, and over 90 percent of them wish to remain living independently. However, we all know that as we age, our physical and cognitive capabilities tend to decline. Robots could be important way to deal with this by acting as active agents that help make up for any diminished abilities we have," said Trower.
When industry insiders speak of robots filling jobs, they are not referring to the industrial robots we have now that deliver repetitive and automated services. They are referring to robots that can replace humans in far more complex tasks.
"Robots can remind us to take our medication. Robots can carry things for us. Robots can keep us better connected with our families and caregivers. Robots can even entertain us," said Trower. "And the cost of the robots to do this will be far cheaper than human assisted care."
The Robots of Japan
Japan in particular has a reason to rush robot development, which is why we see so many efforts coming from there. "With declines in birth rates, overall the average age is increasing at an unprecedented rate, and it is proceeding most rapidly in Japan," said Trower.
"There is great concern about how to deal with the loss out of the workforce as well as how to care for an ever increasing population of senior citizens. Robots are seen as way to bridge the gap," he explained.
The elderly stand to gain a great deal in quality of life issues. "Intelligent homes can provide support to the elderly, allowing us to lead independent lives much longer than we do now," explained Mason.
That support can range from carrying the groceries and putting them away to cleaning the house, aiding in dressing, providing immediate CPR and summoning medical assistance if needed. The possibilities are endless, making the robotic creations in the movie "I, Robot" a lot less far-fetched.

Humanoid, Insectoid, Robotoid

However, the ultimate design of these synthetic helpers is still a bit up for grabs.
Many have noted that the Japanese tend to design robots that look human while Westerners tend to build robots that look more like insects or animals. What's up with that?
"This may be partly cultural. The Japanese have always considered robots as positive human assistants. Astro Boy has been one of the earliest, most popular characterizations," explained Trower. "Perhaps it also has something to do with Asian culture being more attuned with nature, whereas in Western cultures we have been dominated by the scientific method, which tends to separate the rational from the emotional or social aspects of perception."
Others in the industry think the East vs. West perception of robots is totally off the mark from the start.
"Most of our robots are neither humanoid nor insectoid," laughed Mason. "In fact, they don't look like robots at all. Perhaps we should say they are not even robotoid."
He cites, for example, Boss, the Carnegie Mellon robot that recently won the Urban Challenge that looks like a car. "Probably when we have developed homes with embedded robotic technology, they will still look like homes," he said.
Challenges Ahead
The challenges ahead are difficult, but multi-disciplined teams are chiseling away at them every day.
"The current robots are platforms to study, but they are not yet safe for human interaction," explained Khatib. "We are also making progress in sensing and perceiving the environment, but it is difficult. The environment in industry is structured, but the environment around humans is messy. There is still a ways to go."
Khatib said the major challenges are in integration, decreasing the robot's weight, making a more sensitive skin for better environment perception, and in solving the human safety issue.
"A major stumbling block is to build systems that are safe and soft enough to interact with people, and also cheap enough for people to afford," agreed Mason.
In the end, the perfect robot will become commonplace and as unappreciated as the desktop PC.
"Robots are a natural evolution of PC technology, just enabled to interact and support us in a greater diversity of ways," said Trower. "They will help us live safer and more comfortable lives and will come in many forms, from smarter cars to smarter appliances.
"We may not even think of all of them as robots at all," he said. At least that much will stay the same.
Fausto Intilla - www.oloscience.com

Sunday, April 27, 2008

Next Step In Robot Development Is Child's Play


Source:
ScienceDaily (Apr. 26, 2008) — Teaching robots to understand enough about the real world to allow them act independently has proved to be much more difficult than first thought.
The team behind the iCub robot believes it, like children, will learn best from its own experiences.
The technologies developed on the iCub platform – such as grasping, locomotion, interaction, and even language-action association – are of great relevance to further advances in the field of industrial service robotics.
The EU-funded RobotCub project, which designed the iCub, will send one each to six European research labs. Each of the labs proposed winning projects to help train the robots to learn about their surroundings – just as a child would.
The six projects include one from Imperial College London that will explore how ‘mirror neurons’ found in the human brain can be translated into a digital application. ‘Mirror neurons’, discovered in the early 1990s, trigger memories of previous experiences when humans are trying to understand the physical actions of others. A separate team at UPF Barcelona will also work on iCub’s ‘cognitive architecture’.
At the same time, a team headquartered at UPMC in Paris will explore the dynamics needed to achieve full body control for iCub. Meanwhile, researchers at TUM Munich will work on the development of iCub’s manipulation skills. A project team from the University of Lyons will explore internal simulation techniques – something our brains do when planning actions or trying to understand the actions of others.
Over in Turkey, a team based at METU in Ankara will focus almost exclusively on language acquisition and the iCub’s ability to link objects with verbal utterances.
“The six winners had to show they could really use and maintain the robot, and secondly the project had to exploit the capabilities of the robot,” says Giorgio Metta. “Looking at the proposals from the winners, it was clear that if we gave them a robot we would get something in return.”
The iCub robots are about the size of three-year-old children, with highly dexterous hands and fully articulated heads and eyes. They have hearing and touch capabilities and are designed to be able to crawl on all fours and to sit up.
Humans develop their abilities to understand and interact with the world around them through their experiences. As small children, we learn by doing and we understand the actions of others by comparing their actions to our previous experience.
The developers of iCub want to develop their robots’ cognitive capabilities by mimicking that process. Researchers from the EU-funded Robotcub project designed the iCub’s hardware and software using a modular system. The design increases the efficiency of the robot, and also allows researcher to more easily update individual components. The modular design also allows large numbers of researchers to work independently on separate aspects of the robot.
iCub’s software coding, along with technical drawings, are free to anyone who wishes to download and use them.
“We really like the idea of being open as it is a way to build a community of many people working towards a common objective,” says Giorgio Metta, one of the developers of iCub. “We need a critical mass working on these types of problems. If you get 50 researchers, they can really layer knowledge and build a more complex system. Joining forces really makes economic sense for the European Commission that is funding these projects and it makes scientific sense.”
Built-in learning skills
While the iCub’s hardware and mechanical parts are not expected to change much over the next 18 months, researchers expect to develop the software further. To enable iCub to learn by doing, the Robotcub research team is trying to pre-fit it with certain innate skills.
These include the ability to track objects visually or by the sounds – with some element of prediction of where the tracked object will move to next. iCub should also be able to navigate based on landmarks and a sense of its own position.
But the first and key skill iCub needs for learning by doing is an ability to reach towards a fixed point. By October this year, the iCub developers plan to develop the robot so it is able to analyse the information it receives via its vision and feel ‘senses’. The robot will then be able to use this information to perform at least some crude grasping behaviour – reaching outwards and closing its fingers around an object.
“Grasping is the first step in developing cognition as it is required to learn how to use tools and to understand that if you interact with an object it has consequences,” says Giorgio Metta. “From there the robot can develop more complex behaviours as it learns that particular objects are best manipulated in certain ways.”
Once the assembly of the six robots for the research projects is completed, the developers plan to build more iCubs, creating between 15 and 20 in use around Europe.
Adapted from materials provided by ICT Results.
Fausto Intilla - www.oloscience.com

New Prosthetic Hand Has Grip Function Almost Like A Natural Hand: Each Finger Moves Separately


Source:
ScienceDaily (Apr. 25, 2008) — It can hold a credit card, use a keyboard with the index finger, and lift a bag weighing up to 20 kg – the world’s first commercially available prosthetic hand that can move each finger separately and has an astounding range of grip configurations. For the first time worldwide a patient at the Orthopedic University Hospital in Heidelberg has tested both the “i-LIMB” hand in comparison with another innovative prosthesis, the so called ”Fluidhand”. Eighteen-year-old Sören Wolf, who was born with only one hand, is enthusiastic about its capabilities.
The new prosthetic hand developed and distributed by the Scottish company “Touch Bionics” certainly has advantages over previous models. For example, a comparable standard product from another manufacturer allows only a pinch grip using thumb, index, and middle finger, and not a grip using all five fingers. This does not allow a full-wrap grip of an object.
Myoelectric signals from the stump of the arm control the prosthesis
Complex electronics and five motors contained in the fingers enable every digit of the i-LIMB to be powered individually. A passive positioning of the thumb enables various grip configurations to be activated. The myoelectric signals from the stump control the prosthetic hand; muscle signals are picked up by electrodes on the skin and transferred to the control electronics in the prosthetic hand. Batteries provide the necessary power.
The “Fluidhand” from Karlsruhe, thus far developed only as a prototype that is also being tested in the Orthopedic University Hospital in Heidelberg, is based on a somewhat different principle. Unlike its predecessors, the new hand can close around objects, even those with irregular surfaces. A large contact surface and soft, passive form elements greatly reduce the gripping power required to hold onto such an object. The hand also feels softer, more elastic, and more natural than conventional hard prosthetic devices.
"Fluidhand" prosthetic device offers better finishing and better grip function
The flexible drives are located directly in the movable finger joints and operate on the biological principle of the spider leg – to flex the joints, elastic chambers are pumped up by miniature hydraulics. In this way, index finger, middle finger and thumb can be moved independently. The prosthetic hand gives the stump feedback, enabling the amputee to sense the strength of the grip.
Thus far, Sören has been the only patient in Heidelberg who has tested both models. “This experience is very important for us,” says Simon Steffen, Director of the Department of Upper Extremities at the Orthopedic University Hospital in Heidelberg. The two new models were the best of those tested, with a slight advantage for Fluidhand because of its better finishing, the programmed grip configurations, power feedback, and the more easily adjustable controls. However, this prosthetic device is not in serial production. “First the developers have to find a company to produce it,” says Alfons Fuchs, Director of Orthopedics Engineering at the Orthopedic University Hospital in Heidelberg, as the costs of manufacturing it are comparatively high. However it is possible to produce an individual model. Thus far, only one patient in the world has received a Fluidhand for every-day use. A second patient will soon be fitted with this innovative prosthesis in Heidelberg.
Adapted from materials provided by University Hospital Heidelberg.
Fausto Intilla - www.oloscience.com