tag:blogger.com,1999:blog-25425637286205149822024-03-21T18:55:18.111-07:00Electronics & Robotics,News & Press - A Blog by F.Intilla (WWW.OLOSCIENCE.COM)olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.comBlogger180125tag:blogger.com,1999:blog-2542563728620514982.post-87642759567429572472012-02-28T05:08:00.004-08:002012-02-28T05:08:57.731-08:00Finding Explosives With Laser Beams.<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhVl4Jis34r1Z-4kHrnCLAqBX32tLX8ToQ9sZo7vmfKUB66UXjCp13SBXivA-2_V_fktPlAO4LLuTC67p5ajKAOYHuvlx0wMahhdfNotA3hpa0A0Awdm6pRYqIfPq48mWu1adjSg8ke_x71/s1600/lser.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhVl4Jis34r1Z-4kHrnCLAqBX32tLX8ToQ9sZo7vmfKUB66UXjCp13SBXivA-2_V_fktPlAO4LLuTC67p5ajKAOYHuvlx0wMahhdfNotA3hpa0A0Awdm6pRYqIfPq48mWu1adjSg8ke_x71/s1600/lser.jpg" /></a></div>
<div style="text-align: center;">
<em><span style="font-size: xx-small;">The Raman spectroscope emits laser light, which is scattered at the sample and then collected by the telescope (left). (Credit: Vienna University of Technology)</span></em></div>
<div style="text-align: center;">
Source: <a href="http://www.sciencedaily.com/releases/2012/02/120227093952.htm"><span style="color: yellow;">Science Daily</span></a></div>
<div style="text-align: center;">
---------------------------------</div>
<div style="text-align: left;">
<span class="date">ScienceDaily (Feb. 27, 2012)</span> — People like to keep a safe distance from explosive substances, but in order to analyze them, close contact is usually inevitable. At the Vienna University of Technology, a new method has now been developed to detect chemicals inside a container over a distance of more than a hundred meters. Laser light is scattered in a very specific way by different substances. Using this light, the contents of a nontransparent container can be analyzed without opening it.</div>
<strong>Scattered Light as a "Chemical Fingerprint":</strong><br />
"The method we are using is Raman-spectroscopy," says Professor Bernhard Lendl (TU Vienna). The sample is irradiated with a laser beam. When the light is scattered by the molecules of the sample, it can change its energy. For example, the photons can transfer energy to the molecules by exciting molecular vibrations. This changes the wavelength of the light -- and thus its colour. Analyzing the colour spectrum of the scattered light, scientists can determine by what kind of molecules it must have been scattered.<br />
<strong>Measuring over Great Distances -- with Highest Precision:</strong><br />
"Until now, the sample had to be placed very close to the laser and the light detector for this kind of Raman-spectroscopy," says Bernard Zachhuber. Due to his technological advancements, measurements can now be made over long distances. "Among hundreds of millions of photons, only a few trigger a Raman-scattering process in the sample," says Bernhard Zachhuber. These scattered particles of light are scattered uniformly in all directions. Only a tiny fraction travel back to the light detector. From this very weak signal, as much information as possible has to be extracted. This can be done using a highly efficient telescope and extremely sensitive light detectors.<br />
In this project (funded by the EU) the researchers at TU Vienna collaborated with private companies and with partners in public safety, including The Spanish Guardia Civil who are are extremely interested in the new technology. During the project, the Austrian military was also involved. On their testing grounds the researchers from TU Vienna could put their method to the extreme. They tested frequently used explosives, such as TNT, ANFO or RDX. The tests were highly successful: "Even at a distance of more than a hundred meters, the substances could be detected reliably," says Engelene Chrysostom (TU Vienna).<br />
<strong>Seeing Through Walls:</strong><br />
Raman spectroscopy over long distances even works if the sample is hidden in a nontransparent container. The laser beam is scattered by the container wall, but a small portion of the beam penetrates the box. There, in the sample, it can still excite Raman-scattering processes. "The challenge is to distinguish the container's light signal from the sample signal," says Bernhard Lendl. This can be done using a simple geometric trick: The laser beam hits the container on a small, well-defined spot. Therefore, the light signal emitted by the container stems from a very small region. The light which enters the container, on the other hand, is scattered into a much larger region. If the detector telescope is not exactly aimed at the point at which the laser hits the container but at a region just a few centimeters away, the characteristic light signal of the contents can be measured instead of the signal coming from the container.<br />
The new method could make security checks at the airport a lot easier -- but the area of application is much wider. The method could be used wherever it is hard to get close to the subject of investigation. It could be just as useful for studying icebergs as for geological analysis on a Mars mission. In the chemical industry, a broad range of possible applications could be opened up.olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com0tag:blogger.com,1999:blog-2542563728620514982.post-60634908904937792852011-06-18T07:56:00.001-07:002011-06-18T07:58:07.917-07:00'Ultrawideband' Could Be Future of Medical Monitoring<div align="center"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgBKKUoV8RV4DgLWx1RkgFa5w_wEJtfPxcd38T232upYX-nFOLp2vzJrZvoSef5f1pZws8V4NeHUJOPBzurAPKN8rjhaX1QSDUvM2NDrD6EilUY9zBOJHKt5GzPZIUhHrZbpNBImMyNwRVM/s1600/110616193735.jpg"><img style="TEXT-ALIGN: center; MARGIN: 0px auto 10px; WIDTH: 233px; DISPLAY: block; HEIGHT: 320px; CURSOR: hand" id="BLOGGER_PHOTO_ID_5619573835528972930" border="0" alt="" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgBKKUoV8RV4DgLWx1RkgFa5w_wEJtfPxcd38T232upYX-nFOLp2vzJrZvoSef5f1pZws8V4NeHUJOPBzurAPKN8rjhaX1QSDUvM2NDrD6EilUY9zBOJHKt5GzPZIUhHrZbpNBImMyNwRVM/s320/110616193735.jpg" /></a><strong> Source: </strong><a href="http://www.sciencedaily.com/releases/2011/06/110616193735.htm"><strong><span style="color:#ffff66;">ScienceDaily</span></strong></a></div><br /><div align="left"><strong>ScienceDaily (June 16, 2011) — New research by electrical engineers at Oregon State University has confirmed that an electronic technology called "ultrawideband" could hold part of the solution to an ambitious goal in the future of medicine -- health monitoring with sophisticated "body-area networks." Such networks would offer continuous, real-time health diagnosis, experts say, to reduce the onset of degenerative diseases, save lives and cut health care costs.<br />Some remote health monitoring is already available, but the perfection of such systems is still elusive.<br />The ideal device would be very small, worn on the body and perhaps draw its energy from something as minor as body heat. But it would be able to transmit vast amounts of health information in real time, greatly improve medical care, reduce costs and help to prevent or treat disease.<br />Sounds great in theory, but it's not easy. If it were, the X Prize Foundation wouldn't be trying to develop a Tricorder X Prize -- inspired by the remarkable instrument of Star Trek fame -- that would give $10 million to whoever can create a mobile wireless sensor that would give billions of people around the world better access to low-cost, reliable medical monitoring and diagnostics.<br />The new findings at OSU are a step towards that goal.<br />"This type of sensing would scale a monitor down to something about the size of a bandage that you could wear around with you," said Patrick Chiang, an expert in wireless medical electronics and assistant professor in the OSU School of Electrical Engineering and Computer Science.<br />"The sensor might provide and transmit data on some important things, like heart health, bone density, blood pressure or insulin status," Chiang said. "Ideally, you could not only monitor health issues but also help prevent problems before they happen. Maybe detect arrhythmias, for instance, and anticipate heart attacks. And it needs to be non-invasive, cheap and able to provide huge amounts of data."<br />Several startup companies such as Corventis and iRhythm have already entered the cardiac monitoring market.<br />According to the new analysis by OSU researchers, which was published in the EURASIP Journal on Wireless Communications and Networking, one of the key obstacles is the need to transmit large amounts of data while consuming very little energy.<br />They determined that a type of technology called "ultrawideband" might have that capability if the receiver getting the data were within a "line of sight," and not interrupted by passing through a human body. But even non-line of sight transmission might be possible using ultrawideband if lower transmission rates were required, they found. Collaborating on the research was Huaping Liu, an associate professor in School of Electrical Engineering and Computer Science.<br />"The challenges are quite complex, but the potential benefit is huge, and of increasing importance with an aging population," Chiang said. "This is definitely possible. I could see some of the first systems being commercialized within five years." Story Source:<br />The above story is reprinted (with editorial adaptations by ScienceDaily staff) from materials provided by </strong><a class="blue" href="http://www.orst.edu/" rel="nofollow" target="_blank"><strong>Oregon State University</strong></a><strong>.<br /></strong></div><br /><div align="center"></div>olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com1tag:blogger.com,1999:blog-2542563728620514982.post-52121326404150618062011-06-18T07:52:00.001-07:002011-06-18T07:54:37.646-07:00Team Reports Scalable Fabrication of Self-Aligned Graphene Transistors, Circuits<div align="center"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjyOjE5ufYFNkqdUSkUciiyNn67LP8_WIDhJ5Ko8DY2FLiXsBbbiXLm4x6gIEu1zpcqgPXD_ud_xk37mkviaPJ_uLpI9WWVmDWqwjm1D8TKRN2dhvMpyeQhfN1WAxAinS6MGlWClPTQeiRG/s1600/110617110710.jpg"><img style="TEXT-ALIGN: center; MARGIN: 0px auto 10px; WIDTH: 300px; DISPLAY: block; HEIGHT: 235px; CURSOR: hand" id="BLOGGER_PHOTO_ID_5619572892843095074" border="0" alt="" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjyOjE5ufYFNkqdUSkUciiyNn67LP8_WIDhJ5Ko8DY2FLiXsBbbiXLm4x6gIEu1zpcqgPXD_ud_xk37mkviaPJ_uLpI9WWVmDWqwjm1D8TKRN2dhvMpyeQhfN1WAxAinS6MGlWClPTQeiRG/s320/110617110710.jpg" /></a> <strong>Source: </strong><a href="http://www.sciencedaily.com/releases/2011/06/110617110710.htm"><strong><span style="color:#ffff66;">ScienceDaily</span></strong></a></div><br /><div align="left"><strong>ScienceDaily (June 16, 2011) — Graphene, a one-atom-thick layer of graphitic carbon, has the potential to make consumer electronic devices faster and smaller. But its unique properties, and the shrinking scale of electronics, also make graphene difficult to fabricate and to produce on a large scale. In September 2010, a UCLA research team reported that they had overcome some of these difficulties and were able to fabricate graphene transistors with unparalleled speed. These transistors used a nanowire as the self-aligned gate -- the element that switches the transistor between various states. But the scalability of this approach remained an open question.<br />Now the researchers, using equipment from the Nanoelectronics Research Facility and the Center for High Frequency Electronics at UCLA, report that they have developed a scalable approach to fabricating these high-speed graphene transistors.<br />The team used a dielectrophoresis assembly approach to precisely place nanowire gate arrays on large-area chemical vapor deposition-growth graphene -- as opposed to mechanically peeled graphene flakes -- to enable the rational fabrication of high-speed transistor arrays. They were able to do this on a glass substrate, minimizing parasitic delay and enabling graphene transistors with extrinsic cut-off frequencies exceeding 50 GHz. Typical high-speed graphene transistors are fabricated on silicon or semi-insulating silicon carbide substrates that tend to bleed off electric charge, leading to extrinsic cut-off frequencies of around 10 GHz or less.<br />Taking an additional step, the UCLA team was able to use these graphene transistors to construct radio-frequency circuits functioning up to 10 GHz, a substantial improvement from previous reports of 20 MHz.<br />The research opens a rational pathway to scalable fabrication of high-speed, self-aligned graphene transistors and functional circuits and it demonstrates for the first time a graphene transistor with a practical (extrinsic) cutoff frequency beyond 50 GHz.<br />This represents a significant advance toward graphene-based, radio-frequency circuits that could be used in a variety of devices, including radios, computers and mobile phones. The technology might also be used in wireless communication, imaging and radar technologies.<br />The UCLA research team included Xiangfeng Duan, professor of chemistry and biochemistry; Yu Huang, assistant professor of materials science and engineering at the Henry Samueli School of Engineering and Applied Science; Lei Liao; Jingwei Bai; Rui Cheng; Hailong Zhou; Lixin Liu; and Yuan Liu.<br />Duan and Huang are also researchers at the California NanoSystems Institute at UCLA.<br />The work was funded by grants from National Science Foundation and the National Institutes of Health.<br />The research was recently published in the peer-reviewed journal Nano Letters. Story Source:<br />The above story is reprinted (with editorial adaptations by ScienceDaily staff) from materials provided by </strong><a class="blue" href="http://www.ucla.edu/" rel="nofollow" target="_blank"><strong>University of California - Los Angeles</strong></a><strong>. The original article was written by Mike Rodewald.<br /></strong></div><br /><div align="center"></div>olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com0tag:blogger.com,1999:blog-2542563728620514982.post-31204986777644191862010-06-17T00:59:00.000-07:002010-06-17T01:00:17.470-07:00Here’s a rapid solution to find out how solar panels work.<div align="center"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj0pfX6oS8V7Wj-zDh8Ky46OVaAAF3vGqfpDpq5_aOcUAfNUHoERkEjipp8zuja50I9fyovz9R9-Mp-0YdHwNS42w-8AHOm5V20K4hitAnjeWp2h09Dh5ssiahieVn8KTu2e93-JcfxC7M/s1600/imagesCAEZ1YOY.jpg"><img style="TEXT-ALIGN: center; MARGIN: 0px auto 10px; WIDTH: 224px; DISPLAY: block; HEIGHT: 150px; CURSOR: hand" id="BLOGGER_PHOTO_ID_5483647405794247330" border="0" alt="" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj0pfX6oS8V7Wj-zDh8Ky46OVaAAF3vGqfpDpq5_aOcUAfNUHoERkEjipp8zuja50I9fyovz9R9-Mp-0YdHwNS42w-8AHOm5V20K4hitAnjeWp2h09Dh5ssiahieVn8KTu2e93-JcfxC7M/s320/imagesCAEZ1YOY.jpg" /></a> <a href="http://www.12voltsolarpanels.net/Quick_Way_to_Learn_How_Solar_Power_Works_066.DOC"><strong><span style="color:#ffff66;">Source</span></strong></a></div><div align="center"><strong>------------------</strong></div><div align="center"><strong></strong></div><div align="left"><strong>What exactly is solar energy ?<br />Solar energy is radiant energy which is produced by the sun. Every day the sun radiates, or sends out, an enormous volume of energy. The sun radiates more energy in a single second than people have used since the beginning of time!<br />The energy of the Sun comes from within the sun itself. Like other stars, the sun is really a big ball of gases––mostly hydrogen and helium atoms.<br />The hydrogen atoms in the sun’s core combine to form helium and generate energy in a process called nuclear fusion.<br /><br />During nuclear fusion, the sun’s extremely high pressure and temperature cause hydrogen atoms to come apart and their nuclei (the central cores of the atoms) to fuse or combine. Four hydrogen nuclei fuse to become one helium atom. However the helium atom contains less mass compared to four hydrogen atoms that fused. Some matter is lost during nuclear fusion. The lost matter is emitted into space as radiant energy.<br />It takes millions of years for the energy in the sun’s core to make its way to the solar surface, and somewhat over eight minutes to travel the 93 million miles to earth. The solar energy travels to the earth at a speed of 186,000 miles per second, the speed of light.<br />Only a small part of the energy radiated from the sun into space strikes the earth, one part in two billion. Yet this volume of energy is enormous. On a daily basis enough energy strikes the united states to supply the nation’s energy needs for one and a half years!<br /><br />Where does all this energy go? </strong></div><div align="left"><strong>About 15 percent of the sun’s energy that hits our planet is reflected back into space. Another 30 percent is used to evaporate water, which, lifted in to the atmosphere, produces rainfall. Solar power is also absorbed by plants, the land, and the oceans. The remaining could be used to supply our energy needs.<br />Who invented solar technology ?<br />Humans have harnessed solar energy for hundreds of years. As early as the 7th century B.C., people used simple magnifying glasses to concentrate the light of the sun into beams so hot they would cause wood to catch fire. Over a century ago in France, a scientist used heat from a solar collector to create steam to drive a steam engine. In the beginning of this century, scientists and engineers began researching ways to use solar technology in earnest. One important development was a remarkably efficient solar boiler invented by Charles Greeley Abbott, an american astrophysicist, in 1936.<br /><br />The solar water heater gained popularity at this time in Florida, California, and the Southwest. The industry started in the early 1920s and was in full swing just before World War II. This growth lasted before mid-1950s when low-cost natural gas had become the primary fuel for heating American homes.<br />People and world governments remained largely indifferent to the possibilities of solar technology until the oil shortages of the1970s. Today, people use solar technology to heat buildings and water and also to generate electricity.<br />How we use solar power today ?<br />Solar energy is employed in a variety of ways, of course. There are 2 standard forms of solar power:<br /><br />* Solar thermal energy collects the sun's warmth through 1 of 2 means: in water or in an anti-freeze (glycol) mixture.<br /><br />* Solar photovoltaic energy converts the sun's radiation to usable electricity.<br /><br />Listed below are the five most practical and popular ways that solar energy is used:<br /><br />1. Small portable solar photovoltaic systems. We see these used everywhere, from calculators to solar garden tools. Portable units can be utilized for everything from RV appliances while single panel systems are used for traffic signs and remote monitoring stations.<br /><br />2. Solar pool heating. Running water in direct circulation systems via a solar collector is an extremely practical solution to heat water for your pool or hot spa.<br /><br />3. Thermal glycol energy to heat water. In this method (indirect circulation), glycol is heated by sunshine and the heat is then transferred to water in a hot water tank. This process of collecting the sun's energy is more practical now than ever before. In areas as far north as Edmonton, Alberta, solar thermal to heat water is economically sound. It can pay for itself in three years or less. </strong></div><div align="left"><strong><br />4. Integrating solar photovoltaic energy into your home or office power. In most parts on the planet, solar photovoltaics is an economically feasible approach to supplement the power of your home. In Japan, photovoltaics are competitive with other forms of power. In the USA, new incentive programs make this form of solar technology ever more viable in many states. An increasingly popular and practical way of integrating solar energy into the power of your home or business is through the use of building integrated solar photovoltaics.<br /><br />5. Large independent photovoltaic systems. For those who have enough sun power at your site, you could possibly go off grid. It's also possible to integrate or hybridize your solar energy system with wind power or other forms of renewable energy to stay 'off the grid.'<br /><br />How can Photovoltaic panels work ?<br />Silicon is mounted beneath non-reflective glass to create photovoltaic panels. These panels collect photons from the sun, converting them into DC electrical energy. The energy created then flows into an inverter. The inverter transforms the power into basic voltage and AC electrical power.<br />Photovoltaic cells are prepared with particular materials called semiconductors for example silicon, which is presently the most generally used. When light hits the Photovoltaic cell, a certain share of it is absorbed inside the semiconductor material. This means that the energy of the absorbed light is given to the semiconductor.<br /><br />The power unfastens the electrons, permitting them to run freely. Photovoltaic cells also have one or more electric fields that act to compel electrons unfastened by light absorption to flow in a specific direction. This flow of electrons is a current, and by introducing metal links on the top and bottom of the -Photovoltaic cell, the current can be drawn to use it externally.<br />What are the benefits and drawbacks of solar power ?<br /><br />Solar Pro Arguments:<br /><br />- Heating our homes with oil or natural gas or using electricity from power plants running with coal and oil is a reason for climate change and climate disruption. Solar power, on the other hand, is clean and environmentally-friendly.<br /><br />- Solar hot-water heaters require little maintenance, and their initial investment could be recovered within a relatively limited time.<br /><br />- Solar hot-water heaters can work in almost any climate, even just in very cold ones. You just need to choose the right system for your climate: drainback, thermosyphon, batch-ICS, etc.<br /><br />- Maintenance costs of solar powered systems are minimal and also the warranties large.<br /><br />- Financial incentives (USA, Canada, European states…) can reduce the price of the first investment in solar technologies. The U.S. government, for example, offers tax credits for solar systems certified by by the SRCC (Solar Rating and Certification Corporation), which amount to 30 percent of the investment (2009-2016 period).<br /><br />Solar Cons Arguments:<br /><br />- The initial investment in Solar Hot water heaters or in Solar PV Electric Systems is greater than that required by conventional electric and gas heaters systems.<br /><br />- The payback period of solar PV-electric systems is high, as well as those of solar space heating or solar cooling (only the solar warm water heating payback is short or relatively short).<br /><br />- Solar water heating do not support a direct in conjunction with radiators (including baseboard ones).<br /><br />- Some air cooling (solar space heating and the solar cooling systems) are costly, and rather untested technologies: solar air conditioning isn't, till now, a really economical option.<br /><br />- The efficiency of solar powered systems is rather influenced by sunlight resources. It's in colder climates, where heating or electricity needs are higher, that the efficiency is smaller.<br /><br />About me - Barbara Young writes on </strong><a href="http://www.12voltsolarpanels.net/rv-solar-panels-101-ultimate-guide-12-volt-battery-charging"><strong>motorhome solar power</strong></a><strong> in her personal hobby blog 12voltsolarpanels.net. Her work is centered on helping people save energy using solar power to reduce CO2 emissions and energy dependency. </div></strong>olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com1tag:blogger.com,1999:blog-2542563728620514982.post-31394284711102551552010-01-17T11:45:00.000-08:002010-01-17T11:49:39.935-08:00Hexapod robot moves in the right direction by controlling chaos.<p><embed src="http://c.brightcove.com/services/viewer/federated_f8/1399191810" bgcolor="#FFFFFF" flashvars="videoId=61905823001&linkBaseURL=http%3A%2F%2Fwww.scientificamerican.com%2Fblog%2Fpost.cfm%3Fid%3Dhexapod-robot-moves-in-the-right-di-2010-01-17&playerId=1399191810&viewerSecureGatewayURL=https://console.brightcove.com/services/amfgateway&servicesURL=http://services.brightcove.com/services&cdnURL=http://admin.brightcove.com&domain=embed&autoStart=false&" base="http://admin.brightcove.com" name="flashObj" width="510" height="550" seamlesstabbing="false" type="application/x-shockwave-flash" swliveconnect="true" pluginspage="http://www.macromedia.com/shockwave/download/index.cgi?P1_Prod_Version=ShockwaveFlash"></embed></p><p align="center"><strong>Source: </strong><a href="http://www.scientificamerican.com/blog/post.cfm?id=hexapod-robot-moves-in-the-right-di-2010-01-17"><strong><span style="color:#ffff66;">Scientific American</span></strong></a></p><p align="left"><strong>Given that robots generally lack muscles, they can't rely on muscle memory (the trick that allows our bodies to become familiar over time with movements such as walking or breathing) to help them more easily complete repetitive tasks. For autonomous robots, this can be a bit of a problem, since they may have to accommodate changing terrain in real time or risk getting stuck or losing their balance.</strong><strong><br /></strong><strong>One way around this is to create a robot that can process information from a variety of sensors positioned near its "legs" and identify different patterns as it moves, a team of researchers report Sunday in Nature Physics. (Scientific American is part of Nature Publishing Group.)</strong><strong><br /></strong><strong>Some scientists rely on small neural circuits called "central pattern generators" (CPG) to create walking robots that are aware of their surroundings. One of the challenges is that the robot typically needs a separate CPG for each leg in order to sense obstacles and take the appropriate action (such as stepping around a chair leg or over a rock).</strong><strong><br /></strong><strong>Bernstein Center for Computational Neuroscience researcher Poramate Manoonpong and Max Planck Institute for Dynamics and Self-Organization researcher Marc Timme are leading a project that has created a six-legged robot with one CPG that can switch gaits depending upon the obstacles it encounters. The robot does this by manipulating the sensor inputs into periodic patterns (rather than chaotic ones) that determine its gait. In the future, the robot will also be equipped with a memory device that will enable it to complete movements even after the sensory input ceases to exist.</strong><strong><br /></strong><strong>© Poramate Manoonpong and Marc Timme, University of Goettingen and Max Planck Institute for Dynamics and Self-Organization</strong><br /></p>olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com0tag:blogger.com,1999:blog-2542563728620514982.post-52957069815935603512010-01-15T03:52:00.001-08:002010-01-15T03:54:09.684-08:00Fleet of High-Tech Robot 'Gliders' to Explore Oceans.<div align="center"><a href="http://www.sciencedaily.com/images/2010/01/100114162345.jpg"><img style="TEXT-ALIGN: center; MARGIN: 0px auto 10px; WIDTH: 300px; DISPLAY: block; HEIGHT: 225px; CURSOR: hand" border="0" alt="" src="http://www.sciencedaily.com/images/2010/01/100114162345.jpg" /></a> <strong><em><span style="font-size:85%;">Glider under water. (Credit: Holger v. Neuhoff, IFM-GEOMAR) </span></em></strong></div><div align="center"><strong>Source:<span style="color:#ffff66;"> </span></strong><a href="http://www.sciencedaily.com/releases/2010/01/100114162345.htm"><strong><span style="color:#ffff66;">ScienceDaily</span></strong></a></div><div align="center"><strong>--------------------------</strong></div><div align="left"><strong>ScienceDaily (Jan. 14, 2010) — The Leibniz Institute of Marine Sciences (IFM-GEOMAR) in Kiel, Germany, recently obtained the biggest fleet of so-called gliders in Europe. These instruments can explore the oceans like sailplanes up to a depth of 1000 metres. In doing so they only consume as much energy as a bike light. In the next years up to ten of these high-tech instruments will take measurements to better understand many processes in the oceans. Currently scientists and technicians prepare the devices for their first mission as a 'swarm' in the tropical Atlantic. </strong></div><div align="left"><strong>They may look like mini-torpedoes, yet exclusively serve peaceful purposes. The payload of the two-metre-long yellow diving robots consists of modern electronics, sensors and high-performance batteries. With these devices the marine scientists can collect selective measurements from the ocean interior while staying ashore themselves. Moreover, the gliders not only transmit the data in real time, but they can be reached by the scientists via satellite telephone and programmed with new mission parameters.<br />As such the new robots represent an important supplement to previous marine sensor platforms.<br />"Ten year ago we started to explore the ocean systematically with profiling drifters. Today more than 3000 of these devices constantly provide data from the ocean interior," explains Professor Torsten Kanzow, oceanographer at IFM-GEOMAR. This highly successful programme has one major disadvantage: the pathways of the drifters cannot be controlled.<br />"The new gliders have no direct motor, either. But with their small wings they move forward like sailplanes under water," says Dr. Gerd Krahmann, a colleague of Professor Kanzow. In a zigzag movement, the glider cycles between a maximum depth of 1000 metres and the sea surface.<br />"By telephone we can 'talk' to the glider and upload a new course everytime it comes up," explains Krahmann. A glider can carry out autonomous missions for weeks or even months. Every glider is equipped with instruments to measure temperature, salinity, oxygen and chlorophyll content as well as the turbidity of the sea water.<br />The IFM-GEOMAR has been the first institute in Europe to be committed to the new technology. "We tested different devices and we had to learn the hard way, too," oceanographer Dr. Johannes Karstensen says. "This way we have been able to contribute to the glider development, and now we have gathered knowledge required for successful glider operations," he adds.<br />Within the context of a special investment IFM-GEOMAR was able to obtain six new gliders adding to a total of nine altogether, which is the biggest fleet of that kind in Europe. Manufacturer of the IFM-GEOMAR-gliders is the Teledyne Webb Research Inc. in the USA.<br />A very successful mission using a single glider took place between August and October 2009 in the Atlantic Ocean, south of the Cape Verde Islands. The robot carried out measurements along a more than 1000 kilometres long track autonomously, before it was recovered by the German research vessel METEOR. The data collected are accessible online at </strong><a title="http://gliderweb.ifm-geomar.de/html/ifm03_depl05_frame.html" href="http://gliderweb.ifm-geomar.de/html/ifm03_depl05_frame.html" target="_blank"><strong>http://gliderweb.ifm-geomar.de/html/ifm03_depl05_frame.html</strong></a><strong>.<br />Now, for the first time the scientists in Kiel prepare a whole fleet of gliders for a concerted mission. After final tests the robots will be released mid-March 2010 at about 60 nautical miles north-east of the Cape Verde Island of Sao Vicente. For two months they will investigate physical and biogeochemical quantities of the Atlantic Ocean around the oceanographic long-term observatory TENATSO.<br />Goals of the experiment lead jointly by Prof. Torsten Kanzow, Prof. Julie LaRoche (marine biology) and Prof. Arne Körtzinger (marine chemistry) are to get new insights into water circulation and stratification as well as their impact on chemical and biological processes. With the glider swarm the scientists can sample a complete "sea-volume" and not just a single point or a single cross-section in the ocean. The gliders will be remotely controlled from a control centre at the IFM-GEOMAR in Kiel.<br />"This technology enables us to observe the upper layers of the ocean much more effectively and thus much less expensive than previously," says Prof. Dr. Martin Visbeck, Deputy Director of the IFM-GEOMAR and Head of the research division Ocean Circulation and Climate Dynamics. </strong></div><div align="left"><strong>Story Source:<br />Adapted from materials provided by </strong><a class="blue" href="http://www.ifm-geomar.de/" rel="nofollow" target="_blank"><strong>Leibniz Institute of Marine Sciences (IFM-GEOMAR)</strong></a><strong>, via </strong><a href="http://www.alphagalileo.org/" rel="nofollow" target="_blank"><strong>AlphaGalileo</strong></a><strong>. </strong></div>olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com1tag:blogger.com,1999:blog-2542563728620514982.post-80062991393838226292010-01-14T07:02:00.001-08:002010-01-14T07:04:55.289-08:00Modified Mobile Phone Runs on Coca-Cola.<div align="center"><a href="http://cdn.physorg.com/newman/gfx/news/nokiaphone1.jpg"><img style="TEXT-ALIGN: center; MARGIN: 0px auto 10px; WIDTH: 280px; DISPLAY: block; HEIGHT: 187px; CURSOR: hand" border="0" alt="" src="http://cdn.physorg.com/newman/gfx/news/nokiaphone1.jpg" /></a><strong> <em><span style="font-size:85%;">A modified Nokia cell phone that runs on Coca-Cola could run up to four times longer than a phone with a lithium ion battery. Image credit: Daizi Zheng</span></em>.</strong></div><div align="center"><strong>Source: </strong><a href="http://www.physorg.com/news182632665.html"><strong><span style="color:#ffff66;">Physorg.com</span></strong></a></div><div align="center"><strong>---------------------------</strong></div><div align="left"><strong>Daizi Zheng, a Chinese developer who is currently based in London, has modified a Nokia cell phone to run on Coca-Cola or any other sugary solution. </strong></div><div align="left"><strong>Zheng says the modified phone can run three or four times longer on a single charge than a phone using a conventional </strong><a class="textTag" href="http://www.physorg.com/tags/lithium+ion+battery/" rel="tag"><strong>lithium ion battery</strong></a><strong>, and can also be fully biodegradable.</strong></div><div align="left"><strong>As Zheng explains, a sugar-powered phone could potentially offer a much more environmentally friendly power source than lithium ion batteries. The new phone's bio battery, which basically acts as a </strong><a class="textTag" href="http://www.physorg.com/tags/fuel+cell/" rel="tag"><strong>fuel cell</strong></a><strong>, uses enzymes as the </strong><a class="textTag" href="http://www.physorg.com/tags/catalyst/" rel="tag"><strong>catalyst</strong></a><strong> to generate </strong><a class="textTag" href="http://www.physorg.com/tags/electricity/" rel="tag"><strong>electricity</strong></a><strong> from carbohydrates.<br />The phone can run for several hours, and produces water and carbon dioxide as the battery runs down. The phone can then be emptied out and refilled with more Coca-Cola.<br />Zheng designed the phone as a client project for Nokia, but there's no word on whether the company plans to incorporate the concept into future products.<br />"It brings a whole new perception to batteries and afternoon tea," Zheng wrote on her project's website.</strong></div><div align="left"><strong>More information: </strong><a href="http://www.daizizheng.com/projects.htm" rel="nofollow" target="_blank"><strong>www.daizizheng.com</strong></a></div>olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com0tag:blogger.com,1999:blog-2542563728620514982.post-11255412503112189742010-01-14T06:59:00.001-08:002010-01-14T07:01:45.604-08:00Samsung's new flash chips for mobile devices.<div align="center"><a href="http://cdn.physorg.com/newman/gfx/news/32GB_Samsung_microSD_CU.jpg"><img style="TEXT-ALIGN: center; MARGIN: 0px auto 10px; WIDTH: 260px; DISPLAY: block; HEIGHT: 214px; CURSOR: hand" border="0" alt="" src="http://cdn.physorg.com/newman/gfx/news/32GB_Samsung_microSD_CU.jpg" /></a> <strong><em><span style="font-size:85%;">Samsung 32GB microSD memory card</span></em></strong></div><div align="center"><strong>Source: </strong><a href="http://www.physorg.com/news182669590.html"><strong><span style="color:#ffff66;">Physorg.com</span></strong></a></div><div align="center"><strong>------------------------</strong></div><div align="left"><strong>Samsung Electronics has announced two new flash chip storage devices for mobiles: a removable 32-Gbyte micro SD (secure digital) card and a 64-Gbyte moviNAND flash memory module. Both are based on Samsung's own 30 nanometer class 32-Gbyte NAND flash memory chips, which use lithography technology that allows much more storage in a smaller unit. </strong></div><div align="left"><br /><strong>The removable SD flash card is only 1 mm thick and 0.7 mm high and will come into production in February. The card comprises a card controller and eight 30-micron thick stacked chips. Samsung says it is the highest capacity microSD ready for production. Users will be able to insert the 32-Gbyte micro SD card into their phone or other device via the built-in micro SD slot.<br />The 64-Gbyte flash chip is 1.4 mm thick and consists of sixteen stacked chips and a storage controller. This moviNAND embedded memory module has been in commercial production since December last year and will be the first to reach the marketplace. It doubles the memory of current memory modules such as that in the latest Apple </strong><a class="textTag" href="http://www.physorg.com/tags/iphone/" rel="tag"><strong>iPhone</strong></a><strong>.<br />Higher capacity devices such as Samsung's new offerings will allow mobile devices such as smartphones and media players to have increased memory, and demand for more memory is expected to increase as the market for mobile devices and the applications they run continues to grow. Executive President of Memory Marketing for Samsung, Dong-Soo Jun, said the new memory solutions will bring the </strong><a class="textTag" href="http://www.physorg.com/tags/storage+capacity/" rel="tag"><strong>storage capacity</strong></a><strong> of computers to mobile devices.<br />The expected cost of the two new high-density storage devices has not been released.</strong></div>olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com0tag:blogger.com,1999:blog-2542563728620514982.post-85725956197955132992010-01-14T06:55:00.000-08:002010-01-14T06:58:50.655-08:00Self-assembling solar panels a step closer.<div align="center"><a href="http://cdn.physorg.com/newman/gfx/news/self_assembly.jpg"><img style="TEXT-ALIGN: center; MARGIN: 0px auto 10px; WIDTH: 260px; DISPLAY: block; HEIGHT: 149px; CURSOR: hand" border="0" alt="" src="http://cdn.physorg.com/newman/gfx/news/self_assembly.jpg" /></a> <strong><em><span style="font-size:85%;">The self assembly process made a device of 64,000 parts in 3 minutes. Image: Heiko O. Jacobs</span></em></strong></div><div align="center"><strong>Source: </strong><a href="http://www.physorg.com/news182671345.html"><strong><span style="color:#ffff66;">Physorg.com</span></strong></a><br /></div><div align="center"><strong>---------------------------</strong></div><div align="left"><strong>Scientists Robert J. Knuesel and Heiko O. Jacobs of the University of Minnesota have developed a way to make tiny solar cells self-assemble. </strong><br /><strong>The researchers had previously been unsuccessful in their attempts to make self-assembling electronic components. In large systems gravity can be used to drive </strong><a class="textTag" href="http://www.physorg.com/tags/self+assembly/" rel="tag"><strong>self-assembly</strong></a><strong>, and in nanoscale systems chemical processes can be used, but between the two scales, in the micrometer range, it is much more difficult.<br />To overcome the difficulties, Kneusel and Jacobs designed a flexible substrate of a thin layer of copper covered with propylene-terephthalate (PET). Regular depressions the same size as the "chiplets" were etched into the PET layer and then the sheet was dipped into a bath of molten solder, which coated the exposed copper in the etched depressions. Each chiplet consisted of a 20-60 µm silicon cube with one gold face. The silicon sides had a coating of hydrophobic (water-repelling) molecules, while the gold side had a hydrophilic (water-attracting) coating.<br />When the elements were placed in a container containing oil and water, they neatly arranged themselves in a sheet at the boundary between the liquids, with the gold side pointed down to the water layer. The substrate was then pulled slowly up through the boundary like a conveyor belt, and the elements neatly dropped in place in the depressions as the solder attracted the </strong><a class="textTag" href="http://www.physorg.com/tags/gold/" rel="tag"><strong>gold</strong></a><strong> side. Accuracy was 98%. The assembly was covered with epoxy to keep the chiplets in place, and then a conducting </strong><a class="textTag" href="http://www.physorg.com/tags/electrode/" rel="tag"><strong>electrode</strong></a><strong> layer was added.<br />The device was able to assemble 62,000 elements, each of them thinner than a human hair, in only three minutes. The elimination of a dependency on gravity and sedimentation meant the chiplets could be reduced to below 100 micrometers in size. It was important to limit the assembly time to avoid oxidation of the surfaces, which would reduce surface energies and interfere with self-assembly. The water layer had to be acidic, at pH 2.0, and the temperature had to be kept at 95C to keep the solder molten.</strong></div><div align="left"><strong>The researchers think they can adapt their method to smaller components and larger assembled devices, and it could be used to cheaply and quickly assemble all kinds of high-quality electronic components on a wide range of flexible or inflexible substrates including plastics, semiconductors and metals. The assemblages could find uses in numerous applications such as </strong><a class="textTag" href="http://www.physorg.com/tags/solar+cells/" rel="tag"><strong>solar cells</strong></a><strong>, video displays and tiny semiconductors. </strong></div><div align="left"><strong>The use of this method in solar cell production would reduce the cost considerably since less silicon is needed, and it should also be possible to assemble solar chiplets into transparent, flexible materials, which would extend their range of uses.<br />The paper is published in the Proceedings of the National Academy of Sciences (PNAS).<br />More information: Self-assembly of microscopic chiplets at a liquid-liquid-solid interface forming a flexible segmented monocrystalline solar cell, Robert J. Knuesel and Heiko O. Jacobs, PNAS, </strong><a href="http://dx.doi.org/10.1073/pnas.0909482107" target="_blank"><strong>DOI:10.1073/pnas.0909482107</strong></a></div>olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com0tag:blogger.com,1999:blog-2542563728620514982.post-40373945564845662582010-01-14T01:21:00.001-08:002010-01-14T01:23:48.066-08:00No-Sweat Pressure Sensors.<div align="center"><a href="http://www.sciencedaily.com/images/2010/01/100113104249.jpg"><img style="TEXT-ALIGN: center; MARGIN: 0px auto 10px; WIDTH: 300px; DISPLAY: block; HEIGHT: 250px; CURSOR: hand" border="0" alt="" src="http://www.sciencedaily.com/images/2010/01/100113104249.jpg" /></a><strong> <em><span style="font-size:85%;">The new pressure sensor works at temperatures of up to 250 degrees Celsius. (Credit: Copyright Fraunhofer IMS)</span></em></strong></div><div align="center"><strong>Source: </strong><a href="http://www.sciencedaily.com/releases/2010/01/100113104249.htm"><strong><span style="color:#ffff66;">ScienceDaily</span></strong></a></div><div align="center"><strong>------------------------</strong></div><div align="left"><strong>ScienceDaily (Jan. 13, 2010) — Microelectronic chips used to take pressure readings are very delicate. A new technology has been developed that makes pressure sensors more robust, enabling them to continue operating normally at temperatures up to 250 degrees Celsius. </strong></div><div align="left"><strong>The drill bit gradually burrows deeper into the earth, working its way through the rock. Meanwhile, dozens of sensors are busily engaged in tasks such as taking pressure readings and evaluating porosity. The conditions they face are extreme, with the sensors being required to withstand high temperatures and pressures as well as shocks and vibrations. The sensors send the data to the surface to help geologists with work such as searching for oil deposits.<br />Yet there is one major hurdle: on average, the pressure sensors can only withstand temperatures of between 80 and 125 degrees Celsius -- but at great depths the temperature is often significantly higher. The Fraunhofer Institute for Microelectronic Circuits and Systems IMS in Duisburg has come to the rescue, its researchers having developed a pressure sensor system that continues to function normally even at 250 degrees Celsius.<br />"The pressure sensors consist of two components that are located on a microelectronic chip or wafer," explains Dr. Hoc Khiem Trieu, department head at IMS. "The first component is the sensor itself, and the other component is the EEPROM." This is the element that stores all the readings together with the data required for calibration.<br />To enable the pressure sensor to function properly even at extremely high temperatures, the developers modified the wafer. While normal wafers tend to be made of monocrystalline silicon, the researchers chose silicon oxide for this application. "The additional oxide layer provides better electrical insulation," Trieu continues. "It prevents the leakage current that typically occurs at very high temperatures, which is the principal reason that conventional sensors fail when they reach a certain temperature."<br />The oxide layer enabled the researchers to improve the insulation of the memory component by three to four orders of magnitude. In theory, this should enable the pressure sensors to withstand temperatures of up to 350 degrees Celsius -- the researchers have provided practical proof of stability up to 250 degrees and are planning to conduct further studies at higher temperatures. In addition, the researchers are analyzing the prototypes of the pressure sensors in endurance tests.<br />There is a broad range of potential applications, with engineers hoping to use the high-temperature pressure sensors not only in the petrochemical environment, but also in automobile engines and geothermal applications. </strong></div><div align="left"><strong>Story Source:<br />Adapted from materials provided by </strong><a class="blue" href="http://www.fraunhofer.de/" rel="nofollow" target="_blank"><strong>Fraunhofer-Gesellschaft</strong></a><strong>. </strong></div>olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com0tag:blogger.com,1999:blog-2542563728620514982.post-36941112695105092772010-01-13T08:55:00.000-08:002010-01-13T09:02:00.297-08:00RCA's Airenergy charger converts WiFi energy to electricity (VIDEO)<p align="center"><object width="320" height="265"><param name="movie" value="http://www.youtube.com/v/IMMbihbeIls&hl=en_US&fs=1&"><param name="allowFullScreen" value="true"><param name="allowscriptaccess" value="always"><embed src="http://www.youtube.com/v/IMMbihbeIls&hl=en_US&fs=1&" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width="320" height="265"></embed></object></p><p align="center"><strong>Source: </strong><a href="http://www.physorg.com/news182595455.html"><strong><span style="color:#ffff66;">Physorg.com </span></strong></a></p><p align="center"><strong><em>Airenergy is a gadget that can harvest free electricity from WiFi signals such as those from a wireless Internet connection, apparently with enough efficiency to make it practical for recharging devices such as mobile phones. </em></strong></p><p align="left"><strong>At the </strong><a class="textTag" href="http://www.physorg.com/tags/consumer+electronics+show/" rel="tag"><strong>Consumer Electronics Show</strong></a><strong> (CES) in Las Vegas this week a RCA spokesman said they had been able to charge a BlackBerry from 30% charge to fully charged in around 90 minutes using only ambient WiFi signals as the power source, although it was unclear on whether the Airenergy </strong><a class="textTag" href="http://www.physorg.com/tags/battery/" rel="tag"><strong>battery</strong></a><strong> was recharged in that time. The Airenergy recharging time depends on the proximity to the WiFi signal and the number of WiFi sources in the vicinity.<br />The RCA Airenergy unit converts the WiFi antenna signal to DC power to recharge its own internal lithium battery, so it automatically recharges itself whenever the device is anywhere near a WiFi </strong><a class="textTag" href="http://www.physorg.com/tags/hotspot/" rel="tag"><strong>hotspot</strong></a><strong>. If you have a wireless network at home the Airenergy would recharge overnight virtually anywhere in your home. When you need to recharge your phone or other device you plug the Airenergy battery into the phone via USB to transfer the charge.</strong></p><p align="left"><strong>Harvesting electricity from signals in the air is not new, as anyone who ever built a crystal radio running only on the radio signals it received can testify, but until now no device has been able to harvest enough electricity to make it of practical use. In most modern cities WiFi signal hotspots abound, which might make the Airenergy device a viable option, although in rural areas WiFi sources are less widespread.<br />A USB charger costing around $40, and about the size of a phone, is expected to be released later this year, with a WiFi-harvesting battery around the same size and price as an OEM battery available shortly after.</strong></p>olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com0tag:blogger.com,1999:blog-2542563728620514982.post-10108252070719121362010-01-11T13:26:00.001-08:002010-01-11T13:26:46.386-08:00Statistics Page<p align="center"><a title="free world map tracker" href="http://24counter.com/vmap/1258031813/"><img title="free world map counter" border="1" alt="world map hits counter" src="http://24counter.com/map/view.php?type=180&id=1258031813" /></a></p><div align="center"><br /><a href="http://24counter.com/map/">map counter</a><br /><br /><a href="http://24counter.com/cc_stats/1258031831/" target="_blank"><img border="0" alt="blog counter" src="http://24counter.com/online/ccc.php?id=1258031831" /></a><br /><br /><a href="http://24counter.com/">blog counter</a><br /><br /><a href="http://24counter.com/conline/1258031831/" target="_blank"><img border="0" alt="visitors by country counter" src="http://24counter.com/online/fcc.php?id=1258031831" /></a><br /><a href="http://24counter.com/" target="_blank">flag counter</a></div>olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com0tag:blogger.com,1999:blog-2542563728620514982.post-6746022155503092392010-01-11T12:20:00.001-08:002010-01-11T12:26:19.909-08:00Introducing the Light Touch interactive projector.<div align="center"><a href="http://cdn.physorg.com/newman/gfx/news/2-introducingt.jpg"><img style="TEXT-ALIGN: center; MARGIN: 0px auto 10px; WIDTH: 292px; DISPLAY: block; HEIGHT: 185px; CURSOR: hand" border="0" alt="" src="http://cdn.physorg.com/newman/gfx/news/2-introducingt.jpg" /></a><strong> Source: </strong><a href="http://www.physorg.com/news182413124.html"><strong><span style="color:#ffff66;">Physorg.com</span></strong></a></div><div align="center"><strong>---------------------------</strong></div><div align="left"><em><strong>UK-based company Light Blue Optics has introduced an extremely compact projector that converts any flat surface into an interactive touch video screen. </strong></em></div><div align="left"></div><div align="left"><strong>The Light Touch interactive </strong><a class="textTag" href="http://www.physorg.com/tags/projector/" rel="tag"><strong>projector</strong></a><strong> is basically a hand-held computer that runs Windows CE and uses a proprietary holographic laser projection (HLP) system to project a virtual touch screen onto any flat surface. The image projected is only 15 lumens, but remains in focus even at long distances.<br /></strong><strong>Holographic refers to the method Light Blue Optics uses to create two-dimensional images by transforming original images into sets of diffraction patterns that are shown on a micro-display and then illuminated by laser light. Diffraction patterns are used because of their high efficiency, since they direct light to where it is needed rather than indiscriminately.<br />Multiple diffraction patterns are calculated, with each producing a rough version on the image. The viewer's eye then separates the images from the noise and sees them as a clear video image that is always in focus, even when projected onto a curved surface. The lasers produce vivid and bright image colors, and the wide throw angle produces large (10-inch) images even close to the tiny projector.</strong></div><div align="left"><strong>The Light Touch system detects interactivity via an integrated infrared system that in effect transforms any surface into a touch screen, and this allows users to run the projector and control applications by touching the image. WiFi connectivity and Bluetooth technology are built-in to allow users to connect to the Internet for multimedia sharing and social networking via the projector.<br />The laser used in the Light Touch projector is a class 1, which means it is completely eye safe, and the projected images are WVGA (Wide Video Graphics Array), which produces a crisp, auto-focused image. The standard flash memory is 2 GB, but there is a micro SD card slot to upgrade to 32 GB storage. The battery life rating is for two hours. </strong></div><div align="left"><strong>The </strong><a class="textTag" href="http://www.physorg.com/tags/light+touch/" rel="tag"><strong>Light Touch</strong></a><strong> projector, the first released product of Blue Light Optics, was demonstrated for the first time on 7 January at the 2010 Consumer Electronics Show (CES) in Las Vegas.</strong></div><div align="left"><strong>More information: Light Blue Optics website: </strong><a href="http://lightblueoptics.com/products/light-touch/" target="_blank"><strong>http://lightblueoptics.com/products/light-touch/</strong></a><br /><strong>© 2009 PhysOrg.com</strong></div><br /><object width="425" height="344"><param name="movie" value="http://www.youtube.com/v/wXFOAiKjsHo&color1=0xb1b1b1&color2=0xcfcfcf&hl=en_US&feature=player_embedded&fs=1"><param name="allowFullScreen" value="true"><param name="allowScriptAccess" value="always"><embed src="http://www.youtube.com/v/wXFOAiKjsHo&color1=0xb1b1b1&color2=0xcfcfcf&hl=en_US&feature=player_embedded&fs=1" type="application/x-shockwave-flash" allowfullscreen="true" allowscriptaccess="always" width="425" height="344"></embed></object><br /><object width="425" height="344"><param name="movie" value="http://www.youtube.com/v/6BuNyUlZuH4&color1=0xb1b1b1&color2=0xcfcfcf&hl=en_US&feature=player_embedded&fs=1"><param name="allowFullScreen" value="true"><param name="allowScriptAccess" value="always"><embed src="http://www.youtube.com/v/6BuNyUlZuH4&color1=0xb1b1b1&color2=0xcfcfcf&hl=en_US&feature=player_embedded&fs=1" type="application/x-shockwave-flash" allowfullscreen="true" allowscriptaccess="always" width="425" height="344"></embed></object>olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com0tag:blogger.com,1999:blog-2542563728620514982.post-10444198075298544952009-10-07T05:36:00.000-07:002009-10-07T05:40:55.926-07:00Communication Through Power Of Thought now possible,with the help of electrodes, a PC and Internet connection.<div align="center"><a href="http://www.sciencedaily.com/images/2009/10/091006102637.jpg"><img style="TEXT-ALIGN: center; MARGIN: 0px auto 10px; WIDTH: 300px; DISPLAY: block; HEIGHT: 171px; CURSOR: hand" border="0" alt="" src="http://www.sciencedaily.com/images/2009/10/091006102637.jpg" /></a><strong> </strong><a href="http://www.sciencedaily.com/releases/2009/10/091006102637.htm"><strong><span style="color:#ffff66;">SOURCE</span></strong></a></div><div align="center"><strong></strong> </div><div align="left"><strong>ScienceDaily (Oct. 6, 2009) — New research from the University of Southampton has demonstrated that it is possible for communication from person to person through the power of thought -- with the help of electrodes, a computer and Internet connection.</strong></div><div align="left"><strong>Brain-Computer Interfacing (BCI) can be used for capturing brain signals and translating them into commands that allow humans to control (just by thinking) devices such as computers, robots, rehabilitation technology and virtual reality environments.<br />This experiment goes a step further and was conducted by Dr Christopher James from the University's Institute of Sound and Vibration Research. The aim was to expand the current limits of this technology and show that brain-to-brain (B2B) communication is possible.<br />Dr James comments: "Whilst BCI is no longer a new thing and person to person communication via the nervous system was shown previously in work by Professor Kevin Warwick from the University of Reading, here we show, for the first time, true brain to brain interfacing. We have yet to grasp the full implications of this but there are various scenarios where B2B could be of benefit such as helping people with severe debilitating muscle wasting diseases, or with the so-called 'locked-in' syndrome, to communicate and it also has applications for gaming."<br />His experiment had one person using BCI to transmit thoughts, translated as a series of binary digits, over the internet to another person whose computer receives the digits and transmits them to the second user's brain through flashing an LED lamp.<br />While attached to an EEG amplifier, the first person would generate and transmit a series of binary digits, imagining moving their left arm for zero and their right arm for one. The second person was also attached to an EEG amplifier and their PC would pick up the stream of binary digits and flash an LED lamp at two different frequencies, one for zero and the other one for one. The pattern of the flashing LEDs is too subtle to be picked by the second person, but it is picked up by electrodes measuring the visual cortex of the recipient.<br />The encoded information is then extracted from the brain activity of the second user and the PC can decipher whether a zero or a one was transmitted. This shows true brain-to-brain activity.<br />You can watch Dr James' BCI experiment at: </strong><a href="http://www.youtube.com/watch?v=93p7oDkA5WA&feature=email" rel="nofollow" target="_blank"><strong>http://www.youtube.com/watch?v=93p7oDkA5WA&feature=email</strong></a><br /><strong>Dr James is part of the University of Southampton's Brain-Computer Interfacing Research Programme, which brings together biomedical engineering and the clinical sciences and provides a cohesive scientific basis for rehabilitation research and management. Projects are driven by clinical problems, using cutting-edge signal processing research to produce an investigative tool for advancing knowledge of neurophysiological mechanisms, as well as providing a practical therapeutic system to be used outside a specialised BCI laboratory.<br />Dr James also appeared on BBC2's 'James May's Big Ideas' last year, talking about thought controlled wheelchairs and introducing the field of BCI. You can view the segment here: </strong><a href="http://www.youtube.com/watch?v=Uyrd0uOuyms&feature=related" rel="nofollow" target="_blank"><strong>http://www.youtube.com/watch?v=Uyrd0uOuyms&feature=related</strong></a><br /><strong>Adapted from materials provided by </strong><a class="blue" href="http://www.soton.ac.uk/" rel="nofollow" target="_blank"><strong>University of Southampton</strong></a><strong>, via </strong><a href="http://www.eurekalert.org/" rel="nofollow" target="_blank"><strong>EurekAlert!</strong></a><strong>, a service of AAAS. </strong></div>olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com2tag:blogger.com,1999:blog-2542563728620514982.post-24589693613757436262009-10-05T06:41:00.000-07:002009-10-05T06:43:25.505-07:00New Multi-use Device Can Shed Light On Oxygen Intake.<div align="center"><a href="http://www.sciencedaily.com/images/2009/09/090922095812.jpg"><img style="TEXT-ALIGN: center; MARGIN: 0px auto 10px; WIDTH: 300px; DISPLAY: block; HEIGHT: 212px; CURSOR: hand" border="0" alt="" src="http://www.sciencedaily.com/images/2009/09/090922095812.jpg" /></a> <a href="http://www.sciencedaily.com/releases/2009/09/090922095812.htm"><strong><span style="color:#ffff66;">SOURCE</span></strong></a></div><div align="center"><strong></strong> </div><div align="left"><strong>ScienceDaily (Oct. 5, 2009) — A fiber-optic sensor created by a team of Purdue University researchers that is capable of measuring oxygen intake rates could have broad applications ranging from plant root development to assessing the effectiveness of chemotherapy drugs. </strong></div><div align="left"><strong>The self-referencing optrode, developed in the lab of Marshall Porterfield, an associate professor of agricultural and biological engineering, is non-invasive, can deliver real-time data, holds a calibration for the sensor's lifetime and doesn't consume oxygen like traditional sensors that can compete with the sample being measured. A paper on the device was released on the early online version of the journal The Analyst this week.<br />"It's very sensitive in terms of the biological specimens we can monitor," Porterfield said. "We don't only measure oxygen concentration, we measure the flux. That's what's important for biologists."<br />Mohammad Rameez Chatni, a doctoral student in Porterfield's lab, said the sensor could be used broadly across disciplines. Testing included tumor cells, fish eggs, spinal cord material and plant roots.<br />Cancerous cells typically intake oxygen at higher rates than healthy cells, Chatni said. Measuring how a chemotherapy drug affects oxygen intake in both kinds of cells would tell a researcher whether the treatment was effective in killing tumors while leaving healthy cells unaffected.<br />Plant biologists might be interested in the sensor to measure oxygen intake of a genetically engineered plant's roots to determine its ability to survive in different types of soil.<br />"This tool could have applications in biomedical science, agriculture, material science. It's going across all disciplines," Chatni said.<br />The sensor is created by heating an optical fiber and pulling it apart to create two pointed optrodes about 15 microns in diameter, about one-tenth the size of a human hair. A membrane containing a fluorescent dye is placed on the tip of an optrode.<br />Oxygen binds to the fluorescent dye. When a blue light is passed through the optrode, the dye emits red light back. The complex analysis of that red light reveals the concentration of oxygen present at the tip of the optrode.<br />The optrode is oscillated between two points, one just above the surface of the sample and another a short distance away. Based on the difference in the oxygen concentrations, called flux, the amount of oxygen being taken in by the sample is calculated.<br />It's the intake, or oxygen transportation, that Porterfield said is important to understand.<br />"Just knowing the oxygen concentration in or around a sample will not necessarily correlate to the underlying biological processes going on," he said.<br />Porterfield said future work will focus on altering the device to measure things such as sodium and potassium intake as well. The National Science Foundation funded the research.<br />Adapted from materials provided by </strong><a class="blue" href="http://www.purdue-edu.com/" rel="nofollow" target="_blank"><strong>Purdue University</strong></a><strong>. Original article written by Brian Wallheimer. </strong></div>olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com0tag:blogger.com,1999:blog-2542563728620514982.post-15389212186799916742009-10-01T11:17:00.000-07:002009-10-01T11:19:13.839-07:00'Visual Walkman' Offers Augmented Reality.<div align="center"><a href="http://www.sciencedaily.com/images/2009/09/090930102926.jpg"><img style="TEXT-ALIGN: center; MARGIN: 0px auto 10px; WIDTH: 300px; DISPLAY: block; HEIGHT: 245px; CURSOR: hand" border="0" alt="" src="http://www.sciencedaily.com/images/2009/09/090930102926.jpg" /></a><strong><span style="color:#ffff66;"> </span></strong><a href="http://www.sciencedaily.com/releases/2009/09/090930102926.htm"><strong><span style="color:#ffff66;">SOURCE</span></strong></a></div><div align="center"><strong></strong> </div><div align="left"><strong>ScienceDaily (Sep. 30, 2009) — "Augmented reality" involves mixing the real world with computer-generated images. The result is a kind of visual Walkman. Jurjen Caarls developed a prototype, which is the subject of a doctoral dissertation that he recently defended at Delft Univesity of Technology (The Netherlands). </strong></div><div align="left"><strong>One example of augmented reality is a special helmet, in which images are projected into the wearer’s eyes, thereby creating the illusion that these images are part of reality. It is as if extra elements are being added to reality.<br />Football advertising<br />A simpler form of real-time augmented reality is already being used during televised football matches. This technology is used to create virtual billboards on either side of the goals, as an additional option for advertisers. Whatever the camera angle, these virtual billboards seem to be perfectly aligned with real on-screen objects. This is made possible by adjusting the projection of these images using information on the current ‘state’ of the live camera.<br />Sensors<br />However, things get considerably more complicated when people start moving around within an augmented reality environment. In these situations, of course, the only way to achieve acceptable results is to have accurate, moment-by-moment information on the position and orientation of the individual in question (and especially that of their eyes) relative to the real space around them. This information is fed into the system by various sensors. The equipment built into the augmented reality helmet includes a camera, angular velocity sensors, and accelerometers.<br />Prototype<br />Jurjen Caarls’ main focus was on achieving accurate, real-time measurements of position and orientation. To this end, he has developed specific image processing techniques, as well as methods for mixing and filtering the information from various sensors. In partnership with the Royal Academy of Arts in The Hague, he has successfully created a working prototype. Those using the system can simply observe the real world, or they can supplement reality with virtual objects. This effect is achieved using two small screens and two semi-transparent mirrors, which are built into the helmet.<br />Walkman<br />Caarls feels that the further development of augmented reality could lead to an entirely novel interaction between man and computer. "I can imagine a future in which people experience augmented reality by wearing glasses with integrated displays that project images on their retinas. These images will seem to be just another part of reality. Think of it as a visual Walkman," he said.<br />In the future, augmented reality applications could have a wide range of uses, in museums and games for example. They could also be a valuable tool for architects and industrial maintenance workers.<br />Adapted from materials provided by </strong><a class="blue" href="http://www.tudelft.nl/" rel="nofollow" target="_blank"><strong>Delft University of Technology</strong></a><strong>. </strong></div>olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com0tag:blogger.com,1999:blog-2542563728620514982.post-71985039435176019792009-09-28T12:33:00.001-07:002009-09-28T12:34:36.292-07:00Swimming Robot Makes Waves At Bath.<div align="center"><a href="http://www.sciencedaily.com/images/2009/09/090921091835.jpg"><img style="TEXT-ALIGN: center; MARGIN: 0px auto 10px; WIDTH: 300px; DISPLAY: block; HEIGHT: 360px; CURSOR: hand" border="0" alt="" src="http://www.sciencedaily.com/images/2009/09/090921091835.jpg" /></a><strong><span style="color:#ffff66;"> </span></strong><a href="http://www.sciencedaily.com/releases/2009/09/090921091835.htm"><strong><span style="color:#ffff66;">SOURCE</span></strong></a></div><div align="center"><strong></strong> </div><div align="left"><strong>ScienceDaily (Sep. 25, 2009) — Researchers at the University of Bath have used nature for inspiration in designing a new type of swimming robot which could bring a breakthrough in submersible technology. </strong></div><div align="left"><strong>Conventional submarine robots are powered by propellers that are heavy, inefficient and can get tangled in weeds.<br />In contrast ‘Gymnobot', created by researchers from the Ocean Technologies Lab in the University's Department of Mechanical Engineering, is powered by a fin that runs the length of the underside of its rigid body; this undulates to make a wave in the water which propels the robot forwards.<br />The design, inspired by the Amazonian knifefish, is thought to be more energy efficient than conventional propellers and allows the robot to navigate shallow water near the sea shore.<br />Gymnobot could be used to film and study the diverse marine life near the seashore, where conventional submersible robots would have difficulty manoeuvring due to the shallow water with its complex rocky environment and plants that can tangle a propeller.<br />Dr William Megill, Lecturer in Biomimetics at the University of Bath, explained: "The knifefish has a ventral fin that runs the length of its body and makes a wave in the water that enables it to easily swim backwards or forwards in the water.<br />"Gymnobot mimics this fin and creates a wave in the water that drives it forwards. This form of propulsion is potentially much more efficient than a conventional propeller and is easier to control in shallow water near the shore."<br />Keri Collins, a postgraduate student who developed the Gymnobot as part of her PhD, added: "We hope to observe how the water flows around the fin in later stages of the project. In particular we want to look at the creation and development of vortices around the fin.<br />"Some fish create vortices when flicking their tails one way but then destroy them when their tails flick back the other way. By destroying the vortex they are effectively re-using the energy in that swirling bit of water. The less energy left in the wake when the fish has passed, the less energy is wasted.<br />"It will be particularly interesting to see how thrust is affected by changing the wave of the fin from a constant amplitude to one that is tapered at one end."<br />The lab was recently awarded a grant to work with six other European institutions to create a similar robot that reacts to water flow and is able to swim against currents.<br />In addition to studying biodiversity near the shore and in fast-flowing rivers, robots like Gymnobot could also be used for detecting pollution in the environment or for inspecting structures such as oil rigs.<br />The project was funded by BMT Defence Services and the Engineering & Physical Sciences Research Council.<br />Adapted from materials provided by </strong><a class="blue" href="http://www.bath.ac.uk/" rel="nofollow" target="_blank"><strong>University of Bath</strong></a><strong>, via </strong><a href="http://www.alphagalileo.org/" rel="nofollow" target="_blank"><strong>AlphaGalileo</strong></a><strong>. </strong></div>olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com0tag:blogger.com,1999:blog-2542563728620514982.post-81215742682666704092009-09-20T01:30:00.000-07:002009-09-20T01:32:35.806-07:00The Interoperable Telesurgical Protocol.<div align="center"><a href="http://www.sciencedaily.com/releases/2009/09/090917144144.htm"><img style="TEXT-ALIGN: center; MARGIN: 0px auto 10px; WIDTH: 300px; DISPLAY: block; HEIGHT: 200px; CURSOR: hand" border="0" alt="" src="http://www.sciencedaily.com/images/2009/09/090917144144.jpg" /> <strong><span style="color:#ffff66;">SOURCE</span></strong></a></div><div align="center"><strong><br /></strong></div><div align="left"><strong>ScienceDaily (Sep. 18, 2009) — Using a new software protocol called the Interoperable Telesurgical Protocol, nine research teams from universities and research institutes around the world recently collaborated on the first successful demonstration of multiple biomedical robots operated from different locations in the U.S., Europe, and Asia. SRI International operated its M7 surgical robot for this demonstration. </strong></div><div align="left"><strong>In a 24-hour period, each participating group connected over the Internet and controlled robots at different locations. The tests performed demonstrated how a wide variety of robot and controller designs can seamlessly interoperate, allowing researchers to work together easily and more efficiently. In addition, the demonstration evaluated the feasibility of robotic manipulation from multiple sites, and was conducted to measure time and performance for evaluating laparoscopic surgical skills.<br />New Interoperable Telesurgical Protocol The new protocol was cooperatively developed by the University of Washington and SRI International, to standardize the way remotely operated robots are managed over the Internet.<br />"Although many telemanipulation systems have common features, there is currently no accepted protocol for connecting these systems," said SRI's Tom Low. "We hope this new protocol serves as a starting point for the discussion and development of a robust and practical Internet-type standard that supports the interoperability of future robotic systems."<br />The protocol will allow engineers and designers that usually develop technologies independently, to work collaboratively, determine which designs work best, encourage widespread adoption of the new communications protocol, and help robotics research to evolve more rapidly. Early adoption of this protocol internationally will encourage robotic systems to be developed with interoperability in mind, and avoid future incompatibilities.<br />"We're very pleased with the success of the event in which almost all of the possible connections between operator stations and remote robots were successful. We were particularly excited that novel elements such as a simulated robot and an exoskeleton controller worked smoothly with the other remote manipulation systems," said Professor Blake Hannaford of the University of Washington.<br />The demonstration included the following organizations:<br />SRI International, Menlo Park, Calif., USA<br />University of Washington Biorobotics Lab (BRL), Seattle, Washington, USA<br />University of California at Santa Cruz (UCSC), Bionics Lab, Santa Cruz, Calif., USA<br />iMedSim, Interactive Medical Simulation Laboratory, Rensselaer Polytechnic Institute, Troy, New York, USA<br />Korea University of Technology (KUT) BioRobotics Lab, Cheonan, South Chungcheong, South Korea<br />Imperial College London (ICL), London, England<br />Johns Hopkins University (JHU), Baltimore, Maryland, USA<br />Technische Universität München (TUM), Munich, Germany<br />Tokyo Institute of Technology (TOK), Tokyo, Japan<br />For more information regarding availability of the Interoperable Telesurgical Protocol, please visit: </strong><a href="http://brl.ee.washington.edu/Research_Active/Interoperability/index.php/Main_Page" rel="nofollow" target="_blank"><strong>http://brl.ee.washington.edu/Research_Active/Interoperability/index.php/Main_Page</strong></a><br /><strong>Adapted from materials provided by </strong><a class="blue" href="http://www.sri.com/" rel="nofollow" target="_blank"><strong>SRI International</strong></a><strong>. </strong></div>olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com0tag:blogger.com,1999:blog-2542563728620514982.post-23652384260079286842009-07-25T00:43:00.001-07:002009-07-25T00:45:09.547-07:00Silicon With Afterburners: New Process Could Be Boon To Electronics Manufacturer<div align="center"><a href="http://www.sciencedaily.com/releases/2009/07/090723113700.htm"><img style="TEXT-ALIGN: center; MARGIN: 0px auto 10px; WIDTH: 300px; DISPLAY: block; HEIGHT: 290px; CURSOR: hand" border="0" alt="" src="http://www.sciencedaily.com/images/2009/07/090723113700.jpg" /><strong> <span style="color:#ffff66;">SOURCE</span></strong></a><br /><br /><div align="left"><strong>ScienceDaily (July 24, 2009) — Scientists at Rice University and North Carolina State University have found a method of attaching molecules to semiconducting silicon that may help manufacturers reach beyond the current limits of Moore's Law as they make microprocessors both smaller and more powerful. </strong></div><div align="left"><strong>Moore's Law, suggested by Intel co-founder Gordon Moore in 1965, said the number of transistors that can be placed on an integrated circuit doubles about every two years. But even Moore has said the law cannot be sustained indefinitely.<br />The challenge is to get past the limits of doping, a process that has been essential to creating the silicon substrate that is at the heart of all modern integrated circuits, said James Tour, Rice's Chao Professor of Chemistry and professor of mechanical engineering and materials science and of computer science.<br />Doping introduces impurities into pure crystalline silicon as a way of tuning microscopic circuits to a particular need, and it's been effective so far even in concentrations as small as one atom of boron, arsenic or phosphorus per 100 million of silicon.<br />But as manufacturers pack more transistors onto integrated circuits by making the circuits ever smaller, doping gets problematic.<br />"When silicon gets really small, down to the nanoscale, you get structures that essentially have very little volume," Tour said. "You have to put dopant atoms in silicon for it to work as a semiconductor, but now, devices are so small you get inhomogeneities. You may have a few more dopant atoms in this device than in that one, so the irregularities between them become profound."<br />Manufacturers who put billions of devices on a single chip need them all to work the same way, but that becomes more difficult with the size of a state-of-the-art circuit at 45 nanometers wide -- a human hair is about 100,000 nanometers wide -- and smaller ones on the way.<br />The paper suggests that monolayer molecular grafting -- basically, attaching molecules to the surface of the silicon rather than mixing them in -- essentially serves the same function as doping, but works better at the nanometer scale. "We call it silicon with afterburners," Tour said. "We're putting an even layer of molecules on the surface. These are not doping in the same way traditional dopants do, but they're effectively doing the same thing."<br />Tour said years of research into molecular computing with an eye toward replacing silicon has yielded little fruit. "It's hard to compete with something that has trillions of dollars and millions of person-years invested into it. So we decided it would be good to complement silicon, rather than try to supplant it."<br />He anticipates wide industry interest in the process, in which carbon molecules could be bonded with silicon either through a chemical bath or evaporation. "This is a nice entry point for molecules into the silicon industry. We can go to a manufacturer and say, 'Let us make your fabrication line work for you longer. Let us complement what you have.'<br />"This gives the Intels and the Microns and the Samsungs of the world another tool to try, and I guarantee you they'll be trying this."<br />Journal reference:<br />He et al. Controllable Molecular Modulation of Conductivity in Silicon-Based Devices. Journal of the American Chemical Society, 2009; 131 (29): 10023 DOI: </strong><a href="http://dx.doi.org/10.1021/ja9002537" rel="nofollow" target="_blank"><strong>10.1021/ja9002537</strong></a><br /><strong>Adapted from materials provided by </strong><a class="blue" href="http://www.rice.edu/" rel="nofollow" target="_blank"><strong>Rice University</strong></a><strong>. </strong></div></div>olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com0tag:blogger.com,1999:blog-2542563728620514982.post-33843287428669379872009-07-23T00:24:00.000-07:002009-07-23T00:26:19.163-07:00Music Is The Engine Of New Lab-on-a-chip Device<div align="center"><a href="http://www.sciencedaily.com/images/2009/07/090722120835.jpg"><img style="TEXT-ALIGN: center; MARGIN: 0px auto 10px; WIDTH: 300px; DISPLAY: block; HEIGHT: 220px; CURSOR: hand" border="0" alt="" src="http://www.sciencedaily.com/images/2009/07/090722120835.jpg" /></a><span style="color:#ffff66;"> </span><a href="http://www.sciencedaily.com/releases/2009/07/090722120835.htm"><strong><span style="color:#ffff66;">SOURCE</span></strong></a></div><div align="center"><br /></div><div align="left">ScienceDaily (July 23, 2009) — Music, rather than electromechanical valves, can drive experimental samples through a lab-on-a-chip in a new system developed at the University of Michigan. This development could significantly simplify the process of conducting experiments in microfluidic devices. </div><div align="left">A paper on the research will be published online in the Proceedings of the National Academy of Sciences the week of July 20.<br />A lab-on-a-chip, or microfluidic device, integrates multiple laboratory functions onto one chip just millimeters or centimeters in size. The devices allow researchers to experiment on tiny sample sizes, and also to simultaneously perform multiple experiments on the same material. There is hope that they could lead to instant home tests for illnesses, food contaminants and toxic gases, among other advances.<br />To do an experiment in a microfluidic device today, researchers often use dozens of air hoses, valves and electrical connections between the chip and a computer to move, mix and split pin-prick drops of fluid in the device's microscopic channels and divots.<br />"You quickly lose the advantage of a small microfluidic system," said Mark Burns, professor and chair of the Department of Chemical Engineering and a professor in the Department of Biomedical Engineering.<br />"You'd really like to see something the size of an iPhone that you could sneeze onto and it would tell you if you have the flu. What hasn't been developed for such a small system is the pneumatics—the mechanisms for moving chemicals and samples around on the device."<br />The U-M researchers use sound waves to drive a unique pneumatic system that does not require electromechanical valves. Instead, musical notes produce the air pressure to control droplets in the device. The U-M system requires only one "off-chip" connection.<br />"This system is a lot like fiberoptics, or cable television. Nobody's dragging 200 separate wires all over your house to power all those channels," Burns said. "There's one cable signal that gets decoded."<br />The system developed by Burns, chemical engineering doctoral student Sean Langelier, and their collaborators replaces these air hoses, valves and electrical connections with what are called resonance cavities. The resonance cavities are tubes of specific lengths that amplify particular musical notes.<br />These cavities are connected on one end to channels in the microfluidic device, and on the other end to a speaker, which is connected to a computer. The computer generates the notes, or chords. The resonance cavities amplify those notes and the sound waves push air through a hole in the resonance cavity to their assigned channel. The air then nudges the droplets in the microfluidic device along.<br />"Each resonance cavity on the device is designed to amplify a specific tone and turn it into a useful pressure," Langelier said. "If I play one note, one droplet moves. If I play a three-note chord, three move, and so on. And because the cavities don't communicate with each other, I can vary the strength of the individual notes within the chords to move a given drop faster or slower."<br />Burns describes the set-up as the reverse of a bell choir. Rather than ringing a bell to create sound waves in the air, which are heard as music, this system uses music to create sound waves in the device, which in turn, move the experimental droplets.<br />"I think this is a very clever system," Burns said. "It's a way to make the connections between the microfluidic world and the real world much simpler."<br />The new system is still external to the chip, but the researchers are working to make it smaller and incorporate it on a microfluidic device. That would be a step closer to a smartphone-sized home flu test.<br />The paper is called, "Acoustically-driven programmable liquid motion using resonance cavities." Other authors are U-M chemical engineering graduate students Dustin Chang and Ramsey Zeitoun. The research is funded by the National Institutes of Health and the National Science Foundation. The University is pursuing patent protection for the intellectual property.<br />Adapted from materials provided by <a class="blue" href="http://www.umich-edu.com/" rel="nofollow" target="_blank">University of Michigan</a>. </div>olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com2tag:blogger.com,1999:blog-2542563728620514982.post-4776401513788924962009-07-22T08:35:00.001-07:002009-07-22T08:36:36.285-07:00Cell Phones Turned Into Fluorescent Microscopes<div align="center"><a href="http://www.sciencedaily.com/releases/2009/07/090721214625.htm"><img style="TEXT-ALIGN: center; MARGIN: 0px auto 10px; WIDTH: 300px; DISPLAY: block; HEIGHT: 208px; CURSOR: hand" border="0" alt="" src="http://www.sciencedaily.com/images/2009/07/090721214625.jpg" /> <strong><span style="color:#ffff66;">SOURCE</span></strong></a></div><div align="center"> </div><div align="left">ScienceDaily (July 22, 2009) — Researchers at the University of California, Berkeley, are proving that a camera phone can capture far more than photos of people or pets at play. They have now developed a cell phone microscope, or CellScope, that not only takes color images of malaria parasites, but of tuberculosis bacteria labeled with fluorescent markers. </div><div align="left">The prototype CellScope, described in the journal PLoS One, moves a major step forward in taking clinical microscopy out of specialized laboratories and into field settings for disease screening and diagnoses.<br />"The same regions of the world that lack access to adequate health facilities are, paradoxically, well-served by mobile phone networks," said Dan Fletcher, UC Berkeley associate professor of bioengineering and head of the research team developing the CellScope. "We can take advantage of these mobile networks to bring low-cost, easy-to-use lab equipment out to more remote settings."<br />The engineers attached compact microscope lenses to a holder fitted to a cell phone. Using samples of infected blood and sputum, the researchers were able to use the camera phone to capture bright field images of Plasmodium falciparum, the parasite that causes malaria in humans, and sickle-shaped red blood cells. They were also able to take fluorescent images of Mycobacterium tuberculosis, the bacterial culprit that causes TB in humans. Moreover, the researchers showed that the TB bacteria could be automatically counted using image analysis software.<br />"The images can either be analyzed on site or wirelessly transmitted to clinical centers for remote diagnosis," said David Breslauer, co-lead author of the study and a graduate student in the UC San Francisco/UC Berkeley Bioengineering Graduate Group. "The system could be used to help provide early warning of outbreaks by shortening the time needed to screen, diagnose and treat infectious diseases."<br />The engineers had previously shown that a portable microscope mounted on a mobile phone could be used for bright field microscopy, which uses simple white light - such as from a bulb or sunlight - to illuminate samples. The latest development adds to the repertoire fluorescent microscopy, in which a special dye emits a specific fluorescent wavelength to tag a target - such as a parasite, bacteria or cell - in the sample.<br />"Fluorescence microscopy requires more equipment - such as filters and special lighting - than a standard light microscope, which makes them more expensive," said Fletcher. "In this paper we've shown that the whole fluorescence system can be constructed on a cell phone using the existing camera and relatively inexpensive components."<br />The researchers used filters to block out background light and to restrict the light source, a simple light-emitting diode (LED), to the 460 nanometer wavelength necessary to excite the green fluorescent dye in the TB-infected blood. Using an off-the-shelf phone with a 3.2 megapixel camera, they were able to achieve a spatial resolution of 1.2 micrometers. In comparison, a human red blood cell is about 7 micrometers in diameter.<br />"LEDs are dramatically more powerful now than they were just a few years ago, and they are only getting better and cheaper," said Fletcher. "We had to disabuse ourselves of the notion that we needed to spend many thousands on a mercury arc lamp and high-sensitivity camera to get a meaningful image. We found that a high-powered LED - which retails for just a few dollars - coupled with a typical camera phone could produce a clinical quality image sufficient for our goal of detecting in a field setting some of the most common diseases in the developing world."<br />The researchers pointed out that while fluorescent microscopes include additional parts, less training is needed to interpret fluorescent images. Instead of sorting out pathogens from normal cells in the images from standard light microscopes, health workers simply need to look for something the right size and shape to light up on the screen.<br />"Viewing fluorescent images is a bit like looking at stars at night," said Breslauer. "The bright green fluorescent light stands out clearly from the dark background. It's this contrast in fluorescent imaging that allowed us to use standard computer algorithms to analyze the sample containing TB bacteria."<br />Breslauer added that these software programs can be easily installed onto a typical cell phone, turning the mobile phone into a self-contained field lab and a "good platform for epidemiological monitoring."<br />While the CellScope is particularly valuable in resource-poor countries, Fletcher noted that it may have a place in this country's health care system, famously plagued with cost overruns.<br />"A CellScope device with fluorescence could potentially be used by patients undergoing chemotherapy who need to get regular blood counts," said Fletcher. "The patient could transmit from home the image or analyzed data to a health care professional, reducing the number of clinic visits necessary."<br />The CellScope developers have even been approached by experts in agriculture interested in using it to help diagnose diseases in crops. Instead of sending in a leaf sample to a lab for diagnosis, farmers could upload an image of the diseased leaf for analysis.<br />The researchers are currently developing more robust prototypes of the CellScope in preparation for further field testing.<br />Other researchers on the team include Robi Maamari, a UC Berkeley research associate in bioengineering and co-lead author of the study; Neil Switz, a graduate student in UC Berkeley's Biophysics Graduate Group; and Wilbur Lam, a UC Berkeley post-doctoral fellow in bioengineering and a UCSF pediatric hematologist.<br />Funding for the CellScope project comes from the Center for Information Technology Research in the Interest of Society (CITRIS) and the Blum Center for Developing Economies, both at UC Berkeley, and from Microsoft Research, Intel and the Vodafone Americas Foundation.<br />Journal reference:<br />David N. Breslauer et al. Mobile Phone Based Clinical Microscopy for Global Health Applications. PLoS One, July 22, 2009 DOI: <a href="http://dx.doi.org/10.1371/journal.pone.0006320" rel="nofollow" target="_blank">10.1371/journal.pone.0006320</a><br />Adapted from materials provided by <a class="blue" href="http://www.berkeley.edu/" rel="nofollow" target="_blank">University of California - Berkeley</a>.<br /></div><div align="center"></div>olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com0tag:blogger.com,1999:blog-2542563728620514982.post-15797881916269804862009-07-22T08:17:00.001-07:002009-07-22T08:19:06.719-07:00Electronic Nose Created To Detect Skin Vapors<div align="center"><a href="http://www.sciencedaily.com/images/2009/07/090721091839.jpg"><img style="TEXT-ALIGN: center; MARGIN: 0px auto 10px; WIDTH: 300px; DISPLAY: block; HEIGHT: 225px; CURSOR: hand" border="0" alt="" src="http://www.sciencedaily.com/images/2009/07/090721091839.jpg" /></a> <a href="http://www.sciencedaily.com/releases/2009/07/090721091839.htm"><strong><span style="color:#ffff66;">SOURCE</span></strong></a></div><div align="center"><br /></div><div align="left">ScienceDaily (July 21, 2009) — A team of researchers from the Yale University (United States) and a Spanish company have developed a system to detect the vapours emitted by human skin in real time. The scientists think that these substances, essentially made up of fatty acids, are what attract mosquitoes and enable dogs to identify their owners.</div><div align="left">"The spectrum of the vapours emitted by human skin is dominated by fatty acids. These substances are not very volatile, but we have developed an 'electronic nose' able to detect them", Juan Fernández de la Mora, of the Department of Mechanical Engineering at Yale University (United States) and co-author of a study recently published in the Journal of the American Society for Mass Spectrometry, says.<br />The system, created at the Boecillo Technology Park in Valladolid, works by ionising the vapours with an electrospray (a cloud of electrically-charged drops), and later analysing these using mass spectrometry. This technique can be used to identify many of the vapour compounds emitted by a hand, for example.<br />"The great novelty of this study is that, despite the almost non-existent volatility of fatty acids, which have chains of up to 18 carbon atoms, the electronic nose is so sensitive that it can detect them instantaneously", says Fernández de la Mora. The results show that the volatile compounds given off by the skin are primarily fatty acids, although there are also others such as lactic acid and pyruvic acid.<br />The researcher stresses that the great chemical wealth of fatty acids, made up of hundreds of different molecules, "is well known, and seems to prove the hypothesis that these are the key substances that enable dogs to identify people". The enormous range of vapours emitted by human skin and breath may not only enable dogs to recognise their owners, but also help mosquitoes to locate their hosts, according to several studies.<br />World record for detecting explosives<br />Aside from identifying people from their skin vapours, another of the important applications of the new system is that it is able to detect tiny amounts of explosives. The system can "smell" levels below a few parts per trillion, and has been able to set a world sensitivity record at "2x10-14 atmospheres of partial pressure of TNT (the explosive trinitrotoluene)".<br />The "father" of ionisation using the mass spectrometry electrospray is Professor John B. Fenn, who is currently a researcher at the University of Virginia (United States), and in 2002 won the Nobel Prize in Chemistry for using this technique in the analysis of proteins.<br />Journal references:<br />Pablo Martínez Lozano y Juan Fernández de la Mora. Online Detection of Human Skin Vapors. Journal of the American Society for Mass Spectrometry, 20 (6): 1060-1063, 2009<br />Martínez-Lozano et al. Secondary Electrospray Ionization (SESI) of Ambient Vapors for Explosive Detection at Concentrations Below Parts Per Trillion. Journal of the American Society for Mass Spectrometry, 2009; 20 (2): 287 DOI: <a href="http://dx.doi.org/10.1016/j.jasms.2008.10.006" rel="nofollow" target="_blank">10.1016/j.jasms.2008.10.006</a><br />Adapted from materials provided by <a class="blue" href="http://www.fecyt.es/fecyt/home.do" rel="nofollow" target="_blank">FECYT - Spanish Foundation for Science and Technology</a>, via <a href="http://www.eurekalert.org/" rel="nofollow" target="_blank">EurekAlert!</a>, a service of AAAS. </div>olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com1tag:blogger.com,1999:blog-2542563728620514982.post-45546025996042579132009-07-17T07:49:00.001-07:002009-07-17T07:51:05.719-07:00Human-like Vision Lets Robots Navigate Naturally<div align="center"><a href="http://www.sciencedaily.com/releases/2009/06/090630075616.htm"><img style="TEXT-ALIGN: center; MARGIN: 0px auto 10px; WIDTH: 300px; DISPLAY: block; HEIGHT: 210px; CURSOR: hand" border="0" alt="" src="http://www.sciencedaily.com/images/2009/06/090630075616.jpg" /><strong> <span style="color:#ffff66;">SOURCE</span></strong></a></div><div align="center"> </div><div align="left">ScienceDaily (July 17, 2009) — A robotic vision system that mimics key visual functions of the human brain promises to let robots manoeuvre quickly and safely through cluttered environments, and to help guide the visually impaired. </div><div align="left">It’s something any toddler can do – cross a cluttered room to find a toy.<br />It's also one of those seemingly trivial skills that have proved to be extremely hard for computers to master. Analysing shifting and often-ambiguous visual data to detect objects and separate their movement from one’s own has turned out to be an intensely challenging artificial intelligence problem.<br />Three years ago, researchers at the European-funded research consortium Decisions in Motion (<a href="http://www.decisionsinmotion.org/" rel="nofollow" target="_blank">http://www.decisionsinmotion.org/</a>) decided to look to nature for insights into this challenge.<br />In a rare collaboration, neuro- and cognitive scientists studied how the visual systems of advanced mammals, primates and people work, while computer scientists and roboticists incorporated their findings into neural networks and mobile robots.<br />The approach paid off. Decisions in Motion has already built and demonstrated a robot that can zip across a crowded room guided only by what it “sees” through its twin video cameras, and are hard at work on a head-mounted system to help visually impaired people get around.<br />“Until now, the algorithms that have been used are quite slow and their decisions are not reliable enough to be useful,” says project coordinator Mark Greenlee. “Our approach allowed us to build algorithms that can do this on the fly, that can make all these decisions within a few milliseconds using conventional hardware.”<br />How do we see movement?<br />The Decisions in Motion researchers used a wide variety of techniques to learn more about how the brain processes visual information, especially information about movement.<br />These included recording individual neurons and groups of neurons firing in response to movement signals, functional magnetic resonance imaging to track the moment-by-moment interactions between different brain areas as people performed visual tasks, and neuropsychological studies of people with visual processing problems.<br />The researchers hoped to learn more about how the visual system scans the environment, detects objects, discerns movement, distinguishes between the independent movement of objects and the organism’s own movements, and plans and controls motion towards a goal.<br />One of their most interesting discoveries was that the primate brain does not just detect and track a moving object; it actually predicts where the object will go.<br />“When an object moves through a scene, you get a wave of activity as the brain anticipates its trajectory,” says Greenlee. “It’s like feedback signals flowing from the higher areas in the visual cortex back to neurons in the primary visual cortex to give them a sense of what’s coming.”<br />Greenlee compares what an individual visual neuron sees to looking at the world through a peephole. Researchers have known for a long time that high-level processing is needed to build a coherent picture out of a myriad of those tiny glimpses. What's new is the importance of strong anticipatory feedback for perceiving and processing motion.<br />“This proved to be quite critical for the Decisions in Motion project,” Greenlee says. “It solves what is called the ‘aperture problem’, the problem of the neurons in the primary visual cortex looking through those little peepholes.”<br />Building a better robotic brain<br />Armed with a better understanding of how the human brain deals with movement, the project’s computer scientists and roboticists went to work. Using off-the-shelf hardware, they built a neural network with three levels mimicking the brain’s primary, mid-level, and higher-level visual subsystems.<br />They used what they had learned about the flow of information between brain regions to control the flow of information within the robotic “brain”.<br />“It’s basically a neural network with certain biological characteristics,” says Greenlee. “The connectivity is dictated by the numbers we have from our physiological studies.”<br />The computerised brain controls the behaviour of a wheeled robotic platform supporting a moveable head and eyes, in real time. It directs the head and eyes where to look, tracks its own movement, identifies objects, determines if they are moving independently, and directs the platform to speed up, slow down and turn left or right.<br />Greenlee and his colleagues were intrigued when the robot found its way to its first target – a teddy bear – just like a person would, speeding by objects that were at a safe distance, but passing nearby obstacles at a slower pace.<br />”That was very exciting,” Greenlee says. “We didn’t program it in – it popped out of the algorithm.”<br />In addition to improved guidance systems for robots, the consortium envisions a lightweight system that could be worn like eyeglasses by visually or cognitively impaired people to boost their mobility. One of the consortium partners, Cambridge Research Systems, is developing a commercial version of this, called VisGuide.<br />Decisions in Motion received funding from the ICT strand of the EU’s Sixth Framework Programme for research. The project’s work was featured in a video by the New Scientist in February this year.<br />Adapted from materials provided by <a class="blue" href="http://cordis.europa.eu./ictresults" rel="nofollow" target="_blank">ICT Results</a>. </div>olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com0tag:blogger.com,1999:blog-2542563728620514982.post-22342187502455633722009-07-17T01:29:00.001-07:002009-07-17T01:29:52.209-07:00Solar Power: New SunCatcher Power System Ready For Commercial Production In 2010<div align="center"><a href="http://www.sciencedaily.com/releases/2009/07/090709205950.htm"><img style="TEXT-ALIGN: center; MARGIN: 0px auto 10px; WIDTH: 300px; DISPLAY: block; HEIGHT: 203px; CURSOR: hand" border="0" alt="" src="http://www.sciencedaily.com/images/2009/07/090709205950.jpg" /><strong><span style="color:#ffff66;"> SOURCE</span></strong></a></div><div align="center"><br /></div><div align="left">ScienceDaily (July 17, 2009) — Stirling Energy Systems (SES) and Tessera Solar recently unveiled four newly designed solar power collection dishes at Sandia National Laboratories’ National Solar Thermal Test Facility (NSTTF). Called SunCatchers™, the new dishes have a refined design that will be used in commercial-scale deployments of the units beginning in 2010. </div><div align="left">“The four new dishes are the next-generation model of the original SunCatcher system. Six first-generation SunCatchers built over the past several years at the NSTTF have been producing up to 150KW [kilowatts] of grid-ready electrical power during the day,” says Chuck Andraka, the lead Sandia project engineer. “Every part of the new system has been upgraded to allow for a high rate of production and cost reduction.”<br />Sandia’s concentrating solar-thermal power (CSP) team has been working closely with SES over the past five years to improve the system design and operation.<br />The modular CSP SunCatcher uses precision mirrors attached to a parabolic dish to focus the sun’s rays onto a receiver, which transmits the heat to a Stirling engine. The engine is a sealed system filled with hydrogen. As the gas heats and cools, its pressure rises and falls. The change in pressure drives the piston inside the engine, producing mechanical power, which in turn drives a generator and makes electricity.<br />The new SunCatcher is about 5,000 pounds lighter than the original, is round instead of rectangular to allow for more efficient use of steel, has improved optics, and consists of 60 percent fewer engine parts. The revised design also has fewer mirrors — 40 instead of 80. The reflective mirrors are formed into a parabolic shape using stamped sheet metal similar to the hood of a car. The mirrors are made by using automobile manufacturing techniques. The improvements will result in high-volume production, cost reductions, and easier maintenance.<br />Among Sandia’s contributions to the new design was development of a tool to determine how well the mirrors work in less than 10 seconds, something that took the earlier design one hour.<br />“The new design of the SunCatcher represents more than a decade of innovative engineering and validation testing, making it ready for commercialization,” says Steve Cowman, Stirling Energy Systems CEO. “By utilizing the automotive supply chain to manufacture the SunCatcher, we’re leveraging the talents of an industry that has refined high-volume production through an assembly line process. More than 90 percent of the SunCatcher components will be manufactured in North America.”<br />In addition to improved manufacturability and easy maintenance, the new SunCatcher minimizes both cost and land use and has numerous environmental advantages, Andraka says.<br />“They have the lowest water use of any thermal electric generating technology, require minimal grading and trenching, require no excavation for foundations, and will not produce greenhouse gas emissions while converting sunlight into electricity,” he says.<br />Tessera Solar, the developer and operator of large-scale solar projects using the SunCatcher technology and sister company of SES, is building a 60-unit plant generating 1.5 MW (megawatts) by the end of the year either in Arizona or California. One megawatt powers about 800 homes. The proprietary solar dish technology will then be deployed to develop two of the world’s largest solar generating plants in Southern California with San Diego Gas & Electric in the Imperial Valley and Southern California Edison in the Mojave Desert, in addition to the recently announced project with CPS Energy in West Texas. The projects are expected to produce 1,000 MW by the end of 2012.<br />Last year one of the original SunCatchers set a new solar-to-grid system conversion efficiency record by achieving a 31.25 percent net efficiency rate, toppling the old 1984 record of 29.4.<br />Adapted from materials provided by <a class="blue" href="http://www.sandia.gov/" rel="nofollow" target="_blank">Sandia National Laboratories</a>. </div>olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com0tag:blogger.com,1999:blog-2542563728620514982.post-8513999298538561252009-07-08T23:39:00.001-07:002009-07-08T23:41:06.769-07:00Robot Learns To Smile And Frown<div align="center"><a href="http://www.sciencedaily.com/images/2009/07/090708181206.jpg"><img style="TEXT-ALIGN: center; MARGIN: 0px auto 10px; WIDTH: 300px; DISPLAY: block; HEIGHT: 390px; CURSOR: hand" border="0" alt="" src="http://www.sciencedaily.com/images/2009/07/090708181206.jpg" /></a> <a href="http://www.sciencedaily.com/releases/2009/07/090708181206.htm"><strong><span style="color:#ffff66;">SOURCE</span></strong></a></div><div align="center"><br /></div><div align="left">ScienceDaily (July 8, 2009) — A hyper-realistic Einstein robot at the University of California, San Diego has learned to smile and make facial expressions through a process of self-guided learning. The UC San Diego researchers used machine learning to “empower” their robot to learn to make realistic facial expressions. </div><div align="left">“As far as we know, no other research group has used machine learning to teach a robot to make realistic facial expressions,” said Tingfan Wu, the computer science Ph.D. student from the UC San Diego Jacobs School of Engineering who presented this advance on June 6 at the IEEE International Conference on Development and Learning.<br />The faces of robots are increasingly realistic and the number of artificial muscles that controls them is rising. In light of this trend, UC San Diego researchers from the Machine Perception Laboratory are studying the face and head of their robotic Einstein in order to find ways to automate the process of teaching robots to make lifelike facial expressions.<br />This Einstein robot head has about 30 facial muscles, each moved by a tiny servo motor connected to the muscle by a string. Today, a highly trained person must manually set up these kinds of realistic robots so that the servos pull in the right combinations to make specific face expressions. In order to begin to automate this process, the UCSD researchers looked to both developmental psychology and machine learning.<br />Developmental psychologists speculate that infants learn to control their bodies through systematic exploratory movements, including babbling to learn to speak. Initially, these movements appear to be executed in a random manner as infants learn to control their bodies and reach for objects.<br />“We applied this same idea to the problem of a robot learning to make realistic facial expressions,” said Javier Movellan, the senior author on the paper presented at ICDL 2009 and the director of UCSD’s Machine Perception Laboratory, housed in Calit2, the California Institute for Telecommunications and Information Technology.<br />Although their preliminary results are promising, the researchers note that some of the learned facial expressions are still awkward. One potential explanation is that their model may be too simple to describe the coupled interactions between facial muscles and skin.<br />To begin the learning process, the UC San Diego researchers directed the Einstein robot head (Hanson Robotics’ Einstein Head) to twist and turn its face in all directions, a process called “body babbling.” During this period the robot could see itself on a mirror and analyze its own expression using facial expression detection software created at UC San Diego called CERT (Computer Expression Recognition Toolbox). This provided the data necessary for machine learning algorithms to learn a mapping between facial expressions and the movements of the muscle motors.<br />Once the robot learned the relationship between facial expressions and the muscle movements required to make them, the robot learned to make facial expressions it had never encountered.<br />For example, the robot learned eyebrow narrowing, which requires the inner eyebrows to move together and the upper eyelids to close a bit to narrow the eye aperture.<br />“During the experiment, one of the servos burned out due to misconfiguration. We therefore ran the experiment without that servo. We discovered that the model learned to automatically compensate for the missing servo by activating a combination of nearby servos,” the authors wrote in the paper presented at the 2009 IEEE International Conference on Development and Learning.<br />“Currently, we are working on a more accurate facial expression generation model as well as systematic way to explore the model space efficiently,” said Wu, the computer science PhD student. Wu also noted that the “body babbling” approach he and his colleagues described in their paper may not be the most efficient way to explore the model of the face.<br />While the primary goal of this work was to solve the engineering problem of how to approximate the appearance of human facial muscle movements with motors, the researchers say this kind of work could also lead to insights into how humans learn and develop facial expressions.<br />“<a href="http://mplab.ucsd.edu/wp-content/uploads/wu_icdl20091.pdf" rel="nofollow" target="_blank">Learning to Make Facial Expressions</a>,” by Tingfan Wu, Nicholas J. Butko, Paul Ruvulo, Marian S. Bartlett, Javier R. Movellan from Machine Perception Laboratory, University of California San Diego. Presented on June 6 at the 2009 IEEE 8th International Conference On Development And Learning.<br />Adapted from materials provided by <a class="blue" href="http://www.ucsd.edu/" rel="nofollow" target="_blank">University of California - San Diego</a>. </div>olosciencehttp://www.blogger.com/profile/07007258673266741468noreply@blogger.com0