about us:
our company name is KNK DECORATION. our job about sticker service, example: sticker cutting, digital printing, screen printing, neon box, sign marking and another material promotion.
kami melayani pengerjaan stiker cutting, printing indoor/outdoor, neon box, pembuatan rambu-rambu lalu lintas(standard DLLAJR), dll

Monday, April 28, 2008

Beating The Codebreakers With Quantum Cryptography

ScienceDaily (Apr. 28, 2008) — Quantum cryptography may be essentially solved, but getting the funky physics to work on disciplined computer networks is a whole new headache.

Cryptography is an arms race, but the finish line may be fast approaching. Up to now, each time the codemakers made a better mousetrap, codebreakers breed a better mouse. But quantum cryptography theoretically could outpace the codebreakers and win the race. Forever.

Already the current state of the art in classical encryption, 128-bit RSA, can be cracked with enough raw, brute force computing power available to organisations like the US National Security Agency. And the advent of quantum computing will make it even simpler. The gold standard for secret communication will be truly dead.

Quantum cryptography solves the problem, and it will overcome the remaining stumbling block, the distribution of the code key to the right person, by using quantum key distribution (QKD).

Modern cryptography relies on the use of digital ‘keys’ to encrypt data before sending it over a network, and to decrypt it at the other end. The receiver must have a version of the key code used by the sender so as to be able to decrypt and access the data.

QKD offers a theoretically uncrackable code, one that is easily distributed and works in a transparent manner. Even better, the nature of quantum mechanics means that if any eavesdropper – called Eve in the argot of cryptographers – tries to snoop on a message the sender and receiver will both know.

That ability is due to the use of the Heisenberg Uncertainty Principle, which sits at the heart of quantum mechanics. The principle rests on the theory that the act of measuring a quantum state changes that state. It is like children with a guilty secret. As soon as you look at them their faces morph plausibly into ‘Who, me?’

The practical upshot for cryptography is that the sender and receiver can verify the security of the transmission. They will know if the state of the quanta has changed, whether the key has been read en route. If so, they can abandon the key they are using and generate a new one.

QKD made its real-world debut in the canton of Geneva for use in the electronic voting system used in the Swiss general election last year. The system guaranteed that the poll was secure. But, more importantly perhaps, it also ensured that no vote was lost in transmission, because the uncertainly principle established there was no change to the transmitted data.

The end of the beginning

The canton election was a demonstration of the work done by researchers for the SECOQC project, an EU-funded effort to develop an international network for secure communication based on QKD.

The test of the technology demonstrated that QKD worked for point-to-point communications between two parties. But the demonstration was just the beginning of the SECOQC’s overall goal.

“We want to establish a network wide quantum encryption, because it will mean it works over much longer distances,” explains Christian Monyk, co-ordinator of the SECOQC project and head of the quantum-technologies unit at the Austrian Research Centres. “Network quantum encryption and QKD mean that many parties can communicate securely, not just two. Finally, it also means quantum encryption could be deployed on a very large scale, for the insurance and banking sectors, for example.”

Moving the system from point-to-point communications to a network is an order of magnitude more difficult.

“The quantum science for cryptography and key distribution is essentially solved, and it is a great result,” Monyk says. “But getting that system to work across a network is much more difficult. You have to deal with different protocols and network architectures, develop new nodes and new interfaces with the quantum devices to get it to a large-scale, long distance, real-world application.”

Working at a distance

Getting the system to work over long distances is also a challenge because QKD requires hi-fidelity data transmission over high-quality physical networks like non-zero dispersion shifted fibre optics.

“It was not one big problem, it was many, many small computing science and engineering problems,” says Monyk. “We had to work with a large number of technologies. And we have to certify it to experts.”

But SECOQC’s researchers believe they have solved the network issue. The researchers are currently putting the final touches to a demonstration of the technology to be held this October in Vienna, Austria. Industry has shown great interest in the technology. Still the technology is not quite ready for prime time.

“From a technical point of view, the technology will be ready in one or two years,” says Monyk.

And that means that the race will be won, finally, by the codemakers.

Adapted from materials provided by ICT Results.

Aeronautics Engineers Design Silent, Eco-friendly Plane

ScienceDaily (Nov. 7, 2006) — MIT and Cambridge University researchers unveiled the conceptual design for a silent, environmentally friendly passenger plane at a press conference Monday, Nov. 6, at the Royal Aeronautical Society in London.

"Public concern about noise is a major constraint on expansion of aircraft operations. The 'silent aircraft' can help address this concern and thus aid in meeting the increasing passenger demand for air transport," said Edward M. Greitzer, the H.N. Slater Professor of Aeronautics and Astronautics at MIT.

Greitzer and Professor Ann P. Dowling of Cambridge University are the lead principal investigators on the Silent Aircraft Initiative. This collaboration of 40 researchers from MIT and Cambridge, plus many others from more than 30 companies, was launched three years ago "to develop a conceptual design for an aircraft whose noise was almost imperceptible outside the perimeter of an airfield in an urban environment."

While originally conceived to make a huge reduction in airplane noise, the team's ultimate design also has the potential to be more fuel-efficient. In a typical flight, the proposed plane, which is designed to carry 215 passengers, is predicted to achieve 124 passenger-miles per gallon, almost 25 percent more than current aircraft, according to Greitzer. (For a down-to-earth comparison, the Toyota Prius hybrid car carrying two passengers achieves 120 passenger-miles per gallon.)

The project aims to develop aircraft by 2030.

The conceptual design addresses both the engines and the structure, or airframe, of a plane. Half of the noise from a landing plane comes from the airframe.

Other key features of the design include:

* An overall shape that integrates body and wings into a "single" flying wing. As a result, both the body and wings provide lift, allowing a slower approach and takeoff, which would reduce noise. The shape also improves fuel efficiency.
* The elimination of the flaps, or hinged rear sections on each wing. These are a major source of airframe noise when a plane is taking off and landing.
* Engines embedded in the aircraft with air intakes on top of the plane rather than underneath each wing. This screens much of the noise from the ground.
* A variable-size jet nozzle that allows slower jet propulsion during takeoff and landing but efficient cruising at higher speeds.

What will it take to turn the design into a plane by 2030?

"One major technical challenge is the integration of the propulsion system with the aircraft," Greitzer said. "The propulsion system, with engines embedded in the fuselage, is different than for traditional civil aircraft, in which the engines are located in nacelles below the wing. This presents a different set of issues to the designer."

Zoltan S. Spakovszky, C. S. Draper Associate Professor in MIT's Department of Aeronautics and Astronautics, also cited the integration of the propulsion system as a key challenge. Spakovszky and James I. Hileman, a research engineer in the department, are the chief engineers, or day-to-day managers, for the project.

He explained that in today's airplanes, with engines hanging below the wings, air flows unimpeded into the engine. In the new design, however, air traveling into the air intakes on top of the plane will behave differently. This is because the air particles flowing close to the plane's body experience friction. As a result, "the particles flow at a lower velocity near the surface of the plane than in the free (air) stream," Spakovszky said. The new engine must be designed to operate in these strongly nonuniform airflows.

A second important technical challenge involves the craft's unconventional airframe, Spakovszky said. "The structural integrity of a pressure vessel allowing this single wing-like shape needs to be ensured and poses a major challenge."

Greitzer emphasized that the collaboration between MIT, Cambridge University and their industrial partners was key to the end result.

"Collaboration and teaming occurred in essentially all aspects of the project. The Silent Aircraft Initiative has been very much an enterprise in which the whole is greater than the sum of the separate parts," he said.

Spakovszky referred to the overall team effort as the best part of the project. "Technical expectations were taken for granted, but working well across the Atlantic was not a given," he said. "It was a very, very neat experience."

The Silent Aircraft Initiative is funded by the Cambridge-MIT Institute, which has supported a wide range of research and educational collaborations between the two universities. The Knowledge Integration Community (or KIC) that created the conceptual design included academic staff and students from both institutions and participants from a wide range of industrial collaborators including Boeing and Rolls Royce.

Adapted from materials provided by Massachusetts Institute Of Technology.

Next-generation Sensors May Help Avert Airline Disasters

ScienceDaily (Mar. 28, 2008) — Everyone on board an Scandinavian Airline System (SAS) plane died when it collided with a light aircraft and exploded in a luggage hanger in Milan in 2001. The smaller plane had taxied wrongly and ended up on the runway where the SAS aircraft was taking off.

The following year, two planes collided in mid-air over Überlingen in the south of Germany on the edge of Lake Constance. One was a Russian passenger flight from Moscow to Barcelona, while the other was a cargo plane heading for Belgium from the Persian Gulf. Seventy-one persons died.

Safety need to be improved

As air transport grows, take-offs become more tightly spaced and more and more planes are circling airports as they wait for permission to land, the potential for disasters increases.

Last year alone, international air traffic grew by 5.9 percent. Parallel to this increase, the minimum distance between aircraft in the air in European airspace has decreased. The minimum vertical distance between aircraft has been halved from 600 metres to 300 for planes flying above 29000 feet. The idea has been to increase airspace capacity by 20 percent. Routes are shortened, and airlines expect to save the huge sum of NOK 30 billion a year in fuel costs alone.

But what above safety up there? In the wake of a number of disasters in the air in 2001 and 2002, the EU took up the problem and resolved that certain aspects of the industry should be studied in detail and evaluated in terms of safety. Several projects were launched under its 6th Framework Programme. One of these was the HASTEC project, which was to develop the next generation of pressure sensors for better aircraft altitude measurement.

Next-generation sensors

“There is a need for aircraft sensors that are more accurate than current models, which are large and reliable, but expensive systems,” says Sigurd Moe of SINTEF ICT. “Among other things, they need to be more stable throughout their life-cycle. The problem with current sensors is that they need to be checked and calibrated regularly, and this is an expensive process since the aircraft needs to be grounded.

Challenges

The problem is that mechanical tensions may develop in the connection with the sensor package itself. The scientists therefore had to produce a silicon based sensor structure in which such tensions would not transmit/propogate into the chip itself. The solution was a spiral silicon element in which the pressure-sensitive part was not affected even if the mounting stretches and drags the element.

SINTEF produces silicon wafers with hundreds of chips on each wafer, several of which are laid on top of each other and glued together before being sawn into chips. Individual chips are then selected and integrated into a sensor package that has been developed by Memscap. The company produces, assembles and tests the sensor package itself.

The first prototype has now been delivered to Memscap by the scientists for further testing and mounting. During the first six months of 2008 these new-technology sensors will be flight tested.

Adapted from materials provided by SINTEF, via AlphaGalileo.

Next Step In Robot Development Is Child's Play

ScienceDaily (Apr. 26, 2008) — Teaching robots to understand enough about the real world to allow them act independently has proved to be much more difficult than first thought.

The team behind the iCub robot believes it, like children, will learn best from its own experiences.

The technologies developed on the iCub platform – such as grasping, locomotion, interaction, and even language-action association – are of great relevance to further advances in the field of industrial service robotics.

The EU-funded RobotCub project, which designed the iCub, will send one each to six European research labs. Each of the labs proposed winning projects to help train the robots to learn about their surroundings – just as a child would.

The six projects include one from Imperial College London that will explore how ‘mirror neurons’ found in the human brain can be translated into a digital application. ‘Mirror neurons’, discovered in the early 1990s, trigger memories of previous experiences when humans are trying to understand the physical actions of others. A separate team at UPF Barcelona will also work on iCub’s ‘cognitive architecture’.

At the same time, a team headquartered at UPMC in Paris will explore the dynamics needed to achieve full body control for iCub. Meanwhile, researchers at TUM Munich will work on the development of iCub’s manipulation skills. A project team from the University of Lyons will explore internal simulation techniques – something our brains do when planning actions or trying to understand the actions of others.

Over in Turkey, a team based at METU in Ankara will focus almost exclusively on language acquisition and the iCub’s ability to link objects with verbal utterances.

“The six winners had to show they could really use and maintain the robot, and secondly the project had to exploit the capabilities of the robot,” says Giorgio Metta. “Looking at the proposals from the winners, it was clear that if we gave them a robot we would get something in return.”

The iCub robots are about the size of three-year-old children, with highly dexterous hands and fully articulated heads and eyes. They have hearing and touch capabilities and are designed to be able to crawl on all fours and to sit up.

Humans develop their abilities to understand and interact with the world around them through their experiences. As small children, we learn by doing and we understand the actions of others by comparing their actions to our previous experience.

The developers of iCub want to develop their robots’ cognitive capabilities by mimicking that process. Researchers from the EU-funded Robotcub project designed the iCub’s hardware and software using a modular system. The design increases the efficiency of the robot, and also allows researcher to more easily update individual components. The modular design also allows large numbers of researchers to work independently on separate aspects of the robot.

iCub’s software coding, along with technical drawings, are free to anyone who wishes to download and use them.

“We really like the idea of being open as it is a way to build a community of many people working towards a common objective,” says Giorgio Metta, one of the developers of iCub. “We need a critical mass working on these types of problems. If you get 50 researchers, they can really layer knowledge and build a more complex system. Joining forces really makes economic sense for the European Commission that is funding these projects and it makes scientific sense.”

Built-in learning skills

While the iCub’s hardware and mechanical parts are not expected to change much over the next 18 months, researchers expect to develop the software further. To enable iCub to learn by doing, the Robotcub research team is trying to pre-fit it with certain innate skills.

These include the ability to track objects visually or by the sounds – with some element of prediction of where the tracked object will move to next. iCub should also be able to navigate based on landmarks and a sense of its own position.

But the first and key skill iCub needs for learning by doing is an ability to reach towards a fixed point. By October this year, the iCub developers plan to develop the robot so it is able to analyse the information it receives via its vision and feel ‘senses’. The robot will then be able to use this information to perform at least some crude grasping behaviour – reaching outwards and closing its fingers around an object.

“Grasping is the first step in developing cognition as it is required to learn how to use tools and to understand that if you interact with an object it has consequences,” says Giorgio Metta. “From there the robot can develop more complex behaviours as it learns that particular objects are best manipulated in certain ways.”

Once the assembly of the six robots for the research projects is completed, the developers plan to build more iCubs, creating between 15 and 20 in use around Europe.

Adapted from materials provided by ICT Results.

Tuesday, April 22, 2008

Data Transfer In The Brain: Newfound Mechanism Enables Reliable Transmission Of Neuronal Information

ScienceDaily (Apr. 22, 2008) — The receptors of neurotransmitters move very rapidly. This mobility plays an essential, and hitherto unsuspected, role in the passage of nerve impulses from one neuron to another, thus controlling the reliability of data transfer. This has recently been demonstrated by scientists in the "Physiologie cellulaire de la synapse" Laboratory (CNRS/Université Bordeaux 2) coordinated by Daniel Choquet, senior researcher at CNRS.

By enabling a clearer understanding of the mechanisms involved in neuronal transmissions, this work opens the way to new therapeutic targets for the neurological and psychiatric disorders that depend on poor neuronal communication (Parkinson's disease, Alzheimer's disease, OCD, etc.). Fruit of a collaboration with physicists in the Centre de physique moléculaire optique et hertzienne (CPMOH, CNRS/Université Bordeaux 1) and German and American research teams(1), these findings were published on April 11, 2008 in Science.

The processing of information by the brain is mainly based on the coding of data by variations in the frequency of neuronal activity. "Good" communication thus implies the reliable transmission of this "code" by the connections between neurons, or synapses. Under normal circumstances, this junction comprises a pre-synaptic element from which the information arises, and a post-synaptic element which receives it.

It is at this point that neuronal communication occurs. Once the pre-synaptic neuron has been stimulated by an electrical signal with a precise frequency, it releases chemical messengers into the synapse: neurotransmitters. And the response is rapid! These neurotransmitters bind to specific receptors, thus provoking a change to the electrical activity of the post-synaptic neuron and hence the birth of a new signal.

The mobility of receptors controls the reliability of neuronal transmission. Working at the interface between physics and biology, the teams in Bordeaux led by Choquet, CNRS senior researcher in the "Physiologie cellulaire de la synapse"(2) laboratory, working in close collaboration with the group led by

Brahim Lounis at the Centre de physique moléculaire optique et hertzienne(2) have been studying synaptic transmission and, more particularly, the role of certain receptors of glutamate, a neurotransmitter present in 80% of neurons in the brain.

Focusing on the dynamics of these receptors, the researchers have revealed that a minor modification to their mobility has a major impact on high frequency synaptic transmission, i.e. at frequencies between 50 and 100 Hz (those which intervene during memorization, learning or sensory stimulation processes). More specifically, they have established that this mobility enables the replacement in a few milliseconds of desensitized receptors by "naïve" receptors in the synapse. This phenomenon reduces synaptic depression(3) and allows the neurons to transmit the information at a higher frequency. By contrast, if the receptors are immobilized, this depression is notably enhanced, preventing transmission of the nerve impulse in the synapses above around ten Hertz.

More profoundly, the scientists have demonstrated that prolonged series of high frequency stimulations, which induce an increase in calcium levels in the synapses, cause the immobilization of receptors. They have also proved that these series of stimulations diminish the ability of neurons to transmit an activity at high frequency. Receptor mobility is thus correlated with the frequency of synaptic transmission and consequently, the reliability of this transmission.

A real advance for research

When the brain is functioning under normal conditions, we can suppose that the immobilization of receptors following a series of high frequency stimulations constitutes a safety mechanism. It will prevent subsequent series from overexciting the post-synaptic neuron. A reliable transmission of information between two neurons is obviously crucial to satisfactory functioning of the brain.

These results, of prime importance, suggest that some dysfunctions of neuronal transmission are due to a defect in receptor stabilization. However, high frequency electrical stimulation of certain regions of the brain is used to treat Parkinson's disease or obsessive-compulsive disorders (OCD). Its mechanism of action, still poorly understood, may therefore involve receptor mobility. This work has thus made it possible to identify new therapeutic targets and could augur well for potential drugs to treat neurological and psychiatric disorders which often result from poor communication between neurons.

Notes

1. Teams at the Leibniz Institute, Magdeburg and Johns Hopkins University School of Medicine, Baltimore, USA.
2. CNRS/Université Bordeaux 2.
3. CPMOH, CNRS/Université Bordeaux 1.
4. When a pre-synaptic neuron is stimulated at very frequent intervals (high frequencies of around 50-100 Hertz), the post-synaptic response generally diminishes over time: this is called synaptic depression. The higher the stimulation frequency, the more this depression increases.

Journal reference: Surface Mobility of Post-synaptic AMPARs Tunes Synaptic Transmission. Martin Heine, Laurent Groc, Renato Frischknecht, Jean-Claude Béïque, Brahim Lounis, Gavin Rumbaugh, Richard L. Huganir, Laurent Cognet and Daniel Choquet. Science. 11 April 2008.

Adapted from materials provided by CNRS.

Breastfeeding While Taking Seizure Medicine Does Not Appear To Harm Children, Study Suggests

ScienceDaily (Apr. 22, 2008) — A first of its kind study finds breastfeeding while taking certain seizure medications does not appear to harm a child's cognitive development.
"Our early findings show breastfeeding during anti-epilepsy drug treatment doesn't appear to have a negative impact on a child's cognitive abilities," said study author Kimford Meador, MD, with the University of Florida at Gainesville, and Fellow of the American Academy of Neurology. "However, more research is needed to confirm our findings and women should use caution due to the limitations of our study."

Researchers tested the cognitive development of 187 two-year-old children whose mothers were taking the epilepsy drugs lamotrigine, carbamazepine, phenytoin, or valproate. Forty-one percent of the children were breastfed.

The study found breastfed children had higher cognitive test scores than those children who were not breastfed, and this trend was consistent for each anti-epilepsy drug. The children who were breastfed received an average test score of 98.1 compared to a score of 89.5 for the children not breastfed. However, the results were not significant after adjusting for the mother's IQ. Thus, it appears that the higher scores in children who were breastfed is due to the fact that their mothers had higher IQs.

Meador says animal studies have shown that some anti-epilepsy drugs, but not all, can cause cells to die in immature brains, but this effect can be blocked by the protective effects of beta estradiol, which is the mother's sex hormone. "Since the potential protective effects of beta estradiol in utero are absent after birth, concern was raised that breastfeeding by women taking anti-epilepsy drugs may increase the risk of anti-epilepsy drug-induced cell death and result in reduced cognitive outcomes in children."

Meador says additional research on the effects of breastfeeding should be extended to other anti-epilepsy drugs and mothers who use more than one anti-epilepsy medication.

The study is part of an ongoing study of the long-term effects of in utero anti-epilepsy drug exposure on children's cognition. Women with epilepsy who were taking anti-epilepsy drugs were enrolled in the study during pregnancy. Ultimately, the study will examine the effects of in utero anti-epilepsy drug exposure on children at six years old.

This research was presented at the upcoming American Academy of Neurology 60th Anniversary Annual Meeting in Chicago, April 17, 2008.

Adapted from materials provided by American Academy of Neurology.

Chemotherapy's Damage To The Brain Detailed

ScienceDaily (Apr. 22, 2008) — A commonly used chemotherapy drug causes healthy brain cells to die off long after treatment has ended and may be one of the underlying biological causes of the cognitive side effects -- or "chemo brain" -- that many cancer patients experience. That is the conclusion of a study published today in the Journal of Biology.

A team of researchers at the University of Rochester Medical Center (URMC) and Harvard Medical School have linked the widely used chemotherapy drug 5-fluorouracil (5-FU) to a progressing collapse of populations of stem cells and their progeny in the central nervous system.

"This study is the first model of a delayed degeneration syndrome that involves a global disruption of the myelin-forming cells that are essential for normal neuronal function," said Mark Noble, Ph.D., director of the University of Rochester Stem Cell and Regenerative Medicine Institute and senior author of the study. "Because of our growing knowledge of stem cells and their biology, we can now begin to understand and define the molecular mechanisms behind the cognitive difficulties that linger and worsen in a significant number of cancer patients."

Cancer patients have long complained of neurological side effects such as short-term memory loss and, in extreme cases, seizures, vision loss, and even dementia. Until very recently, these cognitive side effects were often dismissed as the byproduct of fatigue, depression, and anxiety related to cancer diagnosis and treatment. Now a growing body of evidence has documented the scope of these conditions, collectively referred to as chemo brain. And while it is increasingly acknowledged by the scientific community that many chemotherapy agents may have a negative impact on brain function in a subset of cancer patients, the precise mechanisms that underlie this dysfunction have not been identified.

Virtually all cancer survivors experience short-term memory loss and difficulty concentrating during and shortly after treatment. A study two years ago by researchers with the James P. Wilmot Cancer Center at the University of Rochester showed that upwards of 82% of breast cancer patients reported that they suffer from some form of cognitive impairment.

While these effects tend to wear off over time, a subset of patients, particularly those who have been administered high doses of chemotherapy, begin to experience these cognitive side effects months or longer after treatment has ceased and the drugs have long since departed their systems. For example, a recent study estimates that somewhere between 15 and 20 percent of the nation's 2.4 million female breast cancer survivors have lingering cognitive problems years after treatment. Another study showed that 50 percent of women had not recovered their previous level of cognitive function one year after treatment.

Two years ago, Noble and his team showed that three common chemotherapy drugs used to treat a wide range of cancers were more toxic to healthy brain cells than the cancer cells they were intended to treat. While these experiments were among the first to establish a biological basis for the acute onset of chemo brain, they did not explain the lingering impact that many patients experience.

The scientists conducted a similar series of experiments in which they exposed both individual cell populations and mice to doses of 5-fluorouracil (5-FU) in amounts comparable to those used in cancer patients. 5-FU is among a class of drugs called antimetabolites that block cell division and has been used in cancer treatment for more than 40 years. The drug, which is often administered in a "cocktail" with other chemotherapy drugs, is currently used to treat breast, ovarian, stomach, colon, pancreatic and other forms of cancer.

The researchers discovered that months after exposure, specific populations of cells in the central nervous -- oligodendrocytes and dividing precursor cells from which they are generated -- underwent such extensive damage that, after 6 months, these cells had all but disappeared in the mice.

Oligodendrocytes play an important role in the central nervous system and are responsible for producing myelin, the fatty substance that, like insulation on electrical wires, coats nerve cells and enables signals between cells to be transmitted rapidly and efficiently. The myelin membranes are constantly being turned over, and without a healthy population of oligodendrocytes, the membranes cannot be renewed and eventually break down, resulting in a disruption of normal impulse transmission between nerve cells.

These findings parallel observations in studies of cancer survivors with cognitive difficulties. MRI scans of these patients' brains revealed a condition similar to leukoencephalopathy. This demyelination -- or the loss of white matter -- can be associated with multiple neurological problems.

"It is clear that, in some patients, chemotherapy appears to trigger a degenerative condition in the central nervous system," said Noble. "Because these treatments will clearly remain the standard of care for many years to come, it is critical that we understand their precise impact on the central nervous system, and then use this knowledge as the basis for discovering means of preventing such side effects."

Noble points out that not all cancer patients experience these cognitive difficulties, and determining why some patients are more vulnerable may be an important step in developing new ways to prevent these side effects. Because of this study, researchers now have a model which, for the first time, allows scientists to begin to examine this condition in a systematic manner.
###

Other investigators participating in the study include Ruolan Han, Ph.D., Yin M. Yang, M.D., Anne Luebke, Ph.D., Margot Mayer-Proschel, Ph.D., all with URMC, and Joerg Dietrich, M.D., Ph.D., formerly with URMC and now with Harvard Medical School. The study was funded by the National Institutes of Neurological Disorders and Stroke, the Komen Foundation for the Cure, and the Wilmot Cancer Center.

Adapted from materials provided by University of Rochester Medical Center.

Contact Lenses With Circuits, Lights A Possible Platform For Superhuman Vision

ScienceDaily (Jan. 17, 2008) — Movie characters from the Terminator to the Bionic Woman use bionic eyes to zoom in on far-off scenes, have useful facts pop into their field of view, or create virtual crosshairs. Off the screen, virtual displays have been proposed for more practical purposes -- visual aids to help vision-impaired people, holographic driving control panels and even as a way to surf the Web on the go.

The device to make this happen may be familiar. Engineers at the University of Washington have for the first time used manufacturing techniques at microscopic scales to combine a flexible, biologically safe contact lens with an imprinted electronic circuit and lights.

"Looking through a completed lens, you would see what the display is generating superimposed on the world outside," said Babak Parviz, a UW assistant professor of electrical engineering. "This is a very small step toward that goal, but I think it's extremely promising." The results were presented today at the Institute of Electrical and Electronics Engineers' international conference on Micro Electro Mechanical Systems by Harvey Ho, a former graduate student of Parviz's now working at Sandia National Laboratories in Livermore, Calif. Other co-authors are Ehsan Saeedi and Samuel Kim in the UW's electrical engineering department and Tueng Shen in the UW Medical Center's ophthalmology department.

There are many possible uses for virtual displays. Drivers or pilots could see a vehicle's speed projected onto the windshield. Video-game companies could use the contact lenses to completely immerse players in a virtual world without restricting their range of motion. And for communications, people on the go could surf the Internet on a midair virtual display screen that only they would be able to see.

"People may find all sorts of applications for it that we have not thought about. Our goal is to demonstrate the basic technology and make sure it works and that it's safe," said Parviz, who heads a multi-disciplinary UW group that is developing electronics for contact lenses.

The prototype device contains an electric circuit as well as red light-emitting diodes for a display, though it does not yet light up. The lenses were tested on rabbits for up to 20 minutes and the animals showed no adverse effects.

Ideally, installing or removing the bionic eye would be as easy as popping a contact lens in or out, and once installed the wearer would barely know the gadget was there, Parviz said.

Building the lenses was a challenge because materials that are safe for use in the body, such as the flexible organic materials used in contact lenses, are delicate. Manufacturing electrical circuits, however, involves inorganic materials, scorching temperatures and toxic chemicals. Researchers built the circuits from layers of metal only a few nanometers thick, about one thousandth the width of a human hair, and constructed light-emitting diodes one third of a millimeter across.

They then sprinkled the grayish powder of electrical components onto a sheet of flexible plastic. The shape of each tiny component dictates which piece it can attach to, a microfabrication technique known as self-assembly. Capillary forces -- the same type of forces that make water move up a plant's roots, and that cause the edge of a glass of water to curve upward -- pull the pieces into position.

The prototype contact lens does not correct the wearer's vision, but the technique could be used on a corrective lens, Parviz said. And all the gadgetry won't obstruct a person's view.

"There is a large area outside of the transparent part of the eye that we can use for placing instrumentation," Parviz said. Future improvements will add wireless communication to and from the lens. The researchers hope to power the whole system using a combination of radio-frequency power and solar cells placed on the lens, Parviz said.

A full-fledged display won't be available for a while, but a version that has a basic display with just a few pixels could be operational "fairly quickly," according to Parviz.

The research was funded by the National Science Foundation and a Technology Gap Innovation Fund from the University of Washington.

Adapted from materials provided by University of Washington.

Monday, April 21, 2008

3-D Images -- Cordless And Any Time

ScienceDaily (Apr. 21, 2008) — Securing evidence at the scene of a crime, measuring faces for medical applications, taking samples during production – three-dimensional images are in demand everywhere. A handy cordless device now en-ables such images to be prepared rapidly anywhere.

The car tires have left deep tracks in the muddy forest floor at the scene of the crime. The forensic experts make a plaster cast of the print, so that it can later be compared with the tire profiles of suspects’ cars. There will soon be an easier way of doing it: The police officers will only need to pick up a 3-D sensor, press a button as on a camera, and a few seconds later they will see a three-dimensional image of the tire track on their laptop computer.

The sensor is no larger than a shoebox and weighs only about a kilogram – which means it is easy to handle even on outdoor missions such as in the forest. No cable drums are needed: The sensor radios the data to the computer via WLAN, and draws its power from batteries.

The sensor was developed at the Fraunhofer Institute for Applied Optics and Precision Engineering IOF in Jena. “It consists of two cameras with a projector in the center,” says IOF head of department Dr. Gunther Notni. “The two cameras provide a three-dimensional view, rather like two eyes. The projector casts a pattern of stripes on the objects. The geometry of the measured object can be deduced from the deformation of the stripes.”

This type of stripe projection is already an established method. What is new about the measuring device named ‘Kolibri CORDLESS’ are its measuring speed, size, weight, and cordless operation. For comparison, conventional devices of this type weigh about four or five times as much and are more than twice the size, or roughly 50 centimeters long. “The reason it can be so much smaller is because of the projector, which produces light with light-emitting diodes instead of the usual halogen lamps,” says Notni. This poses an additional challenge, as the LEDs shine in all directions. To ensure that the image is nevertheless bright enough, the light has to be collected with special micro-optics in such a way that it impacts on the lens.

There are multiple applications: “Patients who snore often need a breathing mask when they sleep. To ensure that the mask is not too tight, it has to be specially made for each patient. Our system enables the doctor to scan the patient’s face in just a few seconds and have the breathing mask made to match these data,” says the researcher. Notni believes that the most important application is for quality assurance in production processes. The portable device also makes it possible to measure installed components and zones that are difficult to access, such as the position of foot pedals inside a car. The researchers will be presenting their development at the Control trade fair in Stuttgart on April 21 through 25 (Hall 1, Stand 1520).

Adapted from materials provided by Fraunhofer-Gesellschaft.

Micro Sensor And Micro Fridge Make Cool Pair

ScienceDaily (Apr. 17, 2008) — Researchers at the National Institute of Standards and Technology (NIST) have combined two tiny but powerful NIST inventions on a single microchip, a cryogenic sensor and a microrefrigerator. The combination offers the possibility of cheaper, simpler and faster precision analysis of materials such as semiconductors and stardust.

As described in an upcoming issue of Applied Physics Letters,* the NIST team combined a transition-edge sensor (TES), a superconducting thin film that identifies X-ray signatures far more precisely than any other device, with a solid-state refrigerator based on a sandwich of a normal metal, an insulator and a superconductor. The combo chip, a square about a quarter inch on a side, achieved the first cooling of a fully functional detector (or any useful device) with a microrefrigerator.

The paper also reports the greatest temperature reduction in a separate object by microrefrigerators: a temperature drop of 110 millikelvins (mK), or about a tenth of a degree Celsius.

TES sensors are most sensitive at about 100 mK (a tenth of a degree Celsius above absolute zero). However, these ultralow temperatures are usually reached only by bulky, complex refrigerators. Because the NIST chip can provide some of its own cooling, it can be combined easily with a much simpler refrigerator that starts at room temperature and cools down to about 300 mK, says lead scientist Joel Ullom. In this setup, the chip would provide the second stage of cooling from 300mK down to the operating temperature (100 mK).

One promising application is cheaper, simpler semiconductor defect analysis using X-rays. A small company is already commercializing an earlier version of TES technology for this purpose. In another application, astronomical telescopes are increasingly using TES arrays to take pictures of the early universe at millimeter wavelengths. Use of the NIST chips would lower the temperature and increase the speed at which these images could be made, Ullom says.

The work was supported in part by the National Aeronautics and Space Administration.

* N.A. Miller, G.C. O'Neil, J.A. Beall, G.C. Hilton, K.D. Irwin, D.R. Schmidt, L.R. Vale and J.N. Ullom. High resolution X-ray transition-edge sensor cooled by tunnel junction refrigerators. Forthcoming in Applied Physics Letters.

Adapted from materials provided by National Institute of Standards and Technology.

Saturday, April 19, 2008

Innovative Composite Opens Terahertz Frequencies To Many Applications

ScienceDaily (Apr. 16, 2008) — A frequency-agile metamaterial that for the first time can be tuned over a range of frequencies in the so-called "terahertz gap" has been engineered by a team of researchers from Boston College, Los Alamos National Laboratory and Boston University.

The team incorporated semiconducting materials in critical regions of tiny elements -- in this case metallic split-ring resonators -- that interact with light in order to tune metamaterials beyond their fixed point on the electromagnetic spectrum, an advance that opens these novel devices to a broader array of uses, according to findings published in the online version of the journal Nature Photonics.

"Metamaterials no longer need to be constructed only out of metallic components," said Boston College Physicist Willie J. Padilla, the project leader. "What we've shown is that one can take the exotic properties of metamaterials and combine them with the unique prosperities of natural materials to form a hybrid that yields superior performance."

Padilla and BC graduate student David Shrekenhamer, along with Hou-Tong Chen, John F. O'Hara, Abul K Azad and Antoinette J. Tayler of Los Alamos National Laboratory, and Boston University's Richard D. Averitt formed a single layer of metamaterial and semiconductor that allowed the team to tune terahertz resonance across a range of frequencies in the far-infrared spectrum.

The team's first-generation device achieved 20 percent tuning of the terahertz resonance to lower frequencies -- those in the far-infrared region --addressing the critical issue of narrow band response typical of all metamaterial designs to date.

Constructed on the micron-scale, metamaterials are composites that use unique metallic contours in order to produce responses to light waves, giving each metamaterial its own unique properties beyond the elements of the actual materials in use.

Within the past decade, researchers have sought ways to significantly expand the range of material responses to waves of electromagnetic radiation -- classified by increasing frequency as radio waves, microwaves, terahertz radiation, infrared radiation, visible light, ultraviolet radiation, X-rays and gamma rays. Numerous novel effects have been demonstrated that defy accepted principles.

"Metamaterials demonstrated negative refractive index and up until that point the commonly held belief was that only a positive index was possible," said Padilla. "Metamaterials gave us access to new regimes of electromagnetic response that you could not get from normal materials."

Prior research has shown that because they rely on light-driven resonance, metamaterials experience frequency dispersion and narrow bandwidth operation where the centre frequency is fixed based on the geometry and dimensions of the elements comprising the metamaterial composite. The team believes that the creation of a material that addresses the narrow bandwidth limitations can advance the use of metamaterials.

Enormous efforts have focused on the search for materials that could respond to terahertz radiation, a scientific quest to find the building blocks for devices that could take advantage of the frequency for imaging and other applications.

Potential applications could lie in medical imaging or security screening, said Padilla. Materials undetectable through x-ray scans -- such as chemicals, biological agents, and certain explosives -- can provide a unique "fingerprint" when struck by radiation in the far-infrared spectrum. Metamaterials like the one developed by the research team will facilitate future devices operating at the terahertz frequency of the electromagnetic spectrum.

In addition to imaging and screening, researchers and high-tech companies are probing the use of terahertz in switches, modulators, lenses, detectors, high bit-rate communications, secure communications, the detection of chemical and biological agents and characterization of explosives, according to Los Alamos National Laboratory.

Adapted from materials provided by Boston College, via EurekAlert!, a service of AAAS.

New Nanotube Sensor Can Continuously Monitor Minute Amounts Of Insulin

ScienceDaily (Apr. 17, 2008) — A new method that uses nanotechnology to rapidly measure minute amounts of insulin is a major step toward developing the ability to assess the health of the body's insulin-producing cells in real time.

Among other potential applications, this method could be used to improve the efficacy of a new procedure for treating Type 1 (juvenile) diabetes that has demonstrated the ability to free diabetics from insulin injections for several years. It works by transplanting insulin-producing cells into the pancreas of diabetics to replace the cells that the disease has disabled or destroyed.

The new insulin detection method was developed by a team of Vanderbilt researchers headed by Associate Professor of Chemistry David Cliffel and is reported in the February 18 issue of the journal Analytica Chimica Acta.

To gain this capability, the researchers developed a new electrode for a device called a microphysiometer. The microphysiometer assesses the condition of living cells by submerging them in a saline solution, confining them in a very small chamber and then measuring variations in their metabolism. The volume of the chamber is only three microliters — about 1/20th the size of an ordinary raindrop — allowing the electrode to detect the minute amounts of insulin produced by special pancreatic cells called Islets of Langerhans.

The new electrode is built from multiwalled carbon nanotubes, which are like several flat sheets of carbon atoms stacked and rolled into very small tubes. Provided by William Hofmeister at the University of Tennessee Space Institute, the nanotubes are electrically conductive and the concentration of insulin in the chamber can be directly related to the current at the electrode and the nanotubes operate reliably at pH levels characteristic of living cells.

Current detection methods measure insulin production at intervals by periodically collecting small samples and measuring their insulin levels. The new sensor detects insulin levels continuously by measuring the transfer of electrons produced when insulin molecules oxidize in the presence of glucose. When the cells produce more insulin molecules, the current in the sensor increases and vice versa, allowing the researchers to monitor insulin concentrations in real time. It is similar to a device developed by another group of researchers that operated at acidity levels well beyond those where living cells can function.

Previous tests had shown that nanotube detectors are more sensitive at measuring insulin than conventional methods. However, the researchers had to overcome a major obstacle to adapt them to work in the microphysiometer.

In the small chamber, they found that the fluid moves across the electrode surface rather than pushing against it. These micro-currents tended to sweep the nanotubes aside rather than pinning them to the electrode surface where their electrical activity can be measured. The researchers solved this problem by coating the electrode with a chemical called dihydropyran, a small molecule that forms chains that trap the insulin molecules on the electrode surface.

"One of the key advances of this project was finding how to keep nanotubes active on the surface without being washed away by microfluidic flows," Cliffel says.

Now that the microphysiometer has demonstrated the ability to rapidly detect the small quantities of insulin produced by individual cells, the researchers hope to use it to determine the health of the islet cells used for transplantation.

Researchers at the University of Alberta have shown that islet cells can be transplanted into a Type I diabetic and can greatly relieve insulin dependence for several years. Unfortunately these transplants require large doses of immunosuppressive drugs, and scientists don't yet know how these drugs affect the health of the islet cells.

One of the next steps is to use the microphysiometer to measure insulin, lactate and oxygen levels simultaneously. This will allow researchers to study how the islet cells react to the drugs and help identify the best way to deal with transplant rejection. It will also allow them to verify the health of the islets cells before they are transplanted into patients.

The research was funded in part by the Vanderbilt Institute of Integrative Biosystems Research and Education and by a pilot project grant from Vanderbilt Diabetes Research and Training Center, supported by the National Institutes of Health. It was performed with islet cells isolated from mice at the Vanderbilt Diabetes Research and Training Center.

Other authors of the study include graduate student Rachel M. Snider, research assistant professor Madalina Ciobanu, and former undergraduate researcher Amy E. Rue.

Adapted from materials provided by Vanderbilt University.

World's Newest and Fastest Survey Telescope Receives New Mirror

ScienceDaily (Apr. 17, 2008) — A 4.1-metre diameter primary mirror, a vital part of the world's newest and fastest survey telescope, VISTA (the Visible and Infrared Survey Telescope for Astronomy) has been delivered to its new mountaintop home at Cerro Paranal, Chile. The mirror will now be coupled with a small camera for initial testing prior to installing the main camera in June. Full scientific operations are due to start early next year. VISTA will form part of ESO's Very Large Telescope facility.

The mirror arrived over the Easter weekend at the Paranal Observatory where the telescope is being assembled at an altitude of 2518m, in Chile's Atacama Desert.

VISTA Project Manager Alistair McPherson from STFC's UK Astronomy Technology Centre (UK ATC) accompanied the mirror on its journey to Chile: "This is a major milestone for the VISTA project. The precious mirror was loaded on to a plane in a special cradle that used tennis balls to cushion it from impact for its arduous journey across three continents. "

"The mirror had a difficult four-day journey, by air and by road. It arrived in perfect condition and now that it has been coated, we will install the mirror in the telescope with a small test camera for about four weeks testing. We plan to install the main camera in June," said the Project Scientist on VISTA, Will Sutherland of Queen Mary, University of London, UK.

The VISTA 4.1-metre diameter primary mirror is the most strongly curved large mirror ever polished to such a precise and exacting surface accuracy - deviations from a perfect surface of less than 1/3000th of the thickness of a human hair. On arrival at Cerro Paranal it was safely craned into the telescope dome where it was washed and coated with a thin layer of protected silver in the facility's coating plant. Silver is the best metal for the purpose since it reflects over 98% of near-infrared light, better than the more commonly used aluminium. To date, the reflectivity produced by the silver coating- a relatively new venture - is well above that specified and exceeds all other telescopes.

VISTA will survey large areas of the southern sky at near infrared wavelengths (2 to 4 times the wavelength of visible light) to study objects that are not seen easily in optical light either because they are too cool (such as brown dwarfs), or are surrounded by interstellar dust which infrared light penetrates much better than optical, or whose optical light is redshifted into the near infrared by the expansion of the Universe.

Amongst other things VISTA's surveys will help our understanding of the nature and distribution and origin of known types of stars and galaxies, map the 3-D structure of our galaxy, and help determine the relation between the 3-D structure of the universe and the mysterious 'dark energy' and dark matter'. Samples of objects will be followed up in detail with further observations by other telescopes and instruments such as the nearby Very Large Telescope.

"The delivery of the last component of VISTA is a significant milestone and we are delighted with the progress made since the mirror arrived. Now astronomers can really look forward to being able to perform unparalleled observing of our Southern skies," said Richard Wade, President of the ESO Council and STFC Chief Operating Officer.

VISTA is a 55 million euro project, funded by grants from the UK DTI's Joint Infrastructure Fund and the STFC to Queen Mary, University of London, the lead institute of the VISTA Consortium. VISTA forms part of the UK's subscription to ESO and will be an ESO telescope.

Adapted from materials provided by ESO.

Comet Collides With Solar Hurricane

ScienceDaily (Oct. 1, 2007) — A NASA satellite has captured the first images of a collision between a comet and a solar hurricane. It is the first time scientists have witnessed such an event on another cosmic body. One of NASA's pair of Solar Terrestrial Relations Observatory satellites, known as STEREO, recorded the event April 20.

The phenomenon was caused by a coronal mass ejection, a large cloud of magnetized gas cast into space by the sun. The collision resulted in the complete detachment of the plasma tail of Encke's comet. Observations of the comet reveal the brightening of its tail as the coronal mass ejection swept by and the tail's subsequent separation as it was carried away by the front of the ejection.

"We were awestruck when we saw these images," says Angelos Vourlidas, lead author and researcher at the Naval Research Laboratory, Washington. "This is the first time we've witnessed a collision between a coronal mass ejection and a comet and the surprise of seeing the disconnection of the tail was the icing on the cake."

Encke's comet was traveling within the orbit of Mercury when a coronal mass ejection first crunched the tail then ripped it completely away. The comet is only the second repeating, or periodic, comet ever identified. Halley's comet was the first.

Scientists at the Naval Research Laboratory made the observations using the Heliospheric Imager in its Sun Earth Connection Coronal and Heliospheric Investigation telescope suite aboard the STEREO-A spacecraft. Coronal mass ejections are violent eruptions with masses greater than a few billion tons. They travel from 60 to more than 2,000 miles per second.

They have been compared to hurricanes because of the widespread disruption they can cause when directed at Earth. These solar hurricanes cause geomagnetic storms that can present hazards for satellites, radio communications and power systems. However, coronal mass ejections are spread over a large volume of space, mitigating their mass and power to create an impact softer than a baby's breath.

Scientists have been aware of the disconnection of the entire plasma tail of a comet for some time, but the conditions that lead to these events remained a mystery. It was suspected that coronal mass ejections could be responsible for some of the disconnected events, but the interaction between a coronal mass ejection and a comet never had been observed.

Preliminary analysis suggests the disconnection likely is triggered by what is known as magnetic reconnection, in which the oppositely directed magnetic fields around the comet are crunched together by the magnetic fields in the coronal mass ejection. The comet fields suddenly link together, reconnecting, to release a burst of energy that detaches the comet's tail. A similar process takes place in Earth's magnetosphere during geomagnetic storms, powering the aurora borealis and other phenomena.

Comets are icy leftovers from the solar system's formation billions of years ago. They usually reside in the cold, distant regions of the solar system. Occasionally, the gravitational tug from a planet, another comet or a nearby star sends a comet into the inner solar system, where the sun's heat and radiation vaporizes gas and dust from the comet to form its tail. Comets typically have two tails: one of dust and a fainter one of electrically conducting gas called plasma.

"Even though STEREO is primarily designed to study coronal mass ejections, particularly their impact on Earth, we hope this impact will provide many insights to scientists studying comets," said Michael Kaiser, STEREO project scientist at NASA's Goddard Space Flight Center, Greenbelt, Md.

The results will be published in the Oct. 10 issue of the Astrophysical Journal Letters.

View animation at: http://www.nasa.gov/mpg/191092main_encke_visualization.mpg

Adapted from materials provided by NASA, Goddard Space Flight Center.

Platinum Nanocube Makes Hydrogen Fuel Cells Cheaper And More Efficient

ScienceDaily (Apr. 18, 2008) — Two great obstacles to hydrogen-powered vehicles lie with fuel cells. Fuel cells, which like batteries produce electrical power through chemical reactions, have been plagued by their relatively low efficiency and high production costs. Scientists have tested a wide assortment of metals and materials to overcome the twin challenge.



Now a team led by Shouheng Sun, professor of chemistry at Brown, has mastered a Rubik’s Cube-like dilemma for dealing with platinum, a precious metal coveted for its ability to boost a chemical reaction in fuel cells. The team shows that shaping platinum into a cube greatly enhances its efficiency in a phase of the fuel cell’s operation known as oxygen reduction reaction. Sun’s results have been published online in the journal Angewandte Chemie. The paper was selected as a Very Important Paper, a distinction reserved for less than 5 percent of manuscripts submitted to the peer-reviewed journal.

Platinum helps reduce the energy barrier – the amount of energy needed to start a reaction – in the oxidation phase of a fuel cell. It is also seen as beneficial on the other end of the fuel cell, known as the cathode. There, platinum has been shown to assist in oxygen reduction, a process in which electrons peeled from hydrogen atoms join with oxygen atoms to create electrical energy. The reaction also is important because it only produces water. This byproduct – rather than the global warming gas carbon dioxide – is a big reason why hydrogen fuel cells are a tantalizing area of research from automakers in Detroit to policy-makers in Washington.

Scientists, however, have had trouble maximizing platinum’s potential in the oxygen reduction reaction. The barriers chiefly revolve around shape and surface area – geometry and geography, so to speak. What Sun has learned is that molding platinum into a cube on the nanoscale enhances its catalysis – that is, it boosts the rate of a chemical reaction.

“For the first time, we can control the morphology of the particle to make it more like a cube,” Sun said. “People have had very limited control over this process before. Now we have shown it can be done uniformly and consistently.”

During his experiments, Sun, along with Brown graduate engineering student Chao Wang and engineers from the Japanese firm Hitachi Maxwell Ltd., created polyhedron and cube shapes of different sizes by adding platinum acetylacetonate (Pt(acac)2) and a trace amount of iron pentacarbonyl (Fe(CO)5) at specific temperature ranges. The team found that cubes were more efficient catalysts, owing largely to their surface structure and their resistance to being absorbed by the sulfate in the fuel cell solution.

“For this reaction, the shape is more important than the size,” Sun said.

The next step, Sun added, is to build a polymer electrolyte membrane fuel cell and test the platinum nanocubes as catalysts in it. The team expects the experiments will yield fuel cells with a higher electrical output than previous versions.

“It’s like science fiction, but we’re a step closer now to the reality of developing a very efficient platinum catalyst for hydrogen cars that produce only water as exhaust,” Sun said.

Hitachi Maxell chemical engineers Hideo Daimon, Taigo Ondera and Tetsunori Koda, a visiting engineer at Brown, contributed to the research.

The research was funded by the National Science Foundation and by the Office of the Vice President for Research at Brown University through its Research Seed Fund.

Adapted from materials provided by Brown University.

Silent Tiny Cooling Systems Made For Laptop Computers, Other Devices

ScienceDaily (Mar. 20, 2008) — Engineers harnessing the same physical property that drives silent household air purifiers have created a miniaturized device that is now ready for testing as a silent, ultra-thin, low-power and low maintenance cooling system for laptop computers and other electronic devices.

The compact, solid-state fan, developed with support from NSF's Small Business Innovation Research program, is the most powerful and energy efficient fan of its size. It produces three times the flow rate of a typical small mechanical fan and is one-fourth the size.

Dan Schlitz and Vishal Singhal of Thorrn Micro Technologies, Inc., of Marietta, Ga. will present their RSD5 solid-state fan at the 24th Annual Semiconductor Thermal Measurement, Modeling and Management Symposium (Semi-Therm) in San Jose, Calif., on March 17, 2008. The device is the culmination of six years of research that began while the researchers were NSF-supported graduate students at Purdue University.

"The RSD5 is one of the most significant advancements in electronics cooling since heat pipes. It could change the cooling paradigm for mobile electronics," said Singhal.

The RSD5 incorporates a series of live wires that generate a micro-scale plasma (an ion-rich gas that has free electrons that conduct electricity). The wires lie within un-charged conducting plates that are contoured into half-cylindrical shape to partially envelop the wires.

Within the intense electric field that results, ions push neutral air molecules from the wire to the plate, generating a wind. The phenomenon is called corona wind.

"The technology is a breakthrough in the design and development of semiconductors as it brings an elegant and cost effective solution to the heating problems that have plagued the industry," said Juan Figueroa, the NSF SBIR program officer who oversaw the research.

With the breakthrough of the contoured surface, the researchers were able to control the micro-scale discharge to produce maximum airflow without risk of sparks or electrical arcing. As a result, the new device yields a breeze as swift as 2.4 meters per second, as compared to airflows of 0.7 to 1.7 meters per second from larger, mechanical fans.

The contoured platform is a part of the device heat sink, a trick that enabled Schlitz and Singhal to both eliminate some of the device's bulk and increase the effectiveness of the airflow.

"The technology has the power to cool a 25-watt chip with a device smaller than 1 cubic-cm and can someday be integrated into silicon to make self-cooling chips," said Schlitz.

This device is also more dust-tolerant than predecessors. While dust attraction is ideal for living-room-scale fans that that provide both air flow and filtration, debris can be a devastating obstacle when the goal is to cool an electrical component.

Adapted from materials provided by National Science Foundation.

New Robots Can Provide Elder Care For Aging Baby Boomers

ScienceDaily (Apr. 16, 2008) — Baby boomers are set to retire, and robots are ready to help, providing elder care and improving the quality of life for those in need. Researchers at the University of Massachusetts Amherst have developed a robotic assistant that can dial 911 in case of emergencies, remind clients to take their medication, help with grocery shopping and allow a client to talk to loved ones and health care providers.

Concerned family members can access the unit and visit their elderly parents from any Internet connection, including navigating around the home and looking for Mom or Dad, who may not hear the ringing phone or may be in need of assistance. Doctors can perform virtual house calls, reducing the need for travel.

“For the first time, robots are safe enough and inexpensive enough to do meaningful work in a residential environment,” says computer scientist Rod Grupen, director of UMass Amherst’s Laboratory for Perceptual Robotics, who developed project ASSIST with computer scientists Allen Hanson and Edward Riseman.

The robot, called the uBOT-5, could allow elders to live independently, and provide relief for caregivers, the medical system and community services, which are expected to be severely stressed by the retirement of over 77 million Americans in the next 30 years.

There is no mistaking the uBot-5 for a person, but its design was inspired by human anatomy. An array of sensors acts as the robots eyes and ears, allowing it to recognize human activities, such as walking or sitting. It can also recognize an abnormal visual event, such as a fall, and notify a remote medical caregiver. Through an interface, the remote service provider may ask the client to speak, smile or raise both arms, movements that the robot can demonstrate. If the person is unresponsive, the robot can call 911, alert family and apply a digital stethoscope to a patient, conveying information to an emergency medical technician who is en route.

The system also tracks what isn’t human. If a delivery person leaves a package in a hallway, the sensor array is trained to notice when a path is blocked, and the robot can move the obstruction out of the way. It can also raise its outstretched arms, carry a load of about 2.2 pounds and has the potential to perform household tasks that require a fair amount of dexterity, including cleaning and grocery shopping.

The uBOT-5 carries a Web cam, a microphone, and a touch-sensitive LCD display that acts as an interface for communication with the outside world. “Grandma can take the robot’s hand, lead it out into the garden and have a virtual visit with a grandchild who is living on the opposite coast,” says Grupen, who notes that isolation can lead to depression in the elderly.

Grupen studied developmental neurology in his quest to create a robot that could do a variety of tasks in different environments. The uBot-5’s arm motors are analogous to the muscles and joints in our own arms, and it can push itself up to a vertical position if it falls over. It has a “spinal cord” and the equivalent of an inner ear to keep it balanced on its Segway-like wheels.

The researchers wanted to create a personal robot that could provide many services, such as a medical alert system, or the means to talk to loved ones, all in one human-like package, according to Grupen. To evaluate the effectiveness of potential technologies, the research team worked with social workers, members of the medical community and family members of those in elder care.

The collaborative effort, dubbed project ASSIST, involved researchers from the Smith College School for Social Work, the Veteran’s Administration (Connecticut Health Care System, West Haven campus) and elder care community centers in western Massachusetts. Through focus groups, the researchers learned about the preferences of potential users.

Graduate students Patrick Deegan, Emily Horrell, Shichao Ou, Sharaj Sen, Brian Thibodeau, Adam Williams and Dan Xie are also collaborators on project ASSIST.

Adapted from materials provided by University of Massachusetts Amherst.

How Nanocluster Contaminants Increase Risk Of Spreading Through GroundwaterHow Nanocluster Contaminants Increase Risk Of Spreading Through Groundwater

ScienceDaily (Apr. 17, 2008) — For almost half a century, scientists have struggled with plutonium contamination spreading further in groundwater than expected, increasing the risk of sickness in humans and animals.
It was known nanometer sized clusters of plutonium oxide were the culprit, but no one had been able to study its structure or find a way to separate it from the groundwater.

Scientists at the U.S. Department of Energy's Argonne National Laboratory, in collaboration with researchers from the University of Notre Dame, were able to use high-energy X-rays from the Argonne Advanced Photon Source to finally discover and study the structure of plutonium nanoclusters.

"When plutonium forms into the clusters, its chemistry is completely different and no one has really been able to assess what it is, how to model it or how to separate it Argonne senior chemist Lynda Soderholm said. "People have known about and tried to understand the nanoclusters, but it was the modern analytical techniques and the APS that allowed us understand what it is."

The nanoclusters are made up of exactly 38 plutonium atoms and had almost no charge. Unlike stray plutonium ions, which carry a positive charge, they are not attracted to the electrons in plant life, minerals, etc. which stopped the ions' progression in the ground water.

Models have been based on the free-plutonium model, creating discrepancies between what is expected and reality. Soderholm said that with knowledge of the structure, scientists can now create better models to account for not only free-roaming plutonium ions, but also the nanoclusters.

The clusters also are a problem for plutonium remediation. The free ions are relatively easy to separate out from groundwater, but the clusters are difficult to remove.

"As we learn more, we will be able to model the nanoclusters and figure out how to break them apart," Soderholm said. "Once they are formed, they are very hard to get rid of."

Soderholm said other experiments have shown some clusters with different numbers of plutonium atoms and she plans to examine -- together with her collaborators S. Skanthakumar, Richard Wilson and Peter Burns of Argonne's Chemical Sciences and Engineering Division-- the unique electric and magnetic properties of the clusters.

Funding for the research was provided by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences.

Adapted from materials provided by DOE/Argonne National Laboratory, via EurekAlert!, a service of AAAS.

Friday, April 18, 2008

Tiny Sensor Developed To Detect Homemade Bombs

ScienceDaily (Mar. 19, 2008) — A team of chemists and physicists at the University of California, San Diego has developed a tiny, inexpensive sensor chip capable of detecting trace amounts of hydrogen peroxide, a chemical used in the most common form of homemade explosives.
The invention and operation of this penny-sized electronic sensor, capable of sniffing out hydrogen peroxide vapor in the parts-per-billion range from peroxide-based explosives, such as those used in the 2005 bombing of the London transit system, is detailed in a new article.*In addition to detecting explosives, UC San Diego scientists say the sensor could have widespread applications in improving the health of industrial workers by providing a new tool to inexpensively monitor the toxic hydrogen peroxide vapors from bleached pulp and other products to which factory workers are exposed.

“The detection capability of this tiny electronic sensor is comparable to current instruments, which are large, bulky and cost thousands of dollars each,” said William Trogler, a professor of chemistry and biochemistry at UCSD and one of its inventors. “If this device were mass produced, it’s not inconceivable that it could be made for less than a dollar.”

The device was invented by a team led by Trogler; Andrew Kummel, a professor of chemistry and biochemistry; and Ivan Schuller, a professor of physics. Much of the work was done by UCSD chemistry and physics graduate students Forest Bohrer, Corneliu Colesniuc and Jeongwon Park.

The sensor works by monitoring the variability of electrical conductivity through thin films of “metal phthalocyanines.” When exposed to most oxidizing agents, such as chlorine, these metal films show an increase in electrical current, while reducing agents have the opposite effect—a decrease of electrical current.

But when exposed to hydrogen peroxide, an oxidant, the metal phthalocyanine films behave differently depending on the type of metal used. Films made of cobalt phthalocyanine show decreases in current, while those made from copper or nickel show increases in current.

The UCSD team used this unusual trait to build their sensor. It is composed of thin films of both cobalt phthalocyanine and copper phthalocyanine to display a unique signature whenever tiny amounts of hydrogen peroxide are present.

Bombs constructed with hydrogen peroxide killed more than 50 people and injured 700 more on two London subway trains and a transit bus during rush hour on July 7, 2005. More than 1,500 pounds of a hydrogen peroxide-based mixture was discovered after an alleged bomb plot in Germany that resulted in the widely publicized arrest last September of three people.

Trogler said that because the team’s sensor is so little affected by water vapor, it can be used in industrial and other “real-life applications.” The university has applied for a patent on the invention, which has not yet been licensed.

The article Selective Detection of Vapor Phase Hydrogen Peroxide with Phthalocyanine Chemiresistors is published in the Journal of the American Chemical Society.

Funding for the research study was provided by the Air Force Office of Scientific Research.

Adapted from materials provided by University of California - San Diego.

Graphene Used To Create World's Smallest Transistor

ScienceDaily (Apr. 18, 2008) — Researchers have used the world's thinnest material to create the world's smallest transistor, one atom thick and ten atoms wide.
Reporting their peer-reviewed findings in the journal Science, Dr Kostya Novoselov and Professor Andre Geim from The School of Physics and Astronomy at The University of Manchester show that graphene can be carved into tiny electronic circuits with individual transistors having a size not much larger than that of a molecule.

The smaller the size of their transistors the better they perform, say the Manchester researchers.

In recent decades, manufacturers have crammed more and more components onto integrated circuits. As a result, the number of transistors and the power of these circuits have roughly doubled every two years. This has become known as Moore's Law.

But the speed of cramming is now noticeably decreasing, and further miniaturisation of electronics is to experience its most fundamental challenge in the next 10 to 20 years, according to the semiconductor industry roadmap.

At the heart of the problem is the poor stability of materials if shaped in elements smaller than 10 nanometres* in size. At this spatial scale, all semiconductors -- including silicon -- oxidise, decompose and uncontrollably migrate along surfaces like water droplets on a hot plate.

Four years ago, Geim and his colleagues discovered graphene, the first known one-atom-thick material which can be viewed as a plane of atoms pulled out from graphite. Graphene has rapidly become the hottest topic in physics and materials science.

Now the Manchester team has shown that it is possible to carve out nanometre-scale transistors from a single graphene crystal. Unlike all other known materials, graphene remains highly stable and conductive even when it is cut into devices one nanometre wide.

Graphene transistors start showing advantages and good performance at sizes below 10 nanometres - the miniaturization limit at which the Silicon technology is predicted to fail.

"Previously, researchers tried to use large molecules as individual transistors to create a new kind of electronic circuits. It is like a bit of chemistry added to computer engineering", says Novoselov. "Now one can think of designer molecules acting as transistors connected into designer computer architecture on the basis of the same material (graphene), and use the same fabrication approach that is currently used by semiconductor industry".

"It is too early to promise graphene supercomputers," adds Geim. "In our work, we relied on chance when making such small transistors. Unfortunately, no existing technology allows the cutting materials with true nanometre precision. But this is exactly the same challenge that all post-silicon electronics has to face. At least we now have a material that can meet such a challenge."

"Graphene is an exciting new material with unusual properties that are promising for nanoelectronics", comments Bob Westervelt, professor at Harvard University. "The future should be very interesting".

*One nanometre is one-millionth of a millimetre and a single human hair is around 100,000 nanometres in width.

A paper entitled "Chaotic Dirac Billiard in Graphene Quantum Dots" is published in April 17 issue of Science. It is accompanied by a Perspective article entitled "Graphene Nanoelectronics" by Westervelt.

Adapted from materials provided by University of Manchester.

Aerodynamic Truck Trailer Cuts Fuel And Emissions By Up To 15 Percent

ScienceDaily (Apr. 17, 2008) — Creating an improved aerodynamic shape for truck trailers by mounting sideskirts can lead to a cut in fuel consumption and emissions of up to as much as 15%. Earlier promising predictions, based on mathematical models and wind tunnel tests by TU Delft, have been confirmed during road tests with an adapted trailer. This means that PART (Platform for Aerodynamic Road Transport), the public-private partnership platform, has produced an application which can immediately be put into production.
It is expected that the cost of fitting aerodynamically-shaped sideskirts will be recouped within two years. Furthermore, the sideskirts can be fitted to approximately half the trucks currently in use in the Netherlands as the skirts can also be retrofitted.

Carbon Dioxide reduction

Prof. Michel van Tooren of TU Delft’s Aerospace Engineering faculty: 'In 2005, 10,000 new trailers were taken into use in the Netherlands. With an average fuel consumption of 30 litres per 100 kilometres, that translates into 750 million litres of diesel consumption in the Netherlands each year. We can cut fuel consumption by 5% or more for 50% of those trailers. That means a reduction of 50 million tons of CO2 emissions a year. This research can therefore result in a substantial, structural contribution to cutting fuel consumption and an annual saving of tens of millions of Euros, next to that cut in CO2 emissions by the road transport sector.'

He continues: 'Together with this sector we have created a practical platform for further research and development, but we still need active government participation. Just obtaining permits for all the road tests has involved a huge amount of time, energy and frustration. The next step is realizing a practical partnership between the government and industry in order to put the solutions into practice.'

About sideskirts

Sideskirts are plates which are mounted on the sides of trailers, primarily with a view to underrun protection. The new aerodynamic design of the sideskirts substantially reduces the air currents alongside and under the trailer and thereby also the air resistance. Initial driving tests with a trailer equipped with the aerodynamic sideskirts over a straight stretch of public road revealed a cut in fuel consumption of between 5% and 15%. Subsequent research comprising long-term operational tests by TNT displayed a fuel reduction of 10%. These results confirm the calculations and findings from the wind tunnel tests: these had already established that the observed 14 - 18% reduction in air resistance led to 7 - 9% less fuel consumption. In practice, the figures are in fact even better.

About the boat tail

Road tests have also already been initiated on what are known as boat tails. These constructions on the rear of a trailer ensure a reduction in the wake: the vacuum and air currents which arise when the trailer is moving. In theory, a boat tail could also mean a cut in air resistance of 30%, with a fuel reduction of 10 - 15%. These road tests should also confirm the earlier, highly positive results from the windtunnel.

Incidentally, this study does not aim to produce boat tails for commercial use. Limitations in their practical use, in particular when loading and unloading, safety aspects and problems with exceeding maximum vehicle sizes prevent these being used for many types of vehicles. This research focuses on gaining knowledge and developing different practicable solutions; the second development phase will concentrate on these aims.

About PART

The objective of PART, a partnership between TU Delft, TNT, Scania Beers BV, FOCWA Carrosseriebouw, Ephicas, Kees Mulder Carrosserieën, Van Eck Carrosseriebouw, Syntens, Squarell Technology, Emons Group and NEA transport research and training, is to develop and test aerodynamic applications for trailers. In contrast to research into the aerodynamic properties of trucks, comparable research into trailers is still relatively new. Applications such as the Ephicas sideskirts or boat tails could lead to reductions in air resistance of up to 30%, which translates into a reduction in fuel consumption and emissions of as much as 15%. Moreover, it contributes to increasing profits in the highly competitive world of road transport.

Adapted from materials provided by Delft University of Technology.

Nuclear Power: Most Successful Fuel Performance Ever For US Advanced Gas Reactor Fuel

ScienceDaily (Apr. 15, 2008) — Advanced gas reactors offer more efficient operation, less waste disposal and other benefits over water-cooled reactor designs used in U.S. nuclear power plants.

But creating fuel that burns efficiently and reliably in the higher temperatures of advanced gas reactors has been a challenge -- until now.

Fuel fabricated at Oak Ridge National Laboratory, in cooperation with Idaho National Laboratory and the Babcock & Wilcox Company, has demonstrated the most successful performance ever for U.S. advanced gas reactor fuel.

In recent tests at the Advanced Test Reactor at INL, the ORNL fuel achieved 9 percent burn-up, a significant milestone on its way to a target of 16-18 percent.

Higher burn-up allows for more efficient use of uranium and less waste compared to the 3-4 percent rate of standard fuel at U.S. power plants.

his experiment is the first of eight planned to qualify fuel as part of the Department of Energy's Next Generation Nuclear Power Plant project.

The fuel elements are built from thousands of tiny uranium-containing spheres coated with carbon and silicon carbide to contain the radioactive fission products. The coated particles are compacted by a special process into fuel sticks and loaded into a graphite form.

The fuel work for this first test was conducted in ORNL's Materials Science and Technology Division and funded by DOE's Office of Nuclear Energy.

Adapted from materials provided by DOE/Oak Ridge National Laboratory.

Water Needed To Produce Various Types Of Energy

ScienceDaily (Apr. 17, 2008) — It is easy to overlook that most of the energy we consume daily, such as electricity or natural gas, is produced with the help of a dwindling resource – fresh water. Virginia Tech professor Tamim Younos and undergraduate student Rachelle Hill are researching the water-efficiency of some of the most common energy sources and power generating methods.
Younos, associate director at the Virginia Water Resources Research Center based at Virginia Tech and research professor of water resources in the College of Natural Resources and undergraduate researcher Hill, of Round Hill, Va., majoring in environmental science and aquatic resource concentration, in the College of Agriculture and Life Sciences, have analyzed 11 types of energy sources, including coal, fuel ethanol, natural gas, and oil; and five power generating methods, including hydroelectric, fossil fuel thermoelectric, and nuclear methods.

Younos said they based their calculations on available governmental reports by using a standard measurement unit, which makes this study unique. “Our unit is gallons of water per British Thermal Unit (BTU),” explained Younos. “We selected BTU as a standard unit because it indicates pure energy as heat and is applicable to all energy production and power generation methods.”

According to the study, the most water-efficient energy sources are natural gas and synthetic fuels produced by coal gasification. The least water-efficient energy sources are fuel ethanol and biodiesel.

In terms of power generation, Younos and Hill have found that geothermal and hydroelectric energy types use the least amount of water, while nuclear plants use the most.

Hill took the study one step further and calculated how many gallons of water are required to burn one 60-watt incandescent light bulb for 12 hours a day, over the course of one year. She found that the bulb would consume between 3,000 and 6,000 gallons of water, depending on how water-efficient the power plant that supplies the electricity is.

Hill added that the results are estimates of the water consumption based on energy produced by fossil fuel thermoelectric plants, which produce most of the Unites State’s power – about 53 percent. “The numbers are even more staggering if you multiply the water consumed by the same light bulb by the approximately 111 million U.S. homes,” said Hill. “The water usage then gets as high as 655 billion gallons of water a year.”

By contrast, burning a compact fluorescent bulb for the same amount of time would save about 2,000 to 4,000 gallons of water per year.

Younos noted that the results of this analysis should be interpreted with a grain of salt. “There are several variables such as geography and climate, technology type and efficiency, and accuracy of measurements that come into play. However, by standardizing the measurement unit, we have been able to obtain a unique snapshot of the water used to produce different kinds of energy.”

Adapted from materials provided by Virginia Tech, via Newswise.