about us:
our company name is KNK DECORATION. our job about sticker service, example: sticker cutting, digital printing, screen printing, neon box, sign marking and another material promotion.
kami melayani pengerjaan stiker cutting, printing indoor/outdoor, neon box, pembuatan rambu-rambu lalu lintas(standard DLLAJR), dll

Wednesday, November 28, 2007

High Performance Field-effect Transistors With Thin Films Of Carbon 60 Produced

ScienceDaily (Nov. 27, 2007) — Using room-temperature processing, researchers at the Georgia Institute of Technology have fabricated high-performance field effect transistors with thin films of Carbon 60, also known as fullerene. The ability to produce devices with such performance with an organic semiconductor represents another milestone toward practical applications for large area, low-cost electronic circuits on flexible organic substrates.
The new devices -- which have electron-mobility values higher than amorphous silicon, low threshold voltages, large on-off ratios and high operational stability -- could encourage more designers to begin working on such circuitry for displays, active electronic billboards, RFID tags and other applications that use flexible substrates.

"If you open a textbook and look at what a thin-film transistor should do, we are pretty close now," said Bernard Kippelen, a professor in Georgia Tech's School of Electrical and Computer Engineering and the Center for Organic Photonics and Electronics. "Now that we have shown very nice single transistors, we want to demonstrate functional devices that are combinations of multiple components. We have everything ready to do that."

Fabrication of the C60 transistors was reported August 27th in the journal Applied Physics Letters. The research was supported by the U.S. National Science Foundation through the STC program MDITR, and the U.S. Office of Naval Research.

Researchers have been interested in making field-effect transistors and other devices from organic semiconductors that can be processed onto various substrates, including flexible plastic materials. As an organic semiconductor material, C60 is attractive because it can provide high electron mobility -- a measure of how fast current can flow. Previous reports have shown that C60 can yield mobility values as high as six square centimeters per volt-second (6 cm2/V/s). However, that record was achieved using a hot-wall epitaxy process requiring processing temperatures of 250 degrees Celsius -- too hot for most flexible plastic substrates.

Though the transistors produced by Kippelen's research team display slightly lower electron mobility -- 2.7 to 5 cm2/V/s -- they can be produced at room temperature.

"If you want to deposit transistors on a plastic substrate, you really can't have any process at a temperature of more than 150 degrees Celsius," Kippelen said. "With room temperature deposition, you can be compatible with many different substrates. For low-cost, large area electronics, that is an essential component."

Because they are sensitive to contact with oxygen, the C60 transistors must operate under a nitrogen atmosphere. Kippelen expects to address that limitation by using other fullerene molecules -- and properly packaging the devices.

The new transistors were fabricated on silicon for convenience. While Kippelen isn't underestimating the potential difficulty of moving to an organic substrate, he says that challenge can be overcome.

Though their performance is impressive, the C60 transistors won't threaten conventional CMOS chips based on silicon. That's because the applications Kippelen has in mind don't require high performance.

"There are a lot of applications where you don't necessarily need millions of fast transistors," he said. "The performance we need is by far much lower than what you can get in a CMOS chip. But whereas CMOS is extremely powerful and can be relatively low in cost because you can make a lot of circuits on a wafer, for large area applications CMOS is not economical."

A different set of goals drives electronic components for use with low-cost organic displays, active billboards and similar applications.

"If you look at a video display, which has a refresh rate of 60 Hz, than means you have to refresh the screen every 16 milliseconds," he noted. "That is a fairly low speed compared to a Pentium processor in your computer. There is no point in trying to use organic materials for high-speed processing because silicon is already very advanced and has much higher carrier mobility."

Now that they have demonstrated attractive field-effect C60 transistors, Kippelen and collaborators Xiao-Hong Zhang and Benoit Domercq plan to produce other electronic components such as inverters, ring oscillators, logic gates, and drivers for active matrix displays and imaging devices. Assembling these more complex systems will showcase the advantages of the C60 devices.

"The goal is to increase the complexity of the circuits to see how that high mobility can be used to make more complex structures with unprecedented performance," Kippelen said.

The researchers fabricated the transistors by depositing C60 molecules from the vapor phase into a thin film atop a silicon substrate onto which a gate electrode and gate dielectric had already been fabricated. The source and drain electrodes were then deposited on top of the C60 films through a shadow mask.

Kippelen's team has been working with C60 for nearly ten years, and is also using the material in photovoltaic cells. Beyond the technical advance, Kippelen believes this new work demonstrates the growing maturity of organic electronics.

"This progress may trigger interest among more conventional electronic engineers," he said. "Most engineers would like to work with the latest technology platform, but they would like to see a level of performance showing they could actually implement these circuits. If you can demonstrate -- as we have -- that you can get transistors with good reproducibility, good stability, near-zero threshold voltages, large on-off current ratios and performance levels higher than amorphous silicon, that may convince designers to consider this technology."

Adapted from materials provided by Georgia Institute of Technology.

Micro Microwave Does Pinpoint Cooking For Miniaturized Labs

ScienceDaily (Nov. 15, 2007) — Researchers at the National Institute of Standards of Technology (NIST) and George Mason University have demonstrated what is probably the world's smallest microwave oven, a tiny mechanism that can heat a pinhead-sized drop of liquid inside a container slightly shorter than an ant and half as wide as a single hair. The micro microwave is intended for lab-on-a-chip devices that perform rapid, complex chemical analyses on tiny samples.
In a paper in the Journal of Micromechanics and Microengineering*, the research team led by NIST engineer Michael Gaitan describes for the first time how a tiny dielectric microwave heater can be successfully integrated with a microfluidic channel to control selectively and precisely the temperature of fluid volumes ranging from a few microliters (millionth of a liter) to sub-nanoliters (less than a billionth of a liter). Sample heating is an essential step in a wide range of analytic techniques that could be built into microfluidic devices, including the high-efficiency polymerase chain reaction (PCR) process that rapidly amplifies tiny samples of DNA for forensic work, and and methods to break cells open to release their contents for study.

The team embedded a thin-film microwave transmission line between a glass substrate and a polymer block to create its micro microwave oven. A trapezoidal-shaped cut in the polymer block only 7 micrometers across at its narrowest--the diameter of a red blood cell--and nearly 4 millimeters long (approximately the length of an ant) serves as the chamber for the fluid to be heated.

Based on classical theory of how microwave energy is absorbed by fluids, the research team developed a model to explain how their minature oven would work. They predicted that electromagnetic fields localized in the gap would directly heat the fluid in a selected portion of the micro channel while leaving the surrounding area unaffected. Measurements of the microwaves produced by the system and their effect on the fluid temperature in the micro channel validated the model by showing that the increase in temperature of the fluid was predominantly due to the absorbed microwave power.

Once the new technology is more refined, the researchers hope to use it to design a microfluidic microwave heater that can cycle temperatures rapidly and efficiently for a host of applications.

The work is supported by the Office of Science and Technology at the Department of Justice's National Institute of Justice.

* J.J. Shah, S.G. Sundaresan, J. Geist, D.R. Reyes, J.C. Booth, M.V. Rao and M. Gaitan. Microwave dielectric heating of fluids in an integrated microfluidic device. Journal of Micromechanics and Microengineering, 17: 2224-2230 (2007)

Adapted from materials provided by National Institute of Standards and Technology.

Friday, November 23, 2007

PCBs May Threaten Killer Whale Populations For 30-60 Years

ScienceDaily (Sep. 10, 2007) — Orcas or killer whales may continue to suffer the effects of contamination with polychlorinated biphenyls (PCBs) for the next 30 -- 60 years, despite 1970s-era regulations that have reduced overall PCB concentrations in the environment, researchers in Canada report. The study calls for better standards to protect these rare marine mammals.
In the study, Brendan Hickie and Peter S. Ross and colleagues point out that orcas face a daunting array of threats to survival, including ship traffic, reduced abundance of prey and environmental contamination. Orcas, which reach a length exceeding 25 feet and weights of 4-5 tons, already are the most PCB-contaminated creatures on Earth. Scientists are trying to determine how current declines in PCBs in the environment may affect orcas throughout an exceptionally long life expectancy, which ranges up to 90 years for females and 50 years for males.

The new study used mathematical models and measurements of PCBs in salmon (orcas' favorite food) and ocean floor cores to recreate a PCB exposure history to estimate PCB concentrations in killer whales over time. It concluded that the "threatened" northern population of 230 animals will likely face health risks until at least 2030, while the endangered southern population of 85 orcas may face such risks until at least 2063. PCBs make whales more vulnerable to infectious disease, impair reproduction, and impede normal growth and development, the researchers say.

"The findings provide conservationists, regulators, and managers with benchmarks against which the effectiveness of mitigative steps can be measured and tissue residue guidelines can be evaluated," the study reported. "The results of our study on PCBs may paint an ominous picture for risks associated with emerging chemicals, as the concentrations of structurally-related PBDEs are doubling every 4 years in marine mammals," researchers added.

"Killer Whales (Orcinus orca) Face Protracted Health Risks Associated with Lifetime Exposure to PCBs" Environmental Science & Technology, September 15, 2007

Adapted from materials provided by American Chemical Society.

Connecting Wind Farms Can Make A More Reliable And Cheaper Power Source

ScienceDaily (Nov. 21, 2007) — Wind power, long considered to be as fickle as wind itself, can be groomed to become a steady, dependable source of electricity and delivered at a lower cost than at present, according to scientists at Stanford University.
The key is connecting wind farms throughout a given geographic area with transmission lines, thus combining the electric outputs of the farms into one powerful energy source. The findings are published in the November issue of the American Meteorological Society's Journal of Applied Meteorology and Climatology.

Wind is the world's fastest growing electric energy source, according to the study's authors, Cristina Archer and Mark Jacobson. However, because wind is intermittent, it is not used to supply baseload electric power today. Baseload power is the amount of steady and reliable electric power that is constantly being produced, typically by power plants, regardless of the electricity demand. But interconnecting wind farms with a transmission grid reduces the power swings caused by wind variability and makes a significant portion of it just as consistent a power source as a coal power plant.

"This study implies that, if interconnected wind is used on a large scale, a third or more of its energy can be used for reliable electric power, and the remaining intermittent portion can be used for transportation, allowing wind to solve energy, climate and air pollution problems simultaneously," said Archer, the study's lead author and a consulting assistant professor in Stanford's Department of Civil and Environmental Engineering and research associate in the Department of Global Ecology of the Carnegie Institution.

It's a bit like having a bunch of hamsters generating your power, each in a separate cage with a treadmill. At any given time, some hamsters will be sleeping or eating and some will be running on their treadmill. If you have only one hamster, the treadmill is either turning or it isn't, so the power's either on or off. With two hamsters, the odds are better that one will be on a treadmill at any given point in time and your chances of running, say, your blender, go up. Get enough hamsters together and the odds are pretty good that at least a few will always be on the treadmill, cranking out the kilowatts.

The combined output of all the hamsters will vary, depending on how many are on treadmills at any one time, but there will be a certain level of power that is always being generated, even as different hamsters hop on or off their individual treadmills. That's the reliable baseload power.

The connected wind farms would operate the same way.

"The idea is that, while wind speed could be calm at a given location, it could be gusty at others. By linking these locations together we can smooth out the differences and substantially improve the overall performance," Archer said.

As one might expect, not all locations make sense for wind farms. Only locations with strong winds are economically competitive. In their study, Archer and Jacobson, a professor of civil and environmental engineering at Stanford, evaluated 19 sites in the Midwestern United States, with annual average wind speeds greater than 6.9 meters per second at a height of 80 meters above ground, the hub height of modern wind turbines. Modern turbines are 80-100 meters high, approximately the height of a 30-story building, and their blades are 70 meters long or more.

The researchers used hourly wind data, collected and quality-controlled by the National Weather Service, for the entire year of 2000 from the 19 sites in the Midwestern United States. They found that an average of 33 percent and a maximum of 47 percent of yearly-averaged wind power from interconnected farms can be used as reliable, baseload electric power. These percentages would hold true for any array of 10 or more wind farms, provided it met the minimum wind speed and turbine height criteria used in the study.

Another benefit of connecting multiple wind farms is reducing the total distance that all the power has to travel from the multiple points of origin to the destination point. Interconnecting multiple wind farms to a common point and then connecting that point to a far-away city reduces the cost of transmission.

It's the same as having lots of streams and creeks join together to form a river that flows out to sea, rather than having each creek flow all the way to the coast by carving out its own little channel.

Another type of cost saving also results when the power combines to flow in a single transmission line. Explains Archer: Suppose a power company wanted to bring power from several independent farms--each with a maximum capacity of, say, 1,500 kilowatts (kW) --from the Midwest to California. Each farm would need a short transmission line of 1,500 kW brought to a common point in the Midwest. Then they would need a larger transmission line between the common point and California--typically with a total capacity of 1,500 kW multiplied by the number of independent farms connected.

However, with geographically dispersed farms, it is unlikely that they would simultaneously be experiencing strong enough winds to each produce their 1,500kW maximum output at the same time. Thus, the capacity of the long-distance transmission line could be reduced significantly with only a small loss in overall delivered power.

The more wind farms connected to the common point in the Midwest, the greater the reduction in long-distance transmission capacity that is possible.

"Due to the high cost of long-distance transmission, a 20 percent reduction in transmission capacity with little delivered power loss would notably reduce the cost of wind energy," added Archer, who calculated the decrease in delivered power to be only about 1.6 percent.

With only one farm, a 20 percent reduction in long-distance transmission capacity would decrease delivered power by 9.8 percent--not a 20 percent reduction, because the farm is not producing its maximum possible output all the time.

Archer said that if the United States and other countries each started to organize the siting and interconnection of new wind farms based on a master plan, the power supply could be smoothed out and transmission requirements could be reduced, decreasing the cost of wind energy. This could result in the large-scale market penetration of wind energy--already the most inexpensive clean renewable electric power source--which could contribute significantly to an eventual solution to global warming, as well as reducing deaths from urban air pollution.

Adapted from materials provided by American Meteorological Society.

New Technology Illuminates Protein Interactions In Living Cells

ScienceDaily (Nov. 15, 2007) — While fluorescence has long been used to tag biological molecules, a new technology developed at Yale allows researchers to use tiny fluorescent probes to rapidly detect and identify protein interactions within living cells while avoiding the biological disruption of existing methods, according to a report in Nature Chemical Biology.
Proteins are commonly tagged using variants of the "green fluorescent protein" (GFP), but these proteins are very large and are often toxic to live cells. They also tend to aggregate, making them difficult to work with and monitor. This new methodology uses the fluorescence emitted by a small molecule, rather than a large protein. It gives researchers a less disruptive way to capture images of the intricate contacts between folded regions of an individual protein or the partnerships between proteins in a live cell.

"Our approach bypasses many of the problems associated with fluorescent proteins, so that we can image protein interactions in living cells," said senior author Alanna Schepartz, the Milton Harris Professor of Chemistry, and Howard Hughes Medical Institute Professor at Yale. "Using these molecules we can differentiate alternative or misfolded proteins from those that are folded correctly and also detect protein partnerships in live cells."

Each protein is a three-dimensional structure created by "folding" its linear chain of amino acids. Usually only one shape "works" for each protein. The particular shape a protein takes depends on its amino acids and on other processes within the cell.

Schepartz and her team devised their new tagging system using small molecules, called "profluorescent" biarsenal dyes. These molecules easily enter cells and become fluorescent when they bind to a specific amino acid tag sequence within a protein. While these compounds have been used for about a decade to bind single proteins, this is the first time they have been used to identify interactions between proteins.

The researchers' strategy was to split the amino acid tag for the dye into two pieces, locating each piece of the tag far apart in the chain of a protein they genetically engineered and expressed in the cells. Then they monitored cells exposed to the dye. Where the protein folded correctly, the two parts of the tag came together and the fluorescent compound bound and lit up. There was no signal unless the protein folded normally.

"This method of detection can provide important insights into how proteins choose their partners within the cell -- choices that may be very different from those made in a test tube," said Schepartz. She emphasizes that this technology does not monitor the process of protein folding -- but, rather "sees" the protein conformations that exist at a given time.

"In theory, our technique could be used to target and selectively inactivate specific protein complexes in the cell, as therapy, or to visualize conformations at very high resolution for diagnostic purposes," said Schepartz. She speculates that the technology could be applied to detection strategies that identify protein misfolding in neurodegenerative diseases like Alzheimer's or Parkinson's.

Other authors on the paper are Nathan W. Luedtke, Rachel J. Dexter and Daniel B. Fried from the Schepartz lab at Yale. Funding from the Howard Hughes Medical Institute and the National Institutes of Health supported the research.

Journal citation: Nature Chemical Biology: (early online) 04 November 2007 | doi:10.1038/nchembio.2007.49

Adapted from materials provided by Yale University.

'Wiring Up' Enzymes For Producing Hydrogen In Fuel Cells

ScienceDaily (Nov. 21, 2007) — Researchers in Colorado are reporting the first successful "wiring up" of hydrogenase enzymes. Those much-heralded proteins are envisioned as stars in a future hydrogen economy where they may serve as catalysts for hydrogen production and oxidation in fuel cells.
heir report describes a successful electrical connection between a carbon nanotube and hydrogenase.

In the new study, Michael J. Heben, Paul W. King, and colleagues explain that bacterial enzymes called hydrogenases show promise as powerful catalysts for using hydrogen in fuel cells, which can produce electricity with virtually no pollution for motor vehicles, portable electronics, and other devices.

However, scientists report difficulty incorporating these enzymes into electrical devices because the enzymes do not form good electrical connections with fuel cell components. Currently, precious metals, such as platinum, are typically needed to perform this catalysis.

The researchers combined hydrogenase enzymes with carbon nanotubes, submicroscopic strands of pure carbon that are excellent electrical conductors. In laboratory studies, the researchers demonstrated that a good electrical connection was established using photoluminescence spectroscopy measurements.

These new "biohybrid" conjugates could reduce the cost of fuel cells by reducing or eliminating the need for platinum and other costly metal components, they say.

The journal article, "Wiring-Up Hydrogenase with Single-Walled Carbon Nanotubes" is scheduled for the Nov. issue of ACS' Nano Letters.

Adapted from materials provided by American Chemical Society.

Wednesday, November 21, 2007

Cold Shot: Blasting Frozen Soil Sample With Ultraviolet Laser Reveals Uranium

ScienceDaily (Sep. 20, 2006) — If you want to ferret out uranium's hiding place in contaminated soil, freeze the dirt and zap it with a black light, an environmental scientist reported Tuesday at the American Chemical Society national meeting.
Scientists have long known that uranium salts under ultraviolet light will glow an eerie greenish-yellow in the dark. This phenomenon sent Henri Bequerel down the path that led to his discovery of radioactivity a century ago.

Others since noted a peculiar feature about the UV glow, or fluorescence spectra, of uranium salts: The resolution of the spectral fingerprint becomes sharper as the temperature falls.

Zheming Wang, a staff scientist at the Department of Energy's Pacific Northwest National Laboratory in Richland, Wash., has now dusted the frost off the files, applying a technique called cryogenic fluorescence spectroscopy to uranium in contaminated soil at a former nuclear fuel manufacturing site.

By cooling the sediments to minus 267 degrees Celsius, near the temperature of liquid helium, Wang and colleagues at the PNNL-based W.R. Wiley Environmental Molecular Sciences Laboratory hit a sample with UV laser on a contaminated sample to coax a uranium fluorescence intensity of more than five times that at room temperature.

What is more, other spectra that were absent at room temperature popped out when frozen, enabling Wang and colleagues to distinguish different forms of uranium from one another, including uranium-carbonate that moves readily underground and is a threat to water supplies.

Adapted from materials provided by Pacific Northwest National Laboratory.

Chemists Create Novel Uranium Molecule

ScienceDaily (Nov. 19, 2007) — Chemists at the University of Virginia have prepared the first uranium methylidyne molecule ever reported, despite the reactivity of uranium atoms with other molecules. This new molecule is a hydrocarbon containing a uranium-carbon triple-bond.
Their finding, which contributes to chemists’ fundamental understanding of uranium chemistry, is reported in the Proceedings of the National Academy of Sciences.

“This is the first example of a triple bond between uranium and carbon in a hydrocarbon,” said Lester Andrews, the lead scientist and a professor of chemistry at the University of Virginia.

Andrews and members of his U.Va. laboratory have been working on uranium chemistry for 15 years with dozens of different molecules. For this finding they used a focused pulsed laser to evaporate depleted uranium in a vacuum chamber and reacted the vapor with fluoroform molecules, then trapped the new molecule in argon frozen at 8 K, near the absolute zero of temperature.

“The uranium atom went into a C-F bond and rearranged the other fluorines to make the new molecule with hydrogen-carbon {triple bond} uranium trifluoride, which is uranium trifluoride methylidyne,” Andrews said. It is this exotic triple bond between uranium and carbon that the researchers have characterized.

“After we did the infrared spectroscopy of the new molecule, we preformed calculations to predict the structure and bonding properties of the molecule and compared the predicted vibrational spectrum with the one we observed,” Andrews said. “The agreement was good enough for us to conclude we had in fact made the molecule that we set out to make.”

Uranium exists in natural abundance in the ground in the form of ores. The material used by Andrews in his experiments is a relatively stable, long-lived isotope, which is U-238 and very little of U-235, the “hot” uranium isotope used for fuel and weapons.

“I think it’s imperative for people to know more about uranium chemistry, particularly our policy makers,” Andrews said. “People need to realize that you can’t just dig up a shovelful of uranium ore and make a bomb from it. One has to go through a considerable amount of chemical process to win uranium from its ore. You have to refine the ore into metal and enrich the material in the hot isotope before it has uses as a nuclear material. This is highly complicated chemistry.”

Andrews’s colleagues include Jonathan T. Lyon, a recent Ph.D. graduate from U.Va. who performed experiments and calculations, and Han-Shi Hu and Jun Li, chemists at Tsinghua University in Beijing who performed additional theoretical calculations to describe this new molecule.

Adapted from materials provided by University of Virginia.

New Metal Alloys Boost High-temperature Heat Treatment Of Jet Engine Components

ScienceDaily (Jul. 27, 2007) — Measurement scientists at the National Physical Laboratory have reduced the uncertainty of thermocouple temperature sensors at high temperatures to within a degree. This may allow manufacturers to improve efficiency and reduce wastage in the quest for more efficient jet engines and lower aircraft emissions.
Aircraft engines are more efficient at higher temperatures, but this requires thermal treatment of engine components at very specific high temperatures in excess of 1300 °C. If the heat treatment temperature deviates too much from the optimal temperature, the treatment may be inadequate.

Thermocouples are calibrated using materials with known melting points (fixed points), but the available reference materials in the region of the very high temperatures required to treat jet engine components have a large uncertainty compared with the lower temperature fixed points.

Using a new type of metal alloy, National Physical Laboratory scientists have identified a range of reference points for thermocouples beyond 1100 °C. With this added confidence in thermal sensors, component manufacturers are expected to start improving hotter thermal treatments and reducing wastage during production of parts for engines which can run at higher temperatures.

Adapted from materials provided by National Physical Laboratory.

Wireless Sensors To Monitor Bearings In Jet Engines Developed

ScienceDaily (Nov. 5, 2007) — Researchers at Purdue University, working with the U.S. Air Force, have developed tiny wireless sensors resilient enough to survive the harsh conditions inside jet engines to detect when critical bearings are close to failing and prevent breakdowns.

The devices are an example of an emerging technology known as "micro electromechanical systems," or MEMS, which are machines that combine electronic and mechanical components on a microscopic scale.

"The MEMS technology is critical because it needs to be small enough that it doesn't interfere with the performance of the bearing itself," said Farshid Sadeghi, a professor of mechanical engineering. "And the other issue is that it needs to be able to withstand extreme heat."

The engine bearings must function amid temperatures of about 300 degrees Celsius, or 572 degrees Fahrenheit.

The researchers have shown that the new sensors can detect impending temperature-induced bearing failure significantly earlier than conventional sensors.

"This kind of advance warning is critical so that you can shut down the engine before it fails," said Dimitrios Peroulis, an assistant professor of electrical and computer engineering.

Findings will be detailed in a research paper to be presented on Tuesday (Oct. 30) during the IEEE Sensors 2007 conference in Atlanta, sponsored by the Institute of Electrical and Electronics Engineers. The paper was written by electrical and computer engineering graduate student Andrew Kovacs, Peroulis and Sadeghi.

The sensors could be in use in a few years in military aircraft such as fighter jets and helicopters. The technology also has potential applications in commercial products, including aircraft and cars.

"Anything that has an engine could benefit through MEMS sensors by keeping track of vital bearings," Peroulis said. "This is going to be the first time that a MEMS component will be made to work in such a harsh environment. It is high temperature, messy, oil is everywhere, and you have high rotational speeds, which subject hardware to extreme stresses."

The work is an extension of Sadeghi's previous research aimed at developing electronic sensors to measure the temperature inside critical bearings in communications satellites.

"This is a major issue for aerospace applications, including bearings in satellite attitude control wheels to keep the satellites in position," Sadeghi said.

The wheels are supported by two bearings. If mission controllers knew the bearings were going bad on a specific unit, they could turn it off and switch to a backup.

"What happens, however, is that you don't get any indication of a bearing's imminent failure, and all of a sudden the gyro stops, causing the satellite to shoot out of orbit," Sadeghi said. "It can take a lot of effort and fuel to try to bring it back to the proper orbit, and many times these efforts fail."

The Purdue researchers received a grant from the U.S. Air Force in 2006 to extend the work for high-temperature applications in jet engines.

"Current sensor technology can withstand temperatures of up to about 210 degrees Celsius, and the military wants to extend that to about 300 degrees Celsius," Sadeghi said. "At the same time, we will need to further miniaturize the size."

The new MEMS sensors provide early detection of impending failure by directly monitoring the temperature of engine bearings, whereas conventional sensors work indirectly by monitoring the temperature of engine oil, yielding less specific data.

The MEMS devices will not require batteries and will transmit temperature data wirelessly.

"This type of system uses a method we call telemetry because the devices transmit signals without wires, and we power the circuitry remotely, eliminating the need for batteries, which do not perform well in high temperatures," Peroulis said.

Power will be provided using a technique called inductive coupling, which uses coils of wire to generate current.

"The major innovation will be the miniaturization and design of the MEMS device, allowing us to install it without disturbing the bearing itself," Peroulis said.

Data from the onboard devices will not only indicate whether a bearing is about to fail but also how long it is likely to last before it fails, Peroulis said.

The research is based at the Birck Nanotechnology Center in Purdue's Discovery Park and at Sadeghi's mechanical engineering laboratory.

Adapted from materials provided by Purdue University.

Could Nuclear Power By The Answer To Fresh Water?

ScienceDaily (Nov. 20, 2007) — Scientists are working on new solutions to the ancient problem of maintaining a fresh water supply. With predictions that more than 3.5 billion people will live in areas facing severe water shortages by the year 2025, the challenge is to find an environmentally benign way to remove salt from seawater.
Global climate change, desertification, and over-population are already taking their toll on fresh water supplies. In coming years, fresh water could become a rare and expensive commodity. Research results presented at the Trombay Symposium on Desalination and Water Reuse offer a new perspective on desalination and describe alternatives to the current expensive and inefficient methods.

Pradip Tewari of the Desalination Division at Bhabha Atomic Research Centre, in Mumbai, India, discusses the increasing demand for water in India driven not only by growing population and expectancies rapid agricultural and industrial expansion. He suggests that a holistic approach is needed to cope with freshwater needs, which include primarily seawater desalination in coastal areas and brackish water desalination as well as rainwater harvesting, particularly during the monsoon season. "The contribution of seawater and brackish water desalination would play an important role in augmenting the freshwater needs of the country."

Meenakshi Jain of CDM & Environmental Services and Positive Climate Care Pvt Ltd in Jaipur highlights the energy problem facing regions with little fresh water. "Desalination is an energy-intensive process. Over the long term, desalination with fossil energy sources would not be compatible with sustainable development; fossil fuel reserves are finite and must be conserved for other essential uses, whereas demands for desalted water would continue to increase."

Jain emphasizes that a sustainable, non-polluting solution to water shortages is essential. Renewable energy sources, such as wind, solar, and wave power, may be used in conjunction to generate electricity and to carry out desalination, which could have a significant impact on reducing potential increased greenhouse gas emissions. "Nuclear energy seawater desalination has a tremendous potential for the production of freshwater," Jain adds.

The development of a floating nuclear plant is one of the more surprising solutions to the desalination problem. S.S. Verma of the Department of Physics at SLIET in Punjab, points out that small floating nuclear power plants represent a way to produce electrical energy with minimal environmental pollution and greenhouse gas emissions. Such plants could be sited offshore anywhere there is dense coastal population and not only provide cheap electricity but be used to power a desalination plant with their excess heat. "Companies are already in the process of developing a special desalination platform for attachment to FNPPs helping the reactor to desalinate seawater," Verma points out.

A. Raha and colleagues at the Desalination Division of the Bhabha Atomic Research Centre, in Trombay, point out that Low-Temperature Evaporation (LTE) desalination technology utilizing low-quality waste heat in the form of hot water (as low as 50 Celsius) or low-pressure steam from a nuclear power plant has been developed to produce high-purity water directly from seawater. Safety, reliability, viable economics, have already been demonstrated. BARC itself has recently commissioned a 50 tons per day low-temperature desalination plant.

Co-editor of the journal*, B.M. Misra, formerly head of BARC, suggests that solar, wind, and wave power, while seemingly cost effective approaches to desalination, are not viable for the kind of large-scale fresh water production that an increasingly industrial and growing population needs.

India already has plans for the rapid expansion of its nuclear power industry. Misra suggests that large-scale desalination plants could readily be incorporated into those plans. "The development of advanced reactors providing heat for hydrogen production and large amount of waste heat will catalyze the large-scale seawater desalination for economic production of fresh water," he says.

*This research is published in the International Journal of Nuclear Desalination.

Adapted from materials provided by Inderscience Publishers.

Nokia N95 - Sophisticated handsets with specialized capabilities


Nokia N Series mobile phones are a series of sophisticated handsets with specialized capabilities in different spheres. Great looks, innovative features and ease of use characterise all the phones from the Nokia N Series. Nokia N95 is the recent mobile phone from the series; and with its stunning looks and a wide spectrum of user-friendly features, it does not disappoint people showing an interest in the same.

Nokia N95 is a third generation (3G) smart phone that impresses one and all. The light weight and small profile of the handset ensures that users do not have any problem in carrying the handset from one place to another. The Nokia N95 weighs only 120 gm and measures 99x53x21 mm.

The multimedia options of the Nokia N95 are equally impressive. An integrated 5 mega-pixel digital camera can be used to capture shots with impressive image quality. An FM radio feature means that users can listen to the programs and music and that too from their favorite music stations. An MP3 player is in-built into the design of the Nokia N95 �" perfect for listening to music on the move. One could further personalise the music experience by using a set of headphones. Internet access is possible with the Nokia N95; the incorporation of the EDGE and WAP technology ensures that users can browse the Internet as well as check their e-mails and messages. Moreover, the 160 MB of internal memory can be expanded up to 2 GB.

Users who are interested in acquiring the Nokia N95 mobile phone can now easily do so. There are a number of innovative deals and offers in the market from leading service providers and network operators in different parts of the world. In the UK, for instance, a person can avail contract mobile phone deals on the Nokia N95 and get to own this sophisticated handset at a reduced cost.
About the Author

Adam Caitlin is expert author of Telecommunication industry.

* Nokia N95
* contract phones
* sim free phones

The Nokia 6131 - Top Class Handset!

The Nokia 6131 mobile phone coming in attractive and stunning casing also has the facilities like digital camera and Bluetooth technology that are much beneficial for the user. It is also possible for the user to browse internet with the help of the handset.

The latest Nokia 6131 mobile handset is a comfortable and slim mobile phone coming with a fold opening mechanism that makes it more attractive. The flip opening mechanism can be opened with one touch from the user which is a very smooth experience. The Nokia 6131 mobile phone handset comes with a very stylish and compact handset that is available in four colours including black, white, red and sand. The colored handset coming with a soft to touch paint finish weighs 112 grams making it light weighted and easy to hold in hand. The size of the handset is 92 x 48 x 20 mm which is too comfortable to hold in hand and slip into the pocket. The attractive feature of the mobile phone is the dual colour screen and the internal screen that comes with a beautiful TFT QVGA 16.7 million true colour display (320 x 240 pixels).

The Nokia phones 6131 mobile phone has a built in 1.3 megapixel digital camera that comes with a 8 x digital zoom and a quick to use camera button that assures the user of perfect picture with excellent quality. These mobile phones have 32 Mbytes of memory that supports the user in saving many photos.

The quad band network that exists in the handset provides the user with network coverage on up to five continents and the phone can switch automatically between network bands. The Bluetooth wireless technology supports the user with a wireless connection to Bluetooth compatible headsets, laptops, computers and PDAs that are beneficial for the user. The Nokia 6131 mobile phones also have built in EDGE technology that helps the user in enjoying a fast and effective experience while browsing the Internet and when downloading. The Nokia phones 6131 mobile phone comes with great messaging including email, text messaging (SMS), Xpress audio messaging, instant messaging and multimedia messaging (MMS).

mobile phones, Nokia phones
About the Author

Nokia phones

mobile phones

Biodiesel Production Facility

There are many companies around the world that are involved in the production of biodiesel. The Biodiesel Company which is privately owned is one of them. The Biodiesel Company has many years of experience in the fields of sales and distribution. Based in Toronto Canada, The Biodiesel Company collects and renews fuel from the waste of cooking oil, non vegetable oils and also related feed stocks and then process them into biofuel.

When they are finished the product is then marketed. This company is always looking for new joint ventures and wants to expand from the Toronto area. They are currently working with researchers for new technology in biodiesel and its production so that only the best quality maybe available for sale on the market.

Grease Brothers make their biodiesel from vegetable oil and change it into fuel, the ultimate biodiesel guide is a company that sells instructions on how to make biodiesel so that you will be able to heat your home. Tree Hugger is a new biodiesel company that produces biodiesel through algae which has been found to be a more efficient way to make fuel. They found that algae can produce 30 times more oil per acre then the current crops that companies use do.

Other biodiesel companies include Agra Biofuels which produces over 3 million gallons of biodiesel every single year. Bently Biofuels, produce biodiesel from the oils of seeds and restaurant grease. Biodiesel production companies are the new solution to the rising prices of fuel. The fuel can be used in any diesel without having to convert it which makes it quick to Descente grate.

The majority of biodiesel production companies use canola oil. The majority of companies make it so that biodiesel meets regulatory specifications so that it can be certified for use in the engine. Although there are many companies, most use the same materials to produce biodiesel. These companies are a savior to the environment.
About the Author

If you want more information on biodiesel production companies, please visit our website: http://biodieselcorner.com

Saturday, November 17, 2007

How to Choose the Right Alarm Monitoring for Home Security System

There are many benefits to be gained from having a home security system which is being monitored 24/7. Below we will take a look at just what alarm monitoring for home security system is and how it can benefit you as a home owner.

Once you start looking into alarm monitoring services you will be amazed at the number of different security companies that now offer this facility to homeowners. Companies such as ADT are probably one of the most well recognized of all home monitoring services now available. But just how does such a system work?

First off the alarm monitoring service company will have, within your security system, a signal that will be sent back to the center where all monitoring is done. Once such a signal is received, meaning the alarm system has been triggered, one of the monitors will call your home to ensure that some emergency is really happening. If they get no answer at the house, then they will alert the police or sheriff to take the proper course of action. If the service you chose has this option, then the monitoring service will also call any numbers on an emergency contact list as well.

In addition to just alarm monitoring for home security system, you can add some other systems to your monitoring service. These could include smoke detection system and a medical or panic alarm system. If these are activated, the monitoring service company will also contact the proper authorities for that emergency.

When it comes to you finding the right kind of alarm monitoring service for your home, you should carry out as much research as possible on each one that you are considering. Below we provide you with some tips which will help you to decide which one may be the best option for you.

Firstly you should inquire as to how long the company has been in business. Certainly the less time they have been in business the less likely they will be able to provide you with the kind of service you want. It is better to go with companies that are well established and have been providing such a service for a number of years.

Additionally, make sure the home security system monitoring company you pick will be providing service every day of the year. Ask them what backup systems they have in place if there is a power outage or something else should happen. You will find the reputable companies have generators as backup or some other type of UPS system.

Lastly check to see if the alarm monitoring for home security system service has insurance in place. Check to see if they are using Underwriters Laboratory (UL) certified equipment in the burglar alarm system or other security system they install. And see if they are UL certified themselves.

Keyword Articles: http://www.keywordarticles.org

For many articles on Burglar Alarm s for ymy family's safety and details on alarm monitoring for home security system s feel free to check out my site at www.burglaralarmnotes.com

Looking For Cheap Plasma TV?

If you're looking for a great television to buy, you should check out what you can get from plasma televisions. It's a good idea to get something that is worth your money and time. This is something you may want to look into.

With so many different brands and models of plasma televisions out there in the market, it is not easy to determine which one is best for you. You need to consider several factors before you can decide on which brand and model is the one you want to buy.

You'll definitely need to shop around for the best deals available. With so many different options available to you, you can afford to be fussy about which one you want to choose.

Imagine the kind of things you want your plasma television to have. Look at all the different features that you can choose from that's available in the market. You'll have a great time arranging your options as you look at the many creative options available to you.

It is most important to get the best deal available for your money. Be careful when you're looking for a cheap plasma TV, just like shopping for any other things. Look for great deals that will work within whatever budget you may have.

Be sure that you're doing whatever you can to carefully check out what's available to you in the market. Find a product that you're comfortable with that meets your budget. You must get something that's worth the money you're going to spend. You must ensure that you'll spend the time necessary to look at all your available options. Something that looks good on paper may not necessary be that great after all.

Be a little cautious when you're selecting the television set you're going to buy. Check out on the brand to see if any current problems have been reported or any recall have been issued that you may need to know about. Different televisions offer somewhat different features. You may want to get all the newest features that's offered.

Set your budget and then take your time searching around for the right type of television for you. You can look online and in the different stores that are out there. You will want to take the time to see what is on clearance. You may never know what you will get out there on sale. You may just get the deal of a lifetime if you shop around at the right places.

Keyword Articles: http://www.keywordarticles.org

Author Hovan Newton is a Flat Panel TV enthusiast. Grab a free report on Flat Panel TV Set Up Tips from his website. www.lcd-plasmatv.com

Ethanol, schmethanol

Sep 27th 2007 | EMERYVILLE, REDWOOD CITY AND SAN CARLOS, CALIFORNIA
From The Economist print edition


Everyone seems to think that ethanol is a good way to make cars greener. Everyone is wrong



SOMETIMES you do things simply because you know how to. People have known how to make ethanol since the dawn of civilisation, if not before. Take some sugary liquid. Add yeast. Wait. They have also known for a thousand years how to get that ethanol out of the formerly sugary liquid and into a more or less pure form. You heat it up, catch the vapour that emanates, and cool that vapour down until it liquefies.

The result burns. And when Henry Ford was experimenting with car engines a century ago, he tried ethanol out as a fuel. But he rejected it—and for good reason. The amount of heat you get from burning a litre of ethanol is a third less than that from a litre of petrol. What is more, it absorbs water from the atmosphere. Unless it is mixed with some other fuel, such as petrol, the result is corrosion that can wreck an engine's seals in a couple of years. So why is ethanol suddenly back in fashion? That is the question many biotechnologists in America have recently asked themselves.

The obvious answer is that, being derived from plants, ethanol is “green”. The carbon dioxide produced by burning it was recently in the atmosphere. Putting that CO2 back into the air can therefore have no adverse effect on the climate. But although that is true, the real reason ethanol has become the preferred green substitute for petrol is that people know how to make it—that, and the subsidies now available to America's maize farmers to produce the necessary feedstock. Yet such things do not stop ethanol from being a lousy fuel. To solve that, the biotechnologists argue, you need to make a better fuel that is equally green. Which is what they are trying to do.
Designer petrol

The first step on the road has been butanol. This is also a type of alcohol that can be made by fermenting sugar (though the fermentation is done by a species of bacterium rather than by yeast), and it has some advantages over ethanol. It has more carbon atoms in its molecules (four, instead of two), which means more energy per litre—though it is still only 85% as rich as petrol. It also has a lower tendency to absorb water from the atmosphere.

A joint venture between DuPont, a large American chemical company, and BP, a British energy firm, has worked out how to industrialise the process of making biobutanol, as the chemical is commonly known when it is the product of fermentation. Although BP plans to start selling the stuff in the next few weeks (mixed with petrol, to start with), the truth is that butanol is not all that much better than ethanol. The interesting activity is elsewhere.

One route might be to go for yet-larger (and thus energy-richer) alcohol molecules. Any simple alcohol is composed of a number of carbon and hydrogen atoms (like a hydrocarbon such as petrol) together with a single oxygen atom. In practice, this game of topping up the carbon content to make a better fuel stops with octanol (eight carbon atoms) as anything bigger tends to freeze at temperatures that might be encountered in winter. But living things are familiar with alcohols. Their enzymes are geared up to cope with them. This makes the biotechnologists' task that much easier.

The idea of engineering enzymes to make octanol was what first brought Codexis, a small biotechnology firm based in Redwood City, California, into the field. Codexis's technology works with pharmaceutical precision—indeed, one of its main commercial products is the enzyme system for making the chemical precursor to Lipitor, a cholesterol-lowering drug that is marketed by Pfizer. Codexis controls most of the important patents for what is known as molecular evolution. This designs enzymes in the way that normal evolution designs organisms. It creates lots of variations on a theme, throws away the ones it does not want, and shuffles the rest in a process akin to sex. It then repeats the process on the survivors until something useful emerges—though, unlike natural evolution, there is a bit of intelligent design in the process, too. The result, according to Codexis's boss, Alan Shaw, is enzymes that can perform chemical transformations unknown in nature.

Dr Shaw, however, is no longer so interested in octanol as a biofuel. Like two other, nearby firms, he is now focusing Codexis's attention on molecules even more chemically similar to petrol. The twist that Codexis brings is that unlike petrol, of which each batch from the refinery is chemically different from the others (because the crude oil from which it is derived is an arbitrary mixture of hydrocarbon molecules), biopetrol could be turned out exactly the same, again and again, and thus designed to have the optimal mixture of properties required of a motor fuel.

Exactly which molecules Codexis is most interested in these days, Dr Shaw is not yet willing to say. But Amyris Biotechnologies, which is also based in California, in Emeryville, and which also started by dabbling in drugs (in its case an antimalarial medicine called artemisinin), is slightly more forthcoming. Under the guidance of its founder Jay Keasling, it has been working on a type of isoprenoid (a class of chemicals that include rubber).

Unlike Codexis, which deals in purified enzymes, Amyris employs a technique called synthetic biology, which turns living organisms into chemical reactors by assembling novel biochemical pathways within them. Dr Keasling and his colleagues scour the world for suitable enzymes, tweak them to make them work better, then sew the genes for the tweaked enzymes into a bacterium that thus turns out the desired product. That was how they produced artemisinin, which is also an isoprenoid.

Isoprenoids have the advantage that, like alcohols, they are part of the natural biochemistry of many organisms. Enzymes to handle them are thus easy to come by. They have the additional advantage that some are pure hydrocarbons, like petrol. With a little judicious searching, Amyris thinks it has come up with isoprenoids that have the right characteristics to substitute for petrol.

The third Californian firm in the business, LS9 of San Carlos, is cutting to the chase. If petrol is what is wanted, petrol is what will be delivered. And diesel, too, although in this case the product is actually biodiesel, which is in some ways superior to the petroleum-based stuff.

LS9 also uses synthetic biology, but it has concentrated on controlling the pathways that make fatty acids. Like alcohols, fatty acids are molecules that have lots of hydrogen and carbon atoms, and a small amount of oxygen (in their case two oxygen atoms, rather than one). Plant oils consist of fatty acids combined with glycerol—and these fatty acids (for example, those from palm oil) are the main raw material for the biodiesel already sold today.

LS9 has used its technology to turn microbes into factories for fatty acids containing between eight and 20 carbon atoms—the optimal number for biodiesel. But it also plans to make what it calls “biocrude”. In this case the fatty acids would have 18-30 carbon atoms, and the final stage of the synthetic pathway would clip off the oxygen atoms to create pure hydrocarbons. This biocrude could be fed directly into existing oil refineries, without any need to modify them.

These firms, however, have one other competitor. His name is Craig Venter. Dr Venter, a veteran of biotechnological scraps ranging from gene patenting to the private human-genome project, has been interested in bioenergy for a long time. To start with, it was hydrogen that caught his eye, then methane—both of which are natural bacterial products. But now that eye is shifting towards liquid fuels. His company, modestly named Synthetic Genomics (and based, unlike the others, on the east side of America, in Rockville, Maryland), is reluctant to discuss details, but Dr Venter, too, is taken with the pharmaceutical analogy. Indeed, he goes as far as to posit the idea of clinical trials for biofuels—presumably pitting one against another, perhaps with petroleum-based products acting as the control, and without the drivers knowing which was which.

Whether biofuels will ever be competitive with fossil fuels remains to be seen. That will depend on a mixture of economics and politics. But the political rush to back ethanol, just because it is green and people have heard of it, is a mistake. Let a thousand flowers bloom, and see which one wins Dr Venter's Grand Prix.

Home Solar Power Kits Save Money


When you want to install a solar power kit at your home, you should take under consideration several parameters. You should first decide the benefit that you would like to achieve with the solar power system, whether it is to power your home, your camp house, Additional power source to reduce your energy utility bills or as a backup power source when local grid electricity outage occurs. Whatever benefit you wish to achieve, you should pick the most suitable quality solar power kit that will meet best these needs while it is easy to install and simple to use.

Whichever solar power kit you choose from should contain photovoltaic collector unit also known as PV solar panel. The photovoltaic solar panel collects the sunrays and turns them into electric power. The PV solar panel should be set in a way that it would face the sun in the most efficient way to harness maximum sunlight per day. The solar kit will contain solar panels mounting hardware. Make sure that this mounting staff can be adjusted so the solar panel could have the right angle toward the sun.

In your home solar power kit there is a solar power controller and a battery that will be used as the electricity storage device coming from the solar panels. The solar charge controller is important to protect your batteries. Since batteries have a limited storage capacity and the sun produces energy (to your solar panels) as long as it shines, you must make sure that your batteries will not be overcharged. Make sure to select a solar charge controller that matches the voltage of your batteries.

The amount of watts to be produced is stated on every solar power kit. Make sure you know what amount of watts per day are your needs and then it will be much easier to pick the solar power kit that matches the best.

Most of solar power kits are simple to install and arrive with simple step by step instructions that are very easy to follow. With solar power kit you can save literally thousands of dollars on the installation contractor. When you install the solar power kit yourself, the return on investment is much quicker and it can be as fast as three years.

Renewable & Alternative Energy Resources: http://www.alternativeenergybase.com

Tuesday, November 13, 2007

New Method Converts Organic Matter To Hydrogen Fuel Easily And Efficiently

ScienceDaily (Nov. 13, 2007) — Hydrogen as an everyday, environmentally friendly fuel source may be closer than we think, according to Penn State researchers.

"The energy focus is currently on ethanol as a fuel, but economical ethanol from cellulose is 10 years down the road," says Bruce E. Logan, the Kappe professor of environmental engineering. "First you need to break cellulose down to sugars and then bacteria can convert them to ethanol."

Logan and Shaoan Cheng, research associate, suggest a method based on microbial fuel cells to convert cellulose and other biodegradable organic materials directly into hydrogen.

The researchers used naturally occurring bacteria in a microbial electrolysis cell with acetic acid -- the acid found in vinegar. Acetic acid is also the predominant acid produced by fermentation of glucose or cellulose. The anode was granulated graphite, the cathode was carbon with a platinum catalyst, and they used an off-the-shelf anion exchange membrane. The bacteria consume the acetic acid and release electrons and protons creating up to 0.3 volts. When more than 0.2 volts are added from an outside source, hydrogen gas bubbles up from the liquid.

"This process produces 288 percent more energy in hydrogen than the electrical energy that is added to the process," says Logan.

Water hydrolysis, a standard method for producing hydrogen, is only 50 to 70 percent efficient. Even if the microbial electrolysis cell process is set up to bleed off some of the hydrogen to produce the added energy boost needed to sustain hydrogen production, the process still creates 144 percent more available energy than the electrical energy used to produce it.

For those who think that a hydrogen economy is far in the future, Logan suggests that hydrogen produced from cellulose and other renewable organic materials could be blended with natural gas for use in natural gas vehicles.

"We drive a lot of vehicles on natural gas already. Natural gas is essentially methane," says Logan. "Methane burns fairly cleanly, but if we add hydrogen, it burns even more cleanly and works fine in existing natural gas combustion vehicles."

The range of efficiencies of hydrogen production based on electrical energy and energy in a variety of organic substances is between 63 and 82 percent. Both lactic acid and acetic acid achieve 82 percent, while unpretreated cellulose is 63 percent efficient. Glucose is 64 percent efficient.

Another potential use for microbial-electrolysis-cell produced hydrogen is in fertilizer manufacture. Currently fertilizer is produced in large factories and trucked to farms. With microbial electrolysis cells, very large farms or farm cooperatives could produce hydrogen from wood chips and then through a common process, use the nitrogen in the air to produce ammonia or nitric acid. Both of these are used directly as fertilizer or the ammonia could be used to make ammonium nitrate, sulfate or phosphate.

This research is published in the Nov. 12 issue of the Proceedings of the National Academy of Sciences online.

The researchers have filed for a patent on this work. Air Products and Chemicals, Inc. and the National Science Foundation supported this work.

Adapted from materials provided by Penn State.

Microbes Plus Sugars Equals Hydrogen Fuel?

ScienceDaily (Nov. 3, 2007) — Wanted: Bacterium that can eat sugar or sludge; must be team player or electrochemically active; ability to survive without oxygen, a plus. Thus might read the bacterial "job description" posted by Agricultural Research Service (ARS) and Washington University (WU) scientists, who are collaborating on ways to make microbial fuel cells more efficient and practical.

According to Mike Cotta, who leads the ARS Fermentation Biotechnology Research Unit, Peoria, Ill., the project with WU arose from a mutual interest in developing sustainable methods of producing energy that could diminish U.S. reliance on crude oil.

Cotta's team specializes in using bacteria, yeasts or other microorganisms inside bioreactors to do work, such as ferment grain sugars into fuel ethanol. At WU in St. Louis, Mo., assistant professor Lars Angenent is investigating fuel cell systems that use mixtures of bacteria to treat organic wastewater and catalyze the release of electrons and protons, which then can be used to produce electricity or hydrogen fuel.

In September 2006, the researchers pooled their labs' resources and expertise to undertake a three-year cooperative project. One resource they'll share is the ARS Peoria-based Microbial Culture Collection, which houses about 87,000 accessions of freeze-dried microbes from around the world.

Using the collection's database information, the team is searching for microbes that "eat" biomass sugars (e.g., glucose and xylose from corn stover) and are electrochemically active. That means they can transfer electrons from fuel cell sugars without help from costly chemicals called mediators. The electrons, after traveling a circuit, combine with protons in a cathode chamber, forming hydrogen, which can be burned or converted into electricity.

Bacteroides and Shewanella are among bacteria species used to start the process.

Hydrogen's appeal stems from its natural abundance and capacity to store and release energy in a nonpolluting manner. The challenge is commercially producing it from sources other than fossil fuels, which are in limited supply and nonrenewable. About 95 percent of U.S. hydrogen comes from petroleum or natural gas via a process called steam reforming.

Adapted from materials provided by US Department of Agriculture.

Line Between Quantum And Classical Worlds Is At Scale Of Hydrogen Molecule

ScienceDaily (Nov. 12, 2007) — The big world of classical physics mostly seems sensible: waves are waves and particles are particles, and the moon rises whether anyone watches or not. The tiny quantum world is different: particles are waves (and vice versa), and quantum systems remain in a state of multiple possibilities until they are measured — which amounts to an intrusion by an observer from the big world — and forced to choose: the exact position or momentum of an electron, say.

On what scale do the quantum world and the classical world begin to cross into each other? How big does an "observer" have to be? It's a long-argued question of fundamental scientific interest and practical importance as well, with significant implications for attempts to build solid-state quantum computers.

Researchers at the Department of Energy's Lawrence Berkeley National Laboratory and their collaborators at the University of Frankfurt, Germany; Kansas State University; and Auburn University have now established that quantum particles start behaving in a classical way on a scale as small as a single hydrogen molecule. They reached this conclusion after performing what they call the world's simplest — and certainly its smallest — double slit experiment, using as their two "slits" the two proton nuclei of a hydrogen molecule, only 1.4 atomic units apart (a few ten-billionths of a meter). Their results appear in the November 9, 2007 issue of Science.

Double slit experiment

"One of the most powerful ways to explore the quantum world is the double slit experiment," says Ali Belkacem of Berkeley Lab's Chemical Sciences Division, one of the research leaders. In its familiar form, the double slit experiment uses a single light source shining through two slits, side by side in an opaque screen; the light that passes through falls on a screen.

If either of the two slits is closed, the light going through the other slit forms a bright bar on the screen, striking the screen like a stream of BBs or Ping-Pong balls or other solid particles. But if both slits are open, the beams overlap to form interference fringes, just as waves in water do, with bright bands where the wavecrests reinforce one another and dark bands where they cancel.

So is light particles or waves? The ambiguous results of early double slit experiments (the first on record was in 1801) were not resolved until well into the 20th century, when it became clear from both experiment and the theory of quantum mechanics that light is both waves and particles — moreover, that particles, including electrons, also have a wave nature.

"It's the wave nature of electrons that allows them to act in a correlated way in a hydrogen molecule," says Thorsten Weber of the Chemical Sciences Division, another of the experiment's leading researchers. "When two particles are part of the same quantum system, their interactions are not restricted to electromagnetism, for example, or gravity. They also possess quantum coherence — they share information about their states nonlocally, even when separated by arbitrary distances."

Correlation between its two electrons is actually what makes double photoionization possible with a hydrogen molecule. Photoionization means that an energetic photon, in this case an x-ray, knocks an electron out of an atom or molecule, leaving the system with net charge (ionized); in double photoionization a single photon triggers the emission of two electrons.

"The photon hits only one electron, but because they are correlated, because they cohere in the quantum sense, the electron that's hit flies off in one direction with a certain momentum, and the other electron also flies off at a specific angle to it with a different momentum," Weber explains.

The experimental set-up used by Belkacem and Weber and their colleagues, being movable, was employed on both beamlines 4.0 and 11.0 of Berkeley Lab's Advanced Light Source (ALS). In the apparatus a stream of hydrogen gas is sent through an interaction region, where some of the molecules are struck by an x-ray beam from the ALS. When the two negatively charged electrons are knocked out of a molecule, the two positively charged protons (the nuclei of the hydrogen atoms) blow themselves apart by mutual repulsion. An electric field in the experiment's interaction region separates the positively and negatively charged particles, sending the protons to one detector and the electrons to a detector in the opposite direction.

"It's what's called a kinematically complete experiment," Belkacem says, "one in which every particle is accounted for. We can determine the momentum of all the particles, the initial orientation and distance between the protons, and the momentum of the electrons."

What the simplest double slit experiment reveals

"At the high photon energies we used for photoionization, most of the time we observed one fast electron and one slow electron," says Weber. "What we were interested in was the interference patterns."

Considered as particles, the electrons fly off at an angle to one another that depends on their energy and how they scatter from the two hydrogen nuclei (the "double slit"). Considered as waves, an electron makes an interference pattern that can be seen by calculating the probability that the electron will be found at a given position relative to the orientation of the two nuclei.

The wave nature of the electron means that in a double slit experiment even a single electron is capable of interfering with itself. Double slit experiments with photoionized hydrogen molecules at first showed only the self-interference patterns of the fast electrons, their waves bouncing off both protons, with little action from the slow electrons.

"From these patterns, it might look like the slow electron is not important, that double photoionization is pretty unspectacular," says Weber. The fast electrons' energies were 185 to 190 eV (electron volts), while the slow electrons had energies of 5 eV or less. But what happens if the slow electron is given just a bit more energy, say somewhere between 5 and 25 eV? As Weber puts it, "What if we make the slow electron a little more active? What if we turn it into an 'observer?'"

As long as both electrons are isolated from their surroundings, quantum coherence prevails, as revealed by the fast electron's wavelike interference pattern. But this interference pattern disappears when the slow electron is made into an observer of the fast one, a stand-in for the larger environment: the quantum system of the fast electron now interacts with the wider world (e.g., its next neighboring particle, the slow electron) and begins to decohere. The system has entered the realm of classical physics.

Not completely, however. And here is what Belkacem calls "the meat of the experiment:" "Even when the interference pattern has disappeared, we can see that coherence is still there, hidden in the entanglement between the two electrons."

Although one electron has become entangled with its environment, the two electrons are still entangled with each other in a way that allows interference between them to be reconstructed, simply by graphing their correlated momenta from the angles at which the electrons were ejected. Two waveforms appear in the graph, either of which can be projected to show an interference pattern. But the two waveforms are out of phase with each other: viewed simultaneously, interference vanishes.

If the two-electron system is split into its subsytems and one (the "observer") is thought of as the environment of the other, it becomes evident that classical properties such as loss of coherence can emerge even when only four particles (two electrons, two protons) are involved. Yet because the two electron subsystems are entangled in a tractable way, their quantum coherence can be reconstructed. What Weber calls "the which-way information exchanged between the particles" persists.

Says Belkacem, "For researchers who are trying to build solid-state quantum computers this is both good news and bad news. The bad news is that decoherence and loss of information occur on the very tiny scale of a single hydrogen molecule. The good news is that, theoretically, the information isn't necessarily lost — or at least not completely."

"The Simplest Double Slit: Interference and Entanglement in Double Photoionization of H2," by D. Akoury, K. Kreidi, T. Jahnke, Th. Weber, A. Staudte, M. Schöffler, N. Neumann, J. Titze, L. Ph. H. Schmidt, A. Czasch, O. Jagutzki, R. A. Costa Fraga, R. E. Grisenti, R. Díez Muiño, N. A. Cherepkov, S. K. Semenov, P. Ranitovic, C. L. Cocke, T. Osipov, H. Adaniya, J. C. Thompson, M. H. Prior, A. Belkacem, A. L. Landers, H. Schmidt-Böcking, and R. Dörner, appears in the 9 November issue of Science.

Adapted from materials provided by DOE/Lawrence Berkeley National Laboratory.

Saturday, November 10, 2007

Using Biology To Create Electronics: DNA Used To Create Self-assembling Nano Transistor

ScienceDaily (Nov. 21, 2003) — NEW YORK, N.Y., and HAIFA, Israel, November 17, 2003 - Scientists at the Technion–Israel Institute of Technology have harnessed the power of DNA to create a self-assembling nanoscale transistor, the building block of electronics. The research, published in the Nov. 21, 2003 issue of Science, is a crucial step in the development of nanoscale devices.

Erez Braun, lead scientist on the project and associate professor in the Faculty Physics at the Technion, says science has been intrigued with the idea of using biology to build electronic transistors that assemble without human manipulation. However, until now, demonstrating it in the lab has remained elusive. "This paper shows you can start with DNA proteins and molecular biology and construct an electronic device," he said.

"Erez Braun and his colleague Uri Sivan are some of the few pioneers in this field," said Horst Stormer, professor in Columbia University's Departments of Physics and Applied Physics and scientific director of the Nano Science and Engineering Centers. "This is outstanding research in the area that matters most in nano technology: self-assembly."

To get the transistors to self assemble, the Technion research team attached a carbon nanotube -- known for its extraordinary electronic properties -- onto a specific site on a DNA strand, and then made metal nanowires out of DNA molecules at each end of the nanotube. The device is a transistor that can be switched on and off by applying voltage to it.

The carbon nanotubes used in the experiment are only one nanometer, or a billionth of a meter, across. In computing technology, as scientists reach the limits of working with silicon, carbon nanotubes are widely recognized as the next step in squeezing an increasing number of transistors onto a chip, vastly increasing computer speed and memory. Braun emphasized that computers are only one application; these transistors may, for example, enable the creation of any number of devices in future applications, such as tiny sensors to perform diagnostic tests in healthcare.

Though transistors made from carbon nanotubes have already been built, those required labor-intensive fabrication. The goal is to have these nanocircuits self-assemble, enabling large-scale manufacturing of nanoscale electronics.

DNA, according to Braun, is a natural place to look for a tool to create these circuits. "But while DNA by itself is a very good self-assembling building block, it doesn't conduct electrical current," he noted.

To overcome these challenges, the researchers manipulated strands of DNA to add bacteria protein to a segment of the DNA. They then added certain protein molecules to the test tube, along with protein-coated carbon nanotubes. These proteins naturally bond together, causing the carbon nanotube to bind to the DNA strand at the bacteria protein.

Finally, they created tiny metal nanowires by coating DNA molecules with gold. In this step, the bacteria protein served another purpose: it prevented the metal from coating the bacteria-coated DNA segment, creating extending gold nanowires only at the ends of the DNA strand.

The goal, Braun explained, was to create a circuit. However, "at this point, the carbon nanotube is located on a segment of DNA, with metal nanowires at either end. Theoretically, one challenge here would be to encourage the nanotube to line up parallel to the DNA strand, meet the nanowires at either end, and thus make a circuit.

"There are some points where nature smiles upon you, and this was one of those points," Braun continued. "Carbon nanotubes are naturally rigid structures, and the protein coating makes the DNA strand rigid as well. The two rigid rods will align parallel to each other, thus making an ideal DNA-nanotube construct."

"In a nutshell, what this does is create a self-assembling carbon nanotube circuit," he concluded.

Scientists controlled the creation of transistors by regulating voltage to the substrate. Out of 45 nanoscale devices created in three batches, almost a third emerged as self-assembled transistors.

Braun added, however, that while this research demonstrates the feasibility of harnessing biology as a framework to construct electronics, creating working electronics from self-assembling carbon nanotube transistors is still in the future.

Braun conducted the research with colleagues Kinneret Keren, Rotem S. Berman, Evgeny Buchstab, and Uri Sivan.

Adapted from materials provided by American Society For Technion - Israel Institute Of Technology.

The Sensitive Side Of Carbon Nanotubes: Creating Powerful Pressure Sensors

ScienceDaily (Oct. 26, 2007) — Blocks of carbon nanotubes can be used to create effective and powerful pressure sensors, according to a new study by researchers at Rensselaer Polytechnic Institute.

Taking advantage of the material’s unique electrical and mechanical properties, researchers repeatedly squeezed a 3-millimeter nanotube block and discovered it was highly suitable for potential applications as a pressure sensor. No matter how many times or how hard they squeezed the block, it exhibited a constant, linear relationship between how much force was applied and electrical resistance.

“Because of the linear relationship between load and stress, it can be a very good pressure sensor,” said Subbalakshmi Sreekala, a postdoctoral researcher at Rensselaer and author of the study.

A sensor incorporating the carbon nanotube block would be able to detect very slight weight changes and would be beneficial in any number of practical and industrial applications, Sreekala said. Two potential applications are a pressure gauge to check the air pressure of automobile tires, and a microelectromechanical pressure sensor that could be used in semiconductor manufacturing equipment.

Despite extensive research over the past decade into the mechanical properties of carbon nanotube structures, this study is the first to explore and document the material’s strain-resistance relationship. The paper, titled “Effects of compressive strains on electrical conductivities of a macroscale carbon nanotube block,” was published in a recent issue of Applied Physics Letters.

Over the course of the experiment, the researchers placed the carbon nanotube block in a vice-like machine and applied different levels of stress. They took note of the stress applied and measured the corresponding strain put on the nanotube block. As it was being squeezed, the researchers also sent an electrical charge through the block and measured its resistance, or how easily the charge moved from one end of the block to the other.

The research team discovered that the strain they applied to the block had a linear relationship with the block’s electrical resistance. The more they squeezed the block, the more its resistance decreased. On a graph, the relationship is represented by a neat, straight line. This means every time one exposes the block to a load of X, they can reliably expect the block’s resistance to decrease by Y.

This reliability and predictability of this relationship makes the carbon nanotube block an ideal material for creating a highly sensitive pressure sensor, Sreekala said.

The pressure sensor would function similarly to a typical weight scale. By placing an object with an unknown weight onto the carbon nanotube block, the block would be squeezed down and its electrical resistance would decrease. The sensor would then send an electrical charge through the nanotube block, and register the resistance. The exact weight of the object could then be easily calculated, thanks to the linear, unchanging relationship between the block’s strain and resistance.

A study published earlier this year, written by Rensselaer senior research specialist Victor Pushparaj, who is also an author of the pressure sensor paper, showed that carbon nanotubes are able to withstand repeated stress yet retain their structural and mechanical integrity. Electrical resistance decreases as the block is squeezed, as the charged electrons have more pathways to move from one end of the block to the other.

In the new study, Sreekala and the research team found that the nanotube block’s linear strain-resistance relationship holds true until the block is squeezed to 65 percent of its original height. Beyond that, the block’s mechanical properties begin to fail and the linear relationship breaks down.

The team is currently thinking of ways to boost the nanotubes’ strength by mixing them with polymer composites, to make a new material with a longer-lived strain-resistance relationship.

“The challenge will be to choose the correct polymer so we don’t lose efficiency, but retain the same response in all directions,” Sreekala said.

In addition to Pushparaj and Sreekala, authors of the paper include Pulickel M. Ajayan, professor of materials science and engineering; Omkaram Nalamasu, professor of chemistry with a joint appointment in materials science and engineering; and Daniel Gall, assistant professor of materials science and engineering. Rensselaer research specialists Lijie Ci, Ashavani Kumar, and doctoral student Sai Kesapragada are also listed as authors.

Funding for the project was provided by the Focus Center New York for Interconnects.

Adapted from materials provided by Rensselaer Polytechnic Institute.

Student Hopes To Break Human Land Speed Record Using Bullet Shaped Bicycle

ScienceDaily (Aug. 30, 2007) — This October, Jerrod Bouchard will attempt to become the fastest college student to be propelled by his or her own power.

Bouchard, a senior in mechanical engineering at the University of Missouri-Rolla, will try to break the collegiate human-powered land speed record of 61.5 mph Oct 1-6 in Battle Mountain, Nev.

Seated in a bullet-shaped bicycle, Bouchard will be pedaling down a remote highway in Battle Mountain that is said to be one of the straightest, fastest and smoothest surfaces in the world.

Like a NASCAR driver, Bouchard is working with a talented crew to make sure his vehicle is sound. Members of the team include aerodynamics designer Andrew Sourk, a senior in aerospace engineering from St. Joseph, Mo.; team leader Craig George, a senior in electrical engineering from St. Joseph; and composite specialist Matt Brown, a senior in mechanical engineering from Rolla. Bouchard, who is from Camdenton, Mo., is the chief engineer.

Bouchard, Sourk, George and Brown are all members of UMR’s Human-Powered Vehicle Team, which won East Coast and West Coast championships in collegiate human-powered racing last spring. The Battle Mountain endeavor is a separate challenge that was born out of the larger team’s success.

Human-powered vehicles are recumbent bicycles with aerodynamic shells. All summer, the four-man UMR team has been designing and building a new vehicle for the record-breaking attempt. Recently, Bouchard and his crew took the new bike to the Massachusetts Institute of Technology, where they tested it in a wind tunnel. They are also planning to test it at Gateway International Raceway in St. Louis.

Battle Mountain has been the site of many record-breaking performances by professional, collegiate and amateur riders. The records are sanctioned by the International Human-Powered Vehicle Racing Association.

“Our forecasted performance is looking extremely optimistic,” Bouchard says, “and we are confident that we will break the current record.”

Adapted from materials provided by University of Missouri-Rolla.