ScienceDaily (Aug. 5, 2008) — Metabolic changes responsible for the evolution of our unique cognitive abilities indicate that the brain may have been pushed to the limit of its capabilities. Research published today in BioMed Central's open access journal Genome Biology adds weight to the theory that schizophrenia is a costly by-product of human brain evolution.
Philipp Khaitovich, from the Max-Planck-Institute for Evolutionary Anthropology and the Shanghai branch of the Chinese Academy of Sciences, led a collaboration of researchers from Cambridge, Leipzig and Shanghai who investigated brains from healthy and schizophrenic humans and compared them with chimpanzee and rhesus macaque brains. The researchers looked for differences in gene expression and metabolite concentrations and, as Khaitovich explains, "identified molecular mechanisms involved in the evolution of human cognitive abilities by combining biological data from two research directions: evolutionary and medical".
The idea that certain neurological diseases are by-products of increases in metabolic capacity and brain size that occurred during human evolution has been suggested before, but in this new work the authors used new technical approaches to really put the theory to the test.
They identified the molecular changes that took place over the course of human evolution and considered those molecular changes observed in schizophrenia, a psychiatric disorder believed to affect cognitive functions such as the capacities for language and complex social relationships. They found that expression levels of many genes and metabolites that are altered in schizophrenia, especially those related to energy metabolism, also changed rapidly during evolution. According to Khaitovich, "Our new research suggests that schizophrenia is a by-product of the increased metabolic demands brought about during human brain evolution".
The authors conclude that this work paves the way for a much more detailed investigation. "Our brains are unique among all species in their enormous metabolic demand. If we can explain how our brains sustain such a tremendous metabolic flow, we will have a much better chance to understand how the brain works and why it sometimes breaks", said Khaitovich.
Journal reference:
1. Khaitovich et al. Metabolic changes in schizophrenia and human brain evolution. Genome Biology, 2008 (in press) [link]
Adapted from materials provided by BioMed Central, via EurekAlert!, a service of AAAS.
about us:
our company name is KNK DECORATION. our job about sticker service, example: sticker cutting, digital printing, screen printing, neon box, sign marking and another material promotion.
kami melayani pengerjaan stiker cutting, printing indoor/outdoor, neon box, pembuatan rambu-rambu lalu lintas(standard DLLAJR), dll
our company name is KNK DECORATION. our job about sticker service, example: sticker cutting, digital printing, screen printing, neon box, sign marking and another material promotion.
kami melayani pengerjaan stiker cutting, printing indoor/outdoor, neon box, pembuatan rambu-rambu lalu lintas(standard DLLAJR), dll
Monday, August 25, 2008
Converting Sunlight To Cheaper Energy
ScienceDaily (Aug. 25, 2008) — Scientists are working to convert sunlight to cheap electricity at South Dakota State University. Research scientists are working with new materials that can make devices used for converting sunlight to electricity cheaper and more efficient.
Assistant professor Qiquan Qiao in SDSU’s Department of Electrical Engineering and Computer Science said so-called organic photovoltaics, or OPVs, are less expensive to produce than traditional devices for harvesting solar energy.
Qiao and his SDSU colleagues also are working on organic light-emitting diodes, or OLEDs.
The new technology is sometimes referred to as “molecular electronics” or “organic electronics” — organic because it relies on carbon-based polymers and molecules as semiconductors rather than inorganic semiconductors such as silicon.
“Right now the challenge for photovoltaics is to make the technology less expensive,” Qiao said.
“Therefore, the objective is find new materials and novel device structures for cost-effective photovoltaic devices.
“The beauty of organic photovoltaics and organic LEDs is low cost and flexibility,” the researcher continued. “These devices can be fabricated by inexpensive, solution-based processing techniques similar to painting or printing."
“The ease of production brings costs down, while the mechanical flexibility of the materials opens up a wide range of applications,” Qiao concluded.
Organic photovoltaics and organic LEDs are made up of thin films of semiconducting organic compounds that can absorb photons of solar energy. Typically an organic polymer, or a long, flexible chain of carbon-based material, is used as a substrate on which semiconducting materials are applied as a solution using a technique similar to inkjet printing.
“The research at SDSU is focused on new materials with variable band gaps,” Qiao said.
“The band gap determines how much solar energy the photovoltaic device can absorb and convert into electricity.”
Qiao explained that visible sunlight contains only about 50 percent of the total solar energy. That means the sun is giving off just as much non-visible energy as visible energy.
“We’re working on synthesizing novel polymers with variable band gaps, including high, medium and low-band gap varieties, to absorb the full spectrum of sunlight. By this we can double the light harvesting or absorption,” Qiao said.
SDSU’s scientists plan to use the variable band gap polymers to build multi-junction polymer solar cells or photovoltaics.
These devices use multiple layers of polymer/fullerene films that are tuned to absorb different spectral regions of solar energy.
Ideally, photons that are not absorbed by the first film layer pass through to be absorbed by the following layers.
The devices can harvest photons from ultraviolet to visible to infrared in order to efficiently convert the full spectrum of solar energy to electricity.
SDSU scientists also work with organic light-emitting diodes focusing on developing novel materials and devices for full color displays.
“We are working to develop these new light-emitting and efficient, charge-transporting materials to improve the light-emitting efficiency of full color displays,” Qiao said.
Currently, LED technology is used mainly for signage displays. But in the future, as OLEDs become less expensive and more efficient, they may be used for residential lighting, for example.
The new technology will make it easy to insert lights into walls or ceilings. But instead of light bulbs, the lighting apparatus of the future may look more like a poster, Qiao said.
Qiao and his colleagues are funded in part by SDSU’s electrical engineering Ph.D. program and by National Science Foundation and South Dakota EPSCoR, the Experimental Program to Stimulate Competitive Research.
In addition Qiao is one of about 40 faculty members from SDSU, the South Dakota School of Mines and Technology and the University of South Dakota who have come together to form Photo Active Nanoscale Systems (PANS).
The primary purpose is developing photovoltaics, or devices that will directly convert light to electricity.
Adapted from materials provided by South Dakota State University, via Newswise.
Assistant professor Qiquan Qiao in SDSU’s Department of Electrical Engineering and Computer Science said so-called organic photovoltaics, or OPVs, are less expensive to produce than traditional devices for harvesting solar energy.
Qiao and his SDSU colleagues also are working on organic light-emitting diodes, or OLEDs.
The new technology is sometimes referred to as “molecular electronics” or “organic electronics” — organic because it relies on carbon-based polymers and molecules as semiconductors rather than inorganic semiconductors such as silicon.
“Right now the challenge for photovoltaics is to make the technology less expensive,” Qiao said.
“Therefore, the objective is find new materials and novel device structures for cost-effective photovoltaic devices.
“The beauty of organic photovoltaics and organic LEDs is low cost and flexibility,” the researcher continued. “These devices can be fabricated by inexpensive, solution-based processing techniques similar to painting or printing."
“The ease of production brings costs down, while the mechanical flexibility of the materials opens up a wide range of applications,” Qiao concluded.
Organic photovoltaics and organic LEDs are made up of thin films of semiconducting organic compounds that can absorb photons of solar energy. Typically an organic polymer, or a long, flexible chain of carbon-based material, is used as a substrate on which semiconducting materials are applied as a solution using a technique similar to inkjet printing.
“The research at SDSU is focused on new materials with variable band gaps,” Qiao said.
“The band gap determines how much solar energy the photovoltaic device can absorb and convert into electricity.”
Qiao explained that visible sunlight contains only about 50 percent of the total solar energy. That means the sun is giving off just as much non-visible energy as visible energy.
“We’re working on synthesizing novel polymers with variable band gaps, including high, medium and low-band gap varieties, to absorb the full spectrum of sunlight. By this we can double the light harvesting or absorption,” Qiao said.
SDSU’s scientists plan to use the variable band gap polymers to build multi-junction polymer solar cells or photovoltaics.
These devices use multiple layers of polymer/fullerene films that are tuned to absorb different spectral regions of solar energy.
Ideally, photons that are not absorbed by the first film layer pass through to be absorbed by the following layers.
The devices can harvest photons from ultraviolet to visible to infrared in order to efficiently convert the full spectrum of solar energy to electricity.
SDSU scientists also work with organic light-emitting diodes focusing on developing novel materials and devices for full color displays.
“We are working to develop these new light-emitting and efficient, charge-transporting materials to improve the light-emitting efficiency of full color displays,” Qiao said.
Currently, LED technology is used mainly for signage displays. But in the future, as OLEDs become less expensive and more efficient, they may be used for residential lighting, for example.
The new technology will make it easy to insert lights into walls or ceilings. But instead of light bulbs, the lighting apparatus of the future may look more like a poster, Qiao said.
Qiao and his colleagues are funded in part by SDSU’s electrical engineering Ph.D. program and by National Science Foundation and South Dakota EPSCoR, the Experimental Program to Stimulate Competitive Research.
In addition Qiao is one of about 40 faculty members from SDSU, the South Dakota School of Mines and Technology and the University of South Dakota who have come together to form Photo Active Nanoscale Systems (PANS).
The primary purpose is developing photovoltaics, or devices that will directly convert light to electricity.
Adapted from materials provided by South Dakota State University, via Newswise.
Thursday, August 21, 2008
New 'Nano-positioners' May Have Atomic-scale Precision
ScienceDaily (Aug. 21, 2008) — Engineers have created a tiny motorized positioning device that has twice the dexterity of similar devices being developed for applications that include biological sensors and more compact, powerful computer hard drives.
The device, called a monolithic comb drive, might be used as a "nanoscale manipulator" that precisely moves or senses movement and forces. The devices also can be used in watery environments for probing biological molecules, said Jason Vaughn Clark, an assistant professor of electrical and computer engineering and mechanical engineering, who created the design.
The monolithic comb drives could make it possible to improve a class of probe-based sensors that detect viruses and biological molecules. The sensors detect objects using two different components: A probe is moved while at the same time the platform holding the specimen is positioned. The new technology would replace both components with a single one - the monolithic comb drive.
The innovation could allow sensors to work faster and at higher resolution and would be small enough to fit on a microchip. The higher resolution might be used to design future computer hard drives capable of high-density data storage and retrieval. Another possible use might be to fabricate or assemble miniature micro and nanoscale machines.
Research findings were detailed in a technical paper presented in July during the University Government Industry Micro/Nano Symposium in Louisville. The work is based at the Birck Nanotechnology Center at Purdue's Discovery Park.
Conventional comb drives have a pair of comblike sections with "interdigitated fingers," meaning they mesh together. These meshing fingers are drawn toward each other when a voltage is applied. The applied voltage causes the fingers on one comb to become positively charged and the fingers on the other comb to become negatively charged, inducing an attraction between the oppositely charged fingers. If the voltage is removed, the spring-loaded comb sections return to their original position.
By comparison, the new monolithic device has a single structure with two perpendicular comb drives.
Clark calls the device monolithic because it contains comb drive components that are not mechanically and electrically separate. Conventional comb drives are structurally "decoupled" to keep opposite charges separated.
"Comb drives represent an advantage over other technologies," Clark said. "In contrast to piezoelectric actuators that typically deflect, or move, a fraction of a micrometer, comb drives can deflect tens to hundreds of micrometers. And unlike conventional comb drives, which only move in one direction, our new device can move in two directions - left to right, forward and backward - an advance that could really open up the door for many applications."
Clark also has invented a way to determine the precise deflection and force of such microdevices while reducing heat-induced vibrations that could interfere with measurements.
Current probe-based biological sensors have a resolution of about 20 nanometers.
"Twenty nanometers is about the size of 200 atoms, so if you are scanning for a particular molecule, it may be hard to find," Clark said. "With our design, the higher atomic-scale resolution should make it easier to find."
Properly using such devices requires engineers to know precisely how much force is being applied to comb drive sensors and how far they are moving. The new design is based on a technology created by Clark called electro micro metrology, which enables engineers to determine the precise displacement and force that's being applied to, or by, a comb drive. The Purdue researcher is able to measure this force by comparing changes in electrical properties such as capacitance or voltage.
Clark used computational methods called nodal analysis and finite element analysis to design, model and simulate the monolithic comb drives.
The research paper describes how the monolithic comb drive works when voltage is applied. The results show independent left-right and forward-backward movement as functions of applied voltage in color-coded graphics.
The findings are an extension of research to create an ultra-precise measuring system for devices having features on the size scale of nanometers, or billionths of a meter. Clark has led research to create devices that "self-calibrate," meaning they are able to precisely measure themselves. Such measuring methods and standards are needed to better understand and exploit nanometer-scale devices.
The size of the entire device is less than one millimeter, or a thousandth of a meter. The smallest feature size is about three micrometers, roughly one-thirtieth as wide as a human hair.
"You can make them smaller, though," Clark said. "This is a proof of concept. The technology I'm developing should allow researchers to practically and efficiently extract dozens of geometric and material properties of their microdevices just by electronically probing changes in capacitance or voltage."
In addition to finite element analysis, Clark used a simulation tool that he developed called Sugar.
"Sugar is fast and allows me to easily try out many design ideas," he said. "After I narrow down to a particular design, I then use finite element analysis for fine-tuning. Finite element analysis is slow, but it is able to model subtle physical phenomena that Sugar doesn't do as well."
Clark's research team is installing Sugar on the nanoHub this summer, making the tool available to other researchers. The nanoHub is operated by the Network for Computational Nanotechnology, funded by the National Science Foundation and housed at Purdue's Birck Nanotechnology Center.
The researchers also are in the process of fabricating the devices at the Birck Nanotechnology Center.
Adapted from materials provided by Purdue University.
The device, called a monolithic comb drive, might be used as a "nanoscale manipulator" that precisely moves or senses movement and forces. The devices also can be used in watery environments for probing biological molecules, said Jason Vaughn Clark, an assistant professor of electrical and computer engineering and mechanical engineering, who created the design.
The monolithic comb drives could make it possible to improve a class of probe-based sensors that detect viruses and biological molecules. The sensors detect objects using two different components: A probe is moved while at the same time the platform holding the specimen is positioned. The new technology would replace both components with a single one - the monolithic comb drive.
The innovation could allow sensors to work faster and at higher resolution and would be small enough to fit on a microchip. The higher resolution might be used to design future computer hard drives capable of high-density data storage and retrieval. Another possible use might be to fabricate or assemble miniature micro and nanoscale machines.
Research findings were detailed in a technical paper presented in July during the University Government Industry Micro/Nano Symposium in Louisville. The work is based at the Birck Nanotechnology Center at Purdue's Discovery Park.
Conventional comb drives have a pair of comblike sections with "interdigitated fingers," meaning they mesh together. These meshing fingers are drawn toward each other when a voltage is applied. The applied voltage causes the fingers on one comb to become positively charged and the fingers on the other comb to become negatively charged, inducing an attraction between the oppositely charged fingers. If the voltage is removed, the spring-loaded comb sections return to their original position.
By comparison, the new monolithic device has a single structure with two perpendicular comb drives.
Clark calls the device monolithic because it contains comb drive components that are not mechanically and electrically separate. Conventional comb drives are structurally "decoupled" to keep opposite charges separated.
"Comb drives represent an advantage over other technologies," Clark said. "In contrast to piezoelectric actuators that typically deflect, or move, a fraction of a micrometer, comb drives can deflect tens to hundreds of micrometers. And unlike conventional comb drives, which only move in one direction, our new device can move in two directions - left to right, forward and backward - an advance that could really open up the door for many applications."
Clark also has invented a way to determine the precise deflection and force of such microdevices while reducing heat-induced vibrations that could interfere with measurements.
Current probe-based biological sensors have a resolution of about 20 nanometers.
"Twenty nanometers is about the size of 200 atoms, so if you are scanning for a particular molecule, it may be hard to find," Clark said. "With our design, the higher atomic-scale resolution should make it easier to find."
Properly using such devices requires engineers to know precisely how much force is being applied to comb drive sensors and how far they are moving. The new design is based on a technology created by Clark called electro micro metrology, which enables engineers to determine the precise displacement and force that's being applied to, or by, a comb drive. The Purdue researcher is able to measure this force by comparing changes in electrical properties such as capacitance or voltage.
Clark used computational methods called nodal analysis and finite element analysis to design, model and simulate the monolithic comb drives.
The research paper describes how the monolithic comb drive works when voltage is applied. The results show independent left-right and forward-backward movement as functions of applied voltage in color-coded graphics.
The findings are an extension of research to create an ultra-precise measuring system for devices having features on the size scale of nanometers, or billionths of a meter. Clark has led research to create devices that "self-calibrate," meaning they are able to precisely measure themselves. Such measuring methods and standards are needed to better understand and exploit nanometer-scale devices.
The size of the entire device is less than one millimeter, or a thousandth of a meter. The smallest feature size is about three micrometers, roughly one-thirtieth as wide as a human hair.
"You can make them smaller, though," Clark said. "This is a proof of concept. The technology I'm developing should allow researchers to practically and efficiently extract dozens of geometric and material properties of their microdevices just by electronically probing changes in capacitance or voltage."
In addition to finite element analysis, Clark used a simulation tool that he developed called Sugar.
"Sugar is fast and allows me to easily try out many design ideas," he said. "After I narrow down to a particular design, I then use finite element analysis for fine-tuning. Finite element analysis is slow, but it is able to model subtle physical phenomena that Sugar doesn't do as well."
Clark's research team is installing Sugar on the nanoHub this summer, making the tool available to other researchers. The nanoHub is operated by the Network for Computational Nanotechnology, funded by the National Science Foundation and housed at Purdue's Birck Nanotechnology Center.
The researchers also are in the process of fabricating the devices at the Birck Nanotechnology Center.
Adapted from materials provided by Purdue University.
Energy Storage For Hybrid Vehicles
ScienceDaily (Aug. 18, 2008) — Hybrid technology combines the advantages of combustion engines and electric motors. Scientists are developing high-performance energy storage units, a prerequisite for effective hybrid motors.
The vehicle is powered by petroleum on the freeway and by electricity in town, thus using considerably less energy. A hybrid propulsion system switches over to generator operation when the brakes go on, producing electric current that is temporarily stored in a battery. The electric motor uses this current when starting up. This yields tremendous savings, particularly in urban traffic.
But up to now, hybrid technology has always had a storage problem. Scientists from three Fraunhofer Institutes are developing new storage modules in a project called “Electromobility Fleet Test”.
The pilot project was launched by Volkswagen and Germany’s Federal Ministry for the Environment BMU together with seven other partners. The Fraunhofer Institutes for Silicon Technology ISIT in Itzehoe, Integrated Circuits IIS in Nuremberg, and Integrated Systems and Device Technology IISB in Erlangen will be pooling their expertise for the next three years. The researchers are developing an energy storage module based on lithium-polymer accumulator technology that is suitable for use in vehicles.
“This module has to be able to withstand the harsh environmental conditions it will encounter in a hybrid vehicle, and above all it must guarantee high operational reliability and a long service life,” states ISIT scientist Dr. Gerold Neumann, who coordinates the Fraunhofer activities. The researchers hope to reach this goal with new electrode materials that are kinder to the environment.
A specially developed battery management system makes the energy storage device more durable and reliable. The experts are also researching into new concepts that will enable large amounts of energy to be stored in a small space. To do this, they integrate mechanical and electrical components in a single module, devising systems for temperature control, performance data registration and high-voltage safety.
The tasks involved are distributed between the three Fraunhofer Institutes according to their skills: The ISIT experts, who have long experience in developing and manufacturing lithium accumulators, are manufacturing the cells. Their colleagues at IIS are responsible for battery management and monitoring. The scientists from IISB are contributing their know-how on power electronics components to configure the accumulator modules. The development and configuration of the new energy storage module is expected to be finished by mid-2010. Volkswagen AG – the industrial partner in this project – will then carry out field trials to test the modules’ suitability for everyday use in the vehicles.
Adapted from materials provided by Fraunhofer-Gesellschaft.
The vehicle is powered by petroleum on the freeway and by electricity in town, thus using considerably less energy. A hybrid propulsion system switches over to generator operation when the brakes go on, producing electric current that is temporarily stored in a battery. The electric motor uses this current when starting up. This yields tremendous savings, particularly in urban traffic.
But up to now, hybrid technology has always had a storage problem. Scientists from three Fraunhofer Institutes are developing new storage modules in a project called “Electromobility Fleet Test”.
The pilot project was launched by Volkswagen and Germany’s Federal Ministry for the Environment BMU together with seven other partners. The Fraunhofer Institutes for Silicon Technology ISIT in Itzehoe, Integrated Circuits IIS in Nuremberg, and Integrated Systems and Device Technology IISB in Erlangen will be pooling their expertise for the next three years. The researchers are developing an energy storage module based on lithium-polymer accumulator technology that is suitable for use in vehicles.
“This module has to be able to withstand the harsh environmental conditions it will encounter in a hybrid vehicle, and above all it must guarantee high operational reliability and a long service life,” states ISIT scientist Dr. Gerold Neumann, who coordinates the Fraunhofer activities. The researchers hope to reach this goal with new electrode materials that are kinder to the environment.
A specially developed battery management system makes the energy storage device more durable and reliable. The experts are also researching into new concepts that will enable large amounts of energy to be stored in a small space. To do this, they integrate mechanical and electrical components in a single module, devising systems for temperature control, performance data registration and high-voltage safety.
The tasks involved are distributed between the three Fraunhofer Institutes according to their skills: The ISIT experts, who have long experience in developing and manufacturing lithium accumulators, are manufacturing the cells. Their colleagues at IIS are responsible for battery management and monitoring. The scientists from IISB are contributing their know-how on power electronics components to configure the accumulator modules. The development and configuration of the new energy storage module is expected to be finished by mid-2010. Volkswagen AG – the industrial partner in this project – will then carry out field trials to test the modules’ suitability for everyday use in the vehicles.
Adapted from materials provided by Fraunhofer-Gesellschaft.
Large Hadron Collider Set To Unveil A New World Of Particle Physics
ScienceDaily (Aug. 21, 2008) — The field of particle physics is poised to enter unknown territory with the startup of a massive new accelerator--the Large Hadron Collider (LHC)--in Europe this summer. On September 10, LHC scientists will attempt to send the first beam of protons speeding around the accelerator.
The LHC will put hotly debated theories to the test as it generates a bonanza of new experimental data in the coming years. Potential breakthroughs include an explanation of what gives mass to fundamental particles and identification of the mysterious dark matter that makes up most of the mass in the universe. More exotic possibilities include evidence for new forces of nature or hidden extra dimensions of space and time.
"The LHC is a discovery machine. We don't know what we'll find," said Abraham Seiden, professor of physics and director of the Santa Cruz Institute for Particle Physics (SCIPP) at the University of California, Santa Cruz.
SCIPP was among the initial group of institutions that spearheaded U.S. participation in the LHC. About half of the entire U.S. experimental particle-physics community has focused its energy on the ATLAS and CMS detectors, the largest of four detectors where experiments will be performed at the LHC. SCIPP researchers have been working on the ATLAS project since 1994. It is one of many international physics and astrophysics projects that have drawn on SCIPP's 20 years of experience developing sophisticated technology for tracking high-energy subatomic particles.
The scale of the LHC is gigantic in every respect--its physical size, the energies attained, the amount of data it will generate, and the size of the international collaboration involved in its planning, construction, and operation. In September, high-energy beams of protons will begin circulating around the LHC's 27-kilometer (16.8-mile) accelerator ring located 100 meters (328 feet) underground at CERN, the European particle physics lab based in Geneva, Switzerland. After a period of testing, the beams will cross paths inside the detectors and the first collisions will take place.
Even before the machine is ramped up to its maximum energy early next year, it will smash protons together with more energy than any previous collider. The debris from those collisions--showers of subatomic particles that the detectors will track and record--will yield results that could radically change our understanding of the physical world.
In a talk at the American Physical Society meeting earlier this year, Seiden gave an overview of the LHC research program, including a rough timeline for reaching certain milestones. One of the most highly anticipated milestones, for example, is detection of the Higgs boson, a hypothetical particle that would fill a major gap in the standard model of particle physics by endowing fundamental particles with mass. Detection of the Higgs boson is most likely to occur in 2010, Seiden said.
But there's no guarantee that the particle actually exists; nature may have found another way to create mass. "I'm actually hoping we find something unexpected that does the job of the Higgs," Seiden said.
Technically, the Higgs boson was postulated to explain a feature of particle interactions known as the breaking of electroweak symmetry, and the LHC is virtually guaranteed to explain that phenomenon, according to theoretical physicist Howard Haber.
"We've been debating this for 30 years, and one way or another, the LHC will definitively tell us how electroweak symmetry breaking occurs. That's a fundamental advance," said Haber, a professor of physics at UCSC.
Haber and other theorists have spent years imagining possible versions of nature, studying their consequences, and describing in detail what the evidence would look like in the experimental data from a particle accelerator such as the LHC. The Higgs boson won't be easy to find, he said. The LHC should produce the particles in abundance (if they exist), but most of them will not result in a very distinctive signal in the detectors.
"It's a tough game. You can only do it by statistical analysis, since there are other known processes that produce events that can mimic a Higgs boson signal," Haber said.
Evidence to support another important theory--supersymmetry--could show up sooner. In many ways, supersymmetry is a more exciting possibility than the Higgs boson, according to theorist Michael Dine, also a professor of physics at UCSC.
"By itself, the Higgs is a very puzzling particle, so there have been a lot of conjectures about some kind of new physics beyond the standard model. Supersymmetry has the easiest time fitting in with what we know," Dine said.
Adding to its appeal, supersymmetry predicts the existence of particles that are good candidates to account for dark matter. Astronomers have detected dark matter through its gravitational effects on stars and galaxies, but they don't yet know what it is. Particles predicted by supersymmetry that could account for dark matter may be identified at the LHC as early as next year, Seiden said.
"Initially, we'll be looking for things that are known standards to make sure that everything is working properly. In 2009, we could start really looking for new things like supersymmetry," he said.
The massive ATLAS detector--45 meters (148 feet) long and 25 meters (82 feet) high--has involved more than 2,000 physicists at 166 institutions. Seiden's team at SCIPP has been responsible for developing the silicon sensors and electronics for the detector's inner tracker, which measures the trajectories of charged particles as they first emerge from the site of the collisions.
Seiden is now leading the U.S. effort to develop a major upgrade of ATLAS. The current detector is designed to last for 10 years, and the upgrade will coincide with a planned increase in the luminosity of the proton beams at the LHC (which will then become the "Super LHC").
"These large projects take such a long time, we have to start early," Seiden said.
Meanwhile, operation and testing of the current ATLAS detector is already under way at CERN, said Alexander Grillo, a SCIPP research physicist who has been working on the project from the start.
"We've been operating it and looking at cosmic ray particles," he said. "Nature gives us these cosmic rays for free, and they're the same kinds of particles we'll see when the machine turns on, so it enables us to check out certain aspects of the detector. But we're very excited to start seeing collisions from the machine."
ATLAS and the other LHC detectors are designed with "trigger" systems that ignore most of the signals and record only those events likely to yield interesting results. Out of the hundreds of millions of collisions happening every second inside the detector, only 100 of the most promising events will be selected and recorded in the LHC's central computer system.
"We'll be throwing away a lot of data, so we have to make sure the triggers are working correctly," Seiden said.
Grillo noted that the ATLAS project has been a great opportunity for UCSC students. Both graduate students and undergraduates have been involved in the development of the detector, along with postdoctoral researchers, research physicists, and senior faculty.
"The graduate students and postdocs get to go to Geneva, but even the undergraduates get a chance to work in a real physics lab and be part of a major international experiment," Grillo said.
SCIPP's prominent role in the LHC is also a boon for theoretical physicists at UCSC who are not directly involved in the collaboration, such as Dine, Haber, Thomas Banks, and others.
"There is a high level of interaction and camaraderie between theorists and experimentalists at UCSC, which is not the case at other leading institutions," Dine said. "For me, it's valuable just in terms of being aware of what's happening on the experimental side."
According to Haber, the LHC is certain to generate a lot of excitement in the world of physics.
"If nothing were found beyond what we know today, that would be so radical, because it would be in violation of a lot of extremely fundamental principles," he said.
More information about the LHC and U.S. involvement in the project is available at http://www.uslhc.us. Information about SCIPP is available at http://scipp.ucsc.edu.
Adapted from materials provided by University of California - Santa Cruz.
The LHC will put hotly debated theories to the test as it generates a bonanza of new experimental data in the coming years. Potential breakthroughs include an explanation of what gives mass to fundamental particles and identification of the mysterious dark matter that makes up most of the mass in the universe. More exotic possibilities include evidence for new forces of nature or hidden extra dimensions of space and time.
"The LHC is a discovery machine. We don't know what we'll find," said Abraham Seiden, professor of physics and director of the Santa Cruz Institute for Particle Physics (SCIPP) at the University of California, Santa Cruz.
SCIPP was among the initial group of institutions that spearheaded U.S. participation in the LHC. About half of the entire U.S. experimental particle-physics community has focused its energy on the ATLAS and CMS detectors, the largest of four detectors where experiments will be performed at the LHC. SCIPP researchers have been working on the ATLAS project since 1994. It is one of many international physics and astrophysics projects that have drawn on SCIPP's 20 years of experience developing sophisticated technology for tracking high-energy subatomic particles.
The scale of the LHC is gigantic in every respect--its physical size, the energies attained, the amount of data it will generate, and the size of the international collaboration involved in its planning, construction, and operation. In September, high-energy beams of protons will begin circulating around the LHC's 27-kilometer (16.8-mile) accelerator ring located 100 meters (328 feet) underground at CERN, the European particle physics lab based in Geneva, Switzerland. After a period of testing, the beams will cross paths inside the detectors and the first collisions will take place.
Even before the machine is ramped up to its maximum energy early next year, it will smash protons together with more energy than any previous collider. The debris from those collisions--showers of subatomic particles that the detectors will track and record--will yield results that could radically change our understanding of the physical world.
In a talk at the American Physical Society meeting earlier this year, Seiden gave an overview of the LHC research program, including a rough timeline for reaching certain milestones. One of the most highly anticipated milestones, for example, is detection of the Higgs boson, a hypothetical particle that would fill a major gap in the standard model of particle physics by endowing fundamental particles with mass. Detection of the Higgs boson is most likely to occur in 2010, Seiden said.
But there's no guarantee that the particle actually exists; nature may have found another way to create mass. "I'm actually hoping we find something unexpected that does the job of the Higgs," Seiden said.
Technically, the Higgs boson was postulated to explain a feature of particle interactions known as the breaking of electroweak symmetry, and the LHC is virtually guaranteed to explain that phenomenon, according to theoretical physicist Howard Haber.
"We've been debating this for 30 years, and one way or another, the LHC will definitively tell us how electroweak symmetry breaking occurs. That's a fundamental advance," said Haber, a professor of physics at UCSC.
Haber and other theorists have spent years imagining possible versions of nature, studying their consequences, and describing in detail what the evidence would look like in the experimental data from a particle accelerator such as the LHC. The Higgs boson won't be easy to find, he said. The LHC should produce the particles in abundance (if they exist), but most of them will not result in a very distinctive signal in the detectors.
"It's a tough game. You can only do it by statistical analysis, since there are other known processes that produce events that can mimic a Higgs boson signal," Haber said.
Evidence to support another important theory--supersymmetry--could show up sooner. In many ways, supersymmetry is a more exciting possibility than the Higgs boson, according to theorist Michael Dine, also a professor of physics at UCSC.
"By itself, the Higgs is a very puzzling particle, so there have been a lot of conjectures about some kind of new physics beyond the standard model. Supersymmetry has the easiest time fitting in with what we know," Dine said.
Adding to its appeal, supersymmetry predicts the existence of particles that are good candidates to account for dark matter. Astronomers have detected dark matter through its gravitational effects on stars and galaxies, but they don't yet know what it is. Particles predicted by supersymmetry that could account for dark matter may be identified at the LHC as early as next year, Seiden said.
"Initially, we'll be looking for things that are known standards to make sure that everything is working properly. In 2009, we could start really looking for new things like supersymmetry," he said.
The massive ATLAS detector--45 meters (148 feet) long and 25 meters (82 feet) high--has involved more than 2,000 physicists at 166 institutions. Seiden's team at SCIPP has been responsible for developing the silicon sensors and electronics for the detector's inner tracker, which measures the trajectories of charged particles as they first emerge from the site of the collisions.
Seiden is now leading the U.S. effort to develop a major upgrade of ATLAS. The current detector is designed to last for 10 years, and the upgrade will coincide with a planned increase in the luminosity of the proton beams at the LHC (which will then become the "Super LHC").
"These large projects take such a long time, we have to start early," Seiden said.
Meanwhile, operation and testing of the current ATLAS detector is already under way at CERN, said Alexander Grillo, a SCIPP research physicist who has been working on the project from the start.
"We've been operating it and looking at cosmic ray particles," he said. "Nature gives us these cosmic rays for free, and they're the same kinds of particles we'll see when the machine turns on, so it enables us to check out certain aspects of the detector. But we're very excited to start seeing collisions from the machine."
ATLAS and the other LHC detectors are designed with "trigger" systems that ignore most of the signals and record only those events likely to yield interesting results. Out of the hundreds of millions of collisions happening every second inside the detector, only 100 of the most promising events will be selected and recorded in the LHC's central computer system.
"We'll be throwing away a lot of data, so we have to make sure the triggers are working correctly," Seiden said.
Grillo noted that the ATLAS project has been a great opportunity for UCSC students. Both graduate students and undergraduates have been involved in the development of the detector, along with postdoctoral researchers, research physicists, and senior faculty.
"The graduate students and postdocs get to go to Geneva, but even the undergraduates get a chance to work in a real physics lab and be part of a major international experiment," Grillo said.
SCIPP's prominent role in the LHC is also a boon for theoretical physicists at UCSC who are not directly involved in the collaboration, such as Dine, Haber, Thomas Banks, and others.
"There is a high level of interaction and camaraderie between theorists and experimentalists at UCSC, which is not the case at other leading institutions," Dine said. "For me, it's valuable just in terms of being aware of what's happening on the experimental side."
According to Haber, the LHC is certain to generate a lot of excitement in the world of physics.
"If nothing were found beyond what we know today, that would be so radical, because it would be in violation of a lot of extremely fundamental principles," he said.
More information about the LHC and U.S. involvement in the project is available at http://www.uslhc.us. Information about SCIPP is available at http://scipp.ucsc.edu.
Adapted from materials provided by University of California - Santa Cruz.
Subscribe to:
Posts (Atom)