Thursday, May 19, 2011

Neutrons Provide First Sub-Nanoscale Snapshots of Huntington's Disease Protein

Huntington's disease is caused by a renegade protein"huntingtin" that destroys neurons in areas of the brain concerned with the emotions, intellect and movement. All humans have the normal huntingtin protein, which is known to be essential to human life, although its true biological functions remain unclear.

Christopher Stanley, a Shull Fellow in the Neutron Scattering Science Division at ORNL, and Valerie Berthelier, a UT Graduate School of Medicine researcher who studies protein folding and misfolding in Huntington's, have used a small-angle neutron scattering instrument, called Bio-SANS, at ORNL's High Flux Isotope Reactor to explore the earliest aggregate species of the protein that are believed to be the most toxic.

Stanley and Berthelier, in research published inBiophysical Journal, were able to determine the size and mass of the mutant protein structures―from the earliest small, spherical precursor species composed of two (dimers) and three (trimers) peptides―along the aggregation pathway to the development of the resulting, later-stage fibrils. They were also able to see inside the later-stage fibrils and determine their internal structure, which provides additional insight into how the peptides aggregate.

"Bio-SANS is a great instrument for taking time-resolved snapshots. You can look at how this stuff changes as a function of time and be able to catch the structures at the earliest of times," Stanley said."When you study several of these types of systems with different glutamines or different conditions, you begin to learn more and more about the nature of these aggregates and how they begin forming."

Normal huntingtin contains a region of 10 to 20 glutamine amino acids in succession. However, the DNA of Huntington's disease patients encodes for 37 or more glutamines, causing instability in huntingtin fragments that contain this abnormally long glutamine repeat. Consequentially, the mutant protein fragment cannot be degraded normally and instead forms deposits of fibrils in neurons.

Those deposits, or clumps, were originally seen as the cause of the devastation that ensues in the brain. More recently researchers think the clumping may actually be a kind of biological housecleaning, an attempt by the brain cells to clean out these toxic proteins from places where they are destructive. Stanley and Berthelier set out to learn through neutron scattering what the toxic proteins were and when and where they occurred.

At the HFIR Bio-SANS instrument, the neutron beam comes through a series of mirrors that focus it on the sample. The neutrons interact with the sample, providing data on its atomic structure, and then the neutrons scatter, to be picked up by a detector. From the data the detector sends of the scattering pattern, researchers can deduce at a scale of less than billionths of a meter the size and shape of the diseased, aggregating protein, at each time-step along its growth pathway.

SANS was able to distinguish the small peptide aggregates in the sample solution from the rapidly forming and growing larger aggregates that are simultaneously present. In separate experiments, they were able to monitor the disappearance of the single peptides, as well as the formation of the mature fibrils.

Now that they know the structures, the hope is to develop drugs that can counteract the toxic properties in the early stages, or dissuade them from taking the path to toxicity."The next step would be, let's take drug molecules and see how they can interact and affect these structures," Stanley said.

For now, the researchers believes Bio-SANS will be useful in the further study of Huntington's disease aggregates and applicable for the study of other protein aggregation processes, such as those involved in Alzheimer's and Parkinson's diseases.

"That is the future hope. Right now, we feel like we are making a positive contribution towards that goal," Stanley said.

The research was supported by the National Institutes of Health. HFIR and Bio-SANS are supported by the DOE Office of Science.


Source

Saturday, May 14, 2011

Controling Robotic Arms Is Child's Play

"The input device contains various movement sensors, also called inertial sensors," says Bernhard Kleiner of the Fraunhofer Institute for Manufacturing Engineering and Automation IPA in Stuttgart, who leads the project. The individual micro-electromechanical systems themselves are not expensive. What the scientists have spent time developing is how these sensors interact."We have developed special algorithms that fuse the data of individual sensors and identify a pattern of movement. That means we can detect movements in free space," summarizes Kleiner.

What may at first appear to be a trade show gimmick, is in fact a technology that offers numerous advantages in industrial production and logistical processes. The system could be used to simplify the programming of industrial robots, for example. To date, this has been done with the aid of laser tracking systems: An employee demonstrates the desired motion with a hand-held baton that features a white marker point. The system records this motion by analyzing the light reflected from a laser beam aimed at the marker. Configuring and calibrating the system takes a lot of time. The new input device should eliminate the need for these steps in the future -- instead, employees need only pick up the device and show the robot what it is supposed to do.

The system has numerous applications in medicine, as well. Take, for example, gait analysis. Until now, cameras have made precise recordings of patients as they walk back and forth along a specified path. The films reveal to the physician such things as how the joints behave while walking, or whether incorrect posture in the knees has been improved by physical therapy. Installing the cameras, however, is complex and costly, and patients are restricted to a predetermined path. The new sensor system can simplify this procedure: Attached to the patient's upper thigh, it measures the sequences and patterns of movement -- without limiting the patient's motion in any way.

"With the inertial sensor system, gait analysis can be performed without a frame of reference and with no need for a complex camera system," explains Kleiner. In another project, scientists are already working on comparisons of patients' gait patterns with those patterns appearing in connection with such diseases as Parkinson's.

Another medical application for the new technology is the control of active prostheses containing numerous small actuators. Whenever the patient moves, the prosthesis in turn also moves; this makes it possible for a leg prosthesis to roll the foot while walking. Here, too, the sensor could be attached to the patient's upper thigh and could analyze the movement, helping to regulate the motors of the prosthesis. Research scientists are currently working on combining the inertial sensor system with an electromyographic (EMG) sensor. Electromyography is based on the principle that when a muscle tenses, it produces an electrical voltage which a sensor can then measure by way of an electrode. If the sensor is placed, for example, on the muscle responsible for lifting the patient's foot, the sensor registers when the patient tenses this muscle -- and the prosthetic foot lifts itself. EMG sensors like this are already available but are difficult to position.

"While standard EMG sensors consist of individual electrodes that have to be positioned precisely on the muscle, our system is made up of many small electrodes that attach to a surface area. This enables us to sense muscle movements much more reliably," says Kleiner.


Source

Tuesday, May 10, 2011

Scientists Achieve Guiding of Electrons by Purely Electric Fields

The research is published online inPhysical Review Letters.

This new technique of electron guiding -- which resembles the guiding of light waves in optical fibres -- promises a variety of applications, from guided matter-wave experiments to non-invasive electron microscopy.

Electrons have been the first elementary particles revealing their wave-like properties and have therefore been of great importance in the development of the theory of quantum mechanics. Even now the observation of electrons leads to new insight into the fundamental laws of physics. Measurements involving confined electrons have so far mainly been performed in so-called Penning traps, which combine a static magnetic field with an oscillating electric field.

For a number of experiments with propagating electrons, like interferometry with slow electrons, it would be advantageous to confine the electrons by a purely electric field. This can be done in an alternating quadrupole potential similar to the standard technique that is used for ion trapping. These so-called Paul traps are based on four electrodes to which a radiofrequency voltage is applied. The resulting field evokes a driving force which keeps the particle in the centre of the trap. Wolfgang Paul received the Nobel Prize in physics for the invention of these traps in 1989.

For several years by now scientists realize Paul traps with micro structured electrodes on planar substrates, using standard microelectronic chip technology. Dr. Hommelhoff and his group have now applied this method for the first time to electrons. Since the mass of these point-like particles is only a tenth of a thousandth of the mass of an ion, electrons react much faster to electric fields than the rather heavy ions. Hence, in order to guide electrons, the frequency of the alternating voltage applied to the electrodes has to be much higher than for the confinement of ions and is in the microwave range, at around 1 GHz.

In the experiment electrons are generated in a thermal source (in which a tungsten wire is heated like in a light bulb) and the emitted electrons are collimated to a parallel beam of a few electron volts. From there the electrons are injected into the"wave-guide." It is being generated by five electrodes on a planar substrate to which an alternating voltage with a frequency of about 1 GHz is applied. This introduces an oscillating quadrupole field in a distance of half a millimetre above the electrodes, which confines the electrons in the radial direction. In the longitudinal direction there is no force acting on the particles so that they are free to travel along the"guide tube." As the confinement in the radial direction is very strong the electrons are forced to follow even small directional changes of the electrodes.

In order to make this effect more visible the 37mm long electrodes are bent to a curve of 30 degrees opening angle and with a bending radius of 40mm. At the end of the structure the guided electrons are ejected and registered by a detector. A bright spot caused by guided electrons appears on the detector right at the exit of the guide tube, which is situated in the left part of the picture. When the alternating field is switched off a more diffusively illuminated area shows up on the right side. It is caused by electrons spreading out from the source and propagating on straight trajectories over the substrate.

"With this fundamental experiment we were able to show that electrons can be efficiently guided be purely electric fields," says Dr. Hommelhoff."However, as our electron source yields a rather poorly collimated electron beam we still lose many electrons." In the future the researchers plan to combine the new microwave guide with an electron source based on field emission from an atomically sharp metal tip. These devices deliver electron beams with such a strong collimation that their transverse component is limited by the Heisenberg uncertainty principle only.

Under these conditions it should be feasible to investigate the individual quantum mechanical oscillations of the electrons in the radial potential of the guide."The strong confinement of electrons observed in our experiment means that a"jump" from one quantum state to the neighbouring higher state requires a lot of energy and is therefore not very likely to happen," explains Johannes Hoffrogge, doctoral student at the experiment."Once a single quantum state is populated it will remain so for an extended period of time and can be used for quantum experiments." This would make it possible to conduct quantum physics experiments such as interferometry with guided slow electrons. Here the wave function of an electron is first split up; later on, its two components are brought together again whereby characteristic superpositions of quantum states of the electron can be generated. But the new method could also be applied for a new form of electron microscopy.


Source

Friday, May 6, 2011

Antibody-Based Biosensor Can Guide Environmental Clean-Ups, Provide Early Warning System for Spills

Testing of the biosensor in the Elizabeth River and Yorktown Creek, which both drain into lower Chesapeake Bay, shows that the instrument can process samples in less than 10 minutes, detect pollutants at levels as low as just a few parts per billion, and do so at a cost of just pennies per sample. Current technology requires hours of lab work, with a per-sample cost of up to$1,000.

"Our biosensor combines the power of the immune system with the sensitivity of cutting-edge electronics," says Dr. Mike Unger of VIMS."It holds great promise for real-time detection and monitoring of oil spills and other releases of contaminants into the marine environment."

The biosensor was developed and tested by Unger, fellow VIMS professor Steve Kaattari, and their doctoral student Candace Spier, with assistance from marine scientist George Vadas. The team's report of field tests with the sensor appears in this month's issue ofEnvironmental Toxicology and Chemistry.

The instrument was developed in conjunction with Sapidyne Instruments, Inc., with funding from the state of Virginia, the Office of Naval Research, and the Cooperative Institute for Coastal and Estuarine Environmental Technology, a partnership between NOAA and the University of New Hampshire.

The tests in the Elizabeth River took place during clean up of a site contaminated by polycyclic aromatic hydrocarbons (PAHs), byproducts of decades of industrial use of creosote to treat marine pilings. The U.S. Environmental Protection Agency considers PAHs highly toxic and lists 17 as suspected carcinogens.

The biosensor allowed the researchers to quantify PAH concentrations while the Elizabeth River remediation was taking place, gaining on-site knowledge about water quality surrounding the remediation site. Spier says the test was"the first use of an antibody-based biosensor to guide sampling efforts through near real-time evaluation of environmental contamination."

In the Yorktown Creek study, the researchers used the biosensor to track the runoff of PAHs from roadways and soils during a rainstorm.

Biosensor development

Kaattari says"Our basic idea was to fuse two different kinds of technologies -- monoclonal antibodies and electronic sensors -- in order to detect contaminants."

Antibodies are proteins produced by the immune system of humans and other mammals. They are particularly well suited for detecting contaminants because they have, as Kaattari puts it, an"almost an infinite power to recognize the 3-dimensional shape of any molecule."

Mammals produce antibodies that recognize and bind with large organic molecules such as proteins or with viruses. The VIMS team took this process one step further, linking proteins to PAHs and other contaminants, then exposing mice to these paired compounds in a manner very similar to a regular vaccination.

"Just like you get vaccinated against the flu, we in essence are vaccinating our mice against contaminants," says Kaattari."The mouse's lymphatic system then produces antibodies to PAHs, TNT, tributyl tin {TBT, the active ingredient in anti-fouling paints for boats}, or other compounds."

Once a mouse has produced an antibody to a particular contaminant, the VIMS team applies standard clinical techniques to produce"monoclonal antibodies" in sufficiently large quantities for use in a biosensor.

"This technology allows you to immortalize a lymphocyte that produces only a very specific antibody," says Kaattari."You grow the lymphocytes in culture and can produce large quantities of antibodies within a couple of weeks. You can preserve the antibody-producing lymphocyte forever, which means you don't have to go to a new animal every time you need to produce new antibodies."

From antibody to electrical signal

The team's next step was to develop a sensor that can recognize when an antibody binds with a contaminant and translate that recognition into an electrical signal. The Sapidyne®sensor used by the VIMS team works via what Kaattari calls a"fluorescence-inhibitory, spectroscopic kind of assay."

In the sensor used on the Elizabeth River and Yorktown Creek, antibodies designed to recognize a specific class of PAHs were joined with a dye that glows when exposed to fluorescent light. The intensity of that light is in turn recorded as a voltage. The sensor also houses tiny plastic beads that are coated with what Spier calls a"PAH surrogate" -- a PAH derivative that retains the shape that the antibody recognizes as a PAH molecule.

When water samples with low PAH levels are added to the sensor chamber (which is already flooded with a solution of anti-PAH antibodies), the antibodies have little to bind with and are thus free to attach to the surrogate-coated beads, providing a strong fluorescent glow and electric signal. In water samples with high PAH concentrations, on the other hand, a large fraction of the antibodies bind with the environmental contaminants. That leaves fewer to attach to the surrogate-coated beads, which consequently provides a fainter glow and a weaker electric signal.

During the Elizabeth River study, the biosensor measured PAH concentrations that ranged from 0.3 to 3.2 parts per billion, with higher PAH levels closer to the dredge site. In Yorktown Creek, the biosensor showed that PAH levels in runoff peaked 1 to 2 hours after the rain started, with peak concentration of 4.4 parts per billion.

Comparison of the biosensor's field readings with later readings from a mass spectrometer at VIMS showed that the biosensor is just as accurate as the more expensive, slower, and laboratory-bound machine.

A valuable field tool

Spier says"Using the biosensor allowed us to quickly survey an area of almost 900 acres around the Elizabeth River dredge, and to provide information about the size and intensity of the contaminant plume to engineers monitoring the dredging from shore. If our results had shown elevated concentrations, they could have halted dredging and put remedial actions in place."

Unger adds"measuring data in real-time also allowed us to guide the collection of large-volume water samples right from the boat. We used these samples for later analysis of specific PAH compounds in the lab. This saved time, effort, and money by keeping us from having to analyze samples that might contain PAHs at levels below our detection limit."

"Biosensors have their constraints and optimal operating conditions," says Kaattari,"but their promise far outweighs any limitations. The primary advantages of our biosensor are its sensitivity, speed, and portability. These instruments are sure to have a myriad of uses in future environmental monitoring and management."

One promising use of the biosensor is for early detection and tracking of oil spills."If biosensors were placed near an oil facility and there was a spill, we would know immediately," says Kaattari."And because we could see concentrations increasing or decreasing in a certain pattern, we could also monitor the dispersal over real time."


Source

Thursday, April 21, 2011

Far Sighted Space Technology Finds Practical Uses on Earth

Ken Wood is presenting the project at the RAS National Astronomy Meeting in Llandudno, Wales.

The part of the Electromagnetic Spectrum including the far infra red and microwave is also called 'terahertz' radiation. Astronomers use this kind of radiation to study the Cosmic Microwave Background and the huge dust clouds where stars are born. The sensitive detectors they use will only operate at temperatures very close to absolute zero (minus 273C.) In Terahertz cameras like KIDCAM, those low temperatures are accessible in compact and less expensive ways using relatively new cooler technology. KIDCAM therefore has many potential day-to-day applications.

"We are all familiar with optical images of the surface of objects and X-ray images which penetrate through soft tissue to reveal bone structure. Terahertz observations give us something in between the two. For example, most clothing and packaging materials are transparent to Terahertz radiation, whereas skin, water, metal and a host of other interesting materials are not. This gives rise to some important day-to-day applications: detecting weapons concealed under clothing or inside parcels; distinguishing skin and breast cancer tissue; quality control of manufactures items and processes in factories. Our KIDCAM detectors are also very sensitive, and so we can look at the natural radiation emitted by the target. This means there are no safety issues like those associated with other imaging techniques which shine radiation, including X-rays, at the target," said Mr Wood.

Until recently, there have been many practical obstacles to using terahertz detectors. Terahertz sources have only become available to the non-specialist in the last 10 years and cooling the detectors to very low temperatures using liquid cryogens is costly and complicated.

"The instruments aboard the Herschel and Planck satellites need to be cooled to temperatures close to absolute zero so that emissions from the spacecraft don't drown out the faint signals that come from the very edge of the observable Universe," said Ken Wood.

"For KIDCAM, we have developed a kind of detector that can be operated in electrical coolers and therefore without the use of liquified gases. KIDCAM can be tuned to specific frequencies for specific applications, for instance to enhance the contrast between skin and plastic explosive for airport security scanners. Unwanted frequencies can be blocked to increase the camera's sensitivity. The experience that we gained working on astronomical missions has been invaluable in helping us do this. The race is now on around the world to produce devices that will realise the enormous potential of terahertz science and thanks to the ingenuity of UK astronomers we have made a great start."


Source

Tuesday, April 19, 2011

New Biosensor Microchip Could Speed Up Drug Development, Researchers Say

A single centimeter-sized array of the nanosensors can simultaneously and continuously monitor thousands of times more protein-binding events than any existing sensor. The new sensor is also able to detect interactions with greater sensitivity and deliver the results significantly faster than the present"gold standard" method.

"You can fit thousands, even tens of thousands, of different proteins of interest on the same chip and run the protein-binding experiments in one shot," said Shan Wang, a professor of materials science and engineering, and of electrical engineering, who led the research effort.

"In theory, in one test, you could look at a drug's affinity for every protein in the human body," said Richard Gaster, MD/PhD candidate in bioengineering and medicine, who is the first author of a paper describing the research that is in the current issue ofNature Nanotechnology,available online now.

The power of the nanosensor array lies in two advances. First, the use of magnetic nanotags attached to the protein being studied -- such as a medication -- greatly increases the sensitivity of the monitoring.

Second, an analytical model the researchers developed enables them to accurately predict the final outcome of an interaction based on only a few minutes of monitoring data. Current techniques typically monitor no more than four simultaneous interactions and the process can take hours.

"I think their technology has the potential to revolutionize how we do bioassays," said P.J. Utz, associate professor of medicine (immunology and rheumatology) at Stanford University Medical Center, who was not involved in the research.

A microchip with a nanosensor array (orange squares) is shown with a different protein (various colors) attached to each sensor. Four proteins of a potential medication (blue Y-shapes), with magnetic nanotags attached (grey spheres), have been added. One medication protein is shown binding with a protein on a nanosensor.

Members of Wang's research group developed the magnetic nanosensor technology several years ago and demonstrated its sensitivity in experiments in which they showed that it could detect a cancer-associated protein biomarker in mouse blood at a thousandth of the concentration that commercially available techniques could detect. That research was described in a 2009 paper inNature Medicine.

The researchers tailor the nanotags to attach to the particular protein being studied. When a nanotag-equipped protein binds with another protein that is attached to a nanosensor, the magnetic nanotag alters the ambient magnetic field around the nanosensor in a small but distinct way that is sensed by the detector.

"Let's say we are looking at a breast cancer drug," Gaster said."The goal of the drug is to bind to the target protein on the breast cancer cells as strongly as possible. But we also want to know: How strongly does that drug aberrantly bind to other proteins in the body?"

To determine that, the researchers would put breast cancer proteins on the nanosensor array, along with proteins from the liver, lungs, kidneys and any other kind of tissue that they are concerned about. Then they would add the medication with its magnetic nanotags attached and see which proteins the drug binds with -- and how strongly.

"We can see how strongly the drug binds to breast cancer cells and then also how strongly it binds to any other cells in the human body such as your liver, kidneys and brain," Gaster said."So we can start to predict the adverse affects to this drug without ever putting it in a human patient."

It is the increased sensitivity to detection that comes with the magnetic nanotags that enables Gaster and Wang to determine not only when a bond forms, but also its strength.

"The rate at which a protein binds and releases, tells how strong the bond is," Gaster said. That can be an important factor with numerous medications.

"I am surprised at the sensitivity they achieved," Utz said."They are detecting on the order of between 10 and 1,000 molecules and that to me is quite surprising."

The nanosensor is based on the same type of sensor used in computer hard drives, Wang said.

"Because our chip is completely based on existing microelectronics technology and procedures, the number of sensors per area is highly scalable with very little cost," he said.

Although the chips used in the work described in theNature Nanotechnologypaper had a little more than 1,000 sensors per square centimeter, Wang said it should be no problem to put tens of thousands of sensors on the same footprint.

"It can be scaled to over 100,000 sensors per centimeter, without even pushing the technology limits in microelectronics industry," he said.

Wang said he sees a bright future for increasingly powerful nanosensor arrays, as the technology infrastructure for making such nanosensor arrays is in place today.

"The next step is to marry this technology to a specific drug that is under development," Wang said."That will be the really killer application of this technology."

Other Stanford researchers who participated in the research and are coauthors of theNature Nanotechnologypaper are Liang Xu and Shu-Jen Han, both of whom were graduate students in materials science and engineering at the time the research was done; Robert Wilson, senior scientist in materials science and engineering; and Drew Hall, graduate student in electrical engineering. Other coauthors are Drs. Sebastian Osterfeld and Heng Yu from MagArray Inc. in Sunnyvale. Osterfeld and Yu are former alumni of the Wang Group.

Funding for the research came from the National Cancer Institute, the National Science Foundation, the Defense Advanced Research Projects Agency, the Gates Foundation and National Semiconductor Corporation.


Source

Friday, April 15, 2011

Search for Dark Matter Moves One Step Closer to Detecting Elusive Particle

Their new results, announced April 14 at the Gran Sasso National Laboratory in Italy, where the XENON experiment is housed deep beneath a mountain 70 miles west of Rome, represent the highest-sensitivity search for dark matter yet, with background noise 100 times lower than competing efforts.

Dark matter is widely thought to be a kind of massive elementary particle that interacts weakly with ordinary matter. Physicists refer to these particles as WIMPS, for weakly interacting massive particles. The XENON researchers used a dark-matter detector known as XENON100 -- an instrumented vat filled with over 100 pounds of liquid xenon -- as a target for these WIMPs, which are thought to be streaming constantly through the solar system and Earth.

And while the XENON100 experiment found no dark matter signal in 100 days of testing, the researchers' newly calculated upper limits on the mass of WIMPs and the probability of their interacting with other particles are the best in the world, said UCLA physics professor Katsushi Arisaka, a member of the international collaboration.

XENON100 looks for a primary flash of light that occurs when a particle bounces off a xenon atom inside the detector and a secondary flash when an electron knocked free from a xenon atom by a collision is accelerated toward the top of the device by an electric field, said UCLA physics researcher Hanguo Wang, who works closely with Arisaka. With this configuration, a WIMP will generate a signal fundamentally different from that of cosmic radiation or emission from the equipment itself, making it possible to identify background readings that could be mistaken for a positive detection, he said.

Even though the experiment did not detect a WIMP, the progress sets the stage for an ambitious next-generation project called XENON1T, which will use a much larger, one-ton liquid xenon instrument with highly specialized light-detectors developed at UCLA that make it 100 times more sensitive than XENON100, said David Cline, a UCLA professor of physics and founder of UCLA's dark matter group.

The search for dark matter

Ordinary matter, which makes up the stars, planets, gas and dust in our galaxy, emits or reflects light that can be observed using telescopes on Earth or in space. However, the effect of dark matter, according to several theories, can be observed only indirectly by the gravitational force exerted on the more visible portions of the galaxy around us, Cline said.

Despite the differences between ordinary and dark matter, cosmologists believe the two have been linked since the beginning of the universe, with dark matter playing a key role in the coalescing of particles into stars, galaxies and other large-scale structures after the Big Bang.

Though dark matter exerts a tangible force on the galaxy as a whole, individual WIMPs have proved far more difficult to detect. Because these particles interact only very weakly with normal matter, the small signal that might come from a WIMP detection above ground would be drowned out by the cosmic radiation that constantly bombards Earth's surface, Cline said.

To eliminate the majority of this background noise, the XENON100 experiment is buried beneath almost one mile of rock in the Gran Sasso lab, the largest underground facility of its kind in the world. While dark matter particles can travel easily through the vast expanse of stone and pass through the detector, only the most energetic particles from space are able to follow, Arisaka said.

Next steps

Because the XENON100 experiment is shielded by large amounts of rock, as well as by several tons of copper, lead and water, the largest source of background detections is actually the radiation coming from the instrument itself, Arisaka said.

In an effort to address this issue, Arisaka and Wang, working in collaboration with Hamamatsu Photonics in Japan, have developed the Quartz Photon Intensifying Detector (QUPID), a new light-detector technology that emits no radiation. The XENON group hopes to incorporate this breakthrough technology into the future XENON1T experiment.

"We have developed a detector to be used in future experiments based on new photon-detector technology," Wang said."We invented, tested and demonstrated its operation in liquid xenon in our laboratory at UCLA."

In addition to Arisaka, Cline and Wang, UCLA's XENON group includes postdoctoral scholars Emilija Pantic and Paolo Beltrame and graduate students Artin Teymourian and Kevin Lung. Two students, Ethan Brown and Michael Lam, received doctorates last year through this experiment.

Elena Aprile, a professor of physics at Columbia University, is the XENON collaboration's principal investigator and spokesperson.

The XENON collaboration consists of 60 scientists from 14 institutions in the U.S. (UCLA, Columbia University, Rice University); China (Shanghai Jiao Tong University); France (Subatech Nantes); Germany (Max-Planck-Institut Heidelberg, Johannes Gutenberg University Mainz, Willhelms Universität Münster); Israel (Weizmann Institute of Science); Italy (Laboratori Nazionali del Gran Sasso, INFN e Università di Bologna); the Netherlands (Nikhef Amsterdam); Portugal (Universidade de Coimbra); and Switzerland (Universität Zürich).

XENON100 is supported by its collaborating institutions and federally funded by the National Science Foundation and the U.S. Department of Energy, as well as by the Swiss National Foundation; France's Institut national de physique des particules et de physique nucléaire and La Région des Pays de la Loire; Germany's Max-Planck-Society and Deutsche Forschungsgemeinschaft; Israel's German-Israeli Minerva Gesellschaft and GIF; the Netherlands' FOM; Portugal's Fundação para a Ciência e Tecnologia; Italy's Instituto Nazionale di Fisica Nucleare; and China's STCSM.


Source

Wednesday, April 13, 2011

Better Lasers for Optical Communications

"All indications are that this technology could be useful at both industrial and scientific levels," explains Eli Kapon, head of EPFL's Laboratory of Physics of Nanostructures. More than fifteen years of research were required to arrive at this result, work that"has caused many headaches and demanded significant investment."

To obtain the right wavelength, the EPFL researchers adapted the lasers' size. In parallel, the EMPA scientists designed a nanometer-scale grating for the emitter in order to control the light's polarization. They were able to achieve this feat by vaporizing long molecules containing gold atoms with a straw-like tool operating above the lasers. Using an electron microscope, they were able to arrange and attach gold particles to the surface of each laser with extreme precision. Thus deposited, the grating serves as a filter for polarizing the light, much like the lenses of sunglasses are used to polarize sunlight.

Industrial and scientific advantages

This technique, developed in collaboration with EMPA, has many advantages. It allows a high-speed throughput of several gigabits a second with reduced transmission errors. The lasers involved are energy-efficient, consuming up to ten times less than their traditional counterparts, thanks to their small size. The technique is very precise and efficient, due to the use of the electron microscope.

"This progress is very satisfying," adds Kapon, who also outlines some possible applications."These kinds of lasers are also useful for studying and detecting gases using spectroscopic methods. We will thus make gains in precision by improving detector sensitivity."


Source

Thursday, April 7, 2011

Battery-Less Chemical Detector Developed

The device overcomes the power requirement of traditional sensors and is simple, highly sensitive and can detect various molecules quickly. Its development could be the first step in making an easily deployable chemical sensor for the battlefield.

The Lab's Yinmin"Morris" Wang and colleagues Daniel Aberg, Paul Erhart, Nipun Misra, Aleksandr Noy and Alex Hamza, along with collaborators from the University of Shanghai for Science and Technology, have fabricated the first-generation battery-less detectors that use one-dimensional semiconductor nanowires.

The nanosensors take advantage of a unique interaction between chemical species and semiconductor nanowire surfaces that stimulate an electrical charge between the two ends of nanowires or between the exposed and unexposed nanowires.

The group tested the battery-less sensors with different types of platforms -- zinc-oxide and silicon -- using ethanol solvent as a testing agent.

In the zinc-oxide sensor the team found there was a change in the electric voltage between the two ends of nanowires when a small amount of ethanol was placed on the detector.

"The rise of the electric signal is almost instantaneous and decays slowly as the ethanol evaporates," Wang said.

However, when the team placed a small amount of a hexane solvent on the device, little electric voltage was seen,"indicating that the nanosensor selectively responds to different types of solvent molecules," Wang said.

The team used more than 15 different types of organic solvents and saw different voltages for each solvent."This trait makes it possible for our nanosensors to detect different types of chemical species and their concentration levels," Wang said.

The response to different solvents was somewhat similar when the team tested the silicon nanosensors. However, the voltage decay as the solvent evaporated was drastically different from the zinc-oxide sensors."The results indicate that it is possible to extend the battery-less sensing platform to randomly aligned semiconductor nanowire systems," Wang said.

The team's next step is to test the sensors with more complex molecules such as those from explosives and biological systems.

The research appears on the inside front cover of the Jan. 4 issue ofAdvanced Materials.


Source

Wednesday, April 6, 2011

Invisibility Cloaks and More: Force of Acoustical Waves Tapped for Metamaterials

Metamaterials are artificial materials that are engineered to have properties not found in nature. These materials usually gain their unusual properties -- such as negative refraction that enables subwavelength focusing, negative bulk modulus, and band gaps -- from structure rather than composition.

By creating an inexpensive bench-top technique, as described in the American Institute of Physics' journalReview of Scientific Instruments, Los Alamos National Lab (LANL) researchers are making these highly desirable metamaterials more accessible.

Their technique harnesses an acoustical wave force, which causes nano-sized particles to cluster in periodic patterns in a host fluid that is later solidified, explains Farid Mitri, a Director's Fellow, and member of the Sensors& Electrochemical Devices, Acoustics& Sensors Technology Team, at LANL.

"The periodicity of the pattern formed is tunable and almost any kind of particle material can be used, including: metal, insulator, semiconductor, piezoelectric, hollow or gas-filled sphere, nanotubes and nanowires," he elaborates.

The entire process of structure formation is very fast and takes anywhere from 10 seconds to 5 minutes. Mitri and colleagues believe this technique can be easily adapted for large-scale manufacturing and holds the potential to become a platform technology for the creation of a new class of materials with extensive flexibility in terms of periodicity (mm to nm) and the variety of materialsthat can be used.

"This new class of acoustically engineered materials can lead to the discovery of many emergent phenomena, understanding novel mechanisms for the control of material properties, and hybrid metamaterials," says Mitri.

Applications of the technology, to name only a few, include: invisibility cloaks to hide objects from radar and sonar detection, sub-wavelength focusing for production of high-resolution lenses for microscopes and medical ultrasound/optical imaging probes, miniature directional antennas, development of novel anisotropic semiconducting metamaterials for the construction of effective electromagnetic devices, biological scaffolding for tissue engineering, light guide, and a variety of sensors.


Source

Friday, April 1, 2011

New Nanomaterial Can Detect and Neutralize Explosives

"This stuff is going to be used anywhere terrorist explosives are used, including battlefields, airports, and subways," said study leader Allen Apblett, Ph.D."It's going to save lives."

The material is a type of ink made of tiny metallic oxide nanoparticles -- so small that 50,000 could fit inside the diameter of a single human hair. The ink changes color, from dark blue to pale yellow or clear, in the presence of explosives. It also changes from a metallic conductor to a non-conducting material, making electronic sensing also possible.

This color-change feature allows the material to work as a sensor for quickly detecting the presence of vapors produced by explosives, Apblett said. Soldiers or firefighters could wear the sensors as badges on their uniforms or use them as paper-based test strips. Airports, subways and other facilities could use the sensors as part of stationary monitoring devices. The sensors could even be engineered into jewelry and cell phones, the scientist added.

The same color-changing material can also serve as an explosives neutralizer. Firefighters and bomb squad technicians could spray the ink onto bombs or suspicious packages until the color change indicates that the devices are no longer a threat, Apblett said. Technicians could also dump the explosives into vats containing the ink to neutralize them.

Apblett notes that authorities are concerned about peroxide-based explosives, made from hydrogen peroxide, which are easy to make and set off. These explosives first drew public attention in 2001, when thwarted"shoe bomber" Richard Reid tried to use one such substance as the detonator onboard a commercial airliner. In particular, they are concerned about a substance called triacetone triperoxide, or TATP, sometimes used in suicide vests and improvised explosive devices that have claimed such a toll among troops and civilians. However, current methods of detecting this explosive are ineffective, allowing the material to easily escape detection at airports and other locations.

The new ink provides a quick way to detect and test these explosives, which might be hidden in clothing, food, and beverages. The ink contains nanoparticles of a compound of molybdenum, a metal used in a wide variety of applications including missile and aircraft parts. The dark blue ink reacts with the peroxide explosives and turns yellow or clear.

When used as an electronic sensor, the highly-sensitive material is capable of detecting TATP vapors at levels as low as a 50 parts per million, equivalent to a few drops of the vapor in a small room, within 30 seconds. The same chemical reaction allows the materials to serve as an explosives neutralizer. In lab studies, the scientists showed that they could add the material to TATP or HMTD and make them nonexplosive.

"This does a really good job of neutralizing terrorists' explosives," said Apblett, a chemist at Oklahoma State University in Stillwater, Okla."I'm excited to see it moving from the lab to the real world."

The material can also improve safety at laboratories that use explosive chemicals. Recently, Apblett developed pellets containing the ink that can be added to laboratory solvents to prevent the build-up of levels of dangerous peroxides, which can cause accidental explosions. The color-changing feature allows the users of the solvents know that they are safe.

Apblett and colleagues founded a company called Xplosafe to develop and market the material. They hope to see the explosive detecting ink used in airports in as little as a year.

The scientists acknowledge funding from Memorial Institute for the Prevention of Terrorism, the National Science Foundation, Oklahoma Center for the Advancement of Science and Technology, Xplosafe, and Oklahoma State University.


Source

Monday, March 21, 2011

Engineers Make Breakthrough in Ultra-Sensitive Sensor Technology

The sensor, which is the most sensitive of its kind to date, relies on a completely new architecture and fabrication technique developed by the Princeton researchers. The device boosts faint signals generated by the scattering of laser light from a material placed on it, allowing the identification of various substances based on the color of light they reflect. The sample could be as small as a single molecule.

The technology is a major advance in a decades-long search to identify materials using Raman scattering, a phenomena discovered in the 1920s by an Indian physicist, Chandrasekhara Raman, where light reflecting off an object carries a signature of its molecular composition and structure.

"Raman scattering has enormous potential in biological and chemical sensing, and could have many applications in industry, medicine, the military and other fields," said Stephen Y. Chou, the professor of electrical engineering who led the research team."But current Raman sensors are so weak that their use has been very limited outside of research. We've developed a way to significantly enhance the signal over the entire sensor and that could change the landscape of how Raman scattering can be used."

Chou and his collaborators, electrical engineering graduate students, Wen-Di Li and Fei Ding, and post-doctoral fellow, Jonathan Hu, published a paper on their innovation in February in the journalOptics Express. The research was funded by the Defense Advance Research Projects Agency.

In Raman scattering, a beam of pure one-color light is focused on a target, but the reflected light from the object contains two extra colors of light. The frequency of these extra colors are unique to the molecular make-up of the substance, providing a potentially powerful method to determine the identity of the substance, analogous to the way a finger print or DNA signature helps identify a person.

Since Raman first discovered the phenomena -- a breakthrough that earned him Nobel Prize -- engineers have dreamed of using it in everyday devices to identify the molecular composition and structures of substances, but for many materials the strength of the extra colors of reflected light was too weak to be seen even with the most sophisticated laboratory equipment.

Researchers discovered in the 1970s that the Raman signals were much stronger if the substance to be identified is placed on a rough metal surface or tiny particles of gold or silver. The technique, known as surface enhanced Raman scattering (SERS), showed great promise, but even after four decades of research has proven difficult to put to practical use. The strong signals appeared only at a few random points on the sensor surface, making it difficult to predict where to measure the signal and resulting in a weak overall signal for such a sensor.

Abandoning the previous methods for designing and manufacturing the sensors, Chou and his colleagues developed a completely new SERS architecture: a chip studded with uniform rows of tiny pillars made of metals and insulators.

One secret of the Chou team's design is that their pillar arrays are fundamentally different from those explored by other researchers. Their structure has two key components: a cavity formed by metal on the top and at the base of each pillar; and metal particles of about 20 nanometers in diameter, known as plasmonic nanodots, on the pillar wall, with small gaps of about 2 nanometers between the metal components.

The small particles and gaps significantly boost the Raman signal. The cavities serve as antennae, trapping light from the laser so it passes the plasmonic nanodots multiple times to generate the Raman signal rather than only once. The cavities also enhance the outgoing Raman signal.

The Chou's team named their new sensor"disk-coupled dots-on-pillar antenna-array" or D2PA, for short.

So far, the chip is a billion times (109) more sensitive than was possible without SERS boosting of Raman signals and the sensor is uniformly sensitive, making it more reliable for use in sensing devices. Such sensitivity is several orders of magnitude higher than the previously reported.

Already, researchers at the U.S. Naval Research Laboratory are experimenting with a less sensitive chip to explore whether the military could use the technology pioneered at Princeton for detecting chemicals, biological agents and explosives.

In addition to being far more sensitive than its predecessors, the Princeton chip can be manufactured inexpensively at large sizes and in large quantities. This is due to the easy-to-build nature of the sensor and a new combination of two powerful nanofabrication technologies: nanoimprint, a method that allows tiny structures to be produced in cookie-cutter fashion; and self-assembly, a technique where tiny particles form on their own. Chou's team has produced these sensors on 4-inch wafers (the basis of electronic chips) and can scale the fabrication to much larger wafer size.

"This is a very powerful method to identify molecules," Chou said."The combination of a sensor that enhances signals far beyond what was previously possible, that's uniform in its sensitivity and that's easy to mass produce could change the landscape of sensor technology and what's possible with sensing."


Source

Thursday, March 3, 2011

New Developments in Quantum Computing

At the Association for Computing Machinery's 43rd Symposium on Theory of Computing in June, associate professor of computer science Scott Aaronson and his graduate student Alex Arkhipov will present a paper describing an experiment that, if it worked, would offer strong evidence that quantum computers can do things that classical computers can't. Although building the experimental apparatus would be difficult, it shouldn't be as difficult as building a fully functional quantum computer.

Aaronson and Arkhipov's proposal is a variation on an experiment conducted by physicists at the University of Rochester in 1987, which relied on a device called a beam splitter, which takes an incoming beam of light and splits it into two beams traveling in different directions. The Rochester researchers demonstrated that if two identical light particles -- photons -- reach the beam splitter at exactly the same time, they will both go either right or left; they won't take different paths. It's another quantum behavior of fundamental particles that defies our physical intuitions.

The MIT researchers' experiment would use a larger number of photons, which would pass through a network of beam splitters and eventually strike photon detectors. The number of detectors would be somewhere in the vicinity of the square of the number of photons -- about 36 detectors for six photons, 100 detectors for 10 photons.

For any run of the MIT experiment, it would be impossible to predict how many photons would strike any given detector. But over successive runs, statistical patterns would begin to build up. In the six-photon version of the experiment, for instance, it could turn out that there's an 8 percent chance that photons will strike detectors 1, 3, 5, 7, 9 and 11, a 4 percent chance that they'll strike detectors 2, 4, 6, 8, 10 and 12, and so on, for any conceivable combination of detectors.

Calculating that distribution -- the likelihood of photons striking a given combination of detectors -- is a hard problem. The researchers' experiment doesn't solve it outright, but every successful execution of the experiment does take a sample from the solution set. One of the key findings in Aaronson and Arkhipov's paper is that, not only is calculating the distribution a hard problem, but so is simulating the sampling of it. For an experiment with more than, say, 100 photons, it would probably be beyond the computational capacity of all the computers in the world.

The question, then, is whether the experiment can be successfully executed. The Rochester researchers performed it with two photons, but getting multiple photons to arrive at a whole sequence of beam splitters at exactly the right time is more complicated. Barry Sanders, director of the University of Calgary's Institute for Quantum Information Science, points out that in 1987, when the Rochester researchers performed their initial experiment, they were using lasers mounted on lab tables and getting photons to arrive at the beam splitter simultaneously by sending them down fiber-optic cables of different lengths. But recent years have seen the advent of optical chips, in which all the optical components are etched into a silicon substrate, which makes it much easier to control the photons' trajectories.

The biggest problem, Sanders believes, is generating individual photons at predictable enough intervals to synchronize their arrival at the beam splitters."People have been working on it for a decade, making great things," Sanders says."But getting a train of single photons is still a challenge."

Sanders points out that even if the problem of getting single photons onto the chip is solved, photon detectors still have inefficiencies that could make their measurements inexact: in engineering parlance, there would be noise in the system. But Aaronson says that he and Arkhipov explicitly consider the question of whether simulating even a noisy version of their optical experiment would be an intractably hard problem. Although they were unable to prove that it was, Aaronson says that"most of our paper is devoted to giving evidence that the answer to that is yes." He's hopeful that a proof is forthcoming, whether from his research group or others'.


Source

Monday, February 28, 2011

Key to Safer Remote Detection of Dangerous Materials

Clough, a student in the Department of Electrical, Computer, and Systems Engineering at Rensselaer, is one of three finalists for the 2011$30,000 Lemelson-MIT Rensselaer Student Prize.  Clough's project is titled"Terahertz Enhanced Acoustics," and his faculty adviser is Xi-Cheng Zhang, the J. Erik Jonsson Professor of Science at Rensselaer and director of the university's Center for Terahertz Research.

The Rensselaer Center for Terahertz Research is one of the most active groups worldwide to apply terahertz wave technology for security and defense applications. Sensors using terahertz waves can penetrate packaging materials or clothing and identify the unique terahertz"fingerprints" of many hidden materials. Terahertz waves occupy a large segment of the electromagnetic spectrum between the infrared and microwave bands. Unlike X-rays and microwaves, terahertz radiation is very low energy and poses no known health threat to humans.

A key practical limitation of terahertz technology, however, is that it only works over short distances. Naturally occurring moisture in air absorbs terahertz waves, weakening the signal and sensing capabilities. This distance limitation is not ideal for applications in bomb or hazardous material detection, where the human operator wants to be as far away as possible from the potential threat.

Clough's patent-pending solution to this problem is a new method for using sound waves to remotely"listen" to terahertz signals from a distance. Focusing two laser beams into air creates small bursts of plasma, which in turn create terahertz pulses. Another pair of lasers is aimed near the target of interest to create a second plasma for detecting the terahertz pulses after they have interacted with the material. This detection plasma produces acoustic waves as it ionizes the air. Clough discovered that by using a sensitive microphone to"listen" to the plasma, he could detect terahertz wave information embedded in these sound waves. This audio information can then be converted into digital data and instantly checked against a library of known terahertz fingerprints, to determine the chemical composition of the mystery material.

So far, Clough has successfully demonstrated the ability to use acoustics to identify the terahertz fingerprints from several meters away. He has separately demonstrated plasma acoustic detection from 11 meters, limited only by available lab space. Along with the increased distance from the potentially hazardous material, an additional advantage is that his system does not require a direct line of sight to collect signals, as the microphone can still capture the audio information. Potential applications of Clough's invention, which circumvents the fundamental limitations of remote terahertz spectroscopy, include environmental monitoring of atmospheric conditions, monitoring smokestack emissions, inspecting suspicious packages, or even detecting land mines -- all from a safe distance.

Clough has presented his findings at several international conferences, and the details of his work have been published inOptics LettersandPhysical Review E.His new method for terahertz sensing has created the possibility to obtain terahertz spectroscopic information from a distance, bypassing a key limitation of high terahertz absorption by water vapor in air.

A National Science Foundation Integrative Graduate Education and Research Traineeship (IGERT) fellow, Clough is deeply committed to his research activities.


Source

Monday, February 21, 2011

Advanced NASA Instrument Gets Close-Up on Mars Rocks

The Alpha Particle X-Ray Spectrometer (APXS) instrument, designed by physics professor Ralf Gellert of the University of Guelph in Ontario, Canada, uses the power of alpha particles, or helium nuclei, and X-rays to bombard a target, causing the target to give off its own characteristic alpha particles and X-ray radiation. This radiation is"read by" an X-ray detector inside the sensor head, which reveals which elements and how much of each are in the rock or soil.

Identifying the elemental composition of lighter elements such as sodium, magnesium or aluminum, as well as heavier elements like iron, nickel or zinc, will help scientists identify the building blocks of the Martian crust. By comparing these findings with those of previous Mars rover findings, scientists can determine if any weathering has taken place since the rock formed ages ago.

All NASA Mars rovers have carried a similar instrument -- Pathfinder's rover Sojourner, Spirit and Opportunity, and now Curiosity, too. Improvements have been made with each generation, but the basic design of the instrument has remained the same.

"APXS was modified for Mars Science Laboratory to be faster so it could make quicker measurements. On the Mars Exploration Rovers {Spirit and Opportunity} it took us five to 10 hours to get information that we will now collect in two to three hours," said Gellert, the instrument's principal investigator."We hope this will help us to investigate more samples."

Another significant change to the next-generation APXS is the cooling system on the X-ray detector chip. The instruments used on Spirit and Opportunity were able to take measurements only at night. But the new cooling system will allow the instrument on Curiosity to take measurements during the day, too.

The main electronics portion of the tissue-box-sized instrument lives in the rover's body, while the sensor head, the size of a soft drink can, is mounted on the robotic arm. With the help of Curiosity's remote sensing instruments -- the Chemistry and Camera (ChemCam) instrument and the Mastcam -- the rover team will decide where to drive Curiosity for a closer look with the instruments, including APXS. Measurements are taken with the APXS by deploying the sensor head to make direct contact with the desired sample.

The rover's brush will be used to remove dust from rocks to prepare them for inspection by APXS and by MAHLI, the rover's arm-mounted, close-up camera. Whenever promising samples are found, the rover will then use its drill to extract a few grains and feed them into the rover's analytical instruments, SAM and CheMin, which will then make very detailed mineralogical and other investigations.

Scientists will use information from APXS and the other instruments to find the interesting spots and to figure out the present and past environmental conditions that are preserved in the rocks and soils.

"The rovers have answered a lot of questions, but they've also opened up new questions," said Gellert."Curiosity was designed to pick up where Spirit and Opportunity left off."

JPL, a division of the California Institute of Technology in Pasadena, manages the Mars Science Laboratory mission for the NASA Science Mission Directorate, Washington.

For more information about the mission, visithttp://mars.jpl.nasa.gov/msl/. To watch the spacecraft being assembled and tested, visithttp://www.ustream.tv/nasajpl.


Source

Wednesday, February 16, 2011

Sentries in the Garden Shed: Plants That Can Detect Environmental Contaminants, Explosives

The stuff of science fiction you say? Not so, says a Colorado State University biologist whose research is funded in part by Homeland Security's Science and Technology Directorate (DHS S&T), as well as by the Defense Advanced Research Projects Agency (DARPA), the Office of Naval Research (ONR), and others.

Dr. June Medford and her team in the Department of Biology at Colorado State have shown that plants can serve as highly specific sentries for environmental pollutants and explosives. She's enabled a computer-designed detection trait to work in plants. How? By rewiring the plant's natural signaling process so that a detection of the bad stuff results in the loss of green color.

Based on research so far, Medford says the detection abilities of some plants (tobacco is an example) are similar to, or even better, than those of a dog's snout, long the hallmark of a good detector. Best of all, the training time is nothing compared to that of a dog.

"The idea comes directly from nature," Medford said."Plants can't run or hide from threats, so they've developed sophisticated systems to detect and respond to their environment. We've 'taught' plants how to detect things we're interested in and respond in a way anyone can see, to tell us there is something nasty around, by modifying the way the plant's proteins process chlorophyll. Our system, with improvements, may allow plants to serve as a simple and inexpensive means to monitor human surroundings for substances such as pollutants, explosives, or chemical agents."

The detection traits could be used in any plant and could detect multiple pollutants at once -- changes that can also be detected by satellite. While visible change in the plant is apparent after a day, the reaction can be remotely sensed within a couple of hours. A spectral imaging system designed specifically for the detection of de-greening biosensors would provide the fastest indication of a threat detected by the plants.

Computational design of the detection trait was initially done in collaboration with Professor Homme Hellinga at Duke University, and more recently with Professor David Baker at the University of Washington. The Baker and Hellinga laboratories used a computer program to redesign naturally-occurring proteins called receptors. These redesigned receptors specifically recognize a pollutant or explosive. Medford's lab then modifies these computer redesigned receptors to function in plants, and targets them to the plant cell wall where they can recognize pollutants or explosives in the air or soil near the plant. Once the substance is detected, an internal signal causes the plant to turn white.

Medford and her team want to speed up detection time. The initial or first-generation plants respond to an explosive in hours, but improvements are underway to reduce the response time to just a few minutes. A faster response time increases the likelihood of identifying the threat and preventing an attack.

"At this point in the research, it takes hours to achieve a visible change in the foliage," says Doug Bauer, DHS S&T's program manager on the research."Ideally, we'd want the reaction to be considerably faster." In addition to faster response times, Bauer says, in the next generation of the research, the indicators may take place in a non-visible spectrum, such as infrared, by using color-changing methods other than the suppression of chlorophyll. That way, law enforcement equipped with the appropriate sensors would be alerted, but a terrorist would not be tipped off.

A decentralized, ubiquitous detection capability could allow the early detection of bomb-manufacturing sites, instead of waiting for a potential bomber to show up at a transportation hub or other target zone.

There are still many, many years of research to go before any possible deployment of plant sentinels. Once the research achieves a point where it may be possible to deploy, there are other considerations that will have to be taken into account and additional studies to be conducted. For example, USDA regulations stipulate that genetically-altered plants must go through a rigorous study on their impact to and interaction with the environment before they can be cultivated or planted in the United States.

This work could eventually be used for a wide range of applications such as security in airports or monitoring for pollutants such as radon, a carcinogenic gas that can be found in basements. Harnessing plants as bio-sensors allows for distributed sensing without the need for a power supply."One day, plants may assist law enforcement officers in detecting meth labs or help emergency responders determine where hazardous chemicals are leaking," Bauer says."The fact that DoD, DHS and a variety of other agencies contributed to funding this research is an indicator of the breadth of possibilities."

Financial support for this research was provided by the Defense Advanced Research Projects Agency (DARPA), the Office of Naval Research (ONR), the Bioscience Discovery Evaluation Grant Program through the Colorado Office of Economic Development and International Trade, the National Science Foundation (NSF), Department of Homeland Security Science and Technology Directorate (DHS S&T), and Gitam Technologies. Most recently, Medford and her team received a three-year,$7.9 million grant from the DoD's Defense Threat Reduction Agency.

The research from Medford's team appeared in the peer-reviewed journalPLoS ONE.


Source

Friday, February 11, 2011

Scientists Elevate Warfighter Readiness Against Invisible Threats

A research team, led by Drs. Joshua Caldwell and Orest Glembocki, scientists at the U.S. Naval Research Laboratory, Electronic Science and Technology Division, has overcome this limitation with surface enhanced Raman scattering (SERS) using optically stimulated plasmon oscillations in nanostructured substrates.

Shown to provide enhancements of the Raman signal, large-area gold (Au) coated silicon (Si) nanopillar arrays are over 100 million times (108) more sensitive than Raman scattering sensing alone, while maintaining a very uniform response with less than 30 percent variability across the sensor area.

"These arrays are over an order-of-magnitude more sensitive than the best reported SERS sensors in the literature and the current state-of-the-art large-area commercial SERS sensors," said Caldwell."These arrays can be a key component of fully integrated, autonomously operating chemical sensors that detect, identify and report the presence of a threat at trace levels of exposure."

Raman devices use laser light to excite molecular vibrations, which in turn causes a shift in the energy of the scattered laser photons, up or down, creating a unique visual pattern. In the case of trace amounts of molecules in gases or liquids, detection through ordinary Raman scattering is virtually impossible. However, the Raman signal can be enhanced via the SERS effect using metal nanoparticles.

Despite surface-enhanced Raman scattering being first observed in the late 1970s, efforts to provide reproducible SERS-based chemical sensors has been hindered by the inability to make large-area devices with a uniform SERS response. The ability to reproducibly pattern nanometer-sized particles in periodic arrays has finally allowed this requirement to be met.

"While many tools are currently available to detect trace amounts of chemical warfare and biological agents and explosive compounds, a device using SERS can be used to identify these minute quantities of the chemicals of interest by providing a 'fingerprint' of the material, which all but eliminates the prevalence of false alarms," says Glembocki.

SERS offers several potential advantages over other spectroscopic techniques because of its measurement speed, high sensitivity, portability, and simple maneuverability. SERS can additionally be used to enhance existing Raman technologies, such as the hand held and standoff units that are already in use in field applications.


Source

Thursday, February 3, 2011

'Air Laser' May Sniff Bombs, Pollutants from a Distance

"We are able to send a laser pulse out and get another pulse back from the air itself," said Richard Miles, a professor of mechanical and aerospace engineering at Princeton, the research group leader and co-author on the paper."The returning beam interacts with the molecules in the air and carries their finger prints."

The new technique differs from previous remote laser-sensing methods in that the returning beam of light is not just a reflection or scattering of the outgoing beam. It is an entirely new laser beam generated by oxygen atoms whose electrons have been"excited" to high energy levels. This"air laser" is a much more powerful tool than previously existed for remote measurements of trace amounts of chemicals in the air.

The researchers, whose work is funded by the Office of Naval Research's basic research program on Sciences Addressing Asymmetric Explosive Threats, published their new method Jan. 28 in the journalScience.

Miles collaborated with three other researchers: Arthur Dogariu, the lead author on the paper, and James Michael of Princeton, and Marlan Scully, a professor with joint appointments at Princeton and Texas A&M University.

The new laser sensing method uses an ultraviolet laser pulse that is focused on a tiny patch of air, similar to the way a magnifying glass focuses sunlight into a hot spot. Within this hot spot -- a cylinder-shaped region just 1 millimeter long -- oxygen atoms become"excited" as their electrons get pumped up to high energy levels. When the pulse ends, the electrons fall back down and emit infrared light. Some of this light travels along the length of the excited cylinder region and, as it does so, it stimulates more electrons to fall, amplifying and organizing the light into a coherent laser beam aimed right back at the original laser.

Researchers plan to use a sensor to receive the returning beam and determine what contaminants it encountered on the way back.

"In general, when you want to determine if there are contaminants in the air you need to collect a sample of that air and test it," Miles said."But with remote sensing you don't need to do that. If there's a bomb buried on the road ahead of you, you'd like to detect it by sampling the surrounding air, much like bomb-sniffing dogs can do, except from far away. That way you're out of the blast zone if it explodes. It's the same thing with hazardous gases -- you don't want to be there yourself. Greenhouse gases and pollutants are up in the atmosphere, so sampling is difficult."

The most commonly used remote laser-sensing method, LIDAR -- short for light detection and ranging -- measures the scattering of a beam of light as it reflects off a distant object and returns back to a sensor. It is commonly used for measuring the density of clouds and pollution in the air, but can't determine the actual identity of the particles or gases. Variants of this approach can identify contaminants, but are not sensitive enough to detect trace amounts and cannot determine the location of the gases with much accuracy.

The returning beam is thousands of times stronger in the method developed by the Princeton researchers, which should allow them to determine not just how many contaminants are in the air but also the identity and location of those contaminants.

The stronger signal should also allow for detection of much smaller concentrations of airborne contaminants, a particular concern when trying to detect trace amounts of explosive vapors. Any chemical explosive emits various gases depending on its ingredients, but for many explosives the amount of gas is miniscule.

While the researchers are developing the underlying methods rather than deployable detectors, they envision a device that is small enough to be mounted on, for example, a tank and used to scan a roadway for bombs.

So far, the researchers have demonstrated the process in the laboratory over a distance of about a foot and a half. In the future they plan to increase the distance over which the beams travel, which they note is a straightforward matter of focusing the beam farther way. They also plan to fine-tune the sensitivity of the technique to identify small amounts of airborne contaminants.

In addition, the research group is developing other approaches to remote detection involving a combination of lasers and radar.

"We'd like to be able to detect contaminants that are below a few parts per billion of the air molecules," Miles said."That's an incredibly small number of molecules to find among the huge number of benign air molecules."


Source

Wednesday, February 2, 2011

Sensors to Detect Explosives, Monitor Food Being Developed

"There are many dangerous substances, pollutants and infectious bacteria we are constantly exposed to," said Rigoberto Advincula, a highly cited materials scientist at UH."Our work is poised to assist in such efforts as rapidly detecting explosives or banned substances in airports for homeland security, as well as monitoring commercial products like milk and pet food for substandard additive products. There is a need to measure this quantitatively and in a rapid manner."

In a two-stage effort on which a provisional patent has been filed, Advincula's team fabricated the polymer materials and then built a device that was used as a sensor. The work is based on what he calls"the artificial receptor concept." This is akin to an enzyme functioning as a biochemical catalyst within a cell, like an antibody, binding with specific molecules to produce a specific effect in the cell. The elements in Advincula's work, however, deal with metals and plastics and are called molecular imprinted polymers (MIP), a concept also used for making plastic antibodies. These polymers show a certain chemical affinity for the original molecule and can be used to fabricate sensors.

Based in electrochemistry, the films were prepared by electrodeposition, a process similar to electroplating used for metals in the automotive and metal industries. Their key innovation was to use a process called electropolymerization directly on a gold surface and attached to a digital read out. The group's next step is to put this film on portable devices, thus acting as sensors.

"Our materials and methods open up these applications toward portable devices and miniaturization. Our device will allow, in principle, the development of hand-held scanners for bomb detection or nerve agent detection in airports," Advincula said."This means accurate answers in a rapid manner without loss of time or use of complicated instruments. We can achieve very high sensitivity and selectivity in sensing. The design of our molecules and their fabrication methods have been developed in a simple, yet effective, manner."

The culmination of a year's work, the research being published simultaneously in three journals is a record for Advincula's group. These publications --Macromolecules, Applied Materials& Intefaces,andBiosensors& Bioelectronics- are among some of the most highly cited in this area of study.

Macromolecules is the most-cited journal on polymer science, and Applied Materials& Intefaces is an upcoming international forum for applied materials science and engineering. Both are put out by the American Chemical Society (ACS), which provides a comprehensive collection of well-cited, peer-reviewed journals in the chemical sciences. Biosensors& Bioelectronics is the principal international journal devoted to research, design, development and application of such devices and is published by Elsevier, one of the world's leading publishers of science and health information.

In the coming year, the researchers hope to expand the work to many other types of dangerous chemicals and also to proteins given off by pathogens. Ultimately, they plan to create portable hand-held devices for detection that will be made commercially available to the general public, as well as being of interest to the military. Advincula plans to seek additional funding and collaborators to reach these goals. Advincula currently has funding from the National Science Foundation and also works in collaboration with some companies.

Student success is another key element that Advincula emphasizes. In addition to becoming an ACS fellow last year, he received UH's undergraduate research mentoring award and is particularly committed to student success in a materials discovery environment. He says there are few labs like his that have the capability to develop all the chemistry in concert with developing the device and doing the surface analysis all in one location. The set-up provides students a unique environment for discovery.

The two students working on this project who are being trained in his lab are Roderick Pernites and Ramakrishna Ponnapati. Pernites, who is finishing his Ph.D. this semester, recently received the ACS best poster award in the colloids division and currently has eight publications in Advincula's group. Ponnapati, who studied under Advincula and received his Ph.D. in 2009, is now a postdoctoral researcher in UH's department of chemical and biomolecular engineering and previously received a best student award with the Society of Plastic Engineers chapter in Houston.


Source

Tuesday, February 1, 2011

Hunt for Dark Matter Closes in at Large Hadron Collider

The scientists have now carried out the first full run of experiments that smash protons together at almost the speed of light. When these sub-atomic particles collide at the heart of the CMS detector, the resultant energies and densities are similar to those that were present in the first instants of the Universe, immediately after the Big Bang some 13.7 billion years ago. The unique conditions created by these collisions can lead to the production of new particles that would have existed in those early instants and have since disappeared.

The researchers say they are well on their way to being able to either confirm or rule out one of the primary theories that could solve many of the outstanding questions of particle physics, known as Supersymmetry (SUSY). Many hope it could be a valid extension for the Standard Model of particle physics, which describes the interactions of known subatomic particles with astonishing precision but fails to incorporate general relativity, dark matter and dark energy.

Dark matter is an invisible substance that we cannot detect directly but whose presence is inferred from the rotation of galaxies. Physicists believe that it makes up about a quarter of the mass of the Universe whilst the ordinary and visible matter only makes up about 5% of the mass of the Universe. Its composition is a mystery, leading to intriguing possibilities of hitherto undiscovered physics.

Professor Geoff Hall from the Department of Physics at Imperial College London, who works on the CMS experiment, said:"We have made an important step forward in the hunt for dark matter, although no discovery has yet been made. These results have come faster than we expected because the LHC and CMS ran better last year than we dared hope and we are now very optimistic about the prospects of pinning down Supersymmetry in the next few years."

The energy released in proton-proton collisions in CMS manifests itself as particles that fly away in all directions. Most collisions produce known particles but, on rare occasions, new ones may be produced, including those predicted by SUSY -- known as supersymmetric particles, or 'sparticles'. The lightest sparticle is a natural candidate for dark matter as it is stable and CMS would only 'see' these objects through an absence of their signal in the detector, leading to an imbalance of energy and momentum.

In order to search for sparticles, CMS looks for collisions that produce two or more high-energy 'jets' (bunches of particles travelling in approximately the same direction) and significant missing energy.

Dr Oliver Buchmueller, also from the Department of Physics at Imperial College London, but who is based at CERN, explained:"We need a good understanding of the ordinary collisions so that we can recognise the unusual ones when they happen. Such collisions are rare but can be produced by known physics. We examined some 3 trillion proton-proton collisions and found 13 'SUSY-like' ones, around the number that we expected. Although no evidence for sparticles was found, this measurement narrows down the area for the search for dark matter significantly."

The physicists are now looking forward to the 2011 run of the LHC and CMS, which is expected to bring in data that could confirm Supersymmetry as an explanation for dark matter.

The CMS experiment is one of two general purpose experiments designed to collect data from the LHC, along with ATLAS (A Toroidal LHC ApparatuS). Imperial's High Energy Physics Group has played a major role in the design and construction of CMS and now many of the members are working on the mission to find new particles, including the elusive Higgs boson particle (if it exists), and solve some of the mysteries of nature, such as where mass comes from, why there is no anti-matter in our Universe and whether there are more than three spatial dimensions.


Source

Friday, January 21, 2011

Single Photon Management for Quantum Computers

In principle, quantum computers can perform calculations that are impossible or impractical using conventional computers by taking advantage of the peculiar rules of quantum mechanics. To do this, they need to operate on things that can be manipulated into specific quantum states. Photons are among the leading contenders.

The new NIST papers address one of the many challenges to a practical quantum computer: the need for a device that produces photons in ready quantities, but only one at a time, and only when the computer's processor is ready to receive them. Just as garbled data will confuse a standard computer, an information-bearing photon that enters a quantum processor together with other particles -- or when the processor is not expecting it -- can ruin a calculation.

The single-photon source has been elusive for nearly two decades, in part because no method of producing these particles individually is ideal."It's a bit like playing a game of whack-a-mole, where solving one problem creates others," says Alan Migdall of NIST's Optical Technology Division."The best you can do is keep all the issues under control somewhat. You can never get rid of them."

The team's first paper addresses the need to be certain that a photon is indeed coming when the processor is expecting it, and that none show up unexpected. Many kinds of single-photon sources create a pair of photons and send one of them to a detector, which tips off the processor to the fact that the second, information-bearing photon is on its way. But since detectors are not completely accurate, sometimes they miss the"herald" photon -- and its twin zips into the processor, gumming up the works.

The team effort, in collaboration with researchers from the Italian metrology laboratory L'Istituto Nazionale di Ricerca Metrologica (INRIM), handled the issue by building a simple gate into the source. When a herald photon reaches the detector, the gate opens, allowing the second photon past."You get a photon when you expect one, and you don't get one when you don't," Migdall says."It was an obvious solution; others proposed it long ago, we were just the first ones to build it. It makes the single photon source better."

In a second paper, the NIST team describes a photon source to address two other requirements. Quantum computers will need many such sources working in parallel, so sources must be able to be built in large numbers and operate reliably; and so that the computer can tell the photons apart, the sources must create multiple individual photons, but all at different wavelengths. The team outlines a way to create just such a source out of silicon, which has been well-understood by the electronics industry for decades as the material from which standard computer chips are built.

"Ordinarily a particular material can produce only pairs in a specific pair of wavelengths, but our design allows production of photons at a number of regular and distinct wavelengths simultaneously, all from one source," Migdall says."Because the design is compatible with microfabrication techniques, this accomplishment is the first step in the process of creating sources that are part of integrated circuits, not just prototype computers that work in the hothouse of the lab."


Source

Wednesday, January 19, 2011

Better Than the Human Eye: Tiny Camera With Adjustable Zoom Could Aid Endoscopic Imaging, Robotics, Night Vision

The"eyeball camera" has a 3.5x optical zoom, takes sharp images, is inexpensive to make and is only the size of a nickel. (A higher zoom is possible with the technology.)

While the camera won't be appearing at Best Buy any time soon, the tunable camera -- once optimized -- should be useful in many applications, including night-vision surveillance, robotic vision, endoscopic imaging and consumer electronics.

"We were inspired by the human eye, but we wanted to go beyond the human eye," said Yonggang Huang, Joseph Cummings Professor of Civil and Environmental Engineering and Mechanical Engineering at Northwestern's McCormick School of Engineering and Applied Science."Our goal was to develop something simple that can zoom and capture good images, and we've achieved that."

The tiny camera combines the best of both the human eye and an expensive single-lens reflex (SLR) camera with a zoom lens. It has the simple lens of the human eye, allowing the device to be small, and the zoom capability of the SLR camera without the bulk and weight of a complex lens. The key is that both the simple lens and photodetectors are on flexible substrates, and a hydraulic system can change the shape of the substrates appropriately, enabling a variable zoom.

The research is being published the week of Jan. 17 by theProceedings of the National Academy of Sciences(PNAS).

Huang, co-corresponding author of the PNAS paper, led the theory and design work at Northwestern. His colleague John Rogers, the Lee J. Flory Founder Chair in Engineering and professor of materials science and engineering at the University of Illinois, led the design, experimental and fabrication work. Rogers is a co-corresponding author of the paper.

Earlier eyeball camera designs are incompatible with variable zoom because these cameras have rigid detectors. The detector must change shape as the in-focus image changes shape with magnification. Huang and Rogers and their team use an array of interconnected and flexible silicon photodetectors on a thin, elastic membrane, which can easily change shape. This flexibility opens up the field of possible uses for such a system. (The array builds on their work in stretchable electronics.)

The camera system also has an integrated lens constructed by putting a thin, elastic membrane on a water chamber, with a clear glass window underneath.

Initially both detector and lens are flat. Beneath both the membranes of the detector and the simple lens are chambers filled with water. By extracting water from the detector's chamber, the detector surface becomes a concave hemisphere. (Injecting water back returns the detector to a flat surface.) Injecting water into the chamber of the lens makes the thin membrane become a convex hemisphere.

To achieve an in-focus and magnified image, the researchers actuate the hydraulics to change the curvatures of the lens and detector in a coordinated manner. The shape of the detector must match the varying curvature of the image surface to accommodate continuously adjustable zoom, and this is easily done with this new hemispherical eye camera.

In addition to Huang and Rogers, other authors of the paper are Chaofeng Lu and Ming Li, from Northwestern; Inhwa Jung, Jianliang Xiao, Viktor Malyarchuk and Jongseung Yoon, from the University of Illinois; and Zhuangjian Liu, from the Institute of High Performance Computing, Singapore.


Source

Thursday, January 13, 2011

Fastest Movie in the World Recorded: Method to Film Nanostructures Developed

A"molecular movie" that shows how a molecule behaves at the crucial moment of a chemical reaction would help us better understand fundamental processes in the natural sciences. Such processes are often only a few femtoseconds long. A femtosecond is a millionth of a billionth of a second.

While it is possible to record a single femtosecond picture using an ultra-short flash of light, it has never been possible to take a sequence of pictures in such rapid succession. On a detector that captures the image, the pictures would overlap and"wash out." An attempt to swap or refresh the detector between two images would simply take too long, even if it could be done at speed of light.

In spite of these difficulties, members of the joint research group"Functional Nanomaterials" of HZB and the Technische Universität Berlin have now managed to take ultrafast image sequences of objects mere micrometres in size using pulses from the X-ray laser FLASH in Hamburg, Germany. Furthermore, they chart out a path how their approach can be scaled to nanometer resolution in the future. Together with colleagues from DESY and the University of Münster, they have published their results in the journalNature Photonics.

The researchers came up with an elegant way to descramble the information superimposed by the two subsequent X-ray pulses. They encoded both images simultaneously in a single X-ray hologram. It takes several steps to obtain the final image sequence: First, the scientists split the X-ray laser beam into two separate beams. Using multiple mirrors, they force one beam to take a short detour, which causes the two pulses to reach the object under study at ever so slightly different times -- the two pulses arrive only 0.00000000000005 seconds apart. Due to a specific geometric arrangement of the sample, the pulses gen-erate a"double-hologram." This hologram encodes the structure of the object at the two times at which the X-ray pulses hit., Using a mathematical reconstruction procedure, the researchers can then simply associate the images with the respective X-ray puses and thus determine the image sequence in correct temporal order.

Using their method, the scientists recorded two pictures of a micro-model of the Brandenburg Gate, separated by only 50 femtoseconds."In this short time interval, even a ray of light travels no further than the width of a human hair," says PhD student Christian Günther, the first author of the publication. The short-wavelength X-rays used allow to reveal extremely small detail, since the shorter the wavelength of light you use, the smaller the objects you can resolve.

"The long-term goal is to be able to follow the movements of molecules and nanostructures in real time," says project head Prof. Dr. Stefan Eisebitt. The extremely high temporal resolution in conjunction with the possibility to see the tiniest objects was the motivation to develop the new technique. A picture may be worth a thousand words, but a movie made up of several pictures can tell you about an object's dynamics.


Source