Friday, February 24, 2012 INFOTECH

Skip Navigation Links
 
link
 
link
SUPPLEMENT
Visitor Login










REVOLUTIONARY EPIDEMIOLOGICAL INVENTION

Dhaka’s cholera outbreak can be predicted 11 months ahead!
Mercedes Pascual

A new University of Michigan (UM) computer model of disease transmission in space and time can predict cholera outbreaks in Bangladesh up to 11 months in advance, providing an early warning system that could help public health official. 

UM theoretical ecologists Mercedes Pascual and Aaron King, along with former UM postdoctoral researcher Robert Reiner and other colleagues, found evidence for a climate-sensitive urban core in Dhaka that acts to propagate cholera risk to the rest of the city. By including those findings in their model, the researchers were able to increase its accuracy and extend its forecasting ability far beyond previous disease models for the city. This was revealed in a bulletin of Banglawirecom on January 24.
 
Model applies to Dhaka city
The new forecast model applies specifically to the capital city of Dhaka and incorporates data on both year-to-year climate variability and the spatial location of cholera cases at the district level. This allowed the researchers to study both local variation in disease transmission and response to climate factors within the megacity of 14 million people.
Earlier models had prediction lead times of a month or less — too short to be of use in an early warning systems. The longer lead time of the new model will help inform decisions about treatment preparedness, vaccination and other disease-prevention strategies.
“What is new here is that we have analyzed the data in space and time by considering the cholera cases at the level of districts within the city,” said Pascual, the Rosemary Grant Collegiate Professor of Ecology and Evolutionary Biology. “Previous analyses here and in other places have aggregated the cases at the level of the whole city. This enables us to provide early warnings that are useful because they can help hospitals prepare for the effective treatment of large numbers of people.”
The Department of Ecology and Evolutionary Biology, UM, embraces education and research on all aspects of biodiversity, including the history of life on earth, evolutionary mechanisms that generate diversity, the ecological context in which all life has evolved, and consequences of interactions among organisms, including humans. Faculty expertise ranges from the tropics to the tundra, from the theoretical to the practical.
Mercedes Pascual
Pascual received Ph.D degree in 1995 from the Joint Program of the Woods Hole Oceanographic Institution and the Massachusetts Institute of Technology. She was awarded a US Department of Energy Alexander Hollaender Distinguished Postdoctoral Fellowship for studies at Princeton, and more recently, a Centennial Fellowship in Global and Complex Systems from the James S. McDonnell Foundation. Currently she is affiliated with the Centre for the Study of Complex Systems at UM and with the Santa Fe Institute as an external faculty.
Aaron King received Ph.D. in Applied Mathematics in1999. He is Associate Professor of Ecology and Evolutionary Biology and Mathematics, University of Michigan, Ann Arbor since 2008.
 
Evolution in Biology
“Nothing in biology makes sense except in the light of evolution.” — Theodosius Dobzhansky. Evolution has been called the cornerstone of biology, and for good reasons. It is possible to do research in biology with little or no knowledge of evolution. Most biologists do. But, without evolution biology becomes a disparate set of fields. Evolutionary explanations pervade all fields in biology and brings them together under one theoretical umbrella. 
We know from microevolutionary theory that natural selection should optimize the existing genetic variation in a population to maximize reproductive success. This provides a framework for interpreting a variety of biological traits and their relative importance. For example, a signal intended to attract a mate could be intercepted by predators. Natural selection has caused a trade- off between attracting mates and getting preyed upon. If you assume something other than reproductive success is optimized, many things in biology would make little sense. Without the theory of evolution, life history strategies would be poorly understood. 
 
Aaron King
Macroevolutionary supposition
Macroevolutionary theory also helps explain many things about how living things work. Organisms are modified over time by cumulative natural selection. The numerous examples of jury- rigged design in nature are a direct result of this. The distribution of genetically based traits across groups is explained by splitting of lineages and the continued production of new traits by mutation. The traits are restricted to the lineages they arise in. 
Details of the past also hold explanatory power in biology. Plants obtain their carbon by joining carbon dioxide gas to an organic molecule within their cells. This is called carbon fixation. This is because photosynthesis evolved when there was little gaseous oxygen present. Later, when oxygen became more abundant, the efficiency of photosynthesis decreased. Photosynthetic organisms compensated by making more of the enzyme. RuBP carboxylase is the most abundant protein on the planet partially because it is one of the least efficient. 
Ecosystems, species, organisms and their genes all have long histories. A complete explanation of any biological trait must have two components. First, a proximal explanation — how does it work? And second, an ultimate explanation — what was it modified from? For centuries humans have asked, “Why are we here?” The answer to that question lies outside the realm of science. Biologists, however, can provide an elegant answer to the question, “How did we get here?” 
—Internet

 

Comment

Mercedes Pascual

A new University of Michigan (UM) computer model of disease transmission in space and time can predict cholera outbreaks in Bangladesh up to 11 months in advance, providing an early warning system that could help public health official. 

UM theoretical ecologists Mercedes Pascual and Aaron King, along with former UM postdoctoral researcher Robert Reiner and other colleagues, found evidence for a climate-sensitive urban core in Dhaka that acts to propagate cholera risk to the rest of the city. By including those findings in their model, the researchers were able to increase its accuracy and extend its forecasting ability far beyond previous disease models for the city. This was revealed in a bulletin of Banglawirecom on January 24.
 
Model applies to Dhaka city
The new forecast model applies specifically to the capital city of Dhaka and incorporates data on both year-to-year climate variability and the spatial location of cholera cases at the district level. This allowed the researchers to study both local variation in disease transmission and response to climate factors within the megacity of 14 million people.
Earlier models had prediction lead times of a month or less — too short to be of use in an early warning systems. The longer lead time of the new model will help inform decisions about treatment preparedness, vaccination and other disease-prevention strategies.
“What is new here is that we have analyzed the data in space and time by considering the cholera cases at the level of districts within the city,” said Pascual, the Rosemary Grant Collegiate Professor of Ecology and Evolutionary Biology. “Previous analyses here and in other places have aggregated the cases at the level of the whole city. This enables us to provide early warnings that are useful because they can help hospitals prepare for the effective treatment of large numbers of people.”
The Department of Ecology and Evolutionary Biology, UM, embraces education and research on all aspects of biodiversity, including the history of life on earth, evolutionary mechanisms that generate diversity, the ecological context in which all life has evolved, and consequences of interactions among organisms, including humans. Faculty expertise ranges from the tropics to the tundra, from the theoretical to the practical.
Mercedes Pascual
Pascual received Ph.D degree in 1995 from the Joint Program of the Woods Hole Oceanographic Institution and the Massachusetts Institute of Technology. She was awarded a US Department of Energy Alexander Hollaender Distinguished Postdoctoral Fellowship for studies at Princeton, and more recently, a Centennial Fellowship in Global and Complex Systems from the James S. McDonnell Foundation. Currently she is affiliated with the Centre for the Study of Complex Systems at UM and with the Santa Fe Institute as an external faculty.
Aaron King received Ph.D. in Applied Mathematics in1999. He is Associate Professor of Ecology and Evolutionary Biology and Mathematics, University of Michigan, Ann Arbor since 2008.
 
Evolution in Biology
“Nothing in biology makes sense except in the light of evolution.” — Theodosius Dobzhansky. Evolution has been called the cornerstone of biology, and for good reasons. It is possible to do research in biology with little or no knowledge of evolution. Most biologists do. But, without evolution biology becomes a disparate set of fields. Evolutionary explanations pervade all fields in biology and brings them together under one theoretical umbrella. 
We know from microevolutionary theory that natural selection should optimize the existing genetic variation in a population to maximize reproductive success. This provides a framework for interpreting a variety of biological traits and their relative importance. For example, a signal intended to attract a mate could be intercepted by predators. Natural selection has caused a trade- off between attracting mates and getting preyed upon. If you assume something other than reproductive success is optimized, many things in biology would make little sense. Without the theory of evolution, life history strategies would be poorly understood. 
 
Aaron King
Macroevolutionary supposition
Macroevolutionary theory also helps explain many things about how living things work. Organisms are modified over time by cumulative natural selection. The numerous examples of jury- rigged design in nature are a direct result of this. The distribution of genetically based traits across groups is explained by splitting of lineages and the continued production of new traits by mutation. The traits are restricted to the lineages they arise in. 
Details of the past also hold explanatory power in biology. Plants obtain their carbon by joining carbon dioxide gas to an organic molecule within their cells. This is called carbon fixation. This is because photosynthesis evolved when there was little gaseous oxygen present. Later, when oxygen became more abundant, the efficiency of photosynthesis decreased. Photosynthetic organisms compensated by making more of the enzyme. RuBP carboxylase is the most abundant protein on the planet partially because it is one of the least efficient. 
Ecosystems, species, organisms and their genes all have long histories. A complete explanation of any biological trait must have two components. First, a proximal explanation — how does it work? And second, an ultimate explanation — what was it modified from? For centuries humans have asked, “Why are we here?” The answer to that question lies outside the realm of science. Biologists, however, can provide an elegant answer to the question, “How did we get here?” 
—Internet

 


Login to post comments


(0)



END OF MOORE’S LAW

Quantum computing begins

 

Samsung 2.8-pound ‘Sandy Bridge’ laptop costs over $624. It uses Intel’s chips.

The smallest transistor ever built — in fact, the smallest transistor that can be built — has been created using a single phosphorus atom by an international team of researchers at the University of New South Wales, Purdue University and the University of Melbourne. 

The term “Moore’s law” was coined around 1970 by the Caltech professor, Very-large-scale integration (VLSI) pioneer, and entrepreneur Carver Mead in reference to a statement by Gordon E. Moore. Gordon Earle Moore is the co-founder and Chairman Emeritus of Intel Corporation and the author of Moore’s Law.
Simulations of the atomic transistor to model its behavior were conducted at Purdue using nanoHUB technology, an online community resource site for researchers in computational nanotechnology.
Gerhard Klimeck, who directed the Purdue group that ran the simulations, says this is an important development because it shows how small electronic components can be engineered.
“To me, this is the physical limit of Moore’s Law,” Klimeck says. “We can’t make it smaller than this.”
Michelle Simmons, group leader and director of the ARC Centre for Quantum Computation and Communication at the University of New South Wales, says the development is less about improving current technology than building future tech.
“This is a beautiful demonstration of controlling matter at the atomic scale to make a real device,” Simmons says. “Fifty years ago when the first transistor was developed, no one could have predicted the role that computers would play in our society today. As we transition to atomic-scale devices, we are now entering a new paradigm. It is the promise of this future technology that makes this present development so exciting.”
 
One atom tall
The same research team announced in January that it had developed a wire of phosphorus and silicon — just one atom tall and four atoms wide — that behaved like copper wire.
Although definitions can vary, simply stated Moore’s Law holds that the number of transistors that can be placed on a processor will double approximately every 18 months. The latest Intel chip, the “Sandy Bridge,” uses a manufacturing process to place 2.3 billion transistors 32 nanometers apart. A single phosphorus atom, by comparison, is just 0.1 nanometers across, which would significantly reduce the size of processors made using this technique, although it may be many years before single-atom processors actually are manufactured.
The single-atom transistor does have one serious limitation: It must be kept very cold, at least as cold as liquid nitrogen, or minus 391 degrees Fahrenheit (minus 196 Celsius).
“The atom sits in a well or channel, and for it to operate as a transistor the electrons must stay in that channel,” Klimeck says. “At higher temperatures, the electrons move more and go outside of the channel. For this atom to act like a metal you have to contain the electrons to the channel.
“If someone develops a technique to contain the electrons, this technique could be used to build a computer that would work at room temperature. But this is a fundamental question for this technology.”
Although single atoms serving as transistors have been observed before, this is the first time a single-atom transistor has been controllably engineered with atomic precision. The structure even has markers that allow researchers to attach contacts and apply a voltage, says Martin Fuechsle, a researcher at the University of New South Wales and lead author on the journal paper.

Comment

Samsung 2.8-pound ‘Sandy Bridge’ laptop costs over $624. It uses Intel’s chips.

The smallest transistor ever built — in fact, the smallest transistor that can be built — has been created using a single phosphorus atom by an international team of researchers at the University of New South Wales, Purdue University and the University of Melbourne. 

The term “Moore’s law” was coined around 1970 by the Caltech professor, Very-large-scale integration (VLSI) pioneer, and entrepreneur Carver Mead in reference to a statement by Gordon E. Moore. Gordon Earle Moore is the co-founder and Chairman Emeritus of Intel Corporation and the author of Moore’s Law.
Simulations of the atomic transistor to model its behavior were conducted at Purdue using nanoHUB technology, an online community resource site for researchers in computational nanotechnology.
Gerhard Klimeck, who directed the Purdue group that ran the simulations, says this is an important development because it shows how small electronic components can be engineered.
“To me, this is the physical limit of Moore’s Law,” Klimeck says. “We can’t make it smaller than this.”
Michelle Simmons, group leader and director of the ARC Centre for Quantum Computation and Communication at the University of New South Wales, says the development is less about improving current technology than building future tech.
“This is a beautiful demonstration of controlling matter at the atomic scale to make a real device,” Simmons says. “Fifty years ago when the first transistor was developed, no one could have predicted the role that computers would play in our society today. As we transition to atomic-scale devices, we are now entering a new paradigm. It is the promise of this future technology that makes this present development so exciting.”
 
One atom tall
The same research team announced in January that it had developed a wire of phosphorus and silicon — just one atom tall and four atoms wide — that behaved like copper wire.
Although definitions can vary, simply stated Moore’s Law holds that the number of transistors that can be placed on a processor will double approximately every 18 months. The latest Intel chip, the “Sandy Bridge,” uses a manufacturing process to place 2.3 billion transistors 32 nanometers apart. A single phosphorus atom, by comparison, is just 0.1 nanometers across, which would significantly reduce the size of processors made using this technique, although it may be many years before single-atom processors actually are manufactured.
The single-atom transistor does have one serious limitation: It must be kept very cold, at least as cold as liquid nitrogen, or minus 391 degrees Fahrenheit (minus 196 Celsius).
“The atom sits in a well or channel, and for it to operate as a transistor the electrons must stay in that channel,” Klimeck says. “At higher temperatures, the electrons move more and go outside of the channel. For this atom to act like a metal you have to contain the electrons to the channel.
“If someone develops a technique to contain the electrons, this technique could be used to build a computer that would work at room temperature. But this is a fundamental question for this technology.”
Although single atoms serving as transistors have been observed before, this is the first time a single-atom transistor has been controllably engineered with atomic precision. The structure even has markers that allow researchers to attach contacts and apply a voltage, says Martin Fuechsle, a researcher at the University of New South Wales and lead author on the journal paper.

Login to post comments


(0)



EDITORIAL
COMMENTS
INTERNATIONAL
BUSINESS
INFOTECH
MISCELLANY
AVIATOUR
CULTURE
SUPPLEMENT
FOUNDING EDITOR: ENAYETULLAH KHAN; EDITOR: SAYED KAMALUDDIN
Contents Copyrighted © by Holiday Publication Limited
Mailing address 30, Tejgaon Industrial Area, Dhaka-1208, Bangladesh.
Phone 880-2-8170462, 8170463, 8170464 Fax 880-2-9127927 Email weeklyholiday65@gmail.com
Site Managed By: Southtech Group
Southtech Group does not take any responsibility for any news content of this site