This is an old revision of this page, as edited by Randula~enwiki (talk | contribs) at 22:13, 28 March 2007 (I have collected the data from Sciencedaily). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
Revision as of 22:13, 28 March 2007 by Randula~enwiki (talk | contribs) (I have collected the data from Sciencedaily)(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)Study Shows Metabolic Strategy Of Stressed Cell Science Daily — Investigators at St. Jude Children's Research Hospital have mapped out many of the dynamic genetic and biochemical changes that make up a cell's response to a shortage of a molecule called Coenzyme A (CoA), a key player in metabolism. The results provide the most detailed look ever obtained of the complex metabolic changes in a cell triggered by a potentially fatal stress.
Metabolism is the sum of all biochemical reactions involved in maintaining the health of the cell, including breaking down and synthesizing various molecules to produce energy and build substances the cell needs to operate normally. CoA plays key roles in the cell's metabolism by participating in biochemical reactions in specific areas throughout the cell.
The St. Jude study is a significant contribution to the growing field of metabolomics--the study of the molecules involved in metabolism. Coupled with genetic studies of the cell, metabolomics is giving scientists a more detailed picture of how the body maintains its health in both normal environments and during times of stress, such as starvation or disease.
The researchers studied the response to decreased CoA in a mouse model by blocking CoA production with hopantenate (HoPan). HoPan is a chemical that interferes with pantothenate kinase (PanK), the enzyme that triggers the first step of CoA production. Following the shutdown of CoA production, the cells quickly recycled CoA from other jobs so it could concentrate all its efforts on a single task: extracting life-supporting energy from nutrients in the mitochondria. Mitochondria are the powerhouses of the cell, so-called because these bags of enzymes host a series of complex biochemical pathways that produce the energy-rich molecule ATP--the cell's "currency" with which it "buys" chemical reactions that consume energy.
"The cell's response to reduced CoA levels is like the driver of a car that is low on gas," said Charles Rock, Ph.D., a member of the St. Jude Infectious Diseases department and co-author of the paper. "The driver might try to save what little gas is left by turning off the air conditioner and driving slower," he said. "Likewise, by shutting down or limiting the other biochemical pathways that use CoA, the cell can concentrate it in the mitochondria where it's needed most."
"The metabolic changes we observed freed up the CoA to make ATP," said Suzanne Jackowski, Ph.D., a member of the St. Jude Infectious Diseases department and the paper's senior author. "Our study provides the first detailed look at how the cell shifts genetic gears to respond to a significant change in its ability to carry on its daily metabolic chores."
The St. Jude study also showed that PanK controls the concentration of CoA in the cell depending on how much is needed and where it is needed. Previous studies at St. Jude showed that four different forms of PanK exist in different places in the cell and each one can be inhibited by rising levels of CoA. This allows the cell to increase or decrease CoA levels in specific locations, depending on the amount of CoA needed.
These findings not only give researchers a detailed look at how the cell responds to a significant reduction in the concentration of a critical molecule. The alterations in the activity of certain genes and enzymes also serve as a model for the milder disruption of CoA levels that may underlie a brain disorder called pantothenate-kinase-associated neurodegeneration (PKAN). PKAN is a hereditary disorder caused by mutations in PanK that may lead to a deficiency of CoA in brain mitochondria. Previously, this group of St. Jude researchers showed how specific mutations in one form of PanK disable this enzyme, which in turn would reduce CoA production and cause PKAN .
In the present study, the St. Jude team showed that low levels of CoA trigger the activation of genes that block other biochemical pathways that ordinarily use this molecule. Instead, the cell shifts most of the available CoA activity to producing glucose from the liver. Other organs then break down glucose into a molecule called pyruvate inside structures called mitochondria. In the mitochondria, CoA molecules perform another job: feeding pyruvate into a complex series of chemical reactions that produces molecules of ATP.
"Our results identify the re-arrangements that the cell's metabolism undergoes in order to ensure that the liver keeps CoA levels high enough to produce glucose and the cells of the body maintain enough free CoA for the mitochondria to keep producing ATP," said Yong-Mei Zhang, Ph.D., of the St. Jude Infectious Diseases department and first author of the report.
The investigators demonstrated many of the metabolic changes caused by a shortage of CoA by treating mice with HoPan. The resulting decrease in CoA triggered severe hypoglycemia--a low level of glucose in the blood. Prior to the hypoglycemia, the liver cells adjusted their metabolism in an effort to maintain the glucose output. This study identified several key steps, including a substantial increase in the amount of enzymes that free CoA from molecules called acyl groups, as well as increases in the amount of acylcarnitine, a molecule that grabs those acyl groups, ensuring that CoA remains free and available for energy production.
Other authors of this paper include Shigeru Chohnan (St. Jude), Kristopher G. Virga and Richard E. Lee (University of Tennessee Health Science Center, Memphis, Tenn.); Robert D. Stevens, Olga R. Ilkayeva, Brett R. Wenner, James R. Bain and Christopher B. Newgard (Duke University Medical Center, Durham, N.C.).
A report on this work appears in the March issue of "Chemistry and Biology. "
Chimpanzee Facial Expressions Are Helping Researchers Understand Human Communication Science Daily — Behavioral researchers led by Lisa Parr, PhD, director of the Yerkes National Primate Research Center Cognitive Testing Facility and Chimpanzee Core, have found understanding chimpanzee facial expressions requires more attention to detail than researchers initially thought. Correctly interpreting the subtleties within chimpanzees' facial expressions may be key to understanding the evolution of human emotional communication.  Using the Chimpanzee Facial Action Coding System (Chimp FACS), the chimpanzees in the study observed anatomically correct 3D animations of chimpanzee facial expressions and then were asked to match the similar ones. (Credit: Yerkes National Primate Research Center) According to Parr, "This discovery is an important step to help researchers recognize facial movements and understand why they are important. While some expressions, such as a playful look, can be identified using a single feature, other expressions, such as when a chimp bares his teeth, require looking at numerous characteristics within the face, including the eyes and lips." This is similar to what researchers see in human emotional expressions. "Sometimes it's easy to read what people are feeling, but at other times, we have to look at multiple places on their faces. Ultimately, we want to better understand what people are feeling and expressing emotionally because it helps us empathize with one another," Parr continued. To facilitate her studies, Parr developed the Chimpanzee Facial Action Coding System (Chimp FACS) to directly compare documented expressions of humans and chimpanzees. Using Chimp FACS, the chimpanzees in the study observed anatomically correct 3D animations of chimpanzee facial expressions and then were asked to match the similar ones. "After the chimpanzees matched similar images, we separated individual features of the original animated expression, such as a raised brow, by frame and pieced the frames back together to create a variation of the original expression. The chimpanzees then were asked to match the new expression to the original one. This is how we determined when the chimpanzees were using a single feature or if they needed more than one feature to match the similar expressions," said Sheila Sterk, a senior animal behavior management specialist on Parr's team. Parr will present this new data at the upcoming "Mind of the Chimpanzee" conference, an international multidisciplinary conference on chimpanzee cognition being held March 22-25 at the Lincoln Park Zoo in Chicago.
Experimental Evolution in Robots
RELATED
Photos: Sea Monster Graveyard Found in the Arctic (October 6, 2006)
Panoramic View: Jurassic Sea Monsters in 3-D
"Godzilla" Fossils Reveal Real-Life Sea Monster (November 10, 2005)
March 22, 2007—It endured a rocky ride—literally—but this ancient "sea monster" from Asia has found a place in the United States to call home.
The fossil remains of a crocodile-like reptile called Thalattosuchia have been discovered in rocks in the Blue Mountains of eastern Oregon—about 5,000 miles (8,050 kilometers) from where it most likely died, researchers announced on Monday. So far about 50 percent of the animal, including the upper leg bone and rib fragments seen here (bottom), have been unearthed.
"This creature lived in Jurassic times, so it's 150 to 180 million years old," retired University of Oregon geologist William Orr said in a press release. Orr provided expert advice to the excavation team.
"It probably lived in an area from Japan to East Timor, somewhere in the western Pacific in a tropical estuarine environment."
The reptile, the oldest ever found in Oregon, is a rare discovery in North America. But similar fossils have been found throughout Southeast Asia, so experts believe that the remains were carried to the U.S. by plate tectonics. As the section of Earth's crust containing the fossils moved eastward, the Pacific plate collided with the North American plate, pushing the bones into the mountains.
The 6- to 8-foot-long (1.8- to 2.4-meter-long) creature, shown in an artist's conception (top), is part of a group that scientists think represents an evolutionary transition for this line of crocodilians. Features from related fossils suggest that the animals were evolving from being semiaquatic to entirely ocean dwelling.
The newfound fossils will go to the University of Iowa for further study before going on display at an Oregon museum.
Yellowstone Grizzlies Lose "Endangered" Status; Critics Growling
Richard A. Lovett
for National Geographic News
March 23, 2007
Citing the recovery of grizzly bear populations in and around Yellowstone National Park, the U.S. government has declared that the bears no longer need protection under the Endangered Species Act.
But conservationists see the move as premature and say it helps clear the way for logging, drilling, and other activities in forests surrounding the park.

RELATED Uproar Over Grizzlies' Likely Loss of Endangered Status (August 18, 2005) Photos: Yellowstone and the Tetons Few Grizzlies Left on Land Traveled by Lewis and Clark (May 3, 2002) "We think it's a big mistake," Carl Pope, executive director of the Sierra Club, told National Geographic News. "We have made a lot of progress in the Greater Yellowstone ecosystem, but the species isn't established in a sufficiently wide range to be delisted." The decision to delist the grizzlies was announced yesterday by Deputy Interior Secretary Lynn Scarlett. Yellowstone-area bears had been listed as "threatened" since 1975, when population estimates ranged from 136 to 312. Today more than 500 bears roam the area, she said. "I believe all Americans should be proud that, as a nation, we had the will and the ability to protect and restore this symbol of the wild," Scarlett said in a press statement. Bears Recovered? Environmentalists are pleased by the comeback but not by the delisting. Yellowstone's grizzlies should be considered as recovered when they are no longer isolated from other grizzly populations, said Neil Darlow, a program manager for the Yellowstone to Yukon Conservation Initiative. The nonprofit group advocates the creation of a continuous chain of protected habitat stretching from Yellowstone to Canada's Yukon Territory. Isolated populations aren't viable in the long run, Darlow said, because they are more susceptible to inbreeding and local extinction from natural or human causes.

Israel denies reported Iran attack plan
TEL AVIV, Israel, Feb. 24 (UPI) -- A top Israeli official Saturday denied a published report that Israel has a plan to attack Iran's nuclear facilities. Deputy Defense Minister MK Ephraim Sneh said Israel is not laying the groundwork to take out Iran's nuclear facilities, despite a report saying so in the British Daily Telegraph, the Jerusalem Post reported. The Telegraph said Israel has asked the United States for permission to fly over Iraq as part of the attack on Iran. The Telegraph quoted an Israeli source saying, "We are planning for every eventuality, and sorting out issues such as these are crucially important. "The only way to do this is to fly through U.S.-controlled air space. If we don't sort these issues out now we could have a situation where American and Israeli war planes start shooting at each other," said the source described by the Telegraph as a senior defense official. Iran missed a United Nations deadline Wednesday to halt its production of nuclear fuel that the United States, Israel and other nations suspect is for nuclear weapons. Iran has said it will not stop enriching uranium and that its nuclear program is for producing energy ______________________________________________________________________________
______________________________________________________________________________
Doctors Test Effort That Helps People Understand Health Risk Information
Science Daily — In a study published in the Feb. 20 issue of the Annals of Internal Medicine, researchers with Dartmouth Medical School and the Veterans Affairs Outcomes Group at the White River Junction (VT). VA Medical Center has tested whether a primer, which the researchers also wrote, helped people better understand information about health risks and interventions meant to reduce that risk.
In Presence of Fragrant Cleaning Products, Air Purifiers That Emit Ozone Can Dirty the Air
Household cleaning products, scientists at UC Irvine have discovered.
Ozone emitted by purifiers reacts in the air with unsaturated volatile organic compounds such as limonene – a chemical added to cleaning supplies that gives them a lemon fragrance – to create additional microscopic particles, scientists found. Certain ionic purifiers emit ozone as a byproduct of ionization used for charging airborne particles and electrostatically attracting them to metal electrodes. Ozonolysis purifiers emit ozone at higher levels on purpose with the ostensible goal of oxidizing volatile organic compounds in the air. This research appeared online this morning in Environmental Science and Technology. “The public needs to be aware that every air purification approach has its limitation, and ionization air purifiers are no exception,” said Sergey Nizkorodov, assistant professor of chemistry at UCI and co-author of the study. “These air purifiers can not only elevate the level of ozone, a formidable air pollutant in itself, but also increase the amount of harmful particulate matter in indoor air.” High levels of airborne particles can aggravate asthma and cardiovascular problems, and linked to higher death and lung cancer rates. Excess ozone can damage the lungs, causing chest pain, coughing, and shortness of breath and throat irritation. Nizkorodov and students Ahmad Alshawa and Ashley Russell conducted their experiment in a sparsely furnished office with a floor area of about 11 square meters. They placed an ozone-emitting air purifier in the middle of the room along with a large fan to better mix the air. At timed intervals, limonene vapor injected in the room. Samples of the air were taken about one meter from the purifier and analyzed for ozone and particulate matter levels. The researchers tested two types of air purifiers – a commercial ionic purifier that emits about 2 milligrams of ozone per hour, and an Ozonolysis purifier that emits approximately 100 milligrams of ozone per hour. Continuous operation of the ionic purifier without limonene resulted in a slight reduction in the average particle concentration, while operation of the Ozonolysis purifier resulted in no detectable effect on the particle level. When limonene added to the room, the particle concentration shot up in both cases, on some occasions up to 100 times the original level. Adding limonene to the room when a purifier was not operating produced little change in the overall particle level. The scientists also developed a mathematical model that precisely matched their experimental observations. This model can be used to predict whether a given air purifier will make the air dirtier in a given indoor environment. Scientific data on indoor air purifiers will be important as officials begin the process of regulating air purifiers that emit ozone. In September 2006, California Gov. Arnold Schwarzenegger signed into law Assembly Bill 2276, requiring the California Air Resources Board to develop regulations that will set emission standards and procedures for certifying and labeling the devices. “State regulators should set a strict limit on the amount of ozone produced by air purifiers to protect the public from exposure to unhealthy ozone and particulate matter levels,” Nizkorodov said
A new high-tech glove enables the translation of sign language into written text, facilitating communication for the hearing or speech impaired. The glove senses movements of the hand and fingers, and a computer turns those signals into letters and words. Future versions will also translate sign language to speech
WHEATON, Md.--There are more than 28 million deaf and hard-of-hearing Americans, yet there are still communication barriers between the deaf and the hearing world. Now, a new technology is breaking sound barriers and lending a helping hand to the hearing impaired.
In a hearing world, many of us take everyday sounds for granted, but for the deaf, living in a silent world is hard. Corinne Vinopol, an educator and president of Institute for Disabilities Research and Training in Wheaton, says, "The biggest challenge that deaf people face on a day-to-day basis is communication."
Communication is a challenge that electrical engineers are now helping the hearing impaired overcomes, with an electronic glove that turns American Sign Language gestures into text.
"What it does is detects the position of the fingers and the position of the hand so it translates positions of fingers into letters," says Josý Hernandez-Rebollar, an electrical engineer at George Washington University in Washington.
The glove, called Accele glove, has sensors that send signals from movements of the hand and fingers to a computer. The computer finds the correct word or letter associated with the hand movement and displays the text on the screen. Vinopol says:” It is important to have technology because it is an equalizer. It allows deaf people to function as their maximum within society."
Researchers hope the high-tech glove will bridge the communication gap between the deaf and hearing -- a sure sign of the times.
The glove will be available to the public within a year and expect the cost to be less than zero. Researchers are also developing the glove to translate sign language to speech.
Children Should Not Be Left Unsupervised With Dogs, Say Experts
Science Daily — Children should not be left unsupervised to play with a dog, say experts in this week's British Medical Journal. Their advice is part of a review aimed at doctors who deal with dog bites.
Dog bites and maulings are a worldwide problem, particularly in children, writes Marina Morgan and John Palmer. Every year 250,000 people who have been bitten by dogs attend minor injuries and emergency units in the United Kingdom, and half of all children are reportedly bitten by dogs at some time, boys more than girls. Accurate death figures are difficult to obtain, but in the past five years, two to three cases a year have made news headlines. Based on the latest medical evidence, they advise doctors how to examine and treat a patient presenting with a dog bite. They discuss the risk of infection and when to refer to specialist care. For travelers bitten abroad, they suggest assessing the risk of rabies. In terms of prevention, they suggest that children should be taught to treat dogs with respect, avoid direct eye contact, and not tease them. They should be taught not to approach an unfamiliar dog; play with any dog unless under close supervision; run or scream in the presence of a dog; pet a dog without first letting it sniff you; or disturb a dog that is eating, sleeping, or caring for puppies. Dog owners also need to change their behaviour, writes Rachel Besser, a children's doctor and lifetime dog owner, in an accompanying article. It is clear that not all dog owners appreciate that children should not be left unsupervised with a dog, she says. Just as some parents are obliged to take parenting classes, she would like to see equivalent mandatory classes for expectant dog owners to teach them about the responsibilities of dog ownership. Educational programmes for children Passive Voice (consider revising) to instill precautionary behaviour around dogs. Finally, she would like to see vets advising dog owners about bite prevention, and doctors promoting bite prevention when treating patients who have been bitten by dogs
Older Adults May Be Unreliable Eyewitnesses, Study Shows
Science Daily — A University of Virginia study suggests that older adults are not only more inclined than younger adults to make errors in recollecting details that have been suggested to them, but are also more likely than
Than younger people to have a very high level of confidence in their recollections, even when wrong.
"There are potentially significant practical implications to these results as confident but mistaken eyewitness testimony may be the largest cause of wrongful convictions in the United States," said Chad Dodson, the study's lead researcher and an assistant professor of psychology at the University of Virginia. (Credit: Photo by Dan Addison)
The Mysterious Case of Columbus's Silver Ore
Science Daily — Silver-bearing ore found at the settlement founded by Christopher Columbus's second expedition was not mined in the Americas, new research reveals.

Samples of galena, a silver-bearing lead ore and worked pieces of lead recovered from the archaeological dig at La Isabella. James Quine, Florida Museum of Natural History, University of Florida)
Sunday Febuary 25, 2006 2:55pm __________________________________________________________________________
CHICAGO, Feb. 23 (UPI) -- Circumcision significantly reduces the risk of acquiring HIV in young African men, a study from the University of Chicago found.
Researchers followed 2,784 young men from Kisumu, Kenya, circumcising half of them. Forty-seven of the 1,391 uncircumcised men contracted HIV, compared to 22 of the 1,393 uncircumcised men. "Our study shows that circumcised men had 53 percent fewer HIV infections than uncircumcised men," said Robert Bailey, an epidemiology professor. "We now have very concrete evidence that a relatively simple surgical procedure can have a very large impact on HIV." Bailey cautioned that circumcised men might engage in risky behavior, feeling that they protected from HIV. "Circumcision is by no means a natural condom," said Bailey. "We do know that some circumcised men become infected with HIV. However, we did find that the circumcised men in our study did not increase their risk behaviors after circumcision. In fact, all men in the trial increased their condom use and reduced their number of sexual partners." The study appears in the Feb. 24 issue of The Lancet.
All Rights Reserved.
LONDON, Feb. 25 (UPI) -- The British government has given in to increased public pressure and bestowed a $491,000 grant toward maintaining a historical hut in the South Pole.
After government officials rejected previous attempts to save the famed hut used by explorer Robert Falcon Scott, they have assigned funds to help preserve the Antarctica locale, the Sunday Telegraph reported. The hut used by Scott on his race to the South Pole has also gained an ally in New Zealand Prime Minister Helen Clark. Clark reportedly British Prime Minister Tony Blair to revisit the funding measure by detailing the importance of preserving one's national heritage, the newspaper said. The modest wooden hut gained notoriety for used by Scott and his party during their successful attempt to become only the second group to venture to the South Pole, despite the fact they died while returning to base.
Japan launches 4th spy satellite
KAGOSHIMA, Japan, Feb. 24 (UPI) -- Japan Saturday successfully launched its fourth intelligence-gathering satellite into Earth orbit aboard a H2A rocket. After the rocket launched from the Tanegashima Space Center in the Kagoshima Prefecture on Saturday, the radar-equipped spy satellite released and successfully began orbiting, the Japan Broadcasting Corp. reported. With its camera and radar, the satellite will enable Japan to accurately recognize objects from space from an altitude of more than 310 miles. With four such satellites now in orbit, Japan can take photographs of anywhere in the world on a daily basis. The Japan Broadcasting Corp. said the increased surveillance efforts prompted by a North Korean missile test over Japan in 1998.
New supercomputer to be unveiled
VANCOUVER, British Columbia, Feb. 12 (UPI) -- A Canadian firm is claiming to have taken a quantum leap in technology by producing a computer that can perform 64,000 calculations at once. D-Wave Systems, Inc., based near Vancouver, says it will unveil its new quantum supercomputer Tuesday, ABC News reports. Though most engineers thought quantum computers were decades away, D-Wave says the digital "bits" that race through the circuits of its computer are able to stand for 0 or 1 at the same time, allowing the machine to eventually do work that is far more than complex than that of digital computers. "There are certain classes of problems that can't be solved with digital computers," says Herb Martin, D-Wave's chief executive officer. "Digital computers are good at running programs; quantum computers are good at handling massive sets of variables." Don't expect to see quantum computers in your local stores anytime soon. Martin says the prototype is as big as a good-size freezer and a lot colder.
Why Do Humans And Primates Get More Stress-related Diseases Than Other Animals?
Science Daily — Why do humans and their primate cousins get more stress-related diseases than any other member of the animal kingdom? The answer, says Stanford University neuroscientist Robert Sapolsky, is that people, apes and monkeys are highly intelligent, social creatures with far too much spare time on their hands.
Primates are super smart and organized just enough to devote their free time to being miserable to each other and stressing each other out," he said. "But if you get chronically, psychosocially stressed, you're going to compromise your health. So, essentially, we've evolved to be smart enough to make ourselves sick."
A professor of biological sciences and of neurology and neurological sciences, Sapolsky has spent more than three decades studying the physiological effects of stress on health. His pioneering work includes ongoing studies of laboratory rats and wild baboons in the African wilderness. He discussed the biological and sociological implications of stress Feb. 17 in a lecture titled "Stress, Health and Coping" at the annual meeting of the American Association for the Advancement of Science in San Francisco. Stress response All vertebrates respond to stressful situations by releasing hormones, such as adrenalin and glucocorticoids, which instantaneously increase the animal's heart rate and energy level. "The stress response is incredibly ancient evolutionarily," Sapolsky said. "Fish, birds and reptiles secrete the same stress hormones we do, yet their metabolism doesn't get messed up the way it does in people and other primates." To understand why, he said, "just look at the dichotomy between what your body does during real stress--for example, something is intent on eating you and you're running for your life--versus what your body does when you're turning on the same stress response for months on end for purely psychosocial reasons." In the short term, he explained, stress hormones are "brilliantly adapted" to help you survive an unexpected threat. "You mobilize energy in your thigh muscles, you increase your blood pressure and you turn off everything that's not essential to surviving, such as digestion, growth and reproduction," he said. "You think more clearly, and certain aspects of learning and memory are enhanced. All of that is spectacularly adapted if you're dealing with an acute physical stressor--a real one." But non-life-threatening stressors, such as constantly worrying about money or pleasing your boss, also trigger the release of adrenalin and other stress hormones, which, over time, can have devastating consequences to your health, he said: "If you turn on the stress response chronically for purely psychological reasons, you increase your risk of adult onset diabetes and high blood pressure. If you're chronically shutting down the digestive system, there's a bunch of gastrointestinal disorders you're more at risk for as well." In children, the continual release of glucocorticoids can suppress the secretion of normal growth hormones. "There's actually a syndrome called stress dwarfism in kids who are so psychologically stressed that growth is markedly impaired," Sapolsky said. Studies show that long-term stress also suppresses the immune system, making you more susceptible to infectious diseases, and can even shut down reproduction by causing erectile dysfunction and disrupting menstrual cycles. "Furthermore, if you're chronically stressed, all sorts of aspects of brain function are impaired, including, at an extreme, making it harder for some neurons to survive neurological insults," Sapolsky added. "Also, neurons in the parts of the brain relating to learning, memory and judgment don't function as well under stress. That particular piece is what my lab has spent the last 20 years on." The bottom line, according to Sapolsky: "If you plan to get stressed like a normal mammal, you had better turn on the stress response or else you're dead. But if you get chronically, psychosocially stressed, like a Westernized human, then you are more at risk for heart disease and some of the other leading causes of death in Westernized life." Baboon studies In addition to numerous scientific papers about stress, Sapolsky has written four popular books on the subject--Why Zebras Don't Get Ulcers, The Trouble with Testosterone, A Primate's Memoir and Monkeyluv. Many of his insights are based on his 30-year field study of wild African baboons, highly social primates that are close relatives of Homo sapiens. Each year, he and his assistants follow troops of baboons in Kenya to gather behavioral and physiological data on individual members, including blood samples, tissue biopsies and electrocardiograms. "We've found that baboons have diseases that other social mammals generally don't have," Sapolsky said. "If you're a gazelle, you don't have a very complex emotional life, despite being a social species. But primates are just smart enough that they can think their bodies into working differently. It's not until you get to primates that you get things that look like depression." The same may be true for elephants, whales and other highly intelligent mammals that have complex emotional lives, he added. "The reason baboons are such good models is, like us, they don't have real stressors," he said. "If you live in a baboon troop in the Serengeti, you only have to work three hours a day for your calories, and predators don't mess with you much. What that means is you've got nine hours of free time every day to devote to generating psychological stress toward other animals in your troop. So the baboon is a wonderful model for living well enough and long enough to pay the price for all the social-stressor nonsense that they create for each other. They're just like us: They're not getting done in by predators and famines, they're getting done in by each other." It turns out that unhealthy baboons, like unhealthy people, often have elevated resting levels of stress hormones. "Their reproductive system doesn't work as well, their wounds heal more slowly, they have elevated blood pressure and the anti-anxiety chemicals in their brain, which have a structural similarity to Valium, work differently," Sapolsky said. "So they're not in great shape." Among the most susceptible to stress are low-ranking baboons and type A individuals. "Type A baboons are the ones who see stressors that other animals don't," Sapolsky said. "For example, having your worst rival taking a nap 100 yards away gets you agitated." But when it comes to stress-related diseases, social isolation may play an even more significant role than social rank or personality. "Up until 15 years ago, the most striking thing we found was that, if you're a baboon, you don't want to be low ranking, because your health is going to be lousy," he explained. "But what has become far clearer, and probably took a decade's worth of data, is the recognition that protection from stress-related disease is most powerfully grounded in social connectedness, and that's far more important than rank." Coping with stress What can baboons teach humans about coping with all the stress-inducing psychosocial nonsense we encounter in our daily lives? "Ideally, we have a lot more behavioral flexibility than the baboon," Sapolsky said, adding that, unlike baboons, humans can overcome their low social status and isolation by belonging to multiple hierarchies. "We are capable of social supports that no other primate can even dream of," he said. "For example, I might say, 'This job, where I'm a lowly mailroom clerk, really doesn't matter. What really matters is that I'm the captain of my softball team or deacon of my church'--that sort of thing. It's not just somebody sitting here, grooming you with their own hands. We can actually feel comfort from the discovery that somebody on the other side of the planet is going through the same experience we are and feel, I'm not alone. We can even take comfort reading about a fictional character, and there's no primate out there that can feel better in life just by listening to Beethoven. So the range of supports that we're capable of is extraordinary." But many of the qualities that make us human also can induce stress, he noted. "We can be pained or empathetic about somebody in Darfur," he said. "We can be pained by some movie character that something terrible happens to that doesn't even exist. We could be made to feel inadequate by seeing Bill Gates on the news at night, and we've never even been in the same village as him or seen our goats next to his. So the realm of space and time that we can extend our emotions means that there are a whole lot more abstract things that can make us feel stressed." Pursuit of happiness The Founding Fathers probably weren't thinking about health when they declared the pursuit of happiness to be an inalienable right, but when it comes to understanding the importance of a stress-free life, they may have been ahead of their time. "When you get to Westernized humans, it's only in the last century or two that our health problems have become ones of chronic lifestyle issues," Sapolsky said. "It's only 10,000 years or so that most humans have been living in high-density settlements--a world of strangers jostling and psychologically stressing each other. But being able to live long enough to get heart disease, that's a fairly new world." According to Sapolsky, happiness and self-esteem are important factors in reducing stress. Yet the definition of "happiness" has less to do with material comfort than Westerners might assume, he noted: "An extraordinary finding that's been replicated over and over is that once you get past the 25 percent or so poorest countries on Earth, where the only question is survival and subsistence, there is no relationship between gross national product, per capita income, any of those things, and levels of happiness." Surveys show that in Greece, for example, one of Western Europe's poorest countries, people are much happier than in the United States, the world's richest nation. And while Greece is ranked number 30 in life expectancy, the United States--with the biggest per capita expenditure on medical care--is only slighter higher, coming in at 29. "The United States has the biggest discrepancy in health and longevity between our wealthiest and our poorest of any country on Earth," Sapolsky noted. "We're also ranked way up in stress-related diseases." Japan is number one in life expectancy, largely because of its extremely supportive social network, according to Sapolsky. He cited similar findings in the United States. "Two of the healthiest states are Vermont and Utah, while two of the unhealthiest are Nevada and New Hampshire," he noted. "Vermont is a much more left-leaning state in terms of its social support systems, while its neighbor New Hampshire prides itself on no income tax and go it alone. In Utah, the Mormon church provides extended social support, explanations for why things are and structure. You can't ask for more than that. And next door is Nevada, where people are keeling over dead from all of their excesses. It's very interesting." Typically, observant Mormons and other religious people are less likely to smoke and drink, he noted. "But once you control for that, religiosity in and of itself is good for your health in some ways, although less than some of its advocates would have you believe," Sapolsky said. "It infuriates me, because I'm an atheist, so it makes me absolutely crazy, but it makes perfect sense. If you have come up with a system that not only tells you why things are but is capped off with certain knowledge that some thing or things respond preferentially to you, you're filling a whole lot of pieces there--gaining some predictability, attribution, social support and control over the scariest realms of our lives."
2,000 Influenza Virus Genomes Now Completed And Publicly Accessible
Science Daily — The Influenza Genome Sequencing Project, funded by the National Institute of Allergy and Infectious Diseases (NIAID), one of the National Institutes of Health (NIH), announced today that it has achieved a major milestone. The entire genetic blueprints of more than 2,000 human and avian influenza viruses taken from samples around the world have been completed and the sequence data made available in a public database. "This information will help scientists understand how influenza viruses evolve and spread," says NIH Director Elias A. Zerhouni, M.D., "and it will aid in the development of new flu vaccines, therapies and diagnostics." "Scientists around the world can use the sequence data to compare different strains of the virus, identify the genetic factors that determine their virulence, and look for new therapeutic, vaccine and diagnostic targets," says NIAID Director Anthony S. Fauci, M.D. The Influenza Genome Sequencing Project, initiated in 2004, has been carried out at the NIAID-funded Microbial Sequencing Center managed by The Institute for Genomic Research (TIGR) of Rockville, Maryland. The project is currently directed by David Spiro, Ph.D., and Claire Fraser, Ph.D., at TIGR and Elodie Ghedin, Ph.D., at the University of Pittsburgh School of Medicine. Recently, growing sequencing capacity has enabled the production rate to increase to more than 200 viral genomes per month. Eclipsing today's milestone of 2,000 genomes, the microbial sequencing center will continue to rapidly sequence more influenza strains and isolates and will make all the sequence data freely available to the scientific community and the public through GenBank, an Internet-accessible database of genetic sequences maintained by the National Center for Biotechnology Information (NCBI) at NIH's National Library of Medicine, another major contributor to the project. Seasonal influenza is a major public health concern in the United States, accounting for approximately 36,000 deaths and 200,000 hospitalizations each year. Globally, influenza results in an estimated 250,000 to half a million deaths annually. Seasonal flu shots are updated every year to target the latest strains in circulation. Developing such vaccines is challenging, however, because the influenza virus is prone to high mutation rates when it replicates, and these mutations can alter the virus enough that vaccines against one strain may not protect against another strain. The Answer To Childhood Obesity: 15 Minutes Of Football? Science Daily — Everyone knows children are getting fatter and that both a poor diet and a lack of exercise are to blame. But, what researchers have been unable to discover until now, is exactly how major a role activity plays in the battle to keep obesity at bay.  The Actigraph Monitor is the size of a matchbox and is worn around the waist. The device counts the up-and-down movement as children, walk, run or engage in other physical activity. As well as measuring overall levels of physical activity, it also allows researchers to measure activity intensity -- by giving a figure for the number of movements it records over the course of a minute. (Credit: ALSPAC)
Now a new report published in the journal PLoS Medicine, offers new hope for parents concerned about the growing obesity epidemic. It suggests that making even small increases to your daily exercise routine, such as walking your child to school each day instead of taking the car, could have dramatic long-term results.
Using the latest cutting-edge techniques, researchers discovered that doing 15 minutes a day of moderate exercise lowered a child's chances of being obese by almost 50 per cent. As long as the activity was at least of the level of a brisk walk - enough to make your child a little out a breath -- it seemed to be of benefit.
What makes the results particularly startling is both the large number of UK children studied and the use of high-tech equipment, providing the most accurate measures of both fat and activity levels ever achieved for a study of this type.
Researchers monitored 5,500 12-year-olds from the Children of the 90s research project (also known as ALSPAC, the Avon Longitudinal Study of Parents and Children) based at the University of Bristol, measuring their activity levels for 10 hours a day.
Each child wore a special 'Actigraph activity monitor', which sits on a belt around the waist and records every move they made. Most wore the movement-sensitive monitor for a week but all used the Actigraph for at least three days.
They also had their body fat measured using an X-ray emission scanner, which differentiates both muscle and fat deposits in the body. This is far more precise than the usual BMI (Body Mass Index) system often used to estimate fat levels.
Heading up the research is Professor Chris Riddoch from Bath University together with Children of the 90s' co-director Professor Andy Ness and his team at Bristol.
Professor Riddoch explained the significance of their results: "This study provides some of the first robust evidence on the link between physical activity and obesity in children.
"We know that diet is important, but what this research tells us is that we mustn't forget about activity. It's been really surprising to us how even small amounts of exercise appear to have dramatic results."
Professor Ness added: "The association between physical activity and obesity we observed was strong. These associations suggest that modest increases in physical activity could lead to important reductions in childhood obesity."
He also stressed that doing 15 minutes of moderate exercise a day should be regarded as a starting point, but one most people would find able to fit into their life-style.
The team will now be taking their research further - looking to see if specific patterns of exercise can help achieve even better results.
ALSPAC (The Avon Longitudinal Study of Parents and Children, also known as Children of the 90s) is a unique ongoing research project based in the University of Bristol. It enrolled 14,000 mothers during pregnancy in 1991-2 and has followed most of the children and parents in minute detail ever since.
This individual physical activity research element of the study was funded direct by the US. National Institutes of Health. The research team consisted of collaborators from the University of Bristol, the University of Bath, MRC Epidemiology Unit Cambridge, UCL, the University of Glasgow and the University of South Carolina.
The ALSPAC study was undertaken with financial support of the Medical Research Council, the Wellcome Trust, and the University of Bristol among many others.
An even greater concern is the potential for an influenza pandemic caused by the emergence of a new, highly lethal virus strain that is easily transmitted from person to person. Influenza pandemics have occurred three times in the last century, the most lethal of which was the pandemic of 1918, which caused an estimated 40 to 50 million deaths worldwide. "A few years ago, only limited genetic information on influenza viruses existed in the public domain, and much of the sequence data was incomplete," says Maria Y. Giovanni, Ph.D., who oversees the NIAID Microbial Sequencing Centers. "The Influenza Genome Sequencing Project has filled that gap by vastly increasing the amount of influenza sequence data and rapidly making it available to the entire scientific community. Subsequently, there has been a marked increase in the number of scientists worldwide depositing influenza genome sequence data into the public domain including scientists at St. Jude Children's Research Hospital and the Centers for Disease Control and Prevention." Along with NIAID, TIGR and NCBI, other collaborators on the project include the Wadsworth Center of the New York State Department of Health in Albany, NY; the Centers for Disease Control and Prevention in Atlanta; St. Jude Children's Research Hospital in Memphis, TN; the World Organization for Animal Health / Food and Agriculture Organization of the United Nations (OIE/FAO) Reference Laboratory for Newcastle Disease and Avian Influenza in Padova, Italy; The Ohio State University in Columbus, OH; Children's Hospital Boston; Baylor College of Medicine in Houston; and Canterbury Health Laboratories in Christchurch, New Zealand.
To help analyze and interpret the large quantity of sequence data generated by the Project, NIAID has funded the Bio Health Base Bioinformatics Resource Center, which is being developed by researchers at the University of Texas Southwestern Medical Center at Dallas and developers at Northrop Grumman Information Technology's Life Sciences division in Rockville, Maryland. This Center provides the scientific community with bioinformatics and software tools and a robust point-of-entry for accessing influenza genomic and related data in a user-friendly format. Bio Health Base has recently established a collaboration with the Influenza Sequence Database at Los Alamos National Laboratory to provide influenza researchers with computational data management and analysis resources to assist in interpreting the genetic data. Data from the Influenza Genome Sequencing Project, as well as all other publicly available influenza sequence data, are also available through NCBI's Influenza Virus Resource, which includes a host of analysis tools, such as sequence alignment and building "trees" that show evolutionary relationships.
Human Factors Analysis Reveals Ways To Reduce Friendly Fire Incidents
Science Daily — One of the most tragic consequences of war is "friendly fire": casualties that result when warfighters mistakenly fire on their own or allied troops. The causes and possible mitigators of friendly fire are being studied by a group of human factors/ergonomics researchers at the University of Central Florida and at the Air Force Research Laboratory. Their findings, to be published in the April issue of Human Factors: The Journal of the Human Factors and Ergonomics Society, include a taxonomy by which troop teamwork could be strengthened to reduce confusion and error.
Although precise figures are hard to pin down, it has been estimated that 17% of deaths during Operation Desert Storm were attributable to friendly fire. "Whatever the absolute numbers," the researchers say, "friendly fire incidents occur and will continue to occur." But they believe that human factors research and analysis can provide information and methods that can reduce the likelihood of such incidents.
They began their investigation by examining the literature on friendly fire, then used a human-centered approach to understand errors and create the taxonomy. It contains a list of questions that draw attention to ways in which teamwork breakdown could occur. They concluded that in the absence of adequate shared cognition -- the sharing of information about the dynamic, ambiguous, and time-stressed battlefield environment -- warfighters can have problems interpreting cues, making decisions, and taking correct action. "The battlefield is one of the most difficult operational environments within which to perform cognitive tasks," the researchers state. "herefore, breakdowns in shared cognition are inevitable."
Although technological solutions, such as combat identification systems (e.g., Blue Force Tracker), have been implemented to enhance shared cognition and prevent friendly fire incidents, other factors, such as sleep deprivation and visual misidentification, can still lead to human error. In addition, these technologies can fail or simply be unavailable. What's needed, the researchers say, is a better understanding of specific failures of teamwork, including information transmission, team behavior, and team attitude. "Human solutions cannot be ignored, such as better teamwork...and training to improve communication, coordination, and cooperation."
The researchers believe that this taxonomy can help in the analysis of trends in friendly fire incidents or near misses, leading to recommendations and solutions to improve teamwork in order to minimize risk.
The Human Factors and Ergonomics Society, which celebrates its 50th anniversary in September 2007, is a multidisciplinary professional association of more than 4,700 persons in the United States and throughout the world. Its members include psychologists and other scientists, designers, and engineers, all of whom have a common interest in designing systems and equipment to be safe and effective for the people who operate and maintain them.
Himalayan Glacier Melting Observed From Space
Science Daily — The Himalaya, the "Roof of the World", source of the seven largest rivers of Asia are, like other mountain chains, suffering the effects of global warming. To assess the extent of melting of its 33 000 km2 of glaciers, scientists have been using a process they have been pioneering for some years.

DGPS positioning of an ablation marker on the tongue of Chhota Shigri Glacier at 4400 m. (Credit: Copyright IRD/Yves Arnaud)
Satellite-imagery derived glacier surface topographies obtained at intervals of a few years were adjusted and compared. Calculations indicated that 915 km2 of Himalayan glaciers of the test region, Spiti/Lahaul (Himachal Pradesh, India) thinned by an annual average of 0.85 m between 1999 and 2004. The technique is still experimental, but it has been validated in the Alps and could prove highly effective for watching over all the Himalayan glacier systems. However, the procedure for achieving a reliable estimate must overcome a number of sources of error and approximation inherent in satellite-based observations.
The researchers started by retrieving satellite data for two periods, 2000 and 2004. A digital field model was extracted for each of them, representing the topography of a ground reference point in digital form and therefore usable in computerized processing. The earliest topography of the area studied was provided by NASA which observed 80% of the Earth's surface during the Shuttle Radar Topographic Mission of February 2000. Then, in November 2004, two 2.5 m resolution images of the same area taken at two different angles were acquired especially by the French satellite Spot5 in the framework of an ISIS (CNES) project.
Comparison of these two images has helped build a field model, a Digital Elevation Model (DEM), by stereoscopic photogrammetric techniques . The DEM model reveals that NASA radar data underestimate values at high altitudes and overestimate them at lower altitudes. And the Spot satellite produces an uncertainty of +/- 25 m in the horizontal positioning of images.
Moreover, as the authorities of the major Himalayan countries (India, Pakistan, China) do not permit public access to detailed topographic maps or aerial photographs of these sensitive cross-border regions, no reference is available for satellite observation error assessment and correction. It is therefore by comparing the SRTM and SPOT5 topographies using stable non-glaciated areas around glaciers that researchers have been able to adjust for the deviations and superimpose the two digital field models. These comparisons gave the bases for a map of glacier elevation (and hence thickness) variations for altitude intervals of 100 m over the period 2000-2004.
The results show clear regression of the large glaciers whose terminal tongues reach the lowest levels (about 4000 m) with a thinning of 8 to 10 m below 4400 m. Such loss is 4 to 7 m between 4400 and 5000 m, passing to 2 m above 5000 m. The satellite image evaluation yields an average mass balance of --0.7 to --0.85 m/a water equivalent for the 915 km2 of glaciers surveyed, a total mass loss of 3.9 km3 of water in 5 years. In order to check these results and validate the procedure, the satellite-derived results were compared with the mass balance for the small glacier Chhota Shigri (15 km2) determined from the field measurements and surveys, performed between 2002 and 2004 by the Great Ice research unit and its Indian partners. The mass balance determined from these field data and that calculated from satellite data agree. For both evaluation methods, Chhota Shigri glacier appears to have lost an average of a little over 1 m of ice per year.
These results are in line with global estimates for glacier made for the period between 2001 and 2004. The approach is therefore being extended to other areas of the Himalaya in order to gain more information on the still poorly known changes taking place in the region's glaciers, which are a water resource on which tens of millions of people depend.
Reference: Berthier Etienne, Arnaud Yves, Kumar Rajesh, Ahmad Sarfaraz, Wagnon Patrick, Chevallier Pierre - Remote sensing estimates of glacier mass balances in the Himachal Pradesh (Western Himalaya, India), Remote Sensing of the Environment. DOI : 10.1016/j.rse.2006.11.017
New Engine Helps Satellites Blast Off With Less Fuel
Science Daily — Georgia Tech researchers have developed a new protoype engine that allows satellites to take off with less fuel, opening the door for deep space missions, lower launch costs and more payload in orbit.  Dr. Mitchell Walker, an assistant professor in the Daniel Guggenheim School of Aerospace Engineering, tests an engine. (Credit: Image courtesy of Georgia Institute of Technology)
The efficient satellite engine uses up to 40 percent less fuel by running on solar power while in space and by fine-tuning exhaust velocity. Satellites using the Georgia Tech engine to blast off can carry more payload thanks to the mass freed up by the smaller amount of fuel needed for the trip into orbit. Or, if engineers wanted to use the reduced fuel load another way, the satellite could be launched more cheaply by using a smaller launch vehicle. The fuel-efficiency improvements could also give satellites expanded capabilities, such as more maneuverability once in orbit or the ability to serve as a refueling or towing vehicle. The Georgia Tech project, lead by Dr. Mitchell Walker, an assistant professor in the Daniel Guggenheim School of Aerospace Engineering, was funded by a grant from the U.S. Air Force. The project team made significant experimental modifications to one of five donated satellite engines from aircraft engine manufacturer Pratt & Whitney to create the final prototype. The key to the engine improvements, said Walker, is the ability to optimize the use of available power, very similar to the transmission in a car. A traditional chemical rocket engine (attached to a satellite ready for launch) runs at maximum exhaust velocity until it reaches orbit, i.e. first gear. The new Georgia Tech engine allows ground control units to adjust the engine’s operating gear based on the immediate propulsive need of the satellite. The engine operates in first gear to maximize acceleration during orbit transfers and then shifts to fifth gear once in the desired orbit. This allows the engine to burn at full capacity only during key moments and conserve fuel. “You can really tailor the exhaust velocity to what you need from the ground,” Walker said. The Georgia Tech engine operates with an efficient ion propulsion system. Xenon (a noble gas) atoms are injected into the discharge chamber. The atoms are ionized, (electrons are stripped from their outer shell), which forms xenon ions. The light electrons are constrained by the magnetic field while the heavy ions are accelerated out into space by an electric field, propelling the satellite to high speeds. Tech’s significant improvement to existing xenon propulsion systems is a new electric and magnetic field design that helps better control the exhaust particles, Walker said. Ground control units can then exercise this control remotely to conserve fuel. The satellite engine is almost ready for military applications, but may be several years away from commercial use, Walker added
Future Car Receives Fuel Cell
Science Daily — LUBBOCK -- Texas Tech University's Future Car Research is receiving an energy boost from Energy Partners, Inc. of West Palm Beach, Fla. The company is donating a hydrogen-powered fuel cell that Texas Tech will install in a Chevrolet Lumina when the cell arrives the first week of December.
The fuel cell is designed to replace the car's internal combustion system with hopes of producing fewer objectionable emissions that pollute the environment. Texas Tech also hopes to design a super fuel-efficient vehicle capable of doubling existing fuel economy. Research in this Future Car may someday find itself in production automobiles. Funded by the Department of Energy, Texas Tech and Virginia Tech University are the only two universities in the United States to create a hybrid electric car that uses a fuel cell as an alternative to the internal combustion engine. In June, Texas Tech was one of 13 universities from across the country to compete in the 10-day Future Car Competition in Detroit, Mich.
Study revises theory of how brain works
BONN, Germany, Feb. 28 (UPI) -- German researchers said they've determined the human brain appears to process information more chaotically than has long been thought. University of Bonn scientists said they found the passing of information from neuron to neuron doesn't occur exclusively at the synapses. They said it seems neurons release their chemical messengers along the entire length of the extensions and, in this way, excite the neighboring cells. The finding is said to be of huge significance since it explodes fundamental notions about the way our brain works and might lead to the development of new drugs. The study appears online in the journal Nature Neuroscience and will soon be published in the print edition
Video: Smart Meters Save $$$
Mechanical Engineers' Device Helps Electricity Conservation Science Daily — Smart meters are small computers that provide real-time information on how much electricity is being used by each customer and when, and can relay information back to the head office over the very same power lines they feed from. Smart meters can bill different rates depending overall grid usage, to encourage conservation. Users can program them to control appliances, for example running the dishwasher during off-peak hours, when rates are lower
Full Story ... SAN FRANCISCO -- During the winter, we crank up the heat. During the summer, we turn up the air. And all the time we're eating up electricity. Now a new smart meter may help to save energy and save money. Like most parents, finding ways to save is a priority for Trina Camping. But she thought saving on her electric bill was a lost cause. That is, until she saw the light and started using a SmartMeter. "This is what makes a SmartMeter smart," Mechanical Engineer Tim Vahlstrom, tells DBIS. "It's mounted onto a typical electric meter." Engineers at Pacific Gas and Electric in San Francisco are working on SmartMeters. They're mini computers that provide real-time information on how much electricity is being used by each customer and when. "This utilizes a technology called 'Powerline Carrier.' So it puts the signal back on the lines that actually feed the Meter, back through the power lines, to the transformer, back all the way to the head office," Vahlstrom says. Everything is done remotely. You can turn your air conditioning, heating and lights on at home -- even when you're on vacation. Vahlstrom says, "I want my thermostat to raise temperature by five degrees or four degrees if the price of electricity gets above this level. Then automatically the thermostat we can send a command to do that." Power outages can be detected immediately, and within seconds, power is back on. "It actually does these things remotely without a person on-site," Vahlstrom says. Peak power hours are from 2 p.m. to 7 p.m. By running her dishwasher and other appliances later at night, the power company gives Camping and her family a discount. "We've saved about 0 to 0," Camping says. It's the first step to saving electricity and saving money. Minnesota, Arizona, Pennsylvania and Florida are already using the Smart Meters. Not all of the functions are available right now in every state, but will be soon. BACKGROUND: Pacific Gas and Electric in California is installing Smart Meters in millions of homes: small computers that communicate with a utility's central data center, providing real-time information on how much electricity a customer is using, and when it is being used. These remotely-read meters can be linked to a variety of pricing and other options, and should help improve service and lower costs for consumers. Similar systems have been introduced in Minnesota, Pennsylvania and Florida. SAVING MONEY: While there will be an initial small hike in electricity rates to pay for the billion Smart Meter program, in the long term, potential savings could be considerable. The new "voluntary pricing plans" charge customers more for power used during peak periods (such as weekday afternoons) when supply is tight and prices are higher, and charge less at night on weekends, when demand and prices are lower. Users can plan their cost and energy usage accordingly. Power outages can be detected right away, and since everything is done remotely, there is no need for meter readings, or on-site connection or disconnection of service. ON THE GRID: The nation's power grid boasts more than 6,000 inter-connected power generation stations. Power is sent around the country via half a million miles of bulk transmission lines carrying high voltage charges of electricity. From these lines, power is sent to regional and neighborhood substations, where the electricity is then stepped down from high voltage to a current suitable for use in homes and offices. The system has its advantages: distant stations can provide electricity to cities and towns that may have lost power. But unusually high or unbalanced demands for power -- especially those that develop suddenly -- can upset the smooth flow of electricity. This can cause a blackout in one section of a grid, or ripple through the entire grid, shutting down one section after another, making it difficult to restore power from neighboring stations. AC/DC: There are two different kinds of electrical current: alternating current (AC) and direct current (DC). In direct current a steady stream of electrons flows continuously in only one direction, for example, from the negative to the positive terminal of a battery. Alternating current changes direction 50 or 60 times per second, oscillating up and down. Almost all of the electricity used in homes and businesses is alternating current. That's because it's easier to send AC over long distances without losing too much to leakage. Leakage is the result of friction as electricity travels along a wire over distance; some voltage loss inevitably occurs. AC can be converted much more easily to higher voltages, which are better able to overcome line resistance.
From Farm Waste To Fuel Tanks: Record-breaking Methane Storage System Derived From Corncobs
Science Daily — Using corncob waste as a starting material, researchers have created carbon briquettes with complex nanopores capable of storing natural gas at an unprecedented density of 180 times their own volume and at one seventh the pressure of conventional natural gas tanks.  Researchers at the University of Missouri-Columbia and the Midwest Research Institute in Kansas City have developed a method to convert corncob waste into a carbon "sponge" with nanoscale pores. The new material can store large quantities of natural gas and can be formed into a variety of shapes, ideal characteristics for next-generation gas storage tanks on methane-powered automobiles. (Credit: Nicolle Rager Fuller, National Science Foundation)
The breakthrough, announced today in Kansas City, Mo., is a significant step forward in the nationwide effort to fit more automobiles to run on methane, an abundant fuel that is domestically produced and cleaner burning than gasoline. Supported by the National Science Foundation (NSF) Partnership for Innovation program, researchers at the University of Missouri-Columbia (MU) and Midwest Research Institute (MRI) in Kansas City developed the technology. The technology has been incorporated into a test bed installed on a pickup truck used regularly by the Kansas City Office of Environmental Quality. The briquettes are the first technology to meet the 180 to 1 storage to volume target set by the U.S. Department of Energy in 2000, a long-term goal of principal project leader Peter Pfeifer of MU. "We are very excited about this breakthrough because it may lead to a flat and compact tank that would fit under the floor of a passenger car, similar to current gasoline tanks," said Pfeifer. "Such a technology would make natural gas a widely attractive alternative fuel for everyone." According to Pfeifer, the absence of such a flatbed tank has been the principal reason why natural gas, which costs significantly less than gasoline and diesel and burns more cleanly, is not yet widely used as a fuel for vehicles. Standard natural gas storage systems use high-pressure natural gas that has been compressed to a pressure of 3600 pounds per square inch and bulky tanks that can take up the space of an entire car trunk. The carbon briquettes contain networks of pores and channels that can hold methane at a high density without the cost of extreme compression, ultimately storing the fuel at a pressure of only 500 pounds per square inch, the pressure found in natural gas pipelines. The low pressure of 500 pounds per square inch is central for crafting the tank into any desired shape, so ultimately, fuel storage tanks could be thin-walled, slim, rectangular structures affixed to the underside of the car, not taking up room in the vehicle. Pfeifer and his colleagues at MU and MRI discovered that that fractal pore spaces (spaces created by repetition of similar patterns at different scales) are remarkably efficient at storing natural gas. "Our project is the first time a carbon storage material has been made from corncobs, an abundantly available waste product in the Midwest," said Pfeifer. "The carbon briquettes are made from the cobs that remain after the kernels have been harvested. The state of Missouri alone could supply the raw material for more than 10 million cars per year. It would be a unique opportunity to bring corn to the market for alternative fuels--corn kernels for ethanol production, and corncob for natural gas tanks." The test pickup truck, part of a fleet of more than 200 natural gas vehicles operated by Kansas City, has been in use since mid-October and the researchers are monitoring the technology's performance, from mileage data to measurements of the stability of the briquettes. In addition to efforts to commercialize the technology, the researchers are now focusing on the next generation briquette, one that will store more natural gas and cost less to produce. Pfeifer believes this next generation of briquette might even hold promise for storing hydrogen.
3D computer graphics are works of graphic art created with the aid of digital computers and 3D software. The term may also refer to the process of creating such graphics, or the field of study of computer graphic techniques and related technology. 3D computer graphics are different from 2D computer graphics in that a three-dimensional representation of geometric data is stored in the computer for the purposes of performing calculations and rendering 2D images. Such images may be for later display or for real-time viewing. 3D modeling is the process of preparing geometric data for 3D computer graphics, and is similiar to sculpting, whereas the art of 2D graphics is analogous to photography. Despite these differences, 3D computer graphics rely on many of the same algorithms as 2D computer graphics. In computer graphics software, the distinction between 2D and 3D is occasionally blurred; 2D applications may use 3D techniques to achieve effects such as lighting, and primarily 3D may use 2D rendering techniques.
Technology
OpenGL and Direct3D are two popular APIs for generation of real-time imagery. Real-time means that image generation occurs in 'real time', or 'on the fly', and may be highly user-interactive. Many modern graphics cards provide some degree of hardware acceleration based on these APIs, frequently enabling display of complex 3D graphics in real-time.
Creation of 3D computer graphics  Architectural rendering compositing of modeling and lighting finalized by rendering process  3D model of a suspension bridge spanning an unusually placid body of water The process of creating 3D computer graphics can be sequentially divided into three basic phases: Content creation (3D modeling, texturing, animation) Scene layout setup Rendering
Modeling The modeling stage consists of shaping individual objects that are later used in the scene. There are a number of modeling techniques, including constructive solid geometry NURBS modeling polygonal modeling subdivision surfaces implicit surfaces Modeling processes may also include editing object surface or material properties (e.g., color, luminosity, diffuse and specular shading components — more commonly called roughness and shininess, reflection characteristics, transparency or opacity, or index of refraction), adding textures, bump-maps and other features. Modeling may also include various activities related to preparing a 3D model for animation (although in a complex character model this will become a stage of its own, known as rigging). Objects may be fitted with a skeleton, a central framework of an object with the capability of affecting the shape or movements of that object. This aids in the process of animation, in that the movement of the skeleton will automatically affect the corresponding portions of the model. See also Forward kinematic animation and Inverse kinematic animation. At the rigging stage, the model can also be given specific controls to make animation easier and more intuitive, such as facial expression controls and mouth shapes (phonemes) for lipsyncing. Modeling can be performed by means of a dedicated program (e.g., Cinema 4D,Lightwave Modeler, Rhinoceros 3D, Moray), an application component (Shaper, Lofter in 3D Studio) or some scene description language (as in POV-Ray). In some cases, there is no strict distinction between these phases; in such cases modelling is just part of the scene creation process (this is the case, for example, with Caligari trueSpace and Realsoft 3D). Particle system are a mass of 3d coordinates which have either points, polygons, splats or sprites assign to them. They act as a volume to represent a shape.
Process  A 3D scene of 8 red glass balls
Scene layout setup Scene setup involves arranging virtual objects, lights, cameras and other entities on a scene which will later be used to produce a still image or an animation. If used for animation, this phase usually makes use of a technique called "keyframing", which facilitates creation of complicated movement in the scene. With the aid of keyframing, instead of having to fix an object's position, rotation, or scaling for each frame in an animation, one needs only to set up some key frames between which states in every frame are interpolated. Lighting is an important aspect of scene setup. As is the case in real-world scene arrangement, lighting is a significant contributing factor to the resulting aesthetic and visual quality of the finished work. As such, it can be a difficult art to master. Lighting effects can contribute greatly to the mood and emotional response effected by a scene, a fact which is well-known to photographers and theatrical lighting technicians.
Tessellation and meshes The process of transforming representations of objects, such as the middle point coordinate of a sphere and a point on its circumference into a polygon representation of a sphere, is called tessellation. This step is used in polygon-based rendering, where objects are broken down from abstract representations ("primitives") such as spheres, cones etc, to so-called meshes, which are nets of interconnected triangles. Meshes of triangles (instead of e.g. squares) are popular as they have proven to be easy to render using scanline rendering. Polygon representations are not used in all rendering techniques, and in these cases the tessellation step is not included in the transition from abstract representation to rendered scene.
Rendering Rendering is the final process of creating the actual 2D image or animation from the prepared scene. This can be compared to taking a photo or filming the scene after the setup is finished in real life. Rendering for interactive media, such as games and simulations, is calculated and displayed in real time, at rates of approximately 20 to 120 frames per second. Animations for non-interactive media, such as feature films and video, are rendered much more slowly. Non-real time rendering enables the leveraging of limited processing power in order to obtain higher image quality. Rendering times for individual frames may vary from a few seconds to several days for complex scenes. Rendered frames are stored on a hard disk then can be transferred to other media such as motion picture film or optical disk. These frames are then displayed sequentially at high frame rates, typically 24, 25, or 30 frames per second, to achieve the illusion of movement. Several different, and often specialized, rendering methods have been developed. These range from the distinctly non-realistic wireframe rendering through polygon-based rendering, to more advanced techniques such as: scanline rendering, ray tracing, or radiosity. In general, different methods are better suited for either photo-realistic rendering, or real-time rendering. In real-time rendering, the goal is to show as much information as possible as the eye can process in a 30th of a second (or one frame, in the case of 30 frame-per-second animation). The goal here is primarily speed and not photo-realism. In fact, here exploitations are made in the way the eye 'perceives' the world, and thus the final image presented is not necessarily that of the real-world, but one which the eye can closely associate to. This is the basic method employed in games, interactive worlds, VRML. The rapid increase in computer processing power has allowed a progressively higher degree of realism even for real-time rendering, including techniques such as HDR rendering. Real-time rendering is often polygonal and aided by the computer's GPU.
An example of a ray-traced image that typically takes seconds or minutes to render. The photo-realism is apparent. When the goal is photo-realism, techniques are employed such as ray tracing or radiosity. Rendering often takes of the order of seconds or sometimes even days (for a single image/frame). This is the basic method employed in digital media and artistic works, etc. Rendering software may simulate such visual effects as lens flares, depth of field or motion blur. These are attempts to simulate visual phenomena resulting from the optical characteristics of cameras and of the human eye. These effects can lend an element of realism to a scene, even if the effect is merely a simulated artifact of a camera. Techniques have been developed for the purpose of simulating other naturally-occurring effects, such as the interaction of light with various forms of matter. Examples of such techniques include particle systems (which can simulate rain, smoke, or fire), volumetric sampling (to simulate fog, dust and other spatial atmospheric effects), caustics (to simulate light focusing by uneven light-refracting surfaces, such as the light ripples seen on the bottom of a swimming pool), and subsurface scattering (to simulate light reflecting inside the volumes of solid objects such as human skin). The rendering process is computationally expensive, given the complex variety of physical processes being simulated. Computer processing power has increased rapidly over the years, allowing for a progressively higher degree of realistic rendering. Film studios that produce computer-generated animations typically make use of a render farm to generate images in a timely manner. However, falling hardware costs mean that it is entirely possible to create small amounts of 3D animation on a home computer system. The output of the renderer is often used as only one small part of a completed motion-picture scene. Many layers of material may be rendered separately and integrated into the final shot using compositing software.
Renderers Although renderers are usually included in 3D software packages, many renderers are offerred as plugins, including AccuRender Brazil R/S Bunkspeed finalRender Indigo Renderer Kerkythea Maxwell mental ray POV-Ray Realsoft 3D Pixar RenderMan V-Ray YafRay
Projection  Perspective Projection The mathematical model represented inside the computer must be transformed back so that the human eye can correlate the image to a realistic one. But the fact that the display device - namely a monitor - can display only two dimensions means that this mathematical model must be transferred to a two-dimensional image. Often this is done using projection; mostly using perspective projection. The basic idea behind the perspective projection, which unsurprisingly is the way the human eye works, is that objects that are further away are smaller in relation to those that are closer to the eye. Thus, to collapse the third dimension onto a screen, a corresponding operation is carried out to remove it - in this case, a division operation. Orthographic projection is used mainly in CAD or CAM applications where scientific modelling requires precise measurements and preservation of the third dimension.
Reflection and shading models Modern 3D computer graphics rely heavily on a simplified reflection model called Phong reflection model (not to be confused with Phong shading). In refraction of light, an important concept is the refractive index. In most 3D programming implementations, the term for this value is "index of refraction," usually abbreviated "IOR."
3D graphics APIs
3D graphics have become so popular, particularly in computer games, that specialized APIs (application programming interfaces) have been created to ease the processes in all stages of computer graphics generation. These APIs have also proved vital to computer graphics hardware manufacturers, as they provide a way for programmers to access the hardware in an abstract way, while still taking advantage of the special hardware of this-or-that graphics card.
These APIs for 3D computer graphics are particularly popular:
OpenGL and the OpenGL Shading Language
OpenGL ES 3D API for embedded devices
Direct3D (a subset of DirectX)
RenderMan
RenderWare
Glide API
TruDimension LC Glasses and 3D monitor API
There is also higher-level 3D scene-graph APIs that provide additional functionality on top of the lower level rendering API. Such libraries under active development include:
QSDK
Quesa
Java 3D
JSR 184 (M3G)
Vega Prime by MultiGen-Paradigm
NVIDIA Scene Graph
Open Scene Graph
OpenSG
OGRE
JMonkey Engine
Irrlicht Engine
Hoops3D
UGS DirectModel (aka JT)
See also 3D model 3D modeler 3D projection Ambient occlusion Anaglyph image — Anaglyphs are stereo pictures that viewed with red-blue glasses that allow a 3D image to perceived as 3D by the human eye. Animation Graphics History of 3D Graphics — Major Milestones/Influential people/Hardware Software Developments Panda3D Polarized glasses — A method to view 3D images. Reflection (computer graphics) Rendering (computer graphics) VRML X3D 3d motion controller
_____________________________________________________________________________
Researchers Wake Up Viruses Inside Tumors To Image And Then Destroy Cancers
Science Daily — Researchers have found a way to activate Epstein-Barr viruses inside tumors as a way to identify patients whose infection can then be manipulated to destroy their tumors. They say this strategy could offer a novel way of treating many cancers associated with Epstein-Barr, including at least four different types of lymphoma and nasopharyngeal and gastric cancers In the March 1 issue of Clinical Cancer Research, a team of radiologists and oncologists from Johns Hopkins Medical Institutions describe how they used two agents already on the market one of which is the multiple myeloma drug Velcade to light up tumor viruses on a gamma camera. The technique is the first in the new field of in vivo molecular-genetic imaging that doesn't require transfecting tumors with a "reporter" gene, the scientists say. "The beauty of this is that you don't have to introduce any reporter genes into the tumor because they are already there," says radiologist Martin G. Pomper, M.D., Ph.D. "This is the only example we know of where it is possible to image activated endogenous gene expression without having to transfect cells." A variety of blood and solid cancers are more likely to occur in people who have been infected with the Epstein-Barr virus (EBV), but not everyone with these cancers has such infections. For those who do, researchers, such as Hopkins oncologist and co-author Richard F. Ambinder, M.D., Ph.D., have been working on ways to activate the reproductive, or "lytic" cycle, within the virus to make it
Supercomputer Simulations May Pinpoint Causes Of Parkinson's, Alzheimer's Diseases Science Daily — Using the massive computer-simulation power of the San Diego Supercomputer Center (SDSC) at UC San Diego, researchers are zeroing in on the causes of Parkinson’s disease, Alzheimer’s disease, rheumatoid arthritis and other diseases A study published in this week’s Federation of European Biochemical Societies (FEBS) Journal offers – for the first time – a model for the complex process of aggregation of a protein known as alpha-synuclein, which in turn leads to harmful ring-like or pore-like structures in human membranes, the kind of damage found in Parkinson’s and Alzheimer’s patients. The researchers at SDSC and UC San Diego also found that the destructive properties of alpha-synuclein can be blocked by beta-synuclein – a finding that could lead to treatments for many debilitating diseases. “This is one of the first studies to use supercomputers to model how alpha-synuclein complexes damage the cells, and how that could be blocked,” said Eliezer Masliah, professor of neurosciences and pathology at UC San Diego. “We believe that these ring- or pore-like structures might be deleterious to the cells, and we have a unique opportunity to better understand how alpha-synuclein is involved in the pathogenesis of Parkinson’s disease, and how to reverse this process.” Igor Tsigelny, project scientist in chemistry and biochemistry at UC San Diego and a researcher at SDSC, said that the team’s research helped confirm what researchers had suspected. “The present study – using molecular modeling and molecular dynamics simulations in combination with biochemical and ultrastructural analysis – shows that alpha-synuclein can lead to the formation of pore-like structures on membranes.” In contrast, he said, “beta-synuclein appears to block the propagation of alpha-synucleins into harmful structures.” The complex calculations for the study were performed on Blue Gene supercomputers at SDSC and the Argonne National Labs. Tsigelny worked in collaboration with Pazit Bar-On, Department of Neurosciences; Yuriy Sharikov of SDSC; Leslie Crews of the Department of Pathology; Makoto Hashimoto of Neurosciences; Mark A. Miller of SDSC; Steve H. Keller in Medicine; Oleksandr Platoshyn and Jason X.J. Yuan, both in Medicine; and Masliah, all at UC San Diego. The research was supported by funding from the National Institutes of Health, a Department of Energy INCITE Grant, the Argonne National Laboratory, and the SDSC/ IBM Institute for Innovation in Biomedical Simulations and Visualization
replicate within the tumor cell. When enough viral particles are produced, the tumor will burst, releasing the virus. In animal experiments, this experimental therapy, called lytic induction therapy, results in tumor death. As the first step in this study, researchers screened a wide variety of drugs to see if any of them could reawaken the virus. They were fortunate in that one of the genes that is expressed upon viral lytic induction is EBV's thymidine kinase (EBV-TK), an enzyme that helps the virus begin to reproduce. This kinase is of interest because researchers know its "sister" kinase, the one produced by the herpes simplex virus, can be imaged by an injected radiolabeled chemical (FIAU), which can then be imaged using a gamma camera. "To perform molecular-genetic imaging, we have always had to infect cells with active herpes simplex virus so that they can replicate, express TK, and only then could we use the FIAU tracer to make the cells light up," Pomper says. "So we were hoping to find a way to turn latent Epstein-Barr virus on in these cancers, and use the thymidine kinase it then produces to enable us to see the virus-associated tumors with radiolabeled FIAU." The researchers screened 2,700 agents until they hit upon Velcade, a targeted chemotherapy drug already approved for use in multiple myeloma. "We were both surprised and lucky," he says. "Velcade is a proteasome inhibitor, but it also induces the lytic cycle thereby activating the TK in the Epstein-Barr virus. Once the TK is activated, we can image the tumors." To test their findings, the researchers used mice carrying human Burkitt's lymphoma, a cancer often associated with Epstein-Barr viral infection. Tumors glowed in mice given Velcade followed by an injection of FIAU, but not in mice that weren't given Velcade. Mice whose Burkitt's lymphoma did not contain Epstein-Barr virus also did not respond to either Velcade or FIAU, according to researchers. "Velcade woke up the virus in the tumors, which increased viral load by 12-fold, all the while cranking out TK," Pomper says. "An injection of FIAU made it easy to image the tumors with virus in them." The method is highly sensitive, he says: as few as five percent of the cells within the tumor mass needed to be induced into the lytic cycle in order to be detected. Not only can FIAU light up the tumors, it can also potentially kill them, Pomper says. For imaging purposes, FIAU can carry a radionuclide that emits a low energy gamma photon, but can be engineered to carry therapeutic radionuclides, which are lethal to cells in which TK is activated. Results of this study suggests that this strategy could be applied to other viruses associated with tumors, and that other drugs may potentially be used to activate these viruses, Pomper says. "Velcade is only one of an array of new, as well as older agents, that can induce lytic infection, and a particular agent could be tailored for use in a specific patient through imaging," he says.
Satellite tracks Indonesia wildfire plume
BOULDER, Colo., March 2 (UPI) -- U.S. scientists have linked the 2006 El Nino weather disturbance to the widespread wildfires in Indonesia. "Droughts over Indonesia are often brought on by a shift in the atmospheric circulation over the tropical Pacific associated with El Nino conditions," said David Edwards, of the National Center for Atmospheric Research in Boulder, Colo. "Although the current El Nino is rather weak compared to that of 1997-98, we have found dramatic increases in wildfire activity and corresponding pollution." Edwards and his colleagues used NASA satellite and rainfall data for their research. When rainfall sharply decreased during the last quarter of 2006 in Indonesia's tropical rainforests, the exceptionally dry conditions allowed wildfires to spread, and large amounts of soot and dust delivered unhealthy pollution levels to the area, he said. The Measurements of Pollution in the Troposphere (MOPITT) instrument aboard NASA's Terra satellite tracked wildfire pollution plumes spreading from Indonesia to the Indian Ocean and measured increases in carbon monoxide levels, NASA reported. "Even though fires in South America and southern Africa typically produce the greatest amount of carbon monoxide, the pollution from Indonesian fires is likely responsible for most of the year-to-year variation in pollution levels throughout the Southern Hemisphere," Edwards said.
NASA's Robotic Sub Readies For Dive Into Earth's Deepest Sinkhole
Science Daily — An underwater robot, shaped like a flattened orange, maneuvered untethered and autonomously within a 115-meter-deep sinkhole during tests this month in Mexico, a prelude to its mission to probe the mysterious nether reaches of the world's deepest sinkhole.  Bill Stone, leader of the NASA-funded Deep Phreatic Thermal Explorer (DEPTHX) mission, said the 2.5-meter-diameter vehicle performed "phenomenally well" during early February tests in the geothermal sinkhole, or cenote, known as La Pilita. Carnegie Mellon University researchers developed the software that guided the DEPTHX craft. (Credit: Image courtesy of Carnegie Mellon University)
Bill Stone, leader of the NASA-funded Deep Phreatic Thermal Explorer (DEPTHX) mission, said the 2.5-meter-diameter vehicle performed "phenomenally well" during early February tests in the geothermal sinkhole, or cenote, known as La Pilita. Carnegie Mellon University researchers developed the software that guided the DEPTHX craft. "The fact that it ran untethered in a complicated, unexplored three-dimensional space is very impressive," said Stone, an engineer and expert cave diver who heads Stone Aerospace Inc. of Austin, Texas. That's a fundamentally new capability never before demonstrated in autonomous underwater vehicles (AUVs), he added. The autonomous navigation and mapping software that enabled DEPTHX to safely and precisely operate in the close confines of cenote La Pilita was developed by a team of Carnegie Mellon researchers led by David Wetter green, associate research professor in the Robotics Institute. "These experiments give us confidence that DEPTHX will be able to meet the challenge of its ultimate goal, the cenote El Zacatón," Wettergreen said. Like La Pilita, Zacatón is in the Mexican state of Tamaulipas and was formed by the collapse of a limestone chamber dissolved by warm, acidic groundwater that originated in a nearby volcanic region. The current theory is that the cenote formed under a vast travertine bed like that of Mammoth Hot Springs in Yellowstone National Park. But no one knows how deep Zacatón goes. Human divers, descending far below safe depths, have made it to 282 meters without reaching bottom. Sonar doesn't work over long distances in the confines of the cenote, and current measurements peter out at around 270 meters. NASA has funded the mission to develop and test technologies that might someday be used to explore the oceans hidden under the icy crust of Europa, one of Jupiter's moons. Team members, including scientists from the Southwest Research Institute, the University of Texas and the Colorado School of Mines, also want to learn more about the cenote, including its physical dimensions, the geothermal vents that feed it and whatever life exists at various depths. Because a tether could become tangled or snagged at great depths, DEPTHX is designed to operate autonomously -- independent of human control once it is under way. In areas that are well-mapped, the AUV can operate by dead reckoning, using depth, velocity and inertial guidance sensors to estimate its position. It is also equipped with an array of 56 sonar sensors -- 32 with a range of 100 meters and 24 with a range of 200 meters -- that send sonar beams out in all directions. DEPTHX uses the sonar data to detect obstacles and locate itself on a map, providing precise navigation in the pitch-black deep water. In unexplored areas, the sonar is combined with a software technology known as simultaneous localization and mapping (SLAM) to produce maps of the cenote. In previous applications, Carnegie Mellon roboticists have used a two-dimensional version of SLAM to map environments such as building hallways and mine corridors. Because DEPTHX won't just maneuver side to side, but also up and down, one of Wettergreen's Ph.D. students, Nathaniel Fairfield, has adapted SLAM for more complicated 3-D operation. The DEPTHX researchers, including Project Scientist George Kantor and Senior Research Programmer Dominic Jonak of Carnegie Mellon, traveled five hours south of the border town of Brownsville, Texas, to reach La Pilita. From the surface, the cenote looks like an ordinary duck pond, perhaps 30 meters across. "You can swim across it in a minute and it is warm, too," Wettergreen said. "But the top is like the neck of a vase. As you get deeper, the cenote widens until it's more than 100 meters across. It is a duck pond that is 115 meters deep." The overhang created by the narrowing at the top of La Pilita made the cenote a challenging environment. Researchers made certain that, in event of emergency, DEPTHX could find its way to the surface, even if it had been operating underneath the overhang. Kantor has developed vehicle models and sensor filters for highly accurate dead reckoning. Repeated tests showed that DEPTHX could estimate its position even after hours of underwater operation, determining its location within one meter using dead reckoning and within 15 centimeters using sonar localization. Wettergreen said the use of SLAM was limited during this initial test, but Stone noted that the 3-D mapping capability was demonstrated Feb. 5 when DEPTHX made a descent to the cenote floor. DEPTHX spun slowly as it descended, helping the sonar beams cover as much of the walls as possible. The resulting map revealed a tunnel on the western wall and a bulging of the northwest wall at a depth of about 30 meters. DEPTHX was able to maneuver close to the walls so it could extend a mechanical arm with a coring mechanism and obtain wall samples. The AUV also conducted water-sampling experiments, though the full science instrumentation package won't be tested until the second week of March, when the researchers will return to La Pilita for a science investigation and complete rehearsal of their mission to Zacatón. The researchers plan to begin their exploration of Zacatón in May.
Smoking mother can hurt fetus
ORLANDO, Fla., March 2 (UPI) -- Women who smoke during pregnancy can put their children at risk for strokes and heart attacks later in life, a study presented in Florida said. "This is the first report to demonstrate this association," said Dr. Cuno S.
Uiterwaal of the University Medical Center Utrecht in the Netherlands.
"This is a preventable risk factor. Women need to stop smoking, especially in pregnancy, not only for their own health, but for their unborn child." Uiterwaal presented his study at the American Heart Association's 47th Annual Conference on Cardiovascular Disease Epidemiology and Prevention. It found that maternal smoking could cause permanent vascular damage in their unborn children "There is the possibility that the compounds in tobacco smoke go through the placenta and directly damage the cardiovascular system of the fetus," Uiterwaal said. "The damage appears to be permanent and stays with the children."
New Mechanism For Producing Cosmic Gamma Rays From Starlight Is Proposed Science Daily — In 2002, when astronomers first detected cosmic gamma rays — the most energetic form of light known — coming from the constellation Cygnus they were surprised and perplexed. The region lacked the extreme electromagnetic fields that they thought were required to produce such energetic rays. But now a team of theoretical physicists propose a mechanism that can explain this mystery and may also help account for another type of cosmic ray, the high-energy nuclei that rain down on Earth in the billions.  The starbust area in Cygnus OB2 is dominated by young, bright, hot stars and has been identified as a source of cosmic gamma rays. This is an infrared image taken of the area by the Infrared Astronomical Satellite. (Credit: Image courtesy of J. Knoedlseder)
The new mechanism is described in a Physical Review Letters paper published online on March 20. The theoretical study was headed by Thomas Weiler, professor of physics at Vanderbilt, working with Luis Anchordoqui at the University of Wisconsin-Milwaukee; John Beacom at Ohio State University; Haim Goldberg at Northeastern University; and Sergio Palomares-Ruiz at the University of Durham. Existing methods for producing cosmic gamma rays require the ultra-strong electromagnetic fields found only in some of the most extreme conditions in the universe, such as stellar explosions and regions surrounding the massive black holes found at the core of many galaxies. So they couldn't explain how a "starburst" region in the Cygnus galaxy dominated by young, hot, bright stars could produce such energetic rays. The newly proposed mechanism, however, shows how two constituents present in such an area — fast-moving nuclei found in stellar winds and ultraviolet light — can interact to produce cosmic gamma rays. Cosmic rays provide an invisible but important link between the Earth and the rest of the universe. They have a number of subtle effects on everyday life. They cause chemical changes in soil and rock and trigger lightning strikes, and some scientists have suggested that they may affect the climate by influencing the process of cloud formation. The circuitry in computer chips is now so small that individual cosmic rays can cause non-reproducible computer errors, and cosmic rays increase the risk of cancer among frequent airline passengers. There is also speculation that waves of cosmic rays streaming down the spiral arms of the galaxy could have contributed to past episodes of mass extinction on earth. Since cosmic rays were discovered in 1912 in balloon experiments, scientists have marveled at the tremendous amount of energy that they carry and have speculated about their origins. Originally, about all researchers knew about them were that they came from outer space. Today, scientists know that cosmic rays consist of a variety of different objects, including gamma rays, protons, electrons and the nuclei of a wide variety of different elements. They also know more about where cosmic rays come from. Most low-energy cosmic rays are produced by the sun. However, high-energy cosmic rays come from distant parts of the universe. Despite the years of study, cosmic rays have managed to keep a number of secrets. For example, the most energetic proton-cosmic rays — nicknamed "Oh-my-God-particles" — pack a punch equivalent to that of a fast-pitch baseball. In the baseball, billions upon billions of nuclear particles share this energy. These energetic cosmic rays demonstrate that there are ways to pack the same amount of energy into a single particle, but, despite their continuing efforts, scientists have not yet found an acceptable mechanism for doing so. Another outstanding question is the origin of the most energetic gamma rays. They carry a trillion times more energy than photons in the visible range, making them the most energetic form of light known. (Atomic particles like protons and electrons gain and lose energy by speeding up and slowing down. Light particles, called photons, always travel at the same speed and gain energy by oscillating faster at shorter wavelengths and lose energy by oscillating more slowly at longer wavelengths.) Physicists measure the energy in photons in electron-volts (eV): the amount of energy a single electron gains when it passes through a potential difference of one volt. The energies of photons in visible light range from 1.5 to 3.0 eV. Cosmic gamma rays contain tens of trillions of electron volts. Such TeV gamma rays are relatively rare: one falls on a square kilometer of Earth's atmosphere every second on average. Virtually all of them collide with air molecules and produce a cascade of energetic particles in the upper atmosphere. Scientists have come up with several mechanisms that can explain how photons can gain so much energy. They do a good job of explaining how TeV gamma rays can be created by the ultra-strong electrical and magnetic fields that occur when stars explode and that are associated with the super-massive black holes found in many galaxies. One of the generally accepted mechanisms begins with electrons that have been accelerated to extremely high energies. When such an electron runs head on into a microwave photon, it can transfer much of its energy into the photon by a process called Compton back scattering. In the process, the microwave photon is transformed into a TeV gamma ray. A variation on the theme involves the interaction of a fast-moving electron with an extremely strong magnetic field. The magnetic field throws the electron into a curve. If the curve is sharp enough, the electron will lose energy by emitting high-energy gamma rays: a phenomenon called bremsstahlung. The second mechanism involves collisions between highly accelerated protons and a photon. In this case, the proton first absorbs the photon. This makes the proton unstable, so it decays into a short-lived subatomic particle called a pion, which, in turn, decays into a pair of cosmic gamma rays. "There is a region in Cygnus, called Cygnus OB2, where there have been unexplained observations of TeV gamma rays: That is where we jumped in," says Weiler. The new mechanism he and his colleagues have worked out uses the strong ultraviolet light produced by young, hot, stars and the nuclei of iron and silicon, which should be present in the stellar winds in starburst regions. Both nuclei carry strong positive electric charges, so they can be accelerated to extremely high velocities by moderate electromagnetic fields. The scientists calculate that when one of these nuclei collides with a photon of ultraviolet starlight, it will frequently disintegrate into a nuclear fragment and some TeV gamma rays. "Each of these three mechanisms — electron versus proton versus nucleus as accelerated beam — has a characteristic signature in the gamma ray spectrum. Our nuclear mechanism fits the observations from Cygnus OB2 much better than the others," says Weiler. The heavy nuclei required in this process are produced in supernovas and there are no known exploded stars in the region. So the model assumes that these nuclei, which are spread throughout space, are sometimes trapped by starburst regions. "This is one of the weakest parts of our model, so I don't want to push this aspect," says Weiler. However, if the model is correct then these regions may be an important source of the nuclei fraction of the cosmic rays that fall on Earth. The nuclei that produce the cosmic gamma rays should stream out into the galaxy and some should reach Earth eventually as cosmic rays. Unlike gamma rays, which can be tracked back to their sources, the paths of electrically charged nuclei are altered by the magnetic fields that they pass through so it is not possible to determine their origins directly. The effort was funded by grants from the National Science Foundation, the National Aeronautics and Astronautics Administration, the Department of Energy and a Vanderbilt University Discovery Award.
Storm worm blog virus on rise
SEOUL, March 2 (UPI) -- A new variant of the troublesome Storm worm computer virus attacks weblogs, bulletin boards and Web mail, British and Korean computer security experts say. The worm, which disguises itself as a news message about Europe's February storms, appears safe, but inserts a link to a malicious Web site if users post onto a blog journal or bulletin board, senior researcher Kang Eun-sung of AhnLab, South Korea's foremost antivirus software developer, told The Korea Times. The users' text, including through Web mail, will contain their own content along with the link and a note to lure readers to check out a Web site with "fun" videos or an e-card. Storm worm made up 50.3 percent of all "mal ware" tracked by Britain's Sophos, making it the number one threat seen by the security company. Mal ware is malicious computer software that interferes with normal computer functions or sends personal data about the user to unauthorized parties over the Internet.
Scientists Create RNA Computer
Science Daily — PRINCETON, N.J. -- Princeton University researchers have developed a kind of computer that uses the biological molecule RNA to solve complex problems. The achievement marks a significant advance in molecular computing, an emerging field in which scientists are harnessing molecules such as DNA and RNA to solve certain problems more efficiently than could be done by conventional computing.
In work to be published in the Proceedings of the National Academy of Sciences, the Princeton scientists used a test tube containing 1,024 different strands of RNA to solve a simple version of the "knight problem," a chess puzzle that is representative of a class of problems that requires brute-force computing. The knight problem asks how many and where can one place knights on a chessboard so they can not attack each other. For the purposes of their experiment, the researchers restricted the board to just nine squares, so there were 512 possible combinations. Of these, the RNA computer correctly identified 43 solutions. It also produced one incorrect response, highlighting the need to develop error-checking techniques in chemical computing. This test-tube computer does not have any immediate applications, and it will probably never completely replace silicon technology. But it does have attractive aspects, said assistant professor of ecology and evolutionary biology Laura Landweber who led the research project in collaboration with professor of computer science Richard Lipton, and postdoctoral fellow Dirk Faulhammer and a student, Anthony Cukras. "It begs the question, What is a computer?" said Landweber. "A computer can be an abacus, it can be many types of devices. This is really an abstraction of a computer." One advantage, said Land Weber, is that the genetic molecules DNA and RNA, which encode all the instructions for creating and running life, can store much more data in a given space than conventional memory chips. Another benefit is that, with vast numbers of genetic fragments floating in a test tube, a biomolecular computer could perform thousands or millions of calculations at the same time. It is an extreme example of parallel computing, which is a rapidly growing area of computer technology. For example, in the knight problem, each strand of RNA represented a possible solution, but the researchers did not need to sort through each one individually; in a series of five steps, a specially targeted enzyme slashed away all the strands that did not match the requirements of a correct solution. Researchers believe that such techniques could be valuable for problems that need to be solved by trial and error, where it is cumbersome to test possible solutions one at a time. DNA computing has attracted considerable attention from researchers since 1994 when Leonard Adle man of the University of Southern California used DNA to solve a version of an archetypal problem called the traveling salesman problem. The idea is that words written in the letters of DNA, referred to as A, T, C and G, could represent the ones and zeroes used in computer logic. Computing is accomplished by eliminating molecules whose sequences appear to be poor solutions and retaining ones that seem more promising. The output of final molecules can be read like the holes punched in an old-fashioned computer tape. Land weber found that substituting RNA for DNA gave her more flexibility in developing a computing system. With DNA, there is a limited set of restriction enzymes - a kind of molecular scissors - so scientists may not be able to cut the molecule where they want. With RNA, Landweber's group could use just one universal enzyme that targets any part of the molecule. This aspect streamlines their approach and makes it inherently 'scalable' to larger problems.
Computer software is so called in contrast to computer hardware, which encompasses the physical interconnections and devices required to store and execute (or run) the software. In computers, software is loaded into RAM and executed in the central processing unit. At the lowest level, software consists of a machine language specific to an individual processor. A machine language consists of groups of binary values signifying processor instructions (object code), which change the state of the computer from its preceding state. Software is an ordered sequence of instructions for changing the state of the computer hardware in a particular sequence. It is usually written in high-level programming languages that are easier and more efficient for humans to use (closer to natural language) than machine language. High-level languages are compiled or interpreted into machine language object code. Software may also be written in an assembly language, essentially, a mnemonic representation of a machine language using a natural language alphabet. Assembly language must be assembled into object code via an assembler.
The term "software" was first used in this sense by John W. Tukey in 1957. In computer science and software engineering, computer software is all computer programs. The concept of reading different sequences of instructions into the memory of a device to control computations was invented by Charles Babbage as part of his difference engine. The theory that is the basis for most modern software was first proposed by Alan Turing in his 1935 essay Computable numbers with an application to the Entscheidungsproblem.
Types Practical computer systems divide software into three major classes: system software, programming software and application software, although the distinction is arbitrary, and often blurred. System software helps run the computer hardware and computer system. It includes operating systems, device drivers, diagnostic tools, servers, windowing systems, utilities and more. The purpose of systems software is to insulate the applications programmer as much as possible from the details of the particular computer complex being used, especially memory and other hardware features, and such accessory devices as communications, printers, readers, displays, keyboards, etc. Programming software usually provides tools to assist a programmer in writing computer programs and software using different programming languages in a more convenient way. The tools include text editors, compilers, interpreters, linkers, debuggers, and so on. An Integrated development environment (IDE) merges those tools into a software bundle, and a programmer may not need to type multiple commands for compiling, interpreter, debugging, tracing, and etc., because the IDE usually has an advanced graphical user interface, or GUI. Application software allows end users to accomplish one or more specific (non-computer related) tasks. Typical applications include industrial automation, business software, educational software, medical software, databases, and computer games. Businesses are probably the biggest users of application software, but almost every field of human activity now uses some form of application software. It is used to automate all sorts of functions.
Program and library A program may not be sufficiently complete for execution by a computer. In particular, it may require additional software from a software library in order to be complete. Such a library may include software components used by stand-alone programs, but which cannot work on their own. Thus, programs may include standard routines that are common to many programs, extracted from these libraries. Libraries may also include 'stand-alone' programs which are activated by some computer event and/or perform some function (e.g., of computer 'housekeeping') but do not return data to their calling program. Programs may be called by one to many other programs; programs may call zero to many other programs.
Three layers  Starting in the 1980s, application software has been sold in mass-produced packages through retailers Users often see things differently than programmers. People who use modern general purpose computers (as opposed to embedded systems, analog computers, supercomputers, etc.) usually see three layers of software performing a variety of tasks: platform, application, and user software. Platform software Platform includes the basic input-output system (often described as firmware rather than software), device drivers, an operating system, and typically a graphical user interface which, in total, allow a user to interact with the computer and its peripherals (associated equipment). Platform software often comes bundled with the computer, and users may not realize that it exists or that they have a choice to use different platform software. Application software Application software or Applications are what most people think of when they think of software. Typical examples include office suites and video games. Application software is often purchased separately from computer hardware. Sometimes applications are bundled with the computer, but that does not change the fact that they run as independent applications. Applications are almost always independent programs from the operating system, though they are often tailored for specific platforms. Most users think of compilers, databases, and other "system software" as applications. User-written software User software tailors systems to meet the users specific needs. User software include spreadsheet templates, word processor macros, scientific simulations, graphics and animation scripts. Even email filters are a kind of user software. Users create this software themselves and often overlook how important it is. Depending on how competently the user-written software has been integrated into purchased application packages, many users may not be aware of the distinction between the purchased packages, and what has been added by fellow co-workers.