Thursday, June 28, 2007

Howstuffworks "How Women Work"



Howstuffworks "How Women Work"
How Women Work
How Women Work
by Tracy V. Wilson

Introduction to How Women Work
If you believe what you see on TV, women are inscrutable, conniving, hysterical and apt to change their minds without reason or warning. Some women's magazines perpetuate these stereotypes by offering advice on how to entrap men or keep them guessing. And some of the basic differences between men and women can seem a little confusing, depending on your point of view. So it's not surprising that one of the most requested articles in the history of HowStuffWorks is "How Women Work."

The irony is that from conception until the eighth week of gestation, men and women are almost exactly the same. The only difference is at the chromosomal level, deep inside the embryos' cells. Inside every cell of a person's body, DNA is tightly wound into pairs of structures called chromosomes. One pair of chromosomes determines whether the person is male or female. Except in the case of extremely rare abnormalities, a person with two X chromosomes is female, and a person with one X chromosome and one Y chromosome is male. For a few weeks, these chromosomes are all that differentiates male embryos from female embryos.

X and Y chromosomes
Image courtesy National Human Genome Resource Institute
A karyotype, or chromosome "map," for a normal
human male, showing X and Y chromosomes

Of course, by the time an embryo has grown into an adult woman, many attributes make her different from a man. On average, women are shorter and smaller than men are, although women have a higher percentage of body fat. Women have reproductive organs that can support a developing baby and nourish it after its birth. Their blood pressure is lower, and their heart beats faster, even when they're asleep [Source: FDA]. Women also have faster blood flow to their brains and lose less brain tissue as they age than men do [Source: Psychology Today].

And then, of course, there are hormones, which a lot of people view as a huge difference between men and women. But every person's body, whether it's male or female, uses hormones to regulate and control a wide range of processes. Hormones are the products of the endocrine system, which includes numerous glands located in various parts of the body. For example, two well-known hormones are adrenaline, which comes from the adrenal gland, and insulin, which comes from the pancreas. These and other hormones are vital to the lives and health of both men and women. To learn more about the endocrine system, watch this ADAM animation.

Sex hormones, on the other hand, work a little differently in men's and women's bodies. In men, the testes produce the hormone testosterone, which regulates sperm production and causes masculine secondary sex characteristics. In women, the ovaries produce hormones like estrogen and progesterone, which regulate reproductive processes. Men's bodies convert a little testosterone into estrogen, and women's bodies make small amounts of testosterone, so neither hormone is exclusive to one sex or the other.

estrogen molecule

A man's testosterone levels can fluctuate throughout the day as his body regulates its production of sperm. But a woman's sex hormone levels fluctuate as part of her reproductive cycle, which takes about a month to complete. During a woman's childbearing years, the recurring changes in her hormone levels can cause symptoms like irritability and moodiness, known as premenstrual syndrome (PMS). When a woman reaches perimenopause, her body slows down its production of sex hormones. During the process, her levels of estrogen and progesterone can vary significantly, causing symptoms like hot flashes and trouble sleeping.

Sex hormones can affect a woman's emotions and physiology throughout most of her life. But contrary to some people's perceptions, they're not responsible for every facet of her behavior. In this article, we'll look at some other common perceptions and stereotypes about women as we examine how they work.

Boy Babies and Girl Babies
Research suggests that testosterone in a woman's body may contribute to the sex of her children. It may encourage eggs to allow fertilization from a sperm carrying a Y chromosome, resulting in the birth of a boy [Source: Psychology Today].

Women and Emotions
A 2001 Gallup poll asked American adults whether a series of qualities applied more to men or to women. Ninety percent of those surveyed said that the characteristic "emotional" applied more to women. The survey didn't ask about particular emotions or specify positive or negative connotations for the word "emotion." But it seems likely from the results that most Americans view women as either able to experience or prone to experiencing a wider, more intense range of emotions than men do.
Confusing on Purpose?
One common stereotype is that women give mixed signals, especially when it comes to romantic involvement. A study at the University of Texas at Austin suggests that the sexes misunderstand one another and that there are evolutionary forces involved. The confusion may come from early human men trying to have more descendants while early human women tried to protect themselves from deception [Source: Psychology Today].

Are women more emotional than men are? Do they cry more?
The perception that women cry more than men is pretty widespread. But as babies and children, boys and girls cry about the same amount on average. Only during puberty do girls begin to cry more than boys do. According to a 2005 New York Times article, by age 18, women cry four times as much as men.

A possible explanation for this is the hormone prolactin, which contributes to how much people cry. Prolactin is present in blood and tears, and it's more prevalent in women than in men. Women's tear ducts are also shaped a little differently from men's, which could be either a cause or an effect of increased crying [Source: New York Times]. In addition, people who are depressed may cry four times as much as people who are not, and two-thirds of people diagnosed with depression are women [Psychology Today].

Of course, another common explanation is that some societies encourage women to cry while discouraging men from crying. In the United States, an exception to this standard seems to be the business world. In some businesses, crying is discouraged -- a woman who cries in the office may be viewed as weak or ineffectual [Source: New York Times].

Are women more stressed out than men are?
Women sometimes have a reputation for being worriers. According to a 2005 Gallup poll, women are more worried about a range of social issues than men are. Significantly more women than men answered that they worried "a great deal" about seven of the 12 issues in the survey.

Studies show that, in addition to worrying more often, women may be physiologically prone to experiencing more stress. For example, the amygdala of the brain processes emotions like fear and anxiety. In men, the amygdala communicates with organs that take in and process visual information, like the visual cortex. In women, though, it communicates with parts of the brain that regulate hormones and digestion. This may mean that stress responses are more likely to cause physical symptoms in women than in men [Source: Live Science].

Brain with amygdala highlighted

In addition, women's bodies produce more stress hormones than men's bodies do. Once a stressful event is over, women's bodies also take longer to stop producing the hormones. This may be a cause or an effect of women's tendency to replay stressful events in their minds and think about upsetting situations [Source: Psychology Today].

Are women more jealous than men are?
In some people's minds, women are more jealous and possessive than men are, especially in the context of romantic relationships. But research suggests that women aren't more jealous than men -- they're just jealous about different situations.

In one German study, researchers showed participants images of several scenarios. The participants used a computer to describe which of the scenarios would be more upsetting. The results suggest that, across cultures, women find emotional infidelity more upsetting than sexual infidelity. Men's responses varied across cultures, but in general they were jealous of sexual infidelity [Source: Human Nature].

On the other hand, a study at the University of California at San Diego measured participants' blood pressure and heart rate rather than asking them to describe their responses. Men had greater physical reactions to physical infidelity, while woman reacted with about the same intensity to both scenarios. Women who were in committed relationships were more upset by physical infidelity than those who were not. However, 80 percent of the women in the study thought emotional infidelity would be more upsetting to them than physical infidelity [Source: Psychology Today].

Next, we'll look at some common perceptions of how women learn and communicate.



Powered by ScribeFire.

Wednesday, June 27, 2007

Paris Lap Dances

Brain scans show meditation changes minds, increases attention




For hundreds of years, Tibetan monks and other religious people have used meditation to calm the mind and improve concentration. This week, a new study shows exactly how one common type of meditation affects the brain. Using a scanner that reveals which parts of the brain are active at any given moment, the researchers found that meditation increased activity in the brain regions used for paying attention and making decisions. The changes were associated with the practice of concentration meditation, says study leader Richard Davidson, professor of psychology and psychiatry at the University of Wisconsin School of Medicine and Public Health and the Waisman Center. Practitioners were instructed to focus attention intently on a stimulus, and when the attention wandered off, to simply bring the attention back to the object, explains Davidson. "In one sense, concentration mediation is ridiculously simple, but in another, it's extraordinarily difficult," adds Davidson. "If you try it for two minutes, you will see that it's not so easy. Minds have a propensity to wander." In collaboration with colleagues Julie Brefczynski-Lewis and Antoine Lutz of the UW-Madison W.M. Keck Laboratory for Functional Brain Imaging and Behavior, Davidson compared newly trained meditators to people with up to 54,000 hours of meditation experience. The study is being published this week in the online edition of the Proceedings of the National Academy of Science. After the novices were taught to meditate, all subjects underwent a magnetic resonance imaging scan of the brain while they were meditating. Among all experienced meditators, the MRI scan found greater activity in brain circuits involved in paying attention. "We found that regions of the brain that are intimately involved in the control and regulation of attention, such as the prefrontal cortex, were more activated in the long-term practitioners," Davidson says. A different picture emerged, however, from looking only at the most experienced meditators with at least 40,000 hours of experience. "There was a brief increase in activity as they start meditating, and then it came down to baseline, as if they were able to concentrate in an effortless way," says Davidson. Effortless concentration is described in classic meditation texts, adds Davidson. "And we think this may be a neural reflection of that. These results illustrate one mechanism by which meditation may act in the brain." While the subjects meditated inside the MRI, the researchers periodically blasted them with disturbing noises. Among the experienced meditators, the noise had less effect on the brain areas involved in emotion and decision-making than among novice meditators. Among meditators with more than 40,000 hours of lifetime practice, these areas were hardly affected at all. "Most people, if they heard a baby screaming, would have some emotional response," Davidson says, but not the highly experienced meditators. "They do hear the sound, we can detect that in the auditory cortex, but they don't have the emotional reaction." As Davidson notes, any comparison of average middle-aged Americans to people who have meditated daily for decades must try to associate the differences with meditation, and not lifestyle factors such as isolation or religious faith. "This was a highly unusual group of people. Two-thirds of the experienced meditators were Tibetan monks, recruited with the help of the Dalai Lama, and they all had an extremely long history of formal practice." For 15 years, Davidson has had a scientific relationship with the Dalai Lama, spiritual leader of Tibetan Buddhists, to investigate the effects of meditation. Still, the correlation between more meditation experience and greater brain changes does suggest that the changes were caused by meditation. "If it were simply lifestyle, we would not expect a very strong correlation with hours of practice," Davidson says. Other evidence for the neurological benefits of meditation came from a study Davidson reported in May, which showed that three months of meditation training improved the ability to detect a brief visual signal that most people cannot detect. "That was a more definitive kind of evidence, because we were able to track the same people over time," he says. Psychologists have long considered an adult's capacity to pay attention as relatively fixed, but Davidson says: "Attention can be trained, and in a way that is not fundamentally different than how physical exercise changes the body." The attention circuits affected by meditation are also involved in attention deficit hyperactivity disorder, which Davidson describes as the most prevalent psychiatric diagnosis among children in our country. "Our findings suggest that it may-I stress may-be possible to train attention in children with methods derived from these practices," he says. Davidson says scientific studies of meditation are proving traditional beliefs about the mental benefits of meditation. Yet although meditation is often associated with monks living a life of simplicity, poverty, and prayer, "There is nothing fundamentally mysterious about these practices; they can be understood in hard-nosed western scientific terms." And, he adds, a growing body of "hard-nosed neuroscience research" is attracting attention to the profound effects of meditation. "This deserves serious scientific attention," he says. "It also explains why people spend time sitting on the meditation cushion, because of the effects on day-to-day life." Davidson compares mental practice to physical exercise. "We all know that if an individual works out on a regular basis, that can change cardiovascular health," he says. "In the same way, these data suggest that certain basic mechanisms of the mind, like attention, can also be trained and improved through systematic practice." Source: UW-Madison This news is brought to you by PhysOrg.com

Tuesday, June 26, 2007

Sundance...online... ...cool

This is not boring

High notes







Published online: 24 June 2007; | doi:10.1038/news070618-18 Kerri Smith  The way that people talk about 'high' and 'low' notes makes it sound as though musical pitch has something to do with physical location. Now it seems there may be a reason for this: the same bit of our brain could control both our understanding of pitch and spatial orientation. The result comes from a study of tone-deaf people — also known as 'amusics' — which shows that they have poorer spatial skills than those who have no problem distinguishing between two musical notes. Amusics are unable to tell whether a particular musical note is higher or lower than another. The condition has puzzled neuroscientists, because the way in which the brains of amusics process auditory information seems to be no different from normal. Researchers from the University of Otago in New Zealand were keen to investigate. David Bilkey and his student Katie Douglas (who, as a member of the New Zealand Youth Choir, is particularly interested in how the brain processes music) had noticed that music is often described using spatial references, such as 'high' and 'low' notes — with higher notes literally sitting higher on a stave. The same is true in many different languages. So they decided to test the spatial skills of amusic people. "The question was whether the relationship was just a metaphor or something more than that," says Bilkey. He and Douglas asked volunteers to mentally rotate an object, and click on a picture of how it would look when rotated. Amusic subjects made more than twice as many errors than either of the two control groups — one made up of musicians, the other a group with little musical training. The results are reported in Nature Neuroscience1. "We were really surprised. The hypothesis that spatial processing was the underlying problem was a long shot," Bilkey says. Most studies of amusia have focused on pitch processing as the fundamental deficit, says Tim Griffiths, a neurologist at Newcastle University in the UK. In chorus The researchers went on to see if their volunteers could perform both tasks — pitch discrimination and object rotation— at the same time. The control groups found this hard, and took much longer to mentally rotate objects when they also had to discriminate between two notes. This is presumably because the tasks interfered with each other. "One possibility is that pitch is encoded in parts of the brain that also encode spatial information," suggests Bilkey. This would increase the workload for these brain regions in normal people, slowing them down. But amusic subjects were much less affected by having to do these tasks simultaneously. Because they were pretty much unable to tell the musical notes apart, their brain was free to work on the spatial task. One brain region that might be doing the work is an area in the parietal lobe called the intraparietal sulcus (IPS), says Bilkey, which is known to be involved in processing music, spatial information and numbers. Space training Given the relationship between amusia and spatial skill, does this mean that improving one might boost the other? The researchers don't yet know. It has been previously shown that people with many years of musical training are better at spatial tasks, Bilkey says. But it's not clear how this relationship works, or what causes what. So it's unknown whether wannabe musicians would benefit from rotating shapes in their heads. Or whether amusic people would benefit from spatial skills training. Griffiths has met many amusics, and is sceptical. "I'm not sure if auditory training would help people, let alone spatial training," he says.




Powered by ScribeFire.

Beer music







What's the backscatter of your beer?

An acoustic technology developed at Pacific Northwest National Laboratory eliminates the need for laborious and costly sampling of slurries in large containers. Fermentation-based industries, such as beer and pharmaceuticals, could benefit from the technology’s non-invasive, continuous and objective “listening” technique in tracking microbial growth through the different process phases.

A team of researchers at PNNL can track the size and concentration of particles within opaque slurries by attaching an acoustic-based technology to the outside of a large tank or vat, much like those used to make beer and medicinal drugs.

The lab's patented technique is novel in its fusion of information extracted from both acoustic backscatter and transit measurements, including velocity, amplitude and frequency data.

“The beauty of acoustics is that it can tell you what’s going on within a mixture without having to disrupt the process by physically drawing a sample and analyzing it,” said Dick Pappas, senior research scientist. “And because we can measure how fast sound travels across a vat, for instance, and the change in the signal’s frequency and strength, we can also tell when a mixture has changed from what it should be, possibly heading off a negative situation. Similarly, we can tell when a mixture is brewed to perfection.”

Conceptually, this acoustic technology is relatively simple. It consists of either a single transducer or paired transducers – devices that resemble ear phones and that transform electric signals into sound energy – placed on opposite sides of a container. Both the backscattered acoustic signals and the acoustic signals that transit the vessel contain useful information about the slurry. The signals from the transducers are digitized and analyzed so that an operator can immediately detect changes in the fermentation process. The technology can be automated, runs continuously unattended and can be configured to trigger process controls such as valves and switches.

The ultrasound technology is also useful for measuring cell or organism growth and population in fermentations. A typical method for characterizing fermentation slurries involves diluting and visually counting a sample at periodic intervals. But with acoustics, researchers can quickly and continuously analyze the size and population of organisms throughout the fermentation process, often helping to identify specific fermentation phases.

In addition to biological processes, this backscatter-transit acoustics methodology has been used in lab testing to characterize industrial processes and products such as paints, micro-milling, asphalt-based commercial products and sterilize packaged liquid food.

Source: Pacific Northwest National Laboratory




This news is brought to you by PhysOrg.com






Powered by ScribeFire.

Saturday, June 23, 2007

Scientific American: Parkinson's Gene Therapy Breakthrough May Enter Clinical Trials by Year-End





Scientific American: Parkinson's Gene Therapy Breakthrough May Enter Clinical Trials by Year-End

ScientificAmerican.com

June 22, 2007

Parkinson's Gene Therapy Breakthrough May Enter Clinical Trials by Year-End

Promising results delivered in the first human clinical trial testing the procedure against the neurodegenerative disorder

An innocuous gene-bearing virus injected into the midbrains of a dozen patients suffering from Parkinson's disease improved the subjects' motor function while causing no adverse effects, says a new study.

This is the first time gene therapy has been tested to fight Parkinson's, which affects an estimated 500,000 Americans. The promising findings, published this week in The Lancet, opens the door to a new treatment option for the neurodegenerative disease.

"The safety and effectiveness clearly indicate that this is something worth pursuing," says lead study author Michael Kaplitt, a neurological surgeon and director of movement disorders at New York-Presbyterian Hospital/Weill Cornell Medical Center. "We're not finished, clearly; we still need to do a larger, more definitive study to prove this for sure."

Parkinson's disease, a disorder that typically strikes people their 60s, is characterized by tremors, stiffness, loss of speech and difficulty with motor function. Neuroscientists have tracked its biological cause to the loss of neurons, or nerve cells, in a midbrain region called the substantia nigra, which produces the neurotransmitter dopamine (that helps maintain proper movement control). When dopamine levels are low the subthalamic nucleus, a sliver of neurons just above the substantia nigra, overproduces glutamate, which is the brain's primary excitatory chemical messenger. When hyped up it overstimulates downstream neurons, triggering a strong inhibitory response that results in disrupted movement.

Kaplitt and senior study author Matthew During, a senior research associate at Weill Cornell, focused on trying to calm down the overactive subthalamic nucleus. They used a harmless virus called an adeno-associated virus to transport a gene that codes for the enzyme glutamic acid decarboxylase (GAD) into the neurons of the subthalamic nucleus. The gene prompted these subthalamic cells to produce gamma-aminobutyric acid (GABA), the brain's primary inhibitory neurotransmitter, which made them settle down and restored normal motor function.

Because of federal regulations, the team could only inject the virus into one hemisphere of each person's brain. "This allowed us to compare the two sides of the brain," Kaplitt says, which enabled researchers to judge the effectiveness of their treatment.

Researchers monitored the 12 subjects over the next year and discovered that motor function improved from 25 percent to 65 percent. They also found that the treated sides of the brain showed normalized brain activity in key regions downstream from the subthalamic nucleus: the thalamus, also implicated in motor function, and parts of the cerebral cortex involved in movement. Most encouraging to the scientists was that the improvement persisted even when the patients were on their Parkinson's medications meaning that, as Kaplitt describes, "the therapy was causing additional improvement to the medicines."

Kaplitt and During had a number of safety concerns at the study's inception: The viral packages could damage target cells by provoking the immune system; there was a chance of overinhibition of the neurons in the subthalamic nucleus; and there were worries about unknown side effects that the researchers had not anticipated. But Kaplitt reports their fears were unfounded—there were no incidents of infection, immune response or toxicity.

In an editorial accompanying The Lancet article, Jon Stoessel, a professor of neurology at Pacific Parkinson's Research Center in Vancouver, calls the new work a "provocative approach to the treatment of neurodegenerative disease." He questions, however, whether this is a better treatment option than deep-brain stimulation, the most effective current therapy, which involves implanting a brain "pacemaker" to electrically stimulate either the thalamus or subthalamic nucleus

Kaplitt, who hopes to have a full-scale clinical trial for the gene therapy approach underway by year-end, says there are several "inherent advantages" to this new option. Most notably, there are no wires or batteries in the body that could cause infection, it avoids the risks of putting a new electrical source in the brain, and the pacemaker has to be tuned frequently, whereas the gene, according to animal studies, should remain effective for several years.
"If I had these two therapies side by side," Kaplitt says, "I and most of my patients would choose the gene therapy approach."




Powered by ScribeFire.

Saturday, June 16, 2007

Monday, June 11, 2007

Weird Converter from Make Magazine



Go: WEIGHT
How many NASCAR Winston Cup Tires in an African Elephant?
How many kegs of beer in an Airbus A380?

Go: LENGTH/HEIGHT
How many Shaquille O'Neals in the Great Wall of China?
How many Giraffe's necks in the Weinermobile?


http://www.weirdconverter.com/index.php


Posted to Make by Phillip Torrone | Jun 11, 2007 03:00 AM

Thursday, June 7, 2007

A Look at the Internets... tubes... er...



Akamai Gives Free Peek at Internet

(AP) -- What does the Internet look like? A free new Web service from Akamai Technologies Inc. offers a peek, providing a sort of Internet weather report on global traffic tie-ups, cyberattacks and spikes in activity.

Akamai, which says it delivers 15 to 20 percent of Internet traffic on any given day, hopes its new Web site helps not only the techies it counts as clients, but also the general public.

If your Internet connection is slower than usual, Akamai's tool can show whether traffic is clogged overall in your city. (If not, your Internet service provider might be to blame.) Or you might just want a way to visualize the global ebb and flow of the Internet.

"We originally built this feature as a tool for our customers, but once it was built it seemed like a fun thing to put out there to the public," said Tom Leighton, Akamai's chief scientist.

The service - check out http://tinyurl.com/yooz96 - reveals some of the data that engineers at Akamai's Cambridge headquarters rely on to monitor and troubleshoot global server networks and ensure information flows over the most efficient paths.

The service features a real-time monitor measuring Internet traffic globally and by region. The tool shows the 10 cities with the slowest Web connections at a given moment, and ranks the regions facing the most network attacks. Other sections measure traffic on digital music, retail and news Web sites.

Still crave more insight into the world's data streams? Check out some other sites like http://www.internettrafficreport.com and http://www.internethealthreport.com .



Now it's official



According to the latest research as far as women are concerned the size of a man's penis is immaterial.

Women are far more interested in a man's personality and looks than the size of his penis, but men nevertheless often become anxious about whether their penis is big enough.

Penis anxiety apparently abounds in Britain where even when it was often found that while men often have a better body image, a better genital image and more sexual confidence if they have a large penis, women don't necessarily feel that bigger is better.

Dr. Kevan Wylie, of the Royal Hallamshire Hospital, in Sheffield, and Ian Eardley, of St. James' Hospital in Leeds, examined more than 50 international research projects on penile size and small penis syndrome completed since 1942.

The researchers found from 12 studies that measured the penises of 11,531 men, that on average, erect penises ranged from 5.5 to 6.2 inches in length and 4.7 to 5.1 inches in girth.

A survey then conducted of over 50,000 heterosexual men and women found that 66 percent of men said their penis was average-sized, 22 percent said large and 12 percent said small and while 85 percent of women were satisfied with their partner's penile size, but only 55 percent of men were satisfied with their penis size.Read on...



The urologists say having a micropenis is a totally different condition from having penis anxiety and they urge doctors not to laugh at these very real worries over an imaginary defect and say the review normalizes the situation and provides accurate information.

The researchers warn against plastic surgery techniques which promise to make a man's flaccid or erect penis larger as they are unproven and serious complications may ensue.

Wylie and Eardley say in such cases if education and counseling do not resolve the anxiety issue, psychotherapy is advisable for men whose obsession over penis size is interfering with their lives.

The review is published in the current issue of the urology journal BJU International.

Wednesday, June 6, 2007

Thanks

Monday, June 4, 2007

Old memory traces in brain may trigger chronic pain

Why do so many people continue to suffer from life-altering, chronic pain long after their injuries have actually healed?

The definitive answer -- and an effective treatment -- has long eluded scientists. Traditional analgesic drugs, such as aspirin and morphine derivatives, haven’t worked very well.

A Northwestern University researcher has found a key source of chronic pain appears to be an old memory trace that essentially gets stuck in the prefrontal cortex, the site of emotion and learning. The brain seems to remember the injury as if it were fresh and can’t forget it.

With new understanding of the pain source, Vania Apkarian, professor of physiology, and of anesthesiology, at Northwestern’s Feinberg School of Medicine, has identified a drug that controls persistent nerve pain by targeting the part of the brain that experiences the emotional suffering of pain. The drug is D-Cycloserine, which has been used to treat phobic behavior over the past decade.

In animal studies, D-Cycloserine appeared to significantly diminish the emotional suffering from pain as well as reduce the sensitivity of the formerly injured site. It also controlled nerve pain resulting from chemotherapy, noted Apkarian, who is a member of the Robert H. Lurie Comprehensive Cancer Center at Northwestern University.

The drug has long-term benefits. Animals appeared to be pain free 30 days after the last dose of a 30-day regime of D-Cycloserine.

The study, funded by the National Institutes of Health, will be published in the journal Pain this fall. (It has been published on-line.)

“In some ways, you can think of chronic pain as the inability to turn off the memory of the pain,” Apkarian said. “What’s exciting is that we now may be relieving what has clinically been the most difficult to treat—the suffering or the emotional component of pain.”

Scientists have always tried to understand pain from the viewpoint of sensation, Apkarian said. “To control it, they tried to stop the sensory input to the brain. ”We are saying there’s a cognitive memory and emotional component in the brain that seems abnormal. Easing that may have a bigger effect on suffering.”

Chronic pain is not caused by a single mechanism, Apkarian noted. Sensory abnormalities in people with chronic pain probably drive this memory abnormality.

About 10 percent of the United States population suffers from chronic pain, of which the majority is back pain.Read on...


One of Apkarian’s studies with rats tried to separately measure their emotional suffering and their physical pain after being treated with the drug. (The rats had chronic pain from a healed limb injury.) The results indicated the animals’ emotional suffering decreased much more than their physical pain. While the physical pain appeared to be reduced 30 percent – their emotional suffering completely disappeared.

Rat are nocturnal animals that prefer to be in the dark and are averse to bright light. Researchers placed the rats in a two-compartment chamber –- one side light, one dark. When the rats were in their preferred dark side, scientists mechanically stimulated their sensitive limbs. The rats didn’t like that and bolted into the bright chamber, where they remained. Next scientists took the same rats and treated them with D–Cycloserine. Again, scientists stimulated the rats’ sensitive limbs. This time, however, the rats remained in the dark chamber.

“Their aversive reaction to the stimulation disappeared,” Apkarian said.

Based on the animal results, the next step will be to test the drug in clinical trials, Apkarian said.

“When we do this in a clinical trial, we expect people to say I still have the pain, but it’s not bothering me anymore,” Apkarian said. “We think they will have a physical awareness of the pain, but its emotional consequences will have decreased.” He said the drug potentially may lower the amount of standard analgesics people have to use.

In Apkarian’s previous study, published in late 2006, he revealed that chronic back pain appears in a different part of the brain than the discomfort of burning your finger, for example. With a functional MRI, he found that chronic back pain shows up in the prefrontal cortex. By contrast, the acute sensory pain of the burned finger appears in the sensory part of the thalamus.

Apkarian also found that the longer a person has been suffering from chronic pain, the more activity in the prefrontal cortex. He was able to predict the years of their suffering from the MRI.

“It’s cumulative memory,” he explained. “I can predict with 90 percent accuracy how many years they have been living in that pain without even asking them the question.”

Source: Northwestern University




This news is brought to you by PhysOrg.com


Aquacat


Vallejo, California, May 30, 2007—Cats swimming may seem about as likely as pigs flying, but Odin the tiger is proving the exception to the rule at a U.S. theme park.

Raised by humans at Six Flags Discovery Kingdom outside San Francisco, the white Bengal tiger has been diving for dinner—happily, according to his trainer—since he was a cub.

Tricks of light revisited

NIST atom interferometry displays new quantum tricks

Atoms interfering with themselves. After ultracold atoms are maneuvered into superpositions -- each one located in two places simultaneously -- they are released to allow interference of each atoms two selves. They are then illuminated with light whi ...
Atoms interfering with themselves. After ultracold atoms are maneuvered into superpositions -- each one located in two places simultaneously -- they are released to allow interference of each atom's two "selves." They are then illuminated with light, which casts a shadow, revealing a characteristic interference pattern, with red representing higher atom density. The variations in density are caused by the alternating constructive and destructive interference between the two "parts" of each atom, magnified by thousands of atoms acting in unison.

Physicists at the National Institute of Standards and Technology (NIST) have demonstrated a novel way of making atoms interfere with each other, recreating a famous experiment originally done with light while also making the atoms do things that light just won't do. Their experiments showcase some of the extraordinary behavior taken for granted in the quantum world—atoms acting like waves and appearing in two places at once, for starters—and demonstrate a new technique that could be useful in quantum computing with neutral atoms and further studies of atomic hijinks.

The NIST experiments, described in Physical Review Letters[1], recreate the historic "double-slit" experiment in which light is directed through two separate openings and the two resulting beams interfere with each other, creating a striped pattern. That experiment is a classic demonstration of light behaving like a wave, and the general technique, called interferometry, is used as a measurement tool in many fields. The NIST team used atoms, which, like light, can behave like particles or waves, and made the wave patterns interfere, or, in one curious situation, not.

Atom interferometers have been made before, but the NIST technique introduces some new twists. The researchers trap about 20,000 ultracold rubidium atoms with optical lattices, a lacework of light formed by three pairs of infrared laser beams that sets up an array of energy "wells," shaped like an egg carton, that trap the atoms. The lasers are arranged to create two horizontal lattices overlapping like two mesh screens, one twice as fine as the other in one dimension. If one atom is placed in each site of the wider lattice, and those lasers are turned off while the finer lattice is activated, then each site is split into two wells, about 400 nanometers apart. Under the rules of the quantum world, the atom doesn't choose between the two sites but rather assumes a "superposition," located in both places simultaneously. Images reveal a characteristic pattern as the two parts of the single superpositioned atom interfere with each other. (The effect is strong enough to image because this is happening to thousands of atoms simultaneously—see image.)

Everything changes when two atoms are placed in each site of the wider lattice, and those sites are split in two. The original atom pair is now in a superposition of three possible arrangements: both atoms on one site, both on the other, and one on each. In the two cases when both atoms are on a single site, they interact with each other, altering the interference pattern—an effect that does not occur with light. The imbalance among the three arrangements creates a strobe-like effect. Depending on how long the atoms are held in the lattice before being released to interfere, the interference pattern flickers on (with stripes) and off (no stripes). A similar "collapse and revival" of an interference pattern was seen in similar experiments done earlier in Germany, but that work did not confine a pair of atoms to a single pair of sites. The NIST experiments allowed researchers to measure the degree to which they had exactly one or exactly two atoms in a single site, and to controllably make exactly two atoms interact. These are important capabilities for making a quantum computer that stores information in individual neutral atoms.

[1] J. Sebby-Strabley, B.L. Brown, M. Anderlini, P.J. Lee, W.D. Phillips, J.V. Porto and P.R. Johnson. 2007. Preparing and probing atomic number states with an atom interferometer. Physical Review Letters 98, 200405 (2007).

Source: National Institute of Standards and Technology (NIST)

This news is brought to you by PhysOrg.com

Supersize me

May 24, 2007 - ST. LOUIS - It's research that may have you thinking twice before upgrading to the large size at your favorite fast food joint. Saint Louis University research presented this week in Washington, D.C., shows the dangers of high-fat food combined with high fructose corn syrup and a sedentary lifestyle - in other words, what may be becoming commonplace among Americans.

Brent Tetri, M.D., associate professor of internal medicine at Saint Louis University Liver Center, and colleagues studied the effects of a diet that was 40 percent fat and replete with high fructose corn syrup, a sweetener common in soda and some fruit juices. The research is being presented at the Digestive Diseases Week meeting.

"We wanted to mirror the kind of diet many Americans subsist on, so the high fat content is about the same you'd find in a typical McDonald's meal, and the high fructose corn syrup translates to about eight cans of soda a day in a human diet, which is not far off with what some people consume," says Tetri, a leading researcher in nonalcoholic fatty liver disease, which can lead to cirrhosis and, ultimately, death. "But we were also keeping the mice sedentary, with a very limited amount of activity."

The study, which lasted for 16 weeks, had some curious results, says Tetri.

"We had a feeling we'd see evidence of fatty liver disease by the end of the study," he says. "But we were surprised to find how severe the damage was and how quickly it occurred. It took only four weeks for liver enzymes to increase and for glucose intolerance - the beginning of type II diabetes - to begin."

And unlike other studies, the mice were not forced to eat; rather, they were able to eat whenever they wanted - and eat they did. Tetri says there's evidence that suggests fructose actually suppresses your fullness, unlike fiber-rich foods, which make you feel full quickly.

The take-home message for humans is obvious, he says.

"A high-fat and sugar-sweetened diet compounded by a sedentary lifestyle will have severe repercussions for your liver and other vital organs," he says. "Fatty liver disease now affects about one of every eight children in this country. The good news is that it is somewhat reversible - but for some it will take major changes in diet and lifestyle."

Saint Louis University

genius pill

June 04, 2007 - A drug described by some people as a "genius pill" for enhancing cognitive function provided relief to a small group of Rochester breast cancer survivors who were coping with a side effect known as "chemo-brain," according to a University of Rochester Medical Center study.

Sixty-eight women, who had completed treatment for breast cancer, participated in an eight-week clinical trial testing the effects of modafinil (Provigil). All women took the drug for the first four weeks. During the next month half of the women continued to receive the drug while the other half took an identical looking placebo pill. The women who took modafinil for all eight weeks reported major improvements in memory, concentration and learning.

"I am very enthusiastic about the potential we've demonstrated," said Sadhna Kohli, Ph.D., M.P.H., lead author of the study and a research assistant professor at the University of Rochester's James P. Wilmot Cancer Center. "This is a novel drug and after completing the trial, many of the women wanted to know how they could continue to get modafinil."

Kohli presents the research -- which is believed to be the first to examine the drug's use in breast cancer patients -- on June 3 at the American Society of Clinical Oncology meeting in Chicago. ASCO is honoring Kohli with a Merit Award.

Originally licensed to treat narcolepsy, modafinil promotes wakefulness and seems to boost brainpower without causing the jittery, restless feelings induced by amphetamines. Modafinil is part of a class of drugs called eugeroics, which stimulate the brain only when it is required. The effects of modafinil disappear in about 12 hours. For this reason, sleep-deprived college students, athletes, soldiers or others who want to gain an edge in a competitive environment sometimes seek out the drug, calling it a "genius pill."

The application for cancer care is unique and entirely appropriate, Kohli said. Although some in the scientific community doubt the existence of "chemo-brain," many cancer patients insist they are suffering from an impairment of brain function after chemotherapy and desire some form of relief. In a separate study last year, Kohli found that 82 percent of 595 cancer patients reported problems with memory and concentration. Importantly, the deficits can lead to job loss or social dysfunction.

Scientists have not yet discovered the precise cause of "chemo-brain." A separate research group at the University of Rochester is investigating the toxicity of cancer-killing drugs on healthy brain cells. In 2006 they showed that chemotherapy disrupts cell division in the hippocampus, the brain region essential for learning and memory.

Until now, cancer patients had few options. Some studies have tested Ritalin, which stimulates the central nervous system. But Ritalin has unwanted side effects like headaches, irritability and addiction, Kohli said. Since modafinil does not linger in the body, side effects are minimal, according to recent studies.

Initially, researchers looked at whether modafinil might help alleviate fatigue, another persistent side effect, given the drug's use in narcoleptic patients. Once researchers saw the drug's positive effect on fatigue, they conducted secondary analyses to assess whether modafinil could improve the breast cancer patients' memory and attention skills.

Results showed the women had a faster memory and could more accurately recognize words and pictures after four weeks on the drug. The ability to focus attention, however, did not change at first. But after taking modafinil for another four weeks, attention deficits improved and memory was even greater. Larger studies are still needed to confirm the results.

University of Rochester Medical Center

Plug me in

Scientists develop molecular implantable biocomputers KurzweilAI.net, May 22, 2007 Researchers at Harvard and Princeton universities have taken a crucial step toward building biological computers, tiny implantable devices that can monitor the activities and characteristics of human cells. The information provided by these "molecular doctors," constructed entirely of DNA, RNA, and proteins, could eventually revolutionize medicine by directing therapies only to diseased cells or tissues. Evaluating Boolean logic equations inside cells, these molecular automata will detect anything from the presence of a mutated gene to the activity of genes within the cell. The biocomputers' "input" is RNA; the "output" molecules, indicating the presence of the telltale signals, are easily discernable with basic laboratory equipment. "Currently, we have no tools for reading cellular signals," says Harvard's Yaakov 'Kobi' Benenson, a Bauer Fellow in the Faculty of Arts and Sciences' Center for Systems Biology. "These biocomputers can translate complex cellular signatures, such as activities of multiple genes, into a readily observed output. They can even be programmed to automatically translate that output into a concrete action, meaning they could either be used to label a cell for a clinician to treat or they could trigger therapeutic action themselves." Molecular automata could allow doctors to specifically target only cancerous or diseased cells via a sophisticated integration of intracellular disease signals, leaving healthy cells completely unaffected.

MRI detects cancers missed by mammography in breast cancer patients

June 04, 2007 - CHICAGO -- A unique examination of one treatment center's use of magnetic resonance imaging (MRI) in new breast cancer patients has found MRI to be superior to mammography in finding additional tumors in a breast in which cancer has already been diagnosed, and in detecting new tumors in a patient's supposedly healthy breast.

Two studies conducted by oncologists from Mayo Clinic (http://cancercenter.mayo.edu/) in Jacksonville, Fla., and presented June 2, at the annual meeting of the American Society of Clinical Oncology (http://www.asco.org/portal/site/ASCO) (ASCO), also revealed distinct patterns between MRI-detected ipsilateral tumors (cancer in the affected breast) and contralateral tumors (cancer in the unaffected breast) that could guide use of the screening technology, researchers say.

"These results prove that MRI can detect tumors missed by traditional exams, and can be vital in helping women choose the right course of treatment for their breast cancer," says Mayo radiation oncologist Laura Vallow, M.D. She and other radiation oncologists at the Multidisciplinary Breast Clinic (http://www.mayoclinic.org/breastcancerprogram-jax/) in Jacksonville scan both breasts of all newly diagnosed breast cancer patients with mammograms and MRI and so have been able to compare results from both screening techniques in hundreds of women.

In their study comparing MRI and mammography in detection of ipsilateral breast cancer -- the first such study of its kind in the nation --the researchers found that MRI detected tumors missed by mammography in 16 percent of 390 patients. Women with MRI-detected ipsilateral tumors tended to be younger or they had a primary breast tumor that was at least 1 centimeter in size. The primary tumors also belonged to many different breast cancer subtypes, and newly diagnosed tumors were different from the primary tumor in 29 percent of patients.

"This is an important finding because breast cancer tends to be more aggressive when diagnosed in younger women," says Dr. Vallow, the study's lead investigator. "This suggests that younger women who want breast-conserving surgery to treat newly diagnosed cancer may benefit from an MRI scan of the entire breast."

The second study, which compared MRI versus mammography in detection of contralateral breast cancer, found that MRI detected tumors missed by mammograms in 3.2 percent of 401 women. Some women participated in both studies, and in a few cases, MRI found tumors in both breasts that were not found by mammography.

Patients with MRI-detected tumors were more likely to be postmenopausal, and to have a common type of tumor, one classified as estrogen receptor (ER)-positive, says the contralateral breast study's lead investigator, oncology resident Johnny Bernard Jr., M.D., who works with Dr. Vallow. This result mirrors findings of a 969-patient study reported in April 2007 in the New England Journal of Medicine (http://content.nejm.org/) (NEJM) which concluded that MRI detected contralateral breast tumors in 3.1 percent of patients that had been missed by mammography. Data on 65 Mayo patients were part of the NEJM study.

Detecting contralateral breast cancer before a newly diagnosed patient is treated is important, says Dr. Bernard. "Patients can have treatment for both breast cancers simultaneously rather than the physical and psychological trauma of having a contralateral cancer detected years later," he says. "Furthermore, treatment recommendations by the physician may be affected by the finding of a contralateral cancer, as well as patient preferences for one treatment versus another, such as for mastectomy versus breast-conserving surgery.

"Patients with a diagnosis of breast cancer are considered high risk and therefore we contend that these patients should have a screening MRI to further evaluate the extent of ipsilateral disease and also to evaluate the contralateral breast," says Dr. Bernard.

"The combined results of these studies validate what we have already seen in our clinic, and that is that MRI breast cancer screening is quickly becoming an indispensable tool," says Dr. Vallow. "Studies like these are needed to pave the way toward greater use of MRI breast cancer screening in newly diagnosed patients."

However, Dr. Vallow acknowledges that MRI scans need to be "faster, cheaper, more accessible, and uniform across institutions," before it can be widely used, and cautions that cancer centers that use MRI also need to be able to perform MRI-guided biopsy of the suspected tumor, if necessary. "There is huge variation in how clinics conduct breast MRI screenings, because there is no quality control at the moment," she says. "Patients at centers that don't offer MRI-guided biopsy can end up having mastectomies because physicians there have no way to investigate the findings seen on the scan."

New guidelines, issued in April by the American Cancer Society (http://www.cancer.org/docroot/home/index.asp), recommend that women who are at high risk of developing breast cancer receive both annual MRI and mammography scans starting at age 30.

Big Pig


Near Delta, Alabama, May 3, 2007—Hogzilla may be headed for horror-movie heaven, but the massive swine that became an Internet sensation in 2004 may have been bested, size wise, by this reportedly wild pig killed May 3 by Jamison Stone, 11, and reported by the Associated Press on Wednesday.

From tip to tail, the newfound hog—dubbed "Monster Pig"—measures 9 feet, 4 inches (284 centimeters) and weighs in at 1,051 pounds (477 kilograms), according to Stone's father.

anti-noise





 Open-plan offices are social, collaborative environments. They can also be noisy, filled by a cacophony of workers' chatter. But a new piece of software might help turn down the volume.

Cambridge Sound Management of Massachusetts has developed the Open Office Privacy Calculator, which lets architects plan an office's acoustics. The software calculates how the materials used to construct an office, as well as structural factors such as desk partitions and ceiling height, affect how sound travels.

Each company can then set a desired volume for its offices - choosing to embrace loud-mouthed colleagues or muffle them, for instance - and the software suggests which design tweaks and noise cancellation systems best achieve it.

NewScientistTech





Powered by ScribeFire.

hint, hint, hint

People gauge how responsive their partners are primarily by how they themselves respond to their partners—not the other way around, according to a series of Yale studies in the Journal of Personality and Social Psychology.

“We have examined this in different ways,” said Margaret Clark, faculty author and psychology professor. “In studies of marriage we’ve found that what people report they do for their partners is a better predictor of what they think their spouse does for them than are the spouse’s own reports of what was done.”

“Most surprisingly,” she said, “when Edward Lemay, a senior Yale graduate student, brought people into the lab and asked leading questions to make them feel supportive or non-supportive of their partner, the first group reported that their partner is more supportive toward them than did the second group.”

Responsiveness in this instance means anything a person does that promotes the partner’s welfare, such as helping with tasks, providing comfort and information, encouraging a person to strive toward goals, including a partner in desirable joint activities, and providing symbolic support, such as words of affection, hugs, and sending greeting cards.

Clark and co-authors Lemay and Brooke Feeney, a Carnegie Mellon University professor, report findings from three studies, all of which suggest that only a small fraction of how people gauge their partners’ responsiveness to their needs is based on what the partners do. Most of it is based on what they themselves do and feel.

“We are calling this projection of responsiveness,” Clark said, “which means seeing your relationship partner as behaving in the same manner toward you as you do toward that partner. That is, you see your partner as about as responsive to your welfare as you are to your partner’s welfare, regardless of the partner’s true behavior.”

The researchers said they conducted the studies because an essential feature of the health and well-being of a mutual communal relationship is believing that one’s partner cares about one’s welfare and will attend and respond to one’s desires, needs, and goals. Not only do people who care about their partners perceive that their partners in turn care about them, they become more satisfied with their relationship over time.

“Sadly, the flip side is true too,” Clark said. “Those who are uncaring believe their apathy is reciprocated, which undermines their satisfaction.”

Citation: Journal of Personality and Social Psychology 92: 834-853 (May 2007)

Source: Yale University

nervous mice



New insights into the neural basis of anxiety

People who suffer from anxiety tend to interpret ambiguous situations, situations that could potentially be dangerous but not necessarily so, as threatening. Researchers from the Mouse Biology Unit of the European Molecular Biology Laboratory (EMBL) in Italy have now uncovered the neural basis for such anxiety behaviour in mice. In the current issue of Nature Neuroscience they report that a receptor for the messenger serotonin and a neural circuit involving a brain region called the hippocampus play crucial roles in mediating fear responses in ambiguous situations.

A mouse that has learned that a certain cue, for example a tone, is always followed by an electrical shock comes to associate the two and freezes with fear whenever it hears the tone even if the shock is not delivered. But in real life the situation is not always so clear; a stimulus will only sometimes be followed by a threat while other times nothing might happen. Normal mice show less fear towards such ambiguous cues than to clearly threatening stimuli.

A team of researchers led by Cornelius Gross at the EMBL Mouse Biology Unit now discovered that this response to ambiguous stimuli requires a specific receptor molecule for serotonin, a signal many brain cells use to communicate. Mice that lack the serotonin receptor 1A have problems processing ambiguous stimuli and react to them with full-fledged fear responses. The cause is wrongly connected cells in their brains. Serotonin signalling is very important for brain development and if the receptor 1A is missing, defects arise in the wiring of the brain that affect the behaviour of mice later on in life.

"In humans serotonin signalling has been implicated in disorders including depression and anxiety and like our mice patients suffering from these conditions also overreact to ambiguous situations," Gross says. "The next step was to identify the brain regions that are responsible for such complex fear behaviour and the processing of ambiguous cues."

Using a new technique to switch off neural activity in selective brain cells in living mice, Gross and his colleagues discovered that a specific part of the hippocampus is required for correct processing of ambiguous stimuli.

"Shutting down a specific circuit in the hippocampus abolished fear reactions only to ambiguous cues," says Theodoros Tsetsenis who carried out the research in Gross' lab. "The pathway must be involved in processing and assessing the value of stimuli. It seems to bias mice to interpret situations as threatening."

The hippocampus is mainly known as a region important for learning and memory, but the results reveal a more general role in evaluating information and assessing contingencies.

Neural circuits that govern fundamental behaviours like fear are often often conserved between species and patient studies suggest a role for the hippocampus in anxiety also in humans.

The new insights gained into serotonin signalling via the receptor 1A and the role of the hippocampus in fear behaviour in mice promise to shed light on the neural basis of anxiety disorders and open up new avenues for therapies.

Source: European Molecular Biology Laboratory

This news is brought to you by PhysOrg.com







connecting the spectrum of energy



University of Utah physicist Orest Symko holds a match to a small heat engine that produces a high-pitched tone by converting heat into sound. Symkos research team is combining such heat engines with existing technology that turns sound into electric ...

University of Utah physicist Orest Symko holds a match to a small heat engine that produces a high-pitched tone by converting heat into sound. Symko's research team is combining such heat engines with existing technology that turns sound into electricity, resulting in devices that can harness solar energy in a new way, cool computers and other electronics. Credit: University of Utah



University of Utah physicists developed small devices that turn heat into sound and then into electricity. The technology holds promise for changing waste heat into electricity, harnessing solar energy and cooling computers and radars.



"We are converting waste heat to electricity in an efficient, simple way by using sound," says Orest Symko, a University of Utah physics professor who leads the effort. "It is a new source of renewable energy from waste heat."



Five of Symko’s doctoral students recently devised methods to improve the efficiency of acoustic heat-engine devices to turn heat into electricity. They will present their findings on Friday, June 8 during the annual meeting of the Acoustical Society of America at the Hilton Salt Lake City Center hotel.



Symko plans to test the devices within a year to produce electricity from waste heat at a military radar facility and at the university’s hot-water-generating plant.



The research is funded by the U.S. Army, which is interested in "taking care of waste heat from radar, and also producing a portable source of electrical energy which you can use in the battlefield to run electronics" he says.



Symko expects the devices could be used within two years as an alternative to photovoltaic cells for converting sunlight into electricity. The heat engines also could be used to cool laptop and other computers that generate more heat as their electronics grow more complex. And Symko foresees using the devices to generate electricity from heat that now is released from nuclear power plant cooling towers.



How to Get Power from Heat and Sound



Symko’s work on converting heat into electricity via sound stems from his ongoing research to develop tiny thermoacoustic refrigerators for cooling electronics.



In 2005, he began a five-year heat-sound-electricity conversion research project named Thermal Acoustic Piezo Energy Conversion (TAPEC). Symko works with collaborators at Washington State University and the University of Mississippi.



The project has received $2 million in funding during the past two years, and Symko hopes it will grow as small heat-sound-electricity devices shrink further so they can be incorporated in micromachines (known as microelectromechanical systems, or MEMS) for use in cooling computers and other electronic devices such as amplifiers.



Using sound to convert heat into electricity has two key steps. Symko and colleagues developed various new heat engines (technically called "thermoacoustic prime movers") to accomplish the first step: convert heat into sound.



Then they convert the sound into electricity using existing technology: "piezoelectric" devices that are squeezed in response to pressure, including sound waves, and change that pressure into electrical current. "Piezo" means pressure or squeezing.



Most of the heat-to-electricity acoustic devices built in Symko’s laboratory are housed in cylinder-shaped "resonators" that fit in the palm of your hand. Each cylinder, or resonator, contains a "stack" of material with a large surface area – such as metal or plastic plates, or fibers made of glass, cotton or steel wool – placed between a cold heat exchanger and a hot heat exchanger.



When heat is applied – with matches, a blowtorch or a heating element – the heat builds to a threshold. Then the hot, moving air produces sound at a single frequency, similar to air blown into a flute.



"You have heat, which is so disorderly and chaotic, and all of a sudden you have sound coming out at one frequency," Symko says.



Then the sound waves squeeze the piezoelectric device, producing an electrical voltage. Symko says it’s similar to what happens if you hit a nerve in your elbow, producing a painful electrical nerve impulse.



Longer resonator cylinders produce lower tones, while shorter tubes produce higher-pitched tones.



Devices that convert heat to sound and then to electricity lack moving parts, so such devices will require little maintenance and last a long time. They do not need to be built as precisely as, say, pistons in an engine, which loses efficiency as the pistons wear.



Symko says the devices won’t create noise pollution. First, as smaller devices are developed, they will convert heat to ultrasonic frequencies people cannot hear. Second, sound volume goes down as it is converted to electricity. Finally, "it’s easy to contain the noise by putting a sound absorber around the device," he says.



Studies Improve Efficiency of Acoustic Conversion of Heat to Electricity



Here are summaries of the studies by Symko’s doctoral students:



-- Student Bonnie McLaughlin showed it was possible to double the efficiency of converting heat into sound by optimizing the geometry and insulation of the acoustic resonator and by injecting heat directly into the hot heat exchanger.



She built cylindrical devices 1.5 inches long and a half-inch wide, and worked to improve how much heat was converted to sound rather than escaping. As little as a 90-degree Fahrenheit temperature difference between hot and cold heat exchangers produced sound. Some devices produced sound at 135 decibels – as loud as a jackhammer.



-- Student Nick Webb showed that by pressurizing the air in a similar-sized resonator, it was able to produce more sound, and thus more electricity.



He also showed that by increasing air pressure, a smaller temperature difference between heat exchangers is needed for heat to begin converting into sound. That makes it practical to use the acoustic devices to cool laptop computers and other electronics that emit relatively small amounts of waste heat, Symko says.



-- Numerous heat-to-sound-to-electricity devices will be needed to harness solar power or to cool large, industrial sources of waste heat. Student Brenna Gillman learned how to get the devices – mounted together to form an array – to work together.



For an array to efficiently convert heat to sound and electricity, its individual devices must be "coupled" to produce the same frequency of sound and vibrate in sync.



Gillman used various metals to build supports to hold five of the devices at once. She found the devices could be synchronized if a support was made of a less dense metal such as aluminum and, more important, if the ratio of the support’s weight to the array’s total weight fell within a specific range. The devices could be synchronized even better if they were "coupled" when their sound waves interacted in an air cavity in the support.



-- Student Ivan Rodriguez used a different approach in building an acoustic device to convert heat to electricity. Instead of a cylinder, he built a resonator from a quarter-inch-diameter hollow steel tube bent to form a ring about 1.3 inches across.



In cylinder-shaped resonators, sound waves bounce against the ends of the cylinder. But when heat is applied to Rodriguez’s ring-shaped resonator, sound waves keep circling through the device with nothing to reflect them.



Symko says the ring-shaped device is twice as efficient as cylindrical devices in converting heat into sound and electricity. That is because the pressure and speed of air in the ring-shaped device are always in sync, unlike in cylinder-shaped devices.



-- Student Myra Flitcroft designed a cylinder-shaped heat engine one-third the size of the other devices. It is less than half as wide as a penny, producing a much higher pitch than the other resonators. When heated, the device generated sound at 120 decibels – the level produced by a siren or a rock concert.



"It’s an extremely small thermoacoustic device – one of the smallest built – and it opens the way for producing them in an array," Symko says.



Source: University of Utah







eau de Eiffel


Paris, France, May 31, 2007—It was "eau de Eiffel Tower" for this scuba diver, who swam in a tank installed under the famous landmark to support tourism, ocean conservation, and the sport of scuba.

The square pool—measuring 52 feet (16 meters) to a side—opened on June 1, 2007 to anyone hoping to get their feet wet in scuba diving, a sport not well known among city-dwellers.

Instructors will provide wetsuits, flippers, masks, and air tanks to interested visitors, who can each take a ten-minute swim in the 4.5-foot-deep (1.5-meter-deep) pool.

The organizers expect to welcome about a hundred people each day.

Sunday, June 3, 2007

more good news

Richard Black
Environment correspondent, BBC News website, Anchorage, Alaska

The blue whale, possibly the largest animal ever to live on Earth, is making a comeback, scientists have said.

They have collated data showing that the number of marine mammals in the Southern Hemisphere has increased from a few hundred to a few thousand.

Before the commercial hunting era, there would have been hundreds of thousands in the oceans.

The findings were presented at the International Whaling Commission's (IWC) annual meeting.

Numbers of other large species such as fin whales and humpbacks are also rising in many parts of the world.

"The most recent data is really encouraging," said the IWC's head of science, Greg Donovan.

"Blue whales have now been increasing by about 7-8% per year for the last 10 years at least, for which we have good data.



"The abundance is still very low; it's about 2,300 for the whole Southern Hemisphere so it's a tiny fraction of what it used to be, but it's good news they're increasing."

There is less data available for the Northern Hemisphere, but off the Icelandic coast a recovery has also been noted.

The IWC's "guesstimate" is that globally, numbers are currently about 4,500.

Top target

Blue whales are true leviathans, growing up to 30m in length and weighing up to 190 tonnes.

Before industrial-scale commercial hunting began in earnest about a century ago, there were thought to be 150,000-200,000 in the oceans.

As factory ships and efficient harpoons multiplied, the blue's size made it the favoured species, as vast quantities of oil could be extracted from its blubber.

The 1930-1931 season alone saw about 30,000 prised from the oceans.

Protection arrived in the 1960s; but with numbers so low, it was doubtful whether the species could survive.

For now, it has survived, with its extraordinary capacity to communicate acoustically over distances of thousands of kilometres meaning it can find mates even when so few remain.

Renewed threat?

Global protection from hunting for the second largest species, the fin, and the whale-watcher's favourite, the humpback, have also led to populations rising in several parts of the oceans.

But that can bring mixed benefits. There have been rumours over the last six months that scientists advising the organisation which decides threat categories for animals, the World Conservation Union (IUCN), are contemplating moving humpbacks from their current Vulnerable status to a less threatened category.

As numbers grow, so does the likelihood that a species will be hunted.

For the moment, catches of these giant creatures remain low. Over the next 12 months, hunters from Japan, Greenland, Iceland and the Caribbean will target 78 fins and 54 humpbacks, and nobody is suggesting catching blue whales again.

Dr Donovan sees climate change as potentially the biggest long-term threat.

"We don't know whether climate change is going to be a big problem for whales, but it could be," he said.

"Blues feed very close to the ice edge, and if there's less ice then it could affect them. But it could be that the opposite will happen, we really don't know at the moment."

Richard.Black-INTERNET@bbc.co.uk
Story from BBC NEWS:
http://news.bbc.co.uk/go/pr/fr/-/1/hi/sci/tech/6710051.stm

delightfully self-referential



James Watson's genome sequenced
Discoverer of the double helix blazes trail for personal genomics.

Erika Check
Nobel laureate James D. Watson peered deep into his genome yesterday. And soon, anyone else interested in his genetic makeup will be able to do the same.

Scientists in Houston presented Watson with a DVD of his genome sequence, which they said was the "first individual genome to be sequenced for less than $1 million". The carefully worded claim may be an acknowledgement that another personal genome project has already been completed: J. Craig Venter has deposited his genome sequence into the public GenBank database, he told Nature two weeks ago.

Such personal genomes are for now largely symbolic, because it's difficult to draw concrete information about a person's health from his or her genome sequence.

And genetic self-knowledge does not necessarily help a person: the only deliberate omission from Watson's sequence is that of a gene linked to Alzheimer's disease, which Watson, who is now 79, asked not to know about because it is incurable and claimed one of his grandmothers.

Scientists said yesterday that Watson's genes showed some predisposition to cancer. Watson — who, working with Francis Crick, deduced DNA's structure in 1953 — has had skin cancer, and a sister had breast cancer, he said yesterday. But it's unlikely that reading Watson's genome would have allowed doctors to predict what type of cancer he might have suffered before it was diagnosed.

Price cut

That could change in the near future, when hundreds or thousands of individual genomes are sequenced and scientists learn to correlate the sequence with health outcomes.

The US National Human Genome Research Institute is planning to sequence hundreds of individual genomes, and the private Archon X Prize in Genomics has announced a $10 million cash reward for the first team to sequence 100 genomes in 10 days.

Sequencing technology is still too expensive for most people — and some geneticists are concerned that sequencing prominent scientists makes genomics look like a gimmick for the rich and powerful (see 'Celebrity genomes alarm researchers').

But the price is dropping quickly. The Human Genome Project, which was declared complete four years ago, cost $3 billion over 13 years. Sequencing Watson's genome took two months.

The Human Genome Project, which Watson helped initiate in 1988, assembled a composite 'reference' genome of DNA from many individuals, in contrast to the sequence given to Watson yesterday.

Watson's genome is expected to be described in a scientific publication soon, and was submitted to the GenBank yesterday. The project was initiated by scientists at the sequencing technology company 454 Life Sciences in Branford, Connecticut, and at Baylor College of Medicine in Houston, Texas.



.

Superfluid (and I don't mean beer)

Attempting to unlock the secrets of superfluidity

Ever since superfluidity was discovered in liquid helium, scientists have been searching for its causes, and exploring the different phases of matter in which superflow might exist (gases, liquids and solids).

“The property most closely associated with superfluidity is Bose-Einstein condensation,” Henry Glyde tells PhysOrg.com. Superfluidity in solid helium was reported in 2004. “BEC has been observed in liquids and gases, and the question now is: Does it exist in solids? We wanted to look for BEC in this third phase of matter.”

Glyde is a scientist at the University of Delaware, but the team on the project to discover Bose-Einstein condensation (BEC) in solid helium-4 consists of scientists from the National Physical Laboratory in Teddington and ISIS at the Rutherford Appleton Laboratory in the United Kingdom, as well as the NIST Center for Neutron Research in Gaithersburg, Maryland, and at the University of Maryland in College Park. The U.S. Department of Energy also plays a major role in providing funding for this research. Diallo, Pearce, Azuah, Kirichek, Taylor and Glyde had their findings published in a Physical Review Letters piece titled “Bose-Einstein Condensation in Solid 4He.”

“BEC is a phenomena in which a large fraction of bosons in a system condense in a single quantum state,” explains Glyde. “With most BEC, you get superfluidity. This is when there is flow without resistance. We know this exists in gases and liquids. This superfluidity is very useful in many applications, including superconducting. Finding BEC in a solid could answer questions about how superfluidity works, and help us learn more about BEC.”

“This work could potentially have impacts on other areas of science,” Glyde explains. “If we can understand the origin of the BEC phenomenon, and of superfluidity, we can help clarify the whole connection between superfluidity, superconductivity and BEC. This is especially important in superconductivity, where we have a number of high temperature superconductors, and there is no consensus on how it arises.” Glyde pauses before continuing: “Clearly, BEC is playing a role, and when we know what that is, there is the potential for very wide application.”

Some of the applications of BEC itself are creating highly coherent beams of atoms that could be used to create precise atomic circuitry. Applications that can arise from understanding the role that BEC plays in superfluidity include electrical wires that carry currents that do not lose energy to resistance and creating very powerful magnets. “Once we understand the mechanism, how it works, than we can begin to tailor it to applications and materials to get the properties we want,” enthuses Glyde.

Glyde says that in 2004, Kim and Chan found that superflow does indeed exist in solid helium. “The next step was to measure it for Bose-Einstein condensation,” Glyde says. “We wanted to see if BEC was also present in solid helium.” The experiment used neutron scattering to measure the solid helium and then Glyde and his coauthors searched the measurement for BEC clues.

And the results? “We did not see any BEC in solid helium,” Glyde admits. “But this experiment did set a sort of limit. We didn’t use the highest resolution possible.” Glyde explains that when higher resolution is used in neutron scattering, fewer counting statistics are available. “We didn’t know what we would see in this first experiment. But we did get a good signal, and we set a precision level.”

This means that further experiments are planned for solid helium. “We plan to increase the resolution, as well as try the experiment at a lower temperature and use a larger solid,” Glyde explains. He also says that BEC might be connected with a defect in the material. Therefore, a future experiment with solid helium will also include a way to enhance defects in the solid.

“We’ve seen BEC in liquid helium using this instrument and made the connection with superfluidity,” Glyde insists. “Now we’re doing it with solid helium. This technique should work on this last phase of matter. And at the increased level, if there’s something there to see, we’ll see it.”

Saturday, June 2, 2007

Free courses, lectures on itunes

MIT Open Courseware (among others) now available (free) on itunes
http://www.apple.com/education/itunesu/