A Bridge Too Far? Stark Warning From History Over Plans For 'Inhabited' London Bridge  

Posted by Zaib in


On July 11, public celebrations will mark the 800th anniversary of the completion of London Bridge. Now, a new study at the University of Leicester has uncovered a tale of corruption, mismanagement, financial crisis and a property crash that resulted in the downfall of the Old London Bridge -- the capital’s last ‘living bridge’.


The research, which is due to be published in the London Journal, provides a stark warning from history as plans are discussed for a new ‘inhabited’ London Bridge – between Waterloo and Blackfriars – with luxury flats, shops and restaurants. London Mayor Boris Johnson has revived plans for the £80m scheme.

But doctoral research conducted by Mark Latham at the University of Leicester’s Centre for Urban History discovered the houses built on the Old London Bridge to attract the gentry didn’t have the pulling power as expected. This combined with an economic slump and other factors to ensure that the grand vision of an inhabited bridge across the Thames was not sustainable.

“Old London Bridge is familiar to many of us in the form of the nursery rhyme “London Bridge is fallen down”, said Mr Latham, “but what is not generally known is why commerce and housing on the bridge did collapse.

“It was previously assumed by historians that the removal of the structures from the Bridge was part of a more general movement within the Corporation of London to “improve” the City via a series of infrastructure projects. However it is clear from my research that a far more complex and intriguing set of factors were at play.”

The removal of the houses and shops from Old London Bridge occurred in 1756. Mr Latham’s study has examined why.

He said: “I am fascinated by the question of why the houses were removed from the Bridge as in medieval times they were viewed as one of London’s great attractions, and the rental income from the houses on the Bridge, alongside others within the City of London, financed the maintenance of the structure.

“What I discovered was that the organisation that managed the bridge at that time was plagued with incompetent management and corruption. Both workmen and their managers charged inflated prices for materials and labour, the management left rents uncollected, and on several occasions the workmen were found to have deliberately and almost fatally damaged the Bridge in order to charge for its repair.

“Furthermore, managers often paid for improvements to their own houses out of the coffers of the Trust running the Bridge.”

Problems were compounded by a “highly risky, costly and poorly timed project” undertaken in the teeth of a credit crisis to construct a series of gentrified houses on the Bridge in the belief that such houses would prove attractive to middle class Londoners and increase the organisation’s rental income. However, the authorities had grossly miscalculated the demand for such properties and the houses attracted only a handful of tenants.

A London wide property crash ensued and soon the Trust running the bridge was haemorrhaging income, the maintenance budget for the Bridge itself was being squeezed and so the vacant houses on the Bridge began to rapidly fall into a state of dangerous disrepair. London Bridge was indeed close to being “fallen down”, said Mr Latham.

“At this point reality dawned on the members of the Trust, and they faced up to the fact that it was no longer financially viable to maintain structures on the Bridge, and by the early 1755 they had begun to petition Parliament in a desperate plea for the money to fund their demolition.”

One further interesting insight from the study is that the removal of the houses and businesses from the Bridge marks a break from London’s medieval past.

Said Mr Latham: “The renovation of the Bridge in the mid eighteenth century was such an important event in the history of London as in many ways the demolition of these characterful medieval houses and the subsequent transformation of the Bridge into the type of bland utilitarian functional structure - very similar to the London Bridge we see today - represents a rupture with London’s medieval past and can be taken as symbolic of London’s emergent modernity.”


Single Molecules As Electric Conductors  

Posted by Zaib in

Researchers from Graz University of Technology, Humboldt University in Berlin, M.I.T., Montan University in Leoben and Georgia Institute of Technology report an important advance in the understanding of electrical conduction through single molecules.


Minimum size, maximum efficiency: The use of molecules as elements in electronic circuits shows great potential. One of the central challenges up until now has been that most molecules only start to conduct once a large voltage has been applied. An international research team with participation of the Graz University of Technology has shown that molecules containing an odd number of electrons are much more conductive at low bias voltages. These fundamental findings in the highly dynamic research field of nanotechnology open up a diverse array of possible applications: More efficient microchips and components with considerably increased storage densities are conceivable.

One electron instead of two: Most stable molecules have a closed shell configuration with an even number of electrons. Molecules with an odd number of electrons tend to be harder for chemists to synthesize but they conduct much better at low bias voltages. Although using an odd rather than an even number of electrons may seem simple, it is a fundamental realization in the field of nanotechnology – because as a result of this, metal elements in molecular electronic circuits can now be replaced by single molecules. “This brings us a considerable step closer to the ultimate minitiurization of electronic components”, explains Egbert Zojer from the Institute for Solid State Physics of the Graz University of Technology.

Molecules instead of metal

The motivation for this basic research is the vision of circuits that only consist of a few molecules. “If it is possible to get molecular components to completely assume the functions of a circuit’s various elements, this would open up a wide array of possible applications, the full potential of which will only become apparent over time. In our work we show a path to realizing the highly electrically conductive elements”, Zojer excitedly reports the momentous consequences of the discovery.

Specific new perspectives are opened up in the field of molecular electronics, sensor technology or the development of bio-compatible interfaces between inorganic and organic materials: The latter refers to the contact with biological systems such as human cells, for instance, which can be connected to electronic circuits in a bio-compatible fashion via the conductive molecules.


Apollo 11 Moon Rocks Still Crucial 40 Years Later, Say Researchers  

Posted by Zaib in

A lunar geochemist at Washington University in St. Louis says that there are still many answers to be gleaned from the moon rocks collected by the Apollo 11 astronauts on their historic moonwalk 40 years ago July 20.


And he credits another WUSTL professor for the fact that the astronauts even collected the moon rocks in the first place.

Randy L. Korotev, Ph.D., a research professor in the Department of Earth and Planetary Sciences in Arts & Sciences, has studied lunar samples and their chemical compositions since he was an undergraduate at the University of Wisconsin and "was in the right place at the right time" in 1969 to be a part of a team to study some of the first lunar samples.

"We know even more now and can ask smarter questions as we research these samples," says Korotev, who is mainly interested in studying the impact history of the moon, how the moon's surface has been affected by meteorite impacts and the nature of the early lunar crust.

"There are still some answers, we believe, in the Apollo 11 mission.

"We went to the moon and collected samples before we knew much about the moon. We didn't totally understand the big concept of what the moon was like until early 2000 as a result of missions that orbited the moon collecting mineralogical and compositional data.

"It's only been fairly recently that we decided that we should look closer at these Apollo 11 samples."

Korotev credits the late Robert M. Walker, Ph.D., Washington University's McDonnell Professor of Physics in Arts & Sciences, and a handful of other scientists for the fact that there are even moon samples to study.

"Bringing samples back from the moon wasn't the point of the mission," says Korotev. "It was really about politics. It took scientists like Bob Walker to bring these samples back — to show the value of them for research.

"Bob convinced them to build a receiving lab for the samples and advised them on the handling and storage of them.

"We didn't' go to the moon to collect rocks, so we scientists are really lucky that we have this collection."

Korotev points out that by the last Apollo mission — Apollo 17 — one of the astronauts onboard was a geologist, Harrison H. Schmitt.

WUSTL's moon history

Walker was recruited to serve on the scientific team that advised NASA on the handling and distribution of moon rocks and soil samples from the first Apollo missions. That team distributed Apollo 11 samples to some 150 laboratories worldwide, including WUSTL.

Walker also briefed those early astronauts about what to expect on the rocky, dusty moon surface.

In an interview some months after the first moon samples arrived in WUSTL's space sciences lab, Walker recalled the excitement of that momentous day in 1969: "We felt just like a bunch of kids who were suddenly given a brand new toy store ... there was so much to do, we hardly knew where to begin."

Ghislaine Crozaz, Ph.D., professor of earth and planetary sciences emerita in Arts & Sciences at Washington University and a member of Walker's space sciences group that was one of those selected to study the first lunar samples, says the event is "as vivid in my mind as if it had happened yesterday."

Crozaz says that the team studied the cosmic rays and radiation history of the lunar samples mainly using nuclear particle tracks, which were revealed by techniques invented by Walker.

"After we received the samples in early September, we worked like hell until the First Lunar Science Conference in early January 1970 in Houston, where we arrived with our Science paper after having worked 'incommunicado' for 4 months."

In their study of the lunar materials, Walker's laboratory led the way in deciphering their record of lunar, solar system and galactic evolution. Of special importance was the information they gave on the history of solar radiation and cosmic rays.

Crozaz, who later became Walker's wife, says the lunar samples provided insights into the history of the solar system that couldn't be achieved at the time by looking at meteorites found on Earth. The intense heat encountered during their passage through the atmosphere would have erased much of the record of radiation the meteorites carried.

The Apollo 11 samples — and samples from almost every Apollo mission until the last one in December 1972 — have been securely housed on the 4th floor of the physics department's Compton Laboratory and used by numerous WUSTL researchers, including many members of the McDonnell Center for the Space Sciences. The McDonnell Center was established in 1974, with Walker as its inaugural director.

Today, the remaining lunar samples in Compton Hall that arrived in 1969 from the Apollo 11 mission and from subsequent Apollo missions in the 1970s are being painstakingly prepared for a return trip to Houston to NASA's moon rocks repository, the Lunar Sample Building at the Lyndon B. Johnson Space Center in Houston, Texas.

"The samples have been exhaustively analyzed and numerous papers have been published showing interesting research results," says Ernst K. Zinner, Ph.D., research professor of physics and of earth and planetary sciences, who joined Walker's lab in 1972 studying Apollo mission samples before focusing on analysis of stellar dust grains found in primitive meteorites.

"We have finished analyzing these particular samples and we're focusing on other extraterrestrial samples. In a sense, our lab in Compton has moved from the moon to the stars in our research interests.

"It is a great and serious responsibility to hold and guard these samples, which are absolutely irreplaceable."

In the meantime, in the Earth and Planetary Sciences Building, next door to Compton Hall, Korotev, who received his Apollo 11 samples from NASA much later — not until 2005 — still has much work to do with his samples, which have been chemically analyzed and are sealed in tubes and securely stored away for now.

"You can look at the moon and know that the moon has been hit a lot by very large meteorites," says Korotev. "We know this occurred some 3.9 billion years ago.

"We don't know, however, the history of large meteorites hitting the Earth — we can't see those impacts because they would have been erased by Earth's active geology.

"We want to see if meteorite bombardment on the moon coincided with what was happening on Earth, and, in turn, with life starting on Earth," says Korotev, who as a 20-year-old chemistry major in 1969, decided his career path after working with the Apollo 11 rocks.

"The whole experience decided my career. I went to graduate school in 1971 to study lunar geochemistry so that I'd know how to interpret the chemical data we obtained in terms of lunar geology. That's what I'm still doing!"


First Look At The Apollo Landing Sites  

Posted by Zaib in

The imaging system on board NASA's Lunar Reconnaissance Orbiter (LRO) recently had its first of many opportunities to photograph the Apollo landing sites. The Lunar Reconnaissance Orbiter Camera (LROC) imaged five of the six Apollo sites with the narrow angle cameras (NACs) between July 11 and 15, within days of the 40th anniversary of the Apollo 11 mission.


The early images obtained by LROC, operated by Arizona State University Professor Mark Robinson, show the lunar module descent stages left behind by the departing astronauts. Their locations are made evident by their long shadows, which result from a low sun angle at the time of collection.

"In a three-day period we were able to image five of the six Apollo sites – the LROC team anxiously awaited each image," says LROC Principal Investigator Mark Robinson, professor in the School of Earth and Space Exploration in ASU's College of Liberal Arts and Sciences. "Of course we were very interested to get our first peek at the lunar module descent stages just for the thrill – and to see how well the cameras had come into focus."

The orbiter's current elliptical orbit resulted in image resolutions from the NACs that were slightly different for each site but were all about four feet per pixel. Since the deck of the descent stage is about 14 feet in diameter, the Apollo relics themselves fill about four pixels. However, because the Sun was low to the horizon when the images were acquired, even subtle variations in topography create long shadows. Standing just over ten feet above the surface, each Apollo descent stage creates a distinct shadow that fills roughly 20 pixels.

"For the five landing site images photographed by LROC, the biggest variables are spacecraft altitude (ground scale) and time of day, which translates into signal strength," explains Robinson. "In the current collection of images the best discrimination of features is in the Apollo 14 scene even though the highest resolution picture covers the Apollo 16 site."

Compared to the other landing site images, the image of the Apollo 14 site revealed additional details. The Apollo Lunar Surface Experiment Package (ALSEP), a set of scientific instruments placed by the astronauts at the landing site, is discernable, as are the faint trails between the descent stage and ALSEP left by the astronauts' footprints.

Though it had been expected that LRO would be able to resolve the remnants of the Apollo missions, these first images came prior to the spacecraft reaching its final mapping orbit. As the orbit of LRO is lowered, LROC will receive many more opportunities to image the landing sites in the weeks to come. The resolution of future LROC images of these sites will improve by two to three times.

The timing of these images being captured is notable as it occurred only days before the 40-year anniversary of NASA's Apollo 11 mission that first put humans on the moon. Though these pictures provide a reminder of one of humankind's greatest technological achievements, LRO's primary focus is paving the way for future exploration. By returning detailed lunar data the LRO mission will help NASA identify safe and compelling landing sites for future explorers, locate potential resources, describe the moon's radiation environment and demonstrate new technologies.


ScienceDaily: Your source for the latest research news and science breakthroughs -- updated daily Science News Share Blog Cite Print Email Bo  

Posted by Zaib in ,

The Wildlife Conservation Society announced today that critically endangered alligators in China have a new chance for survival. The WCS's Bronx Zoo, in partnership with two other North American parks and the Department of Wildlife Conservation and Management of the State Forestry Administration of China, has successfully reintroduced alligators into the wild that are now multiplying on their own.


The alligator hatchlings—15 in number—are the offspring of a group of alligators that includes animals from the Wildlife Conservation Society's Bronx Zoo. The baby alligators represent a milestone for the 10-year effort to reintroduce the Chinese alligator on Chongming Island, located at the mouth of China's Yangtze River.

The announcement was made at the International Congress for Conservation Biology, convened by the Society for Conservation Biology in Beijing, China (July 11-16).

"We are grateful to our Chinese partners for their commitment to reintroduce Chinese alligators back into the wild," said Dr. Steven E. Sanderson, President and CEO of the Wildlife Conservation Society. "WCS has championed careful wildlife reintroductions for more than a century. The reintroduction of Chinese alligators is a great example of how WCS partners with governments and local communities around the world to save wildlife and wild places."

"This is fantastic news," said WCS researcher Dr. John Thorbjarnarson, one of the world's foremost experts on crocodilians and a participant in the project. "The success of this small population suggests that there's hope for bringing the Chinese alligator back to some parts of its former distribution."

Plans to reintroduce Chinese alligators started in 1999 with a survey conducted by WCS, the Anhui Forestry Bureau, and the East China Normal University in Anhui Province, the only remaining location where the reptiles are still found in the wild in what is a small fraction of the alligator's former range. The results of the survey were dire, with an estimate of fewer than 130 animals in a declining population.

An international workshop on the species was held in 2001, followed by recommendations for the reintroduction of captive bred alligators. The first three animals released in Hongxing Reserve of Xuancheng County in Anhui in 2003 were from the Anhui Research Center of Chinese Alligator Reproduction (ARCCAR).

To ensure the maximum genetic diversity for the effort, project participants imported 12 more animals to Changxing Yinjiabian Chinese Alligator Nature Reserve from North America, including four from the Bronx Zoo. From this group, three animals from the U.S. were released in 2007 along with three more alligators from Changxing. The alligators were given health examinations by veterinary professionals from WCS's Global Health Program and the Shanghai Wildlife Zoo and fitted with radio transmitters for remote monitoring before being released.

Experts reported that the reintroduced alligators successfully hibernated, and then in 2008, bred in the wild.

With a former range that covered a wide watershed area of East China, the Chinese alligator—or "tu long," which means "muddy dragon"—is now listed as "Critically Endangered" on IUCN's Red List of Threatened Species and is the most threatened of the 23 species of crocodilians in the world today. It is one of only two alligator species in existence (the other is the better known, and much better off, American alligator).

The Yangtze River, where the reintroduction of these alligators took place, is the third longest river in the world (after the Amazon and the Nile) and is China's most economically important waterway. The world's largest hydro-electric dam—the Three Gorges Dam—is also located on the river. The high levels of development along the river have become a challenge for native wildlife; in 2006, a comprehensive search for the Yangtze River dolphin, or baiji, didn't find any, although one isolated sighting of a dolphin was made in 2007.

Other participants in the project include the East China Normal University, Shanghai Forestry Bureau, Changxing Yinjiabian Chinese Alligator Nature Reserve, and Wetland Park of Shanghai Industrial Investment (Holdings) Co. Ltd.

The project is being supported by the Ocean Park Conservation Foundation, Hong Kong.


King Crabs Go Deep To Avoid Hot Water  

Posted by Zaib in , ,

Researchers from the University of Southampton have drawn together 200 years' worth of oceanographic knowledge to investigate the distribution of a notorious deep-sea giant - the king crab. The results, published this week in the Journal of Biogeography, reveal temperature as a driving force behind the divergence of a major seafloor predator; globally, and over tens of millions of years of Earth's history.


In deep seas all over the world, around 100 species of king crabs live largely undiscovered. The fraction that have been found includes some weird and wonderful examples - Paralomis seagrantii has its eight walking legs and claws entirely covered in long fur-like setae; while related group Lithodes megacanthus grows to lengths of 1.5 metres, and has 15-20-cm long defensive spines covering its body. At temperatures of around 1- 4ºC, these crabs thrive in some of the colder waters on Earth; living and growing very slowly, probably to very old ages. Only in the cooler water towards the poles are king crabs found near the water surface - though temperatures found around some parts of the Antarctic (below 1ºC) are too extreme for their survival.

A paper, published 15 years ago in Nature is thought to show that king crabs evolved from shell-bound hermit crabs - similar to the familiar shoreline animals. Soft-bodied, but shell-free intermediate forms are found only in the shallow waters off Japan, Alaska, and Western Canada.

By looking at 200 years' worth of records from scientific cruises and museum collections, Sally Hall and Dr Sven Thatje from the University of Southampton's School of Ocean and Earth Science at the National Oceanography Centre, Southampton discovered that the soft-bodied forms can live at temperatures about ten degrees higher than the hard-bodied forms, but that both groups can only reproduce when temperature is between 1ºC up to 13-15ºC.

"It seems that most shallow-water representatives of this family are trapped in the coastal regions of the North Pacific because the higher sea surface temperatures further south prevent them from reproducing successfully and spreading," said Dr Thatje.

In order to leave this geographic bottleneck and spread around the world, the shallow water ancestors of current deep-sea groups had to go deep and adapt to the challenges of life in the deep sea. The process of adaptation to constant low temperatures (1-4ºC) prevailing in the deep sea seems to have narrowed the temperature tolerance range of the crabs where they have emerged to the surface waters in the Southern Hemisphere. With differences of only a couple of degrees in temperature affecting the distribution of the king crab, it is difficult to predict the consequences of range expansion in the warming waters around the Antarctic Peninsular region.

King crabs are of great commercial value, and fisheries are established in high latitude regions of both hemispheres. "Understanding their evolutionary history and ecology is key to supporting sustainable fisheries of these creatures," said research student Sally Hall. She adds: "Recent range extensions of king crabs into Antarctica, as well as that of the red king crab Paralithodes camtchaticus in the Barents Sea and along the coast off Norway emphasise the responsiveness of this group to rapid climate change."

This study reveals temperature as a driving force behind the speciation and radiation of a major seafloor predator globally and over tens of millions of years of Earth's history.

The study has been supported by the National Environment Research Council (UK) through a PhD studentship to Sally Hall, and a Research Grant from the Royal Society awarded to Dr Thatje.


Ecologist Brings Century-old Eggs To Life To Study Evolution  

Posted by Zaib in , ,

Suspending a life in time is a theme that normally finds itself in the pages of science fiction, but now such ideas have become a reality in the annals of science.


Cornell ecologist Nelson Hairston Jr. is a pioneer in a field known loosely as "resurrection ecology," in which researchers study the eggs of such creatures as zooplankton -- tiny, free-floating water animals -- that get buried in lake sediments and can remain viable for decades or even centuries. By hatching these eggs, Hairston and others can compare time-suspended hatchlings with their more contemporary counterparts to better understand how a species may have evolved in the meantime.

The researchers take sediment cores from lake floors to extract the eggs; the deeper the egg lies in the core, the older it is. They then place the eggs in optimal hatching conditions, such as those found in spring in a temperate lake, and let nature take its course.

"We can resurrect them and discover what life was like in the past," said Hairston, who came to Cornell in 1985 and is a professor and chair of Cornell's Department of Ecology and Evolutionary Biology. "Paleo-ecologists study microfossils, but you can't understand much physiologically or behaviorally" with that approach, he said.

Hairston first became interested in the possibilities of studying dormant eggs in the late 1970s, when he was an assistant professor of zoology at the University of Rhode Island. There, he noticed that the little red crustaceans -- known as copepods -- in the pristine lake behind his Rhode Island home disappeared in the summer, only to return as larvae in the fall.

The observation prompted him to study why they disappear, research that revealed the copepods stay active under the ice in the winter, but they die out as their eggs lie dormant on the lake floor through the summer when the lake's fish are most active. When the fish become less active in the fall, larvae hatch from the eggs, and the copepods continue their life cycle.

This time suspension, where zooplankton pause their life cycles to avoid heavy predation or harsh seasonal and environmental conditions, also increases a species' local gene pool, with up to a century's worth of genetic material stored in a lake bed, Hairston said. When insects, nesting fish and boat anchors stir the mud, they can release old eggs that hatch and offer a wider variety of genetic material to the contemporary population.

In 1999 Hairston and colleagues published a paper in Nature that described how 40-year-old resurrected eggs could answer whether tiny crustaceans called Daphnia in central Europe's Lake Constance had evolved to survive rising levels of toxic cyanobacteria, known as blue-green algae. In the 1970s, phosphorus levels from pollution rose in the lake, increasing the numbers of cyanobacteria. The researchers hatched eggs from the 1960s and found they could not survive the toxic lake conditions, but Daphnia from the 1970s had adapted and survived.

Hairston and colleagues have organized a resurrection ecology symposium in September 2009, in Herzberg, Switzerland, to bring together researchers in this growing new field.


Moles And Melanoma: Genetic Links To Skin Cancer Found  

Posted by Zaib in

New research has shown why people with the greatest number of moles are at increased risk of the most dangerous form of skin cancer.


The study, led by Professors Julia Newton Bishop and Tim Bishop of the Melanoma Genetics Consortium (GenoMEL) at the University of Leeds, looked at more than 10,000 people, comparing those who have been diagnosed with melanoma to those who do not have the disease.

Researchers across Europe and in Australia, looked at 300,000 variations in their research subjects' genetic make-up, to pinpoint which genes were most significant in developing melanoma – a disease which causes the overwhelming majority of skin cancer related deaths. Their findings are published in the journal Nature Genetics.

Across the large sample, a number of clear genetic patterns emerged.

It is already well known that red-haired people, those with fair skin and those who sunburn easily are most at risk of melanoma, and the people who had been diagnosed with melanoma were found to be much more likely to be carrying the genes most closely associated with red hair and freckles. "This is what we expected to find," said Professor Bishop of the Leeds Institute of Molecular Medicine and the Cancer Research UK Centre at Leeds. "But the links seemed to be much stronger than we anticipated."

"We had known for some time that people with many moles are at increased risk of melanoma. In this study we found a clear link between some genes on chromosomes 9 and 22 and increased risk of melanoma. These genes were not associated with skin colour," he added.

"Instead, in joint research with colleagues at King's College London and in Brisbane who counted the number of moles on volunteer twins, we showed that these genes actually influenced the number of moles a person has."

Around 48,000 people worldwide die of melanoma each year. It is more common in males and those with pale skin – and is on the increase. It is widely believed that the increase in melanomas is largely due to social and behavioural activities, such as increased exposure to the sun, partly caused by the availability of cheaper foreign holidays. Sunny holidays increase the risk because it is intermittent sun exposure which causes melanoma rather than daily exposure over longer periods of time.

Even so, the process by which sunlight and genetics combine to cause cancer in some people, is still poorly understood, as Professor Bishop explained: "If you take the people who have the greatest exposure to sunlight – those who work outside for example – and compare them to those with the least exposure, their risks of getting skin cancer are actually quite similar. Statistically, the differences are quite negligible.

"What we do know is that the combination of particular genes and a lifestyle of significant sun exposure is putting people at greatest risk."

The research shows that there are at least five genes which influence the risk of melanoma. A person carrying all the variants associated with an increased risk is around eight times more likely to develop melanoma than those carrying none, though the majority of people carry at least one of these variants.

Sara Hiom, Cancer Research UK's director of health information, said: "The more we can understand malignant melanoma through research like this the closer we should get to controlling what is an often fatal cancer. This study confirms Cancer Research UK's advice in its SunSmart campaign that people with lots of moles – as well as those with red hair and fair skin – are more at risk of the most dangerous form of skin cancer and should take extra care in the sun.

"The research goes further and identifies the actual genes associated with this increased risk."


Researchers Uncover Genetic Variants Linked To Blood Pressure In African-Americans  

Posted by Zaib in

A team led by researchers from the National Institutes of Health today reported the discovery of five genetic variants related to blood pressure in African-Americans, findings that may provide new clues to treating and preventing hypertension. The effort marks the first time that a relatively new research approach, called a genome-wide association study, has focused on blood pressure and hypertension in an African-American population.


Hypertension, or chronic high blood pressure, underlies an array of life-threatening conditions, including heart disease, stroke and kidney disease. Diet, physical activity and obesity all contribute to risk of hypertension, but researchers also think genetics plays an important role.

About one-third of U.S. adults suffer from hypertension. The burden is considerably greater in the African-American community, in which the condition affects 39 percent of men and 43 percent of women.

"This work underscores the value of using genomic tools to untangle the complex genetic factors that influence the risk for hypertension and other common diseases," said Eric Green, M.D., Ph.D., scientific director for the National Human Genome Research Institute (NHGRI), part of NIH. "We hope these findings eventually will translate into better ways of helping the millions of African-Americans at risk for hypertension, as well as improved treatment options for other populations."

In addition to NHGRI researchers, scientists from the Coriell Institute for Medical Research in Camden, N.J.; Boston University; and Howard University, in Washington, D.C., collaborated on the study, which was published in the July 17 online issue of PLoS Genetics.

To produce their findings, researchers analyzed DNA samples from 1,017 participants in the Howard University Family Study, a multigenerational study of families from the Washington, D.C., metropolitan area who identified themselves as African-American. Half of the volunteers had hypertension and half did not. To see if there were any genetic differences between the two groups, researchers scanned the volunteers' DNA, or genomes, analyzing more than 800,000 genetic markers called single-nucleotide polymorphisms (SNPs).

The researchers found five genetic variants significantly more often in people with hypertension than in those without the condition. The variants were associated with high systolic blood pressure, but not with diastolic blood pressure or combined systolic/diastolic blood pressure.

Blood pressure is measured in millimeters of mercury (mm Hg), and expressed with two numbers; for example, 120/80 mm Hg. The first number (systolic pressure) is the pressure when the heart beats while pumping blood. The second number (diastolic pressure) is the pressure in large arteries when the heart is at rest between beats.

"This is the first genome-wide association study for hypertension and blood pressure solely focused on a population with majority African ancestry," said the study's senior author, Charles Rotimi, Ph.D., NHGRI senior investigator and director of the trans-NIH Center for Research on Genomics and Global Health (CRGGH). "Although the effect of each individual genetic variant was modest, our findings extend the scope of what is known generally about the genetics of human hypertension."

In a genome-wide association study, researchers identify strategically selected markers of genetic variation. If disease status differs for individuals with certain genetic variants, this indicates that something in that chromosomal neighborhood likely influences the disease. Variants detected using this approach can accurately point to the region of the genome involved, but may not themselves directly influence the trait.

In May, two major international studies used the genome-wide association approach to identify 13 genetic variants associated with blood pressure and hypertension in people with primarily European and South Asian ancestry. While each variant was associated with only a slight increase in blood pressure, that work found that the more variants an individual had, the greater his or her risk of hypertension. Two genes identified by one of those studies were also associated with blood pressure in the new study.

In their pioneering study of African-Americans, Dr. Rotimi and his colleagues found that all of the five genetic variants associated with blood pressure were located in or near genes that code for proteins thought to be biologically important in hypertension and blood pressure. Previous research had implicated two of those genes in blood pressure regulation, and additional analyses by Dr. Rotimi's group revealed that all of the variants are likely involved in biological pathways and networks related to blood pressure and hypertension.

An existing class of anti-hypertension drugs, called calcium channel blockers, already targets one of the genes, CACNA1H. However, the additional genes may point to new avenues for treatment and prevention.

To follow up and expand upon their findings in African-Americans, the researchers scanned DNA from 980 West Africans with and without hypertension. The work confirmed that some of the genetic variants detected in African-Americans were also associated with blood pressure in West Africans. "The Western African population is of particular significance since it is the ancestral population of many African-Americans," said lead author Adebowale Adeyemo, M.D., CRGGH staff scientist.

This study was supported by the NHGRI, CRGGH, and the National Institute of General Medical Sciences, all part of NIH; and by a W.W. Smith Foundation grant to the Coriell Institute. The Howard University General Clinical Research Center carried out the enrollment of study participants.

For more information about hypertension, visit http://www.nhlbi.nih.gov/health/dci/Diseases/Hbp/HBP_WhatIs.html.

To learn more about the genome-wide association approach, visit http://www.genome.gov/20019523.


Why Winning Athletes Are Getting Bigger  

Posted by Zaib in

While watching swimmers line up during the 2008 Olympic Games in Beijing, former Olympic swimmer and NBC Sports commentator Rowdy Gaines quipped that swimmers keep getting bigger, with the shortest one in the current race towering over the average spectator.


What may have been seen as an off-hand remark turns out to illustrate a trend in human development -- elite athletes are getting bigger and bigger.

What Gaines did not know was that a new theory by Duke University engineers has indeed showed that not only have Olympic swimmers and sprinters gotten bigger and faster over the past 100 years, but they have grown at a much faster rate than the normal population.

Futhermore, the researchers said, this pattern of growth can be predicted by the constructal theory, a Duke-inspired theory of design in nature that explains such diverse phenomena as river basin formation and the capillary structure of tree branches and roots.

In a new analysis, Jordan Charles, an engineering student who graduated this spring, collected the heights and weights of the fastest swimmers (100 meters) and sprinters (100 meters) for world record winners since 1900. He then correlated the size growth of these athletes with their winning times.

"The trends revealed by our analysis suggest that speed records will continue to be dominated by heavier and taller athletes," said Charles, who worked with senior author Adrian Bejan, engineering professor who came up with the constructal theory 13 years ago. The results of their analysis were published online in the Journal of Experimental Biology. "We believe that this is due to the constructal rules of animal locomotion and not the contemporary increase in the average size of humans."

Specifically, while the average human has gained about 1.9 inches in height since 1900, Charles' research showed that the fastest swimmers have grown 4.5 inches and the swiftest runners have grown 6.4 inches.

The theoretical rules of animal locomotion generally state that larger animals should move faster than smaller animals. In his contructal theory, Bejan linked all three forms of animal locomotion -- running, swimming and flying. Bejan argues that the three forms of locomotion involve two basic forces: lifting weight vertically and overcoming drag horizontally. Therefore, they can be described by the same mathematical formulas.

Using these insights, the researchers can predict running speeds during the Greek or Roman empires, for example. In those days, obviously, time was not kept.

"In antiquity, body weights were roughly 70 percent less than they are today," Charles said. "Using our theory, a 100-meter dash that is won in 13 seconds would have taken about 14 seconds back then."

Charles, a varsity breaststroke swimmer during his time at Duke, said this new way of looking at locomotion and size validates a particular practice in swim training, though for a different reason. Swimmers are urged by their coaches to raise their body as far as they can out of the water with each stroke as a means of increasing their speed.

"It was thought that the swimmer would experience less friction drag in the air than in the water," Charles said. "However, when the body is higher above the water, it falls faster and more forward when it hits the water. The larger wave that occurs is faster and propels the body forward. A larger swimmer would get a heightened effect. Right advice, wrong reason."

In an almost whimsical corollary, the authors suggest that if athletes of all sizes are to compete in these kinds of events, weight classes might be needed.

"In the future, the fastest athletes can be predicted to be heavier and taller," Bejan said. "If the winners' podium is to include athletes of all sizes, then speed competitions might have to be divided into weight categories. Larger athletes lift, push and punch harder than smaller athletes, and this led to the establishment of weight classes in certain sports, like boxing, wrestling or weight-lifting.


Toxin Detection As Close As An Inkjet Printer  

Posted by Zaib in , ,

If that office inkjet printer has become just another fixture, it's time to take a fresh look at it. Similar technology may soon be used to develop paper-based biosensors that can detect certain harmful toxins that can cause food poisoning or be used as bioterrorism agents.


In a paper published in the July issue of Analytical Chemistry, John Brennan and his research team at McMaster University, working with the Sentinel Bioactive Paper Network, describe a method for printing a toxin-detecting biosensor on paper using a FujiFilm Dimatix Materials Printer.

The researchers demonstrated the concept on the detection of acetylcholinesterase (AChE) inhibitors such as paraoxon and aflatoxin B1 on paper using a "lateral flow" sensing approach similar to that used in a home pregnancy test strip.

The process involves formulating an ink like the one found in computer printer cartridges but with special additives to make the ink biocompatible. An ink comprised of biocompatible silica nanoparticles is first deposited on paper, followed by a second ink containing the enzyme, and the resulting bio-ink forms a thin film of enzyme that is entrapped in the silica on paper. When the enzyme is exposed to a toxin, reporter molecules in the ink change colour in a manner that is dependent on the concentration of the toxin in the sample.

This simple and cost-effective method of adhering biochemical reagents to paper is expected to bring the concept of bioactive paper a significant step closer to commercialization. The goal for bioactive paper is to provide a rapid, portable, disposable and inexpensive way of detecting harmful substances, including toxins, pathogens and viruses, without the need for sophisticated instrumentation. The research showed that the printed enzyme retains full activity for at least two months when stored properly, suggesting that such sensor strips should have a good shelf life.

Portable bio-sensing papers are expected to be extremely useful in monitoring environmental and food-based toxins, as well as in remote settings in less industrialized countries where simple bioassays are essential for the first stages of detecting disease.

Applications for bioactive paper also include clinical applications in neuroscience, drug assessment, and pharmaceutical development.


Human-like Vision Lets Robots Navigate Naturally  

Posted by Zaib in ,

A robotic vision system that mimics key visual functions of the human brain promises to let robots manoeuvre quickly and safely through cluttered environments, and to help guide the visually impaired.


It’s something any toddler can do – cross a cluttered room to find a toy.

It's also one of those seemingly trivial skills that have proved to be extremely hard for computers to master. Analysing shifting and often-ambiguous visual data to detect objects and separate their movement from one’s own has turned out to be an intensely challenging artificial intelligence problem.

Three years ago, researchers at the European-funded research consortium Decisions in Motion (http://www.decisionsinmotion.org/) decided to look to nature for insights into this challenge.

In a rare collaboration, neuro- and cognitive scientists studied how the visual systems of advanced mammals, primates and people work, while computer scientists and roboticists incorporated their findings into neural networks and mobile robots.

The approach paid off. Decisions in Motion has already built and demonstrated a robot that can zip across a crowded room guided only by what it “sees” through its twin video cameras, and are hard at work on a head-mounted system to help visually impaired people get around.

“Until now, the algorithms that have been used are quite slow and their decisions are not reliable enough to be useful,” says project coordinator Mark Greenlee. “Our approach allowed us to build algorithms that can do this on the fly, that can make all these decisions within a few milliseconds using conventional hardware.”

How do we see movement?

The Decisions in Motion researchers used a wide variety of techniques to learn more about how the brain processes visual information, especially information about movement.

These included recording individual neurons and groups of neurons firing in response to movement signals, functional magnetic resonance imaging to track the moment-by-moment interactions between different brain areas as people performed visual tasks, and neuropsychological studies of people with visual processing problems.

The researchers hoped to learn more about how the visual system scans the environment, detects objects, discerns movement, distinguishes between the independent movement of objects and the organism’s own movements, and plans and controls motion towards a goal.

One of their most interesting discoveries was that the primate brain does not just detect and track a moving object; it actually predicts where the object will go.

“When an object moves through a scene, you get a wave of activity as the brain anticipates its trajectory,” says Greenlee. “It’s like feedback signals flowing from the higher areas in the visual cortex back to neurons in the primary visual cortex to give them a sense of what’s coming.”

Greenlee compares what an individual visual neuron sees to looking at the world through a peephole. Researchers have known for a long time that high-level processing is needed to build a coherent picture out of a myriad of those tiny glimpses. What's new is the importance of strong anticipatory feedback for perceiving and processing motion.

“This proved to be quite critical for the Decisions in Motion project,” Greenlee says. “It solves what is called the ‘aperture problem’, the problem of the neurons in the primary visual cortex looking through those little peepholes.”

Building a better robotic brain

Armed with a better understanding of how the human brain deals with movement, the project’s computer scientists and roboticists went to work. Using off-the-shelf hardware, they built a neural network with three levels mimicking the brain’s primary, mid-level, and higher-level visual subsystems.

They used what they had learned about the flow of information between brain regions to control the flow of information within the robotic “brain”.

“It’s basically a neural network with certain biological characteristics,” says Greenlee. “The connectivity is dictated by the numbers we have from our physiological studies.”

The computerised brain controls the behaviour of a wheeled robotic platform supporting a moveable head and eyes, in real time. It directs the head and eyes where to look, tracks its own movement, identifies objects, determines if they are moving independently, and directs the platform to speed up, slow down and turn left or right.

Greenlee and his colleagues were intrigued when the robot found its way to its first target – a teddy bear – just like a person would, speeding by objects that were at a safe distance, but passing nearby obstacles at a slower pace.

”That was very exciting,” Greenlee says. “We didn’t program it in – it popped out of the algorithm.”

In addition to improved guidance systems for robots, the consortium envisions a lightweight system that could be worn like eyeglasses by visually or cognitively impaired people to boost their mobility. One of the consortium partners, Cambridge Research Systems, is developing a commercial version of this, called VisGuide.

Decisions in Motion received funding from the ICT strand of the EU’s Sixth Framework Programme for research. The project’s work was featured in a video by the New Scientist in February this year.


Arctic Climate Under Greenhouse Conditions In The Late Cretaceous  

Posted by Zaib in

New evidence for ice-free summers with intermittent winter sea ice in the Arctic Ocean during the Late Cretaceous – a period of greenhouse conditions - gives a glimpse of how the Arctic is likely to respond to future global warming.


Records of past environmental change in the Arctic should help predict its future behaviour. The Late Cretaceous, the period between 100 and 65 million years ago leading up to the extinction of the dinosaurs, is crucial in this regard because levels of carbon dioxide (CO2) were high, driving greenhouse conditions. But scientists have disagreed about the climate at this time, with some arguing for low Arctic late Cretaceous winter temperatures (when sunlight is absent during the Polar night) as against more recent suggestions of a somewhat milder 15°C mean annual temperature.

Writing in Nature, Dr Andrew Davies and Professor Alan Kemp of the University of Southampton's School of Ocean and Earth Science based at the National Oceanography Centre, Southampton, along with Dr Jennifer Pike of Cardiff University take this debate a step forward by presenting the first seasonally resolved Cretaceous sedimentary record from the Alpha Ridge of the Arctic Ocean.

The scientists analysed the remains of diatoms – tiny free-floating plant-like organisms - preserved in late Cretaceous marine sediments. In modern oceans, diatoms play a dominant role in the 'biological carbon pump' by which carbon dioxide is drawn down from the atmosphere through photosynthesis and a proportion of it exported to the deep ocean. Unfortunately, the role of diatoms in the Cretaceous oceans has until now been unclear, in part because they are often poorly preserved in sediments.

But the researchers struck lucky. "With remarkable serendipity," they explain, " successive US and Canadian expeditions that occupied floating ice islands above the Alpha Ridge of the Arctic Ocean, recovered cores containing shallow buried upper Cretaceous diatom ooze with superbly preserved diatoms." This has allowed them to conduct a detailed study of the diatom fossils using sophisticated electron microscopy techniques. In the modern ocean, scientists use floating sediment traps to collect and study settling material. These electron microscope techniques that have been pioneered by Professor Kemp's group at Southampton have unlocked a 'palaeo-sediment trap' to reveal information about Late Cretaceous environmental conditions.

They find that the most informative sediment core samples display a regular alternation of microscopically thin layers composed of two distinctly different diatom assemblages, reflecting seasonal changes. Their analysis clearly demonstrates that seasonal blooming of diatoms was not related to the upwelling of nutrients, as has been previously suggested. Rather, production occurred within a stratified water column, indicative of ice-free summers. These summer blooms comprised specially adapted species resembling those of the modern North Pacific Subtropical Gyre, or preserved in relatively recent organically rich Mediterranean sediments called 'sapropels'.

The sheer number of diatoms found in the Late Cretaceous sediment cores indicates exceptional abundances equalling modern values for the most productive areas of the Southern Ocean. "This Cretaceous production, dominated by diatoms adapted to stratified conditions of the polar summer may also be a pointer to future trends in the modern ocean," say the researchers: "With increasing CO2 levels and global warming giving rise to increased ocean stratification, this style of (marine biological) production may become of increasing importance."

However, thin accumulations of earthborn sediment within the diatom ooze are consistent with the presence of intermittent sea ice in the winter, a finding that supports "a wide body of evidence for low Arctic late Cretaceous winter temperatures rather than recent suggestions of a 15C mean annual temperature at this time." The size distribution of clay and sand grains in the sediment points to the formation of sea ice in shallow coastal seas during autumn storms but suggests the absence of larger drop-stones suggests that the winters, although cold, were not cold enough to support thick glacial ice or large areas of anchored ice.

Commenting on the findings, Professor Kemp said: "Although seasonally-resolved records are rarely preserved, our research shows that they can provide a unique window into past Earth system behaviour on timescales immediately comparable and relevant to those of modern concern."

Davies, A., Kemp, A. S. & Pike, J. Late Cretaceous seasonal ocean variability from the Arctic. Nature 460, 254-258 (9 July 2009).

http://www.nature.com/nature/journal/v460/n7252/full/nature08141.html

The research was supported by the Natural Environment Research Council.


Solar Power: New SunCatcher Power System Ready For Commercial Production In 2010  

Posted by Zaib in

Stirling Energy Systems (SES) and Tessera Solar recently unveiled four newly designed solar power collection dishes at Sandia National Laboratories’ National Solar Thermal Test Facility (NSTTF). Called SunCatchers™, the new dishes have a refined design that will be used in commercial-scale deployments of the units beginning in 2010.


“The four new dishes are the next-generation model of the original SunCatcher system. Six first-generation SunCatchers built over the past several years at the NSTTF have been producing up to 150KW [kilowatts] of grid-ready electrical power during the day,” says Chuck Andraka, the lead Sandia project engineer. “Every part of the new system has been upgraded to allow for a high rate of production and cost reduction.”

Sandia’s concentrating solar-thermal power (CSP) team has been working closely with SES over the past five years to improve the system design and operation.

The modular CSP SunCatcher uses precision mirrors attached to a parabolic dish to focus the sun’s rays onto a receiver, which transmits the heat to a Stirling engine. The engine is a sealed system filled with hydrogen. As the gas heats and cools, its pressure rises and falls. The change in pressure drives the piston inside the engine, producing mechanical power, which in turn drives a generator and makes electricity.

The new SunCatcher is about 5,000 pounds lighter than the original, is round instead of rectangular to allow for more efficient use of steel, has improved optics, and consists of 60 percent fewer engine parts. The revised design also has fewer mirrors — 40 instead of 80. The reflective mirrors are formed into a parabolic shape using stamped sheet metal similar to the hood of a car. The mirrors are made by using automobile manufacturing techniques. The improvements will result in high-volume production, cost reductions, and easier maintenance.

Among Sandia’s contributions to the new design was development of a tool to determine how well the mirrors work in less than 10 seconds, something that took the earlier design one hour.

“The new design of the SunCatcher represents more than a decade of innovative engineering and validation testing, making it ready for commercialization,” says Steve Cowman, Stirling Energy Systems CEO. “By utilizing the automotive supply chain to manufacture the SunCatcher, we’re leveraging the talents of an industry that has refined high-volume production through an assembly line process. More than 90 percent of the SunCatcher components will be manufactured in North America.”

In addition to improved manufacturability and easy maintenance, the new SunCatcher minimizes both cost and land use and has numerous environmental advantages, Andraka says.

“They have the lowest water use of any thermal electric generating technology, require minimal grading and trenching, require no excavation for foundations, and will not produce greenhouse gas emissions while converting sunlight into electricity,” he says.

Tessera Solar, the developer and operator of large-scale solar projects using the SunCatcher technology and sister company of SES, is building a 60-unit plant generating 1.5 MW (megawatts) by the end of the year either in Arizona or California. One megawatt powers about 800 homes. The proprietary solar dish technology will then be deployed to develop two of the world’s largest solar generating plants in Southern California with San Diego Gas & Electric in the Imperial Valley and Southern California Edison in the Mojave Desert, in addition to the recently announced project with CPS Energy in West Texas. The projects are expected to produce 1,000 MW by the end of 2012.

Last year one of the original SunCatchers set a new solar-to-grid system conversion efficiency record by achieving a 31.25 percent net efficiency rate, toppling the old 1984 record of 29.4.


Hydrogen Technology Steams Ahead  

Posted by Zaib in

Could the cars and laptops of the future be fuelled by old chip fat? Engineers at the University of Leeds believe so, and are developing an energy efficient, environmentally-friendly hydrogen production system. The system enables hydrogen to be extracted from waste materials, such as vegetable oil and the glycerol by-product of bio-diesel. The aim is to create the high purity hydrogen-based fuel necessary not only for large-scale power production, but also for smaller portable fuel cells.


Dr Valerie Dupont from the School of Process, Environmental and Materials Engineering (SPEME) says: “I can foresee a time when the processes we are investigating could help ensure that hydrogen is a mainstream fuel.

“We are investigating the feasibility of creating a uniquely energy efficient method of hydrogen production which uses air rather than burners to heat the raw product. Our current research will improve the sustainability of this process and reduce its carbon emissions.”

A grant of over £400k has been awarded to the University by the Engineering and Physical Sciences Research Council (EPSRC) within a consortium of 12 institutions known as SUPERGEN Sustainable Hydrogen Delivery.

Hydrogen is widely considered to be a potential replacement for fossil fuels, but it is costly to extract. There are also often high levels of greenhouse gases emitted during conventional methods of production.

The system being developed at Leeds – known as Unmixed and Sorption-Enhanced Steam Reforming - mixes waste products with steam to release hydrogen and is potentially cheaper, cleaner and more energy efficient.

A hydrocarbon-based fuel from plant or waste sources is mixed with steam in a catalytic reactor, generating hydrogen and carbon dioxide along with excess water. The water is then easily condensed by cooling and the carbon dioxide is removed in-situ by a solid sorbent material.

Dr Dupont says: “It’s becoming increasingly necessary for scientists devising new technologies to limit the amount of carbon dioxide they release. This project takes us one step closer to these goals – once we have technologies that enable us to produce hydrogen sustainably, the infrastructure to support its use will grow.”

“We firmly believe that these advanced steam reforming processes have great potential for helping to build the hydrogen economy. Our primary focus now is to ensure the materials we rely on - both to catalyse the desired reaction and to capture the carbon dioxide – can be used over and over again without losing their efficacy.”


By Manipulating Oxygen, Scientists Coax Bacteria Into Never-Before-Seen Solitary Wave  

Posted by Zaib in

Bacteria know that they are too small to make an impact individually. So they wait, they multiply, and then they engage in behaviors that are only successful when all cells participate in unison. There are hundreds of behaviors that bacteria carry out in such communities. Now researchers at Rockefeller University have discovered one that has never been observed or described before in a living system.


In research published in the May 12 issue of Physical Review Letters, Albert J. Libchaber, head of the Laboratory of Experimental Condensed Matter Physics, and his colleagues, including first author Carine Douarche, a postdoctoral associate in the lab, show that when oxygen penetrates a sample of oxygen-deprived Escherichia coli bacteria, they do something that no living community had been seen to do before: The bacteria accumulate and form a solitary propagating wave that moves with constant velocity and without changing shape. But while the front is moving, each bacterium in it isn’t moving at all.

“It’s like a soliton,” says Douarche. “A self-reinforcing solitary wave.”

Unlike the undulating pattern of an ocean wave, which flattens or topples over as it approaches the shore, a soliton is a solitary, self-sustaining wave that behaves like a particle. For example, when two solitons collide, they merge into one and then separate into two with the same shape and velocity as before the collision. The first soliton was observed in 1834 at a canal in Scotland by John Scott Russell, a scientist who was so fascinated with what he saw that he followed it on horseback for miles and then set up a 30-foot water tank in his yard where he successfully simulated it, sparking considerable controversy.

The work began when Libchaber, Douarche and their colleagues placed E. coli bacteria in a sealed square chamber and measured the oxygen concentration and the density of bacteria every two hours until the bacteria consumed all the oxygen. (Bacteria, unlike humans, don’t die when starved for oxygen, but switch to a nonmotile state from which they can be revived.) The researchers then cracked the seals of the chamber, allowing oxygen to flow in.

The result: The motionless bacteria, which had spread out uniformly, began to move; first those around the perimeter, nearest to the seals, and then those further away. A few hours later, the bacteria began to spatially segregate into two domains of moving and nonmoving bacteria and pile up into a ring at the border of low-oxygen and no-oxygen. There they formed a solitary wave that propagated slowly but steadily toward the center of the chamber without changing its shape.

The effect, which lasted for more than 15 hours and covered a considerable distance (for bacteria), could not be explained by the expression of new proteins or by the addition of energy in the system. Instead, the creation of the front depends on the dispersion of the active bacteria and on the time it takes for oxygen-starved bacteria to completely stop moving, 15 minutes. The former allows the bacteria to propagate at a constant velocity, while the latter keeps the front from changing shape.

However, a propagating front of bacteria wasn’t all that was created. “To me, the biggest surprise was that the bacteria control the flow of oxygen in the regime,” says Libchaber. “There’s a propagating front of bacteria, but there is a propagating front of oxygen, too. And the bacteria, by absorbing the oxygen, control it very precisely.”

Oxygen, Libchaber explains, is one of the fastest-diffusing molecules, moving from regions of high concentration to low concentration such that the greater the distance it needs to travel, the faster it will diffuse there. But that is not what they observed. Rather, oxygen penetrated the chamber very slowly in a linear manner. Equal time, equal distance. “This pattern is not due to biology,” says Libchaber. “It has to do with the laws of physics. And it is organized in such an elegant way that the only thing it tells us is that we have a lot to learn from bacteria.”





Water Webs: Connecting Spiders, Residents In The Southwest  

Posted by Zaib in

If you are a cricket and it is a dry season on the San Pedro River in Arizona, on your nighttime ramblings to eat leaves, you are more likely to be ambushed by thirsty wolf spiders, or so a June 19 study suggests, published in the journal Ecology, and featured as an editor's choice in the journal Science.


A potential horror story for any cricket. However, it is also a tale of water limitation that looks beyond how most ecosystem studies are considered. Much current work about the relationships between predators and prey is based on nutrients or energy limitation – via a food web.

The research, performed by graduate student Kevin McCluney and associate professor John Sabo in School of Life Sciences at Arizona State University, demonstrates that under restricted water conditions, crickets consume more moist green leaves and wolf spiders more crickets. This distinct increase is driven by water limitation and the connectivity between organisms based on water – a water web.

With water the key ingredient to life, especially in the desert, why the focus on crickets and spiders and water webs? Studies on insects and riparian ecosystems such as these lend specific insights into how arid and semi-arid environments and their flora and fauna may be specifically affected by global climate change.

The authors note: "Water seems to be the ecological currency governing consumption behavior at multiple trophic levels, which indicates a role for water in understanding effects of global change on animal communities."

This article coincides with the June 18 release of the national report "Global Climate Change Impacts in the United States," funded by the National Science and Technology Council and authored by members of the U.S. Global Change Research Program, including ASU professor Nancy Grimm. The report contains a special section on the Southwest. Major changes in soil moisture and precipitation are expected as a result of climate change. McCluney and Sabo's study highlights one way ecological communities may be affected.

"Kevin's experiments suggest that by understanding water webs, we can find clues about how biodiversity may change as our region experiences drier climates under climate change," adds Sabo.

In that way, this study of crickets and spiders offers a looking glass into a future that extend much farther than the banks of one of the last undammed perennial rivers in the Southwest and the vibrant riparian community it supports.

"Drylands constitute more than one third of the land mass on Earth," McCluney notes. "While further testing is needed, our study may have implications for other ecosystems in light of recent reports of droughts and rivers drying up globally."

In addition to examining the water ties that bind inhabitants of terrestrial systems, Sabo and his students also examine aquatic ecosystems and the effects of human activity and water policy in the Southwest. In 2008, with funding from the National Science Foundation, Sabo launched a series of workshops to examine the impacts of dams on waterways in the United States held at the National Center for Ecological Synthesis and Analysis at University of California, Santa Barbara. Participants are working to define the ecological footprint that dams have had on water quantity and quality, the number of native and non-native species in rivers, the salinity of soils in some of the most productive agricultural areas, and the demand for irrigated water by the 100 largest cities in the United States. Along with studies by his ASU colleagues in the Global Institute of Sustainability and the College of Liberal Arts and Sciences, such as Juliet Stromberg, author of "The Ecology and Conservation of the San Pedro," Sabo seeks to illuminate the complexity of relationships behind developing sustainable management of water resources for both human and biodiversity needs.


Conversing Helps Language Development More Than Reading Alone  

Posted by Zaib in

Adult-child conversations have a more significant impact on language development than exposing children to language through one-on-one reading alone, according to a new study in the July issue of Pediatrics, the journal of the American Academy of Pediatrics.


"Pediatricians and others have encouraged parents to provide language input through reading, storytelling and simple narration of daily events," explains study's lead author, Dr. Frederick J. Zimmerman, associate professor in the Department of Health Services in the UCLA School of Public Health. "Although sound advice, this form of input may not place enough emphasis on children's role in language-based exchanges and the importance of getting children to speak as much as possible."

The study of 275 families of children ages 0-4 was designed to test factors that contribute to language development of infants and toddlers. Participants' exposure to adult speech, child speech and television was measured using a small digital language recorder or processor known as the LENA System. This innovative technology allowed researchers to hear what was truly going on in a child's language environment, facilitating access to valuable new insights.

The study found that back-and-forth conversation was strongly associated with future improvements in the child's language score. Conversely, adult monologueing, such as monologic reading, was more weakly associated with language development. TV viewing had no effect on language development, positive or negative.

Zimmerman adds, "What's new here is the finding that the effect of adult-child conversations was roughly six times as potent at fostering good language development as adult speech input alone."

Each day, children hear an average of some 13,000 words spoken to them by adults and participate in about 400 conversational turns with adults. More conversations mean more opportunities for mistakes and therefore more opportunities for valuable corrections. Furthermore, they also provide an opportunity for children to practice new vocabulary.

Parents should be encouraged not only to provide language input to their children through reading or storytelling but also to engage their children in two-sided conversations, the study concludes.

"Talk is powerful, but what's even more powerful is engaging a child in meaningful interactions — the 'give and take' that is so important to the social, emotional and cognitive development of infants and toddlers," says Dr. Jill Gilkerson, language research director at LENA Foundation and a study co-author.

"It is not enough to speak to children," Zimmerman adds. "Parents should also engage them in conversation. Kids love to hear you speak, but they thrive on trying speech out for themselves. Give them a chance to say what's on their minds, even if it's 'goo goo gah.'"


Delinquent Behavior Among Boys 'Contagious,' Study Finds  

Posted by Zaib in

Impulsive boys with inadequate supervision, poor families and deviant friends are more likely to commit criminal acts that land them in juvenile court, according to a new study published in the Journal of Child Psychology and Psychiatry. The most surprising finding from the 20-year study, conducted by researchers from the Université de Montréal and University of Genoa, was how help provided by the juvenile justice system substantially increased the risk of the boys engaging in criminal activities during early adulthood.


"For boys who had been through the juvenile justice system, compared to boys with similar histories without judicial involvement, the odds of adult judicial interventions increased almost seven-fold," says study co-author Richard E. Tremblay, a professor of psychology, pediatrics and psychiatry at the Université de Montréal and a researcher at the Sainte-Justine University Hospital Research Center.

The research team sought out boys from kindergarten who were at risk for delinquent behavior and who were enrolled at 53 schools from the poorest neighbourhoods in Montreal. Some 779 participants were interviewed annually from the age of 10 until 17 years. By their mid-20s, some 17.6 percent of participants ended up with adult criminal records for infractions that included homicide (17.9 percent); arson (31.2 percent); prostitution (25.5 percent); drug possession (16.4 percent) and impaired driving (8.8 percent).

"The more intense the help given by the juvenile justice system, the greater was its negative impact," Dr. Tremblay stresses. "Our findings take on even greater importance given that the juvenile justice system in the province of Quebec has the reputation of being among the best. Most countries spend considerable financial resources to fund programs and institutions that group deviant youths together in order to help them. The problem is that delinquent behavior is contagious, especially among adolescents. Putting deviant adolescents together creates a culture of deviance, which increases the likelihood of continued criminal behavior."

"Two solutions exist for this problem," adds Dr Tremblay. "The first is to implement prevention programs before adolescence when problem children are more responsive. The second is to minimize the concentration of problem youths in juvenile justice programs, thereby reducing the risk of peer contagion."

This study was funded by the Canadian Institutes of Health Research, the Fonds de la recherche en santé du Québec, the Fonds de recherche sur la société et la culture, the Social Sciences and Humanities Research Council.


Handle With Care: Telomeres Resemble DNA Fragile Sites  

Posted by Zaib in

Telomeres, the repetitive sequences of DNA at the ends of linear chromosomes, have an important function: They protect vulnerable chromosome ends from molecular attack. Researchers at Rockefeller University now show that telomeres have their own weakness. They resemble unstable parts of the genome called fragile sites where DNA replication can stall and go awry. But what keeps our fragile telomeres from falling apart is a protein that ensures the smooth progression of DNA replication to the end of a chromosome.


The research, led by Titia de Lange, head of the Laboratory of Cell Biology and Genetics, and first author Agnel Sfeir, a postdoctoral associate in the lab, suggests a striking similarity between telomeres and common fragile sites, parts of the genome where breaks tend to occur, albeit infrequently. (Humans have 80 common fragile sites, many of which have been linked to cancer.) De Lange and Sfeir found that these newly discovered fragile sites make it difficult for DNA replication to proceed, a discovery that unveils a new replication problem posed by telomeres.

At the center of the discovery is a protein known as TRF1, which de Lange, in an effort to understand how telomeres protect chromosome ends, discovered in 1995. Using a conditional mouse knockout, de Lange and Sfeir have now revealed that TRF1, which is part of a six-protein complex called shelterin, enables DNA replication to drive smoothly through telomeres with the aid of two other proteins.

“Telomeric DNA has a repetitive sequence that can form unusual DNA structures when the DNA is unwound during DNA replication,” says de Lange. “Our data suggest that TRF1 brings in two proteins that can take out these structures in the telomeric DNA. In other words, TRF1 and its helpers remove the bumps in the road so that the replication fork can drive through.”

The work, published in the July 10 issue of Cell, began when Sfeir deleted TRF1 and saw that the telomeres resembled common fragile sites, suggesting that TRF1 protects telomeres from becoming fragile. Instead of a continuous string of DNA, the telomeres were broken into fragments of twos and threes. To see if the replication fork stalls at telomeres, de Lange and Sfeir joined forces with Carl L. Schildkraut, a researcher at Albert Einstein College of Medicine in New York City. Using a technique called SMARD, the researchers observed the dynamics of replication across individual DNA molecules — the first time this technique has been used to study telomeres. In the absence of TRF1, the fork often stalled for a considerable amount of time.

The only other known replication problem posed by telomeres was solved in 1985 when it was shown that the enzyme telomerase elongates telomeres, which shorten during every cell division. The second problem posed by telomeres, the so-called end-protection problem, was solved by de Lange and her colleagues when they found that shelterin protects the ends of linear chromosomes, which look like damaged DNA, from unnecessary repair. Working with TRF1, the very first shelterin protein ever to be identified, de Lange and Sfeir have not only unveiled a completely unanticipated replication problem at telomeres, they have also shown how it is solved.

The research lays new groundwork for the study of common fragile sites throughout the genome, explains de Lange. “Fragile sites have always been hard to study because no specific DNA sequence preceeds or follows them,” she says. “In constrast, telomeres represent fragile sites with a known sequence, which may help us understand how common fragile sites break throughout the genome — and why.”


Kontera

Followers

Enter your email address:

Delivered by FeedBurner

Number of Visits :

Recent Visitors

Archives