Immune Molecules Prune Neural Links, Science
Summary: (1) In work described on page
a team of neuroscientists suggests that theclass I major
histocompatibility complex proteins, previously known for their
role in controlling immune responses, also play a role in the
nervous system. They are necessary for the formation of normal
neuronal connectionsin a visual area of the brain during
development, and later in life, they're called into play in the
hippocampus, a brain area involved in memory and learning. The
work shows a completely unexpected function for the immune system
molecules.Brain-Building MHCs In some ways, the nervous system and
the immune system solve similar problems: They both have to
distinguish and respond to an extremely large array of input from
the external world, and both are exceedingly complex. Huh et al.
(p. 2155; see the news story by Helmuth) show that class I
majorhisto compatibility complex (MHC) molecules, used by the
immune system to respond to antigens, are also necessary for
accurate assembly of the brain.In mice genetically deficient for
class I MHC molecules, the neural connections between the retina
and their targets in the central nervous system are abnormal.
Long-term potentiation, a form of cellular learning, is enhanced,
and another form, long-term depression, is eliminated. The
diversity and specificity of class I MHC molecules makes them
attractive candidates for a role in establishing neural
(2) Brain-Building MHCs
In some ways, the nervous system and the immune system solve
similar problems: They both have to distinguish and respond to an
extremely large array of input from the external world, and both
are exceedingly complex. Huh et al. (p. 2155;
see the news
story by Helmuth) show that class I major
histocompatibility complex (MHC) molecules, used by the immune
system to respond to antigens, are also necessary for accurate
assembly of the brain.
In mice genetically deficient for class I MHC molecules, the
neural connections between the retina and their targets in the
central nervous system are abnormal. Long-term potentiation, a
form of cellular learning, is enhanced, and another form,
long-term depression, is eliminated. The diversity and specificity
of class I MHC molecules makes them attractive candidates for a
role in establishing neural connections.
Molecules Prune Neural
Links, Laura Helmuth,
Science, 290(5499): 12/15/00, p.2051
Requirement for Class I MHC in CNS Development and
Plasticity, Gene S.
Huh, Lisa M. Boulanger, Hongping Du, Patricio A.
Riquelme, Tilmann M. Brotz, and Carla J. Shatz,
Science 2000 290: 2155-2159
Synaptic Efficacy and the Transmission of Complex Firing Patterns Between Neurons, J. Neurophysiol.
Abstract: In central neurons, the summation of
inputs from presynaptic cells combined with the unreliability of
synaptic transmission produces incessant variations of the
membrane potential termed synaptic noise (SN). These fluctuations,
which depend on both the unpredictable timing of afferent
activities and quantal variations of postsynaptic potentials, have
defied conventional analysis. We show here that, when applied to
SN recorded from the Mauthner (M) cell of teleosts, a simple
method of nonlinear analysis reveals previously undetected
features of this signal including hidden periodic components. The
phase relationship between these components is compatible with the
notion that the temporal organization of events comprising this
noise is deterministic rather than random and that it is generated
by presynaptic interneurons behaving as coupled periodic
oscillators. Furthermore a model of the presynaptic network shows
how SN is shaped both by activities in incoming inputs and by the
distribution of their synaptic weights expressed as mean quantal
contents of the activated synapses. In confirmation we found
experimentally that long-term tetanic potentiation (LTP), which
selectively increases some of these synaptic weights, permits
oscillating temporal patterns to be transmitted more effectively
to the postsynaptic cell. Thus the probabilistic nature of
transmitter release, which governs the strength of synapses, may
be critical for the transfer of complex timing information within
Maze Navigation by Honeybees: Learning Path Regularity, Learn. Mem.
Abstract: We investigated the ability of
honeybees to learn mazes of four types: constant-turn mazes, in
which the appropriate turn is always in the same direction in each
decision chamber; zig-zag mazes, in which the appropriate turn is
alternately left and right in successive decision chambers;
irregular mazes, in which there is no readily apparent pattern to
the turns; and variable irregular mazes, in which the bees were
trained to learn several irregular mazes simultaneously. The bees
were able to learn to navigate all four types of maze. Performance
was best in the constant-turn mazes, somewhat poorer in the
zig-zag mazes, poorer still in the irregular mazes, and poorest in
the variable irregular mazes. These results demonstrate that bees
do not navigate such mazes simply by memorizing the entire
sequence of appropriate turns. Rather, performance in the various
configurations depends on the existence of regularity in the
structure of the maze and on the ease with which this regularity
is recognized and learned.
Gene Mutation Extends Lifespan In "I'm Not Dead Yet" Fruitflies, Natl. Inst. Aging/Science Daily
Excerpt: Mutating a single gene can double the
lifespan of fruitflies from 37 days to between 69 and 71 days,
while maintaining a high level of functioning and fertility. This
finding of a research team led by Stephen L. Helfand was supported
in part by the National
Institute on Aging (NIA), part of the National
Institutes of Health. Their study is reported in the December 14
issue of Science. The gene complex was named Indy as a joking
reference to the tag line from Monty Python and the Holy
Grail, "I'm not dead yet." This is the third
mutation in the fruitfly genome that is reported to extend
lifespan. According to Helfand, the Indy gene is associated with
the way that the body stores and uses energy.
The gene is named "Indy" in homage to Monty Python and the
Holy Grail's tag line, "I'm not dead yet," uttered by a supposed
plague victim being hauled off for burial while still alive. (JPEG
file available on request.)
The researchers speculate that the way the Indy gene
mutation works to extend life and health may be via changes in the
normal metabolism of food. This link between altered metabolism
and life-span extension became the focus of Helfand's studies when
other laboratories showed that research animals receiving full
nutrition but lowered calorie intake, or caloric restriction,
lived longer. Although the mechanism by which caloric restriction
benefits longevity is not understood, Dr. Helfand suggests that it
is likely to involve changes in energy utilization. The Indy
fruitfly differs from other long-lived fruitflies by the direct,
rather than indirect, action of the altered gene on metabolism and
the use of food energy.
"What is interesting about this line of research is the
recurrence of the link between metabolism, caloric restriction and
longevity. This study points to the possibility that if you
genetically alter metabolism, you can alter lifespan," said Dr.
David Finkelstein, research director for metabolic regulation
research at the National Institute on Aging.
"While there is an 80 percent homology between the fruitfly
and human genomes, we are still many steps away from understanding
how caloric restriction may affect human lifespan," Finkelstein
Mutation Extends Lifespan In "I'm Not Dead Yet"
Institute on Aging (NIA),
Science Daily, 12/15/00
Flies May Hold Secrets Of
Pennisi, Science, 290(5499): 12/15/00, p.
Life-Span Conferred By Cotransporter Gene Mutations In
Rogina, Robert A. Reenan, Steven P. Nilsen, Stephen L.
Helfand, Science, 290(5499): 12/15/00, p.
The problem of how to come to a legitimate method to
determine the democratic decision of the people of a modern
society has not only been painfully demonstrated in the pitiful
processes related to the counting of votes. The idealistic "one
person, one vote" principle has not been implemented not only in
the US. There, due to the antiquated electoral system e.g. a vote
for president in Alaska counts three times as much as a vote in
Massachusetts. But not only the US uses thresholds to introduce
non-linearites in the vote-counting process. The same problem
causes trouble in Europe where European matters are decided by
votes assigned to countries and not to people. That raises the
problem how to take into account the different population sizes in
Excerpt: As for the twin aims of efficiency and legitimacy,
the complex solutions proposed at the summit seem more likely to
hinder than help. With all the variations of voting weights for
big and small, increased thresholds necessary to reach a qualified
majority decision on any issue, and failure to extend
significantly the subjects on which majority voting can be used,
future decision-making is likely to be more difficult, not less.
By opting for greater complexity in order to reconcile their
differences, the EU leaders will make their system less
transparent, and less open to democratic control, whether by the
European parliament, or by national parliaments. Simplification of
the rules should have been the order of the day. It was
Have We Overdone Deregulation and Privatization?, HBS Working Knowledge.
Excerpt: During the Jimmy Carter administration,
Congress enacted legislation that had become known as "the Federal
Express bill." It was designed to test the idea of deregulation by
allowing air freight carriers to fly planes of any size on any
routes, without federal price controls.
The bill accommodated the persistent lobbying activities of
one Fred Smith, the young CEO of Federal Express, who had been
required under previous regulation to use small, inefficient
aircraft to transport freight or else submit to stringent
government regulation. It was regarded by Congress as an
experiment carried out in a small, obscure industry which, if
unsuccessful, would have little economic impact.
Little did Congress realize that true believers in
deregulation, like Alfred Kahn, a Cornell economics professor whom
Carter had appointed as chairman of the Civil Aeronautics Board,
would champion the extension of the idea to the entire airline
industry, and then into areas such as brokerage fees and other
In recent years, the deregulation movement has spread to
industries with which consumers interact daily, such as electric
power and telephone service, whose dependability and equitable
pricing were generally left to government and not thought much
Unchained Value: The New Logic of Digital Business, HBS Working Knowledge.
Excerpt: In Unchained Value, Internet expert Mary
Cronin introduces a radically new strategic model for organization
that she calls the "digital value system." It is focused not on
static, internally focused "chains" but on dynamic, external webs
of relationships that take full advantage of the power,
flexibility, and opportunity of the digital arena.
One of the keys to the new model, she writes, is an
understanding that the old strategy of hoarding information to
maximize its value is no longer appropriate. "The Internet
undermines competitive strategy based on information scarcity,"
she writes, "by making every Web site into a potential channel for
free distribution of anything and everything that can be
Common Knowledge: How Companies Thrive by Sharing What They Know, HBS Working Knowledge
Excerpt: Pervading the idea of knowledge sharing
are three myths. Perhaps myth is the wrong term—maybe they
are just assumptions that seem reasonable at first glance, but
when acted on send organizations to a dead end. Many of the
organizations I studied started with one or more of these
assumptions and then had to make corrections to get back on track.
The three myths are (1) build it and they will come, (2)
technology can replace face-to-face, and (3) first you have to
create a learning culture.
Managers who want to make the knowledge in their
organizations more available often have a mental image of a large
warehouse that contains all of that knowledge. They envision those
who are looking for knowledge going to the warehouse and taking
out what they need. The idea has a lot of intuitive appeal.
Knowledge seems so amorphous that the notion of its being
documented and located in a central place offers a comforting
sense of control and manageability.
The Causes of 20th Century Warming, Science
Summary: Global air surface temperatures
increased by about 0.6°C during the 20th century,
but as Zwiers and Weaver discuss in their Perspective, the warming
was not continuous. Two distinct periods of warming, from 1910 to
1945 and since 1976, were separated by a period of very gradual
cooling. The authors highlight the work by Stott et al., who have
performed the most comprehensive simulation of 20th century
climate to date. The agreement between observed and simulated
temperature variations strongly suggests that forcing from
anthropogenic activities, moderated by variations in solar and
volcanic forcing, has been the main driver of climate change
during the past century.
Global annual mean near-surface air temperature increased
during the 20th century in two major steps, the first between
roughly 1910 and 1940 and the second (which is still continuing)
after about 1975. It has been difficult to understand the causes
of this overall rise, partly because anthropogenic forcing by
fossil fuel combustion has grown steadily during that interval and
partly because it was not as important a forcing factor in the
first half of the century as in the second. Stott et al. (p. 2133;
see the Perspective by Zwiers and Weaver) have used a
state-of-the-art climate model, HadCM3, to examine the reasons for
this increase. An ensemble of four simulations of the last 140
years indicates that a combination of natural climate variations
and human-induced variability can explain the observed temperature
rise, andthat most of the multidecadal-scale global variations are
not due to internal variability of Earth's climate system, but are
Causes Of 20th Century
Warming, Francis W.
Zwiers And Andrew J. Weaver, Science 290(5499):
12/15/00, P. 2081
Control Of 20th Century Temperature by Natural And
Peter A. Stott, S. F. B. Tett, G. S. Jones, M. R.
Allen, J. F. B. Mitchell, G. J. Jenkins, Science
290(5499): 12/15/00, P.2133
Summary: If the study of global climate change
were a card game, one of the wild cards would be the role of
clouds. Clouds are a primary influence on the energy budget of
Earth's surface and atmosphere because of their effects on the
reflection and absorption of solar radiation and their trapping of
outgoing long-wave radiation. Clouds differ in their radiative
properties, however, and the complexity of cloud formation is
greater than our understanding of all of the factors that control
their distribution and composition. Solar cosmic rays may
influence global cloud cover because they can ionize atmospheric
particles and thus create condensation nuclei for cloud droplet
formation. The terrestrial cosmic ray flux depends on solar output
and is modulated by Earth's magnetic field; both of these
quantities are known to vary.
Marsh and Svensmark have measured global average monthly
cloud anomalies for lower, middle, and upper troposphere, and
correlated them with changes in the cosmic ray flux. They found,
surprisingly, that cloud cover at altitudes of less than 3.2
kilometers covaries with cosmic ray fluxes from 1980 to 1995, but
no correlation was seen for higher altitude clouds. If this
relation is systematic, cosmic ray variability could have a
significant effect on the evolution of climate.
Clouds And Cosmic Rays,
H. Jesse Smith, Science, 290(5499): 12/15/00, p.
- Marsh and Svensmark, Phys. Rev.
Lett. 85, 5004 (2000
Models Of Division Of Labor In Social Insects, Annu. Rev. Entomol.
Abstract: Division of labor is one of the most
basic and widely studied aspects of colony behavior in social
insects. Studies of division of labor are concerned with the
integration of individual worker behavior into colony level task
organization and with the question of how regulation of division
of labor may contribute to colony efficiency.
Here we describe and critique the current models concerned
with the proximate causes of division of labor in social insects.
The models have identified various proximate mechanisms to explain
division of labor, based on both internal and external factors. On
the basis of these factors, we suggest a classification of the
models. We first describe the different types of models and then
review the empirical evidence supporting them.
The models to date may be considered preliminary and
exploratory; they have advanced our understanding by suggesting
possible mechanisms for division of labor and by revealing how
individual and colony-level behavior may be related. They suggest
specific hypotheses that can be tested by experiment and so may
lead to the development of more powerful and integrative
Earth's Continental Land Masses Created In Short, Fast Bursts, Science Daily
Excerpts: Scientists believe they have unraveled
one of geology's most enduring mysteries about how the Earth's
continental crust was built, and they say it happened in a
relative blink of an eye.
According to Alexander Cruden, associate professor of
geology at the University of Toronto and second author of the
paper to appear in the Dec. 6 issue of Nature, the way that
granite forms - a rock that makes up about 70 to 80 per cent of
the Earth's continental crust - is not the sluggish, multi-million
year process that scientists previously believed. In fact, Cruden
and his co-authors argue that the process occurs in rapid, dynamic
and possibly catastrophic events that take between 1,000 and
100,000 years, depending on the size of the granite intrusion. And
that's changing how scientists look at the formation of the
Earth's continents. (…)
The researchers used experimental studies that involved
melting rock samples to understand how granite magma initially
forms in the upper mantle and lower crust and how fast it can
move. That data was then applied to theoretical models to
determine its method and rate of ascension. New models for the
emplacement stage - where the granite is intruded into older rock
in the upper crust - are based on a combination of theoretical
studies and fieldwork in areas such as the Canadian Shield,
Sweden, the Sierra Nevada of California, Greenland and the Andes
of South America. A unique aspect of the research is that the
three main stages of granite formation - generation, ascent and
emplacement - are regarded together as a system. Historically,
these processes have been studied by different geological
specialists in isolation from each other.
Cruden likens the granite formation process to subterranean
volcanic eruptions. Like Lego blocks built on top of one another,
large parts of the Earth's continental land masses were created by
tens of thousands of quick eruptions or bursts of molten magma
that were transferred rapidly from the mantle and lower-most crust
and then injected as large horizontal sheets into the upper crust.
These sheets then cooled and crystallized to form the large
granite intrusions that we see exposed at the surface of all
continents today, he says.
The Earth's continents began forming approximately four
billion years ago, Cruden explains. "This research has important
implications for how we understand the basic physics and chemistry
of crust formation processes as well as the formation of economic
ore deposits - gold and copper, for example - many of which are
associated with granite intrusions."
Diet And The Evolution Of = The Earliest Human Ancestors, PNAS
Abstract: Over the past decade, discussions of
the evolution of the earliest human ancestors have focused on the
locomotion of the australopithecines. Recent discoveries in a
broad range of disciplines have raised important questions about
the influence of ecological factors in early human evolution. Here
we trace the cranial and dental traits of the early
australopithecines through time, to show that between 4.4 million
and 2.3 million years ago, the dietary capabilities of the
earliest hominids changed dramatically, leaving them well suited
for life in a variety of habitats and able to cope with
significant changes in resource availability associated with
long-term and short-term climatic fluctuations.
Testing A Biosynthetic = Theory Of The Genetic Code: Fact Or Artifact?, PNAS
Abstract: It has long been conjectured that the
canonical genetic code evolved from a simpler primordial form that
encoded fewer amino acids [e.g., Crick, F. H. C. (1968) J.
Mol. Biol. 38, 367-379]. The most influential form of this
idea, "code coevolution" [Wong, J. T.-F. (1975) Proc. Natl.
Acad. Sci. USA 72, 1909-1912], proposes that the genetic code
coevolved with the invention of biosynthetic pathways for new
amino acids. It further proposes that a comparison of modern codon
assignments with the conserved metabolic pathways of amino acid
biosynthesis can inform us about this history of code expansion.
Here we re-examine the biochemical basis of this theory to test
the validity of its statistical support. We show that the theory's
definition of "precursor-product" amino acid pairs is unjustified
biochemically because it requires the energetically unfavorable
reversal of steps in extant metabolic pathways to achieve desired
relationships. In addition, the theory neglects important
biochemical constraints when calculating the probability that
chance could assign precursor-product amino acids to contiguous
codons. A conservative correction for these errors reveals a
surprisingly high 23% probability that apparent patterns within
the code are caused purely by chance. Finally, even this figure
rests on post hoc assumptions about primordial codon assignments,
without which the probability rises to 62% that chance alone could
explain the precursor-product pairings found within the code. Thus
we conclude that coevolution theory cannot adequately explain the
structure of the genetic code.
RNA Editing Process Plays Essential Role In Embryo Development, Science Daily
Excerpt: In a new study, scientists at The Wistar
Institute report the first direct evidence that RNA editing is
essential to mammalian embryo development. RNA editing is a normal
but not yet fully understood process in which small nucleotide
changes occur after DNA has been transcribed into RNA. The process
makes it possible for one gene to be translated into multiple
proteins with different structures or functions.
The researchers repeatedly attempted to delete, or knock
out, in mice a gene known to be involved in RNA editing called
ADAR1 in order to study its function. Certain target genes in the
brain are known to be subjected to RNA editing by the ADAR1
enzyme, including glutamate receptor ion channels, critical for
memory formation, and serotonin receptors, which regulate
emotional behaviors. The investigators expected that deletion of
the ADAR1 gene would therefore lead to significant changes in
Unexpectedly, however, they found that the knockout mouse
embryos died midterm due to an inability to make mature red blood
cells. At the least, the results suggest that ADAR1 and RNA
editing are critical to the development of mature red blood cells,
an essential step in mammalian embryo development. The new
findings were published in the December 1 issue of Science.
"The inability of mice with a defective RNA editing system
to make mature red blood cells is likely just the tip of the
iceberg," says Wistar professor Kazuko Nishikura, Ph.D., senior
author on the study. "The ADAR1 gene is expressed in many tissues
throughout the body in addition to the brain and is probably
involved in the RNA editing of a number of target genes that have
not yet been identified."
As scientists prepare to enter the post-genomic era, the
role of RNA editing in determining protein structure and function
may become an increasingly important consideration in genetic
research. Investigations such as Nishikura's indicate that RNA
editing is fundamental to key biological processes and point to
the complexity of predicting protein structures and functions from
gene sequences alone.
Before the discovery of RNA editing in mammals in the 1990s,
it was believed that the path from DNA to protein was fairly
straightforward: DNA in a cell's nucleus is transcribed to RNA,
and then sometimes shortened to splice out noncoding sections to
form mature messenger RNA. The mature messenger RNA is transported
to the cell's cytoplasm, where translation to protein occurs.
But researchers learned that mammalian protein production
can be more complicated; some RNA is edited prior to translation.
Single or multiple nucleotides may change before the mature
messenger RNA moves into the cytoplasm, leading ultimately to the
production of a protein that does not fully reflect the original
genetic instructions in the DNA. RNA editing is, in a sense, an
economical system, enabling one gene to produce a number of
proteins with different structures or functions. Scientists
believe that the known target genes represent only a fraction of
those that are subjected to RNA editing.
Nishikura and her co-investigators aimed to produce a mouse
line lacking the gene ADAR1, which is part of a small gene family
that produces enzymes involved in the RNA editing of a number of
target genes. Midway through development, however, the mouse
embryos lacking the ADAR1 gene died, which surprised the
researchers; often, gene knockout mice are born alive because the
mother supplies necessary biological functions to its embryos.
Further study revealed that the embryos died due to an inability
to produce mature red blood cells.
High-Resolution Inkjet Printing of All-Polymer Transistor Circuits, Science
Abstract: Although inkjet printing has been used
to fabricate low-cost organic light emitting diodes and displays,
its application to more complex thin film transistors (TFTs) has
been hampered by its resolution limit (~50 micrometers). In order
to maintain sufficient drive-current and switching times and
reasonable turn-on voltages in polymer TFTs, channel lengths
lessthan 10 micrometers are needed. Sirringhaus et al. (p. 2123)
introduce a method using a prepatterned substrate in which the
hydrophobic properties of polyimide are used to define the
critical features. They demonstrate the ability to fabricate TFTs
with 5-micrometer channel lengths and on-off ratios in excess of
105 at operating voltages of 10 volts.
Synchronization of Randomly Multiplexed Chaotic Systems with Application to Communication, Phys.Rev.Lett.
Abstract: Synchronized chaotic systems have recently
been applied tothe area of secure communications in a variety of
ways. At the same time, there have also been significant advances
in deciphering messages masked by chaotic signals. It is
important, therefore, to explore more secure approaches to using
chaos in communication. We show that multiple chaotic systems can
be synchronized through a scalar coupling which carries a
stochastic signal generated by random multiplexing of the source
systems. This approach, which is a variant of the active-passive
decomposition method, promises enhanced security in chaos-based
Network Robustness and Fragility: Percolation on Random Graphs, Phys.Rev.Lett.
Abstract: Recent work on the Internet, social
networks, and the power grid has addressed the resilience of these
networks to either random or targeted deletion of network nodes or
links. Such deletions include, for example, the failure of
Internet routers or power transmission lines. Percolation models
on random graphs provide a simple representation of this process
but have typically been limited to graphs with Poisson degree
distribution at their vertices. Such graphs are quite unlike
real-world networks, which often possess power-law or other highly
skewed degree distributions. In this paper we study percolation on
graphs with completely general degree distribution, giving exact
solutions for a variety of cases, including site percolation, bond
percolation, and models in which occupation probabilities depend
on vertex degree. We discuss the application of our theory to the
understanding of network resilience.
Theoretical Analysis of a Dripping Faucet, Phys.Rev.Lett.
Rob Shaw's classic experiment on chaotic systems from the
early 80's has been revisited in great detail:
Abstract: While previous studies of continuous emission
of drops from a faucet have shown the richness of the system's
nonlinear response, a theory of dripping has heretofore been
lacking. Long-time behavior of dripping is simulated
computationally by tracking the formation of up to several hundred
drops in a sequence, rather than the usual single drop, at a given
flow rate Q and verified by experiments. As Q increases, the
system evolves from a period-1 system through a number of period
doubling (halving) bifurcations as dripping ultimately gives way
to jetting. That hysteresis can occur is also