<p>
Science Feature Articles</p>

The Science of the Slow Slip

English
Dr. Eran Bouchbinder
 

 

We all know about the big earthquakes: The edges of two tectonic plates in the earth’s crust slip against one another, causing shaking and damage as the slip releases huge amounts of pent-up energy within mere seconds. The earth, however, literally moves under our feet much more often than we’re aware. In the last few years researchers have come to realize that there is another kind of earthquake – called a “slow” or “silent” earthquake – which does not entail a strong ground-shaking energy release. Rather, the slip takes place many orders of magnitude more slowly – over hours or even days. Although the ultimate energy release might be similar, the slow pace means such quakes are not felt.

Slow quakes, says Dr. Eran Bouchbinder of the Chemical Physics Department (Faculty of Chemistry), appear to be a common occurrence. It is only the recent development of novel geophysical and laboratory measurement techniques that have enabled scientists to uncover their existence and prevalence. Researchers are eager to understand the physical principles underlying the slow quakes: Are they governed by the same physics as their fast counterparts? What is actually determining their speed? There is some evidence that in certain cases, slow quakes may be precursors of the fast ones, though no one can say, as yet, how to identify those that might presage a major event.
 
 
In research recently published in Geophysical Research Letters, Bouchbinder and his team proposed a mathematical model suggesting that slow quakes might be controlled by somewhat different physics than the fast ones, accounting for their leisurely rate of movement.

Both types of earthquake involve frictional interfaces – the surfaces where the plates meet. Such interfaces are present in a great number of physical systems and play a central role in phenomena occurring on widely disparate scales, ranging from nano-machines and tiny biological motors to the geophysical scale of earthquakes. The strength and stability of these interfaces is an issue of fundamental importance; when frictional interfaces fail, it could spell trouble.  
 
A rupture in frictional interfaces involves the interplay between stored “elastic” energy and frictional resistance. Two bodies – whether tectonic plates in the earth’s crust or car tires and the road – are typically brought into friction-causing contact by normal forces that press them together. Parallel to the interface, a phenomenon called shear force acts to slide the two bodies against each other. In other words, one force pushes the bodies to move; the other resists movement. As long as the frictional resistance balances out the shear force, the interface remains pinned in place. But when shear force overcomes frictional resistance, sliding occurs, and ruptures propagate along the interface. How fast these ruptures progress determines whether the elastic energy – often built up over long periods as the shear force strains against friction – will be released abruptly or a little at a time. In fast, shaking quakes, this propagation is limited by the speed of sound, on the order of kilometers a second through ordinary solids – the speed at which waves can move within a material.
 
Earthquake damage: occurs when ruptures propagate at the speed of sound
 

 

For Bouchbinder, the question was: What makes the slow quakes slow? Is their speed simply a small fraction of the speed of sound or is it really a different physical quantity?
 
To investigate this, he and his team, research student Yohai Bar Sinai and Dr. Efim Brener, a visiting professor from Peter Grünberg Institut, Forschungszentrum Jülich, Germany, looked at geophysical measurements as well as experimental data from the group of Prof. Jay Fineberg of the Hebrew University of Jerusalem in which the factors involved in various types of earthquake are tested on a lab-sized scale. They then developed a model that offers a new explanation for the dynamics of the slow quakes.

According to conventional models, frictional resistance decreases as the sliding speed rises: The faster something slides the easier the sliding becomes. This is why an interfacial rupture tends to accelerate until it approaches the limit – the speed of sound. But the new model Bouchbinder and his team proposed allows for a different scenario: Frictional resistance first acts as in the standard model, decreasing with increasing sliding speed; but at some point the effect is reversed, and friction begins to progressively rise again. The slow quake phenomenon, says Bouchbinder, should occur around that point in the model at which friction levels reach their minimum and begin to swing upwards. The speed of the rupture, in this case, is determined by the frictional properties of the interface alone, rather than by the speed of sound, and thus can be orders of magnitude slower.

Bouchbinder and his team now plan to continue testing their mathematical model with more complicated calculations and to compare their predictions to experimental measurements, in hopes of providing a useful tool for better understanding the failure of frictional interfaces, including those that result in earthquakes.
 
 
Satellite images from 1992 and 1997 reveal gradual slippage of 2-3 cm along the Hayward fault (thin red line) in California, evidence of a slow earthquake Image: NASA

 
 

 

 
 
Earthquake damage: occurs when ruptures propagate at the speed of sound
Chemistry
English

Immune System Tricks

English
 

Prof. Irit SagiA team of Weizmann Institute scientists has turned the tables on an autoimmune disease. In such diseases,

including Crohn’s and rheumatoid arthritis, the immune system mistakenly attacks the body’s tissues. But the
scientists managed to trick the immune systems of mice into targeting one of the body’s players in autoimmune processes, an enzyme known as MMP9. The results of their research appeared in Nature Medicine.

Prof. Irit Sagi of the Biological Regulation Department and her research group have spent years looking for ways to home in on and block members of the matrix metalloproteinase (MMP) enzyme family. These proteins cut through the tough support materials in our bodies, which makes them crucial for cellular mobilization, proliferation and wound healing, among other things. But when the activities of some members of the MMP family, especially MMP9, get out of control, it can aid and abet autoimmune disease and cancer metastasis. Blocking these proteins, then, might lead to effective treatments for a number of diseases.   

Originally, Sagi and others had designed synthetic drug molecules to directly target MMPs. But these drugs proved to be fairly crude tools that had extremely severe side effects. The body normally produces its own MMP inhibitors, known as TIMPs, as part of the tight regulation program that keeps these enzymes in line. As opposed to the synthetic drugs, these work in a highly selective manner. An arm on each TIMP is precisely constructed to reach into a cleft in the enzyme that shelters the active bit – a metal zinc ion surrounded by three histidine peptides – closing it off like a snug cork. “Unfortunately,” says Sagi, “it is quite difficult to reproduce this precision synthetically.”

Dr. Netta Sela-Passwell began working on an alternative approach as an M.Sc. student in Sagi’s lab and continued on through her Ph.D. research. She and Sagi decided that, rather than attempting to design a synthetic molecule to directly attack MMPs, they would try to trick the immune system into creating natural antibodies that would target MMP9 through immunization. Just as immunization with a killed virus induces the immune system to create antibodies that then attack live viruses, an MMP immunization would get the body to make antibodies that would block the enzyme at its active site.

Together with Prof. Abraham Shanzer of the Organic Chemistry Department, they created an artificial version of the metal zinc-histidine complex at the heart of the MMP9 active site. They then injected these small, synthetic molecules into mice and afterward checked the mice’s blood for signs of immune activity against the MMPs. Indeed, the team identified antibodies, which they dubbed “metallobodies,” that were similar but not identical to TIMPS. A detailed analysis of their atomic structure suggested the two work in a similar way – reaching into the enzyme’s cleft and blocking the active site. The metallobodies were selective for just two members of the MMP family – MMP2 and 9 – and they bound tightly to the mouse versions of these enzymes, as well as to the human ones.
 
Left: Natural control mechanism blocks the enzyme's zinc active site. Right: Novel antibody works as effectively as the natural control mechanism
 

 

 
As the team hoped, when they had induced an inflammatory condition that mimics Crohn’s disease in mice, the symptoms were prevented when mice were treated with metallobodies. “We are excited not only by the potential of this method to possibly treat Crohn’s,” says Sagi, “but by the potential of using this approach to explore novel treatments for many other diseases.” Yeda, the technology transfer arm of the Weizmann Institute, has applied for a patent for the synthetic immunization molecules as well as the generated metallobodies.

Also participating in this research were Drs. Orly Dym, Haim Rozenberg, Rina Arad-Yellin and Tsipi Shoham, and Raanan Margalit of the Structural Biology, Immunology and Biological Regulation Departments; Raghavendra Kikkeri of the Organic Chemistry Department; Miriam Eisenstein of the Chemical Research Support Department; Ori Brenner of the Veterinary Resources Department; and Tamar Danon of the Molecular Cell Biology Department.
 
Prof. Irit Sagi's research is supported by the Spencer Charitable Fund; the Leona M. and Harry B. Helmsley Charitable Trust; Cynthia Adelson, Canada; Dr. Mireille Steinberg, Canada; the Leonard and Carol Berall Post Doctoral Fellowship; and the Ilse Katz Institute for Material Sciences and Magnetic Resonance Research. Prof. Sagi is the incumbent of the Maurizio Pontecorvo Professorial Chair.
 
Prof. Abraham Shanzer is the incumbent of the Siegfried and Irma Ullmann Professorial Chair.


 
 
Left: Natural control mechanism blocks the enzyme's zinc active site. Right: Novel antibody works as effectively as the natural control mechanism
Life Sciences
English

Production Overshoot

English

 

Prof. Yosef Yarden, Dr. Wolfgang Köstler, Amit Zeisel and Prof. Eytan Domany

 

 

 
One person’s trash, they say, is another’s treasure. For Amit Zeisel, a research student in the group of Prof. Eytan Domany of the Physics of Complex Systems Department, and Dr. Wolfgang Köstler, an M.D. conducting Ph.D. research in the lab of Prof. Yosef Yarden of the Biological Regulation Department, reluctance to throw out seemingly useless information led to a revision in our understanding of how certain RNAs are produced in the cell.
 
Ongoing collaboration between the labs of Domany and Yarden has been revealing the timing of the production of messenger RNA (mRNA), the strands that carry the protein-making instructions encoded in the genes out of the cell nucleus. After the cell receives a signal to activate a series of genes, the different mRNA levels begin to rise in what appear to be coordinated steps, some peaking within half an hour, others after one or two hours. These signals and the ensuing mRNA production are the first steps of nearly all cellular activity, and understanding the timing – which mRNAs are produced right away and which peak only several hours later – is helping researchers to understand some of the most basic processes in health and disease, for instance, how cancer develops.

The new finding arose out of some previous research between the labs involving apparent waste in the process of mRNA production. Starting with the pre-mRNA – an exact transcript of the gene code “written” in the DNA – short sequences called exons are snipped out and spliced together, while the pieces in between, called introns, are left out of the final mRNA strand. For their study, Zeisel and Köstler were using a research tool that enables them to track those pieces – individual exons and introns – over time.
 
Each row represents one of 400 genes induced by an EGF signal. Red denotes high expression levels. The expression profiles reveal that pre-mRNA expression does not always match that of the mRNA. Genes marked by green exhibit production overshoot

 
The two were looking at variations in the exon splicing to see if certain versions might be tied to the timing of mRNA production.

Most researchers simply discount the introns; after all, it is only the exons that go into the final instruction list. Even though their study only involved exons, Zeisel and Köstler hit upon the idea that the intron information might come in handy and left some of it in the data sets. That is when the researchers noticed something surprising: The shape, height and timing of the production peaks they were seeing – which belonged to pre-mRNAs – looked quite different from those of the corresponding mRNAs.

The team realized that because introns are excluded from the mRNA, the accidental discovery presented a novel way of measuring pre-mRNA dynamics on a genome-wide level. The findings implied that pre-mRNA is produced on a different schedule than mRNA and not, as scientists had assumed, at the same rate. To check this idea, they conducted a carefully designed experiment using both introns and exons. Their results showed that in a number of cases, pre-mRNA levels spiked quickly – rising to a much higher level than the mRNA peaks that appeared later on – and fell off just as quickly.

Afterward, the team took their insight to other Institute research groups to see if the same pattern is found in different cell types and in different situations. And indeed, in the immune cells researched in the group of Prof. Steffen Jung as well as in the embryonic stem cells in the lab of Dr. Yoav Soen, certain RNAs exhibited patterns that were similar to those Yarden, Domany and their students had seen in their original experiment.
 
 
The scientists termed these early, high and narrow pre-mRNA peaks “production overshoot.” “We begin observing these cells as they receive a signal that initiates a particular chain of events,” explains Domany. If these cells suddenly need large amounts of a certain protein, a fast jumpstart in pre-mRNA production might get things moving.” He compares it to merging from a slow on-ramp into fast-moving freeway traffic: Instead of adjusting his foot on the gas to the level needed for cruising at the higher speed, a driver will put the pedal to the metal for a minute to get up to speed and then ease off again.
Pre-mRNA (red) and mRNA (blue) levels measured over several hours for a single gene show pre-mRNA production overshoot. The inferred time-dependent pre-mRNA production (green) and mRNA degradation (gold) profiles are also shown
 
In addition to challenging some previous assumptions about the steps in the process that leads to protein production, Domany points out that having a second “known” about this multi-step cellular activity can enable researchers to deduce other unknowns. For instance, if one knows the temporal profiles of both pre-mRNA and mRNA, one can calculate both the pre-mRNA production rate and the mRNA degradation rate. In fact, this finding is helping to paint a detailed picture of mRNA regulation. In that picture, pre-mRNA overproduction and the carefully timed degradation of the mRNA strand work together as a sort of combination gas-and-brake mechanism, adjusting its levels over time. Now, the team is adding an investigation of protein levels to their research, in hopes of obtaining a fuller picture of the entire procedure – from the instant the pre-mRNA is copied off the gene to the rolling of the proteins off the production line. And, they say, they have learned a valuable lesson: Don’t be surprised to find paradigm-changing information sitting right under your nose – in the discard pile.



Prof. Eytan Domany’s research is supported by the Kahn Family Research Center for Systems Biology of the Human Cell, which he heads; the Mario Negri Institute for Pharmacological Research - Weizmann Institute of Science Exchange Program; the Leir Charitable Foundations; and Mordechai Segal, Israel. Prof. Domany is the incumbent of the Henry J. Leir Professorial Chair.
 
Prof. Yosef Yarden’s research is supported by the European Research Council; the Dr. Miriam and Sheldon G. Adelson Medical Research Foundation; the Aharon Katzir-Katchalsky Center, which he heads; the M.D. Moross Institute for Cancer Research; the Kekst Family Institute for Medical Genetics; the Steven and Beverly Rubenstein Charitable Foundation, Inc.; Julie Charbonneau, Canada; and the Marvin Tanner Laboratory for Research on Cancer. Prof. Yarden is the incumbent of the Harold and Zelda Goldenberg Professorial Chair in Molecular Cell Biology.

 
 
 
Prof. Yosef Yarden, Dr. Wolfgang Köstler, Amit Zeisel and Prof. Eytan Domany
Life Sciences
English

Meetings in the Cell

English

 

Walking across a sparsely occupied plaza, it shouldn’t take long to meet up with a person strolling through from the other side. Now, imagine crossing that same plaza filled with throngs of people. How much longer will one need to reach the other person?

A similar question is often asked by biochemists who study proteins. They can easily observe how long it takes two proteins to interact in a test tube solution. Test tubes are something like open plazas. The insides of cells, however, are extremely crowded spaces: To meet and interact, the proteins must make their way through a teeming mass of other macromolecules in the cell’s cytoplasm. In attempts to partially answer the question, scientists have tried adding other proteins and protein-like substances to their test tubes to simulate the crowd effect.
 
Yet the question remained open. Recently, Prof. Gideon Schreiber, research student Yael Phillip and Dr. Vladimir Kiss of the Biological Chemistry Department decided to resolve the issue by developing a method to directly observe protein interactions in individual living cells. Their results – an experimental first – appeared in the Proceedings of the National Academy of Sciences (PNAS).
 
To measure the interaction rate, the scientists needed a starting point. Creating that moment in time involved manipulating single cells into producing one of the proteins internally. The other protein was carefully injected into the cell with a microscopic needle as the measuring commenced. Both proteins were tagged with fluorescent molecules that produced a glow when an energy transfer took place, showing that binding was occurring between pairs of molecules.
Protein interaction in a cell. Lower left – light microscopy image shows the time from the injection of the second protein. Upper left – donor proteins appear in green. Upper right – as the proteins interact, energy is transferred to the acceptor proteins, causing them to glow red. Lower right – donor and acceptor dynamics combined
      
To their surprise, the scientists found that interaction rates for the proteins they tested were just a bit slower in a cell than they were in test tube solutions. Even when they mutated the proteins to make them faster or slower, the comparative rates did not differ by much.
 
Although they can’t yet be sure, Schreiber and his team think that, while counterintuitive, the crowd scenario could help explain why the protein interaction rates in living cells are faster than might have been expected in the thickly populated cytoplasm. “At first,” he says, “the crowding does slow the proteins down. But as they get closer to one another, the busy throng actually jostles them into meeting. These proteins don’t automatically recognize one another – they can bump into each other many times before an interaction takes place. Bouncing back and forth in the bustling mass of molecules increases the number of chance encounters between the two proteins, and thus the odds of interaction.”
Prof. Gideon Schreiber
 
 
On the one hand, says Schreiber, the new results provide strong evidence that the myriad protein experiments done in test tubes should give a good approximation of the true interaction rate. On the other, the method created in his lab is likely to advance the study of molecular interactions in their natural setting, inside living cells.
 
 
Schreiber: “This is the first time that anyone has managed to observe molecular interaction rates inside a single living cell. With the fluorescent imaging technology, we were literally able to see the interactions as they took place – we could determine not just the rate but the concentration of the different molecules over time as well. The team found that we could even focus on specific areas within the cell body. We are continuing to develop new experiments with the method, and other scientists have already expressed interest in using it to perform various biomolecular studies.”  
 
 
Prof. Gideon Schreiber
Life Sciences
English

Relating to Relationships

English

 

Hilary Finucane and Yakir Reshef. Tying the knot
 
 
 
Relationships can be a messy business. Fortunately for Yakir Reshef, identifying meaningful relationships is his forte. Born in Israel in 1987, Yakir found himself moving with his family to Kenya at age 3 and, shortly thereafter, immigrating to America. In high school he began dating Hilary Finucane, and since then they have taken the meaning of relationships to a whole new level. They both attended Harvard University, both in the Department of Mathematics. They are now both at the Weizmann Institute of Science and both in the Faculty of Mathematics and Computer Science: Yakir – a visiting Fulbright scholar hosted by Prof. Moni Naor;* and Hilary – a Ph.D. student in the group of Prof. Itai Benjamini.*
 
So the fact that Yakir and Hilary are joint authors of a recently published paper in Science on the subject of relationships seems quite appropriate: The paper reports a new data analysis tool that is able to search complex data sets for interesting relationships and trends that are invisible in other types of statistical analysis. And in another twist on relationships, the other first author of the paper, in addition to Yakir, is David Reshef – a computer scientist at the Broad Institute of MIT and Harvard in Cambridge, Massachusetts, and Yakir’s brother.

“While I was an undergraduate at Harvard, David asked me to help him create a program for visualizing and analyzing large public health data sets. But when we started, we realized that to visualize and analyze relationships in a data set, you first have to know what relationships you need to consider,” says Yakir. This sounds simple, but it can get very complicated when dealing with large data sets. Take, for example, bacterial species that colonize the gut of humans and other mammals: There are trillions of bacteria; even narrowing down the data set to just seven thousand yields over 22 million potential relationships between assorted pairs of bacteria. How can microbiologists keep themselves from drowning in such a huge sea of data, and know in advance what kinds of patterns to look for? Challenges like this are faced not only by microbiologists: Large, complex data sets with thousands of variables are increasingly common in fields as diverse as genomics, physics, political science, economics and more, and there is thus an increasing need for data-analysis tools to make sense of them.
Sign of status? (image: iStock)
 
The two brothers decided they needed an algorithm that could uncover new and important, yet unexpected, relationships that would otherwise go unnoticed.

The tool they developed – under the guidance of advisers Michael Mitzenmacher of the Harvard University School of Engineering and Applied Sciences and Pardis Sabeti of the Broad Institute – is called the maximal information coefficient, or MIC for short. It is based on the idea that if two variables are related to each other, there should be a way to draw a grid on a scatterplot of the two variables in a way that captures the relationship between them. The algorithm that calculates the MIC searches through many such grids and uses the one best able to quantify how strong the relationship is. Researchers can calculate the MIC on each pair of variables in their data set, rank the pairs by their scores (the higher the score, the more related the pair) and then examine the top-scoring pairs – that is, the pairs that affect each other the most.
 
 
To test how well the algorithm works, Yakir, David and Hilary applied the MIC to data sets in a variety of fields – global health, gene expression, human gut microbiota and even major-league baseball – and compared the MIC results to those of current methods.
 
How did they fare? With regard to the microbiota data, the MIC was able to narrow down 22 million variable pairs to just a few hundred interesting relationships, many of which had not been observed before. For instance, it identified examples of “non-coexistent” species in which if one bacterium is abundant, the other is not, and vice versa. Some of the non-coexistent relationships identified were familiar – known to be caused by differences in host diet – while others were novel. This finding raises the possibility of the existence of additional factors that, like diet, affect the make-up of the human microbiome.
Netting a raise? (image: iStock)
 
In another example, the team examined a data set from the World Health Organization covering 200 countries and containing 357 variables per country. One of the identified relationships was between female obesity and household income in the Pacific Islands, in which obesity increases with income, in contrast with other countries. It turned out that obesity, rather than being an anomaly, is considered a sign of status in the Pacific Islands. Most methods would treat this separate trend as “noise,” but the MIC is able to identify relationships, such as this one, that include more than one trend.

And major-league baseball? According to the MIC, hits, total bases and how many runs a player generates for a team are the most influential factors determining a player’s salary. A more traditional statistic had placed walks, intentional walks and runs batted in as the three strongest factors. So, which of the statistics is correct? The researchers are wisely leaving it to baseball enthusiasts to decide which set of variables is – or at least should be – more strongly tied to salary.

“What sets the MIC apart from other data analysis tools is twofold,” says Hilary. “Unlike other methods, the MIC assigns high scores to a wide variety of relationship types hidden in large data sets, while it can also provide similar scores for relationships with comparable amounts of noise.” Yakir: “In other words, the MIC has a “sweet spot” – it finds cool things going on that you might not have expected and that are difficult to find with other types of analyses.”

As for Hilary and Yakir, while working on the MIC together they discovered the top-scoring relationship of all: marriage. “It’s really great for us that we share the same passion for mathematics.” Not stopping at that, they also share hobbies, including classical piano, jogging and cooking.
 
Associations between bacterial species in the gut microbiota of “humanized” mice. A spring graph in which nodes correspond to significant relationships between “species”-level, and edges correspond to the top 300 nonlinear relationships. Node size is proportional to the number of these relationships between species relationships, black edges represent relationships explained by diet, and node glow color is proportional to the fraction of adjacent edges that are black (100% is red, 0% is blue)
 
 
*Profs. Itai Benjamini and Moni Naor had no involvement in this research.

Prof. Itai Benjamini is the incumbent of the Renee and Jay Weiss Professorial Chair.
 
Prof. Moni Naor's resaearch is supported by Citi Foundation and Walmart.
 



 
 
 
Associations between bacterial species in the gut microbiota of “humanized” mice. A spring graph in which nodes correspond to significant relationships between “species”-level, and edges correspond to the top 300 nonlinear relationships. Node size is proportional to the number of these relationships between species relationships, black edges represent relationships explained by diet, and node glow color is proportional to the fraction of adjacent edges that are black (100% is red, 0% is blue)
Math & Computer Science
English

Let it Rain

English

rain

 
 
 
 
 
 
 
 
 
Rain can be a blessing or a disaster. An hour of steady rainfall can water the roots of plants and percolate underground to replenish water supplies. If the same amount of water falls in a five-minute downpour, however, the results are more likely to be uprooted plants, runoff and even flooding. According to new research, the effects of human activity on the atmosphere appear to be nudging rain patterns in the direction of faster and heavier.

In a paper recently published in Nature Geosciences, Prof. Ilan Koren, research student Reuven Heiblum and Dr. Orit Altaratz of the Institute’s Environmental Sciences and Energy Research Department (Faculty of Chemistry), together with Dr. Graham Feingold of NOAA-ESRL, Lorraine Remer of NASA-GSFC and J. Vanderlei Martins of JCET University of Maryland, Baltimore County and NASA-GSFC, looked at the rain rate – how much falls in a given period of time – for a large part of the globe. They then asked whether there is any connection between this rate and aerosols – tiny floating particles in the atmosphere. Aerosols often occur naturally, and they provide the “seeds” for the condensation of cloud droplets or, higher in the atmosphere, the nuclei for ice crystals. But today’s amounts are anything but natural: Burning fuels have added many tons of aerosols to the atmosphere over the past century. “For the first time,” says Koren, “we have managed to obtain a global picture of aerosols’ ability to change rain patterns.”
 
 
Prof. Ilan Koren
 
Koren and Feingold had previously shown how these aerosols affect cloud formation: Heavier loads of airborne particles stimulate the creation of more, but smaller, droplets. But the progression from cloud formation to rainfall is far from straightforward. Many different factors contribute, and diverse patterns of rainfall – from drizzles to torrential storms – arise from quite different weather conditions. Add to this the fact that rain is a widespread but sporadic occurrence, and it becomes quite difficult to formulate any hard-and-fast rules about cause and effect.

To assess rain rates – measured in millimeters per second – around the world, the researchers turned to satellite data. These satellites cannot, as they orbit, monitor absolute rainfall amounts, but they do measure rain rates for so-called convective cloud systems – well-formed clouds that produce relatively heavy rainfall. Data amassed from several different research satellites, collected over a several-month period in 2007 (see below), enabled them to evaluate rain intensity over large parts of the earth where convective cloud systems form, and also to assess aerosol pollution levels in the same regions.

The research team then looked for possible connections between the two, comparing satellite-generated maps of rainfall intensity with those in which aerosol pollution was sorted according to clean, mildly polluted or heavily polluted. Their analysis showed a very strong correlation: In over 80% of the areas in which rain fell from convective cloud systems, more pollution went hand in hand with higher rain rates.

Were aerosol levels really affecting the rain? Or could a third factor, for example, atmospheric instability, be the cause of both higher levels of particles carried in the atmosphere and greater rainfall intensity? To address this question, the scientists undertook a detailed meteorological analysis to see if their results would stand up in light of the hundreds of possible factors affecting rainfall. Most convincingly, when they divided their findings into levels of relative atmospheric stability, they found that added aerosols were tied to increased rain rates across the board, for stable as well as neutral and unstable conditions. This strongly implies that rising aerosol levels do, indeed, intensify rainfall.

Koren believes this phenomenon is mostly felt in areas in which rainfall patterns are close to borderline. In regions that are already flood-prone, higher rain rates might not significantly affect the flooding; and in arid regions, rain is too scarce for the rate to matter much. But somewhere in the middle, as more of the annual rains fall faster than the ground can absorb them, that beneficial rain may also bring more of the less-than-favorable effects.
 

The little satellite that could

satellite data: Tropical Rainfall Measuring Mission data for a 24-hour period
 

 

 
 
 
 
 
 
 
Sometimes unplanned surprises can work in our favor. That is the case with the Tropical Rainfall Measuring Mission (TRMM), a satellite operated by NASA and the Japanese Aerospace Exploration Agency. When it was launched in 1997, carrying monitoring technology from the early 1990s, its projected life span was around three years, at the very most. In 2005, the agencies planned to destroy the TRMM with a controlled reentry, rather than letting it crash to Earth when it ran out of fuel. But scientists realized that the data it was providing were too important, and they even appealed to the US congress, getting a stay of execution to keep the TRMM in orbit. “The satellite has an orbit that allows it to cover much of the tropics and extract as much rainfall data over 24 hours (16 orbits) as possible,” says Koren. “The fact that it has refused to give up the ghost for 15 years means we were able to use it to produce a good, solid study.”
 
Artist's image of the TRMM satellite at work. Images: Jupiterimages and NASA
 

Dr. Ilan Koren’s research is supported by the Yeda-Sela Center for Basic Research.

 

 

 


 

 
rain
Environment
English

Proof of Inequality

English

(l-r) Dr. Zohar Komargodski, Prof. Adam Schwimmer and visiting Prof. Alexander Zamolodchikov. Extra dimensions

 

 

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
The LHC particle accelerator in CERN – the European Laboratory for Particle Physics – near Geneva, is expecting big discoveries in the near future. By big, of course, they mean very, very small. The accelerator has the capacity to reveal the existence of forces and particles that have been beyond the reach of science until now: particles so small that they are dwarfed by the protons and neutrons in the nucleus of an atom. If protons and neutrons measure a millionth of a billionth of a meter, the particles scientists hope to observe in the LHC are sized around a billionth of a billionth of a meter – an atto-meter. Inside the LHC, immensely powerful forces are exerted on these minuscule particles, bringing science into uncharted regions. What will be found there?  

Physicists, in truth, do not know what to expect. Nevertheless, they try to plan for all possibilities, using the tools they have at hand to endeavor to make sense of any new phenomena, particles or forces they might encounter. Can such key tools as quantum field theory describe this new region in physics, in which such tiny particles exist on extremely small-distance scales and with enormously high energies? Quantum field theory provides the framework for particle physics, and it is, in a very real sense, the basis of everything we know about the physical world. This theory is the unavoidable result of attempts to reconcile the field of quantum mechanics founded by Bohr and Heisenberg with Einstein’s Theory of Special Relativity.

One such attempt to formulate a general principle for quantum field theory was put forward by the British physicist John Cardy in 1988. According to his theorem, inequalities arise in systems with known “rules” that involve large numbers of factors. The states of these systems cannot be explained just through knowing the rules and the various players. Examples of such systems include stock markets, traffic patterns and weather. The inequalities transpire between degrees of freedom: the degrees of freedom existing at short-distance scales (for example the scales of the new particles that LHC researchers hope to discover) versus those at longer-distance scales (for instance in the forms of matter we know today).
equations
 
If Cardy’s theory proves correct, it might provide an explanation of how we arrive at the Standard Model – the main theory of particle physics in use today – when a system of infinitesimally small particles at small-distance scales and very high energies cools down. We might even begin to understand how the world as we know it arises out of a complex universe of subatomic particles and powerful forces.

In 1986, about two years before Cardy presented his theory, another physicist, Alexander Zamolodchikov, had shown that this inequality between degrees of freedom on short-distance scales exists in two-dimensional systems (one dimension of space and one of time). Cardy took the idea one step further, theorizing a similar inequality in a four-dimensional system (three of space and one of time). But the theory remained open for the next 23 years, until one evening a few months ago, when two Weizmann Institute theoretical physicists were relaxing on a beach on an Aegean island. The two were Prof. Adam Schwimmer of the Physics of Complex Systems Department and Dr. Zohar Komargodski, a recent addition to the Particle Physics and Astrophysics faculty and an Institute alumnus who recently completed postdoctoral work at the Institute for Advanced Study, Princeton.  

For the last few years, Schwimmer and Komargodski had been looking for a way to prove Cardy’s theorem and turn it into an accepted axiom. They had traded ideas and explored a number of different avenues, but none had really panned out. But that evening, sitting on the Aegean sands and relaxing between lectures in a scientific conference they were attending, the two began to discuss their old problem. As the sun set over the blue water, the solution seem to float up to them: Although none of the methods they had applied to the problem had yielded the long-awaited proof, they realized that four or five of their beginnings might be combined into a framework on which they could erect that proof.

So far, a number of physicists have checked the proof and announced that it stands up to various challenges. The Institute scientists say that it must be shown to withstand a number of further challenges before it can be completely accepted. Even so, physicists in a number of different fields have begun to envision new applications of the theorem, now that it appears to stand on somewhat solid ground. In particular, it could prove to be a valuable tool in interpreting LHC results in which scientists are not only searching for new particles, but for evidence for such models as supersymmetry that will help to explain the physical world.
 

Dr. Zohar Komargodski’s research is supported by the Peter and Patricia Gruber Awards.


 
 
(l-r) Dr. Zohar Komargodski, Prof. Adam Schwimmer and visiting Prof. Alexander Zamolodchikov. Extra dimensions
Space & Physics
English

The X Factor of Ovulation

English
 
An essential prelude to the journey of life is the egg's ripening in the mother’s ovary, culminating with its release at ovulation. Understanding the details of this process is crucial both for developing future fertility treatments and for designing better contraceptives. A major molecular mystery concerning this early step – one that had baffled scientists for more than seven decades – has recently been conclusively solved.

It is now well known that a woman ovulates – that is, an egg matures and is released from its ovarian follicle – under the influence of a hormone called LH, which is secreted by the pituitary gland in the brain. But in 1935, the famous American biologist Prof. Gregory Pincus, later to become one of the fathers of the pill, working with Dr. E. V. Enzmann, discovered a remarkable paradox: Eggs removed from the follicle mature spontaneously in a test tube, without any need for hormonal stimulation. How is this possible?
 
Mature oocyte, ready for fertilization; surrounding the oocyte are follicle cells
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
The series of studies that were ultimately to resolve this riddle began at the Weizmann Institute of Science in the early 1970s. Intrigued by the Pincus-Enzmann paradox, Prof. Alex Tsafriri of the Biological Regulation Department, then a doctoral student, created the first system for culturing the large ovarian follicles of rats in a laboratory dish. This model proved useful for future studies of egg maturation in all mammals, including humans. It was thus possible to reveal the exact role of the ovulation hormone, LH. This was the first indication that a crucial molecular messenger called cyclic adenosine monophosphate (cAMP) plays a role in egg maturation.

This role was elucidated through the studies of several groups over many years, with crucial contributions by Prof. Nava Dekel of the Weizmann Institute. In a nutshell, it turned out that LH triggers ovulation through a rise in follicle cAMP production. Yet in the egg, cAMP produces the opposite effect: A steady cAMP presence blocks egg maturation, which begins to proceed only when the level drops.

As a postdoctoral fellow in the United States, Tsafriri, together with Prof. Cornelia Channing of the University of Maryland in Baltimore, showed that not only the follicle cells but even the fluid filling the follicles prevented the spontaneous maturation of mammalian eggs in the test tube. The scientists postulated that a mysterious “X factor” blocked egg maturation in the follicle. Tsafriri and his colleagues purified it partially and stipulated that it was a small peptide, giving it the general name OMI, for “oocyte maturation inhibitor.”
 
 
Prof. Alex Tsafriri
 
Tsafriri continued to study egg maturation upon returning to Israel and joining the Weizmann Institute faculty. In research spanning a decade, roughly between 1975 and 1985, Tsafriri and his colleagues from Baltimore, Profs. Channing and Seymour Pomerantz, came close to solving the Pincus-Enzmann paradox. As predicted earlier, OMI turned out to be a small peptide produced by follicle cells. Most importantly, by growing rat eggs and follicle cells together in test tubes, the scientists clarified how egg maturation is controlled: They suggested that follicle cells continually release OMI, blocking egg maturation, while LH overcomes this blockage. This explanation resolved the paradox in general terms: When the egg is placed in a laboratory dish, it matures without the hormone simply because OMI is no longer there to prevent it from doing so.

During a sabbatical at Stanford University, in collaboration with Prof. Marco Conti, Tsafriri identified the final link in this chain of events, the ultimate molecular switch that keeps egg maturation in check. This switch, an enzyme called PDE3A that degrades cAMP, is present in the egg itself.  The scientists showed that as long as PDE3A remains in the OFF position, the egg does not mature, although other ovulation-related changes in the ovary can proceed undisturbed. These findings suggest that it might be possible to develop a method for blocking PDE3A as an optimal approach to contraception.

With time, it became clear that blocking egg maturation is a multi-step process. Why is such a complex molecular sequence needed to regulate egg maturation?  “The sequence ensures that this process is carefully controlled, so that eggs don’t start maturing spontaneously by mistake,” says Tsafriri. “Egg maturation is so critical to life, it needs to be regulated with utmost precision, in concert with the changes that prepare the womb to be receptive to the embryo at the proper time.”


X-factor identified


As for OMI, the molecular “X factor,” it was finally identified recently by Prof. John Eppig and his group at the Jackson Laboratory in Bar Harbor, Maine, as reported in Science. Indeed, it’s a peptide, of the same size that Tsafriri and colleagues had predicted, called natriuretic peptide C (NPC); and indeed, it is continuously released by the follicle cells. A shutdown in NPC production unleashes the complex cascade of molecular events in the follicle cells that results in the activation of PDE3A (the ON position) in the egg, degradation of cAMP there, and eventually, egg maturation. “The discovery of this molecule and its action in the follicle has provided the ultimate proof for the OMI concept we formulated more than thirty years ago,” says Tsafriri.

 

The Bahat Prize


Prof. Alex Tsafriri was awarded the Bahat Prize given annually by the University of Haifa Press “for quality, original, non-fiction manuscripts in Hebrew that have not been published previously and which have a potentially large popular audience.” Tsafriri was awarded the prize for his work “Man as Animal?” (Ha-Adam K-Haya?), which provides a scientist’s perspective on the debates held in Israel and abroad over animal experimentation in biomedical research. Tsafriri discusses the moral and philosophical aspects of such experimentation, outlining how people-animal relations are treated by the different monotheistic religions, with an emphasis on Jewish theology. The manuscript will be co-published by the University of Haifa Press and Yedioth Books.
 
 
Mature oocyte, ready for fertilization; surrounding the oocyte are follicle cells
Life Sciences
English

Work around the Clock

English
 
There’s a season for everything, King Solomon reportedly said: a time to be born and a time to die, a time for war and a time for peace. Modern science says there’s also the best time to wake up – at 6 am, when metabolism starts running; the best time for love – at 8 am, when sex hormones are about to peak; the best time to see a dentist – at 2 pm, when sensitivity to pain drops; and the worst time to drive – at 2 am, when the body is geared for its deepest sleep.
(l-r) Top: Dr. Adi Neufeld, Tal Shamia and Ziv Zwighaft. Bottom: Drs. Judith Cohen, Gad Asher and Liat Rousso
 
Such peaks and valleys are determined by internal biological “clocks” that operate in approximately 24-hour cycles known as circadian, from Latin, circa diem, meaning “approximately a day.” These clocks regulate the daily fluctuations in heartbeat, blood pressure, kidney function, body temperature, sleeping and waking, sensory perception and the secretion of many hormones. The body’s master circadian clock, residing in the brain, synchronizes a multitude of peripheral clocks present not only in every organ in the body but in every single cell. “Our body is a collection of myriad microscopic clocks that all ‘tick’ in unison,” says Dr. Gad Asher, who recently joined the Weizmann Institute’s Biological Chemistry Department. “Moreover, amazingly enough, these clocks function perfectly well even when the cells are cultured in a laboratory dish.”
 
 
A major aim of Asher’s new lab at the Institute is to find out why we need circadian clocks and how they function. Though science does not yet have an answer to these questions, it’s obvious that the clocks are vital. For one, studies suggest that shift workers tend to have a higher incidence of cancer, diabetes and obesity, as well as signs of accelerated aging, apparently because their circadian clocks are disrupted by work patterns that don’t follow the regular day-and-night cycle. Another indication of the clocks’ importance is their survival throughout millions of years of evolution, from plants and bacteria to humans: Such survival generally means that a mechanism is important enough for nature to keep. Finally, as many as 15 percent of all genes have circadian rhythms. “That’s an incredibly high number – it means that various processes in our bodies run entirely differently at different times of the day,” says Asher.
clock illustration
 
One striking example of such a process was discovered by accident, as often happens in science, in the lab of Prof. Ueli Schibler at the University of Geneva. Several years before Asher joined that lab to conduct his postdoctoral research, Schibler’s team had discovered that a protein called DBP could be purified in unusually high amounts from liver cells. But after the scientists had already published these findings in a prestigious journal, something seemed to go terribly wrong. When they repeated the same experiments, they suddenly couldn’t purify the protein at all. After a brief spell of panic, the quandary was resolved. It turned out that the original experiments had been performed by a student who used to come to the lab at 6 am, as was the custom in his Swiss peasant family. In contrast, the follow-up experiments were conducted by an American student who, needing a rest after late-night partying, used to start work at 2 pm. It so happened that the liver protein under study was regulated by the circadian clock: Its early-morning production was a hundred times higher than in the afternoon.
 
This fascinating discovery gave rise to an entire series of in-depth studies of circadian clocks in Schibler’s Geneva lab. “Geneva is a world leader in all types of clock – not just the Swiss cuckoo clocks, but circadian clocks as well,” says Asher.
 
Among the major questions addressed by Asher’s current studies is how circadian clocks are regulated. It’s known that the master clock in the brain is reset every day by light-dark cycles. As for the clocks in peripheral organs, the major cues for their synchronization are feeding times; it therefore seems that peripheral clocks are extremely sensitive to the metabolic state of cells. While working in Prof. Schibler’s lab, Asher discovered that a protein called SIRT1, which plays a central role in cellular metabolism, also controls the circadian clocks’ function. This discovery, according to the journal Cell, suggests that the SIRT1 might be the “missing link” between metabolism and circadian clocks. In his lab at Weizmann, Asher now intends to expand his studies into the relationship between metabolic factors and circadian mechanisms.

Asher’s ultimate goal is to clarify how circadian clocks work at the molecular and cellular levels, and how the central clock in the brain synchronizes peripheral clocks in other organs. His studies are aimed at answering fundamental biological questions that are related to a wide variety of physiological and pathological conditions – from jetlag and sleep disturbances to cancer, diabetes, obesity and aging.

 

Devoted to science


Born in Ramat Gan, Gad Asher started out by studying mathematics as part of a special program at Tel Aviv University to promote excellence, but then switched to medicine, earning his M.D. degree with distinction in 1998. Already as a medical student, he was fascinated by biological research, spending three summer internships at the Weizmann Institute. While working as a resident internist at the Tel Aviv Sourasky Medical Center, he began doctoral studies in the Weizmann Institute’s Department of Molecular Genetics under the guidance of Prof. Yosef Shaul. Upon earning his Ph.D. degree in 2006, he decided to devote himself entirely to science. After four years of postdoctoral research at the University of Geneva, Switzerland, he joined the Weizmann faculty as a Senior Scientist in May 2011. Outside the lab, he plays classical piano and enjoys bicycle racing and mountain climbing.
 



Dr.  Gad Asher's research is supported by the estate of Dorothy Geller; Ms. Rudolfine Steindling, Austria; and the Willner Family Leadership Institute. 
 
clock illustration
Life Sciences
English

Touching Ground

English

 

Non-polarized cell six hours after seeding onto a compliant fibronectin-coated substrate. Yellow shows paxillin, a protein associated with focal adhesions
 
 
 
Adhesion, as they say, is a sticky business, especially for a cell. The adhesion complexes linking a cell’s outer surface to neighboring cells and connective tissues consist of hundreds of different proteins. These not only hold the cell in place but also sense the properties of the cell’s surroundings and relay that information to the inside of the cell. So crucial is the adhesion-mediated process, according to new Institute research, that even cells that actively migrate must first “stick around” in order to get into traveling shape.
 
Profs. Benjamin Geiger and Alexander Bershadsky of the Molecular Cell Biology Department are interested in the ways that cells sense and respond to the physical properties of their environment. Blood pressure, muscle tension, and tissue stiffness and tension are all features and forces that act on cells, affecting their behavior and fate. Included in these is the capacity of cells to generate contractile forces of their own. When a cell contacts a surface, for example, it first attaches weakly and then contracts so as to probe the surface with the force of its pull. If this test indicates that the surface is adhesive and displays the proper mechanical properties (i.e., rigidity), it adheres, extending its adhesion complexes into so-called focal adhesions and spreading itself out. To begin revealing the intricate relationship between an adhering cell and its substrate, Geiger and Bershadsky, together with postdoctoral fellows Drs. Masha Prager-Khoutorsky and Alexandra Lichtenstein, designed and constructed an experimental system in which cells were placed on adhesive polymer surfaces that had similar chemical properties and differed only in their stiffness. Their findings were published in Nature Cell Biology.
Prof. Alexander Bershadsky
 
They found that the differences in the cells’ behavior were immediately apparent: The cells on the softer surfaces spread out equally in all directions, forming tiny, circular “fried egg” configurations, whereas those on the stiffer surface became elongated. Elongation gives cells polarity – a “head” and “tail.”  This pear shape enables the cells to take off and migrate, an essential property for embryonic development, wound healing, and tissue growth and repair. A closer look revealed differences in the focal adhesions. The round cells on the softer surface had adhesions that were small and evenly distributed all around or throughout.  By contrast, on the rigid surfaces, the focal adhesions were large, and these tended to align with the future head and tail regions of the cell even before elongation was observed. In other words, whether a cell is sleek and travel-worthy or comfortably round is directly connected to the rigidity of the substrate it adheres to. Focal adhesions, among other things, function as the rigidity sensors. 
 
 
To understand the focal adhesion process on the molecular level, the researchers, using gene silencing technology, systematically depleted various genes that encode specific signaling cellular proteins – some 80 different genes in all – and they tested how each modified cell responded to its surface. The Institute’s Prof. Zvi Kam provided essential help with microscopy, automatic image analysis and quantification, and Dr. Ramaswamy Krishnan of Harvard University assisted with a method for measuring the forces applied by the cells to the underlying substrate.
Prof. Benjamin Geiger
 
Some of the cells with silenced genes lost their ability to polarize or to form different sizes of focal adhesions in response to the substrate rigidity; in others, the cells’ grip on the surface or the detection of force was affected. The researchers concluded that cell polarization is a highly complex process – one that is driven by mechanical force and mediated by focal adhesions. Regulation of this process occurs in multiple stages, affecting the generation of cellular forces as well as directing the response to force. “We were surprised at how many regulatory factors are involved,” says Bershadsky. Geiger: “We have revealed a strong tie between the development of focal adhesions, the generation of force and cell migration, and have identified some of the critical regulators of this process.”

Their findings may have relevance for many areas of biology and basic biomedical research. For instance, the scientists believe the insights they have gained may be relevant to processes affecting the cells lining the blood vessels. These cells are often exposed to turbulent blood flow or regular pressure changes, which can contribute to arterial plaque formation and aberrant collateral blood vessel development. Understanding the process may help to identify potential therapeutic targets for such disorders.
 
Prof. Alexander Bershadsky's research is supported by the Kahn Family Research Center for Systems Biology of the Human Cell. Prof. Bershadsky is the incumbent of the Joseph Moss Professorial Chair of Biomedical Research.

Prof. Benjamin Geiger's research is supported by the Leona M. and Harry B. Helmsley Charitable Trust; and IIMI, Inc. Prof Geiger is the incumbent of the Professor Erwin Neter Professorial Chair of Cell and Tumor Biology.


 
 


 

 
 
 
 
Non-polarized cell six hours after seeding onto a compliant fibronectin-coated substrate. Yellow shows paxillin, a protein associated with focal adhesions
Life Sciences
English

Pages