We get by with a little help from our friends - but sometimes this help can come from an unexpected source. That’s what happened to a tiny relative of the mustard plant: Using genetic engineering, scientists endowed it with a strawberry gene, enabling it to recruit impressive numbers of "bodyguard" insects that attack the plant’s enemies. This is the first time genetic engineering has been used to devise plant protection involving natural bodyguards. Dr. Asaph Aharoni of the Weizmann Institute’s Plant Sciences Department, who performed this research with colleagues from the Netherlands, says the approach may help develop advanced environmentally friendly methods of pest control. "Instead of using large amounts of pesticides that pollute the soil and groundwater, we may enable the plants to recruit natural bodyguard insects that will protect them," says Aharoni.
Numerous plants in nature are capable of recruiting bodyguards via a chain process, which involves a slew of enzymes and culminates in the plant releasing a mixture of volatile organic materials, among them substances called terpenoids. Terpenoid-producing plants include corn, apple trees, beans, cucumbers, cotton and strawberries. They attract a wide variety of predator insects, such as ladybugs, which devour aphids, and parasitic wasps, which lay their eggs in the larvae of harmful bugs.
The pathway of terpenoid production and release is extremely complex, making it possible for the plants to generate different terpenoids to attract assorted insects for all sorts of purposes - from pollination to the repulsion of harmful bugs. But what happens when terpenoid production is ineffective and does not sufficiently protect the plant? Can the pathway be corrected to adjust the time, place and quantity of terpenoid release? Such a correction would significantly improve the plants’ ability to protect themselves against their enemies. The scientists studied this possibility in a model research plant called Arabidopsis thaliana, the first plant to have its entire genome mapped and deciphered.
In attempts to jump-start the terpenoid release system, scientists around the world have tried equipping the cells of different plants with a gene that codes for a unique enzyme responsible for terpenoid production. These experiments, however, failed to produce the desired results because the enzyme "chose" to work in a particular area of the plant cells that was lacking in sufficient raw materials to make terpenoids. To overcome this difficulty, Dr. Aharoni decided to insert into the Arabidopsis plant a single strawberry gene to which he attached a "routing" genetic segment. This segment directed the enzyme to a part of the cell that was rich in the required raw materials - a strategy that allowed the enzyme to step up terpenoid production.
The engineered plant released large quantities - 25 times more than the natural plant - of a signaling chemical that recruits predator mites. At this stage, the scientists decided to test the effectiveness of the method. Predator mites were allowed to roam freely and choose between a genetically engineered and a regular plant. The result: On average, 388 mites rushed to the engineered plant, while only 191 flew over to the regular plant. These results were recently published in the journal Science.
Unlike natural plants, which produce terpenoids only on demand, the engineered plant releases the signaling chemical continuously, so that it cries "Wolf!" even when it’s not being attacked. This never-ending alert could conceivably create a problem, as predator mites have occasionally been known to become disappointed and lose their "trust" in the help-recruiting signals. To prevent this undesirable situation, the scientists are currently striving to engineer plants in which it will be possible to control when the signaling substances are released.
Dr. Asaph Aharoni's research is supported by the Sir Charles Clore Research Prize; the Henry S. and Anne Reich Family Foundation; Sir Harry Djanogly, CBE; and Mr. and Mrs. Mordechai Segal, Israel. Dr. Aharoni is the incumbent of the Adolpho and Evelyn Blum Career Development Chair of Cancer Research.
Predatory mite (r) attacking a plant-eating insect. Photo: Koppert Biological Systems, the Netherlands
Prof. Amir Yacoby and Ph.D. students Merav Dolev and Sandra Foletti. Quantum states
Quantum computers might have the ability to work millions of times faster than today’s computers. For this reason, scientists around the world cherish the dream of creating practical quantum computers, even though no one is quite sure the undertaking is really feasible. But this much is already clear: to construct a quantum computer, they must devise a way of storing and processing information that is encoded in quantum bits.
A regular bit is an entity that can exist in one of two states, usually described by the digits zero (0) and one (1). In contrast, a quantum bit can exist simultaneously in many more than two states. Several candidates might perform the function of a quantum bit. One is the electron, which forms the basis of modern electronics. Electronics takes advantage of the electron’s property of carrying an electric charge; but electrons are also characterized by a kind of a spinning motion, or spin. This motion can have two opposite directions. Thus two electrons carrying the same electric charge can differ in the direction of their spins. The differences in spin direction, as well as a unique state called superposition, are likely to provide the basis for quantum computation technology.
Superposition is a quantum phenomenon first described by Erwin Schroedinger, who demonstrated it through a thought experiment known as "Schroedinger’s cat." In this experiment, a cat is placed in a closed box together with a bottle of poison whose cork is connected to a trigger made of radioactive material. When this material decays at some point, the accompanying radiation will activate the trigger, which will then open the bottle and release the poison that will kill the cat. An observer watching the closed box can’t know whether or not the radioactive trigger has already been activated, and therefore cannot be sure whether the cat inside is dead or alive. What, then, is the cat’s real state? According to quantum theory, before the observer opens the box or measures the cat’s condition in any way, the cat exists in a state of superposition: It is simultaneously dead and alive. This is not a trick but a formulation of principle: In quantum theory, an unevaluated cat is truly both dead and alive at the same time. (It’s worth noting that quantum theory is the best proven of all scientific theories.)
Prof. Amir Yacoby of the Weizmann Institute’s Condensed Matter Physics Department and a team headed by Prof. Charles Marcus of Harvard University have recently conducted an experiment showing how the spin of electrons in a state of superposition can be used as a quantum bit. The idea is based on the fact that superposition exists only until someone has observed or measured the system. At the moment of measurement, the "magic" vanishes - the cat can only be either dead or alive; it can no longer exist "in both worlds." The question is: How long can the "magic" last before somebody or something measures it, thereby ending its existence?
In the new experiment, the scientists created systems in which the electrons’ spin was in superposition, and they managed to measure, for the first time ever, the duration of this superposition in a single electron. In other words, they measured how long a situation in which an electron is characterized by spins pointed simultaneously in two directions can last - a situation, that is, which ends the moment something or someone performs a measurement on the electron. To conduct the experiment, the scientists developed a way to measure the spin directions of a single electron inside a "box." Their approach is based on a system that "translates" spin into an electric charge. The position of the charge can be measured by well-known methods, so that the system measures electric charges and derives the spins from these measurements. A report on this research has recently been published in the journal Science.
Essentially, the new approach allows scientists to point a particle’s spin in any desired direction, not only the usual two. This means that the scientists can control the spin, designing different superposition states on demand. Each such superposition can represent different information, making it possible to encode information in quantum systems based on the electrons' spin. That’s a major advance toward developing quantum computers that will be able to perform computational tasks which are difficult for regular computers - such as dismantling a product of two large primes into its original components. The ability to perform such tasks has great practical significance in several areas, including encoding and decoding information.
Prof. Yacoby and his Ph.D. students Merav Dolev and Sandra Foletti are now studying ways of grouping quantum bits, moving them from one location to another, creating logical quantum gates and performing a number of other tasks that may advance the practical application of quantum computers.
Prof. Amir Yacoby's research is supported by the Rosa and Emilio Segre Fund; and the Joseph H. and Belle R. Braun Center for Submicron Research.
(l-r) Prof. Oded Goldreich and Prof. Dana Ron. Super efficiency
Scientists have confirmed what many suspected all along: We live in an imperfect world, filled with less-than-ideal choices. For example, in computing the solution to a problem with a large number of parts, we're forced to choose between efficient computation and an exact solution. If complete accuracy is essential, the computing process is likely to be slow and costly, whereas if we want a "fast and cheap" answer, there is no guarantee it will be exact. According to a recent study by Prof. Oded Goldreich of the Weizmann Institute's Computer Science and Applied Mathematics Department and his spouse, Prof. Dana Ron of the Electrical Engineering-Systems Department at Tel Aviv University, opting for efficiency over accuracy may very often yield results that are close enough to perfect to be useful in many applications.
Efficiency, in the case of computing, means a processing time that is relatively fast in comparison to the amount of data. If the time needed to carry out the computation grows at a steady rate relative to increasing input, it's considered to be fairly efficient. In comparison, processing that expands exponentially in relation to the data is inefficient and typically impractical.
For extremely large data sets, however, even computing at linear rates can be too slow to be feasible. For certain types of problems, Goldreich and Ron found a way to work with super efficiency - by developing a technique that reads just part of the data. What they had to sacrifice was absolute accuracy.
Their method was designed with networks in mind. Networks, such as the Internet, are large entities made up of sites, or nodes, and the links between them. Questions concerning networks may involve enormous amounts of data. They may ask, for instance, how connected is the network (i.e., can one get from any one point to any other point, and through how many different routes)? Or what is the average distance between two nodes? These issues affect research and planning in areas as diverse as computer networks, transportation and biological studies of gene and protein networks.
The standard algorithms (a computer's plans of action) one might use to answer these questions must analyze all of the data found in a network. So efficient are Goldreich and Ron's algorithms, they're dubbed sublinear (the processing needed is smaller than the input size); they randomly choose samples of network nodes and then scan only a portion of the immediate surroundings. Like election-time pollsters, who come close to predicting the actual election results on the basis of a representative sampling of voters, the scientists' method can produce approximate, but very close, answers on the basis of only a fraction of the data.
One of their algorithms, for example, can determine with near accuracy whether a given network is bipartite (separable into two parts). The nodes of a bipartite network can be partitioned such that all of its links are only to nodes in the opposite part. To process the data in a large network with a standard algorithm could range from time-consuming and expensive to futile. In contrast, the scientists' algorithm works by taking a random sample (sample size is determined by the size of the data list and can be as small as its square root) and then taking a small number of steps in random directions.
Thus in cases in which close is good enough, or better than nothing at all, Goldreich and Ron's algorithms may make a lot of sense. For a small loss in accuracy, the scientists gain a great deal of efficiency. And that, in an imperfect world, is close to ideal - a solution based on compromise.
Prof. Oded Goldreich is the incumbent of the Meyer W. Weisgal Professorial Chair of Experimental Physics.
100 Proof
How can we convince someone of something without giving away any information? Prof. Goldreich, whose work often deals with randomness and interaction, came up with the following story to illustrate the key to so-called zero-knowledge proofs:
On Olympus, the practical-minded Hera declared that all new nectar jugs, formerly of gold, be henceforth fashioned from silver. But perceptive Athena complained, claiming the nectar from the silver jugs did not taste as good as that from the gold ones. Zeus, already irritated with Hera's economizing, let loose his ire: "Let her be served 100 gourds of nectar, each poured at random from an old or a new jug, and let her tell them apart!" To Zeus' surprise, Athena correctly identified the source of each sip of nectar, convincing him there really was a difference.
Athena must have had a way of discerning one taste from the other. (Otherwise, even a goddess would have gotten the answer right only about half the time.) But her proof was based only on the randomness of the repeated servings, not on her means of identifying them, so Zeus learned nothing beyond the fact that she was right. Such zero-knowledge proofs are useful in designing secure computer systems.
Prof. Benjamin Geiger. Big questions for nanomedicine
Even the most inveterate tinkerer must occasionally consult the instruction manual. Unfortunately, for some important machines we rely on every day - the assortment of microscopic machinery that makes up our body's cells - no such manual exists. While the parts list - genes, proteins and biological molecules - grows longer by the day, scientists are just beginning to understand how these components all work together to make up the complex machinery of cells, and how breakdowns in this equipment, the cause of many diseases, might be fixed.
A new international research project aims to improve this situation by setting out to write a "Cell Operations Manual" and a "Cell Repair Manual." This project is part of an ambitious initiative of the National Institutes of Health (NIH) in the U.S. called the "Roadmap for Medical Research." The brainchild of NIH director Elias Zerhouni, the Roadmap was set up to fund innovative biomedical research in a number of areas, with no less a goal than that of transforming medical science. In the futuristic area of nanomedicine, four groups were awarded grants totaling $43 million over five years. Prof. Benjamin Geiger, Dean of Biology and researcher in the Molecular Cell Biology Department at the Weizmann Institute of Science, is a member of one of these four groups, the NanoMedicine Center for Mechanical Biology. Each member of the group, which includes biologists, materials scientists, physicists and theoreticians from the U.S., Israel and Switzerland, will bring his or her own research experience to bear on fundamental questions concerning the mechanics of life on the incredibly tiny scales of cells and molecules.
Scale, in fact, is one of the more tangled puzzles the scientists plan to address. How do cells, around 40 microns across (a micron is a millionth of a meter), self-organize to become organisms that are meters in size? At the other end of the scale, single molecules, the information-bearing units of the cell, are in the nanometer range - just thousandths of the cell's size. If cells were the size of people, their sense organs would be little bigger than grains of sand. How does communication between a cell and its parts take place across this range?
Communication is another subject the scientists will tackle. Cells are continually subjected to mechanical forces, whether the pumping force of blood or the structural force of bones, tissues and neighboring cells. Endowed with sophisticated means of sensing these forces, they are able to convert their "readout" on the nature of the force into biochemical signals that then inform the cell's actions. But how exactly does this happen? Many diseases - including metastasis, in which cancer cells stop clinging to their neighbors and move away - might be tied to the cells' failure to properly sense and interpret forces. In addition, bioengineers attempting to grow tissues from various stem cells have found that cells need the proper mechanical cues in their environment to know how to develop into specific cell types.
From nanomaterials and nanoelectronics to molecular biology, the world of the ultra-small has its own physical laws, which very often differ from those of the everyday world. By incorporating knowledge from varied fields, the research group intends to develop new approaches to understanding the mechanics of the cell. As their work progresses, the scientists hope to gain insight into many of the major health issues facing us today: wound healing, hypertension and cardiovascular diseases, osteoporosis, nerve regeneration,immune responses and cancer. The "instruction manuals" they're planning will then become works-in-progress that can be applied to maintaining the machinery of life in good working order.
Prof. Benjamin Geiger's research is supported by the Clore Center for Biological Physics; the Ilse Katz Institute for Material Sciences and Magnetic Resonance Research; the Levine Institute of Applied Science; the Women's Health Research Center; the Edith C. Blum Foundation; the Samuel R. Dweck Family Foundation; Mr. and Mrs. James Adler, Chevy Chase, MD; the estate of Evelyn Blum, Switzerland; the estate of Ernst and Anni Deutsch, Liechtenstein; and Ms. Ruth Browns Gundelfinger, San Rafael, CA. Prof Geiger is the incumbent of the Professor Erwin Neter Professorial Chair of Cell and Tumor Biology.
Surface Patterns
For a number of years Prof. Geiger, a molecular cell biologist, and Prof. Joachim Spatz of the University of Heidelberg, a materials scientist, have been working together to try to figure out how the cell "reads" and responds to the information in its environment. Spatz and his group in Heidelberg create materials with surfaces that mimic collagen, one of the body's support materials. They simulate different conditions by controlling various properties of these materials, such as their surface topography or relative hardness, and they also produce nanopatterned surfaces with minuscule areas of differing properties. Onto this carefully designed surface they anchor assorted molecules using gold nanoparticles.
These molecules are positioned so as to create tiny islands on which only one receptor (a cellular "antenna" through which cells connect to the world outside their walls) can gain a foothold at a time. "With these materials, we see exactly which molecules the receptors recognize and interact with," says Geiger. In recent studies, the scientists have discovered that the placement of the binding molecules affects the ability of the cell receptors to work together either to keep the cell stuck to the surface or to help it move. The method has the added advantage of providing a relatively large surface - a centimeter or so square (several football fields to a cell) - to work with.
(l-r) Dr. Erez Dekel and Prof. Uri Alon. Evolution in a test tube
In a world of stable populations where each individual must struggle to survive, those with the "best" characteristics will be more likely to survive, and those desirable traits will be passed to their offspring.
- Ernst Mayr, The Growth of Biological Thought
Was Darwin right, back in 1859, when he proposed what is arguably the most famous, and the most controversial, theory in the history of biology? Natural selection, or "survival of the fittest," despite the overwhelming scientific evidence in its support, remains just a theory. Direct proof of natural selection has been lacking because it's been difficult to test fitness quantitatively in the lab – until now.
To address the ongoing natural selection debate, post-doctoral fellow Erez Dekel, working under the supervision of Prof. Uri Alon of the Weizmann Institute of Science's Molecular Cell Biology and Physics of Complex Systems Departments, devised a set of experiments to measure, for the first time, the fitness of a simple organism and ascertain whether it is really the fittest organisms that survive into the next generation.
They studied a simple one-celled organism (the bacterium Escherichia coli) and its production of lactose protein, an enzyme that allows the bacterium to utilize lactose, the sugar in milk. The scientists asked themselves: Why does the E. coli cell make a specific number of lactose proteins – 60,000 – as opposed to, say, 50,000 or 70,000? And is this number really the optimum; that is, does this amount somehow make a better contribution to the organism's survival than any other?
Indeed, their results, recently published in Nature, showed that, like the baby bear's porridge – neither too hot nor too cold, but just right – there is a "just right" amount of protein that a bacterium should produce for maximum gain. "It is all a matter of weighing up the costs and benefits," explains Alon. For a bacterium living in a given lactose environment, produce too much protein, and the cost of production and maintenance becomes a burden. Produce too little protein, and the bacterium cannot reap all the benefits of the available sugar. But with just the right amount of protein, the bacterium thrives and produces offspring with the optimal potential.
After measuring the cost of manufacturing the lactose protein and the benefits gained by utilizing the sugar, Dekel decided to take the experiment one step further. He wanted to find out whether the level of protein production can actually change over evolutionary time scales. But how to test whether natural selection is really at work or whether the changes that occur are due to chance?
He approached this problem by growing E. coli in seven separate test tubes, each containing different amounts of lactose. Every day he took a sample of bacteria out of each tube and placed it in a fresh tube containing the same amount of lactose as the previous one. A few months and more than 550 generations of E. coli later, the time had come to analyze the results. Dekel found that not only did each line of bacteria evolve and adapt by altering the amount of the protein produced, but those amounts corresponded to the optimum levels he had predicted in his previous experiment. These heritable changes were seen to take effect after only 200-300 generations.
Some biologists have suggested that the details of various biological systems are just historical accidents that get handed down from generation to generation like Great-Grandma's old furniture. In this scenario, asking why a cell produces a specific number of proteins would be like wondering what advantage there is to having carved legs on the table - they just are. But the team's findings show that, just as you can replace the legs on Great-Grandma's table with ones that are easier to dust, simple organisms can rapidly and accurately evolve to reach an optimal state. "We have witnessed the survival of the fittest in a test tube," says Alon. Is this proof of natural selection "fit" enough to survive the scrutiny of the next generation of skeptics? Time, and further experiments, will tell.
Prof. Uri Alon's research is supported by the Nella and Leon Benoziyo Center for Neurological Diseases; the Clore Center for Biological Physics; the Yad Abraham Research Center for Cancer Diagnostics and Therapy; the Leon and Gina Fromer Philanthropic Fund; the Kahn Family Foundation for Humanitarian Support; the Minerva Stiftung Gesellschaft fuer die Forschung m.b.H.; the James and Ilene Nathan Charitable Directed Fund; the Harry M. Ringel Memorial Foundation; the estate of Ernst and Anni Deutsch, Liechtenstein; and Mr. and Mrs. Mordechai Segal, Israel.
Evolution's Building Blocks
In a series of related, ongoing projects, Alon uses computer simulations to study the theoretical basis of the mechanisms underlying evolution. His most recent work, done in collaboration with Ph.D. student Nadav Kashtan, offers an explanation as to why many biological systems seem to have evolved to be modular – made up of many separate building blocks that are able to perform independently.
For example, a cheetah and a giraffe both possess four limbs, which can be thought of as four modules. Evolution can make small adjustments to this template to meet the different needs of each animal. For the cheetah, the modules can be tweaked to result in limbs that run fast, whereas for the giraffe, the same basic modules can be tweaked slightly differently to create longer limbs. The study hints that modular systems may spontaneously arise with environmental changes that are themselves modular, thus allowing organisms to adapt more rapidly. Modular design might also help to speed up the process of evolution, allowing nature to tinker with the parts rather than redesigning the whole animal.
(l-r) Dr. Ishai Dror and Prof. Brian Berkowitz. Taking up the challenge
Part family saga, part invention that might help extend the world’s oil reserves for many years to come, the story of a patent for a method of extracting usable oil from so-called oil sands has all the elements of a "tragedy-to-bittersweet success" movie.
The story begins with Prof. Norbert Berkowitz, father of the Weizmann Institute’s Prof. Brian Berkowitz, who was a highly respected coal chemist at the University of Alberta in Edmonton, Canada. The northern part of Alberta, one of Canada’s western provinces, contains enormous deposits of oil in its oil sands. The problem is that the oil in these deposits is more like thick tar than a gushing liquid, making it impossible to simply sink a well and pump the stuff up. This oil is presently dug up and diluted with large amounts of solvent, or even larger amounts of a thinner oil, so it can be piped to refineries hundreds and thousands of kilometers away. But these methods are expensive and problematic, severely limiting the extraction potential of these fields.
Norbert Berkowitz had an idea for turning the viscous gunk from oil sands into flowing oil, using water. He and his Calgary-based business partner, Stephen Dunn, applied for a patent for the process in Canada. But they needed to test the idea, and Norbert, who was retired at that point, did not have immediate access to a lab. Before he got the chance to begin carrying out experiments on his idea, he and his wife, Sheila, were tragically killed in an auto accident in 2001.
Dunn met Brian and his siblings in Edmonton, where they had gathered soon after the accident, and gave them a copy of the patent application as a family keepsake. Then and there, Brian promised he would take up where his father had left off and see the project to fruition. For Brian, whose area of research is hydrology, this involved first and foremost a crash course in hydrocarbon chemistry. Even with some knowledge of petroleum engineering (which he’d picked up some 20 years earlier during his M.Sc. studies), he needed specialized knowledge to carry on with the project. Fortunately, his father had written three books and dozens of scientific papers on the subject, and these became Berkowitz’s study guides, inspiring him to delve more deeply into the basic science behind the idea. For Brian, touching and reading these books was "both magical and intensely painful."
When the time came to build the actual apparatus in his lab in the Environmental Sciences and Energy Research Department with the help of then post-doctoral fellow Ishai Dror, it took two and a half years to plan, construct and test an experimental system. "We had to design everything ourselves, from the screws on up," says Berkowitz. The process then had to be fine-tuned. "Cook" it too long, and instead of free-flowing oil the result is coke, a sort of low-grade coal. More than once, Berkowitz found himself covered in oil from small explosions, and the lab walls had to be repainted. After countless late nights in the lab and dozens of nights spent reading and thinking, the day finally arrived when the oil, in a cinematic "Eureka!" moment, began to flow like milk.
Berkowitz’s father had come up with a way to create a system that could potentially extract a large amount of oil quickly and efficiently. The process relies on water that is heated under pressure. Although it had been known for some time that a hot "supercritical liquid" could be used to partially break down viscous oil, the new system translates this knowledge into a practical, flow-through apparatus that doesn’t require a long heating period. A supercritical liquid is neither a liquid nor a gas, but a substance that has properties of both, plus some atypical properties as well. It occurs at high pressures and temperatures: As the pressure goes up, the boiling point of water rises and the water remains liquid, even at temperatures well above 100° C (212° F). Oil won’t dissolve in either water or steam, but supercritical water can be used as a solvent for all kinds of oil. In the case of the thick oil from oil sands, the supercritical water reacts chemically with the long hydrocarbon chains that cause the oil to be so viscous, breaking them into shorter ones that flow past each other more easily.
As well as the original patent (now handled by Yeda, the business arm of the Weizmann Institute), which has been expanded with add-ons that have come out of Berkowitz’s research, a four-country patent has been applied for under the aegis of Yeda. In addition to Canada, other countries - among them Venezuela - have extensive oil sands and heavy oil deposits that could potentially be exploited with the method
The final scene in the movie is yet to come: Norbert Berkowitz’s business partner, Stephen Dunn (who has since become Brian’s friend and business partner), recently signed an agreement with Yeda to build a demonstration plant in Calgary. Hopefully, as the partners ride off into the sunset, it will be in a car fueled with plentiful Canadian heavy oil.
Prof. Brian Berkowitz's research is supported by the Sussman Family Center for the Study of Environmental Sciences; the Brita Fund for Scholarships Research and Education; the Feldman Foundation; the P. and A. Guggenheim-Ascarelli Foundation; and Mr. and Mrs. Michael Levine, Pinckney, NJ. Prof. Berkowitz is the incumbent of the Sam Zuckerberg Professorial Chair in Hydrology.
In the fields of observation, chance favors only the prepared mind.
- Louis Pasteur, 1854
Serendipity - making fortunate discoveries by accident - has played a role in some of the most influential scientific achievements. This is more or less what happened to Prof. Nava Dekel of the Weizmann Institute's Biological Regulation Department. "It all began with a straightforward study," says Dekel, "but it has turned into something completely unexpected, opening up new research directions that might give hope to women with fertility problems."
Dekel was originally interested in discovering whether a specific protein, known to form channels for communication between neighboring cells, plays a role in the implantation of the fertilized egg into a woman's uterus. This stage is a critical one - IVF treatments most often fail at or soon after implantation. In an ongoing collaboration with Kaplan Hospital, Dekel and her research team recruited 12 women, all of whom suffered from fertility problems and had failed to conceive after numerous IVF treatments, and took a series of uterine biopsies at different stages in the menstrual cycle. After analyzing the results, the team found, as they had predicted, that levels of the protein under study undergo changes during the menstrual cycle, and the production patterns did, indeed, point to a role for this protein in the successful implantation of the fertilized egg in the uterus.
So far, they'd found what they were looking for and everything was going according to plan. The surprise was just around the corner: Of the 12 women who participated in the study, 11 became pregnant during the next round of IVF. Was this merely a coincidence, or did the biopsies somehow effect a "magic touch"? Dekel set up another experiment, again involving women who had problems conceiving. This time, however, they divided the patients into a group of 45 volunteers who had biopsies taken and a control group of 89 women who did not undergo biopsy. There was no doubt about it: The biopsy procedure had somehow managed to double the pregnancy success rate.
How does a simple biopsy create such a dramatic increase in pregnancies? "It seems counter-intuitive. How can injury lead to a positive outcome?" Dekel wondered. She suggests, on the basis of this and other evidence obtained from previous studies, that some form of distress, in this case a biopsy procedure, provokes a response that renders the uterus more receptive to implantation. Dekel and her team are continuing to investigate, looking for the exact mechanisms involved when an unreceptive uterus turns into a receptive one as a result of local injury. Both animal studies and human clinical trials are now being conducted to identify genes that may play a role in this process.
From an accidental finding, these scientists have started down a new path of inquiry that may, in the future, give birth to new treatment procedures to improve the success rate of IVF or even tackle some types of fertility problems directly.
This study is conducted by Dekel and Drs. Yael Kalma and Yulia Gnainsky (former and present post-doctoral fellows, respectively), in collaboration with Drs. Amichai Barash and Irit Granot of the In Vitro Fertilization Unit of the Kaplan Medical Center Obstetrics and Gynecology Department.
Prof. Nava Dekel's research is supported by the Y. Leon Benoziyo Institute for Molecular Medicine; the Willner Family Center for Vascular Biology; the Dwek Family Biomedical Research Fund; the Paul Godfrey Research Foundation in Children's Diseases; the Rachel and Shaul Peles Fund for Hormone Research; Mr. Max Candiotty, Los Angeles, CA; and Shimon Shestovich Ltd. Prof. Dekel is the incumbent of the Philip M. Klutznick Professorial Chair of Developmental Biology.
(l-r) Vardit Eckhouse, Prof. Asher Friesem, Prof. Nir Davidson and Liran Shimshi
You can't have your cake and eat it too. Or maybe sometimes you can, at least according to scientists at the Weizmann Institute of Science who set out on a quest to devise lasers that are both high-powered and razor-sharp - properties that are generally mutually exclusive.
Usually, powerful lasers have low beam quality, while high beam quality is attainable only in weak lasers. As opposed to light waves from a light bulb, which are emitted in all different wavelengths and directions, laser beam light waves are emitted in a single wavelength and direction. The more closely matched the waves, the higher the beam quality. A high-quality beam can be focused to a point like that of a sharp dagger, while a low-quality beam spreads out more like a blunt butter knife.
But what would happen if several weak lasers were combined? Would they turn into one strong, sharp laser? And the big question: Would the high quality of the separate weak beams be maintained once combined, or would some quality be lost in the process? A team headed by Profs. Asher Friesem and Nir Davidson of the Physics of Complex Systems Department, including Ph.D. students Amiel Ishaaya, Vardit Eckhouse and Liran Shimshi, sought answers to just these questions when they created a laser joining 16 individual weak beams into one powerful beam. In a paper published in Optics Letters, they showed their combination laser sustained a beam quality as high as that of the original weak beams.
To combine beams, the properties of all the light waves have to be coherent (identical in every way). Then the light waves must be superimposed in such a way that the peaks and troughs line up exactly with one another. But this is not so simple in practice: These conditions are very difficult to achieve, and the minutest disturbance can knock the waves out of their superimposed state.
The team - with a bit of "magic" - has managed to design a laser that largely overcomes such problems. In their device, 16 beams of light are produced and, like darts, they all travel straight toward a special optical element located within the laser cavity. This element is positioned such that beams that aren't coherent or properly aligned won't pass through; but if they are exactly the same and superimposed to boot, the element efficiently combines their individual powers. "And here comes the magic," says Davidson. "As if the laser beams have personalities and can make choices, they automatically choose to conform to one another. They seem to know their alternative is to be cast out, and they are able to rapidlyandcontinuouslyself-conform even under unstable conditions." The end result: an uncompromisingly stable, powerful and sharply focused combined beam.
Although this is not the first time laser beams have been joined, the team has taken laser combining to new heights. Other techniques face restrictions in beam numbers, but with this new innovation, the possibilities are unlimited, at least in theory. The design of this laser is also more stable and robust - crucial traits when translated to practical purposes. Their research, therefore, has significant implications in a wide range of commercial fields, from laser radars, optical communications, space exploration and material processing to laser treatments and surgery. Their next steps, already in progress, are to refine their design to eliminate practical constraints to combining even more beams, and to apply it to additional types of laser systems.
Prof. Nir Davidson's research is supported by the Fritz Haber Center for Physical Chemistry; the Levine Institute of Applied Science; the Rosa and Emilio Segre Fund; and the Cymerman-Jakubskind Prize. Prof. Davidson is the incumbent of the Peter and Carola Kleeman Professorial Chair of Optical Sciences.
Left to right: Dr. Atan Gross, Iris Kamer, Dr. Rachel Sarig, Limor Regev, Galia Oberkovitz, Dr. Hagit Niv and Dr. Yehudit Zaltsman. Cellular self-check
In the life of a cell, just as in the course of a person’s life, there are times that call for a break in the routine; a moment to run a “self-check” to review the state of one’s internal health and settle on a course of action. When the genetic material of the cell is damaged, the cell needs to assess the extent of that damage and decide whether to turn on the DNA repair machinery or commit suicide to prevent the possible development of cancer. If this crucial self-evaluation stage is skipped, wrong decisions are likely to result.
The process of self-checking, decision making and action is managed by a crack team of proteins. Depending on the circumstances, certain members of this team may be given more than one assignment. In research published in the journal Cell, Dr. Atan Gross and his research team in the Biological Regulation Department at the Weizmann Institute showed that one prominent protein called BID lives the life of a double agent. BID plays two different roles: one in the chain of events that leads to cell suicide, and another in the chain that ends in survival. How is this one protein able to perform two, seemingly opposed missions?
In its better-known role as a regulatory protein in the suicide chain, BID begins to act when an enzyme, caspase-8, cuts a piece off the protein, leaving a smaller protein, tBID, that is primed and ready to carry out its assigned task in the cell’s suicide.
Gross and his research team began to learn of the protein’s double role when they asked what happens to the suicide chain if BID does not undergo cleavage. To answer this question, they inserted into embryonic mouse cells a gene that encodes for a mutant BID protein, one that resists cutting by the caspase-8 enzyme. To their surprise, the mutant, uncut BID was still activated. Gross and his team then tracked the activities of this protein and soon discovered that BID hooks up with another protein, ATM, a “player” for the anti-suicide side. ATM is a crucial manager of the DNA repair process in the cell nucleus; its role is so vital we can’t survive without it. It is called into action in response to toxins that severely damage DNA and cause breaks in both strands of the double-stranded chain. ATM, in turn, recruits a whole emergency crew to join in the rescue effort. Postdoctoral fellow Dr. Rachel Sarig, a member of Gross’s research team, found that along with the rescue workers, some of which science has not yet identified, ATM activates the full, uncut version of BID. If this is indeed the case, BID might play an important role in the chain of events that leads to DNA repair and cell survival.
Illustration of cell life cycle and BID
What is this role? To continue the investigation, research student Iris Kamer examined how the cell life cycle was managed in embryonic mouse cells lacking the gene for BID. These cells were exposed to a DNA-breaking toxin. The result: The BID-deficient cells skipped the critical self-check stage. Only when the scientists reintroduced BID into the cells did they take a break in the cycle to run a self-check.
The researcher’s investigations into the activities of this double agent revealed further insights into its whereabouts and dealings. They refuted current thinking, which maintains BID resides only in the body of the cell, showing that it also frequents the nucleus. The research team, which included Dr. Yehudit Zaltsman, Dr. Hagit Niv and research students Galia Oberkovitz, Limor Regev and Gal Haimovich, also found that BID can postpone the decision to commit suicide by a few hours, granting the cell’s DNA repair machinery a chance to undo the damage (a process in which BID might play yet another role). Although such a delay may seem inconsequential, pausing to seize the chance to preserve cell life rather than rushing into a decision to commit suicide may have some advantages.
When the cell is improperly managed, the upshot might be unsound decisions that could lead, for example, to the replication of damaged DNA or a delay in cell death – mistakes that can result in uncontrolled cell proliferation and the development of cancer. Gross hopes that the findings of his research team will contribute to a better understanding of cancer progression and the resistance of some tumors to chemotherapy and radiation, as well as helping to explain genetic diseases in which ATM is absent or has been inactivated.
Dr. Atan Gross’s research is supported by the Y. Leon Benoziyo Institute for Molecular Medicine; the Dolfi and Lola Ebner Center for Biomedical Research; the David and Fela Shapell Family Center for Genetic Disorders Research; the Willner Family Center for Vascular Biology; the Jeans-Jacques Brunschwig Fund for the Molecular Genetics of Cancer; the Harry and Jeanette Weinberg Fund for Molecular Genetics of Cancer; the Louis Chor Memorial Trust Fund; the Abisch Frenkel Foundation for the Promotion of Life Sciences; la Fondation Fernande et Jean Gaj; la Fondation Raphael et Regina Levy; and Mr. and Mrs. Stanley Chais. Dr. Gross is the incumbent of the Robert Armour Family Career Development Chair.
Right timing is in all things the most important factor.
---Hesiod (about 800 BC)
Everyone, from professional athletes to stock market investors to politicians, can tell you that timing is crucial. A path to the goal clears for a split second, a window of opportunity opens for buying or selling, the time is ripe to pass a law or sign a treaty. In the medicine of the future, when tissues and organs are grown specifically for transplants, doctors may also need to pay careful attention to timing. Prof. Yair Reisner of the Institute’s Immunology Department has found that the successful transplantation of tissues from developing organs may depend on zeroing in on a window of opportunity.
For over two decades scientists, Reisner among them, have been investigating the possibility of transplanting tissues from embryonic pigs into humans. Such “grown-to-order” organs might have the potential to resolve the pressing shortage of available transplant organs and could conceivably bypass problems associated with rejection. New tissues might be used to treat a host of diseases, as well. Experiments have been performed, for instance, on transplanting pancreatic tissue from embryonic pigs to treat type 1 diabetes. Yet the results of experimental attempts at such implantation have been unsatisfactory.
In the developing embryo, tissues become increasingly differentiated as the various organs form. Starting with the earliest embryonic stem cells, which literally have the potential to form any kind of tissue, each generation of cells gradually becomes more and more “locked” into becoming one kind of tissue, to the exclusion of other kinds. If tissues are transplanted at too early a stage, when they are still relatively undifferentiated, there is a risk that they will develop into teratomas – potentially malignant tumors. However, wait too long to transfer organs from one animal to another, and the tissues will already have developed identifiers that are likely to trigger rejection in the new host.
Pancreatic tissue that has developed normally after being transplanted in the sixth week from pig embryos to mice
“Basically, there is a complex interplay between three parameters,” says Reisner. “There is the risk of teratoma, the risk of graft rejection and the stage at which the transplant organs will grow best.” Generally, tissues harvested earlier are less likely to be rejected. In previous research, Reisner and his team had demonstrated that miniature, fully functioning pig and human kidneys could be grown in mice from transplanted embryonic tissue. By using mice that were bred to have inactive immune systems, thereby preventing rejection, the team discovered the earliest point in time that affords perfect growth without the risk of teratomas forming. They determined that the ideal time slots to harvest pig and human embryonic kidneys were at four and seven weeks of gestation, respectively.
In recent experiments, Reisner and his research team transplanted tissues from pig embryos, again into immunodeficient mice, and waited to see how they developed. They found that each type of organ has its own window of opportunity during which transplantation is more likely to succeed. Reisner’s findings appeared in the Proceedings of the National Academy of Sciences, USA.
The liver, for instance, was the earliest to reach its optimal stage of growth and function – combined with a lowered risk of teratoma formation – at four weeks. The window of opportunity for transplanting the pancreas opened two weeks later, at six weeks, and closed around week ten. After this date, the insulin-secreting capacity of transplanted pancreatic tissues began to decline. The development of mature lung tissue, with all the necessary elements for a fully functioning respiratory system, occurred relatively late in gestation. This sequence of the opening and closing of time windows seems to follow the normal order of embryonic development, in which the liver and pancreas are the first to develop, and the lungs last.
“Disappointing results in past transplantation trials may be explained, at least in part, by these findings,” says Reisner. “The attempts to cure diabetes, for instance, made use of late gestational organs but, according to the results of this study, they may have had a much higher success rate if tissues had been taken from younger embryos.” With this study, the idea of growing tissues and organs for transplantation may have come one step closer to becoming reality.
Prof. Yair Reisner’s research is supported by the M.D. Moross Institute for Cancer Research; the Abisch Frenkel Foundation for the Promotion of Life Sciences; Richard M. Beleson; Renee Companez; the Crown Endowment Fund for Immunological Research; Erica A. Drake; Ligue Nationale Francaise Contre le Cancer; Mr. and Mrs. Barry Reznik; the Gabrielle Rich Center for Transplantation Biology Research; the Union Bank of Switzerland Optimus Foundation; the Belle S. and Irving E. Meller Center for the Biology of Aging; the J & R Center for Scientific Research; the Mario Negri Institute for Pharmacological Research-Weizmann Institute of Science Exchange Program; and the Loreen Arbus Foundation. Prof. Reisner is the incumbent of the Henry H. Drake Professorial Chair in Immunology.
Our website uses cookies to enhance user experience by remembering your preferences and analyzing website traffic. For more information about how we use cookies please read our
Everybody Needs a Friend Sometimes