Working under Pressure

English

Several molecules must work together for proper joint lubrication. In this illustration, the lubricin molecules anchor the long hyaluronan chains to the cartilage collagen network; these hyaluronan chains in turn attach the phospholipid molecules in either single layers (green) or double layers (blue)

 

No man-made lubricant can even come near to meeting the daily demands we make of our joints. As we run or walk, the surfaces of the cartilage in our hip and knee joints slide over one another at high pressures. And they do so over and over, every day, for many decades. Two new studies in the group of Prof. Jacob Klein of the Weizmann Institute’s Materials and Interfaces Department, which appeared in Nature Communications, are helping to solve the puzzle of joint lubrication, and they may, in the future, point the way to treatments for problems with this lubrication, especially osteoarthritis. One in three people suffers from osteoarthritis – the most common form of arthritis – by age 65, and nearly one in two does so by age 85; while younger people often develop the disease after sporting or other accidents; and there is no cure. 

Part of the reason that the joint lubrication system had not been revealed until now, says Klein, is that it forms a very thin layer on the cartilage surface, and its properties are not understood. Moreover, of the many different molecular substances in the joints, several have been proposed to be the main lubrication molecule. The three main molecules that have been suggested for the role of lubricant are phospholipids, hyaluronan (a chain-like, polymer molecule) and lubricin (a protein). The problem is that in some five decades of tests and experiments, none of them alone has performed nearly as well as the body’s actual lubrication system.

 

(L-R) Drs. Nir Kampf and Ronit Goldberg, Prof. Jacob Klein, Dr. Jasmine Seror and Anastasia Gaisniskaya-Kipnis

Klein and postdoctoral fellow Jasmin Seror suggested a way out of this bind: What if all three molecules worked together to lubricate the joints? They devised an experiment using two of the molecules. Working together with Linyi Zhu,a visiting student from the Chinese Academy of Science, Dr. Ronit Goldberg, a senior intern in Klein’s group, and Prof. Anthony Day of the University of Manchester, they created a model lubricating system of phospholipids and hyaluronan placed between atomically-smooth test surfaces made of the mineral mica. They then applied a wide range of loads to the surfaces, mimicking the pressures in actual joints, to see how well the two layers could slide when pressed.

Klein and his group had been studying the first type, phospholipids, because they are an unusual kind of lubricant that relies on water – via so-called hydration lubrication. Phospholipid molecules have two parts: a water-loving head and a couple of long, water-repelling tails. The heads have both a positive and a negative charge, and so they strongly attract water molecules, H2O, which also carry the two types of charge: negative on the oxygen and positive on the hydrogen. The water molecules form a hydration shell around the lipid head; it is this shell that works like a microscopic “ball bearing” and provides the lubrication.

An atomic force microscopy image of mica surfaces coated with hyaluronan molecules to which phospholipids were added. The “bead-necklace” features seen in the image – two of which are indicated by red lines - reveal a lubricating structure consisting of hyaluronan and phospholipids similar to that depicted in the above illustrationThe results of the experiments showed that the combination of the two molecules, hyaluronan and phospholipids, worked remarkably well as lubricants – close to the performance of real joint lubrication. The scientists believe that each plays a different role in reducing friction between the joints’ surfaces. Klein explains: “Although you would not think it of a structure made of water, the hydration shell is quite strong – it is at once incompressible and yet fluid. In other words, it is an excellent lubricant. The long hyaluronan molecules act as a ‘sticky’ layer at the joints' surface, anchoring the phospholipids so that their hydrated heads are exposed on the outside face. This lubricates the sliding. We think that the third molecule – the lubricin – acts as a connector, attaching the hyaluronan molecules to the cartilage.”

 

Just the Right Resistance

In this model, the hydration shells on the phospholipid heads are what ultimately reduce the friction. How well do they work? One way to understand lubrication is to test the viscosity of the fluid substance – how resistant it is to flow. Anyone who has changed the motor oil in a car knows that the correct viscosity (the SAE numbers on the can) is important for keeping the motor running smoothly. The same is true of our joints. Not viscous enough, like plain water, and the substance would be squeezed out of the joint. Too viscous, like gelatin, and it would gum up the works. Hydration shells are a bit more complex, kept in place by the enclosed charges that attract the water molecules. But the question remained: Is their viscosity “just right,” so as to explain the lubrication?

To answer this, in the second study, Klein and his research group, including Liran Ma, Anastasia Gaisinskaya-Kipnis and Nir Kampf, investigated the viscosity of these watery shells. The researchers created simple hydration shells – small positive ions surrounded by water molecules. They then trapped these hydration shells between two atomically-smooth surfaces and measured their viscosity by compressing and sliding those surfaces at different velocities. Their results showed that a fluid made up of these shells is about 200 times as viscous as water – around that of motor oil SAE 30 or cold maple syrup. This viscosity, measured in trapped hydration shells, turns out to provide a good explanation for the observed reduction in friction in the previous study.

Taken together, says Klein, these experiments will lead to both a better understanding of the complex lubrication system that keeps our joints in condition and, hopefully, in the future, to treatments for osteoarthritis.

 

Prof. Jacob Klein's research is supported by the European Research Council; the Charles W. McCutchen Foundation; the Minerva Foundation; the Israel Science Foundation; and the ISF-NSFC joint research program. Prof. Klein is the incumbent of the Hermann Mark Professorial Chair of Polymer Physics.


 

 
Several molecules must work together for proper joint lubrication. In this illustration, the lubricin molecules anchor the long hyaluronan chains to the cartilage collagen network; these hyaluronan chains in turn attach the phospholipid molecules in either single layers (green) or double layers (blue)
Chemistry
English

In a Heartbeat

English
Two hearts, said Keats, can beat as one; but a study led by Weizmann Institute scientists in collaboration with researchers from the University of Pennsylvania shows that sometimes a single heart muscle cell can beat as more than two dozen. The findings, reported recently in Nature Communications, provide an extremely detailed glimpse into the mechanisms behind normal and irregular heart muscle cell contractions. The study may help define the limitations of existing therapies for abnormal heartbeat and, in the future, suggest ways of designing new ones.

Each heart muscle cell consists of numerous parallel filaments comprising repeated subunits. When the heart beats, each individual filament contracts to produce muscle cell contractions.

Optimally, all the filaments should contract in a synchronized manner, thus ensuring the greatest amplitude of contraction for each muscle cell, and ultimately, the strongest and most effective beating of the entire heart. However, a new theoretical model proposed and analyzed by Prof. Samuel Safran and postdoctoral fellow Dr. Kinjal Dasbiswas of the Weizmann Institute’s Materials and Interfaces Department suggests that the filaments contract together only when their subunits, and subunit boundaries, are aligned with one another. Since such alignment usually only happens among a limited number of neighboring filaments, aligned filaments contract together as a bundle, but each such bundle contracts out of phase with others. Therefore, a heart cell does not necessarily beat as a single uniform entity; rather, the number of different beating entities in the cell depends on the bundle number, which may reach more than two dozen.

The theory, which uses the methods of statistical physics, further predicted that the alignment of the filaments in the heart muscle cell depends on the cell’s physical environment; more specifically on the elasticity of the supporting structure called the extracellular matrix. The alignment is best when this structure is not too soft and not too rigid. The prediction took into consideration various forces operating on the microscale, particularly mechanical forces that are exerted on each filament subunit by neighboring filaments via the extracellular matrix.  
 
A chicken heart muscle cell under a fluorescent microscope; the filaments consist of repeated subunits (bright dotted lines). The schematic representation shows three neighboring filaments; the black lines are the boundaries of their subunits, such that the lower filament is aligned with the middle one, while the upper one is not
 
 
 
 
 
 
 
 
 
 

 

 
 







By assuming that only  structurally aligned filaments beat together,  the Weizmann theorists were able to quantitatively explain experimental findings by their collaborators from the University of Pennsylvania, Prof. Dennis Discher and Dr. Stephanie Majkut. In the experiments, these scientists had placed chick embryonic heart cells on support surfaces of varying stiffness and found that two strikingly different properties – the structural alignment of the filaments and the beating strength of the cell – depended on the rigidity of the supporting surface.
 
Providing a theoretical basis for these experiments, the Weizmann Institute model may help explain how filaments become aligned in heart muscle cells during embryonic development, and how their arrangement correlates with the muscle function in the adult heart.

This correlation suggests that the current means of treating irregular heartbeat may be limited to a certain extent by the structural order of heart muscle filaments. But the new understanding may one day help design improved treatments for heart disease. For example, in the future, if new heart cells are grown to replace diseased ones, their growth environment may be manipulated so that their structure is well-ordered and, to paraphrase Keats, all their filaments beat as one.


Prof. Samuel Safran’s research is supported by the Gerhardt M.J. Schmidt Minerva Center on Supramolecular Architectures, which he heads; the US-Israel Binational Science Foundation; the Israel Science Foundation; Antonio and Noga Villalon, Winnetka, IL; the Clore Center for Biological Physics; the Kimmelman Center for Structural Biology; and the Kimmel Stem Cell Research Institute. Prof. Safran is the incumbent of the Fern and Manfred Steinfeld Professorial Chair. Dr. Kinjal Dasbiswas’s research is supported by a fellowship from the Council of Higher Education.

 

 

 

A chicken heart muscle cell under a fluorescent microscope; the filaments consist of repeated subunits (bright dotted lines). The schematic representation shows three neighboring filaments; the black lines are the boundaries of their subunits, such that the lower filament is aligned with the middle one, while the upper one is not
Chemistry
English

Artificial Cells Produce Real Proteins

English

 

A living cell, from one point of view, is a sort of sprawling protein factory that can churn out thousands of different proteins to order. Prof. Roy Bar-Ziv of the Weizmann Institute’s Materials and Interfaces Department is building on the basic idea of creating “artificial cells” that might, in the future, enable us to control the production of proteins or other complex biological processes.

 
Fluorescent image of DNA (white squares) patterned in circular compartments connected by capillary tubes to the cell-free extract flowing in the channel at bottom. Compartments are 100 micrometers in diameter
 

 

 

The system, designed by PhD students Eyal Karzbrun and Alexandra Tayar in Bar-Ziv’s lab, in collaboration with Prof. Vincent Noireaux of the University of Minnesota, comprises multiple compartments etched onto a biochip. These tiny artificial cells, each a mere millionth of a meter in depth, are connected via thin capillary tubes, creating a network that allows the diffusion of biological substances throughout the system. The instructions – DNA designed by the scientists – are inserted into the cells, along with the protein-making machinery and raw materials – both provided by an extract of the bacterium E. coli.


The genetic sequence the researchers had inserted contained two regulatory genes – basically “on” and “off” switches. Much like their real counterparts, the artificial cells are linked through a capillary system to a feeding channel that enabled them to absorb nutrients and exchange materials; and they, in turn, created proteins in a periodic fashion.

(l-r) Eyal Karzbrun, Alexandra Tayar and Prof. Roy Bar-Ziv
The network also mimicked a key facet of complex cellular communication – one that takes place during embryonic development. As the body plan takes shape – a process called morphogenesis – the diffusion of proteins out of the cells becomes crucial. Tayar: “We observed that when we place a gene in a compartment at the edge of the array, it creates a diminishing protein concentration gradient; other compartments within the array can sense and respond to this gradient – resembling an embryo during early development. We are now working to expand the system and to introduce gene networks that will mimic pattern formation, such as the striped patterns that appear during fly embryogenesis.”
 
Bar-Ziv: “Genes are like Lego in which you can mix and match various components to produce different outcomes; you can take a regulatory element from E. coli that naturally controls gene X, and produce a known protein; or you can take the same regulatory element but connect it to gene Y to get different functions that do not naturally occur in nature.” This research may, in the future, help advance the synthesis of such things as fuel, pharmaceuticals, chemicals and the production of enzymes for industrial use, to name a few of the possibilities.
 
Prof. Roy Bar Ziv's research is supported by the Yeda-Sela Center for Basic Research.
 
 

 

 

 

 
(l-r) Eyal Karzbrun, Alexandra Tayar and Prof. Roy Bar-Ziv
Chemistry
English

Artificial Cells Act Like the Real Thing

English
 
Imitation, they say, is the sincerest form of flattery, but mimicking the intricate networks and dynamic interactions that are inherent to living cells is difficult to achieve outside the cell. Now, as published in Science, Weizmann Institute scientists have created an artificial, network-like cell system that is capable of reproducing the dynamic behavior of protein synthesis. This achievement is not only likely to help gain a deeper understanding of basic biological processes, but it may, in the future, pave the way toward controlling the synthesis of both naturally-occurring and synthetic proteins for a host of uses.
 
(l-r) Eyal Karzbrun, Alexandra Tayar and Prof. Roy Bar-Ziv
 

The system, designed by PhD students Eyal Karzbrun and Alexandra Tayar in the lab of Prof. Roy Bar-Ziv of the Weizmann Institute’s Materials and Interfaces Department, in collaboration with Prof. Vincent Noireaux of the University of Minnesota, comprises multiple compartments “etched’’ onto a biochip. These compartments – artificial cells, each a mere millionth of a meter in depth – are connected via thin capillary tubes, creating a network that allows the diffusion of biological substances throughout the system. Within each compartment, the researchers insert a cell genome – strands of DNA designed and controlled by the scientists themselves. In order to translate the genes into proteins, the scientists relinquished control to the bacterium E. coli: Filling the compartments with E. coli cell extract – a solution containing the entire bacterial protein-translating machinery, minus its DNA code – the scientists were able to sit back and observe the protein synthesis dynamics that emerged.


By coding two regulatory genes into the sequence, the scientists created a protein synthesis rate that was periodic, spontaneously switching from periods of being “on” to “off.” The amount of time each period lasted was determined by the geometry of the compartments. Such periodic behavior – a primitive version of cell cycle events – emerged in the system because the synthesized proteins could diffuse out of the compartment through the capillaries, mimicking natural protein turnover behavior in living cells. At the same time fresh nutrients were continuously replenished, diffusing into the compartment and enabling the protein synthesis reaction to continue indefinitely. “The artificial cell system, in which we can control the genetic content and protein dilution times, allows us to study the relation between gene network design and the emerging protein dynamics. This is quite difficult to do in a living system,” says Karzbrun. “The two-gene pattern we designed is a simple example of a cell network, but after proving the concept, we can now move forward to more complicated gene networks. One goal is to eventually design DNA content similar to a real genome that can be placed in the compartments.”
 
Fluorescent image of DNA (white squares) patterned in circular compartments connected by capillary tubes to the cell-free extract flowing in the channel at bottom. Compartments are 100 micrometers in diameter
 
 
 

The scientists then asked whether the artificial cells actually communicate and interact with one another like real cells. Indeed, they found that the synthesized proteins that diffused through the array of interconnected compartments were able to regulate genes and produce new proteins in compartments farther along the network. In fact, this system resembles the initial stages of morphogenesis – the biological process that governs the emergence of the body plan in embryonic development. “We observed that when we place a gene in a compartment at the edge of the array, it creates a diminishing protein concentration gradient; other compartments within the array can sense and respond to this gradient – similar to how morphogen concentration gradients diffuse through the cells and tissues of an embryo during early development. We are now working to expand the system and to introduce gene networks that will mimic pattern formation, such as the striped patterns that appear during fly embryogenesis,” explains Tayar.

With the artificial cell system, according to Bar-Ziv, one can, in principle, encode anything: “Genes are like Lego in which you can mix and match various components to produce different outcomes; you can take a regulatory element from E. coli that naturally controls gene X, and produce a known protein; or you can take the same regulatory element but connect it to gene Y instead to get different functions that do not naturally occur in nature.” This research may, in the future, help advance the synthesis of such things as fuel, pharmaceuticals, chemicals and the production of enzymes for industrial use, to name a few.
 
 
Prof. Roy Bar-Ziv’s research is supported by the Yeda-Sela Center for Basic Research.
 
(l-r) Eyal Karzbrun, Alexandra Tayar and Prof. Roy Bar-Ziv
Chemistry
English

Higher End, Lower Cost

English

 

(l-r) Profs. Gary Hodes, Henry Snaith and David Cahen

 

The goal: To create photovoltaic cells that are highly efficient, easy to produce and economical enough to put on every rooftop. If new materials that have recently come onto the solar cell scene live up to their promise, that goal could possibly be met within the next few years. Prof. David Cahen of the Materials and Interfaces Department says: “It will get to the point that the cells themselves will be so cheap that the external costs involved in producing photovoltaics and installing them will be what ultimately determine their price.”


Cahen and Prof. Gary Hodes of the same department are big fans of these materials, called perovskites. Based on such inexpensive metals as tin or lead, the compounds in this group of materials all share a particular crystal structure. “Perovskite cells are the first inexpensive type of solar cell to work in the high-energy part of the spectrum and give high voltage. And they are so easy to make, you can basically produce them on a hot plate,” says Hodes.
 

 

The rapid rise in efficiency for perovskites as compared to other photovoltaic materials. Graph: Hodes

 

Ninety percent of today’s photovoltaic cells are made with silicon, explains Cahen. Silicon is an inexpensive, abundant material that can allow a current of electrons to flow when photons from sunlight hit its surface. To complete the process, other materials added to the solar cell facilitate charge separation: sending the electrons to one side of the cell for use and the “holes” they leave behind to the other.


But even the very best future silicon module, working at its peak, can use only up to 25% of the sunlight that falls on its surface. Wavelengths at the lower end of the spectrum – for example infrared – do not have enough energy to free silicon’s electrons, while those at the higher end are so energetic that most of their energy is lost. Scientists trying to improve the efficiency of solar cells have been investigating materials that can make better use of those higher-energy photons, which could provide more electricity from the same surface area. Over the years many of these groups – Cahen and Hodes’s among them – have made progress, mostly incremental, in improving efficiency. But both the slow pace and the complexity of these improvements has often been a frustrating factor.     


Perovskites leapt onto the solar cell scene in 2009 when a group led by Prof. Tsutomu Miyasaka from Yokohama, Japan, used them in a special type of solar cell. While the efficiencies they achieved were respectable, the stability of their cells was very poor. Within a short time, the groups of Profs. Nam-Gyu Park and Sang-Il Seok from Korea, and that of Prof. Henry Snaith (see below) in the UK and of Prof. Michael Graetzel in Switzerland showed the way to better efficiencies and stability. Snaith, who had recently set up his lab at Oxford University, and others showed that cells using these materials gave relatively high voltages. Within a short time, these groups were creating experimental perovskite solar cells that rivaled silicon alternatives in efficiency and produced higher voltage.

 
Cahen and Hodes immediately recognized the potential of this material to fill in the “missing link” – that is, a cheap cell that yields high voltages, efficiently using the high-energy part of the solar spectrum. In a number of recent papers published together with Dr. Saar Kirmayer and PhD student Eran Edri in Nature Communications, with Tel Aviv University colleagues in Nano Letters, and with colleagues from Princeton University in Energy & Environmental Science, they elucidated the mechanisms by which perovskites manage to convert sunlight into useful electricity with high-voltage efficiency, as well as demonstrating some methods of improving on solar cells made of these materials. In two studies that appeared in The Journal of Physical Chemistry Letters, they and MSc student Michael Kulbak tested perovskites layered with various materials that act as hole conductors for improved charge separation. These yielded high-voltage cells that surpassed 1.5V (silicon yields around 0.7V, by comparison). Their results, they say, can point the way toward further improvements.


What is the secret of these materials? Cahen and Hodes say their research and that of others points to the crystal structure. The materials form high-quality structures – that is, containing few irregularities. The behavior and efficiency of the perovskite cells fits in with a model that Cahen and his coworkers had proposed some years back, which says that this order is critical. Furthermore, the material simultaneously contains both non-organic (lead and iodide or bromide) and organic (containing carbon and hydrogen rings) components. It is the way these components fit together that makes perovskites so useful for photovoltaic cells: Like silicon, they form nicely ordered crystalline structures, but at the same time, the interactions between the organic parts are weak, making surfaces that allow electrons to cross easily from grain to grain.  


At present, very small perovskite photocells have achieved 16% efficiency, and it appears likely that they can reach, and possibly surpass, 20%. There are still a few hurdles to overcome, cautions Cahen. For one, these materials must prove they are stable over time. For another, the compound contains a small amount of lead, and care must be taken either to find a substitute or to ensure that this toxic element cannot be released into the environment. Neither of these issues, say Cahen and Hodes, have dampened their enthusiasm. Perovskites have renewed both interest in the field and the hope that solar energy may finally become a widespread alternative to fossil-fuel-generated electricity.
 

perovskite infographic

 

 

Visit


Prof. Henry Snaith was as surprised as anyone by the rapid success of perovskite photovoltaics. “We looked at a whole range of materials before beginning to work with them,” he says. “Normally, if you start with 1% efficiency, you’re happy just to have proved that the material has photovoltaic properties. We started at 6%; within six months we had achieved 10%; and now we are already at 16% – nearly the efficiency of silicon.”


Snaith was recently in Israel as a guest of the Weizmann Institute, and through the Institute, he appeared at the annual meeting of the Israel Chemical Society. During his visit – just two days – he met with Cahen and Hodes and their groups. Snaith and the Weizmann scientists are at present undertaking a collaborative project that is supported by an initiative of Weizmann UK. They are working on widening the range of light energy the material can absorb. Snaith notes that he is looking forward to working with the Institute’s scientists: “Fantastic work is being done here at the Weizmann Institute,” he says.  
 

Prof. David Cahen’s research is supported by the Mary and Tom Beck Canadian Center for Alternative Energy Research, which he heads; the Nancy and Stephen Grand Center for Sensors and Security; the Ben B. and Joyce E. Eisenberg Foundation Endowment Fund; the Monroe and Marjorie Burk Fund for Alternative Energy Studies; the Leona M. and Harry B. Helmsley Charitable Trust; the Carolito Stiftung; the Wolfson Family Charitable Trust; the Charles and David Wolfson Charitable Trust; and Martin Kushner Schnur, Mexico. Prof. Cahen is the incumbent of the Rowland and Sylvia Schaefer Professorial Chair in Energy Research.


Prof. Gary Hodes’s research is supported by the Nancy and Stephen Grand Center for Sensors and Security; and the Leona M. and Harry B. Helmsley Charitable Trust.
 

 

 
(l-r) Profs. Gary Hodes, Henry Snaith and David Cahen
Chemistry
English

When Material Worlds Unite

English

 

Profs. David Cahen, Leeor Kronik and Ron Naaman

 
 
 
An emerging class of materials and devices – organic electronics – has attracted much attention in the past two decades.  In this world, such traditional inorganic semiconductors as silicon are combined with organic (carbon-based) materials. In combining desirable properties from both worlds – a nearly infinite choice of possible combinations – organic-inorganic materials offer a number of advantages over conventional inorganic semiconductors. Many can be produced at low cost; they can be light and easy to produce; and they can be designed with mechanical and chemical flexibility in mind.

In fact, such materials as organic light-emitting diodes (OLEDs) are already in use today in digital displays for TV screens, computer monitors and smartphones.
 
Often overlooked when combining two materials is the interface – the area in which they meet. That thin boundary turns out to be crucial for the properties of new materials, especially for those used in electronic devices. “The interface is so important, it is essentially what makes or breaks our ability to use an electronic material,” says Prof. David Cahen of the Materials and Interfaces Department.

At the Weizmann Institute, various groups are conducting cutting-edge research into organic-inorganic interfaces from all angles – from theory to experiments. This has led to highly fruitful collaborations both within Weizmann and beyond; an Israel Science Foundation grant, awarded to them together with scientists from Bar-Ilan and Tel Aviv universities, designates this research a scientific “Center of Excellence.”  
 
inorganic-organic interface
 
 
 

Guided intuition


Prof. Leeor Kronik, Head of the Institute’s Materials and Interfaces Department, focuses his research on the theoretical: the novel electronic, optical and magnetic properties of organic-inorganic interfaces. His main tools are so-called first principles calculations – calculations based on the laws of quantum physics, in which a material’s properties are predicted from nothing more than the type of atoms of which it is made up.

“Although we have well-developed models for both organic and inorganic materials,” says Kronik, “combining these two different worlds creates a truly new class of materials, bringing with it novel, unexpected phenomena.” One striking example is the observation of magnetism at the interface between two non-magnetic materials – a discovery of Prof. Ron Naaman of the Chemical Physics Department. “Building new theories on traditional models can be misleading; but a bottom-up approach, using first principles calculations, can guide us to understanding such unique phenomena,” says Kronik.

To predict the electronic structure and properties of atoms and molecules making up a material, scientists use density functional theory (DFT), currently the only practical way of conducting first principles calculations of systems with many thousands of electrons. DFT calculations are exact – in principle – but their practical application is always approximate. So Kronik and his group are developing new theoretical methods and equations to refine DFT, in hopes of gaining a more accurate understanding of these new materials.
 

Control, control, control

Much of materials research revolves around the control of structure, since it is a material’s molecular structure that ultimately determines its properties. The idea of using organic materials at interfaces is attractive precisely because it is easier to manipulate their structure and composition than those of inorganic materials.

 
An electron-transmission microscope (TEM) image of a methyl-styrene monolayer (“white stripe”) about 1 nm thick at the interface between silicon (Si) and lead (Pb). The superimposed schemes of molecules are in scale with the molecular monolayer
 
Prof. David Cahen, who helped pioneer the  field together with Prof. Abraham Shanzer of the Institute’s Organic Chemistry Department, is finding ways to optimize conventional electronic materials by adding organic materials at their surface, specifically to the silicon in solar cells.
 
The use of silicon – the ubiquitous semiconductor material found in most electronic devices today – became possible only once industry figured out a way to “tame” its “messiness” through the process of oxidation: A thin oxide layer is grown on its surface. This offers a controlled decrease in defects at the interface between the silicon and the oxide, as well as adding a protective layer to the silicon. But oxidation is a difficult process; placing a thin layer – one to three atoms thick – of organic molecules at the interface leads to better control and opens up the potential for obtaining new properties.
 
Why is this important for solar cells? The best conventional solar cells are close to their theoretical limit (about 30%), but most are much less efficient; much of the avoidable energy loss is the result of less-than-optimal control at the interface. “Though solar cells made with organic materials are not yet in commercial use, the example of organic LEDs, which are already being manufactured and used, gives us hope that it’s only a matter of time before organic photovoltaics will mature into a viable technology,” says Cahen. “Along the way, we are learning the fascinating science of these interfaces.”

 

Putting a spin on it

Experimental system for measuring spin specific electron transfer through DNA
 
In another area of organic electronics known simply as molecular electronics, single molecules or one-molecule-thick layers of an organic compound are the active elements of electronic devices. This branch is being explored by Prof. Ron Naaman. More specifically, his research focuses on spin electronics or “spintronics.” Spintronic devices take advantage of an electron’s spin – a quantum property – and Naaman is trying to figure out, among other things, how spin can be used for memory devices and computing. Such computers would harness the power of atoms and molecules to perform calculations significantly faster than any silicon-based computer. Although they are mostly a “sci-fi” notion at present, Naaman is making some interesting strides toward turning this notion into reality.

An electron’s spin has two possible states, either “up” or “down,” and changing the ratio of spins in a material can change its properties. Controlling electron spin within a device is usually achieved with magnetic layers; but the use of these is complex, and they are sensitive to heat and cold.

Naaman discovered that spin direction can be controlled with organic, biological molecules, especially helical, double-stranded DNA. He based this discovery on a well-known property of organic molecules: They exist in either “left-” or “right-handed” forms that cannot be superimposed on one another. When Naaman exposed DNA to mixed groups containing electrons with both directions of spin, the DNA “preferred” electrons with one particular spin over the other.

Naaman: “DNA has turned out to be a superb ‘spin filter,’ and this concept has opened up new avenues in using organic materials for spintronics.”
 

 

Joining forces
 

“The overwhelming response we received from students and postdocs around the world is testament: This field is alive and well,” says Prof. Leeor Kronik, co-organizer of the 7th International Conference on Electronic Structure and Processes at Molecular-Based Interfaces (ESPMI VII), recently hosted at the Weizmann Institute of Science. The conference brought together some of the best minds in the field, providing a platform for sharing the latest research results as well as the opportunity to consolidate old collaborations and forge new ones. To make the conference more useful to those at the beginning of their career, mainly students, a workshop with tutorials was organized by Weizmann Institute students in the days before the actual meeting.
 
The conference was attended by more than 150 scientists; some 42 acknowledged experts in the field – from the USA, through Europe, to Japan – were invited to speak. Kronik: “The decision to hold the conference in Israel this year is recognition that important and interesting research on organic-inorganic interfaces is being conducted here.”
 
Interfaces conference
 
 

 

 
Prof. David Cahen's research is suppored by the Mary and Tom Beck Canadian Center for Alternative Energy Research,  which he heads; the Leona M. and Harry B. Helmsley Charitable Trust; the Nancy and Stephen Grand Center for Sensors and Security; the Ben B. and Joyce E. Eisenberg Foundation Endowment Fund; the Monroe and Marjorie Burk Fund for Alternative Energy Studies; the Carolito Stiftung; the Wolfson Family Charitable Trust; the Charles and David Wolfson Charitable Trust; the estate of Theodore E. Rifkin; and Martin Kushner Schnur, Mexico. Prof. Cahen is the incumbent of the Rowland and Sylvia Schaefer Professorial Chair in Energy Research.   

Prof.  Leeor Kronik's research is supported by the Wolfson Family Charitable Trust; the Carolito Stiftung; the European Research Council; the Leona M. and Harry B. Helmsley Charitable Trust; and Antonio and Noga Villalon, Winnetka, IL.
 
Prof. Ron Naaman's research is supported by the Nancy and Stephen Grand Research Center for Sensors and Security, which he heads; and the estate of Theodore Rifkin. Prof. Naaman is the incumbent of the Aryeh and Mintzi Katzman Professorial Chair.
 
Prof. Abraham Shanzer is the incumbent of the Siegfried and Irma Ullmann Professorial Chair.

 
ontrol, control, control
Profs. David Cahen, Leeor Kronik and Ron Naaman
Chemistry
English
Yes

Cell on a Chip Reveals Protein Behavior

English

 

Protein interaction on a chip: Red proteins concentrated more on the right, farther from the chip-bound genes, while green proteins are more highly concentrated on the left, closer to the genes that encode them
 
 
 
 
For years, scientists around the world have dreamed of building a complete, functional, artificial cell. Though this vision is still a distant blur on the horizon, many are making progress on various fronts. Prof. Roy Bar-Ziv and his research team in the Weizmann Institute’s Materials and Interfaces Department recently took a significant step in this direction when they created a two-dimensional, cell-like system on a glass chip. This system, composed of some of the basic biological molecules found in cells – DNA, RNA, proteins – carried out one of the central functions of a living cell: gene expression, the process by which the information stored in the genes is translated into proteins. More than that, it enabled the scientists, led by research student Yael Heprotein yman, to obtain “snapshots” of this process in nanoscale resolution.

The system, consisting of glass chips that are only 8 nanometers thick, is based on an earlier one designed in Bar-Ziv’s lab by Dr. Shirley Daube and former student Dr. Amnon Buxboim. After being coated in a light-sensitive substance, the chips are irradiated with focused beams of ultraviolet light, which enables the biological molecules to bind to the substance in the irradiated areas. In this way, the scientists could precisely place DNA molecules encoding a protein marked with a green fluorescent marker in one area of the chip and antibodies that “trap” the colored proteins in an abutting area. When they observed the chips under a fluorescence microscope, the area in which they had placed the antibodies turned a glowing bright green. This meant that the DNA instructions had been copied into RNA molecules, which were in turn translated into fluorescent green proteins. The green proteins were then ensnared by the antibodies.

Next, the scientists asked whether their cell-like system could reproduce complex structural assemblies of naturally-occurring proteins. This time, they attached a viral gene to the chips’ surface encoding a protein that can self-assemble into a nanotube. With the help of Dr. Sharon Wolf of the Electron Microscopy Unit, they observed a forest of minuscule tubes sprouting from the antibody area under an electron microscope.

The researchers then sought a way to produce and trap multiple proteins simultaneously by confining each protein in the area of its gene on the chip. On top of the chip to which the DNA encoding green proteins was bound, the scientists added a solution with a second gene encoding a red protein. The resulting red and green proteins competed for binding on the antibody traps, yielding a graded spatial separation in which the antibodies closest to the green genes had the highest concentration of green protein, with red concentrations rising farther afield. The results of this research recently appeared in Nature Nanotechnology.
 
Illustrated biological chip: Genes (gray) are attached to the chip, along with antibodies (blue). The gene encoding the second protein is purple. RNA strands (red) are copied, and, depending on the information they encode, the proteins produced in the ribosomes (yellow) glow either green or red (green and purple cylinders)
 
 
 

Bar-Ziv: “We have shown that it is possible to build a protein 'production line' outside of the cell and use it to observe a spectrum of protein activities.” In the future, such a system may move from enabling the observation of proteins to providing the basis for techniques to create complex, active protein structures on demand.

Prof. Roy Bar Ziv’s research is supported by the Yeda-Sela Center for Basic Research; and the Carolito Stiftung.


 
Chemistry
English

The Science of the Slow Slip

English
Dr. Eran Bouchbinder
 

 

We all know about the big earthquakes: The edges of two tectonic plates in the earth’s crust slip against one another, causing shaking and damage as the slip releases huge amounts of pent-up energy within mere seconds. The earth, however, literally moves under our feet much more often than we’re aware. In the last few years researchers have come to realize that there is another kind of earthquake – called a “slow” or “silent” earthquake – which does not entail a strong ground-shaking energy release. Rather, the slip takes place many orders of magnitude more slowly – over hours or even days. Although the ultimate energy release might be similar, the slow pace means such quakes are not felt.

Slow quakes, says Dr. Eran Bouchbinder of the Chemical Physics Department (Faculty of Chemistry), appear to be a common occurrence. It is only the recent development of novel geophysical and laboratory measurement techniques that have enabled scientists to uncover their existence and prevalence. Researchers are eager to understand the physical principles underlying the slow quakes: Are they governed by the same physics as their fast counterparts? What is actually determining their speed? There is some evidence that in certain cases, slow quakes may be precursors of the fast ones, though no one can say, as yet, how to identify those that might presage a major event.
 
 
In research recently published in Geophysical Research Letters, Bouchbinder and his team proposed a mathematical model suggesting that slow quakes might be controlled by somewhat different physics than the fast ones, accounting for their leisurely rate of movement.

Both types of earthquake involve frictional interfaces – the surfaces where the plates meet. Such interfaces are present in a great number of physical systems and play a central role in phenomena occurring on widely disparate scales, ranging from nano-machines and tiny biological motors to the geophysical scale of earthquakes. The strength and stability of these interfaces is an issue of fundamental importance; when frictional interfaces fail, it could spell trouble.  
 
A rupture in frictional interfaces involves the interplay between stored “elastic” energy and frictional resistance. Two bodies – whether tectonic plates in the earth’s crust or car tires and the road – are typically brought into friction-causing contact by normal forces that press them together. Parallel to the interface, a phenomenon called shear force acts to slide the two bodies against each other. In other words, one force pushes the bodies to move; the other resists movement. As long as the frictional resistance balances out the shear force, the interface remains pinned in place. But when shear force overcomes frictional resistance, sliding occurs, and ruptures propagate along the interface. How fast these ruptures progress determines whether the elastic energy – often built up over long periods as the shear force strains against friction – will be released abruptly or a little at a time. In fast, shaking quakes, this propagation is limited by the speed of sound, on the order of kilometers a second through ordinary solids – the speed at which waves can move within a material.
 
Earthquake damage: occurs when ruptures propagate at the speed of sound
 

 

For Bouchbinder, the question was: What makes the slow quakes slow? Is their speed simply a small fraction of the speed of sound or is it really a different physical quantity?
 
To investigate this, he and his team, research student Yohai Bar Sinai and Dr. Efim Brener, a visiting professor from Peter Grünberg Institut, Forschungszentrum Jülich, Germany, looked at geophysical measurements as well as experimental data from the group of Prof. Jay Fineberg of the Hebrew University of Jerusalem in which the factors involved in various types of earthquake are tested on a lab-sized scale. They then developed a model that offers a new explanation for the dynamics of the slow quakes.

According to conventional models, frictional resistance decreases as the sliding speed rises: The faster something slides the easier the sliding becomes. This is why an interfacial rupture tends to accelerate until it approaches the limit – the speed of sound. But the new model Bouchbinder and his team proposed allows for a different scenario: Frictional resistance first acts as in the standard model, decreasing with increasing sliding speed; but at some point the effect is reversed, and friction begins to progressively rise again. The slow quake phenomenon, says Bouchbinder, should occur around that point in the model at which friction levels reach their minimum and begin to swing upwards. The speed of the rupture, in this case, is determined by the frictional properties of the interface alone, rather than by the speed of sound, and thus can be orders of magnitude slower.

Bouchbinder and his team now plan to continue testing their mathematical model with more complicated calculations and to compare their predictions to experimental measurements, in hopes of providing a useful tool for better understanding the failure of frictional interfaces, including those that result in earthquakes.
 
 
Satellite images from 1992 and 1997 reveal gradual slippage of 2-3 cm along the Hayward fault (thin red line) in California, evidence of a slow earthquake Image: NASA

 
 

 

 
 
Earthquake damage: occurs when ruptures propagate at the speed of sound
Chemistry
English

New Nanotube Structures

English

Weizmann Institute Scientists Create New Nanotube Structures


Thanks to the rising trend toward miniaturization, carbon nanotubes – which are about 100,000 times thinner than a human hair and possess several unique and very useful properties – have become the choice candidates for use as building blocks in nanosized electronic and mechanical devices. But it is precisely their infinitesimal dimensions, as well as their tendency to clump together, that make it difficult for scientists to manipulate nanotubes.


Dr. Ernesto Joselevich, together with Ph.D. student Ariel Ismach and former M.Sc. student Noam Geblinger of the Weizmann Institute’s Materials and Interfaces Department, are developing techniques to coax carbon nanotubes to self-assemble into ordered structures – essentially making the nanotubes do the hard work for them.


Ironically, the universal principle of 'order through chaos,' has allowed the team’s most recent research to give rise to nanotubes that are strikingly more ordered and complex than any ever observed before. These intriguing new nanotube structures, which the scientists have dubbed 'serpentines' due to their self-assembly into snake-like or looped configurations, have recently been reported in the cover article of the journal Nature Nanotechnology.


'It may seem paradoxical – trying to create order through chaos – but in fact, this a common phenomenon on the macroscale. Systems affected by forces that fluctuate from one extreme to another tend to self-organize into much more complexly ordered structures than those in which the external forces are ‘calm.’ We applied this principle at the nanoscale to see if it would have the same effect, and indeed, it did,' says Joselevich.


Serpentines are a common geometry in many functional macroscale systems: antennas, radiators and cooling elements. Analogously, nanotube serpentines could find a wide range of nano-device applications, such as cooling elements for electronic circuits and opto-electronic devices, as well as in power-generating, single-molecule dynamos. 'But the feature I find most intriguing about these serpentines,' says Joselevich, 'is their beauty.'


Dr. Ernesto Joselevich’s research is supported by the Helen and Martin Kimmel Center for Nanoscale Science; and the Gerhardt Schmidt Minerva Center on Supramolecular Architectures.  Dr. Joselevich is the incumbent of the Dr. Victor L. Ehrlich Career Development Chair. An animated movie explaining nanotube serpentine formation can be seen at www.youtube.com/watch?v=8znin1nSo_I
 

 

 


The Weizmann Institute of Science in Rehovot, Israel, is one of the world's top-ranking multidisciplinary research institutions. Noted for its wide-ranging exploration of the natural and exact sciences, the Institute is home to 2,600 scientists, students, technicians and supporting staff. Institute research efforts include the search for new ways of fighting disease and hunger, examining leading questions in mathematics and computer science, probing the physics of matter and the universe, creating novel materials and developing new strategies for protecting the environment.


Weizmann Institute news releases are posted on the World Wide Web at
http://wis-wander.weizmann.ac.il/, and are also available at http://www.eurekalert.org/

 

 

 

 

Chemistry
English

A New Technique May Speed the Development of Molecular Electronics

English

THE INSIDE DOPE

 

Often, things can be improved by a little 'contamination.' Steel, for example is iron with a bit of carbon mixed in. To produce materials for modern electronics, small amounts of impurities are introduced into silicon – a process called doping. It is these impurities that enable electricity to flow through the semiconductor and allow designers to control the electronic properties of the material.


Scientists at the Weizmann Institute of Science, together with colleagues from the U.S.A., recently succeeded in being the first to implement doping in the field of molecular electronics – the development of electronic components made of single layers of organic (carbon-based) molecules. Such components might be inexpensive, biodegradable, versatile and easy to manipulate. The main problem with molecular electronics, however, is that the organic materials must first be made sufficiently pure and then, ways must be found to successfully dope these somewhat delicate systems.


This is what Prof. David Cahen and postdoctoral fellow Dr. Oliver Seitz of the Weizmann Institute’s Material and Interfaces Department, together with Drs. Ayelet Vilan and Hagai Cohen from the Chemical Research Support Unit and Prof. Antoine Kahn from Princeton University did. They showed that such 'contamination' is indeed possible, after they succeeded in purifying the molecular layer to such an extent that the remaining impurities did not affect the system’s electrical behavior.

The scientists doped the 'clean' monolayers by irradiating the surface with UV light or weak electron beams, changing chemical bonds between the carbon atoms that make up the molecular layer. These bonds ultimately influenced electronic transport through the molecules.

This achievement was recently described in the Journal of the American Chemical Society (JACS). The researchers foresee that this method may enable scientists and electronics engineers to substantially broaden the use of these organic monolayers in the field of nanoelectronics. Dr. Seitz: 'If I am permitted to dream a little, it could be that this method will allow us to create types of electronics that are different, and maybe even more environmentally friendly, than the standard ones that are available today.'   

Prof. David Cahen’s research is supported by the Nancy and Stephen Grand Research Center for Sensors and Security; the Philip M. Klutznick Fund for Research; Mr. Yehuda Bronicki, Israel; Mr. and Mrs. Yossie Hollander, Israel; and the Wolfson Family Charitable Trust. Prof. Cahen is the incumbent of the Rowland Schaefer Professorial Chair in Energy Research.
 
 
 
The Weizmann Institute of Science in Rehovot, Israel, is one of the world's top-ranking multidisciplinary research institutions. Noted for its wide-ranging exploration of the natural and exact sciences, the Institute is home to 2,600 scientists, students, technicians and supporting staff. Institute research efforts include the search for new ways of fighting disease and hunger, examining leading questions in mathematics and computer science, probing the physics of matter and the universe, creating novel materials and developing new strategies for protecting the environment.

Weizmann Institute news releases are posted on the World Wide Web at http://wis-wander.weizmann.ac.il/, and are also available at http://www.eurekalert.org/
 
Chemistry
English

Pages