Biological Computer Diagnoses Cancer and Produces the Drug -- in a Test Tube

English

Weizmann Institute scientist’s vision: Microscopic computers will function inside living tissues, performing diagnosis and administering treatment.

 

The world’s smallest computer (around a trillion can fit in a drop of water) might one day go on record again as the tiniest medical kit. Made entirely of biological molecules, this computer was successfully programmed to identify - in a test tube - changes in the balance of molecules in the body that indicate the presence of certain cancers, to diagnose the type of cancer, and to react by producing a drug molecule to fight the cancer cells.

 

The Weizmann Institute of Science team that developed the computer published these results today in Nature. Headed by Prof. Ehud Shapiro, of the Departments of Computer Sciences and Applied Mathematics, and Biological Chemistry, the team included research students Yaakov Benenson, Binyamin Gil, Uri Ben-Dor and Dr. Rivka Adar. Shapiro presented the team’s findings today at the Brussels symposium “Life, a Nobel Story,” in which Nobel Laureates and others addressed the future of the life sciences.

 

As in previous biological computers produced in Shapiro’s lab, input, output and “software” are all composed of DNA, the material of genes, while DNA-manipulating enzymes are used as “hardware.” The newest version’s input apparatus is designed to assess concentrations of specific RNA molecules, which may be overproduced or under produced, depending on the type of cancer. Using pre-programmed medical knowledge, the computer then makes its diagnosis based on the detected RNA levels. In response to a cancer diagnosis, the output unit of the computer can initiate the controlled release of a single-stranded DNA molecule that is known to interfere with the cancer cell’s activities, causing it to self-destruct.

 

In one series of test-tube experiments, the team programmed the computer to identify RNA molecules that indicate the presence of prostate cancer and, following a correct diagnosis, to release the short DNA strands designed to kill cancer cells. Similarly, they were able to identify, in the test tube, the signs of one form of lung cancer. One day in the future, they hope to create a “doctor in a cell”, which will be able to operate inside a living body, spot disease and apply the necessary treatment before external symptoms even appear.

 

The original version of the biomolecular computer (also created in a test tube) capable of performing simple mathematical calculations, was introduced by Shapiro and colleagues in 2001. An improved system, which uses its input DNA molecule as its sole source of energy, was reported in 2003 and was listed in the 2004 Guinness Book of World Records as the smallest biological computing device.

 

Shapiro: “It is clear that the road to realizing our vision is a long one; it may take decades before such a system operating inside the human body becomes reality. Nevertheless, only two years ago we predicted that it would take another 10 years to reach the point we have reached today.”

 

Prof. Ehud Shapiro's research is supported by the M.D. Moross Institute for Cancer Research, the Samuel R. Dweck Foundation, the Dolfi and Lola Ebner Center for Biomedical Research, the Benjamin and Seema Pulier Charitable Foundation, and the Robert Rees Fund for Applied Research.

 

Further information and photos can be obtained online at: www.weizmann.ac.il/udi/PressRoom <http://www.weizmann.ac.il/udi/PressRoom> or by contacting the Weizmann Institute Publications and Media Relations Department at 972-8-934-3856.

 

Math & Computer Science
English

Tiny Computing Machine Fueled by DNA

English
Fifty years after the discovery of the structure of DNA, a new use has been found for this celebrated molecule: fuel for molecular computation systems. The search, conducted by scientists at the Weizmann Institute of Science, will appear in this week's issue of Proceedings of the National Academy of Sciences USA (PNAS).
 
Whether plugged in or battery-powered, computers need energy. Around a year ago, Prof. Ehud Shapiro of the Weizmann Institute made international headlines for devising a programmable molecular computing machine composed of enzymes and DNA molecules. Now his team has made the device uniquely frugal: the single DNA molecule that provides the computer with the input data also provides all the necessary fuel. 
 
The source of fuel of the earlier device was a molecule called ATP, the standard energy currency of all life forms. The redesigned device processes its DNA input molecule using only spontaneous, energy-releasing operations. It breaks two bonds in the DNA input molecule, releasing the energy stored in these bonds as heat. This process generates sufficient energy to carry out computations to completion without any external source of energy.
 
A spoonful (5 milliliters) of “computer soup” can contain 15,000 trillion such computers, together performing 330 trillion operations per second with 99.9% accuracy per step. These computers need very little energy (all supplied, as mentioned, by the input molecule) and together release less than 25 millionths of a watt as heat. 
 
The device was recently awarded the Guinness World Record for “smallest biological computing device.”
 
The study was carried out by Yaakov Benenson, Dr. Rivka Adar, Dr. Tamar Paz-Elizur, Prof. Zvi Livneh and Prof. Ehud Shapiro of the Institute's Biological Chemistry Department and the Computer Science and Applied Mathematics Department.  
 
Prof. Ehud Shapiro's research is supported by the Dolfi and Lola Ebner Center for Biomedical Research, Yad Hanadiv, the Robert Rees Fund for Applied Research and the Samuel R. Dweck Foundation.
 
Additional information is available at www.weizmann.ac.il/udi
Math & Computer Science
English

Weizmann Institute Scientists Develop a Novel System for Analyzing Genetic Data that Mimics the Human Capacity for Unsupervised Learning

English
 

Addressing this and similar challenges may soon be easier thanks to Prof. Eytan Domany of the Weizmann Institute's Physics of Complex Systems Department and doctoral students Gad Getz and Erel Levine. The team has designed a unique mathematical system for analyzing genetic data based on a computer algorithm that 'clusters' information into relevant categories. The algorithm searches simultaneously for clusters of 'similar' genes and patients by evaluating the gene expression of tissue samples. (A gene's 'expression' refers to the production level of the proteins it encodes.)

 

Reported in the October 17 issue of the Proceedings of the National Academy of Sciences (PNAS), the algorithm's most powerful feature is that it mimics unassisted learning. Unlike most automated 'sorting' processes, in which a computer must be informed of the relevant categories in advance, the algorithm is analogous to human intuition (such as the ability to intuitively categorize images of animals and cars into proper classes). When given a clustering task, it analyzes the data, computes the degree of similarity among its components, and determines its own clustering criteria.

 

The new method makes use of a previous application by Domany and his colleagues based on a well-known physical phenomenon. When a granular magnet such as a magnetic tape is warm, its grains are highly disorganized. But upon cooling down, the magnet's grains progressively organize themselves into well-ordered clusters. Using the statistical mechanics of granular magnets, Domany created an algorithm that can look for clusters in any data.

 

When applied in a cancer study using DNA chips, the new algorithm proved highly effective, evaluating roughly 140,000 figures representing the cellular expression of 2,000 genes from 70 subjects. The algorithm categorized tissue samples into separate clusters according to their gene expression profiles. For example, one cluster consisted of cancerous tissues, while another contained samples from healthy subjects. The new method also distinguished among different forms of cancer as well as demonstrating treatment effects, picking up differences in the gene expression of leukemia patients that had received treatment versus those that had not. The ability to monitor cell response to treatment and understand the origin of disease in each patient may improve future treatment protocols, which would be tailored to individual pathologies.

 

Finally, one of the algorithm's most promising features is that it enabled researchers to pinpoint a small group of genes from within the 2,000 examined that can be used to accurately distinguish between cellular cancerous processes.

 

In a sense, however, applying the new algorithm to DNA chips is only a start. The new algorithm's inherent clustering capacity makes it invaluable for use in data-heavy scientific and industrial applications. It may be used to analyze financial information and MRI data in brain research, or to perform 'data mining,' the process by which specific details are culled from the world's huge and ever-growing data banks, such as those generated by the international Human Genome Project.

The Institute's technology transfer arm, Yeda Research and Development, has issued a patent application for the algorithm.

 

Prof. Eytan Domany holds the Henry J. Leir Professorial Chair.

 

The Weizmann Institute of Science is a major center of scientific research and graduate study located in Rehovot, Israel. Its 2,500 scientists, students and support staff are engaged in more than 1,000 research projects across the spectrum of contemporary science.

Space & Physics
English

Weizmann Institute Scientist Designs The First General-Purpose Mechanical Computing Device To Serve As The Basis For A Biological Computer

English
The first general-purpose mechanical computer designed for biomolecular and pharmaceutical applications has been developed by Prof. Ehud Shapiro of the Computer Science and Applied Mathematics Department at the Weizmann Institute of Science.  The mechanical computer will be presented today at the Fifth International Meeting on DNA-Based Computers at the Massachusetts Institute of Technology.

In terms of the logic behind it, Shapiro's mechanical computer is very similar to biomolecular machines of the living cell such as the ribosome. Therefore, a future biomolecular version of the device may ultimately lead to the construction of general-purpose programmable computers of subcellular size. If scientists succeed to build such a computer, it may be able to operate in the human body and interact with the body's biochemical environment, thus having far-reaching biological and pharmaceutical applications.

"For example, such a computer could sense anomalous biochemical changes in the tissue and decide, based on its program, what drug to synthesize and release in order to correct the anomaly," Prof. Shapiro says.


The Turing machine

 

Unlike existing electronic computers, which are based on the computer architecture developed by John von Neumann in the U.S. in the 1940s, the new mechanical computer is based on the Turing machine, conceived as a paper-and-pencil computing device in 1936 by the British mathematician Alan Turing.  
 
The theoretical Turing machine consists of a potentially infinite tape divided into cells, each of which can hold one symbol, a read/write head, and a control unit which can be in one of a finite number of states. The operation of the machine is governed by a finite set of rules that constitute its "software program."  
 
In each cycle the machine reads the symbol in the cell located under the read/write head, writes a new symbol in the cell, moves the read/write head one cell to the left or to the right, and changes the control state, all according to its program rules. Although the Turing machine is a general-purpose, universal, programmable computer and is key to the theoretical foundations of computer science, it has not been used in real applications. Shapiro's mechanical device embodies the theoretical Turing machine, and as such is a general-purpose programmable computer.


The mechanical computer

 

The device employs a chain of three-dimensional building blocks to represent the Turing machine's tape, and uses another set of building blocks to encode the machine's program rules.  In each cycle the device processes one "rule molecule."  The device is designed so that the processing of the molecule modifies the polymer representing the Turing machine's tape in accordance with the intended meaning of the rule.

At the conference, Shapiro will present a 30-cm high plastic model of his mechanical computer. He hopes that in the future, with the advent of improved techniques for the analysis and synthesis of biomolecular machines, the actual computer could possibly be built from biological molecules, so that it would measure about 25 millionths of a millimeter in length, roughly the size of a ribosome.
 

The computer and the ribosome


In fact, Prof Shapiro designed the mechanical computer with the ultimate goal of implementing it with biological molecules. The computer is not more complicated than existing biomolecular machines of the living cell such as the ribosome, and all its operations are part of the standard repertoire of these machines. 
 
These operations include the mechanical equivalents of polymer elongation, cleavage and ligation, as well as moving along a polymer and being controlled by coordinated structural changes. The ribosome is the molecular machine of the living cell that performs the final step of interpretation of the genetic code by translating messenger RNA, which is transcribed from DNA, into protein.
 
A key similarity between Shapiro's mechanical computer and the ribosome is that a "program rule" molecule specifies a computational step of the computer similarly to the way a transfer RNA molecule specifies a translation step of the ribosome.

The computer is similar to the ribosome in that both operate on two polymers simultaneously, and their basic cycle consists of processing an incoming molecule that matches the currently held molecules on the first polymer, elongating the second polymer, and moving sideways. However, unlike the ribosome, which only "reads" the messenger RNA in one direction, the computer edits the tape polymer and may move in either direction.
 

A future interactive biological computer


The computer design may allow it to respond to the availability and to the relative concentrations of specific molecules in its environment, and to construct program-defined polymers, releasing them into the environment. If implemented using biomolecules, such a device may operate in the human body,  interacting with its biochemical environment in a program-controlled manner. In particular, given a biomolecular implementation of the computer that uses RNA as the tape polymer, the computer may release cleaved tape polymer segments that function as messenger RNA, performing program-directed synthesis of proteins in response to specific biochemical conditions within the cell. Such an implementation could give rise to a family of computing devices with broad biological and pharmaceutical applications.


About Prof. Shapiro

 

Prof. Shapiro received his Ph.D. from Yale University and joined the Weizmann Institute in 1982. During the 1980s he was involved with the Japanese Fifth Generation Computer Project and published numerous scientific papers in the area of concurrent logic programming languages.

In the early 1990s, Shapiro´s innovative research in programming languages led to the establishment of Ubique, a company that develops interactive online environments. Shapiro took a leave from Weizmann to establish Ubique, and when the company was bought by America Online, Inc., he moved to the U.S. to assist in incorporating Ubique's Virtual Places technology in America Online's internet services.
 
When America Online sold Ubique to Lotus/IBM in 1998, Shapiro returned to his research post at the Weizmann Institute. The mechanical design of Shapiro's computer model was performed by K. Karunaratne from Korteks and M. Schilling from Schilling 3D Design, both from San Diego, CA.

The Weizmann Institute of Science is a major scientific research graduate study located in Rehovot, Israel. Its 2,500 scientists, students and support staff are engaged in more than 1,000 research projects across the spectrum of contemporary science.
Math & Computer Science
English
Yes

More Bytes for the Buck

English
To boost the storage capacity of tomorrow's computers, information is digitally recorded on optical disks and read by a laser. Scientists hoped to enhance the computers memory banks even further by stacking several optical disks on top of each other. However, information retrieval from such systems is relatively slow because the laser can only focus on one level at a time.

Dr. Erez Hasman and Prof. Asher Friesem of the Weizmann Institute's Physics of Complex Systems Department have developed an approach that may greatly speed up this process by reading multiple layers simultaneously. It also reduces the amount of light signals containing unwanted information, referred to as background noise, relative to the signals with the useful information.

The new method employs a number of light sources, such as diode lasers, which emit several light beams of different wavelengths. These beams are radiated onto the different optical disks simultaneously using a novel optical element. The reflected light is then analyzed according to wavelength, providing information about each disk.

While optical disks can now store up to 650 megabytes of data, the new approach may push their storage capacity to thousands of megabytes. A patent was recently registered in the U.S.A. by Yeda Research and Development Co. Ltd., which handles the commercial development of Weizmann Institute research.

Prof. Friesem holds the Peter and Carola Kleeman Chair of Optical Sciences.

The Weizmann Institute of Science is a major center of scientific research and graduate study located in Rehovot, Israel.
Math & Computer Science
English

The Math of Distortion

English
 
Many of us in the modern world spend a good part of our day viewing 2-D representations of the 3-D world: drawings and photos, television and electronic screens, to name a few. Our brains automatically translate what we see into three dimensions. Not just that, but we unthinkingly group objects together, identify people from their photos no matter how they are posed, and gauge distances between objects. To the average computer, however, images are basically collections of colored dots in a 2-D grid. One of the bigger challenges for today’s computer scientists is to get these machines to relate to the 3-D world portrayed in their 2-D files as humans do – by sorting, classifying, comparing and inferring.
Dr. Yaron Lipman
 
Dr. Yaron Lipman, who joined the Weizmann Institute’s Computer Science and Applied Mathematics Department in 2011, deals with the mathematics of deformation – variations in shape between similar objects or in that of a single body as it turns, twists, stretches or bends. His work has implications for fields as far-ranging as biology and computer-generated animation, engineering and computer vision.

One of the more basic questions is: In a group of objects, how does one decide whether any two are the same or different? This is not a trivial question, even for humans: To the untrained eye, for example, a pile of old animal bones may be a collection of similar objects, but a trained morphologist or paleontologist will be able to sort them into different species. That ability often comes with years of practice and, just as the average person can tell a pear from an apple without much thought, the expert can distinguish between bones without consciously tracing each step leading up to a particular identification. How, then, does one translate this type of thinking, which is at least partly unconscious, to a mathematical computer algorithm?

Lipman and his colleagues developed an algorithm for comparing and classifying anatomical surfaces such as bones and teeth by analyzing plausible deformations among their three-dimensional models. While a human working from a subjective point of view might look for identifying “landmarks” – for example a specific type of recognizable ridge or protrusion – the computer would take a different track: matching the surfaces globally while minimizing the amount of distortion in the match. A human often works from the bottom up, looking for local cues and then putting them together to arrive at a conclusion. In contrast, the algorithm is designed to consider the collective surfaces as a geometric whole and match them in a top-down manner. Lipman then gave his algorithm the ultimate test – pitting its bone and tooth identification against that of expert morphologists. In all the tests, the computer performed nearly as well as the trained morphologists. That means, he says, that non-experts could use the algorithm to quickly obtain accurate species identification from bones and teeth. In the future, further biological information might be more easily extracted from shape.
 
Matching surfaces: (Top) the computer analysis and (bottom) the points selected by an expert. Note the similarities between the computer’s final result (top right) and that of the expert
 
 
expert match
 
 
 
A second line of research in Lipman’s group is the distantly related problem of image matching. In recent years, as digital cameras are found in every pocket and purse, vast numbers of images are recorded and uploaded to visual media every day. This makes one of the great challenges of computer vision — the ability to interpret, analyze and compare image content automatically – more pressing than ever. In contrast to surfaces, images contain many local cues and features that are easy for a human brain to grasp for recognition. A computer, however, would assign all the points in the image equal importance. So a pair of photos taken with different lighting or from different angles, which a human would have no trouble identifying as the same person, might confound a computer’s point-matching algorithm. Lipman’s solution is to introduce an algorithm for distortion – that is, to determine a mathematical distortion limit on the ways that one set of points can morph into a second set. Somewhat surprisingly, this technique eliminates most of the false matches.
 
Image matching: (top) two images of the same person in different pose; (middle) a set of matching points with no distortion algorithm applied; (bottom) with the distortion algorithm, most of the false matches are eliminated
 

 

 
Yet another train of investigation Lipman pursues is that of modeling mappings and deformations, in 3-D space, of objects that possess desired geometric properties. This area is relevant to computer-generated animation, whose practitioners are constantly in search of better methods for creating life-like motion on screen; engineering, in which computerized models of objects are deformed and mapped to each other; and such areas as medical imaging and computer modeling. Such mappings are laid out on top of a representation of the object as a sort of 3-D mesh of tetrahedrons, and they are then used to work out how the various parts of the mesh deform as the object moves. In real life, this movement involves many parameters, including flexibility, elasticity, movement patterns of joints and the sites where surfaces meet. Lipman develops particular models of deformations that are able to avoid high distortion and self-penetration of matter — both properties of great importance for modeling deformations in “real-life” applications.
 
Applying a standard deformation technique to a 3-D mechanical model (left) produces high distortion and fold-overs (middle, highlighted in red and yellow), which leads to the unstable behavior of numerical algorithms; (right) deformation with strict distortion limits guarantees that the map is free of fold-overs and of bounded distortion
 
 

Dr. Yaron Lipman's research is supported by the Friends of Weizmann Institute in memory of Richard Kronstein
 

 

 

 
Dr. Yaron Lipman
Math & Computer Science
English

Turing Award to the Weizmann Institute’s Shafi Goldwasser for Advances that Revolutionized the Science of Cryptography

English

 

 

Prof. Shafi Goldwasser
 

ACM, the Association for Computing Machinery, today announced that Prof. Shafi Goldwasser of the Weizmann Institute’s Computer Science and Applied Mathematics Department, and the Massachusetts Institute of Technology Computer Science and Artificial Intelligence Lab, will receive the ACM A.M. Turing Award. She receives the Award together with Prof. Silvio Micali of MIT “for transformative work that laid the complexity-theoretic foundations for the science of cryptography, and in the process pioneered new methods for efficient verification of mathematical proofs in complexity theory.”

 

The ACM Turing Award, widely considered the highest prize in the field of computing (there is no Nobel Prize in the field) carries a $250,000 prize, with financial support provided by Intel Corporation and Google Inc.
 
 

Probabilistic Encryption
 

In their 1982 paper on "Probabilistic Encryption," Goldwasser and Micali laid the rigorous foundations for modern cryptography. The work is universally credited in changing cryptography from an "art" to a "science".
 

This paper pioneered several themes which are today considered basic to the field. These include the introduction of formal security definitions that are now the gold standard for security; the introduction of randomized methods for encryption which can satisfy stringent security requirements that could not be satisfied by previous deterministic encryption schemes; and the methodology of "reductionist proofs," which shows how to translate the slightest attack on security into a fast algorithm for solving such hard classical mathematical problems as factoring integers. These proofs are a double edged sword, in that they show that one of two things must be true: Either we have a super secure encryption scheme, or we have new algorithms for factoring integers.
 

The 1982 paper also introduced the "simulation paradigm," in which the security of a system is established by showing that an enemy could have simulated, on his own, any information he obtained during the employment of a cryptographic system, and thus this cryptographic system represents no risk. The simulation paradigm has become the most general method for proving security in cryptography, going beyond privacy to define and prove security of authentication methods, security of software protection schemes and security of cryptographic protocols that involve many participants, for example electronic elections and auctions.

 

Zero-Knowledge Interactive Proofs
 

In another influential paper, published in 1985 with Rackoff, Goldwasser and Micali introduced the concept of “zero-knowledge interactive proofs.”
 

An example of a zero-knowledge interactive proof would be an ATM machine that would not need you to enter your PIN number, but would only need to verify that you, yourself know it. Zero-knowledge proofs can also enable users working on the Internet who may not trust each other to compute joint functions on their secret data.
 

In contrast to classical mathematical proofs, which can always be written down, an interactive proof is a sort of conversation. One side – the “prover” – tries to convince the other – the “verifier” – that he knows the proof of a mathematical statement. The verifier must ask the prover a series of questions, which are randomly chosen depending on the prover’s previous answers and the mathematical statement to be proved. If the prover does not know the proof, the overwhelming probability is that the verifier will be able to catch him out by his mistakes. Because the verifier can be convinced that the proof exists, without learning the proof itself, such proofs are truly “zero-knowledge proofs.”
 

When Goldwasser, Micali and Rackoff published their paper, its relevance to cryptography was immediately apparent. For example, it suggested an identification system that can be used safely over the Internet. The idea is that an individual user will know a proof for her own special theorem, which will be her password. To identify herself, the user can interact with a verifier (an ATM machine for example) to give that verifier a zero-knowledge proof of her special theorem.
 

Interactive proofs are not only a cryptographic tool; they have had a major impact on complexity theory. What seemed to be obvious for cryptographic purposes – that randomization and interaction must be used – has turned out to be widely applicable to complexity theory in general. It enables faster verification of proofs than classical mathematics and even gives mathematicians the ability to prove that some mathematical statements are not correct, by proving "non existence" of classical proofs.
 

In a further series of works by Goldwasser, Micali and others, interactive proofs were extended to include interactions between a verifier and multiple provers, which ultimately led to surprising new ways to prove NP-completeness results for approximation problems. This is an active area of research today.

 

Practical impact
 

ACM President Vint Cerf said the practical impact of the ideas put forward by Goldwasser and Micali is tangible. “The encryption schemes running in today’s browsers meet their notions of security. The method of encrypting credit card numbers when shopping on the Internet also meets their test. We are indebted to these recipients for their innovative approaches to ensuring security in the digital age.”
 

Alfred Spector, Vice President of Research and Special Initiatives at Google Inc., said Goldwasser and Micali developed cryptographic algorithms that are designed around computational hardness assumptions, making such algorithms hard to break in practice. “In the computer era, these advances in cryptography have transcended the cryptography of Alan Turing’s code-breaking era. They now have applications for ATM cards, computer passwords and electronic commerce as well as preserving the secrecy of participant data such as electronic voting. These are monumental achievements that have changed how we live and work.”

 

The third woman to receive a Turing Award
 

Prof. Shafi Goldwasser is recipient of the National Science Foundation Presidential Young Investigator Award; she also won the ACM Grace Murray Hopper Award for outstanding young computer professional. She has twice won the Gödel Prize presented jointly by the ACM Special Interest Group on Algorithms and Computation Theory (SIGACT) and the European Association for Theoretical Computer Science (EATCS).  
 

She was elected to the American Academy of Arts and Science, the National Academy of Sciences, and the National Academy of Engineering. She was recognized by the ACM Council on Women in Computing (ACM-W) as the Athena Lecturer, and received the IEEE Piore Award and the Franklin Institute’s Benjamin Franklin Medal in Computer and Cognitive science.
 

A graduate of Carnegie Mellon University with a B.A. degree in mathematics, she received M.S. and Ph.D. degrees in computer science from the University of California, Berkeley. She joined the Weizmann Institute in 1993 as a full professor. Goldwasser is the third woman to receive a Turing Award since the Awards' inception in 1966.
 

ACM will present the 2012 A.M. Turing Award at its annual Awards Banquet on June 15, in San Francisco, CA.  

 

Prof. Shafi Goldwasser’s research is supported by Walmart
 

 
Prof. Shafi Goldwasser
Math & Computer Science
English

Verifying correctness and reliability of computer hardware and software

English
Prof. Amir Pnueli
 
Prof. Amir Pnueli developed sophisticated methods for verifying the correctness and reliability of computer hardware and software. He used a mathematical language called temporal logic, which makes it possible to formulate and prove theorems that contain statements relating to time.
 

Application

 
Pnueli’s research is being implemented in control systems of nuclear reactors and space missile launchers. It is also being considered for use in creating “smart roads” equipped with sensors for preventing traffic accidents.
 
Prof. Amir Pnueli
Math & Computer Science
English

Statecharts - Graphic Computer Language

English
Prof. David Harel
 
Prof. David Harel studied ways to program computer systems based on the use of visual structures.
 

Application

 
Harel’s research led to Statecharts, a revolutionary graphic computer language for the design of complex systems. Today, Statecharts is used around the world in the aircraft, automotive and chemical industries, and in communications systems. It also led to a new avenue of research for exploring living systems and is already being applied to study the immune system.
Prof. David Harel
Math & Computer Science
English

RSA Algorithm for Secured Transations

English
Prof. Adi Shamir
 
Prof. Adi Shamir, together with two colleagues from the United States, developed an algorithm (now called RSA, for the last initials of its inventors) that allowed for the delivery of encrypted codes and their decryption (by public key) between parties that had not previously been in contact.
 

Application

 
The RSA algorithm is used worldwide to secure Internet, banking and credit card transactions. An additional algorithm developed by Prof. Shamir and colleagues led to the development of the smart cards used in TV satellite receivers to ensure that only subscribers receive broadcasts.
 
Prof. Adi Shamir
Math & Computer Science
English

Pages