Considered one of the most influential computer scientists in the world and a leader in advancing the field of machine learning, Michael I. Jordan helped develop unsupervised learning into a powerful algorithmic tool for solving real-world challenges in many areas including natural language processing, computational biology, and signal processing. A potent blend of computer science, statistics, and applied mathematics, machine learning involves the use of algorithms and statistical models that enable computers to carry out specific tasks without explicit instructions and to continually improve. Jordan helped transform unsupervised machine learning, which can find structure in data without preexisting labels, from a collection of unrelated algorithms to an intellectually coherent field that solves real-world problems. His pioneering work on latent Dirichlet allocation (or topic models) demonstrated how statistical modeling ideas can be used to learn, in an unsupervised manner, models of nontraditional data sets (such as documents) as compositions of different parts (such as topics), where the representations of the parts themselves are also learned simultaneously.
In his work on topic models and beyond, Jordan augmented the classical analytical distributions of Bayesian statistics with computational entities having graphical, combinatorial, temporal, and spectral structure, and he then used ideas from convex analysis, optimization, and statistical physics to develop new approximation algorithms, referred to as variational inference algorithms, that exploited these structures. Variational methods became a major area of machine learning and the principal engine behind scalable unsupervised learning. Today, they transcend subdisciplines of machine learning and play an important role in both deep learning and probabilistic machine learning.
An IEEE Fellow and member of the U.S. National Academy of Sciences and the U. S. National Academy of Engineering, Jordan is the Pehong Chen Distinguished Professor in the Department of Electrical Engineering and Computer Science and the Department of Statistics at the University of California, Berkeley, CA, USA.
Eva Tardos has reshaped and renewed the foundations of algorithm design with long-term vision, creativity, and technical strength that is benefitting the Internet through improved resource allocation, network formation, routing and congestion control, and electronic commerce. During a career spanning over 30 years, Tardos is most known for her work on network-flow algorithms, approximation algorithms, and quantifying the efficiency of selfish routing through the lens of algorithmic game theory. Her solo work on strongly polynomial algorithms was a breakthrough, having resolved a major open problem in the field; in particular, she showed that the minimum-cost flow problem (one of the basic problems in network flow that models the efficient transport of goods through a network) could be solved in strongly polynomial time, with a running time depending only on the number of nodes and edges of the network, not on the magnitudes of its capacities or costs. She then played a pivotal role in establishing the modern use of linear programming in algorithm design to advance the field of approximation algorithms. She has developed approximation algorithms for fundamental problems in a wide range of application areas, including facility location, routing, clustering, classification, and social network analysis.
Tardos’ work has been one of the pillars of algorithmic game theory, a burgeoning field that brings theoretical computer science and economics together to develop algorithmic foundations for our highly connected digital world. Algorithmic game theory is concerned with algorithms designed in the presence of self-interested agents, governed by incentives and economic constraints. Her pioneering work with Tim Roughgarden demonstrated how game-theoretic ideas could quantify the performance gaps between centrally managed network traffic and the flow of traffic directed by self-interested agents. This innovation provided the tools by which computer scientists can analyze the behavior of rational entities in computerized systems and has sparked an enormous amount of further research.
A member of the U.S. National Academy of Sciences and National Academy of Engineering, Tardos is the Jacob Gould Schurman Professor of Computer Science at Cornell University, Ithaca, NY, USA.
With the introduction and development of abstract interpretation, Patrick Cousot has provided the computer programming industry with one of the most sweepingly influential and impactful tools in all of computing. Working with his wife Radhia (who passed away in 2014), Cousot’s groundbreaking demonstration in 1977 of abstract interpretation was a fundamental paradigm shift that placed static program analysis on a mathematical footing so that researchers could reason about correctness. It is now the dominating approach to static program analysis and is pervasive in today’s programming tools, including compilers and the interactive development environment. Abstract interpretation provides a foundation for performing automatic program analysis, where the goal is to obtain information about the possible states that a program passes through during execution, but without actually running the program on specific inputs. In compilers, it is used to gather information used to decide which optimizations to employ, thereby allowing programs to run faster. In software-engineering tools, it is used to provide feedback to programmers about a program’s runtime properties, which helps them do a better job of developing, modifying, debugging, and testing programs. In verification tools, it is one of the key techniques used to show that a program never reaches a bad state, thereby establishing that the program is correct with respect to some property of interest. Verification has grown increasingly important as computers and microchip-based controllers have become pervasive, and it is especially crucial regarding critical systems, such as controllers in nuclear reactors, automobile-braking and airbag-deployment systems, and aircraft collision-avoidance systems.
Cousot and his team developed the ASTRÉE software system to analyze C programs for the occurrence of runtime errors. ASTRÉE has been applied to many safety-critical applications, such as the flight-control software for the Airbus A340 and A380 aircraft.
An IEEE member and recipient of the 2008 Humboldt Research Award and the 2014 IEEE Harlan D. Mills Joint Award, Cousot is the Silver Professor of Computer Science at the Courant Institute of Mathematical Sciences, New York University, New York, USA.
A living legend in the field of machine learning largely responsible for its historical and current success, Vladimir Vapnik has shaped the way modern researchers address the challenges of machine learning and how the field is practiced every day in applications ranging from large computer systems such as Google and Facebook to next-generation smart devices. Vapnik, with colleague Alexey Chervonenkis, developed the fundamental basis of statistical learning theory, which is at the foundation of practically all machine-learning techniques. Vapnik established an approach to machine learning based on the principle of fitting available training data while balancing the complexity of the learned model (known as the Vapnik-Chervonenkis dimension). This work helped researchers to understand basic issues about the nature of learning in general, and about what it means for a model or a theory to be simple or complex. It has provided the mathematical foundations for the entire optimization-based approach to machine learning.
Another of Vapnik’s breakthroughs was the support vector machine (SVM) algorithm, which has become one of the most widely used techniques in machine learning. Building on Vapnik's statistical learning theory, this computationally efficient learning algorithm satisfies strong generalization guarantees. When combined with kernel functions, SVMs produce a highly flexible learning system for a wide range of data types and inductive biases, effectively using a linear-separator learning algorithm to perform well even for data requiring highly nonlinear separation boundaries. SVMs have been applied to a tremendous range of commercial, governmental, scientific, and academic problems, from spam and fraud detection, to the face detector in an iPhone, to supporting cutting-edge biological discoveries.
A member of the U.S. National Academy of Engineering and recipient of the Benjamin Franklin Medal (2012), Vapnik is a professor with the Department of Computer Science at Columbia University, New York, NY, USA, and a research consultant with Facebook AI Research, Menlo Park, CA, USA.
Christos H. Papadimitriou is a leader in providing an understanding of how computational complexity can be used as a tool for understanding limits and solving problems within the broader scientific community, pioneering connections and collaborations between computer science and other disciplines. Papadimitriou has been the key player in the development of the understanding of "NP total search problems,” which are computational challenges where solutions are guaranteed to exist but may be hard to find.
He has been very influential in developing algorithmic game theory, which involves the convergence of computer science and economic theory. His work on computing and determining the computational complexity of the Nash equilibrium has provided important insights for game theory and economics. He studied the algorithmic complexity of computing game-theoretic solutions to cooperative and noncooperative game scenarios, which has had important implications for economics and gauging the health of the Internet amid the risks caused by congestion. He defined the “price of anarchy,” which provides a measure of the degree of inefficiency of equilibrium in a game, and is important for quantifying loss due to uncoordinated behavior of selfish agents within networks such as the Internet. Papadimitriou has also demonstrated how computational complexity can be applied to natural processes such as biology, describing the algorithmic aspects of protein structure and providing a mixability theory for the role of sex in evolution. His novels Turing and Logicomix have been very successful in reaching the broader public and exposing many people to some of the fundamental principles and ideas of mathematics and computer science.
An IEEE member and member of the U.S. National Academy of Sciences, the National Academy of Engineering, the American Academy of Arts and Sciences, and an ACM Fellow, Papadimitriou is the C. Lester Hogan Professor of Electrical Engineering and Computer Science with the University of California, Berkeley, CA, USA.
A world-class innovator for over 30 years, James A. Gosling’s development of the Java programming language in 1995 was a major milestone in computing that has had an immeasurable impact on computer science. Dr. Gosling combined the best ideas in programming languages with his own ideas to create the first widely deployed programming language featuring portability to allow transmission of code over the Internet from one computer to another for execution while still meeting security requirements. Its features include the portable "write once, run anywhere" byte-coded platform and libraries that makes use of a standard class file format that can be loaded and executed by any Java Virtual Machine; the robust and secure "sand box" approach; type-safe automatic storage management; just-in-time compilation; and platform scaling from cell phone to enterprise server. Used by approximately 9 million developers, Java is one of the most popular programming languages in history and can be found in servers, mobile phones, and the chips embedded in credit cards and identity badges. Dr. Gosling has also influenced software engineering methodology with important contributions during the 1980s.
As a graduate student, he created one of the most widely used versions of the UNIX Emacs text editor. As a contributor to Carnegie Mellon’s Andrew Project, he developed the first UNIX windows manager and one of the first modern, multiformat text editors that allowed placement of tables, pictures, and graphics in a document. This open-source architecture influenced the evolution of Microsoft Windows. Dr. Gosling has also impacted the world of embedded systems with his early work on the ISIS II satellite, a real-time specification for Java, and his current work on autonomous ocean-going robots.
A member of the US National Academy of Engineering and an Officer of the Order Canada (second-highest Canadian civilian honor), Dr. Gosling is chief software architect with Liquid Robotics, Redwood, CA, USA.
Cleve B. Moler is considered one of the most influential contributors to computational science and engineering for his development of the MATLAB high-level programming environment that changed the face of numerical computation and provided an indispensable tool for engineers worldwide. Dr. Moler developed MATLAB, which stands for “matrix laboratory,” as a simple matrix calculator for student use in mathematical courses, but it soon found broader acceptance in engineering. MATLAB makes computing easier for scientists and engineers and increases productivity by allowing them to focus on solving the problem at hand without needing to write their own code to perform matrix computations.
In 1984, Dr. Moler founded MathWorks with Jack Little to commercialize MATLAB. Today, MATLAB has over 1 million users representing universities, industry, and government worldwide. It is an important tool in industries including automotive, aerospace, communications, electronics, industrial automation, financial services, and computational biology. MATLAB is used at over 5,000 universities, and it is often the first programming language taught to science and engineering students. Other impactful contributions from Dr. Moler include the LINPACK and EISPACK linear algebra software libraries for computation involving matrices, which he helped develop during the 1970s. LINPACK and EISPACK gave scientists the ability to solve complex problems without requiring them to be experts in the algorithms and software. The LINPACK Benchmark, used to rank the world’s fastest supercomputers, is named after the LINPACK software library. Dr. Moler also contributed to linear algebra during the 1960s by writing reliable state-of-the-art Fortran subroutines for matrix computations and creating (with Pete Stewart) the QZ algorithm for the generalized eigenvalue problem prevalent in many applications.
A member of the U.S. National Academy of Engineering and recipient of IEEE Computer Society Pioneer Award (2012), Dr. Moler is currently chief mathematician with MathWorks, Natick, MA, USA.
Jack D. Dennis’ early development and continued exploration of parallel computing architectures have yielded principles for creating systems with increased interactivity and sharing ability and have important implications for today’s multicore processors. With over 50 years of research, Dr. Dennis was one of the earliest advocates of addressing computer architectures and programming together for parallel computing. Parallelism involves performance of many calculations simultaneously using two or more processors to cooperatively solve a problem and has become an essential aspect of high-performance computing.
Dr. Dennis developed principles for executing programs securely in parallel environments, introducing the concepts of capability, protected domains, object lists, and protected call and return. Dr. Dennis’ dataflow concept remains a promising approach for future computer system architecture. In dataflow computation, individual instructions or groups of instructions known as codelets can be executed as soon as data become available. The dataflow graph models and execution algorithms developed by Dr. Dennis and his group at the Massachusetts Institute of Technology from the late 1960s through the 1980s inspired universities and research groups around the world to undertake dataflow projects. Dr. Dennis developed the VAL language for parallel computation, which evolved into the SISAL programming language. His work has influenced several generations of computer architects and compiler writers and is currently shaping new approaches to the architecture and programming of massively parallel computer systems.
An IEEE Fellow, Association for Computing Machinery (ACM) Fellow, and a member of the U.S. National Academy of Engineering, Dr. Dennis’ awards include the IEEE/ACM Eckhert-Mauchly Award (1984) and induction to the SOSP Hall of Fame. Dr. Dennis is Professor Emeritus with MIT’s Computer Science and Artificial Intelligence Laboratory, Cambridge, MA, USA.
Edward J. McCluskey provided the foundation for the design automation methods that make production of today’s complex computer chips possible. Credited with helping bridge the gap between computer science and electrical engineering to establish computer engineering as a discipline of its own, Dr. McCluskey has contributed techniques and concepts essential to computer design, testing, and reliability for over 50 years. He developed the Quine-McCluskey algorithm as the first systematic method to create a minimized two-level logic representation of a digital circuit. Computer scientists and engineers learn this algorithm as the foundation for logic optimization. It continues to be used in numerous design automation tools and has influenced practically all digital chips in use today.
Dr. McCluskey also developed the modern theory of hazards in logic networks and formulated the operating modes of sequential circuits, which defined techniques for designing high-speed circuits. Dr. McCluskey was also an early pioneer of digital testing methods, and his vision continues to impact the field. He demonstrated that test metrics rather than fault models were the key to obtaining high test quality, which helped change the way digital testing is conducted. He formulated the concept of algebraic properties of fault models and developed methods using the concept to reduce test sets. This work has been incorporated into today’s automatic test pattern generation and fault simulation tools. Dr. McCluskey also developed very low-voltage testing as a cost-effective alternative to burn-in methods for detecting weak chips. Dr. McCluskey has also made important contributions to fault-tolerant computing for applications where failure is not an option.
An IEEE Life Fellow, Dr. McCluskey is Professor Emeritus of the departments of Electrical Engineering and Computer Science at Stanford University, CA, USA.
Professor Sir Tony Hoare has established the foundation of much that is taken for granted today in software design. A major portion of Hoare’s 50-plus-year scientific career has been devoted to developing the theoretical underpinnings of software to the point where its creation becomes a true engineering field. His work also has had practical impact, with application to commercial software development projects involving database management systems for the telecommunications industry and security and safety applications in the medical, transportation, and nuclear power industries. Hoare invented the Quicksort sorting algorithm in 1960, which has been widely studied and implemented in modern computers.
He also led a team during the 1960s that developed a successful early compiler for the ALGOL 60 high-level language. His compiler checked all array subscripts at run-time, which is a precaution now common in modern object-oriented languages. Rejecting shared variable interaction, he proposed “communicating sequential processes” to address concurrency issues among programs. This bold step was very influential and saw application in the U.S. Department of Defense’s Ada language. It was also the inspiration for the Occam programming language used in the transputer microprocessor developed during the 1980s for parallel computing. More recent work from Hoare involves working on the theory that would underpin a verification toolset and encouraging computer scientists to work together toward its achievement.
A Fellow of the U.K. Royal Academy of Engineering and a foreign associate of the U.S. National Academy of Engineering, he is currently a principal researcher with Microsoft Research Ltd., Cambridge, U.K.
Through pioneering research and wide-reaching textbooks, John Hopcroft and Jeffrey D. Ullman are known as two of the most influential figures responsible for shaping the field of computer science. These giants of computer science first met in 1964 when Dr. Ullman took Dr. Hopcroft’s automata theory course at Princeton University. Together, they wrote the book on automata theory and formal languages that was used by universities around the world to educate the first generation of computer scientists, Formal Languages and Their Relation to Automata. Its successor, Automata Theory, Languages, and Computation, is still in use today.
Dr. Hopcroft is considered one of only a handful of computer scientists who created the discipline of theoretical computer science, unifying automata theory and formal languages during the late 1960s. He also determined that computer programming could be synthesized into a theory of algorithms and that these algorithms could be evaluated by their asymptotic complexity. This led to a set of design principles that could be used to design optimal algorithms. An IEEE Life Fellow, Dr. Hopcroft is also a member of the National Academy of Sciences and the National Academy of Engineering. He is currently the IBM Professor of Engineering and Applied Mathematics at Cornell University, Ithaca, N.Y.
Dr. Ullman also focused on compiler technology and code optimization, writing the definitive book on compiler technology (Principles of Compiler Design, Addison-Wesley, 1977) with Aho. In the late 1970s Dr. Ullman’s interests turned to database systems and he became known as one of the founders of database theory. Dr. Ullman is a member of the National Academy of Engineering and is currently the Stanford W. Ascherman Professor of Computer Science (Emeritus) at Stanford University, Calif., and currently heads Gradiance Corporation, Stanford, which he founded to provide online homework and programming-lab support for college students.
Susan L. Graham's seminal contributions and leadership have greatly influenced software development and high-performance computing. Together with her students, Dr. Graham's programming language implementation innovations include a sophisticated pattern-matching algorithm for generating machine code for high-level languages, which is important for today's processors, and an important and useful elimination-style algorithm for flow analysis. She introduced automatic error message generation as a useful tool, pointing the field in a new direction; as a result IBM has used automatic error message generation in several of its production compilers. She and her students developed the "gprof" profiling tool for analyzing the execution efficiency of programs and new algorithms for debugging programs. She led the development of the Berkeley Pascal compiler and the distribution of and extensions of BSD Unix. Dr. Graham was also the co-creator of the Titanium language and system, which is used to develop parallel programs for scientific applications.
Dr. Graham has served on the Presidential Information Technology Advisory Committee, where she was actively involved in influencing the U.S. Congress to bolster information technology spending. She co-chaired the National Research Council's Future of Supercomputing committee. She was the chief computer scientist for the National Partnership for Advanced Computational Infrastructure, ensuring that the best computer science results were transferred into important computation-based science applications. She participates in a variety of science and engineering advisory groups.
Her current research interest is on developing interactive language-aware tools for creating and maintaining software. An IEEE member, Dr. Graham is currently the Pehong Chen Distinguished Professor of Electrical Engineering and Computer Science Emerita and a professor in the Graduate School at the University of California, Berkeley.
Leslie Lamport's pioneering work in distributed and concurrent algorithms has improved numerous consumer and industrial computing systems. The result of his work can be found in multi-processor technology such as very-large-scale-integration (VLSI) semiconductors and multi-computer networks used in aircraft control systems. Since 2001 he has been at the Microsoft Research Silicon Valley Center, where he is a principal researcher. Prior to that, Dr. Lamport spent 16 years as a researcher at Digital Equipment Corporation (later Compaq Corporation). There he developed the Temporal Logic of Actions (TLA) system, a toolset for mechanical verification that is used to describe the behaviors of concurrent systems. Dr. Lamport developed several well-known concurrent and distributed algorithms, including solutions for Byzantine Fault Tolerance. The algorithm is a method of prevention against Byzantine Failure, in which a component of a system behaves erroneously while failing to behave consistently when interacting with multiple other components in the system. During his career, he has authored or co-authored nearly 150 publications on concurrent and distributed computing and their applications. One of his most notable papers, "Time, Clocks, and the Ordering of Events in a Distributed System," still ranks as one of the most important and influential papers in computer science. He is a past recipient of the IEEE Emanuel R. Piore Award, the Edsger W. Dijkstra Prize in Distributed Computing and the influential paper award at the Principles of Distributed Computing Conference. Dr. Lamport holds a bachelor's degree from the Massachusetts Institute of Technology, Cambridge, as well as a masters and doctorate from Brandeis University, Waltham, Massachusetts.
For over 35 years, Charles P. Thacker has led innovation in the area of distributed personal computing. He is one of the primary forces behind the introduction of the modern-day PC. While working at the Xerox Palo Alto Research Center (PARC) as a research fellow from 1970 to 1983, he served as the principal designer for the Alto personal computer system, widely considered the prototype for both workstations and windowed personal computers. Additionally, he revolutionized the computing industry as one of the co-inventors of the Ethernet local area network.
Mr. Thacker was a corporate consultant engineer at the Digital Equipment Systems Research Center (DEC SRC) from 1983 to 1997, and developed Firefly, one of the first multiprocessor workstation systems.
Since 1997, he has held the position of Technical Fellow at Microsoft Corporation, where he designed and implemented major parts of the prototype for the Tablet PC, including portions of its handwriting-recognition system. His prototype is also the basis for tablet PCs now being sold by several computer manufacturers.
An IEEE Member, Thacker holds a bachelor's degree in Physics from the University of California, Berkeley, and an honorary doctorate from the Swiss Federal Institute of Technology in Zurich. He is a fellow of the ACM, a member of the American Academy of Arts and Sciences, and a member of the National Academy of Engineering, which presented him with the 2004 Charles Stark Draper Prize, along with Alan C. Kay, Butler W. Lampson and Robert W. Taylor for development of the first networked distributed personal computer system.P.
Dr. Ed Catmull, president and co-founder of Pixar Animation Studios, has made groundbreaking contributions to the field of computer graphics in modeling, animation and rendering that have revolutionized the way live-action and animated motion pictures are created. Dr. Catmull is one of the architects of the RenderMan software product utilized to create animated films such as Pixar's Toy Story and Finding Nemo and special effects in live-action films.
In 1974, while studying physics and computer science at University of Utah, Dr. Catmull's pioneering animation of a human hand was incorporated into the first movie to use 3D computer graphics. Through his research, he made four key computer graphics discoveries: Z-buffering, texture mapping, subdivision surface, and the fast rendering ofbicubic patches. Prior to Pixar, he was vice president of the computer division of Lucasfilm Ltd.
Dr. Catmull has been honored for his work with four Scientific and Technical Engineering awards, including an Oscar from The Academy of Motion Picture Arts and Sciences. In 1993 he was honored with the Steven A. Coons Award, the highest achievement in the computer graphics field, for his lifetime contributions.
Dr. Catmull is a member of the IEEE and the IEEE Computer Society, the U.S. National Academy of Engineering and The Academy of Motion Picture Arts and Sciences.
A founder and chief technology officer of several successful corporations and an adjunct professor at MIT, Dr. Michael Stonebraker made critically important contributions to relational database software, inventing many of the architectures and strategies that are the foundation of today's multi-billion-dollar-a-year database software industry. Virtually all business computing applications - from accounting, inventory and shipping to point- of-sale, online commerce, human resources and computer-aided design - manage data using relational database systems that can be traced to his work.
His earliest contributions were as the leader of the groundbreaking INGRES project. It applied relational theory to high-performance, data management software systems by allowing users to specify queries that describe desired data without specifying programs to find it. The relational systems he developed,along with those of the System R project from IBM, revolutionized database systems.
His next creation, POSTGRES, was an object-relational database that extended the relational model with abstract data types,including user-defined operators and procedures. It also allowed database systems to provide significantly better intelligence and efficiency in commercial, administrative and scientific applications. The types of functionality that Dr. Stonebraker pioneered in POSTGRES are now used in database systems worldwide.
In recent years, Dr.Stonebraker applied economic computing paradigms to federated data management. Currently, he is applying database concepts to high performance stream processing applications. In addition to his current role at MIT, Dr. Stonebraker founded StreamBase Systems in 2003 and serves as the company's chief technology officer. StreamBase has developed a new software platform designed to process real-time streaming data.
Alfred Aho's many contributions to computer science as a researcher, manager and educator demonstrate an elegant balance of theory and practice. His innovative research in formal languages and compiler theory led to key algorithms for modern compilers and string-pattern matching tools. His algorithms are used in widely deployed compiler construction tools like YACC and LEX, which have been used to design many computer programming languages.
Dr. Aho conducted research at Bell Labs in Murray Hill, N.J., from 1967-1991. During that time, he developed efficient algorithms for textprocessing applications and programming language translation. His codegeneration algorithms influenced the design of retargetable C compilers, which facilitated the porting of the UNIX operating system from minicomputers to supercomputers. He worked at Bellcore in Murray Hill, N.J., from 1991-1995 and then became a professor of computer science at Columbia University in New York. He later joined Lucent Technologies in Murray Hill, N.J., as an associate research vice president and vice president of computing sciences research. In 2002, he returned to Columbia as a professor of computer science and is chair of the Computer Science Department.
Dr. Aho co-wrote several of computer science's most used textbooks including The Design and Analysis of Computer Algorithms; Compilers: Principles, Techniques, and Tools; and The AWK Programming Language. He has served on many professional committees including the U.S.
National Research Council's Committee on Information Technology Literacy and the Advisory Committee of the National Science Foundation Computer and Information Science and Engineering Directorate. From 1992-1994 he also was editor of the Proceedings of the IEEE.
A member of the U.S. National Academy of Engineering, Dr. Aho is a Fellow of the IEEE, Association for Computing Machinery, Bell Labs, and the American Association for the Advancement of Science.
When Ole-Johan Dahl and Kristen Nygaard created the Simula languages in the 1960s at the Norwegian Computing Center, they introduced a new way of modeling and programming complex tasks. Object-oriented programming is now dominant in system development, and is everywhere part of the computer science curricula, as are languages built on OOP concepts, such as Smalltalk, C++, Eiffel and Java.
Kristen Nygaard analyzed complex problems by computer simulation, requiring interaction between many very dissimilar components. He saw the need for a description language that could be used to comprehend, describe and communicate complex systems, and also make it possible for computers to execute models of what had been described.
Ole-Johan Dahl joined Kristen Nygaard at the NCC. They designed the Simula concepts and language together, and Dahl carried out the difficult task of writing the program that translated descriptions into computer instructions. In 1968Dahl became the University of Oslo's first computer science professor and built its Informatics Department. Later he has contributed to the theory of proving correctness of programs. Nygaard continued OOP, but also became the driving force in the "Scandinavian school of system development." His efforts now are aimed at improving the teaching and conceptual platform for computer science.Dahl and Nygaard are Professors Emeriti at the University of Oslo. They both have been named Commander of the Orderof Saint Olav by the King of Norway and Honorary Fellows of the Object Management Group, have received the Norwegian Rosing Prize and the 2001 ACM A.M. Turing Award.
Combining a deep understanding of computer-science theory with practical implementation, Butler Lampson has contributed to a remarkable range of systems that make computing as we know it today possible.
As a member of the legendary Computer Science Laboratory of the Xerox Palo Alto Research Center, Dr. Lampson's work laid much of the foundation for today's local area networks, client-server systems, laser printers, and editors such as Microsoft Word. He was instrumental to the group's many breakthroughs, including the Xerox Alto and its successor, the Dorado, which proved that an IBM 370-168 class computer could be packaged to fit in an office. He helped design Interpress, a precursor to Adobe's Postscript language. He also has been a major contributor to the design and implementation of Mesa, Cedar, and Modula 2+ computer programming languages. This series of designs has influenced the structure of the Java programming language.
One of the designers of the SDS 940 timesharing system, Dr. Lampson was a pioneer in creating a commercial timesharing system that allowed user programming in machine language. Also a leader in computer security, he collaborated in the design of a global naming and authentication service that guided the structure for the OSF/DCE naming and authentication facilities.
Butler Lampson was born on 23 December 1943, in Washington, DC. He obtained an AB in Physics from Harvard University in 1964, and a Ph.D. in Electrical Engineering and Computer Science from the University of California at Berkeley in 1967. Currently, he is a Distinguished Engineer at Microsoft and an Adjunct Professor at MIT.
Dr. Lampson holds 24 patents on networks, security, raster printing, and transaction processing, and has published 64 papers. He is a member of the National Academy of Engineering and a Fellow of the Association for Computing Machinery (ACM) and the American Academy of Arts and Sciences. He holds honorary ScD's from the Eidgenssische Technische Hochschule, Zrich, and the University of Bologna. He received the ACM's Software Systems Award in 1984 for his work on the Alto, the IEEE Computer Pioneer Award in 1996, the National Computer Systems Security Award in 1998, and the Turing Award in 1992.