2019 - Erkki Oja

Erkki Oja

For automatic analysis of large data masses, the pioneering accomplishments of Erkki Oja in developing artificial neural networks to tackle today’s overwhelming amount of digital data have been critical in extracting reliable and useful information. He introduced a fundamental learning rule that enables a neural network to find the principal components of input data, leading to efficient compression and feature extraction, known as “Oja’s Rule.” Oja improved upon the Independent Component Analysis (ICA) technique with his FastICA method, which he applied to biomedical signal and images and demonstrated that very efficient source separation of artifacts from actual brain signals was possible. His work on self-organizing maps has been used to classify both still pictures and videos and even for indexing nonverbal data.

An IEEE Life Fellow, Oja is a Distinguished Professor Emeritus of Computer Science and Engineering with Aalto University, Espoo, Finland.

2018 - Enrique H. Ruspini

Enrique H. Ruspini

In a seminal 1969 paper, Enrique H. Ruspini provided the conceptual bases and tools for fuzzy clustering: the summarization and understanding of large data sets and complex objects as collections of fuzzy sets. In subsequent work, Ruspini defined methods that generalize fuzzy clustering by allowing the discovery of multiple, overlapping clusters of different nature and for recognizing important relations between those clusters. His work has led to numerous approaches for data representation and their application to fields ranging from image understanding to neurophysiology to genomics. His developments in the field of approximate reasoning led to a better understanding of methodologies for the analysis of systems described by uncertain data and to approaches to the intelligent control of autonomous robots and to pattern matching in databases (finding “needles” in data “haystacks”).

An IEEE Life Fellow, Ruspini is currently an independent consultant residing in Palo Alto, CA, USA.

2017 - Stephen Grossberg

Stephen Grossberg

Stephen Grossberg’s foundational work on modeling how brain mechanisms give rise to behavior functions has played an important role in understanding the human mind and enabling machines to adapt to unexpected changes. The ability of his models to adapt autonomously in real time to unexpected environments makes them suitable for applications including image processing, pattern recognition and prediction, and robotics. He is best known for his Adaptive Resonance Theory, which concerns how the brain can learn new objects without forgetting previously learned patterns. He also developed the concept of Laminar Computing. His principle of Complementary Computing has demonstrated how the brain is organized in pairs of parallel streams that compute complementary processes that allow for trade-offs necessary for adaptive intelligent systems.

An IEEE Fellow, Grossberg is the Wang Professor of Cognitive and Neural Systems at Boston University, Boston, MA, USA.

2016 - Ronald R. Yager

Ronald R. Yager

With almost 40 years of groundbreaking contributions, Ronald R. Yager is one of the most highly cited researchers in the field of computational intelligence. Of major impact has been Yager’s introduction of the Ordered Weighted Averaging operator that has been applied to multicriteria decision making, information fusion, database retrieval, and pattern recognition. His methodology for finding linguistic summaries of large data collections makes data easier to understand and has been integral to data mining applications. He also developed a generalized class of logical “and” operators, known as the Yager family of t-norms, that have been widely used to model the intersection of fuzzy sets. Yager’s pioneering work on fuzzy-set-based approaches for social network and recommender systems have been important to web applications.

An IEEE Life Fellow, Yager is a professor with the Machine Intelligence Institute at Iona College, New York, NY, USA.

2015 - Marco Dorigo

Marco Dorigo

Marco Dorigo’s groundbreaking research on biologically inspired intelligent methods for solving optimization problems has helped launch the discipline of swarm intelligence. Swarm intelligence studies distributed systems whose problem-solving abilities derive from self-organized local interactions between their constituent components. Prof. Dorigo is most known for his work on the ant colony optimization (ACO) methodology, inspired by the foraging behavior of ants, which is used by researchers worldwide and has generated many high-performance algorithms. He is also a leading contributor to swarm robotics, which applies swarm intelligence principles to coordinate large groups of autonomous robots without relying on any external infrastructure or on any form of centralized control. This holds promise for performing tasks too difficult or dangerous for humans.

An IEEE Fellow, Dr. Dorigo is an F.R.S.-FNRS research director and a co-director of IRIDIA, the artificial intelligence lab of the Université Libre de Bruxelles, Brussels, Belgium.

2014 - Geoffrey E. Hinton

Geoffrey E. Hinton

Geoffrey E. Hinton helped establish the field of machine learning and has dedicated his research to understanding how the human brain works and how this knowledge can be applied to provide machines with brain-like capabilities for performing complex tasks. Prof. Hinton pioneered backpropagation learning algorithms for training neural networks and has revolutionized machine learning several times over. His work on deep learning provides a better model of biological learning than previous methods. By employing multiple levels, it is capable of producing the type of deep hierarchy of abstract representations that are known to exist in the brain. His work has provided revolutionary changes in speech recognition technology and his algorithms have been applied to collaborative filtering and object recognition.

Dr. Hinton is a university professor with the Department of Computer Science at the University of Toronto, Canada, and a Distinguished Researcher at Google Inc.

2013 - Terrence Sejnowski

A pioneer of computational neuroscience, Terrence Sejnowski has developed methods important for studying and understanding how the human brain learns and stores memories. Dr. Sejnowksi’s contributions ultimately may provide medical specialists with important clues to combating Alzheimer's disease. He has created methods for representing how networks of neurons generate dynamical patterns of activity, how sensory information is represented in the cerebral cortex, and how memory representations are formed and consolidated during sleep. In 1987 he created NETtalk for converting English words to speech. In 1995, with Tony Bell, he introduced the infomax independent component analysis algorithm for blind source separation.

An IEEE Fellow, Dr. Sejnowski is one of only ten living people to be a member of the U.S. Institute of Medicine, the U.S. National Academy of Sciences, and the U.S. National Academy of Engineering. He is the Francis Crick Chair with the Salk Institute for Biological Studies, La Jolla, CA, USA.

2012 - Vladimir N. Vapnik

Photo of 2012 IEEE Frank Rosenblatt Award recipient, Vladimir N. Vapnik

Vladimir N. Vapnik’s pioneering work became the foundation of a new research field known as “statistical learning theory” that has transformed how computers learn in tackling complex problems. Working with Alexey Chervonenkis in Moscow during the late 1960s/early 1970s, Dr. Vapnik developed the Vapnik-Chervonenkis (VC) learning theory. This theory established a fundamental quantity to represent the limitations of learning machines. Dr. Vapnik later created principles to handle the generalization factors defined by VC theory, known as structural risk minimization. Dr. Vapnik’s research was unknown to the Western world until his arriving in the United States shortly before the collapse of the Soviet Union. Working with AT&T Laboratories in Holmdel, NJ, during the 1990s, he put his theories into practical use with support vector machine (SVM) algorithms for recognizing complex patterns in data for classification and regression analysis tasks. SVMs have become the method of choice for machine learning.

A member of the U.S. National Academy of Engineering and NEC Laboratories America Fellow, Dr. Vapnik is currently a professor with Columbia University, New York, NY, USA.

2011 - Hans-Paul Schwefel

Photo of 2011 Frank Rosenblatt Award recipient Schwefel

Hans-Paul Schwefel’s groundbreaking contributions to evolution strategies helped define the field of evolutionary computation and have had lasting impact on the computational intelligence community. Working with fellow students Ingo Rechenberg and Peter Bienert at the Technical University of Berlin during the mid-1960s, Dr. Schwefel's contributions concerned the first theoretical investigations and industrial application of the shape optimization of a supersonic nozzle for a one-component two-phase flow. He pioneered the shift from experiments done by hand to computational optimization by introducing the collective self-adaptation of internal parameters within evolutionary algorithms, which helped to make such methods effective and efficient. He later introduced evolutionary principles beyond variation and natural selection into the algorithms to handle special features of the search space. In 1990 he was co-founder of the international conference series on Parallel Problem Solving from Nature (PPSN).
 
An IEEE Fellow, Dr. Schwefel is Professor Emeritus of Computer Science at the Technical University of Dortmund, Germany.
2010 - Michio Sugeno

Michio Sugeno

Dr. Michio Sugeno first introduced fuzzy measures and the Sugeno Integrals leading to the concept of the Choquet Integrals as an extension of the conventional Lebesgue Integrals. Dr. Sugeno has also been a leader in creating industry partnerships to bring practical applications to life, ranging from smart cameras to industrial control systems. One of Dr. Sugeno’s early breakthroughs was the automatic control of a small car using fuzzy systems based on the Takagi-Sugeno Model. The groundbreaking work has had a tremendous impact on fuzzy control researchers and has impacted applications such as home appliances, automobiles and process control. Dr. Sugeno next tackled intelligent control of an unmanned helicopter by demonstrating fuzzy control through verbal instructions combined with real-time flight data from the GPS and on-board cameras. This achievement has implications for agriculture industry and disaster recovery applications.

An IEEE Member, Dr. Sugeno is currently an Affiliated Distinguished Researcher at the European Centre for Soft Computing, Oviedo, Spain.

2009 - John J. Hopfield

photo of John Hopfield

John J. Hopfield’s research demonstrated how modeling biological processes in the brain can be used to solve complex computational problems. The beginning of the modern era of neural networks can be traced to Dr. Hopfield’s pioneering work in the early 1980s.

Relating an understanding of the electrical and cellular activity that takes place in the brain to computer technology, Dr. Hopfield described a feedback network of highly interconnected neurons that could reconstruct memories from clues (associative memory) and showed how stable states of network activity could represent memories, emphasizing the importance of computers (and the brain) as dynamical systems (now known as a “Hopfield Network”). A large portion of all studies concerning neural circuits are based on Hopfield’s concepts and the use of attractors for computation. Beyond the benefit to computing technology, Dr. Hopfield’s work also serves as a basic paradigm in neuroscience for understanding how the brain carries out its tasks.

Dr. Hopfield is the Howard Prior Professor of Molecular Biology, emeritus, at Princeton University, NJ, USA.

2008 - Teuvo Kohonen

photo of Teuvo Kohonen

Teuvo Kohonen has made pivotal contributions in the field of artificial neural networks, having developed the self-organizing map (SOM), a data-analysis method that helps with the problem of clustering and data exploration. SOM is regularly used in finance, trade, natural sciences and linguistics. This method and its derivatives are also used in speech recognition and robotics. The SOM method, considered by experts to be one of the most significant inventions in computational science, has been the subject of some 8,000 scientific papers, a dozen books and six international workshops that have been organized using the method. Dr. Kohonen is currently a professor and Academician at Helsinki University of Technology, Espoo, Finland. An IEEE Fellow, he has received several awards including the IEEE Neural Networks Pioneer Award, the IEEE Signal Processing Society’s Technical Achievement Award and the International Neural Network Society Lifetime Achievement Award.

2007 - James C. Bezdek

photo of James Bezdek

Dr. Bezdek is the Nystul Professor and Eminent Scholar of Computer Science at the University of West Florida in Pensacola. He developed the fuzzy c-means (FCM) algorithm, considered one of the most important discoveries in fuzzy pattern recognition and related areas and the clustering algorithm of choice for most practitioners in fuzzy exploratory data analysis. The original model has inspired many applications in related areas of pattern recognition and image processing.

Areas of research benefiting from Dr. Bezdek’s work include diagnostic medicine, economics, chemistry, image processing, meteorology, web mining, geology, target recognition, regression analysis, document retrieval, structural failure and irrigation models. One of the most notable applications has been in medical image analysis, where FCM segmentation of magnetic resonance images is used in conjunction with rule-based analysis for both diagnosis and pre-operative planning for brain tumor patients. Dr. Bezdek also has made pioneering contributions in deriving the theories for clustering of relational (Euclidean and non-Euclidean) data.

2006 - Lawrence J. Fogel

photo of Lawrence Fogel

Dr. Lawrence J. Fogel has been described by colleagues as “a father of computational intelligence.”  

Beginning in 1960, he devised evolutionary programming, a radical approach in artificial intelligence that simulated evolution to literally evolve solutions to problems. While a senior staff scientist at General Dynamics/Astronautics in San Diego, CA, he conducted a research study in evolutionary programming to advise management on the technical aspects of man-machine relations within aerospace systems.

His 1964 doctoral dissertation on evolutionary programming was the basis of the first book on evolutional computation, Artificial Intelligence through Simulated Evolution, which he co-authored with Alvin Owens and Michael Walsh. In 1965, Dr. Fogel, with Owens and Walsh, founded Decision Science, Inc. in San Diego, CA, the first company to focus on solving real problems via evolutionary computation.

As president of Decision Science, he directed its activities, guiding research in the informational sciences in areas such as computer simulation, mathematical prediction and control systems, real-time data processing and materials handling systems. He also developed evolutionary programming methods that led to the Adaptive Maneuvering Logic, aheuristic approach to missile evasion for simulated aerial combat, as well as for naval and tank warfare. His method also has been used to discover new pharmaceuticals, improve industrial production and optimize mission planning in defense applications. In 1982, Decision Science merged with and became a division of Titan Systems, Inc., in San Diego.

In 1993, Dr. Fogel founded Natural Selection, Inc. in La Jolla, CA, which combines evolutionary computation with neural networks, fuzzy systems, and other computational intelligence technologies. The company has addressed and solved problems in many areas, including bioinformatics, medical diagnostics, pattern recognition, data mining, perimeter security, factory optimization, route scheduling, autonomous vehicle capabilities and risk management.

An IEEE Life Fellow, Dr. Fogel is the recipient of the IEEE Neural Networks Council Evolutionary Computation Pioneer Award, the Lifetime Achievement Award from the Evolutionary Programming Society, and the Computational Intelligence Pioneer Award from the International Society for Optical Engineering.

He holds a bachelor’s degree from New York University in New York City and a master’s degree from Rutgers University in New Brunswick, NJ, both in Electrical Engineering, and a doctorate in Engineering with a major in Biotechnology from the University of California at Los Angeles, Los Angeles, CA, USA.

Awards Nomination Deadlines
2018 IEEE Medal of Honor Recipient
IEEE Awards Social Media

Follow and engage with IEEE on the following sites:

Follow on FaceBook   Facebook

Twitter   Twitter

LinkedIn   LinkedIn

Instagram   Instagram