Machine learning. History of modern achievements

18 November 2021
Machine Learning

This informative article will explore the history and possibilities of modern methods of AI and ML

Today, machine learning is being integrated into every aspect of human life, but it is often invisible to those who do not work with information technologies. This informative article will explore the possibilities of modern methods of artificial intelligence (AI) and machine learning. It will look at the history, past problems, and evolution of these technologies and discuss their current status and opportunities.

Artificial intelligence and machine learning. What's the difference?

John McCarthy, an American computer scientist who coined the term “artificial intelligence”, described AI as a complex scientific and technical means of designing intelligent machines, especially intelligent computer programs.

Arthur Samuel, who in 1952 developed one of the first checkers programs to incorporate self-learning, later defined machine learning as the process by which computers can show behaviors that are not explicitly programmed.

Early examples of artificial intelligence

Mechanical Turk or Automaton Chess Player

The first chess machine was designed by Wolfgang von Kempelen and demonstrated in Vienna in 1769. It took the form of a chess-playing automaton known as the "Turk" — a life-size wax figure of a man dressed in Turkish attire, sitting at a chessboard, which stood on a wooden box.

The box had doors that opened to those observing to reveal a complex mechanism. The doors were then closed, the mechanism started with a key, and a game would begin. However, unbeknown to the spectators, the game was actually played by a skilled chess player concealed in the box, hidden from view behind mirrors and partitions.

Turing test

Alan Turing first proposed the Turing test in his article "Computing machineryand intelligence" published in 1950 in the philosophical journal Mind. Turing was trying to determine if machines could think.

As part of the Turing test, a person directs questions to two unseen respondents: one computer and one human. Based on the answers to these questions, the person must determine whether they are talking to a human or a computer program. The task of the computer program is to mislead the person and prompt them to make the wrong choice.

To test the machine's intelligence rather than its ability to recognize speech, the conversation is conducted via text using an intermediary computer. Correspondence is carried out at controlled intervals so that the speed of the responses does not influence the questioner. If the person cannot be certain which of the respondents is human, then the computer is deemed to have passed the test.


ELIZA is a computer program created by Joseph Weizenbaum in 1966 that mimics human conversation. Using natural language processing (NLP) techniques, ELIZA simulates the dialogue between patient and therapist through active listening. The program was named after Eliza Doolittle, a character in the play Pygmalion by George Bernard Shaw, who was taught "the language of Shakespeare, Milton, and the Bible."


NLP applications include:

●      Speech recognition (an automatic process of converting speech to text)

●      Dialog systems

●      Text generation

●      Text analysis

●      Extracting information

●      Information searches

●      Statement analysis

●      Text tonality analysis

●      Question and answer systems

Generative Pre-trained Transformer 3 (GPT-3) from OpenAI (released on June 11, 2020) is among the latest achievements in NLP systems involving text generation. GPT-3 can answer questions about text, read, write poetry, solve anagrams and simple arithmetic, and even translate languages.


History of speech recognition development

●      Late 1950s — The first computer-based speech synthesis systems.

●      1963 — Miniature recognition devices with fiber-optic storage.

●      1976 — Harpy (Recognition accuracy of 47% per 1001 words)

●      1983 — Recognition of the commands of Apache helicopter pilots.

●      1990 — Dragon Dictate program.

●      1997 — Dragon Naturally Speaking (the first universal program).

●      2001 — Recognition accuracy of 80%.

●      2007–2009 — Using LSTM (Long short-term memory) and deep direct-link networks.

●      2010 — Systems do not need pre-training. Virtual assistants have become established.

●      2017 — Microsoft develops technology that decodes telephone communications, exhibiting a 5.1% error rate versus 5.9% in humans.


History of chatbots and virtual assistants (dialog systems)

●      1995 — ALICE (provides natural dialogue on 40,000 topics)

●      2006 — Watson (IBM)

●      2011 — Siri (Apple)

●      2014 — Alexa (Amazon) and Cortana (Microsoft)

●      2016 — Google Assistant and Tay (a Microsoft chatbot, which in 16 hours turned from a harmless teenager into a bot spouting offensive messages)

●      2017 — Alice (Yandex)

History of computer vision development

●     1955 — Thetheoretical idea is first explored in the article "Eyes and ears for the computer" by Oliver Selfridge of the Massachusetts Institute of Technology.

●      1958 — Computer implementation of perception.

●      Late 1980s — The development of robots that assess the world around them.

●      Late 1990s — Computer analysis of movement.

●      2001 — Viola–Jones algorithm (face detector).

●      2003 — First commercial facial recognition systems.

●      Circa 2010 — Object classification competitions.

Artificial intelligence, machine learning, deep learning

Artificial intelligence means that a computer simulates human behavior in one way or another.

Machine learning is a branch of AI consisting of methods that allow you to draw conclusions based on data.

Deep Learning is a subset of machine learning algorithms.

Artificial intelligence winter

An AI winter refers to a period of reduced funding and declining interest in AI research. The term is analogous to the idea of a "nuclear winter". The field of AI has gone through several “hype” cycles, where initial interest and investment in the technology are followed by disappointment, criticism, and subsequent cuts in funding. Notable AI winters include:

●      1974 — Cancellation of a $3 million per year grant for speech recognition systems.

●      1987 — The collapse of the Lisp computer market and reduced funding for AI research.

●      1991 — Failure to meet the goals of The Fifth Generation Computers project in Japan, launched in 1981.

As AI has become increasingly associated with failed expectations, the term machine learning has become more commonplace.

The current status

Today, the computer has become a more skillful competitor than the human in board games, and AI solutions are beginning to appear in video games. Computer vision from static images is developing in the direction of video stream analysis and augmented reality. Effective algorithms for image synthesis have emerged, and developments in detecting such artifacts are already underway. Breakthroughs continue in the analysis and generation of text, while many business forecasting and analysis tools incorporate specific machine learning methods.



Author: V. Kurbatov