Tweet This! :)

Friday, December 1, 2023

One of the founding fathers of AI

© Mark Ollig


John McCarthy, assistant professor of mathematics at Dartmouth College, co-authored a paper titled “A Proposal for the Dartmouth Summer Research Project on Artificial Intelligence” Aug. 31, 1955.

In it, he defined artificial intelligence (AI) as: “The science and engineering of making intelligent machines.”

Inspired by McCarthy’s 17-page paper, the Dartmouth Summer Research Project on Artificial Intelligence conference took place at Dartmouth College in Hanover, NH, during the summer of 1956.

The conference, organized by McCarthy, included leading scientists, mathematicians, and researchers who discussed machines capable of human-like thinking.

Shortly after, artificial intelligence witnessed a surge in activity as researchers sought to develop computer programs that could replicate human intelligence.

In an article published April 16, 1957, in The State Journal newspaper of Lansing, MI. Watson Davis, then-editor of Science Service in Washington, DC, offered his predictions on the potential impact of artificial intelligence in 2000.

“Your grandchildren will probably enjoy artificial intelligence machines, which will do things people do now – write letters, do bookkeeping, translate languages, file and retrieve information, teach students individually, plan and operate factories, cook, serve meals, clean houses, drive automobiles, and fly airplanes,” said Davis, who made this accurate prediction 66 years and seven months ago.

In 1958, while at the Massachusetts Institute of Technology (MIT), John McCarthy introduced list processing, a programming language enabling the application of mathematical concepts to solve complex programming problems.

List processing is known for its symbolic computation capabilities, which introduced tree data structures and automatic storage management for list manipulation.

This language is valuable for its ability to manipulate code as data, making it a powerful tool for developing complex algorithms.

Although one of the oldest programming languages, list processing is still widely utilized in symbolic computation, functional programming, meta-programming, and modern AI research.

In 1958, John McCarthy introduced a memory management technique known colloquially in the computing world as “garbage collection.”

Initially used with list processing, this technique enables computers to reclaim unused memory automatically, simplifies programming, and efficiently handles extensive memory demands.

McCarthy emphasized that AI systems should incorporate “common sense” to replicate human cognitive reasoning.

He felt AI needed to understand indirect and nuanced language expressions using formal AI logic to arrange rules and principles into a systematic code to equip AI systems to reason and act more like human intelligence.

In 1959, McCarthy developed time-sharing technology, which allowed multiple users to share a single computer.

By 1961, he created the first interactive time-sharing system for software, called the Compatible Time-Sharing System.

McCarthy also predicted cloud computing during the centennial week celebration of MIT April 3 to 10, 1961.

“Computing may someday be organized as a public utility just as the telephone system is a public utility,” McCarthy said.

“Each subscriber needs to pay only for the capacity he actually uses, but he has access to all programming languages characteristic of a very large system. Certain subscribers might offer service to other subscribers. The computer utility could become the basis of a new and important industry,” he stated.

Today’s top three cloud computing platforms are Amazon Web Services, Microsoft Azure, and Google Cloud.

Together, with other public cloud platforms, they generated $526 billion in revenue this year.

In 1990, John McCarthy published a position paper, “Artificial Intelligence, Logical and Formalizing Common-sense.”

McCarthy talks about how we can teach machines to understand concepts that come naturally to humans, but are difficult to explain in a way that computers can understand.

He explained that using mathematical logic is essential in making machines intelligent.

McCarthy suggested an organized approach to integrating encyclopedic, widespread knowledge into a computer, including objects, events, actions, and relationships.

He proposed creating computer programs that could use their acquired knowledge to reason logically, just like humans, so that artificial intelligence could simulate human thinking and decision-making.

Intelligent computing devices need “formal computational logic” to comprehend complex information, connect data, and make educated decisions.

McCarthy’s developments in formal computational logic played a significant role in advancing AI systems, leading to the emergence of sophisticated technologies like Natural Language Processing (NLP).

NLP is an essential branch of AI that uses advanced language analysis techniques to improve information processing and expand decision-making abilities, thereby enhancing AI.

AI technologies like the Chat Generative Pre-Trained Transformer (ChatGPT from OpenAI) use NLP to mimic how humans speak and will respond with “sentence fluency” to our questions and comments.

Large language models (LLMs) are AI systems that process and generate human language using extensive data and complex algorithms.

They are built with deep-learning techniques, mainly neural networks, inspired by the human brain.

LLMs have shown remarkable capabilities in NLP tasks and are used in AI chatbots like ChatGPT, Bard AI, and Bing AI Chat.

NLP and LLMs augment AI to understand and engage in conversations that have us feeling like we are talking with another human being.

American computer scientist John McCarthy, one of the founding fathers of AI, passed away at 84 Oct. 24, 2011.