top of page

A.I. Over the Years

By Neel C.


Starting in the late 1930s, a new science fiction style had evolved. Dubbed the Golden Age of Science Fiction, this period focused on the experimentation of hypothetical science fiction, exploring the 'what ifs' rather than following the scientifically accurate and logical facts of its predecessor: hard science fiction. One example of such books was Ursula K. Le Guin’s The Lathe of Heaven, which explored the possibility that one man's dreams could alter the course of reality. But the most popular novel of the time has to be Isaac Asimov’s set of short stories I, Robot, which was such a hit that it even scored itself a movie adaptation, starring Will Smith as its main protagonist.


Published in 1950, Asimov’s stories are set in a world in which humans and robots work alongside each other. The stories explore the ups and downs of collaborating with sentient robots, and how the world adapts to the use of these inventions. Many people were intrigued by the new idea, but English Mathematician Alan Turing took this fiction as a possibility. He proposed the revolutionary question: “Can machines think?” Attempting to answer this question, Turing suggested a criterion for machine intelligence, known as the Turing Test. First introduced in his paper Computing Machinery and Intelligence, the test consists of a human judge who tries to differentiate between a machine and a real person based on interactions with each


But in 1955, an official answer to Turing's question was presented. Allen Newell, Cliff Shaw, and Herbert Simon’s Logic Theorist was a program created as an attempt to mimic the problem-solving skills of a human. It is considered by many to be the first artificial intelligence (AI) program, a term coined only a year later.


From 1957 to 1974, the AI industry was on a roll. People were finally getting the hang of using AI in their work, and computers were now faster, stronger, and more accessible to the public. The government also became aware of this revolutionary technology and strove to create a language translation program similar to the modern Google Translate.


Unfortunately, the awe once held by the people quickly transformed into a sense of fear. This was exactly the effect of Arthur Clarke and Stanley Kubrick’s popular film 2001: A Space Odyssey. In the movie, Clark and Kubrick suggested that by 2001, humans would have developed machines with greater intelligence than that of a human being. The motion picture was then followed by the release of blockbuster 1984 hit Terminator, which depicted a near-future world overrun by killer robots. This insanely popular movie put a roadblock in development within the AI industry as consumers were too scared of the evil possibilities for the tech.


Along with other pieces of fictional media, these movies set off a phenomenon called the Winter of AI. This phase showcased a rapid decrease in funding and hype for artificial intelligence and an overall lack of interest in the field. The crash lasted for six years and was termed by analogy to the concept of nuclear winter. In perspective, had it not been for that half a decade long scare, ChatGPT, an influential AI invention, would have been released to the public in November of 2016. Who knows what scientists now, in 2023, could have done with that head start.


Recent Posts

See All

On Health Equity

Quinn C. '25 Any discussion of social equity and justice requires positioning health as a key concern. Health is, after all, a social consideration. To discuss justice without health equity is a grave

The Radar Recap

By Irene M. '26 Featuring summaries of the biggest news, hand-picked articles to get you up to date, and the editor’s personal picks, the Radar Recap provides a breakdown on what’s been happening in t

bottom of page