• Image via Warner Bros. Pictures

Hollywood’s romanticization of artificial intelligence

Home / Culture Evolved
By Peter Lagosky
@Quantumrun
Aug 08, 2014,  6:52 PM

Cultural depictions of automated living are nothing new to the average North American media consumer. As early as the 1960s, shows such as The Jetsons whimsically foretold of the coming millennium and its associated technological renaissance of floating cars, teleportation devices and friendly robots that would tend to the children, cook dinner, or clean the house in as little time as it took to worry about it. While the millennium as portrayed in The Jetsons was a far-fetched utopia of man and machine coming together to rid the world of human error and inefficiency, it still reflected popular wishful thinking on behalf of those who created film or television during the era.

As the year 2000 drew nearer, more and more consumer attention was given not only to the growth and evolution of technology, but also to the possible shortcomings of too much digitization, as well as what could happen if the machines overpowered us and took charge.

Plenty of Hollywood blockbusters have focused on the development, implementation, and often disastrous outcomes of artificial intelligence. Once the 1980s rolled around, Hollywood developed a sort of obsession with the future, and the film industry’s collective ability to accurately depict and assuage fears of an AI meltdown was met with varying levels of success. Before we look at some films that have shaped our perception of artificial intelligence, we need to travel back in time to when film-making and futurism merged to create a burgeoning business. We need to turn the clock back to 1982.

Our introduction to the future at home

 

In 1982, the Commodore 64 was released, revolutionizing home computing. For the first time ever, the personal computer was released to a broad market, and new ways of accomplishing simple tasks and processing information were introduced, bringing with it the fields of computer sciences and programming. Soon enough, the first ever computer virus, the Elk Cloner, was discovered and found to be rampantly infecting Apple II computers through floppy disks.

Long before the introduction of the Internet, fears of information insecurity and mechanic rebellion shocked the computer industry, and before they knew it, their own end users were finding new and inventive ways to program and reprogram the machines to perform malicious tasks. Trust in machines was virtually non-existent and is still a very foreign idea to most: why put any trust in a platform that, using its own technology to help you, can just as easily compromise you?

The idea seemed ludicrous until later in 1982 when Walt Disney, whose entertainment conglomerate had a small collection of Disney-licensed video games playable on the Commodore 64, opened EPCOT (Experimental Prototype Community of Tomorrow) at Walt Disney World and changed perceptions of the future from a cold, sterile abstraction created by nerds to something accessible, fascinating, and worth getting excited about. Best of all, it made tons of money, and personal computing was a burgeoning field just as soon as it faltered. One of EPCOT’s most notable attractions is “Future World,” which features sections with names such as Spaceship Earth, Innovations and Wonders of Life. Computers were given new hope as life-preserving, joy-bringing, space-exploring wonder machines that, if we trust enough, could bring us great efficiency and innovation.

All of a sudden, the future was friendly, and with the continued development of both personal computing and EPCOT, technology, as well as innovation and imagination, were at an all-time high. It seemed only natural to release movies that reflected this energy and exploited the technologically feeble-minded populace. It all started back in 1984, the same time personal computing took another humongous leap, with Apple’s release of the first Macintosh personal computer.

Their claim that 1984 wouldn’t be like 1984 implied the abolition of any fears of technological uprising, surveillance and control: for once, a machine made by the people for the people was released. No longer was the computer a cold metal-and-plastic box with difficult codes and a bible of commands to memorize to do anything meaningful: it became personal.

Are you Sarah Connor?

 

With this growing trend towards personalization of technology, coupled with the programming scene’s growing ability to manipulate said technologies to carry out tasks unimaginable a few years ago, Hollywood had the perfect cultural framework to release motion pictures that played on the fears, assumptions and controversies associated with the increasing personalization of artificial intelligence. The first major blip on the radar came when an unknown director on the fringes of the sci-fi scene by the name of James Cameron decided to create The Terminator later in 1984.

Set in 1984, Cameron’s film shows us the dichotomy between human and machine by having a sinister robot from the year 2029 determined to kill a woman named Sarah Connor and another human, Kyle Reese, who traveled back in time to save her and eliminate the Terminator. The Terminator has traveled back in time as a representative of Skynet, an AI-powered defense network intended to replace the military and homeland security systems of post-millennium America. All hell breaks loose when Skynet becomes self-aware and begins a purge of humankind, which eventually prompts Sarah Connor’s unborn son, John, to rally the survivors and fight off the machines. Running out of ideas and time, Skynet decides to send a cyborg back in time to eliminate Sarah before John is even born, creating the premise for the rest of the film. Kyle has an attraction to Sarah, and his vengeance is tainted by his feelings for her, leaving the very serious issue of an angry death machine on the loose in the back of the viewer’s mind.

By combining the ominous inevitability of technological uprising with the limitations of the human heart, Cameron broaches the subject of automation and human futility without fully exploring it or alleging too much, leading to a box office smash-hit and a “creation of curiosity” towards what robots are truly capable of. With the release of The Terminator, the masses could glimpse an entirely new paradigm of futurism and they responded by demanding more of the same.

The uncanny valley

 

What follows is Steven Spielberg’s A.I. Artificial Intelligence, a movie that Stanley Kubrick began developing as early as the 1970s but was not completed and released until 2001, after Kubrick’s death. What we see in A.I. is a total blurring of the lines between man and machine; and the creation of Mecha, humanoid robots capable of receiving and giving love. Unlike The Terminator, which is set in an otherwise normal world, A.I. takes place in the late 21st century during a time of climate change and unexplained population loss.

Cybertronics, a corporation that creates Mecha, has released a child version of their humanoid robots and as a prototype, gives the child (David) to two of its employees (Monica and Henry) whose real son (Martin) is in suspended animation with a rare disease. David, along with his artificially intelligent teddy bear (Teddy), fit in with the family swimmingly until their real son’s disease is cured and a sibling rivalry ensues. It all comes to a head at a pool party, when an innocent poke in the ribs sets off David’s self-protection mechanism and he tackles Martin into the pool, almost drowning him and prompting the family to return him to Cybertronics to be destroyed, their fear being that he is as capable of harm as he is of love.

The human-machine bond is far too great, however, and Monica instead abandons him in a forest, where he is eventually captured by organizers of an anti-Mecha group who destroy them in front of raucous crowds. David, once again, escapes and the rest of the movie is based on his quest to find the Blue Fairy from Pinocchio to change him into a real boy. While A.I. is much less polemic than The Terminator in its approach to the mechanization of humanity, it nonetheless shows us the other side of the spectrum, where artificially intelligent beings are capable of replacing us not only in the workplace, but at home as well.

We fall in love with David because he is a sweet little boy who just so happens to also be a robot—something that is never a point of contention in the film. Unlike the technologically-deprived 1980s when The Terminator stirred fear in its viewers, A.I. was developed over the course of almost three decades, giving both Kubrick and Spielberg a more vivid idea of exactly what technology could be capable of. Both films try to add elements of humanity to technology and create dramatic story lines featuring humanoids and real life human beings, but in retrospect in 2014, both were overambitious in their attempt to bridge the gap between man and machine. In fact, both trivialize an idea they don’t fully understand to the point of fallacy and near-mockery. 

Impact (ONLY use the 'Paste From Word' button to safely copy and paste text from a Word doc) 

Whether or not artificial intelligence will ever play as large a role in our lives as these films depict is still anybody’s guess. We can continue to expect Hollywood to make it into whatever sells. So grab a bag of popcorn and sit back – that is, unless all hell does break loose.

Public Release Year: 
1940 to 2030
The future is amazing. We want to show you why. Sign up for our email, the Quantumview, and see for yourself.
The future is amazing. We want to show you why.
Sign up for our email,
the Quantumview, and see for yourself.
By signing up, you agree to our privacy policy.

Comments

Load comments

Want More Stuff Like This?

The future is amazing. We want to show you why. Sign up for our email, the Quantumview, and see for yourself.
The future is amazing. We want to show you why.
Sign up for our email,
the Quantumview, and see for yourself.
By signing up, you agree to our privacy policy.