As part of our 5-part series on the evolution of cybersecurity, check out our first article below that covers how War Games and Worms led to the creation of the first cybersecurity efforts in the 1980s.
While cybersecurity is a multibillion-dollar industry today, during the early years of the commercial computing industry, most organizations did not fully understand the need for security or even the potential scope of the problem. As is still the case, criminals saw opportunities in those vulnerabilities early on, and most security efforts were reactive.
In this series we will explore the development of cybersecurity from the 1980s up until 2023. It is essential for the cyber community to comprehend our history as it enables us to gain insights from incidents and remain responsive to emerging technologies and threats, in the future.
Before the internet was everywhere
The 1980s saw both the rise of the personal computer and the nascent cybersecurity industry’s emergence as academics and criminals tested the security of the first networked computer systems.
Remember, though, that many computers were disconnected during that first decade of widespread adoption. If they were networked at all, it was internally or across a campus. Organizations that were truly interconnected through the early iteration of what would be called the Internet were unmadeiversities, government organizations, and very large companies.
As a result, threats tended to be more hardware oriented. To steal data or introduce malware, a criminal had to physically access a computer or server and use a floppy disk (or, in later years, a USB drive). Most individual computers were not password protected.
In response, companies tended to implement physical security such as cameras or access card systems. If you watched television or movies back in the 80s, there were often times sequences in which characters would hurriedly download data to a disk after breaking into a poorly secured room, then tuck the disk into a satchel and escape.
However, there was a growing awareness of the vulnerability of networked government computers, and it was a Hollywood depiction of cybercrime that would spur early security efforts. The 1983 film War Games depicted a lone hacker and a rogue computer program taking over nuclear missile systems. A viewing of that film led to then-President Ronald Reagan asking the Department of Defense to investigate the possibility of something similar happening in real life. As it turned out, significant vulnerabilities were already being researched within the department as far back as the 1960s. (The screenwriters of War Games had interviewed Willis Ware, a RAND engineer who wrote one of the first papers on cybersecurity, who helped them understand the Demon Dialing modem hack used in the film.)
Early hackers also breached government computers via ARPANET (as well as systems at AT&T, Los Alamos National Laboratory, and other locations). Back then, these were primarily information crimes or sabotage. Criminals would access data to sell to other governments or companies or to hijack computer systems to hobble the victim or create chaos.
The first cybersecurity legislation, the Computer Fraud and Abuse Act (CFAA) of 1986, was enacted in the U.S. to make accessing a protected computer without authorization illegal and prohibit the theft or destruction of data or computer programs.
The emergence of sophisticated viruses
In addition to espionage and sabotage, the 1980s also saw the emergence of sophisticated computer viruses and worms. The first computer virus was created in the early 1970s (the Creeper virus), spreading across the ARPANET. In the 1980s, the Cascade virus, and more notoriously, the Morris Worm, drew more attention to these types of breaches.
A grad student initially created the Morris worm to gauge the size of the Internet. However, a programming error caused it to replicate wildly, nearly bringing down the entire global network.
The creator of the Morris Worm was the first person charged under the CFAA. The Morris Worm also led to the creation of the Computer Emergency Response Team (which evolved into US-CERT) and spurred growth in the emerging field of antivirus development. The first commercial products emerged in 1987 with the release of antivirus products for the Atari ST, along with the NOD antivirus solution, Anti4us, Flushot Plus, and the launch of McAfee in the U.S. A Cascade virus infection at one of its locations brought IBM into the antivirus market as well.
These early products scanned and searched computers for virus code sequences and modified programs that could trick the viruses into thinking the computer was already infected. Once identified, the virus could be quarantined and removed. However, these products could only be designed to counter known viruses and required disk-based updates. Early antivirus was also signature-based and not only produced a lot of false positives but also hogged a significant amount of power and memory.
Within a few years, virus developers would find ways to circumvent these products, leading to a significant escalation in the cybersecurity arms race. In our next article, I will trace how these developments led to the evolution of additional cybersecurity tools in the 1990s.
That’s all for part one of our series on the evolution of cybersecurity. Look out for part two coming soon!
Photo: Shaiith / Shutterstock