Hacking becomes (kind of) formalized
Members of the Tech Model Railroad Club at MIT (Massachusetts Institute of Technology) were the first to use the term “hacking” in a technological context. After World War II, these students started using the word „hack“ to denote an innovative (and sometimes unorthodox) solution to a technical problem. This can be considered the first “formal” definition of the term which still applies today.
Flash forward to the early 1960s, when computers started to become more accessible to academic facilities. This was in the days of the Apollo program by NASA, where the Instrumentation Lab at MIT was tasked with developing the computer hard- and software that would land people on the moon, much to the surprise and dismay of established companies like IBM. In fact, the contract to develop the computer systems for Apollo was one of the first contracts awarded in the entire program. Curious club members who entered this new technological field brought the terminology with them. Ever since, “hacking” has been very closely associated with computing.
But it was not until the early 1980s that hacking as a widely recognized phenomenon took off. For the first time, computers were available and affordable to the general public. Almost anyone could buy a computer and experiment with hacking from that time onwards. And experiment they certainly did. Hacking came in various forms – from purely fun to challenge the mind, to the mildly annoying to the downright criminal. This was also during the Cold War, so computer espionage was of course a topic that made its fair share of headlines.
Many of the world’s most dangerous hackers of recent years draw inspiration from these early pioneers.
By 1986, criminal hacking had become so widespread in the US that the Computer Fraud and Abuse Act (CFAA) was passed. This was the world’s first law against cybercrime.