ENIAC (Electronic Numerical Integrator and Computer) was the world’s first general purpose electronic computer, meaning it could be ‘reprogrammed’ to perform more than one task. It was designed to calculate mathematical problems (primarily artillery trajectories) for the US military. It weighed 30 tons and filled a room. Reprogramming it required flipping switches and rewiring its hardware.
A few years later, in Manchester, UK, the Small Scale Experimental machine, more commonly known as ‘The Baby’, was developed,. This became the world’s first digital stored-program computer. It stored its programs in the same memory that data was stored in, and could be reprogrammed by typing instructions into it.
‘High-level programming languages’ emerged. The earliest computers had to be programmed in 'machine code' – instructions they could run directly. High-level languages were designed for people first which meant that programmers did not need to know details of a computer's hardware or electronics.
A tool called a compiler would take the high-level code a programmer wrote and turn it into the machine code to run on the computer. Grace Hopper developed one of the very first compilers in 1952. Compilers are still used to write code today.
In the UK, J. Lyons and Co built LEO I, the first computer used for business. They went on to found LEO Computers and sell their computers to other businesses.
The BASIC language (Beginners' All-purpose Symbolic Instruction Code) was designed at Dartmouth College, USA, and for many decades was many beginners’ introduction to programming.
In 1968, Doug Englebart and his team demonstrated a huge variety of technologies in a single presentation. These included the first computer mouse; video conferencing, multi-user document editing, and windowing interfaces. The 90-minute presentation became known as ‘The Mother of All Demos’.
The US landed a man on the moon. The code for the guidance computer in the Apollo spacecraft ran to 145,000 lines. Margaret Hamilton led the development of the software for the computer in the spacecraft.
Intel invented the microprocessor, a single computer chip containing all the functionality of a computer’s central processor. This made computing power much smaller and cheaper, defining the next few decades.
The UNIX operating system (Uniplexed Information and Computer Systems) and C programming language emerged from Bell Labs in the US. Forms of Unix are still in use today, for example, in many of the servers on the internet, and on every Apple computer.
Microsoft released its first product, an implementation of the BASIC language for the Altair 8800.
Pong was the first arcade video game and a smash hit in the US in 1972.
The WIMP (Windows, Icons, Menus, Pointer) style of graphical user interface emerged into mainstream home computers. It was first used on Apple’s Lisa in 1983, followed by its Macintosh in 1984. Microsoft launched its first version of Windows in 1985.
The microprocessor era brought an explosion of affordable home computers, such as the British ZX Spectrum and American Commodore 64. With a built-in version of the BASIC language, they became a first exposure to programming for many, and many small businesses emerged producing software for them.
The CD-ROM, a compact disc used to store data, was invented.
In Switzerland, Tim Berners-Lee wrote the first design proposal for the World Wide Web.
The beginning of the internet era.
The first web browsers were developed, including NCSA Mosaic and Netscape Navigator. The first Internet Service Providers (ISPs) emerged and the web as we know it began to grow.
The web led to other related technologies emerging. Netscape developed the JavaScript programming language in 1995. The JPEG compressed graphics format, now common online and in digital cameras, was invented.
The first smartphones emerged. These mobile telephones behaved more like small computers and made heavy use of internet and data services. (See example pictured). They were enabled in part by increasingly fast mobile networks: the first 'third generation' (3G networks) also emerged in the early 2000s. In 2007 Apple goes on to launch its first smartphone, the iPhone. From 2008 Apple allowed anyone to release software for it (known as ‘apps’).
‘Web 2.0’ emerged as a term to describe websites where user participation is encouraged and fostered. Effectively, this was early social media. Such sites included Wikipedia, which launched in 2001, YouTube in 2004, and Twitter in 2006.