Did you know that before computers became household items, they could weigh tons?
That's right! These massive machines required entire rooms and cooling systems to prevent overheating. But before we get to that, let's go back in time and explore the history of computing…
Main historical facts that led us to this current situation:
This period was known as the Age of Mechanical Devices (3000 B.C. – 1880).
The abacus - invented by the Chinese between 2500 and 3000 B.C., is a simple calculation tool with parallel wires and sliding beads. The position of the beads represents numbers, allowing basic arithmetic operations like addition, subtraction, multiplication, and division.
The Pascaline - invented in 1642 by French mathematician Blaise Pascal at age 19, was a mechanical calculator for addition and subtraction. It used toothed wheels and gears, where each gear had 10 teeth. A full rotation moved the next gear, allowing calculations from units to tens, hundreds, and thousands.
Leibniz's calculating machine - created in 1671 by German mathematician Gottfried Leibniz, was an improved version of the Pascaline. It could perform all four arithmetic operations—addition, subtraction, multiplication, and division—using successive calculations.
Babbage's Difference Engine (1823) - was a programmable calculating machine designed for addition and subtraction using finite differences, enabling logarithmic and trigonometric calculations.
Babbage's Analytical Engine (1834) - introduced a programmable system for various calculations. Though never completed, it was a precursor to modern computers. Ada Lovelace, considered the first programmer, developed its programming.
This marks the end of the Mechanical Devices period.
Remembering that this Extension Activity would not be possible without the Unigran University Book - Software Engineering 1st Semester, and all its authors and their research. If anyone visits this site and wants to leave additional comments, it will be a pleasure to share them here.
As the world evolved, so did computers. With the invention of electric motors, many adding machines were developed using this new technology.
Hollerith's Tabulating Machine (1889) - was an electromechanical device that counted, classified, and organized data stored on punched cards. These cards could hold programs or information. The machine won a contest in 1888 to process the 1890 U.S. census and later became widely used worldwide.
With continuous evolution, new electronic components soon became necessary for improved performance and reliability.
The Age of Electronic Devices - ENIAC Construction Period
John von Neumann and His Contributions
John von Neumann (1903–1957) developed the von Neumann architecture, where memory stores both data and instructions, enabling sequential execution.
ENIAC belonged to the first generation of computers, which used vacuum tubes but often failed after hours of use. These massive machines occupied entire rooms, marking the beginning of electronic computing.
ENIAC - built between 1943 and 1946, used 18,000 vacuum tubes and 1,500 relays. It weighed 30 tons and consumed a lot of energy, requiring a large space and cooling system to manage the heat.
John von Neumann contributed to the EDVAC, improving stored-program computing. His model, with a control unit, memory, input/output, and arithmetic logic unit (ALU), became the foundation of modern computers.
Gordon Moore and His Contributions
Gordon Moore (1929–2023) was a co-founder of Intel and the creator of Moore's Law, which predicted that the number of transistors on a chip would double approximately every two years, increasing computing power. His insights drove advancements in semiconductor technology, shaping the modern computing era.
In 1971, Intel introduced the Intel 4004, the first commercial microprocessor. This small chip allowed computers to become more powerful and compact. The project was led by Federico Faggin, an important figure in microprocessor development.
In the early 1970s, Ken Thompson and Dennis Ritchie at Bell Labs created UNIX, a powerful and flexible operating system. UNIX became the foundation for modern systems like Linux and macOS.
In 1975, Bill Gates and Paul Allen founded Microsoft, developing software for personal computers. A year later, in 1976, Steve Jobs, Steve Wozniak, and Ronald Wayne founded Apple, introducing user-friendly personal computers.
The Internet started in 1969 as a military project called ARPANET. In 1990, Tim Berners-Lee created the World Wide Web (WWW), allowing people to browse and create websites.
In 1981, IBM launched the IBM PC, making computers more accessible. It used the MS-DOS operating system, developed by Microsoft, helping PCs become widely used.
In 1991, Linus Torvalds created Linux, an open-source operating system. Today, Linux runs on servers, supercomputers, and even smartphones (Android).
Recent advancements in Artificial Intelligence (AI) have transformed technology. AI systems like AlphaGo and ChatGPT use deep learning to perform complex tasks, shaping the future of computing.
The evolution of computers and software has been remarkable. From the mechanical designs of Charles Babbage’s Analytical Engine to the massive ENIAC and the birth of personal computers, each step brought new possibilities and challenges.
Programming languages evolved alongside hardware, starting with assembly code and moving to high-level languages like C, Java, and Python. The rise of the internet, open-source development, and artificial intelligence has made software more powerful and accessible than ever before.
Today, computers are faster, smaller, and smarter — but the core principles of programming and problem-solving remain the same. The journey from the first algorithm to modern software shows how human creativity and innovation continue to shape the future of technology.