Quite simply, computer hardware is the physical components that a computer system requires to function. It encompasses everything with a circuit board that operates within a PC or laptop; including the motherboard, graphics card, CPU (Central Processing Unit), ventilation fans, webcam, power supply, and so on.
Although the design of hardware differs between desktop PCs and laptops due to their differences in size, the same core components will be found in both. Without hardware, there would be no way of running the essential software that makes computers so useful. Software is defined as the virtual programs that run on your computer; that is, operating system, internet browser, word-processing documents, etc.
Although a computer can function only when both hardware and software are working together, the speed of a system will largely rely on the hardware used.
When building up a new computer, or simply replacing old parts, you may need to know the specific hardware in your computer.
Blaise Pascal builds the Pascal Adding Machine – the first workable calculator. This is more significant than Napier’s bones, the development of logarithmic tables or some mechanical devices, like the watch or the quadrant, because the device does the computing. (1642 – France)
Gottfried Leibniz perfects the binary number system. (1679 – Germany) and in 1673 created a digital (mechanical) calculator- first binary system for ‘computing’.
Joseph Jacquard builds his textile loom using the concept of a punch card (stored program) to weave intricate designs into cloth. The Jacquard Loom is arguably the foundation of the programmable machine. (1801 – France)
Charles Babbage had the idea for the Analytical Engine, and although he didn’t ultimately build it, it set the foundation for all modern computers. In 1823: Babbage's Difference Engine was the first multipurpose programmable machine. Augusta Ada Byron, Countess of Lovelace, who worked with him, proposed using punch cards like those used in Jacquard’s loom to make the Analytical Engine programmable, and is credited with proposing the first algorithm. (1833 – UK)
George Boole creates Boolean algebra, laying the foundation for Information Theory. This is where “and,” “or” and “not” come into mathematical formulas. Boolean algebra simplified logic design and made it possible for binary calculators to function as logic machines. This formula was later used by Charles Sanders Peirce to develop the idea that Boole’s logic lends itself to electrical switching circuits. It would be 50 years before Bertrand Russell presented the idea that this is the foundation of all mathematics, and another 30 years until Claude Shannon incorporated the symbolic “true or false” logic into electrical switching circuits. (1854 – UK)
Thomas Edison discovers thermionic emissions, the basis of the vacuum tube, which, in turn, becomes one of the building blocks of the entire electronics industry. When the vacuum tube is invented, in 1904, it enables amplified radio and telephone technology. (1863 – 1904 USA)
Electric computers were faster than their mechanical counterparts. The idea of using relays to realise logic circuits was not new even in the 1920s. However it was not until the late 1930s that actual full-scale computers and calculators were developed. It was not until the end of World War II that a large number of systems were built.
Electronic computers even at the beginning were 1000 times faster than their electric counterparts. The Atanasoff–Berry computer, a prototype of which was first demonstrated in 1939, is now credited as the first vacuum-tube computer. However, it was not a general-purpose computer, being able to only solve a system of linear equations, and was also not very reliable. During World War II, special-purpose vacuum-tube digital computers such as Colossus were used to break German and Japanese ciphers. The military intelligence gathered by these systems was essential to the Allied war effort. Each Colossus used between 1,600 and 2,400 vacuum tubes. The existence of the machine was kept secret, and the public was unaware of its application until the 1970s
Alan Turing was an amazing guy. Turing provided the basis for the development of automatic programming, demonstrating that computing machines can simulate more complicated problems. If it wasn’t for him the Z2, the first digital computer which was used to break Germany’s Enigma machine, would not have been built. And although the dream of artificial intelligence was first thought of in Indian philosophies such as those of Charvaka, dating back to 3500 years, Turing championed the notion of AI for computers, leading to the Turing test (1950). (1936 – UK)
John Bardeen invents the transistor. (1948 – USA)
An Wang invents magnetic core memory. Although he didn’t build it but sold the patent to IBM for $400K to get the funds to start his company, the idea was not practical until Jay Forrester at MIT enhanced the idea to put it into a matrix. This opened greater practical applications for the technology, which in turn led to the later development of computer memory by Fred Williams. (1949 – USA)
Grace Hopper was a star. She pioneered the idea of using higher-level computer languages and built the concept of a compiler, so we could program in words, not numbers. This gave rise to COBOL, the first language to run on multiple types of computers. (1952 – USA)
The airline industry develops the semi-automatic business research environment (SABRE) with two connected mainframes, the start of computer networking. This project borrowed some logic from the military SAGE project, but it is nonetheless the foundation of networking, which really took off after Robert Metcalfe created Ethernet for Xerox. The current internet gets it roots from ARPANET in 1969, the first network to implement TCP/IP and the ancestor of today’s Internet. (1953 – USA)
Arthur Rock invents the term venture capital, which makes funding of technology ideas possible and launches the business model of the computer age. If it wasn’t for him, Robert Noyce and Gordon Moore could not have formed the “traitorous eight” to form Fairchild Semiconductor, which in turn spawned AMD and Intel. (1957)
IBM releases the IBM System/360, the first computer system to offer the concept of modular, compatible general purpose-computing. This led to the expansion of computer systems and the foundation of the personal computer market. Some would argue that the DEC PDP-11, developed in 1975, really led to the PC market. THE PDP-11 was just easier to program, had general-purpose registers and interrupts, and could be manufactured with semi-skilled labor. (1964 – USA)
The first concept of a mouse and a graphical user interface is demonstrated by Doug Engelbart. It wasn’t until 10 years later, however, that Xerox PARC developed the Alto, which was later stolen by Microsoft and Apple. (1964 – USA) Ted Nelson, Project Xanadu, came up with hypertext, precursor to WWW and in many ways superior (bi-directional links, something Berners-Lee didn’t think of, 1960.)
Gordon Moore and Robert Noyce create Intel to build the integrated circuit. After forming the company, it takes Moore only a year to posit Moore’s Law. (1964 – USA)
Intel releases the 8-bit 8008, soon replaced by the 8080, microprocessor. This was the first true microprocessor, which led to the PC revolution. (1972)
The World Wide Web is born at the CERN physics laboratory, led by Sir Tim Berners-Lee. His paper is published in 1989, the WWW is built in 1990, and the product launches in 1991 (something I did not learn about until early 1994 when I was on a sales call working for BMC). (1989 – UK)
ARM Holdings, which has a great business model, is the company that made smart phones possible. Built on the RISC architecture, it required fewer transistors, which reduced costs, power and heat. (1985 – UK)
The motherboard is at the center of what makes a PC work. It houses the CPU and is a hub that all other hardware runs through. The motherboard acts as a brain; allocating power where it’s needed, communicating with and coordinating across all other components – making it one of the most important pieces of hardware in a computer.
When choosing a motherboard, it’s important to check what hardware ports the motherboard supplies. It’s vital to check how many USB ports, and what grade (USB 2.0, 3.0, 3.1) they are, as well as what display ports are used (HDMI, DVI, RGB) and how many of each there are. The ports on the motherboard will also help you define what other hardware will be compatible with your computer, such as what type of RAM and graphics card you can use.
Although the motherboard is just one piece of circuitry, it is home to another one of the most important pieces of hardware: the processor.
Parts of the Motherboard
The CPU (Central Processing Unit or processor) is responsible for processing all information from programs run by your computer. The ‘clock speed’, or the speed at which the processor processes information, is measured in gigahertz (GHz). This means that a processor advertising a high GHz rating will likely perform faster than a similarly specified processor of the same brand and age.
Central Processing Unit vs. Graphics Processing unit
Especially important for 3D rendering, the GPU does exactly what its name suggests and processes huge batches of graphic data. You will find that your computer’s graphics card has at least one GPU. As opposed to the basic on-board graphic capabilities that PC motherboards supply, dedicated graphics cards interface with the motherboard via an expansion slot to work almost exclusively on graphic rendering. This also means you can upgrade your graphics card if you want to get a bit more performance from your PC.
Not only this, but modern GPUs fulfil a broad computational workload beyond just rendering, making them an extension to the central processing unit.
Random Access Memory, or RAM, is hardware found in the memory slots of the motherboard. The role of RAM is to temporarily store on-the-fly information created by programs and to do so in a way that makes this data immediately accessible. The tasks that require random memory could be; rendering images for graphic design, edited video or photographs, multi-tasking with multiple apps open (for example, running a game on one screen and chatting via Discord on the other).
Double Data Rate Synchronous Dynamic Random-Access Memory module
How much RAM you require depends on the programs that you’ll be running. Medium intensity gaming generally uses 8GB of memory when performed alongside other programs, but video/graphic design can use upwards of 16GB of RAM. Ubuntu Linux only needs 2GB to run, and Puppy Linux needs only 256MB, so the memory needed depends on both the OS AND the Applications to be run on the platform.
The hard drive is a storage device responsible for storing permanent and temporary data. This data comes in many different forms, but is essentially anything saved or installed to a computer: for example, computer programs, family photos, operating system, word-processing documents, and so on. Find out more about hard drives and how they work.
There are two different types of storage devices: the traditional hard disk drive (HDD) and the newer solid state drives (SSD). Hard disk drives work by writing binary data onto spinning magnetic disks called platters that rotate at high speeds, while a solid-state drive stores data by using static flash memory chips.
Traditional Hard Disk Drive vs a Solid State Drive
A power supply unit, commonly abbreviated as PSU, does more than just supply your computer with power. It is the point where power enters your system from an external power source and is then allocated by the motherboard to individual component hardware. Not all power supplies are made equally however, and without the right wattage PSU your system will fail to work.
A modern computer will generally need a PSU that’s rated between 500W – 850W to effectively power all hardware, although the size of the PSU will depend entirely on the power consumption of the system. Computers that are used for highly intensive tasks such as graphic design or gaming will require more powerful components and thus will need a bigger PSU to cater to this additional need.
Without the right amount of power, components won’t be able to run effectively and the computer might experience crashes or simply fail to boot at all. It’s recommended to have a power supply that more than covers your system usage. Not only do you guard yourself against system failure, you also future-proof yourself against needing a new PSU when you upgrade to more powerful PC components.
Understanding your computer and its hardware components can prove very useful when the time comes to upgrade or replace any parts, or when building a computer. Should a problem arise with the internal workings of your computer, you will have a better understanding of the importance of each component, the need for them to be in good working condition and how to go about solving any issues.
Power Supply Unit