It’s difficult to imagine our world today without computers!
They have been around since World War Two, but they were clunky, massively expensive things that had all the calculating power of a brick.
When Steve Wozniak and Steve Jobs introduced the original Apple computer in 1976, however, it changed everything and the rest is, as they say, history. Today computers are everywhere and we have become so dependent upon them that many people cannot function in their daily lives without one!
For some, they even provide the very means of maintaining a livelihood. We use them to keep track of our finances, write books, design logos and sell real estate. Plus, they are rapidly replacing the radio and television in their ability to entertain us with music, movies, and games.
It makes it hard to understand how our ancestors did so well without them, doesn’t it?
Mobile phones and tablets are getting so advanced that the line between laptop and mobile device is blurring every day with some phones being more versatile and powerful than desktop computers. Even the latest iPad Pro’s perform faster than their laptop equivalents. Don’t believe me?
How many laptops are running octo-core processors with a dedicated graphics chips and 64 bit CPU’s? How many laptops have Wi-Fi, Bluetooth, 5G and batteries that can last for days at a time?
Microelectronics, surface mounted components (SMC’s) and PCB’s have all contributed to the development of computer technology.
The first mechanical computer was invented by Charles Babbage in 1822, but it bears no resemblance to a modern computer.
The Difference Engine was a huge machine made of brass cogs and levers. It was never built in Babbage’s lifetime, but his son completed part of it in 1910 and it performed basic calculations. It is the principle upon which modern day computers were developed.
The first programmable computer was invented by Konrad Zuse in 1936. The Z1 and was considered to be the first functional computer.
Alan Turing invented the Turing Machine, also in 1936. Although it was a hypothetical device, it became the foundation for theories about computers and computing.
Alan Turing is considered "the father of the modern computer"
The first electric programmable computer was the Colossus in 1943, a device which helped to break code during the second world war.
Development continued through the 1940’s until the first commercial computer, the Z4, was released in 1950. This was still a large and complex device which could only be operated by scientists and mathematicians.
Next came the birth of the digital computer. Like radio, transistors replaced valves and mechanisms allowing computers to become faster and smaller...
The first transistor based device was demonstrated in 1956, just a few years after IBM were established. IBM would go on to be the largest manufacturer of computers through the 1970’s until Apple and Microsoft appeared.
Hewlett Packard designed the first mass market PC in 1968, and Intel created the first microprocessor, the Intel 4004, three years later in 1971.
The microprocessor age was born and throughout the 1970’s huge companies as well as many start-ups and hobbyist outlets began to develop computers for personal use.
Apple created the Apple 1 in 1976 and became a real competitor to companies like IBM. The rest is history…
The 1980’s saw a huge interest in computers and videogames which were the main attraction of home computers unless you were trying to do complex accounts or learn to program.
The 1980’s will always be associated with the birth of videogames, computers and electronic music; all made possible by the microchip which has developed at a constant pace ever since.
Space invaders
It would be 1993 before the first multimedia computer was created; computers could now play music, watch movies etc.
The ability to run more than spreadsheets or simple games was amazing at the time, even though it is something we take for granted, even on our mobile phones, today!
The introduction of the internet was a major factor in advancing computer technology, followed by rapid development in storage, memory, processor speed and screen size. Add to this improved battery life, lower power consumption, better reliability.
Apple are one company for whom technology was seen as just part of the process of using a computer.
While their competitors continued to produce drab beige boxes that looked very similar, Apple sought to redesign computers to be desirable and easy to use.
This is a fundamental part of their marketing strategy and has clearly worked well for them as they are now one of the most successful companies, and familiar brands, in the world today. They have not only become market leaders in computers but also in tablets, music players, smartwatches and phones!
The diagrams on the next two slides show how they have developed their range from the humble Apple 1 (which didn’t even come with a case or keyboard) to the iMacs, iPads and iPhones of today.
When looking at new technology, we are all to often obsessed with the marketable features such as storage, CPU speed, screen size etc. but the one thing we often take for granted is the battery. We just presume that any device will be battery powered and easily recharged, lasting us days at a time.
The technology behind the ‘battery’ goes back as far as 1748 when Benjamin Franklin used the term to describe an array of charged glass plates.
Subsequent batteries often relied on chemical reactions to generate electricity; often a metal electrode in acid.
Over the years batteries as we know them have evolved massively. Dry cell batteries such as the Zinc Carbon cell evolved removing the need for liquid chemicals.
NiCad (Nickel Cadmium) was the first Alkaline battery.
NiMH (Nickel Metal Hydride) had a longer lifespan and became popular for use in electronic items.
Lithium ion batteries were developed through the 1980’s and, in 1991, the first commercial Lithium ion battery was released by Sony.
In 1997, the Lithium ion polymer battery was released. These batteries hold their electrolyte in a solid polymer composite instead of a liquid solvent, and the electrodes and separators are laminated to each other. The latter allows the battery to be encased in a flexible wrapping
instead of a rigid metal casing, which means such batteries can be specifically shaped to fit a
particular device. They also have a higher energy density than normal lithium ion batteries. These advantages have made it the battery of choice for portable electronics such as mobile phones and tablets as they allow for more flexible and compact design.
Rechargeable batteries such as the ones above have been in existence since the mid 1800’s, but a modern battery is very advanced and capable of being charged thousands of times before they begin to deteriorate. They also provide an incredible amount of portable power for their size.
The Apple iPad Air, released November 2018, had a lithium-polymer battery capable of running the iPad for at least 10 hours on a single charge. Even with this battery the iPad was only 7.5mm thick yet had a high resolution screen and a great deal of processing power alongside Wi-Fi, Bluetooth, stereo speakers and a 64 bit dual core CPU and dedicated graphics chip.
It was probably the pinnacle of portable battery technology in a commercial product at the time of its release and battery life is continuously being improved. Devices like the iPad Pro of 2020 make even this technology and design look outdated!
Although it seems to have been with us forever, the internet was released to the world in 1990; just 30 years ago.
The computer rendered the typewriter obsolete and made writing in long-hand a thing of the past, but it took the internet to truly turn the computer into the monster it is today.
While the airplane shrank our planet to the point that one could fly from New York to London in six hours, the internet made it possible to be there in a few seconds.
It allows truth to make it into and out of repressive countries, it incites revolutions, and spreads lies at the speed of light. It also gives anyone the ability to buy and sell almost anything imaginable, find and torment old school mates, watch the latest YouTube videos, and even find their perfect life partner, all for a few pounds a month. Oh, and you can also get useful information off it if you don’t mind scrolling through 15,000 hits to find out just how long snails really live.
The Internet of Things (IoT) is the connection of a range of devices to one another over networks such as Wi-fi and the internet. It has massive potential for improving the way that manufacturing works, as well as more mundane domestic tasks. Take Alexa and Google Home as examples of how everyday operations can be voice controlled.
It is possible to visualize machines in a JIT system being set up to automatically ensure there is a supply of materials and parts. Such a system could also respond to changes in operation, identify faults and facilitate maintenance.
Already covered in the material units (1.1/1.2), it is worth reminding ourselves of the new materials that have also been developed in the last few decades:
Glulam: the use of several pieces of timber glued together (much like Plywood) to create strong, composite components for use in buildings.
Kevlar: a form of aromatic polyamide artificial fibre that has tremendous toughness and tensile strength.
Precious metal clay: a craft material that is made up of microscopic particles of precious metals that can be formed like clay then fired to leave a metal product.
Nanomaterials: nanotechnology involving the precise manipulation of nanomaterials which are created using particles in the atomic size range. See Graphene as a good example.
Again, already covered in the processes sections (1.4), it is worth reminding ourselves of the new methods of manufacture that have been developed in the last few decades:
Hydraulic forming: the single stage forming of complex sheet metal parts using a single sided former, by the action of a shockwave generated by an electrical spark in a tank of water.
Advanced 3D printing of metals using selective laser sintering (SLS) called direct metal laser sintering (DMLS).
Fibre injection moulding: pellets of glass or carbon fibre filled polymers are used
Laser beam welding (LBW) where the intense heat of a laser beam is used to join multiple pieces of metal.
Physical vapour deposition: a method of producing thin films of material or coating products with a finished surface as an alternative to electro-plating.
The growth in the use of standardized file formats such as DXF and STL to connect CAD and CAM continues to be a key factor in the increasingly influential manufacturing role of this technology.
CAD and CAM technology advances rapidly and much of this was covered in Unit 1.7, but it is important to keep up to date with technological advances in this area.
You should be familiar with advances in 3D printing and other rapid prototyping technologies as well as integrated realization; the dynamic use of design tools such as FEA and CFD during the early stages of designing parts and components.
There is also a move to great integration of CAD/CAM with Augmented Reality (AR) and Virtual Reality (VR), but currently they are more for immersive experiences rather than serious modelling and/or testing.
Telecommunication is now an everyday part of our lives, but so are the increasing virtual lives we lead. From avatars in Warcraft and Fortnite to building cities in the Sims, we are increasingly spending our time chatting in virtual rooms or updating our status on Facebook.
We text more than we chat in person and send emails rather than physical letters. Most of our entertainment is supplied via the internet; even our TV and radio, and we can Skype or FaceTime friends around the world.
It's not all bad and means that communication is fast and seamless, plus we can stay in touch with friends and family anywhere in the world 24 hours a day.
Unfortunately there are those who spend far too much time in virtual or online worlds and increasingly become detached from the real world. With the release of the PlayStation 5, VR and Xbox consoles, the lines between reality and virtuality are becoming more blurred. An ‘always online’ approach to connectivity as well as social networking, file sharing etc. mean they will not only become the entertainment hub of our homes, but also a means of communicating with others. Handsets and phones are being integrated into these media monsters and soon we will be connected to each other wherever we go!
One can only hope that we don’t lose the ability or desire to communicate directly with each other in person in the future.
TFT (Thin Film Transistor) used for computer screens
OLED (Organic Light Emitting Diode) Similar to an LCD but each pixel has its own light so can be pure black or white
HDTV (High Definition TV)
IC (Integrated Circuit) also known as a ‘chip’. Thousands of transistors and other components on a single chip rather than a large circuit board.
SMC (Surface Mounted Components) tiny components mounted directly on to the surface of a circuit board rather than soldered on.
GUI (Graphical User Interface) the visual means by which we interact with modern technology.
MP3 A compressed data music format CD (Compact Disc) the first commercial digital disc-based technology
Blu Ray larger storage again than DVD, allowing for higher definition content
DAB (Digital Audio Broadcasting) digital radio
Nanotechnology the miniaturization of technology to a size so small the human eye can no longer perceive it.
ICT (Information and Communication Technology) A term that encompasses all technology used to communicate information.
Broadband a form of data delivery with a high bandwidth allowing more data to be transferred at very fast speeds.
Internet the term given to the vast network of connected devices. Also referred to as the World Wide Web (www)
Intranet the term given to computer devices connected within an organization (can also be connected to the internet.)
Computer the term given to a machine which uses technology to process (compute) data.