Course Overview: Living in the Information Technology (IT) Era
The "Living in the IT Era" course explores the science, culture, and ethics of information technology, its various uses and applications, and its influence on culture and society. This course not only explains the basic concepts and key terms in IT but also highlights the major IT trends and the issues and challenges these developments bring.
Click each topic below to see the learning objectives and its video lesson.
At the end of the lesson, students will be able to do the following:
Identify the major technological development that characterizes each generation of computers;
List down the advantages and disadvantages in each generation; and
Reflect on what makes the computers perform the way they do now.
Video Lesson: https://youtu.be/fjQ0AomS6Lg
PPT: https://docs.google.com/presentation/d/18deU_Y_mYWK3Goh23bHDXW6hSkYi2-MoLME-U-nGLY8/edit?usp=sharing
Evolution of Computers: From Primitive Counting to Advanced Technology
Before the first generation of computers, humans used fingers, ropes, beads, bones, pebbles, and other objects for counting. During this time, electricity had not yet been invented.
First Generation of Computers
The first generation of computers used vacuum tubes, which are sealed glass tubes containing a near vacuum that allows the free passage of electric current. These tubes were about the size of a light bulb. The first computers used vacuum tubes for circuitry and magnetic drums for memory. They were often enormous, taking up entire rooms, and relied on machine language. Examples include UNIVAC and ENIAC. These computers were very expensive to operate, consumed a lot of electricity, and generated significant heat, often causing malfunctions. Vacuum tubes were prone to frequent burnouts. Input was based on punched cards and paper tape, and output was displayed on printouts.
Advantages:
Pioneered electronic circuits for computation.
First electronic devices with memory.
Disadvantages:
Bulky and large in size.
Frequent vacuum tube burnouts.
Produced a significant amount of heat.
Second Generation of Computers
In the second generation, transistors replaced vacuum tubes, making computers smaller, faster, cheaper, more energy-efficient, and more reliable. Although they still relied on punched cards for input and printouts for output, second-generation computers moved from binary machine language to symbolic or assembly languages, allowing programmers to specify instructions in words. High-level languages such as COBOL and Fortran were developed during this time.
The year 1959 marked the invention of transistors, which characterized the second generation of computers. Transistors were three-legged components that significantly reduced the size of computers, occupying only one-hundredth of the space of a vacuum tube. They were more reliable, faster, required no warm-up time, and consumed far less electricity.
Advantages:
Reduced size.
Faster and more reliable than first-generation computers.
Disadvantages:
Overheated quickly.
Maintenance problems.
Third Generation of Computers
The third generation of computers began in 1965 with the invention of integrated circuits (ICs), which are square silicon chips containing circuitry that can perform the functions of hundreds of transistors. Transistors were miniaturized and placed on silicon chips, called semiconductors, increasing the speed and efficiency of computers. Users interacted through keyboards and monitors, and the interface with an operating system allowed the device to run many different applications simultaneously. With a central program that monitored memory, computers became more accessible to a mass audience because they were smaller and cheaper.
Advantages:
Very small size.
Improved performance.
Durable silicon chips.
Cheaper and more energy-efficient.
Disadvantages:
Sophisticated design.
Required frequent maintenance.
Fourth Generation of Computers
The fourth generation of computers emerged with the development of the microprocessor, a silicon chip containing the CPU (Central Processing Unit), where all processing takes place. The Intel 4004 chip, developed in 1971, located all the components of the computer. Fourth-generation computers also saw the development of GUIs, the mouse, and handheld devices. What once filled an entire room could now fit in the palm of a hand. In 1981, IBM introduced its first computer for home users, and in 1984, Apple introduced the Macintosh. As these small computers became more powerful, they could be linked together to form networks, eventually leading to the development of the internet.
Fifth Generation of Computers
Fifth-generation computing devices are based on artificial intelligence and are still in development. Applications such as voice recognition and the use of parallel processing and superconductors are helping to make artificial intelligence a reality. Future advancements in quantum computation, molecular, and nanotechnology will radically change the face of computers. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.
Today, computers are faster and more powerful, with tremendous data storage and processing capacity. New brands and models are frequently released, often more powerful and cheaper than their predecessors. Computers have become more affordable and are now commonly found in homes, schools, and offices. Software technology has also seen tremendous improvements, with a variety of applications available for word processing, spreadsheets, database management, games, and entertainment. Computer education is now offered not just to college students but also to elementary and high school students.
At the end of the lesson, students will be able to do the following:
Differentiate data and information;
Define information technology;
Provide a scenario in which information technology is present; and
Determine the composition of information technologies.
Video Lesson: https://youtu.be/EyVKFNxjLH0
PPT: https://docs.google.com/presentation/d/1ApYjSb3tVc9NhOk13jCPKkKCJS89JBTIwcb96q5wUdw/edit?usp=sharing
The IT Fundamentals: Exploring the Relationship Between Data and Information
To better understand Information Technology, it's important to differentiate between data and information. Data refers to raw, unorganized facts that need to be processed. It can be something simple and seemingly random and useless until it is organized. When data is processed, organized, structured, or presented in a given context to make it useful, it is called information. Information is any knowledge that comes to our attention. The methods for conveying information could include voice, image, text, or video.
According to computerhope.com, information informs you of something and answers a specific question, representing a specific truth or fact. Data is the collection of recorded values from which information can be derived. For example, consider the question “What is your height?” Data provides the basis for an answer to that question. If the data is "six" and "feet," the answer is "my height is six feet." To process the data into information, you must understand what "height" and "feet" are.
What is Information Technology?
Information technologies are systems of hardware and software that capture, process, exchange, store, and present information using electrical, magnetic, and/or electromagnetic energy.
Example Scenarios in Information Technology
Scenario 1: Phone Call
Capture: A cell phone captures the sound of a human voice and converts it into electrical signals.
Process: Network equipment determines where to route the call.
Exchange: A network routes the call from origination to destination.
Store: A voicemail system stores information for later use.
Present: A cell phone translates information from electrical signals to sound waves that the recipient can understand.
Scenario 2: Web Design and Browsing
Capture: A web designer captures multimedia information in HTML format.
Process: A web server processes information like reservations or transactions.
Exchange: Information is exchanged from the web server over the Internet to a wireless access point and to a Wi-Fi-enabled laptop.
Store: The web server stores information content.
Present: Information is conveyed to a user via a web browser on a laptop.
At the end of the lesson, students will be able to do the following:
Define cyberspace, online, network, Internet, and World Wide Web;
Provide examples of the Internet of Things (IoT); and
Determine cloud computing services.
Video Lesson: https://youtu.be/oos6r3in1u0
PPT: https://docs.google.com/presentation/d/1pJCuwui01arJ8gbVcs4AMZatCNtm2qO9lTa350fpR9E/edit?usp=sharing
Understanding Cyberspace and Related Terms
In studying the internet, we often encounter terms like the Internet, Cyberspace, World Wide Web, network, and online. These terms are sometimes used interchangeably but have distinct meanings. Let's take a quick look at the history of the internet to understand how it has evolved into what we know today.
Cyberspace is a term coined by American-Canadian writer William Gibson in his 1984 science fiction novel Neuromancer. Set in the future, the novel follows Henry Case, a washed-up computer hacker hired for one last job, which brings him up against a powerful artificial intelligence. Gibson described a futuristic computer network that people could plug into directly with their brains. Today, the term "cyberspace" encompasses the internet and the World Wide Web, as well as the broader realm of wired and wireless communications. Cyberspace includes chat rooms, blogs, ATMs, conference calls, texting, and more.
Online refers to using a computer connected through a network to access information and services from another computer or information device.
A network is a communication system connecting two or more computers. The Internet is the largest network, a worldwide system that links thousands of smaller networks. It connects educational, commercial, non-profit, military entities, and individuals. Initially developed to share text and numeric data, the internet now supports multimedia as well.
The World Wide Web is the multimedia part of the Internet. It is an interconnected system of servers supporting specially formatted documents in multimedia form. It includes text, still images, moving images, and sound. The World Wide Web is largely responsible for the growth and popularity of the Internet.
A Brief History of the Internet
Long before modern technology existed, Nikola Tesla toyed with the idea of a world wireless system in the early 1900s. Visionary thinkers like Paul Otlet and Vannevar Bush conceived the idea of mechanized, searchable storage systems for books and media in the 1930s and 1940s. In the early 1960s, MIT's Joseph Carl Robnett Licklider popularized the idea of an "Intergalactic Network" of computers. Computer scientists then developed the concept of packet switching, a method for effectively transmitting electronic data.
In the late 1960s, the first workable prototype of the Internet emerged with the creation of ARPANET (Advanced Research Projects Agency Network). On October 29, 1969, ARPANET delivered its first message—a node-to-node communication from one computer to another. It sent a "login" message from UCLA to Stanford.
In the 1970s, Robert Kahn and Vinton Cerf developed Transmission Control Protocol/Internet Protocol (TCP/IP). It is a networking protocol that allows two computers to communicate. A communications model that sets standards for how data could be transmitted between multiple networks. ARPANET adopted TCP/IP on January 1, 1983, and from there, researchers began to assemble the network of networks that became the Internet. In 1990, the online world took on a more recognizable form when Tim Berners-Lee invented the World Wide Web.
This era also introduced the concept of the Internet of Things (IoT). The IoT connects any device to the internet and other connected devices, creating a giant network of connected things and people that collect and share data about how they are used and their environments.
Examples of the Internet of Things:
Smart microwaves that automatically cook your food for the right length of time.
Self-driving cars with complex sensors that detect objects in their path.
Wearable fitness devices that measure your heart rate and the number of steps you take, then suggest personalized exercise plans.
Connected footballs that track how far and fast they are thrown, recording those statistics via an app for future training purposes.
IoT in Your Home
Imagine you wake up at 7:00 AM every day to go to work. Your alarm clock does the job just fine until something goes wrong. Suppose your trains are canceled, and you have to drive to work instead. Driving takes longer, so you would have needed to get up at 6:45 AM to avoid being late. Additionally, it's pouring rain, so you'll need to drive slower than usual.
An IoT-enabled alarm clock would reset itself based on these factors to ensure you get to work on time. It could recognize that your usual train is canceled, calculate the driving distance and travel time for your alternative route, check the weather, factor in slower travel speed due to heavy rain, and determine when it needs to wake you up so you're not late. If it's super smart, it might even sync with your IoT-enabled coffee maker to ensure your morning caffeine is ready when you get up.
IoT Sensors
IoT sensors consist of manual or digital sensors connected to circuit boards such as Arduino Uno or Raspberry Pi. These circuit boards can be programmed to measure a range of data, including carbon monoxide, temperature, vibration, and motion. What differentiates IoT sensors from simple sensors is their ability to not only gather data in different physical environments but also send this data to connected devices.
IoT sensors enable seamless data control through automation, delivering actionable insights. Businesses can use them for predictive maintenance, enhanced efficiency, and reduced costs.
Understanding Cloud Computing
You might have heard of cloud computing before. Cloud computing provides access to information, applications, communications, and storage over the Internet. Before cloud computing, most computers ran software locally. For example, to use a word processor, you might open Microsoft Word, which you had installed on your computer's hard disk. Data storage was also local; emails, documents, photos, and music were all stored on your computer's hard drive or a flash drive.
With cloud computing, all of these changes. You can use your browser to access word processing applications that run from the Internet instead of software installed on your local hard disk. You can use online applications to manage your email, create floor plans, produce presentations, and carry out many other activities. Additionally, you can store your data in the cloud, making it available no matter what computer you're using, as long as it has an Internet connection.
Cloud Computing Services
Cloud computing services include infrastructure, platform, and software. Here are the main types of cloud computing services:
Infrastructure as a Service (IaaS): IaaS provides highly automated and scalable infrastructure for storage, hosting, computing, and networking from third-party global data centers. Instead of owning assets like software licenses or on-premise servers, companies can flexibly rent resources according to their needs, paying only for what they use. Examples of IaaS include Amazon Web Services (AWS), Cisco Metapod, Microsoft Azure, and Google Compute Engine (GCE). IaaS is typically used by system administrators.
Platform as a Service (PaaS): PaaS offers all the basics of IaaS as well as the tools and capabilities needed to develop and deploy applications securely. It provides developers with everything they need to build and deploy applications. Examples of PaaS include Google App Engine, Heroku, and Windows Azure. Heroku is a cloud platform supporting several programming languages. PaaS is commonly used by developers.
Software as a Service (SaaS): SaaS is software hosted by a third party and accessible over the web, typically via a subscription model per user. This differs from the old model of buying and installing software on a machine or server manually. Examples of SaaS include Google Apps like Gmail, Docs, and Drive, Dropbox, Evernote, and Cisco WebEx. SaaS is most commonly used by end customers.
REFERENCES:
https://www.history.com/news/who-invented-the-internet
http://sarabhaiit.com/iot/
https://sea.pcmag.com/networking-communications-software/2919/what-is-cloud-computing
https://www.bmc.com/blogs/saas-vs-paas-vs-iaas-whats-the-difference-and-how-to-choose/
https://www.softwaretestinghelp.com/best-iot-examples/
At the end of the lesson, students will be able to do the following:
Define Information Society (IS);
Discuss the beginning of IS;
Differentiate Industrial Society and Information Society;
Explain an example of an Information Cycle;
List down the indicators and technologies of IS; and
Identify the laws to regulate IS.
Video Lesson: https://youtu.be/lMqZyqgG9u0
PPT: https://docs.google.com/presentation/d/1wI72S7tTaVZN4IVUiHBy4hc2WhL8CP2SJIs1dSGKbMU/edit?usp=sharing
The concept of the Information Society is frequently encountered in discussions about Information Technology. This section will define and explain what the Information Society truly entails.
Defining the Terms
Information refers to an assemblage of data presented in a comprehensible form that can be communicated and used effectively. It involves attaching meaning to raw facts, transforming them into a structured and useful form. For example, raw data such as numbers or text becomes meaningful information when organized into reports or analyzed to reveal trends and patterns.
Society, on the other hand, is a community of people living in a specific country or region, who are connected by shared customs, laws, and organizations. It encompasses the social structures, institutions, and interactions that shape how people live and work together.
When combined, the term Information Society describes a societal framework where information, rather than material goods, becomes the main economic, social, and cultural driving force. In an Information Society, activities such as the creation, distribution, use, integration, and manipulation of information play a central role in economic and cultural life. The aim of this society is to gain a competitive edge through the creative and effective use of Information Technology (IT).
The concept of the Information Society first emerged in Japan during the 1960s, with economist Fritz Machlup being one of the early proponents of the idea. It is sometimes referred to as the "Post-Industrial Society" due to its evolution from the earlier Industrial Society.
Industrial Society, as described by Ashley Crossman in her article “What Is an Industrial Society” on www.thoughtco.com, is characterized by the use of mass production technologies to create large quantities of goods in factories. This mode of production became the central force shaping social and economic life.
Comparing Industrial Society to Information Society
In the Industrial Society, engines and machines were central, whereas in the Information Society, computers take on this role. The basic labor in the Industrial Society was physical, while in the Information Society, mental labor is emphasized. The Industrial Society focused on producing material goods and services, while the Information Society concentrates on creating and managing information. The Industrial Society's production centers were facilities equipped with machinery, whereas the Information Society relies on information utilities and networks.
Nick Moore identifies three main characteristics of Information Societies. First, information is utilized as an economic resource, with organizations leveraging it to improve efficiency, spark innovation, and enhance their competitive position. Second, there is a significant increase in the general public’s use of information, which helps individuals make informed decisions, access public services, understand their rights, and participate in distance learning. Third, there is the development of an information sector within the economy, focusing on satisfying the demand for information facilities and services, including technological infrastructure such as telecommunications networks and computer systems.
The Information Cycle
The process of creating information begins with creators like writers, musicians, artists, researchers, and web developers. Their output consists of information products such as books, videos, magazines, and websites. These products are distributed by publishers, internet providers, and vendors, and are then made available through schools, libraries, universities, businesses, governments, and museums. Finally, this information is used by individuals, students, businesspersons, researchers, employees, and employers.
Example: A web developer creates a website intended for students to access online.
Indicators of an Information Society
Indicators of an Information Society include the growth of information and communication technologies (ICTs), an increase in skilled professionals, and changes in occupational structures that promote equal opportunities. The creation, management, and delivery of information become essential assets, and there is a heightened use of information for public engagement in policymaking and social activities.
Technologies Used
Technologies in the Information Society include phones, broadcasting media, computers, the World Wide Web, intranets, the internet, telecommunications, and satellites.
Regulation and Impact
As Information Societies expand, there are growing calls for regulation. Proposed regulations include guidelines for freedom of information and related laws, such as copyright and intellectual property rights.
In today’s world, the internet and computer networks have connected people globally. ICTs have made life more convenient and productive. For example, schools now use fewer physical books, businesses conduct transactions online, and organizations make predictions with minimal effort. Consequently, mental power is as important as physical power in the modern era.
REFERENCES:
https://whatis.techtarget.com/definition/Information-Society
https://www.thoughtco.com/industrial-society-3026359
https://files.dnb.de/EDBI/www.unesco.org/webworld/wirerpt/wirenglish/chap20.pdf
https://www.slideshare.net/shifanmihilar/information-society-2
At the end of the lesson, students will be able to do the following:
Identify the new trends in Information Technology (IT);
Predict the future of IT;
Explain the different mobile technology cellular networks, WiFi, and Bluetooth for mobile devices;
Categorize artificial intelligence;
Compare virtual reality and augmented reality; and
Discuss Emotional Intelligence Computing.
Video Lesson: https://youtu.be/pS_UbWdiixA
PPT: https://docs.google.com/presentation/d/1gyI2DE8vRKFsTAiWj0wuu9U1o7LvzVG62AI7owzcELU/edit?usp=sharing
New Trends and Future Directions of IT
This section explores the new trends and future directions of Information Technology (IT) that are increasingly prevalent across various sectors today. The discussion will cover mobile technology, artificial intelligence, machine learning, virtual reality, augmented reality, and affective computing.
Mobile Technology
Mobile technology refers to portable communication and computing devices that enable users to stay connected wherever they go. This technology includes devices such as smartphones, tablets, and smartwatches, along with the networks that support their communication capabilities.
Mobile Technology Communication Networks:
Cellular Networks: The evolution of cellular networks began in the 1980s with 1G, which provided analog voice-only services. In 1991, 2G introduced text messaging and multimedia messaging. The 1998 launch of 3G brought enhanced data capabilities, including video calling and mobile internet. The late 2000s saw the advent of 4G, offering speeds 500 times faster than 3G. The upcoming 5G, expected by 2024, aims to support more devices and expand the Internet of Things (IoT).
Wi-Fi: Standing for “Wireless Fidelity,” Wi-Fi allows devices to connect to networks using radio waves instead of wires.
Bluetooth: A technology for short-range wireless communication between devices such as phones, computers, and headphones.
Artificial Intelligence (AI)
According to Built In, Artificial Intelligence is a broad field within computer science dedicated to creating smart machines capable of performing tasks that typically require human intelligence.
AI generally falls into two main categories:
Narrow AI: Also known as Weak AI, this type of AI operates within a limited context and is designed to perform specific tasks. While these systems can accomplish these tasks effectively, they are constrained by their specialized functions and lack the general problem-solving abilities of human intelligence.
Artificial General Intelligence (AGI): Sometimes referred to as Strong AI, AGI represents a type of AI that possesses general intelligence akin to that of a human being. AGI systems can understand, learn, and apply knowledge across a wide range of tasks, similar to the sophisticated robots seen in movies like Westworld or Star Trek: The Next Generation.
Narrow AI is the most prevalent and successful form of artificial intelligence seen today. Some prominent examples of Narrow AI include:
Image Recognition Software: This technology identifies objects, people, places, text, and actions in images or videos. For instance, Facebook’s facial recognition feature, which identifies and tags friends in photos, is a practical application of this technology.
Google Search: Also known as Google Web Search, this web search engine, developed by Google, holds a dominant 92.62% market share as of June 2019. It handles over 5.4 billion searches daily.
Intelligent Virtual Assistants (IVAs): These software agents perform tasks or services based on user commands or questions. Examples include Google Assistant, Siri, and Alexa.
Self-Driving Cars: Also known as autonomous vehicles or driverless cars, these vehicles can sense their environment and navigate safely with minimal or no human intervention.
Question-Answering Systems: Systems like IBM’s Watson, developed by the DeepQA project of David Ferruci's research team, can answer questions posed in natural language. Watson, named after IBM's founder Thomas J. Watson, showcases advanced capabilities in processing and understanding human language.
Other Examples of Narrow AI include:
Conversational bots for marketing and customer service
Robo-advisors for stock trading
Spam filters for email
Social media monitoring tools for detecting dangerous content or false news
Recommendation systems for music or TV shows, such as those used by Spotify and Netflix
Disease mapping and prediction tools
Personalized healthcare treatment recommendations
Venture capitalist Frank Chen offers a clear overview of AI, describing it as "a set of algorithms and intelligence to try to mimic human intelligence," with machine learning and deep learning being key components.
Machine Learning is a field focused on enabling machines to learn from experience and improve their performance over time. Common applications include:
Face detection on smartphones
Friend and ad recommendations on social media
Product recommendations based on browsing history on platforms like Amazon
Fraud detection in banking transactions
Deep Learning is a specialized area of machine learning that utilizes a neural network with multiple hidden layers to process data. This approach allows machines to make complex connections and improve their learning outcomes.
Virtual Reality
Virtual Reality (VR) refers to the use of computer technology to create a simulated environment. The most recognizable component of VR is the head-mounted display (HMD). As visual creatures, humans are significantly influenced by display technology, which often distinguishes immersive VR systems from traditional user interfaces. Major players in the VR market include HTC Vive, Oculus Rift, and PlayStation VR (PSVR).
VR immerses users in a digital experience by placing them inside a 3D world rather than simply displaying content on a screen. By simulating multiple senses—such as vision, hearing, touch, and even smell—VR creates an artificial environment where users can interact with the virtual world.
Augmented Reality (AR) is often confused with VR, but they are distinct technologies. Let’s differentiate between the two:
Virtual Reality (VR) creates a completely artificial environment. In VR, the computer uses sensors and algorithms to place the user’s eyes within a simulated world. When the user moves their head, the graphics adjust accordingly to maintain the illusion of being in a different environment. VR focuses on creating a fully immersive and interactive experience.
Augmented Reality (AR), on the other hand, overlays artificial objects onto the real world. Unlike VR, AR combines real and virtual elements. The computer uses sensors and algorithms to track the position and orientation of a camera, rendering 3D graphics that appear to be part of the user’s real-world view. AR adds digital content to the user’s environment rather than creating a separate virtual world.
In short, while VR immerses users in a completely virtual environment, AR enhances the real world with computer-generated images and information.
One of the promising technologies for the future is affective computing, also known as artificial emotional intelligence. Affective computing refers to the study and development of systems and devices designed to recognize, interpret, process, and simulate human emotions. These technologies use various methods, such as sensors, microphones, cameras, and software algorithms, to detect a user’s emotional state and respond with predefined actions, such as adjusting a quiz or recommending videos based on the user’s mood.
Affective computing aims to achieve several key aspects of emotional intelligence, including:
Recognizing others’ emotions: Identifying emotional states through various cues.
Responding to others’ emotions: Reacting appropriately to emotional signals.
Expressing emotions: Simulating emotional responses.
Regulating and utilizing emotions in decision-making: Managing and using emotional information effectively.
How Does Affective Computing Work?
Affective computing involves human-computer interaction where a device can detect and respond to a user’s emotions. This is achieved through various means such as:
Facial expressions: Cameras capture and analyze facial expressions.
Posture and gestures: Sensors detect body language.
Speech: Speech recognition systems interpret tone and content.
Keystrokes: Analyzing typing patterns for emotional cues.
Temperature changes: Measuring changes in hand temperature on a mouse.
These methods collect data that is processed using algorithms to provide meaningful insights into the user’s emotional state. Technologies like facial recognition, speech analysis, and gesture recognition are key components of affective computing.
Applications of Affective Computing
Affective computing has significant potential across various sectors. Based on Prof. Ahmed Banafa’s insights from BBVA OpenMind, here are some specific applications:
E-Learning: Affective computing can adjust a computerized teacher’s presentation style based on the learner’s emotional state, such as boredom, interest, frustration, or pleasure.
Psychological Health Services: It can aid counselors in assessing a client’s emotional state, improving therapeutic outcomes.
Robotic Systems: Robots that process affective information can be more flexible and effective in uncertain or challenging environments.
Companion Devices: Digital pets and similar devices use affective computing to enhance realism and provide greater autonomy.
Examples of Affective Computing in Use
Automobiles: Cars can monitor a driver’s emotions and implement safety measures, such as alerting other drivers if the system detects anger.
Affective Mirrors: Mirrors that guide users on how to perform emotional expressions.
Emotion Monitoring Agents: Tools that warn users before sending an emotionally charged email.
Music Players: Devices that select music tracks based on the user’s mood.
Market Research: Companies can use affective computing to gauge public reception of their products.
Future Directions in IT
Looking ahead, future developments in IT will continue to focus on advancements in mobile technology, artificial intelligence, and emotional computing. The ongoing evolution of these technologies promises to reshape various aspects of daily life, from improving e-learning experiences to enhancing user interactions through innovative applications.
At the end of the lesson, students will be able to do the following:
Describe ubiquitous computing;
Discuss the beginning of ubiquitous computing;
Explain the goal of ubiquitous computing;
Identify examples of ubiquitous computing;
Differentiate virtual reality and ubiquitous computing;
List down challenges, advantages, and uses of ubiquitous computing;
Outline the key elements of ubiquitous computing; and
Create an example of ubiquitous computing and label its ubiquitous sensing, access, middleware, and networking.
Video Lesson: https://youtu.be/xa7jnY_mCaw
PPT: https://docs.google.com/presentation/d/1LL-_DL8tgDSKG33wMW3tIJr0HPF8mo6p0Gpg3Q88ZbI/edit?usp=sharing
Ubiquitous computing, also known as pervasive computing, refers to a concept in software engineering and computer science where computing technology is made available at any time and in any place. The term encompasses the idea that computing devices can be integrated into everyday objects and activities, creating a seamless and pervasive computing environment.
Ubiquitous computing envisions a world where almost any device—be it clothing, tools, appliances, cars, homes, or even a coffee mug—can be embedded with chips that connect to a vast network of devices. This approach supports a model where one user can manage multiple computing tasks and one device can perform many functions, such as a smartphone.
The term was coined by Mark Weiser, the chief scientist at Xerox PARC (Palo Alto Research Center) in the late 1980s. Weiser described the ideal of ubiquitous computing as making computers so integrated and natural that they become invisible and seamlessly embedded into everyday life.
Manuel Castells is another significant figure in the concept of ubiquitous computing. In his book, “The Rise of the Network Society,” Castells discusses the shift from stand-alone microcomputers and mainframes to a pervasive computing environment. He envisioned a future where billions of interconnected devices create a comprehensive network.
The goal of ubiquitous computing is to create an environment where connectivity is unobtrusive and constantly available. This involves devices that are tiny, affordable, robust, and capable of seamless integration into various objects and environments.
Examples of ubiquitous computing include:
Computer sensors in floors that monitor physical health.
Smart meters that replace old electric meters to provide real-time data on electricity usage.
Automatic intelligent lighting and cooling systems that adjust based on environmental and user data.
Smart thermostats like Nest, which learn and anticipate user needs for heating and cooling.
Interactive whiteboards used for education and business presentations.
RFID tags combined with barcodes and QR codes to connect products to the Internet of Things (IoT).
Virtual reality and ubiquitous computing are compared to illustrate their differences. While virtual reality immerses users in a computer-generated environment, ubiquitous computing integrates computing into the real world, making technology an invisible but omnipresent aspect of daily life.
Challenges of ubiquitous computing include:
Privacy and Security: Concerns arise over continuous data collection and monitoring.
Volatility: Rapid technological advancements can lead to obsolescence and increased costs.
Impromptu Interoperability: Proprietary technologies can create barriers between different devices and systems.
Advantages of ubiquitous computing are:
Effective Information Processing: Enhances user experiences and manages information efficiently.
Seamless Integration: Technology is designed for everyday use.
Enhanced Socialization and Decision Making: Facilitates better decision-making and social interactions.
Ubiquitous computing technologies include:
Information Access: Retrieval of text, multimedia documents, and e-books.
Automatic Indexing: Algorithms and machine learning for indexing without human intervention.
Networking and Wireless Protocols: Technologies for connecting devices and accessing information.
Key Elements of Ubiquitous Computing are:
Ubiquitous Sensing: Devices that detect physical stimuli and provide data.
Ubiquitous Access: Ensures cloud services are accessible from various devices and interfaces.
Ubiquitous Middleware: Software that manages interactions between applications and networks.
Ubiquitous Networking: Distribution of communication infrastructure to ensure continuous connectivity.
An example of these elements is a smart bathroom patent that uses sensors and cameras to monitor health and collect data for professional analysis.
By understanding these concepts, one can appreciate how ubiquitous computing is shaping the future of technology and its impact on daily life.
At the end of the lesson, students will be able to do the following:
Explain the benefits of Information Technology (IT);
Outline the social issues of IT;
Identify economical issues of IT;
Discuss ethical issues of IT;
Justify legal issues of IT; and
Converse the environmental issues of IT.
Video Lesson: https://youtu.be/PLeaL7e1EeI
At the end of the lesson, students will be able to do the following:
Explain data, Big Data, and Big Data Analytics;
List down examples of Big Data and machine learning activities on Big Data;
Describe the different types of data based on the degree of organization, as follows: structured data, semi-structured data, and unstructured data;
Discuss various characteristics of Big Data, as follows: volume, variety, veracity, value, and velocity; and
Identify what organizations do to extract valuable information from Big Data.
Video Lesson: https://youtu.be/iH-XIkMplHo
At the end of the lesson, students will be able to do the following:
Discuss the benefits of big data to different sectors; and
Explain the drawbacks of big data.
Video Lesson: https://youtu.be/oACqGwtqm1M
At the end of the lesson, students will be able to do the following:
Discuss computer ethics;
List down the rules of computer ethics;
Explain netiquette;
Outline the core rules of netiquette;
Enumerate tips to stay safe online; and
Identify issues in the cyberworld.
Video Lesson: https://youtu.be/hamJzUXVwTw
At the end of the lesson, students will be able to do the following:
Define cybercrime;
Discuss what is Cybercrime Prevention Act of 2012; and
List down what are the offenses included in the Republic Act 10175.
Video Lesson: https://youtu.be/pJMGKeoIfyw
At the end of the lesson, students will be able to do the following:
Create strong passwords;
Be able to lock personal device;
Identify how the browser works;
Explain threats in using public WiFi;
Discuss disadvantages of being too public on social media;
Perform check and clean up of email;
List down steps to avoid malware from adware;
Demonstrate backing up files; and
Explain the importance of securing a wireless network.
Video Lesson: https://youtu.be/Ka3v2f9B1Kk
At the end of the lesson, students will be able to do the following:
Discuss these different agencies responsible for IT programs;
Identify the laws that created those agencies; and
Outline their powers and functions.
Video Lesson: https://youtu.be/OXAg9uVw7g8
At the end of the lesson, students will be able to do the following:
Define copyright;
List down what can be protected using the copyright;
Discuss what rights does copyright give;
Identify how long does copyright last;
Describe patent;
Explain what kind of protection does a patent offer;
Determine how long does a patent last;
Define trademark;
Explain what rights does trademark registration provides;
Identify how long does a trademark last;
Discuss Republic Act Number 8293 or the Intellectual Property Code of the Philippines;
List down the mandate of the Intellectual Property Office of the Philippines; and
Explain what the IPOPHL logo represents.
Video Lesson: https://youtu.be/YlQFRzW6USQ
At the end of the lesson, students will be able to do the following:
Define e-commerce;
Discuss what is Electronic Commerce Act of the Philippines; and
List down what are the salient features of Republic Act 8792.
Video Lesson: https://youtu.be/Sm62p5RNEZY
At the end of the lesson, students will be able to do the following:
Define data privacy;
Discuss what is the Data Privacy Act of 2012 or Republic Act No. 10173; and
Identify the functions of the National Privacy Commission.
Video Lesson: https://youtu.be/immgbXJRwKo
In this video, I will talk about the data subject's data privacy rights under the Republic Act 10173 or the Data Privacy Act of 2012.
At the end of the lesson, students will be able to do the following:
Describe data subject, personal information, and personal information controller; and
Discuss the data privacy rights of a data subject.
Video Lesson: https://youtu.be/immgbXJRwKo
At the end of the lesson, students will be able to do the following:
Define personal data breach;
Classify data breaches;
Identify reasons why data breaches occur;
Discuss COMELEC data breach and Cebuana Lhuiller data breach; and
Explain the procedures of data breach notification of the National Privacy Commission.
Video Lesson: https://youtu.be/UXZZNvzr7qQ
Here are the review materials that will guide the students in studying for the periodical exams. Click each image to go to the resources.
This serves as review material for the Prelim period. The topics included are Computer Generations, Intro to IT, the Internet, Information Society, New Trends and Future Directions of IT, and Ubiquitous Computing.
Link: https://bit.ly/gelc111prm
This serves as review material for the Midterm period. Covered topics are Benefits and Issues of IT, Intro to Big Data, Computer Ethics, and Government Agencies Responsible for IT Programs.
Link: https://bit.ly/3uLaVR4
This serves as review material for the Final period. Included topics Intellectual Property Code, E-commerce Act, Cybercrime Act, and Data Privacy Act, Personal Data Privacy Rights, and Data Breach.
Link: https://bit.ly/3CqcQhP