It seems that no matter how hi-tech. or highly funded your systems and software are, there will always be issues present within them. We see poor software/system malfunctions all the time, like when "Eleven people were killed in U.S. crashes involving vehicles that were using automated driving systems during a four-month period" [1] in 2022. We need to be careful with what we buy and how we use what we buy because the computing systems we purchase and use could end up harming us, regardless of the creator's intent. Errors are natural and happen all the time, but those who create products need to test their items more thoroughly before releasing to the public.
Errors Leading to System Malfunctions
There is a scene where a Bash LiiF purchases a song repeatedly without the user’s consent. This is likely due to a bug in the system that purchases items based on what it hears, due to the fact that the owner of the phone was listening to talk show hosts talk about the artist whose song the phone purchased. There's also the scene where the CEO of BASH touches the BASH LiiF 14.3 phone, which reads emotions and treats you accordingly. The phone determines he's sad and recommends a video for him to watch to make him happy, but it is later revealed that the video did not satisfy him.
Errors Leading to System Failures
When the BEADS were launched into space to mine the comet for valuables, a glitch occurred that caused numerous drones to not mine the comet as planned. Since more drones failed than the government and scientists planned, their mission was ultimately unsuccessful. Since the scientists could instantly see which drones were going offline and when, it is implied that real-time systems were used to monitor the drones.
Assigning Moral Responsibility
While the government was not responsible for the comet killing millions, the government only tried to stop the comet when it was convenient for them. They have millions of dollars and high-tech resources, and could have fought much harder to destroy the comet and saved more lives. The government also withheld information about the severity of the comet, prevented information from being spread, and blatantly lied to the public. Dibiasky and Mindy, however, worked hard to make known the severity of the comet and Dibiasky discovered the comet in the first place. They tried multiple times to persuade the government to at least try to stop it. Essentially, the scientists acted more morally than the government itself did.
Effort to Eliminate “Bad Experiences”
A false negative occurs when people do not recognize and establish that lethal predicament is present [2]. The government made an effort to downplay and outright deny the impact the comet would have on the planet. They gave commands such as “Don’t Look Up” to the public and wanted to tell the public there was a much higher chance of surviving the comet than there actually was.
Simulations
When the drones and rockets were launched into space, simulations were run to determine if they would be successful or not. The simulations were verified and validated, but the drones and rockets failed. The rockets were called away, but the drones genuinely failed their mission. Simulations also were run by the Bash LiiF phone itself, as it can predict people’s time and cause of death. The phone also uses past events and people’s statistics as data points and uses that information to classify and diagnose people. For example, the phone classifies Mindy as a “lifestyle idealist” which means even though he likes to think of himself as ethical, he ultimately just tries to make himself happy and avoid confrontation. The phone also ran a health diagnosis and found issues before his doctors could. Lastly, computer simulations had to be performed so the phone could recognize people’s feelings. The phone takes into account health statistics such as heartbeat and blood pressure to determine what mood the user is in. Without verification and validation, however, simulations are ultimately useless.
Software Development/Software Quality
Software engineering plays a huge role in this film, as it plays a role in essentially all computing devices used. Specifically, CASE (computer-assisted software engineering) tools were used to design the BEADS. It is likely that the Bash LiiF phone and BEADS were created through object-oriented design. Since the Bash LiiF phone gave readings that were often accurate and the Dell computers & social media platforms had no noticeable glitches, it can be assumed that their software is high-quality. Since the BEADS failed, it can be assumed that their software quality is not that great.
Moneywatch, "11 more people killed in car crash involving automated-tech vehicles", (CBS News, October 19 2022), www.cbsnews.com/news/self-driving-vehicles-crash-deaths-elon-musk-tesla-nhtsa-2022/ (May 3 2023)
Michael J. Quinn, Ethics for the Information Age, (Pearson, September 15 2020), p.646