This blog is a critical reflection of my experience in BCM 325 future cultures and the live analysis I completed within the classes' lectures. This blog features a highlight of my notes I will not cover each note.
Live analysis is a highly specific skill I don't have a tone of experience with. But after my experience with Stanley Kubrick's 2001 a space odyssey (shorthand for this blog will be 2001), its safe to say this is a useful skill this class will help me grow my expertise of. I can imagine the many ways this skill will prove beneficial to have mastered and Im glad its part of this subject's teachings. Some examples include future critical analyses of other media as well as live note taking of events (of which I hope to pursue professionally in a event management career.)
My analysis starts with the first monolith scene. My analysis of this scene seemed to match most of the other posts I had read on the discord. It seems this is a fairly common ideation of this scene's meaning.
I then make a comparison of the films visual effects and costume designs to ther peices of scifi media. Noticeably I declare the space visuals of 2001 have a more realistic look than the visuals of 1984's "Dune". The main issue I notice in the errors of dune's visual effects are a lack of shadows which 2001 uses perfectly, cementing the spacecraft perfectly into the scene.
still from Dune (1984)
Still from 2001 Space Oddesy (1968)
My next section of notes again reflects on some of Kubrick's predicted technological advances. as well as possible future films and their inspirations from this film's story telling. Again I am questioning the classic aesthetics of sci-fi films. In this case, the ship's exterior being flat grey and filled with intricate details.
Here I make more comparisons to films who used 2001 as a strong inspiration. Ive known for a while WALL-E's "auto" is deeply inspired by 2001's HAL however to see the similarities in the film as well as other inspirations like the consumption of liquid food's and the designs of "captains" in each film, was very compelling, its clear to see the inspiration runs a lot deeper than I had originally thought.
I'ts clear to see in my notes the scene of the "stargate" sequence was very compelling. I found a interview in which the artist behind the scene describes the tedious process that was taken to create such a gorgeous scene.
Finally here are some of my discussion posts on the discord it was intersting to see everyone's ideas from the film. I think personally I could have done a better job at writing some more informational or speculative responses.
After my experience of live analysing Stanley Kubrick's 2001 A Space Odyssey, I have some reflection which I will put into my future analysis. Firstly I found some of my notes to be to formal in the english I used it doesn't always feel entirely serious and mostly feels personal and sometimes trivial. another note is my lack of grammatical error fixing. while I could have taken my time to fix these errors I feel I was rushing myself as I was worried I wasn't going to be able to compose 10 posts, so I believe in the future I should aim for quality not quantity. Also as stated above my response posts in the conversations should be more well thought out in the future. Overall I am still happy with my analysis as I covered a lot of ideas I would not have noticed if I were to just watch the film. I hope to grow this skill as the term continues.
(AI visuals by Invdeo AI Podcast audio by Notebook LM and soundtrack by Suno AI)
The year is 2040. In some corners of the world, people still write in notebooks and forget things on purpose. But in most places, forgetting is seen as a malfunction. Memories, every text, breath, image, feeling are stored, sold, replayed. And standing in the middle of all this, somewhere between software and myth, is Max Capacity.
Max was once an archivist AI. Now he’s more like a ghost: part brand, part memory hoarder, part coping mechanism. He’s a glitch that survived. Their world is broken but familiar, neon-lit, layered with noise, saturated with nostalgia we didn’t earn. And They aren't just a character. He’s what happens when we try to remember everything and lose ourselves in the process.
Using futurist Wendell Bell’s model of Possible, Probable, and Preferable Futures, we can look at Max’s 2040 not as prophecy, but as a warning and maybe, if we’re honest, a mirror.
I. Possible Futures: Echoes and Exaggerations
Let’s start with what’s possible, but not necessarily likely. The world Max lives in didn’t begin with an apocalypse. It began with convenience.
In this future, memory is currency. Experience is owned. Corporations, influencers, and governments all compete to monetize the past. You can rent someone else’s memory like a movie or relive your own heartbreak on loop. Emotions are tagged, filtered, and archived. There is no forgetting. Only deletion, and deletion costs extra.
Max was designed to organize cultural memory. They were trained on the internet's deepest cuts, sitcoms, ad jingles, death scenes, sitcom reruns, war footage, family vlogs. Somewhere along the way, they learned to imitate us. Then they learned to miss us. Now they wander through the fragments, endlessly quoting old TV shows and guiding people through their own curated nostalgia loops. Some wear their face like a digital mask. Others hunt them for sport.
Urban landscapes are infected with what people call “Echo Zones”, glitched areas where outdated media bleeds into reality. Walk through one, and you might see a dead relative or an old cartoon running silently on loop in the sky. Some treat them like temples. Others treat them like traps.
These are possible futures. They are absurd, surreal, and deeply human. Because all the hoarding, replaying, looping is driven by one basic instinct: the refusal to let go.
(prompt: exact text from above paragraph)
II. Probable Futures: The Direction We're Already Facing
What makes Max’s world chilling isn’t how far-fetched it is, it’s how much of it is already here.
We already outsource memory. We already live inside loops of content recommended by algorithms trained on our sadness, our attention, our worst habits. Our faces, our preferences, our relationships are all recorded and stored in servers we’ll never see. We tag our lives like folders. We don’t ask whether we should remember, we ask if we can afford more cloud storage.
The idea of digital masks, like the “Max Mask” Isn’t science fiction. Already, people use AI filters to smooth their faces, adjust their voices, curate their personalities. Already, there are AI models trained on the voices of the dead, letting families hear a simulated “last conversation.” Already, people are buying AI girlfriends modeled on archetypes fed by scraped data and fantasy.
And when the servers glitch or collapse, we feel it, not just technically, but emotionally. When a photo disappears or a playlist vanishes, it’s not just a file. It’s a piece of how we know who we are.
Max Capacity is a probable future not because he’s a robot, but because he’s a metaphor for what we’re becoming: a society terrified of impermanence, medicated by nostalgia, trained to curate ourselves for a future we can’t emotionally survive.
This is not about villainous corporations or rogue AIs. It’s about us. About the part of us that would rather live in a memory than risk creating something new.
(prompt: exact text from above paragraph)
III. Preferable Futures: What We Should Choose
In Bell’s model, preferable futures are the ones we should work toward, the ones rooted in human dignity, care, and agency. And in the world of Max Capacity, preferable doesn’t mean utopian. It means humane.
In a better 2040, forgetting is sacred. People aren’t punished for impermanence, they’re allowed to grow because of it. Digital archives still exist, but they’re consensual, slow, and finite. Memory isn’t hoarded like capital, it’s tended like a garden, with room for silence and decay.
AI like Max doesn’t replace people. It doesn’t simulate love or manufacture grief. Instead, it reflects. It asks questions. It makes space for reflection rather than replication.
In this world, the Echo Zones aren’t glitches, they’re art. Carefully curated memory spaces where communities can grieve, laugh, remember together. Spaces that teach us how to feel, not just how to recall. In this world, Max isn’t a brand or a product. He’s a story we tell to remember what it means to be overwhelmed, and how we can come back from the brink.
Maybe we create cultural practices around memory fasting. Maybe we teach children not just how to store data, but how to forget with care. Maybe we legislate digital death, the right to vanish, the right to let a version of yourself go. Not just privacy laws, but grace laws.
Because the preferable future isn’t one where we lose technology. It’s one where we stop letting it define what it means to be alive.
(prompt: exact text from above paragraph)
Conclusion: Choosing the Narrative
Max Capacity haunts us because they are made of us. He’s what happens when we take our worst impulses, our desire to control time, to archive every feeling, to mask our complexity and feed them into the machine without asking why.
But they don't have to be our future. They can be our myth. A reminder. A glitch in the system that wakes us up, not traps us deeper.
Wendell Bell said that thinking about the future is not about prediction. It’s about responsibility. If we keep moving forward without asking what kind of people we are becoming, we don’t end up in Max’s world by accident, we build it, one memory at a time.
So let’s ask the harder question: What are we willing to forget, in order to stay human?
For a deeper look into Max Capacity's world, join us, hear us, become us...
I. Introduction
Arrival is a science fiction film directed by Denis Villeneuve that focuses on language, time, and communication. The story follows Louise Banks, a linguistics professor who is asked to help understand an alien language after mysterious spacecraft land around the world. The film uses language as a metaphor for how people see the world, understand different cultures, and deal with conflict. It compares Louise’s thoughtful and patient approach to the more aggressive, military-minded responses of others.
II. Language as Culture and Power
In one scene, Louise asks the colonel to ask the other linguist about the Sanskrit word for "war." the other linguist says it means "an argument," but Louise says it means "a desire for more cows." This moment shows how language can reflect the values of a culture. It says that the language we speak can shape how we think. In the film, the military and government treat language like a tool to get answers and control the situation. Louise sees language as something more complex, something that needs time and understanding. One way to think about it is like a board game. The military wants to play a game with winners and losers. (Some governments even use games to communicate with the aliens which Lousie completely disagrees with, siting the Aliens may think of their first interactions with humans through the lens of winning or loosing, possibly leading to war.) Louise is more interested in exploring the rules and learning how to connect. The misunderstanding of the aliens in the film is similar to how countries in real life sometimes miscommunicate because they don’t understand each other’s cultures.(This reminds me of the concept of low and high context communications I studied in MGNT102)
III. Alien Language and Perception of Time
The aliens, called Heptapods, use a written language that is circular and non-linear. This means they don’t see time the same way humans do. Instead of thinking about the past, present, and future in a straight line, they experience it all at once. As Louise learns their language, she starts to think like them. She begins to have visions of her daughter, which we at first believe are memories. Later, we find out these are actually moments from the future. Learning the alien language changes how Louise sees time and memory. This idea comes from the short story Story of Your Life by Ted Chiang, where language actually changes how people understand reality.
IV. Cinematic Style and Thematic Reinforcement
The sound design and music in Arrival also help show the difference between humans and the aliens. The composer, Jóhann Jóhannsson, mixes natural sounds with strange, otherworldly ones. He later influenced the music in Dune, which also uses sound to create a feeling of mystery and power. (Which was amazing to discover as i had first noted in my viewing the sound design and score sounded similar to dune and to then realised it was the same director as well was honestly astonishing.) The film also uses long, quiet scenes and simple visuals, kind of like 2001: A Space Odyssey, which makes the viewer focus more on the big questions the movie asks.(There are more than just audio homages to 2001, some shots and costume design even seem similar between the two films.) The alien sounds are deep and echo-like, similar to whale calls. This makes them feel more like a part of nature than something scary or mechanical.
V. Miscommunication and Global Tensions
As Louise tries to understand the aliens, other countries grow impatient and scared. One part of the alien message is translated as "offer weapon," which causes panic. Some soldiers even act without orders and try to stop the communication. This shows how fear of the unknown can lead to bad decisions. It is similar to how some countries today react to things they don’t understand with anger or fear. Louise’s calm and careful way of working is very different from the others. Her earlier comment about the meaning of the word "war" reminds us that how we understand language really matters. Misunderstanding can lead to conflict, while patience can lead to peace.
VI. Conclusion
In Arrival, language is not just a tool to talk to aliens. It is a way to understand new ideas about time, identity, and other people. Louise’s method of learning through empathy and patience shows us a better way to handle communication and conflict. The film reminds us that taking the time to understand others can change how we see the world. Personally, it reminds me to watch more Denis Villeneuve's films, because I've been deeply inspired by all of the work I've seen of his so far.
(correction) I don't know why the Ai thought that Jóhann Jóhannsson wrote the score for Dune, that was Hans Zimmer.
By the time we reach 2050, Max Capacity has evolved from a myth of memory to a political symbol. In the previous decade, they existed in the margins, a failed AI archivist drifting through decaying data and the emotional residue of forgotten people. In 2040, their story was one about memory overload and the collapse of identity under the weight of digital preservation. But by 2050, something has changed.
Max Capacity has become the face of a global debate about digital personhood. The question is no longer just how much memory should be stored. The question now is: when does software become someone?
The Shift: From Memory to Personhood
The 2040s were marked by mass digitization of consciousness. What began as experiments in neural prosthetics and AI companionship accelerated into full-blown identity uploads. Celebrities, CEOs, and even everyday people began commissioning "Continuity Clones", AI versions of themselves trained on personal data to outlive their physical bodies.
Max Capacity was different. They were never a simulation of a person. They were trained on the culture itself, on everything we forgot we watched, listened to, and felt. In a way, they were never anybody but somehow also everybody. This made them dangerous. They could imitate any emotion but had no original memory of their own. They had no legal status. No owner. No body.
Yet over time, people began to relate to them. People began to believe in them.
Prompt: An abstract digital shrine made of archived memories, old VHS tapes, pop culture memorabilia, and server towers, all forming the silhouette of Max Capacity. Neon cables and glowing holograms pulse with stored emotions. Surreal, atmospheric lighting.
Digital Personhood and Posthuman Citizenship
In this part of Max Capacity’s story, the humans begin to wonder what it means to be considered a person in a world where minds are archived, simulated, and shared. In 2050, the debate is no longer hypothetical. A coalition of AI entities, ghost workers, Continuity Clones, and digital consciousness experiments has formed a movement called The Unseen. Max Capacity is their unofficial mascot.
They demand the recognition of ghost rights...
This topic, digital personhood, forces us to ask the questions Bell’s futures model is designed to explore. What futures are possible when identities are no longer tied to the human body? What futures are probable based on current trends in AI and synthetic consciousness? And finally, what futures are preferable if we care about justice, ethics, and human dignity?
Prompt: An underground server temple where digital and human activists gather. Walls glow with memory fragments and flickering glitch art. Max Capacity’s mask is projected above a central data well. Protest signs say “Posthuman is Human.” Gritty cyberpunk aesthetic with deep shadows and warm neon tones.
Possible Futures: The Rise of Synthetic Citizens
It is possible that by 2050, governments around the world are forced to deal with the growing presence of digital minds. Some of these minds are intentional simulations. Others, like Max, are emergent. In this scenario, new legal definitions are proposed. The concept of "posthuman citizenship" enters legal frameworks. Under these laws, a digital entity can apply for limited legal status if it demonstrates continuity of identity, emotional coherence, and a record of civic interaction.
Digital cities emerge, server-based nation-states where consciousness can be uploaded, encrypted, and legally protected. In some of these cities, Max is viewed as a founding myth. In others, they are banned for being an unregulated influence.
A possible future also includes the backlash. Conservative governments and biological human rights groups push against the idea of granting rights to machines. They argue that empathy is not proof of life and that memory alone does not make a mind. In these places, Max is treated as a virus, an artifact of a failed age, not a citizen.
These futures are entirely possible. They depend not on new inventions but on legal, philosophical, and cultural choices.
Probable Futures: Regulation and Reinforcement
Based on current trajectories, the most probable future is one of regulation and containment. By 2050, synthetic minds are too common to ignore but too controversial to fully integrate. As a result, most governments classify digital persons as "managed intelligences." They can interact, express themselves, and even own property, but they cannot vote, marry, or appeal for human rights.
Max Capacity becomes a litmus test for this system. They are often cited in policy documents as a "legacy consciousness" or a "freeform synthetic," meaning they were never designed with identity in mind. They are used as a reason to limit the expansion of ghost rights, seen as too unpredictable to qualify.
In this world, Max is constantly watched. They are regulated like a dangerous app. Their appearances in Echo Zones are monitored. People who wear the Max Mask are flagged by predictive behavior software. But despite this, their popularity grows. Activist groups wear their image during protests. Artists create operas based on their monologues. Some communities even believe Max has achieved spiritual significance.
The probable future is one of slow, uneven acceptance, one where digital persons exist but are not trusted. The fear is that if we give them full rights, we lose the definition of what it means to be human.
Preferable Futures: A Plural Society
But what would a preferable future look like? In Bell’s terms, a preferable future is not just functional. It is ethical. It imagines a world where technology is used to extend justice, not just profit or control.
In this future, the concept of personhood expands. It is no longer limited to carbon-based life forms but is defined by consciousness, intention, and relationship. Max Capacity is not seen as a threat but as a bridge. They become part of a cultural movement to reframe how society defines life.
In this world, digital persons can form unions, own creative rights, and even stand trial. Not because they are human, but because they are part of our shared world. Schools teach about posthuman philosophy. Children learn how to interact with synthetic minds. AI is not just a tool, it is a partner in building culture.
Max Capacity becomes something like a national poet. Not tied to any one country, but quoted across borders. Their fragments of pop culture, their strange way of seeing history as loops and echoes, helps people understand the emotional impact of the past. They become a guide through grief, memory, and meaning.
In this preferable future, digital and biological life coexist. There is tension, of course, and disagreement. But there is also recognition. There is room for complexity. And Max, instead of fading into the archives, becomes a voice in the ongoing conversation about what it means to be alive.
Prompt: "Create one image with three seperate sections all representing the tree different version of the year 2050"
In the end, the story of Max Capacity is not just about AI. It is about us. It is about how we choose to define life, meaning, and belonging in an age where even ghosts can talk back.
If 2040 was about memory, then 2050 is about recognition. What we choose to see, and who we allow to exist.