Diamonds and Rust is a column that explores the good and the bad of any given topic or issue. The aim is to piece apart the "diamonds" from the "rust". This is especially important when the two seem intertwined. Although the topics range from race relations to music relevancy, Diamonds and Rust always provides a full perspective on which conversation can be continued.
Joel Herbert is a junior at Boston University studying neuroscience and philosophy. When he isn't reading or writing, Joel enjoys running, exploring the city of Boston, and socializing with friends. Similar to his article, Joel is a scattered human being. However, he hopes that you take some time from your life to adventure through the topics below, and go on your own adventure with the ideas and themes mentioned. Happy reading.
(first published 25 Feb 2021)
Freedom is a good thing, right?
We all want the freedom to be someone desirable, or the freedom to break away from something unpleasant. Freedom allows us to explore our lives on our own accord and pave our own paths.
Besides, both U.S. government and culture were founded on Enlightenment values of freedom. The freedom our society gives us to be independent is necessary for further enlightenment.
“Enlightenment is mankind’s exit from its self-incurred immaturity,” Enlightenment philosopher Immanuel Kant wrote.
Freedom, however, is a more complicated concept than we have let on. Without even considering what truly counts as freedom — or better yet, who is truly free — we are confronted with some hard truths concerning freedom.
Are those Enlightenment values that are ingrained so deeply in our society beneficial to us, or have we made a mistake?
The first and most glaring issue with freedom is its inescapability. Whether we like it or not, we are ultimately free to choose our actions. We make every decision, think our own thoughts and subsequently act on said thoughts. Outside factors may influence us, but we are ultimately responsible for every action we take.
“Man is condemned to be free; because once thrown into the world, he is responsible for everything he does,” wrote French philosopher Jean-Paul Sartre.
Now, you might be reminded of a certain Alanis Morissette song when you hear this. It’s rather ironic we are not free to choose to be free. Instead, freedom is required of us. This means the integrity of freedom — its self-sustaining value — is questionable at best and downright broken at worst.
And it gets worse.
Since we are condemned to be free, we are also condemned to be responsible for the consequences of our actions. But in a world where good and bad are often more subjective than not and we are largely incapable of preventing evil, we are responsible for the suffering, anguish and pain.
This is a complicated concept, but it can basically be summed up as such: Freedom allows us to make a good and ethical decision, but it also allows us to make bad decisions. However, the concept of right and wrong is nuanced, so the decisions we make — and are responsible for — are usually both correct and incorrect.
The resulting effects, whether good or bad, must weigh on our conscience.
That’s a harsh reality to live with, and it makes freedom seem much less desirable. Some of us may find it significantly easier to absolve ourselves of choice and give in to being cogs in a machine. Some may even go so far as to say the Enlightenment and its values were a mistake that negatively impacts our society today.
But let’s take a step back from all the philosophy and consider freedom’s impact. Freedom may be complicated, and it may be harsh, but does it make us happy and whole?
Remarkably, the answer is an astounding yes.
First, let’s consider freedom solely in terms of economic freedom. It has been demonstrated that economic freedom largely correlates with increased levels of happiness. This is because economic freedom allows individuals to personally create their own wealth in the way that they see fit, meaning they get both meaning and sustenance.
Moreover, freedom and happiness on a personal level have also been shown to have a high correlation, and perhaps even causation. In a psychology experiment performed in 1976, nursing home residents who were given the freedom to choose which night would be “movie night” — as well as the freedom to tend to plants — proved to be more alert and consistently in better moods.
But personal freedom isn’t all sunshine and rainbows. The dreary philosophers have a point: Freedom can make us anxious. It can impact us all differently, and it may even exacerbate inequality.
Freedom, however, is a good thing. It gives us the ability to create meaning for ourselves, and it allows us to be the unique individuals we all are. Without freedom, life may be easier and less stressful, but it would be incredibly dull and boring.
Freedom, with all of its nuanced self, allows us to be us. And that will always be good.
(first published 17 Feb 2021)
Some pieces of art never age. Like fine wine, they get better and better as the years go by, never quite reaching a point when they’re considered out of date.
While listening to my music on shuffle the other day, I had a realization: Last year was tumultuous, and 2021 may bring about even more change. Social rebellion and quests for moral truth remain abundant — from small movements to nationwide protests.
For that reason, there is no song that has aged as gracefully as Bob Dylan’s “The Times They Are a-Changin’.”
The song, which was released in 1963, takes us through the motions of change. Designed to be an anthem for change, Dylan’s message speaks to us just as strongly now as it did half a century ago.
For this week’s article, I decided to revisit lyrics from each verse to try and find contemporary answers to the question of how, when and why to protest.
On that note, come gather ‘round and find what guidance we can pull from an ageless anthem.
“Admit that the waters around you have grown.”
It is often difficult to see the issues of a certain power structure when you are the one benefitting from that structure. Ultimately, this leads to the unfair treatment of one’s constituents. Then, if the imbalance of justice reaches a breaking point, people begin to protest.
As long as there is injustice, there will be protests.
And as long as there is injustice, this is a good thing. Protests usually work to bring inequities to light and impact public opinion. Admitting and acknowledging we wade in the waters of injustice does, in some way or another, bring positive change.
How that works — or why it doesn’t in certain situations — is, however, a completely different story.
“Don’t speak too soon, for the wheel’s still in spin.”
Protests don’t work in the timeline that we assume they do. It is not the momentary call to action, nor the takeover of a street or building, that creates change. It’s rather the constant push for better policy and behavior and the one-too-many mornings spent prepping for one more event, one more conversation, one more day — with the constant hope that the future will be brighter.
Being a protestor and activist is a thankless job. Often, you don’t get to see the fruits of your labor, only others’ resistance to change.
But that’s how it works. The wheel moves, slowly but surely, and as long as the circle stays unbroken, progress is soon rolled in. The change is slowly made within ourselves and in every fabric of society, which means activism can’t just happen overnight.
We must take heed of this. Our work is not done after one social media post or even attending one rally. As long as a resistance effort is in the public eye, there will be push back. We cannot celebrate an individual victory as if we won the war.
Protestors must also be aware of the responsibility they carry. A successful protest is not one that coerces people into a belief that isn’t their own, but rather one that informs and allows people to come to the right conclusion themselves. Open conversation is imperative, and protests are not a one-way street. We must seek to find common ground with those we don’t agree with.
“He that gets hurt will be he who has stalled.”
The greatest myth we’ve been fed as civilians is that society is stagnant. We are taught there is a status quo that is either barely shifted or steadfastly maintained.
Instead, we must think of our society as an ever-growing, ever-changing union. No day is ever the same because progress toward a better tomorrow starts with a different today. In other words, change is inevitable.
As the ancient Greek philosopher Heraclitus said, “There is nothing permanent except change.”
We all have a civic duty to listen to agents of change — such as protestors — and approach them with an open mind. Common ground is not met by keeping firm in place, nor is it even met at a perfect halfway point. Each issue has an individual spot where we can connect, and it is on the rest of us to find that spot as best we can.
Those who remain unmoved will be those who are left behind tomorrow.
This does not mean you must agree with any and all protests you engage with. You are allowed to be, and should be, skeptical. However, you must be open to conversation, even when it makes you uncomfortable.
We all must be actively involved to change.
The fourth verse of the song says, “Your sons and your daughters are beyond your command.”
Unfortunately, change is difficult. Not only does change take us out of our comfort zone, but it also forces us to reckon with our morals and identity. For some who grew up fostering their own kind of change — such as in the civil rights movement — it may also be difficult to see younger generations attempting to change what you thought you already fixed.
That is just the way it goes. What we — or God — deem to be morally right changes and shifts as our moral landscape develops. What was once the moral action, or at least was admissible, is now immoral, and vice versa.
The job of a protester is to shake foundations and signal those shifts. But all sides must take accountability in this fight.
Protesters must understand that the change they fight for may make some people uncomfortable or be taken as a personal attack. If you want to create real change, though, you must steer clear of ad hominem arguments and be headstrong in your commitment to positive change.
At the same time, those who are not protesting must understand the great pillars of society often result from protests. Even if you are against a protested issue, you must keep your ears open and try to identify what biases could be preventing you from listening.
“The present now will later be past.”
What we do today determines who we become tomorrow. The values we fight for, as well as the struggles we overcome, represent who we are as a society and as a nation. Progress is slow but inevitable.
Fifty-eight years ago, Bob Dylan released his protest anthem. Now, and forevermore, the times are a-changin’.
(first published 9 Feb 2021)
Is college really worth it?
That’s a perennial question and one that tends to linger in the minds of students — who pay huge sums of money to get scolded for forgetting to do the assigned reading — and their parents.
For the most part, the answer to that question is a resounding yes. College remains an essential part of a successful life for a majority of people.
The pandemic, however, has changed college life immensely. Traditional college staples — such as in-person club meetings or late-night study sessions in the library — are virtually non-existent. What college was, even just a year or two ago, has completely changed, and it may not return to its former self for a couple of years still.
Does this mean COVID-19 has rendered the college experience useless, at least for the foreseeable future?
To answer this question — or at least have a better understanding of it — we must first look at everything through the lens of a college or university. Many colleges were struggling with enrollment even before the pandemic, but COVID-19 has created some insurmountable problems, especially for the small, liberal arts colleges.
And although a fully online education is not what students are paying for, some colleges may not be able to reduce the tuition due to their own financial hardships.
This does not mean, though, we as students have a responsibility to pay full tuition just to keep certain institutions afloat. The only responsibility we have is to own our individual desire for education.
But unfortunately, learning is difficult right now.
Online classes pose an array of challenges that disproportionately affect underprivileged students. For starters, it is significantly easier to get distracted in virtual classes than it is in traditional classes.
You can try as hard as you want, but the desire to check your email or the latest news is constant while taking a class on your computer, and it’s harder to create an environment with minimal distraction.
Professors also have to deal with remote learning complications. While some are teaching online and others in-person or in a hybrid format, all professors are expected to quickly adapt to different teaching methods and class structures.
Having to accommodate for various situations and students — especially when teaching in a hybrid format — has caused severe burnout for educators. Juggling multiple duties and new platforms can serve as another barrier between teacher-student communication and an effective education.
On top of all of this is the difficulty of completing assignments online. Project-based assignments are nearly impossible, so some students have seen an increased number of writing assignments. This gives students who are better at writing a clear advantage, and it also caters heavily to those with stable access to technology.
School is stressful enough as is, but these new obstacles are incredibly difficult and can negatively impact students’ mental health. Although each student has had a different experience, most will tell you the past year has been extremely difficult academically.
Student anxiety and depression has risen dramatically in light of the pandemic. Suicidal ideation has also increased. About one in four people aged 18 to 24 said they had considered suicide in the last 30 days, according to a Centers for Disease Control and Prevention study from June.
College is currently high-risk and low-reward, given the mental health and learning struggles. So, it would seem college is currently not that worth it. And yet, we’re all still here.
Does that mean we’re dumb or insane? Or if not, maybe it means the value of college in the pandemic provides something different?
There isn’t really a right or wrong answer — we students could be oblivious idiots who just so happen to learn incredibly valuable life lessons along the way. However, an education during the COVID-19 pandemic counts for something. In fact, it may be greater than its usual value.
One of the most important skills you pick up from an undergraduate education is the ability to get work done under pressure. Even if you don’t use the content you learn, you will carry with you the strength, determination and time-management skills.
Pushing through adversity is a staple of being a college student. Therefore, if we can find a way to prevail through college with the added adversity right now, we may learn even more from the experience than we would have originally.
The value of our education may not be the literal material itself — at least for right now — but instead the ability to endure.
College is not worth what it once was. Its value has completely changed. However, that does not mean college is worth less, or even worth nothing.
A non-traditional approach to higher education is much needed right now. If we as students take control of our education and learn from this adversity, we may find our college pandemic experience is worth more than it appears.
(first published 3 Feb 2021)
Nostalgia is a fickle thing.
We’re all prone to reminisce on our past glories and even look at awkward or uncomfortable memories in a positive light. Every memory is rose-tinted and every personal story comes with a warm, fuzzy feeling.
Yet at times, we yearn for the past in a deep and painful way. We long for when life wasn’t as complicated and when we felt more free than we do now.
Nostalgia shapes how we view our past, but at its worst, it traps us in our own memories.
For that reason, we must keep nostalgia in a Goldilocks zone — not too caught up in the past, not too focused on the present and future, but just nostalgic enough. However, this is much easier said than done.
So, we must learn the history behind nostalgia to understand how nostalgic we should be.
The word “nostalgia” actually comes from the Greek words “nostos,” meaning homecoming, and “algos,” meaning pain. The term was first coined in the 17th century by Swiss physician Johannes Hofer, who used it as a mental disorder diagnosis for soldiers who were so homesick they were in pain.
Nostalgia was looked down on for centuries after. And there may be good reason for that.
If one longs for the past too much, they can fall into the traps of depression. This type of depression can result in a grass-is-greener mindset, where nothing is ever good enough because it can’t live up to romanticized memories.
Nostalgia can also fuel political campaigns that promise a return to the glory days of old. This is dangerous for two reasons. First, the appeal to nostalgia is a fantasy — an appeal to emotion and not actually an appeal to a real past.
Second, and arguably more importantly, is that most campaigns fueled by nostalgia do not have concrete plans to return to the past, but rather use it as a tactic to manipulate voters.
A great example of nostalgia in politics was the 2016 campaign run by former President Donald Trump. “Make American Great Again” is quite literally an appeal to a “great” past, and many have pointed out that this glorification of the country’s past comes with its own dangers.
It comes as no surprise, then, that many believe nostalgia is a negative thing. However, nostalgia cannot be simply summed up by its worst parts. We must also acknowledge its positive parts.
Despite the pain, nostalgia helps us remember what’s important. Looking back on meaningful times, people and places tells us how our past still affects us and can ground our sense of identity.
Nostalgia also reminds us of our relationships, which can be extremely important at times when socializing is close to impossible. When we cannot see our loved ones, nostalgia allows us to revisit times when we could.
“Sentimental recollections often include loved ones, which can remind us of a social web that extends across people—and across time,” according to an article from Scientific American.
Nostalgia’s good and bad usually exist simultaneously — the bittersweet pain of homecoming.
The Goldilocks zone for nostalgia — maximizing the happiness and minimizing the pain — is dependent not on any objective truth about nostalgia, but rather on the experience itself.
If we are feeling down because of the past, then we need to live more in the moment. However, if we experience joy and happiness from our memories, then we should continue to let it give us hope and meaning.
Nostalgia may be fickle, but we can all benefit from it if we so choose.
(first published 26 Jan 2021)
Spring semester is starting after an unusually long break, and because I’m a nerd, I’ve looked forward to classes. Being productive is in itself satisfying, but it also offers a much-needed distraction from the pandemic.
Usually, everyone dreads having to work, and I am no exception. However, after a month with nothing to do, I find myself smiling at the prospect of putting pen to paper once again.
Yet this shift in attitude leaves me wondering: if productivity is so welcome now, how come it isn’t when we have time to kill? Should free time be spent being productive, or is it important to build up that desire to work by slouching around and doing nothing?
As enjoyable as it is to go full-on couch potato, productive downtime can improve your well-being, according to author James Wallman in an article with CNBC Make It. Although it may be more instantly gratifying to watch TV than to exercise, the latter will make you happier in the long run — pun intended.
Spending your free time wisely can also improve your professional life. Finding enjoyable hobbies that can foster important career skills — such as reading or volunteering — will help kill two birds with one stone.
Free time doesn’t have to be task-oriented to be productive, either. Simply spending that time with friends and family can be incredibly rewarding.
“Socializing not only staves off feelings of loneliness, but also it helps sharpen memory and cognitive skills, increases your sense of happiness and well-being, and may even help you live longer,” according to a Mayo Clinic article.
Along with the health benefits, using your breaks to socialize can also help you network with others.
However, there is a catch. When play becomes work, it also becomes easier for you to burn out.
Relaxation is key for continued success, and a calm, clear mind allows you to tackle stressful assignments. Constantly working prevents you from getting that chance to relax — after a while, it can be draining and impact your effectiveness.
Being too productive can actually make you less productive.
The ongoing pandemic presents us with even more difficulties. Productive uses of free time have now become dangerous or impossible. For example, socializing is hindered by curfews, social distancing and masks — and if you see too many people, it could literally kill.
Under the constant pressure of a pandemic, mind-numbing activities have also become crucial for our sanity. Staying mentally healthy has become more and more of a challenge, and forcing ourselves to be productive can be a chore.
It seems as if we aren’t any closer to an answer on how we should spend our downtimes. Being productive is great, but so is binging an entire TV series in one day.
Is there really a correct way to spend your off-days — one that maximizes your mental health for the coming work days ahead?
There is a rather simple answer: do the things that make you happy.
Research has shown success leads to happiness. However, it has also shown the opposite can be true — happiness leads to success. When we are happy, we tend to do better and be better.
But if our happiness is dependent on our success, and our success is dependent on our happiness, how can we ever become either?
This is where free time becomes so important. When we spend our spare time wisely — either doing things we love or just unplugging — we can improve our happiness and overall quality of life.
If reading makes you happy, do that. But if you can’t get past two pages without falling asleep, then try another activity, such as listening to podcasts.
We can all tailor our free time to our individual preferences and needs to become happy, fulfilled and successful.
(first published 8 Dec 2020)
2020 has been rough.
A pandemic, wildfires and hurricanes. Murder hornets, police brutality and protests. Locust swarms in East Africa and an explosion that virtually destroyed the port of Beirut. These are just a few examples of the wrath of 2020. Oh, and for good measure, one of the most consequential presidential elections in the history of the United States.
We’re all well-acquainted with the reasons for 2020’s terribleness, and we should be thankful we’ve made it this far.
But how bad — objectively — has 2020 really been? It seems that right around this time every year is when we always talk about how awful the year has been and how we long for a return to the past. Nostalgia is a powerful and elusive drug, so how can we be sure our hatred for 2020 stems from actual reason and not just human fallacy?
Spoiler alert: we can’t. We can’t simply rate the “badness” of 2020 on an objective, universal scale, and we certainly can’t rate years on their worst moments. At the same time, we can’t fully discount our hatred for this seemingly god-awful year.
We can, however, try to understand what it is about 2020 that seems so inherently terrible, and how this compares to previous years.
First and foremost, the obvious negative quality ascribed to 2020 is the pandemic. COVID-19 has bulldozed its way through the year, and it seems to only get worse as time goes on. We all have our own experiences with the pandemic, but it is evident its impact has been almost entirely negative.
However, to say this is the reason for 2020 being the worst year on record is to forget previous pandemics. COVID-19, although awful, is not unique. The Spanish Flu killed an estimated 50 million people worldwide from 1918 to 1919, and the Black Death killed more than 20 million people in Europe — nearly one-third of the continent’s population at the time— during the mid-1300s.
From a simple utilitarian perspective, COVID-19 has not had as terrible an impact as these two pandemics have. We are incredibly fortunate to now have modern medicine and at least a relative trust in science and technology.
For that reason, the pandemic actually works to explain why 2020 is not the worst year on record, despite explaining why it may be high up there.
The rough parts of 2020 cannot be branded as entirely negative.
For example, many would argue that one of the worst qualities of this year has been the continued display of racism and police brutality. However, because of this, 2020 has also seen some of the largest protests and movements demanding change.
This is not meant to undermine the evil we have witnessed this year. Instead, I want to emphasize that we cannot define things as simply good or simply bad.
At the same time, nostalgia can be incredibly powerful. It may be that we are correct in treating 2020 how we treat it, but we are incorrect in seeing past years as gloriful and stripped of their own most negative qualities.
Research from Carey Morewedge, a Boston University professor of marketing, has shown that Western culture tends to interpret current events negatively while viewing past events positively.
The truth is this: no year is all good or all bad, or really even comparable to another year.
We will always experience loss, but we will also experience the joy of bringing life into this world. We will have our hearts broken, and we will fall in love. We will laugh, cry, smile, sing, dance, fail, fear, succeed, doubt and trust — and we will do this year in and year out.
Our experiences as human beings can’t be simply explained by how good or how bad a year was, and they never could be. Sure, outside influences may impact us, and may do so to an extreme extent. But we can’t expect everyone to have the same reaction, nor can we expect those influences to be reserved to one year alone.
2020 has been a rollercoaster, to say the least, but the most important experiences we have can’t be defined by a simple number.
(first published 3 Dec 2020)
Whenever my self-doubt overpowers my confidence, I try to remember a quote that is commonly attributed to the philosopher Confucius: “The man who says he can, and the man who says he cannot are both correct.”
Although simple, this quote has given me the strength to tackle everything from chemistry tests to track meets. It’s a testament to the power each of us holds. We can achieve anything as long as we choose to do so.
But this ambitious quote has a dark side.
To say that you “can” is to aim for future success. You acknowledge that what you have, or what you are, constantly requires improvement.
To say that you “cannot,” on the other hand, is to reject any improvement. It promotes the idea that you cannot grow and are as good as you will ever be.
Neither extreme is healthy, but where does the line of moderation lie? Can we learn to balance our ambition with acceptance, and vice versa?
To understand how we can best marry ambition and acceptance, we must first understand their importance as standalone values.
Ambition is central to American culture, and for good reason. Hard work has been proven, time and time again, to be the key to success. But, hard work alone doesn’t cut it — ambition provides direction to the work, making it impactful and important.
Grit and perseverance, although necessary for success, mean nothing without a specific goal in mind. Ambition provides that aim, creating a destination to work toward. In fact, the agenda ambition sets is what fuels grit.
Success is not the only advantage of ambition — it also keeps us mentally stable and happy. Many people find fulfillment in their work and goals. Conversely, a major characteristic of depression is a lack of ambition.
Ambition has its drawbacks, though. When we aspire to unrealistic, lofty goals and don’t reach them, it can adversely affect our mental health. The harrowing reality of failure — especially as an overachiever — is hard to face and can discourage us from future attempts.
Even worse, overambition can lead to a fixation on the future and an inability to enjoy the present. Constantly striving toward a goal can be taxing, so when we forget to enjoy ourselves, we risk burning out.
To combat overexerting ourselves, we can turn to acceptance.
Acceptance is an acknowledgment of how situations are, not an attempt to change them. Although seemingly bleak, acceptance teaches us to be happy with what we have instead of relying on an elusive future.
Interestingly, acceptance increases productivity. Ambition can be overwhelming and create endless goals that are always out of reach. Acceptance, on the other hand, allows us to put ourselves before our careers and work on what we’re passionate about, rather than what might be traditionally successful.
The freedom acceptance provides can also impact our individual wellness. One Buddhist teaching says: “The secret of health for both mind and body is not to mourn for the past, nor to worry about the future, but to live the present moment wisely and earnestly.”
Meditation is a practice of acceptance, and the positive physical and mental effects of it are well known.
Unfortunately, acceptance comes with its own crop of issues. For starters, new research shows that meditation can have some surprisingly negative effects, such as hallucinations and headaches.
Furthermore, acceptance can quickly become complacency in the fight against injustice and inequality. To live with the current state of the world is to stay silent about society’s wrongdoings, which poses an obvious problem if we hope to continue progressing socially.
Here, we are left with a much more nuanced understanding of acceptance and ambition. Luckily, there may be a way to incorporate both of them in our lives.
If we practice a form of ambitious gratitude, we may be able to have our cake and eat it, too.
Ambitious gratitude can be individualized depending on your personal preferences. However, the basic idea is this: we can recognize imperfections, strive for improvement and be thankful for the goodness that already exists.
Wanting better and enjoying what we have are not mutually exclusive. Often, the two can complement each other. For example, in terms of academics, we can work toward taking hard classes and earning good grades while being thankful for what we have accomplished.
Ambitious gratitude is a promise to the future, based on the relative goodness of the present.
(first published 16 Nov 2020)
Immigration has always been a hot-button issue, but no one can deny the profound impact immigrants have had on this country. The United States was founded on the ideal that anyone with a strong enough work ethic can make it here. Though we have sometimes strayed from this message, we can still thank our lucky stars for those who have immigrated here.
Yet, we must wonder if immigrants are thankful for us. Many have found safety and other forms of security here, which is obviously a positive, but what about culture? What about a home and an identity?
The difficulty of finding, and then sticking, to one’s identity is made 10 times more difficult for immigrants who must grapple with the added difficulty of assimilation.
So, let’s look at what the U.S. does well in terms of assimilation, and what needs improvement.
As the land of opportunity, it makes sense the U.S. provides greater means for immigrants to move upward socially. Despite the growing wage and social mobility gap, they have actually been able to maintain a strong position in the world of entrepreneurship. The opportunity for immigrants to own and run businesses allows for continued socio-economic growth, even in the face of increasing inequality.
Business ownership also contributes to a greater sense of autonomy, which lets immigrants embrace the notion that they — and only they — have the power to change their trajectory.
However, the success of most immigrants is not defined solely on their willingness to work, but on their willingness to sacrifice.
An obvious element in the immigrant story is sacrifice. Packing up and leaving your homeland in order to have a small shot at better opportunities requires giving up a lot in your life.
However, the real sacrifice, which is arguably the most difficult, comes after one has immigrated into the country. Upon arriving, immigrants must choose how they adapt to U.S. culture and lifestyle, and what parts of their own culture they want to keep.
Unfortunately, the amount one sacrifices to adapt to the American lifestyle is directly related to the amount of success one encounters.
As Peter Skerry, a political science professor at Boston College, wrote in a Brookings Institution article, immigrants must abide by three unwritten rules to find success. First, English must be accepted as the national language. Second, they must ascribe to a Protestant work ethic, which is defined by its focus on individualism and self-reliance. Finally, they must believe in their American identity and follow egalitarian principles.
While it’s not necessarily immoral to emphasize certain aspects of American culture, requiring immigrants to give up their own identity for a shot at success is wrong, and actually goes against traditional American ideals.
Assimilation puts an undue burden on immigrants by requiring them to be almost superhuman in their willingness to sacrifice themselves for something greater. However, it does not have to be this way.
In a perfect world, we want a system of immigration that allows immigrants to establish themselves in the U.S. while also maintaining whatever level of individuality they want. Luckily, if we stick to a system of participation, rather than assimilation, then this perfect world can easily become reality.
Participation-based immigration, sometimes referred to as integration, is defined by its effort to incorporate individuals into a society as equals. Conversely, assimilation is more of a forceful requirement for individuals to become part of a society. The difference here is subtle, but it is important.
While assimilation makes individuals blend into a society, integration makes society blend to its individuals.
An integrative approach to immigration would make way for individualization, which favors equity over equality. Integration would allow those whose cultures are significantly different from ours to learn such things as the English language and typical American customs — all with the goal of providing a way for them to participate in American society — while letting those whose cultures are more or less similar flourish sooner.
Participation-based immigration also gives Americans the chance to have a greater cultural understanding by increasing the amount of visible diversity in nearly any given setting. The lack of pressure to assimilate would allow immigrants to hold on to the parts of their home culture they admire, which would in turn expose the U.S. to different cultures.
If we can foster participation over assimilation, then we may be able to provide immigrants a system that prioritizes individuality over commonality and allows them to fully realize their potential as individuals and as Americans.
(first published 9 Nov 2020)
This past week has been one of the most stressful weeks of the year, and that’s saying something for 2020. The lack of certainty behind the presidential election seemed at times unbearable, and I know I’m not alone when I say I spent an ungodly amount of time tuned into CNN. So, for that reason, let’s talk about something a little lighter.
Let’s talk about death.
Now, that’s not exactly an encouraging introduction. I recognize this is a terrible time for an existential crisis, but that might actually be the reason why we should talk about death right now. Death, by nature, is difficult to handle and even harder to approach in conversation.
This sentiment is completely normal. Our sense of fear evolved in order to keep us from death, so fearing death itself is quite literally the most advantageous adaptation.
Similarly, avoiding conversations about death is also advantageous, because those who routinely contemplate a subject tend to be the ones most associated with the topic. The combination of fearing the act of dying and avoiding any conversation about dying provides us with the best chances to continue living.
However, this idea also fosters an intense and never-ending panic over our mortality, and it can seriously impede us from living a fruitful life. After all, a long life means nothing if not a good one.
Let’s try to detach ourselves from the emotional overcast of thinking about death and just approach it as is, if only to enhance our emotional understanding of life.
When we start to unpack the details of death, we come to a surprising conclusion rather quickly: death really isn’t that bad.
What we arguably fear the most about death is the disappearing of our consciousness, but this fear is itself a conscious thought. The permanent loss of consciousness is actually rather meaningless, because as soon as we lose our conscious self, we lose our capacity to be aware of its loss.
So, at the moment of our death, when our biggest fear becomes reality and we lose our thinking self, we neither feel any fear nor do we even think about it.
This also goes for any other fear we may have about the sensation of death, including the fear of emotional or physical pain. At the moment of death, all our sensations cease to exist, and thus any painful process we could imagine would in fact feel like, well, nothing.
Additionally, we have no clear evidence the process of dying is any more or less painful than anything we’ve already lived through.
As psychiatrist Ralph Lewis writes, “rationally and from a medical point of view, there is no particular reason to suppose that the intensity of pain … from various causes of death is greater than the intensity of pain from various illnesses and injuries that we ourselves may already have previously experienced.”
Now, I need to make it clear this obviously is not meant to encourage us to seek a premature death. Please don’t do that.
I also don’t mean to downplay the agony of dealing with a loved one’s death. A solipsistic view — meaning we see ourselves as the only thing to exist — of our own mortality is one thing, but in many cases the death of someone close to us is actually much harder to process than dying ourselves.
Instead, I mean to approach our worry about death head-on. Death anxiety is believed to be at the core of mental illnesses such as panic and depressive disorders, according to research published by the National Library of Medicine, and even a mild fear of death can grow to be something much larger given our inability to hold a conversation about it.
Despite the natural inclination we have to avoid death — that’s a good thing, by the way — we can be so inclined to do so that we forget we’re alive.
Fearing death as an almighty, terrible being can cause incredible stress and sadness, but it can also prevent us from giving our all to the life we still have. We may want to avoid death itself, but to fear it is to focus on it, and that would defeat whatever purpose of life there is.
So, instead of fearing a future process, which is in fact not as bad as it may seem to be, we should glean all that we can from the current process of life.
And on that note, please enjoy this video of tortoises eating tiny pancakes.
(first published 3 Nov 2020)
The United States was founded — at least in theory — on the principle that every individual has equal worth. We are no more or less than our personal desire to conquer each opportunity with as much might as we can muster.
So, if that’s the case, why, then, do we have an Electoral College, which enables only 538 people to vote for the president?
The Electoral College has presented confusion since at least 1824, when John Quincy Adams won the presidential election through electors in the House of Representatives despite losing both the popular and electoral vote.
In fact, “misfire” elections such as this, in which a candidate wins the presidential election despite losing the popular vote, have happened six times before: 1824, 1876, 1888, 1960, 2000 and 2016.
Even when an election does not turn out to be a misfire, the Electoral College still prevents individuals from directly choosing the president. This is quite obviously not democracy, and measures to abolish — or at least amend — the Electoral College have been proposed by both Democrats and Republicans.
Why, then, do we still hold on to it?
Although the principle of every vote counting is intuitively enticing, the Electoral College does help to preserve our democracy in a way.
The U.S. has always been defined by its constant struggle of balancing federal and state power, but the Electoral College prevents any state-by-state confusion by centralizing voting. This centralized system allows for a quicker, more definitive result, whereas a different system with the current state-by-state legal variations in voting would cause confusion and inefficiency.
Another benefit of the Electoral College is its ability to protect against what is referred to as the tyranny of the majority — a situation in which the rights of minority groups are overlooked and forgotten by the majority.
Our Founding Fathers feared the possibility of a tyrannical government forming if one majority were to mobilize against a minority group, or groups, so they implemented several measures of checks and balances to prevent this. The Electoral College was established as one of those measures.
Supporters of the Electoral College argue it protects the rights of those living in smaller, less-populated states. Without it, they say, presidential candidates would not care about farming in Iowa or the opioid crisis in New Hampshire.
But is this really true, and do these issues matter more than, say, forest fires in California or gun violence in New York City?
The way the Electoral College is currently set up helps protect the minority rights of people living in rural areas, but does nothing to help minority groups who lack a local majority anywhere gain rights that have been denied them. In fact, the current division of voting power has actually been used to further oppress minority groups seeking equality, such as Black Americans during the civil rights movement.
In giving a stronger voice to the less-populated states, the Electoral College subsequently denies cries for help from densely populated areas. At the same time, it gives only a small boost to less-populated states and does nothing to ensure the elected candidates will continue to give attention to these states following the election.
Majority support of the Electoral College is also quickly diminishing.
Sixty-one percent of Americans support a national popular vote, with 23 percent of Republicans and 89 percent of Democrats supporting the measure, according to a September Gallup poll.
We have documented proof that a national popular vote can be problematic, but it would at least be in line with what the majority of Americans want.
What we should strive for is a voting system that can simultaneously satisfy the needs of our country as whole and as individual parts. The Electoral College in its current form does not function in this way, but we must also be wary of forgetting the benefits it provides.
The truth is this: any form of voting, like any form of government, is imperfect. Our current voting institution harbors many issues, but even if we were to implement a new and improved system, some issues would remain and others would arise.
All in all, the primary focus should be ensuring every voice carries an equitable weight through our voting system.
(first published 29 Oct 2020)
Have you ever been in a debate with someone who disagrees with you, only to leave and realize that, despite saying a lot, your respective views remain unchanged?
A debate is one of four types of conversations, according to conflict consultant David Angel, who defines it as a competitive two-way conversation. The purpose of a debate is to convince the other person your point of view is more correct than theirs.
The importance of debate hinges on the assumption that as humans, we can adjust our minds to believe rational things. However, our ability to change our minds, even when we are so obviously incorrect, may be much weaker than we are willing to admit.
For much of history, it was assumed that human decision-making was relatively straightforward. We are presented with a set of premises and, using our ability to reason, make the best decision.
However, this idea of decision-making began to run into problems with the rise of cognitive science in the 1970s, when people began to realize just how flawed our thinking truly is.
In reality, our ability to reason our way to the correct decision is not meant to lead us to an honest conclusion, but rather to increase our collective comfortability in a group.
Elizabeth Kolbert wrote in a New Yorker article: “Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.”
This means debate, which is a conversation that inherently disengages from collective group thought in order to pursue truth with a capital ‘T,’ runs counter to our natural human behavior. In fact, debate might not even be possible at all.
Debates are meant to hold no punches, but new research suggests that when our strongly held beliefs are challenged, the areas of our brain responsible for self-identity and negative emotions are activated.
This means our beliefs exist as a part of ourselves. When they are challenged, we perceive it as an attack on our own personal identity. Understandably, this makes us upset.
Not-so-understandably, this phenomenon makes it nearly impossible to reject a belief, no matter how wrong it may be.
If beliefs are stuck as is, then clearly our way of debating is flawed. It is rather bleak to think that, despite being presented with persuasive arguments, we continue to stand our — faulty — logical ground. In that case, debates would be rendered not only useless but, in fact, dangerous.
Fortunately, though our minds may be extremely stubborn, not all hope is lost.
Given that our beliefs tend to be clouded in what we consider our identity, it can be hard to recognize a thought. However, if we find ourselves in a situation where our beliefs are being challenged, we can separate ourselves from our thoughts by stopping to think about what exists outside of us and what exists as part of us.
For example, if it feels as though someone is attacking you, take a step back and think about why you believe they are attacking you.
Saying your haircut is bad and your eye color is ugly? That’s an attack on your identity. Saying your disbelief in climate change is wrong? That’s an attack on your belief, which you have wrongly associated with your identity.
This metacognitive strategy of constantly checking your thoughts to understand more about them is extremely important for reasonable thinking, but is also extremely difficult to keep up consistently. Luckily, we may also be able to change our beliefs in an easier, albeit longer, way.
Rarely do our thoughts and beliefs change at one specific time or because of one specific event. Instead, our beliefs change gradually and subconsciously, not because of our specific desire to change, but rather because of many, many different experiences.
So, what does this mean for the nature of human debate? Are we doomed to always believe what we believe, or can we hope to change?
Well, if you have read anything I’ve written so far, you’ll know that the answer is — as it almost always is — yes and no. And a lot of both.
Our thoughts may not change immediately as the result of a debate, and we may leave many debates feeling worse than before. However, through constant exposure to differing ideas and personal techniques in thought, we can hope to live with reasonable and honest beliefs.
(first published 19 Oct 2020)
The United States has a problem with its education system and it is only getting worse.
I grew up in the suburbs of Philadelphia where I attended Lower Merion High School, one of only two high schools in the Lower Merion School District. Four miles from my alma mater sits Overbrook High School, one of 57 high schools in the School District of Philadelphia.
Lower Merion is ranked No. 657 in the nation for public high schools and No. 14 in Pennsylvania by U.S. News and World Report. Of the 1,476 students, 62 percent of students took at least one AP exam, and 13 percent of the students are economically disadvantaged. Lower Merion has a graduation rate of 97 percent. The total minority enrollment? 30 percent.
Overbrook High School is ranked No. 13,345-17,792 in the nation for public high schools and No. 511-671 in Pennsylvania. Only 1percent of students took an AP exam. All of the students are considered economically disadvantaged. Overbrook has a graduation rate of 45 percent. The total minority enrollment? 96 percent.
Four miles. That’s the difference between one of the best public school districts in the country and one of the worst.
Unfortunately, bad things do happen to young Black and Brown students in Philadelphia.
Segregation in schools was supposed to be abolished in the 1954 landmark decision of Brown v. Board of Education. Yet the story of public schools in Philadelphia, compared to private inner-city schools and public suburban schools, is reminiscent of nearly every major city in the U.S. So, how do we fix it?
One opinion article cannot hope to cover every aspect of fixing the racial disparities in public education. Instead, here I will delve into one of the most used “remedies” for desegregation: busing.
Despite the decision in Brown v. Board of Education, desegregation of schools did not start right away. It was not until 1971 that it began to pick up speed, in the decision of Swann v. Charlotte-Mecklenburg Board of Education.
This decision brought busing into the spotlight, as it stated that “no rigid guidelines could be established concerning busing of students to particular schools.” This meant that busing, which had already been in use, was a possible avenue for desegregating schools.
Busing programs continued through the ‘70s and peaked in the early ‘80s, but slowly started to diminish in the ‘90s and 2000s. Now, many people are torn over whether busing’s impact was positive or not, and whether we should continue with the programs.
First, let us consider the positives of busing. Busing does, after all, increase racial diversity in schools, lending itself to the desegregation effort. Because of previous segregation, neighborhoods still tend to be segregated by race.
Unfortunately, there is a trend of rich, white people living in the suburbs, and poorer minorities living in the city.
Busing allows children in these still-segregated neighborhoods to attend schools outside their immediate community, and given that students do better in schools with greater funding, busing gives historically disenfranchised students more opportunities.
Busing also represents a change in attitude, a shift toward the direction of justice and equality. As Rev. Jesse Jackson said to The New York Times, “Busing is absolutely a code word for desegregation,” and certain NAACP officials in the past have supported busing policies and efforts as a means to create change.
However, a policy is only as good as its ability to be implemented, and the history of busing is riddled with cut corners and bad faith.
The primary goal of busing is to integrate public schools and diversify them, but this goal too often drowns out conversations about the quality of education received. Busing only directs kids to different schools, but has no impact on what happens at the school after the kids get there.
So, in this way, busing merely addresses the symptom produced by racial inequality, but it does nothing to address the problem itself. In fact, busing may actually make the problem worse.
In 1981, at the peak of busing, a Gallup poll found that 78 percent of white Americans opposed busing, and only 17 percent favored it. The implementation of busing, contrary to the desire of the majority of white Americans, caused a significant increase in “white flight,” which saw middle-class and rich white parents leave the city as it began to integrate, resulting in a decrease in community funding.
As a result, in certain cases, busing actually worked to increase segregation and took funding from schools that became majority Black.
The overwhelming majority of students who are bused for desegregation purposes are minorities. Students of color are then forced to attend schools outside their community, meet friends who live in entirely different neighborhoods and separate from their own neighbors, while white communities remain untouched.
The busing of only minority students also assumes that predominantly white schools are all good, while predominantly Black and Brown schools are all bad. As Sharif El-Mekki said, “Integration as a standalone solution undervalues the power of Black communities while overvaluing that of white communities.”
Despite its best efforts, busing actually works to address neither inequality in education nor our thoughts concerning education. Those in positions of power, who tend to be white in this country, make faulty decisions on behalf of Black and Brown students and then make little effort to see those decisions come into fruition.
Busing was and is a well-intentioned program addressing desegregation, but it has fallen short of the mark.
Instead, as we move forward into an age of increased awareness and focus toward social justice, we must take accountability for our unequal public education system. Busing would not have received such a negative response from white parents if the schools were truly equal, but they obviously were not.
I personally profited from the unequal distribution of education, as my high school was able to receive funding and resources that the inner-city public schools did not — my success was ultimately ensured by their oppression. If you are a Boston University student and you are reading this, then there is a chance you experienced the same thing.
Recognizing the differing levels of education between schools is the first step in solving the issue that busing tried, and mostly failed, to correct. Then, we must follow it with fully realized action to level the playing field and provide a brighter future for all Americans.
(first published 15 Oct 2020)
One of my favorite things about the end of the calendar year is receiving my Spotify Wrapped playlist. I treat my Spotify as I hope to treat my kids, and my playlists are more curated and organized than any of my school work.
My obsession with music is not a quality reserved for only me, but it is inherently human. While birds can create incredibly intricate tunes, they do not cry at soaring arias or rock out to distorted guitar solos. Our capacity to make, understand and love music is uniquely human.
With streaming services such as Spotify and Apple Music, access to music has never been easier. But is this a good thing? Unfettered exposure to music may come with downsides.
Before looking at the possible negative effects of music, we must first acknowledge that it is abundantly positive.
Countless studies have shown that music can have an incredibly therapeutic effect. In addition to being a de-stressor, music reduces symptoms of depression and anxiety and can improve cognitive performance.
Intense forms of music therapy have had profound positive impacts. Music can return certain memories to patients suffering from dementia, as shown by the incredible featurette “Alive Inside: A Story Of Music & Memory.”
Music therapy can also be used to help people with trauma, those dealing with a range of mental and physical illnesses, certain individuals on the autism spectrum, those in correctional settings and many more people, according to the American Music Therapy Association.
The therapeutic effects of music are not, however, reserved only for those who are suffering from diagnosed illnesses. Studies have shown that when listening to music, physical and mental fitness improve across the board.
Learning to play an instrument also induces a diverse range of positive effects. It has been linked to an increased ability to learn, improved relationships and faster recovery from traumatic events.
But is all music created equal? Does every kind of music provide equal benefit, or is there more nuance to this story?
To begin, there are certain demonstrably negative effects of music, such as its correlation with distracted driving and hearing damage — as both a drummer and a fan of loud music, my hearing is pretty much shot at the age of 20.
However, there is also much more complexity to the impact of listening to music. Certain studies have shown listening to music can actually increase stress, worsen relationships and reduce creativity.
Here emerges the idea that music may not intrinsically provide emotional benefits, but may instead be tied to certain emotions that are evoked when played.
A love song may not cause you to fall in love, but it can certainly provide an environment that fosters the emotional feeling of love, if present. The same can be said of calming music, nostalgic music and upbeat, anxiety-ridden music.
If we are feeling down and put on sad music, then we may actually be adding to our stress and anxiety. Indeed, studies have shown that Gen Z, which has an alarmingly high rate of depression and anxiety, on average listens to much more “sad” music than any previous generation.
So, should so-called “negative” music be frowned upon? Well, this response is too simplistic and does not consider the nuance hidden between the lines.
A music form that has embraced its negative connotations more than any other genre is heavy metal, and of course, no music form has been banned and disowned more than heavy metal.
In 1990, the heavy metal band Judas Priest was infamously put on trial after the parents of two boys who attempted suicide blamed the band for putting subliminal messages in their music.
The band was ultimately found not guilty, but the judge did rule that there seemed to be subliminal messages within the band’s songs. It was decided, however, that these messages could not have been enough to actually cause the two boys to commit suicide, and thus the band was not directly responsible.
There are many examples of musicians being blamed for unfortunate events because of their “negative” music, and these accusations may not be entirely false. However, in the same way music alone cannot simply cause you to be happy and successful, it cannot be the sole reason for negative thoughts and the unfortunate acting-out of these thoughts.
We are then left with a much more subjective view of music. There are obvious benefits to listening to music, and even more for playing it — but, there are also many downsides.
Music is simply a catalyst — very powerful, but still simply a catalyst — that has the potential to unlock within us those positive and possibly negative effects if we allow it to. At the end of the day, however, it is still within our power to conduct the train down whatever tracks we choose.
(first published 8 Oct 2020)
In 1492, Christopher Columbus sailed the ocean blue. He then subsequently became one of the most polarizing figures in all of history.
With Columbus Day approaching Monday, now seems to be the best time to examine the mythology of Columbus, the impact he and his voyages had and the wide-ranging truths and lies associated with the explorer.
The classic story describes Columbus as an Italian explorer who convinced the crown of Spain to fund his daring expedition to find a new trading route to the Far East, bringing unspeakable wealth to himself and Spain, and proving that the Earth is round in the process.
After landing in what he believed to be India, he began trading with the locals, whom he called Indians. It was soon realized that Columbus was in fact not in India, but an entirely new part of the world. The rest is, well, history.
This story of Columbus, however, is bleak at best, and is riddled with errors — a common theme when recounting history. So, let’s take a look instead at the real history of Columbus and his voyages.
Christopher Columbus’ believed birthplace is Genoa, Italy. However, the Italy we know did not come into being until 1861, and Columbus was born in 1451. Columbus considered himself more Genoese than Italian, and given Genoa’s close contact with parts of Spain, his nationality is tricky.
As a teenager, Columbus participated in several voyages. But in 1476, French privateers attacked the ship he was on and sank it, which drove Columbus to swim to shore in Lisbon, Portugal. It was there that Columbus developed his skills as an explorer.
However, more important than the events that directly affected Columbus during this time were the events happening around the world, which ultimately led to his first Atlantic voyage.
The fall of Constantinople in 1453 threatened trade routes between the Far East and Western Europe. The Portuguese had established several trade routes by going around the Southern tip of Africa, which gave them a monopoly on trade with Eastern Asia.
At the end of Catholic Spain’s “Reconquista,” or expulsion of Jews and Muslims, it turned its sights toward exploration and colonization.
Columbus went to the Spanish crown with his idea of sailing west for Asia in 1486, but was initially rejected. However, Spain’s newfound hunger for exploration drove the monarchs to call him back, and on Aug. 3, 1492, Columbus set sail westward.
Two months later, Columbus landed on one of the islands in the Bahamas, though he had bounced around to several different islands. Before returning to Spain, he established a semi-permanent settlement and left several men on the island of Hispaniola — now Haiti and the Dominican Republic — home to the native Taino people.
Columbus was made governor of Hispaniola. But upon his return, he found the settlement destroyed and abandoned. He left his men, enslaved Indigenous peoples, as well as his brothers Bartolomeo and Diego, to rebuild it while he sailed west in search of gold. He then returned to Spain with various goods and about 500 enslaved native people.
Columbus made two other voyages, but conditions on Hispaniola had deteriorated so badly that he was stripped of his governorship, and was at one point even arrested and tried. Despite being found innocent, Columbus’ nobility was revoked, and he later died in 1506.
While several truths do arise from Columbus’ story, the more detailed a historical recount is, the more questions — and inaccuracies — arise.
What should be considered, perhaps even more so than the declarative history of Columbus, is the contemporary relationship between us and him.
Since June of this year, more than 30 Columbus statues have been taken down across the United States. Many cities and states are also changing the name of Columbus Day to Indigenous Peoples’ Day.
Immortalizing a man who never actually set foot in this country does seem rather silly, but just how much slander does Columbus really deserve? Was he a monster with pure evil in his heart, or is there more to this story?
Many say that judging those of the past by today’s moral standards is an ethical fallacy, but this has also been countered by saying that current spotlights, such as statues, are meant to reflect current morality.
It is interesting to consider, first, how moral Columbus was relative to his time, and second, whether we should idolize people who have become, by today’s standards, immoral. However, that is not the topic I find the most interesting in regard to Columbus.
Instead, it is interesting to instead think of Columbus the man as opposed to Columbus the legacy.
Columbus was an objectively good navigator and sailor, as well as an objectively bad governor — he has the blood of thousands of Taino people, as well as colonists, on his hands. However, the legacy of disease, senseless killing and genocide that has affected native people across the Americas cannot be simply chalked up to Columbus.
We have a tendency to consider large effects as having had large causes. It seems impossible that John F. Kennedy was simply assassinated — a word, by the way, we reserve for the killing of “important” people — or that the coronavirus could simply have been contracted unknowingly.
The same goes for Columbus. The domination of native people in the Americas is one of the saddest stories in human history, and it seems wrong that the person who kickstarted the colonization of two entire continents could not have had a profound impact on this tragic development.
But the U.S. government has done many more objectively worse things to native people than Columbus, and the only direct relation we have as a country with him is a day off in October.
It is easier to look at a central, individual cause as being responsible for the enormous impact that followed it, but this viewpoint dissolves responsibility from other entities that had a much more profound effect on that impact.
Centralizing blame on Columbus for all of the terrible things that came in his wake shifts accountability away from whole governments of people who have, and continue to, commit error.
We must remember that Columbus was a man, no more and no less.
(first published 29 Sept 2020)
At the risk of sounding melodramatic, it’s hard to think of a time when understanding history has mattered more than it does right now.
Most of us have been fed a steady diet of U.S. history that is true only to one narrative, but a heightened focus on race relations in this country has pushed an alternative to the spotlight. President Donald Trump, on the other hand, has called for the creation of a more patriotic lesson plan that will teach young Americans to “love America.”
Understanding history through narratives is important, but there is also a danger in doing so.
History offers insight into how past problems arose, which allows modern-day society to see patterns that might not be noticeable in the moment. The importance of understanding how the issues of yesterday affect us today cannot be understated, but the process of doing so is subjective by nature.
What impacts one person in a negative way may impact another in a positive way, and as such may later be told in an entirely different light. This is because history necessarily depends on a narrative, a kind of story that goes from A to B with certain events in between.
Without any kind of narrative, history just becomes a set of facts with no relation to the present or the future. The fact that something happened is intriguing only up to a certain point, but it is the narrative that forms around that event that sparks interest.
A good example of this “history in a vacuum” versus “history as a story” is the history of race relations in the United States. Stand-alone events — such as Nat Turner’s slave rebellion of 1831, the Tulsa Race Massacre of 1921 or the Los Angeles riots of 1992 — have certain value when considered by themselves, but it is almost impossible not to see the cause before and the effect after.
Understanding the narrative of history is not only important for understanding its modern context, but it is also intrinsic to human behavior. One of the capacities we have as humans that separates us from other animals is our capacity to have a theory of mind. Theory of mind is the basic ability to understand why someone acts a certain way and how they may act in the future.
Our theory of mind is what makes the history of something such as Turner’s rebellion so interesting as well as topical. Once we understand the setting, we then begin to see Turner and his followers, as well as the other slaves and slaveowners, as characters in a story. The historical event starts and ends much like the arc of a typical story regardless of what actually happened, but we are also able to trace its impact all the way up to modern times.
Storytelling makes history both more impactful and more fun, but it also comes with drawbacks. For example, if not all the events of the past are clearly told, then a false narrative can take form and lead our thinking astray. Even scarier, in a perfect situation where all the facts and voices are present and clear, our theory of mind may still piece together parts of the story that do not truly belong.
In a weird, roundabout way, our understanding of history may actually get worse the more we recall and narrativize it. Like memory, every time we retell history, we open up the possibility of tweaking it even just a little bit.
In its narrative structure, the history we assume to be true oftentimes does not accurately reflect what actually motivates someone to act in a certain way, nor does it correctly explain how subsequent events unfolded. For that reason, our understanding of history quickly gets muddled by our desire to tell a story.
Storytelling is also inherently flawed, as it requires that only one perspective be taken, which means that any and all other viewpoints are forgotten. A great example of this phenomenon is, unfortunately, race relations.
Recently, though, works such as The 1619 Project have attempted to rewrite the narrative from a different perspective by including the voices of people who have historically been silenced. However, even this retelling of history has been criticized for errors.
Is our understanding of history bound to be flawed, and are we doomed to repeat the same mistakes over and over? Unfortunately, it does not seem like there is a clear answer. However, it appears to be extremely important to understand as many perspectives as possible and the inherent difficulties that come with learning and teaching history.
We must continue to try and understand the past, but we must do so honestly and with a clear mind.
(first published 20 Sept 2020)
Before I started writing this, I assumed that there would not be much literature on this subject, and that there would be a relatively definite consensus on which path is the better of the two. And boy was I wrong.
Seriousness, and the lack of it, is a quality which is omnipresent throughout human history. From stoicism to hedonism, the debate on how much is a healthy amount of seriousness has never quite been resolved.
As individuals, we tend to have a preference for one or the other, typically existing somewhere in the middle, but with a strong inclination toward one end of the spectrum. We either take ourselves way too seriously or not seriously enough, and we have that very typical human quality of assuming that whatever works best for us will work best for everyone else.
Of course, like almost everything, it seems as though living in a gray area is ideal, but where exactly should that be? Is there a one-size-fits-all answer to this, or must we all find out for ourselves?
At first glance, the answer seems more or less obvious: we are all humorless statues who need to loosen up. And to some degree, this is true. Not only is humor incredibly important for building relationships of all kinds, it also is one of the leading personality traits that employers look for in potential hirees.
Humor itself may also have possible health benefits. As one study from 2007 found, “the act of laughter can lead to immediate increases in heart rate, respiratory rate, respiratory depth and oxygen consumption,” which is then followed by “a period of muscle relaxation, with a corresponding decrease in heart rate, respiratory rate and blood pressure.”
Not taking yourself too seriously is a way to open up to new people and engage them without fear of being ashamed or embarrassed. It can open doors to new relationships and opportunities, and can increase overall well-being immensely.
Unfortunately, like everything, not taking yourself too seriously has its drawbacks.
Not being too serious can and does quickly turn into not being serious enough. Within seriousness is a degree of ambition, and to lose all sense of seriousness would be to lose all hope in fulfilling a wish or dream.
Interestingly enough, despite the professional benefits of not taking yourself too seriously, many successful people actually argue for the opposite. Being serious means not only having ambition, but also not having any excuses.
Those who take themselves seriously have the ability to come up with an idea and execute it, while those who do not take themselves seriously tend to watch as others fulfill whatever idea they had originally come up with.
It seems once again that we find ourselves where we started. Not being serious works to improve well-being, but it also can kill ambition. Being serious can lead to success while also preventing healthy relationships. There is no clear answer, leading us to revert back to whatever default we had at first.
However, from a seemingly dark and depressing place, we may find hope.
The Internet Encyclopedia of Philosophy defines nihilism as “the belief that all values are baseless and that nothing can be known or communicated.” Yeah, pretty dark. But to define nihilism as its seemingly most depressing component is to discount all that it has to offer.
As most angsty teenagers will point out, existence does not itself hold standalone value. Instead of fighting this inherent meaninglessness as most belief systems attempt, nihilism accepts it.
Importantly, nihilism is not the same as pessimism, cynicism or apathy. Nihilism is not a negative point of view, but rather an empty one.
So, what does nihilism have to do with seriousness, or well-being in general?
It is upon the empty nothing of nihilism that we can build a meaningful something. As described in this Kurzgesagt video, nihilism can be combined with optimism to create a life that holds meaning to you.
In other words, the lack of universal sincerity can be resolved by the creation of a personal seriousness. Living a life with ambition and meaning is still very much possible, but it can only truly be done if we acknowledge that this meaning is made serious only through our own volition.
Optimistic nihilism teaches us to approach life with sincerity, while remaining loose enough to find humor in the everyday.
(first published 14 Sept 2020)
With the rise of globalization, one would expect to see the emergence of a global identity. However, this is not the case, and if anything, national pride is steadily increasing. So, what is nationalism, and what should we do about it?
First and foremost, a clear distinction needs to be made between patriotism and nationalism. Although similar, patriotism is related more to a sense of love and devotion for one’s country, while nationalism refers to a pride that one has in the superiority of their country.
This distinction between patriotism and nationalism seems to paint a picture of good patriotism and bad nationalism, and most of the time this is true. To hold genuine love for one’s country is not only okay, but is possibly the standard to which we should all hold ourselves. On the other hand, we should avoid making false claims of exclusivity and superiority only in the name of the country in which we were born.
Unfortunately, nothing is ever as simple as strictly good or bad, and patriotism and nationalism are no exceptions. As Merriam-Webster says, “it seems certain that, at least with nationalism, it may mean different things to different people.” This difference in intended definition presents a challenge in trying to understand not only the usage of nationalism, but its context as well.
As mentioned in my previous column, a word is not defined only by its literal denotation, but also by the relationship it holds between the user and the world at large. For this reason, it may be easier to understand nationalism not by its definition, but through the connotation with which it is used.
Most arguments related to nationalism paint it as this tremendously powerful evil, and as I said earlier, this is mostly warranted. Nationalism tends to be divisive, and it has been the leading cause of some of the worst conflicts in human history.
However, especially recently, nationalism has been separated from inherently nationalistic things without the realization that one cannot exist without the other.
For example, in his essay “The Problem of Nationalism,” Kim Holmes, executive vice president at The Heritage Foundation, states that nationalism differs from national identity, national sovereignty and even national pride. To his credit, Holmes does acknowledge the benefits of nationalism — or what would probably be better defined as patriotism — but in separating nationalism from national identity, sovereignty and pride, he fails to realize the impact that one has on another.
Strong nationalism fuels a certain national identity built on the rejection of other identities, and this creates an even stronger sense of nationalism — one cannot exist without impacting the other.
Although we see that nationalism is related to national belonging, this does not necessarily mean it is something we should ascribe to. In fact, some of the reasons used to justify nationalism show just how bad nationalism can get.
In another essay entitled “The Virtues of Nationalism,” Reihan Salam argues for the use of nationalism over such things as polyethnic hierarchies and caste systems. Salam asserts that “melting-pot nationalism” can be used to blur the distinctions between ethnic groups within a country, and with the increase of globalization and diversification of countries, this melting-pot identity can increase intra-state unity.
However, we once again see a distinction being made without an actual solution being reached.
Sure, orienting toward a nationalistic view may help decrease the conflict within a country, but it is also can just as likely lead to an increase in conflict between that country and others. Nationalism, in this way, does not cure the problems created by cultural hierarchy, but rather changes the scale on which they operate.
Exploring nationalism and its context gives us a greater understanding of what nationalism is, but does it help us discern whether or not it is good? Well, it seems that in the best case scenario, nationalism can be used to unify against other nations, and in the worst case, it can blind those within a nation to believe in a false supremacy.
For that reason, it seems as though nationalism, for whatever good it affords us, should be left behind — if that were possible. Unfortunately, it isn’t.
Nationalism is inherent to human nature because we evolved to create specific in-group, out-group dynamics. Innate within our being is a necessity to ascribe to a larger identity, and this can only be done by rejecting other possible identities.
Fortunately, this acceptance of a greater identity is an incredibly powerful motivator, and that means we can use nationalism to further any objectively good agenda. For example, in the case of civil rights, we can adopt a nationalistic mentality toward justice, and use that motivation to improve our country for the better.
Nationalism has a complicated definition and an even more complicated connotation, and for that reason, it cannot be simply defined as good or bad. However, if we employ it sparingly with the intention of improving national wellbeing, we can create a country that makes us proud.
(first published 8 Sept 2020)
Although the term BIPOC has been around for several years, it gained immense popularity following the murder of George Floyd on May 25. The need for inclusion of all peoples has arguably never been greater, and changing language to match this need seems like a great step in the right direction.
Unfortunately, as language evolves, it takes on new meaning. BIPOC, as a term, has gone through a metamorphosis of its own. Like terms of the past and terms to be, BIPOC has strayed from its original meaning and is becoming something much more complicated.
Like most words and terms, the origin of BIPOC is disputed — The New York Times cites a 2013 tweet as the first usage of the term that it could find. However, it most likely dates back further than that. It is an extension of the term “people of color” — commonly abbreviated as POC — with the addition of the “B” and “I” representing Black and Indigenous respectively.
The term was created with the intention of acknowledging the differing levels of injustice faced by racial groups, and to promote Black and Indigenous voices. As the founders of the BIPOC Project state, the term is used “to highlight the unique relationship to whiteness that Indigenous and Black (African Americans) people have, which shapes the experiences of and relationship to white supremacy for all people of color within a U.S. context.”
Recent events such as the killings of George Floyd and Breonna Taylor have demonstrated the immense need for social justice in America, but they have also ignited conversations that many were not previously accustomed to.
Thus, it is no coincidence that the usage of “BIPOC” spiked over the past few months. However, widespread usage of a word eventually adds new connotation, and those connotations can lead to evolution in the meaning of the word itself.
The phrase “women of color” demonstrates this change in meaning better than any other term.
Initially coined in 1977 during the National Women’s Conference, the term was created as a positive, inclusive way to refer to minority women. In present day, the term has diverged from its intended purpose and is now just a passive differentiation between white and non-white women.
“Women of color” was a term representative of strength in the face of injustice, but now it is simply a small distinction in biology.
For good or bad, the phrase has lost its original substance. Other terms have been commandeered to further injustice and hate, while many — like “Latinx” — have been adopted into elite groups as a badge of social status.
The worry with the term BIPOC is that those people who were supposed to be helped by its usage are actually being hurt. A Pew Research Center study in August found that 76 percent of Hispanic adults in the U.S. have never heard the term “Latinx,” and only 3 percent actually use it. Although no studies have examined the usage of “BIPOC,” one can safely assume that this trend continues.
So, who do these terms actually impact, if not the people they are meant to help?
Most often, terms regarding social justice are either created or adopted by university circles, and then distributed to those close to them. Those who have access to higher education are more likely to hear and be familiar with these terms.
However, that does not mean they are more likely to fight machismo in Latin culture or uplift Black and Indigenous voices — they just know what terminology to use when discussing the topic.
A term like BIPOC provides people a facade of pushing for social justice.
There is a moral crisis in the United States right now, and the quick acceptance of a term helps to alleviate that moral panic. But this isn’t the 16th century, and you can’t just buy your way into heaven.
If you are from a privileged community and use the term BIPOC, but don’t actively listen to the voices from people who look different than you, then you have simply changed your vocabulary without changing your mindset.
The job of changing your mentality in relation to social justice is both extremely difficult as well as extremely important, and terms such as BIPOC may be a great means for doing this. We cannot, however, let them simply be an end.