Below you will find articles relating to all of the humanities aspects of society.
The History of The Union Jack Flag
by Caleb MacRae
The Union Jack Flag represents the United Kingdom (consisting of the nations of England, Scotland, Wales and Northern Ireland), and contains inspiration from other flags of the nations of the United Kingdom. The flag contains the red cross of Saint George (the Patron Saint of England), superimposed upon the saltire of Saint Patrick, the patron saint of Ireland, represented by the diagonal red lines in the flag. This is superimposed on the saltire of Saint Andrew, which is more commonly known as the flag of Scotland. Wales is not represented on the Union Jack flag as the earlier ‘Flag of Great Britain’ was designed when Wales was still a part of the Kingdom of England.
The origins of the modern-day flag date back to the early 17th century, specifically in 1606 when the aforementioned Flag of Great Britain was established by a proclamation of King James VI and I of England and Ireland, and union of Scottish and English crowns respectively. This is what the Flag of Great Britain looked like.
The flags of British Overseas territories, such as Bermuda and Anguilla, as well as some sovereign states and regions (particularly Commonwealth), such as Australia and New Zealand incorporate the Union Jack flag into their own flag designs, or have official flags derived from the Union Jack.
The aforementioned early Flag of Great Britain was introduced on the 12th of April 1606, to represent the regal union between England and Scotland. The flag was specified in a royal decree, according to which the flag of England (a red cross on a white background) and the flag of Scotland (a white saltire on a blue background, known as the saltire or St. Andrew’s Cross) were to be joined together, forming the first union flag and first flag of Great Britain.
At first, this royal flag was only to be used at sea on civil and military ships of both England and Scotland, and land forces would continue to use their national banners. In 1634, King Charles I restricted the use of the flag to Royal Navy ships. After the 1707 acts of Union, the flag was then also adopted by land forces.
The main distinction between the current Union Jack and early Flag of Great Britain is the lack of representation of the Kingdom of Ireland in the earlier version of the flag, which was introduced to the flag on January the 1st, 1801. This current flag dates from this period as it is a result of the 1800 Act of Union which merged the Kingdom of Great Britain and the Kingdom of Ireland to form the Kingdom of Great Britain and Ireland. This new design added the red saltire, the cross of Saint Patrick to represent the Kingdom of Ireland.
The early Flag of Great Britain formed the Continental Union flag, the first de facto national flag of the United States. This is what the Continental Union Flag looked like.
The official flag of the British Army is the Union Jack. However in 1938 a ‘British Army non-ceremonial flag’ was devised, featuring a crowned golden lion, surmounting the royal crown, overlying two crossed swords on a red background. The word ‘Army’ in gold letters may appear beneath this motif.
The lack of Welsh representation on the Union Jack, as aforementioned, is a result of Wales not being considered an integral part of the Kingdom of England at the time that the early flag of Great Britain was introduced. On the 26th of November 2007, in a House of Commons debate, Wrexham’s Labour MP, Ian Lucas proposed that the Union Jack flag be combined with the Welsh flag to reflect Wales’ modern-day status within the UK., and that the Welsh Red dragon, or Y Ddraig Goch be added to the Union Jack’s red, white and blue pattern. Minister for Culture, Creative Industries and Tourism, Margaret Hodge concluded that Lucas raised a valid point for debate, however the flag remains unchanged. Suggested redesigns include one shown below:
Furthermore, recent proposed alterations and redesigns also developed interest in the run-up to the 2014 Scottish independence referendum. However, as Scotland voted against independence, an official redesign issue never arose, and the St. Andrew’s Cross remains on the Union Jack Flag.
Write What You Live - The Heart of Poetry
by Afiyah Rasool
In general, introductions to poetry start early on, I’m sure most of us can recall being made to write acrostic poems during primary school, and a lot of people disliked it. I know I did. As I moved through school, I was exposed to plenty of poetry, from Shakespeare to spoken word, and told to write it, too. “Write a poem about a bear”, what a prompt!
Yet, in all honesty, I always struggled to create poetry when merely handed a prompt and told to 'create', I usually stared at my page as if I had been told write a shakesperian sonnet on the topic of concrete. It usually feels forced and artificial, because it's not like I can create feelings from thin air. For a long time, I moved through school thinking I was either bad at poetry, or that poetry was just plain boring. It took me a long time to see the real problem: not poetry, but distance. Because, believe it or not, I haven’t had many striking experiences involving concrete.
But I have known what it’s like to wish things would move faster. I have outgrown friendships I thought would last forever. I have known the quiet of realization that I am changing, while things stay the same around me. If you think about it, you probably have your own versions of that core emotion.
We feel happiness, and loneliness, and stillness, and we go through it often in silence. Those are the moments that hold weight. No, maybe not the physical weight of concrete, but the concrete weight of emotion that sits with you.
Nobody lacks an imagination. The problem was never creativity, it was that I was trying to invent something I already had. To me, poetry is less decoration and more a reflection of memory, empathy, understanding, observation. It takes the ability to look inward before you start writing.
It’s easy to reduce literature to a structure you have to follow, and as important as those skills are, I see it as far more than a pattern. Literature that stays with you involves your memories, experiences, interests, the things that matter to you. Until the prompt touches something real to you, it’s easy not to care.
At some point, I stopped asking what a poet looks like, and I started asking what I looked like. Not physically, but inside and out. What did I really care about? What had I seen? What interested me?
When I stopped forcing metaphors and started paying attention to my own life, sentences began to form. They weren’t perfect, and they still aren’t. I didn’t suddenly become a master of poetry, but that was never really the point. The point was that I wanted to be like the writers I read all about, yet I had no connection to my own work. Prompts are good practice, and technique will always be key, but they don’t provide the depth. Depth is what makes words matter.
Writing from the heart doesn’t mean every poem needs to be heavy and dramatic. It doesn’t have to be about heartbreak or loss. Sometimes writing honestly about any feeling: joy, boredom, awkwardness, has the same effect, because it’s true. I want people to start writing, and to create until they feel that shift.
Poetry doesn’t have to be perfect or dramatic. It’s not about impressing anyone or following every rule. It’s about writing what you’ve truly felt, what you’ve lived. When you do that, even the smallest moments can become something real and impactful. So start there, and let your own experiences guide you.
All in all, if poetry and writing isn’t your thing, it never hurts to find your own creative outlet. What you have to remember is that to create is to feel. So, no matter what it is, feel. Capture the memory with words, or colours. It doesn’t matter, just keep feeling, and keep creating.
The Mind and Modern Demise
by Jessica Oakley
It seems increasingly relevant that extreme ideologies have not been left in the past the way we might wish them to be. Whether it is an increase in misogyny, gender division and antifeminism, or racism and discrimination, we are far from eradicating harmful social narratives. It is sometimes difficult to comprehend that those who hold beliefs far from our own could believe that they are justified. But we cannot challenge adverse perspectives unless we first understand them, and so it also seems increasingly relevant that the mind holds the answer.
There is something inspiring about individuality, the way the world is perceived differently, and contributed to in a way that is never the same for each person. But it is the very nature of our perception that it is completely subjective. This can be threatening when informing decisions based on individual, and not always universal, rationality and morality. Perception is defined as “the way in which something is regarded, understood, or interpreted,” whereas reality is defined as “the state of things as they actually exist, as opposed to an idealistic or notional idea of them.” It is dangerous, and all too common, to confuse them. While our perceptions are necessary tools for understanding the world around us, they must consistently evolve with new experiences and critical engagement with the knowledge presented to us. Remaining open-minded is essential, rather than believing that our own views are infallible. An example of the consequences of this are the influences on a political system governing those who cannot find common ground. Extreme contrast between alternative perceptions may leave no room for mediation between them. According to Psychology Today, massive divides between perceptions in a country would lead to a slow disintegration of the institutions that hold a society together, generating a real world sense of dystopia.
Notably, the ever expanding influence of social media and its online communities seem to be facilitating this division. Division that is a primary catalyst for conflict and violence. Exposing individuals to highly selective versions of reality that feed into their vulnerabilities and insecurities fosters a culture of fear and hatred in echo chamber adjacent environments. Our habits and algorithm filters influence the content we are presented with, potentially preventing individuals from being exposed to media that does not align with their viewpoints and opinions. Instigating the spread of misinformation as critical thinking remains undeveloped without the provision of new information for the opportunity to exercise it. While books and articles that gain popularity usually have some credibility or general regulation for polarisation, we are seeing shortcomings in the policies of technology companies who fail to keep users safe in the same way, allowing the exploitation of vulnerable individuals, the real world consequences of online radicalisation, and the disruption of democracy.
Furthering this is the idea of herd mentality. The National Library of Medicine suggests that it has a powerful influence on people’s behaviour. This can be positive, allowing groups to align in achieving a common goal, however, the aforementioned developments in technology have allowed the ignition of harmful stereotypes and the promotion of damaging behaviour to be exacerbated by it, without sufficient consideration for the consequences of or meaning behind judgements and actions. Perhaps it is this combined with the often unrealistic expectations we hold for each other that has led to a severe neglect of empathy. Not only for the general public but incredibly influential members of society. Western cultures can be characterised as being individualistic, emphasising the needs and desires of individuals rather than the relationships of communities as a whole. Valuing personal independence, self-sufficiency, and autonomy can lead to loneliness and social isolation, as well as mental health struggles and transactional relationships that threaten social cohesion and are furthered by a focus on individual achievement online. Suggesting that we may be depriving ourselves of necessary social connection by failing to balance our individual desires with the need for communal well being.
Overall, it is crucial for every member of society to evaluate their own perceptions and the knowledge they are presented with. Making personal and global decisions based on compassionate principles, rather than self serving or misguided ones. Evidently, the sociological and psychological aspects of human nature are crucial to understanding our downfalls, rather than excusing the basic knowledge that we must challenge our own beliefs, especially if they contradict what is the bare minimum; treating each other with respect and dignity.
Shadows of the Mind: The Dark Side of Human Behaviour
by Ananya Kupperi
It begins in the shadows, in the breathless silence before a scream. A fleeting silhouette. The cold glint of a knife. Before the mind can even comprehend what has happened, blood is already seeping across the pavement while the figure disappears into the darkness.
Murderers. The word immediately conjures the image of someone who takes a life in sudden, impulsive moments , instinctive acts of violence. Yet the reality may be far more disturbing: a darker impulse, a hedonistic craving for thrill, control and power that lurks beneath the surface of human behaviour. Do we ever truly stop to consider why someone may commit such an act, or what experiences might have led them to believe that taking a life is their only option? Society often assumes that murderers are born with malicious intent and are beyond redemption. However, if individuals struggling with drug addiction are offered rehabilitation in order to rebuild their lives, why should we automatically assume that those who commit violent crimes are a lost cause?
Throughout history, this subject has fascinated many, largely because most people struggle to comprehend how an individual could willingly take innocent lives. Perhaps, somewhere in their past, a traumatic or distressing experience altered their perception of the world. Perhaps they lacked the guidance, stability or emotional support that many others receive from family and society. Human personality is shaped not only by genetics but also by environment and childhood experiences. Studies have demonstrated that negative family environments including constant conflict, emotional neglect and a lack of warmth or affection can significantly influence psychological development. Childhood experiences act as the foundation upon which the rest of life is built. If that foundation contains emotional fractures or instability, the structure of adulthood may become deeply affected.
This raises an important question: are individuals born with violent tendencies, or are they shaped into such people by their environment? Some argue that many killers develop their behaviour as a consequence of difficult childhood experiences, where loss, trauma or neglect gradually shape their worldview. In my opinion, growing up in a household lacking emotional closeness or consistent support can leave individuals feeling isolated, abandoned and disconnected from others. Without positive guidance or strong role models, it becomes far more difficult for someone to learn how to regulate their emotions and understand moral boundaries. Much of what we know how to behave, how to communicate, even how to carry out basic daily tasks is learned from those around us. Without that guidance, a person’s sense of right and wrong may become distorted.
Criminal psychology has attempted to explore these patterns in greater depth. For example, some researchers studying serial offenders have identified connections between childhood trauma and later violent behaviour. In certain cases, individuals who experienced emotional neglect or psychological distress during childhood later developed disturbing interests or behaviours. One frequently discussed example is that of Jeffrey Dahmer. Reports suggest that after undergoing surgery at a young age and experiencing increasing social isolation, he began showing unusual fascination with dead animals and dissection. While such behaviour does not inevitably lead to violence, psychologists often associate these patterns with severe depression, attachment difficulties and emotional detachment. Over time, these psychological struggles can evolve into far more dangerous behaviours if left untreated.
But what exactly defines a serial killer? In reality, violent offenders are categorised in several different ways depending on the pattern and circumstances of their crimes. According to data from the Federal Bureau of Investigation, thousands of cases have been classified under different categories of serial, spree and mass killings. A serial killer is typically defined as someone who murders three or more individuals over a period of time, with a cooling off period between each crime. Mass murderers, however, kill multiple victims during a single event in one location, while spree killers commit murders in multiple locations with little or no time between them.
Beyond these categories, motivations for killing can also differ. Hedonistic killers seek pleasure, excitement or financial gain through their crimes. Power oriented killers are driven by a desire for dominance and control over their victims. Mission-oriented killers believe they are eliminating a particular group or individual for a perceived moral or ideological reason. For instance, Joseph Paul Franklin targeted victims based on racist beliefs, believing he was carrying out a personal “mission”. Others, such as David Berkowitz, claimed that hallucinations or delusional beliefs compelled them to commit murder. As these examples demonstrate, the motivations behind violent crime can vary significantly, making it difficult to apply a single explanation to all offenders.
Another factor that may contribute to violent behaviour is the presence of certain psychological disorders. Conditions such as Intermittent Explosive Disorder involve sudden and uncontrollable episodes of intense anger. Similarly, Antisocial Personality Disorder is characterised by a persistent disregard for the rights and feelings of others, along with a reduced capacity for empathy. Disorders such as Schizophrenia may involve hallucinations, delusions and episodes of psychosis that distort an individual’s perception of reality. Living with such conditions can make it extremely difficult for someone to distinguish between what is real and what exists only in their mind.
One example frequently discussed in criminal history is William Heirens, an American criminal known as the “Lipstick Killer” due to messages written at crime scenes. He was ultimately sentenced to decades in prison and later transferred to the Dixon Correctional Centre, where he remained until his death. Cases such as these raise complex ethical questions: if a person’s violent actions are strongly influenced by mental illness, should society treat them solely as criminals, or should earlier psychological intervention have been provided?
Statistics further illustrate the scale of the issue. Each year, thousands of people around the world lose their lives to homicide. While the number is lower than deaths caused by natural disasters or disease, the deliberate nature of murder makes it uniquely disturbing. Imagining life with severe psychological disturbances constantly questioning what is real and what is imagined highlights how profoundly mental illness can affect behaviour and perception.
Genetics may also play a role in violent tendencies. Some studies suggest that individuals with family histories of violent crime may have a higher risk of developing similar behaviours. However, this raises another ethical dilemma: are individuals destined to follow the path of their ancestors, or can environment and personal choices alter that trajectory? In reality, most people with family histories of crime never become offenders themselves.
One striking example is Harold Shipman, a respected doctor who was later revealed to be responsible for the deaths of hundreds of patients. For years, he appeared to be a trusted and respected member of society. His case demonstrates how outward appearances can be deceptive and how even individuals in positions of trust may conceal deeply disturbing behaviour.
To conclude, there are numerous theories attempting to explain why some individuals commit acts of extreme violence. Some emphasise the influence of genetics, while others focus on childhood experiences, psychological disorders or social environment. Yet despite decades of research, the true causes remain complex and not fully understood. Perhaps the most unsettling question is this: if society cannot fully understand the origins of such behaviour, how can it hope to prevent it? Or are we simply left to lock our doors at night and hope that the darkness remains outside?
By Evan Lendrum
The law is a topic which is more complex than you might first think. I am sure we all have a surface level reasoning for why it’s justified, however have you ever wondered why is law necessary in society?
It’s a question that needs much more critical thinking than you may initially think. Essentially the law is a system of rules enforced by an authority, in our case the UK government, and is in short formatted to maintain peace and sustainable structure for society. It is the fundamental framework which provides the platform for the function of the modern world which we live in. Yet this begs the question once more, why do we, who live in such an advanced global society, require these binding rules?
This is an issue which has been the focus of a perpetual argument between philosophers, who have adopted varying standpoints on the purpose of these rules. There is no debating the fact that the law is needed for society to operate, however the root of the issue is how it places limits on people’s behaviour.
The Greek philosopher Aristotle rigidly took the stance that law should be enforced to cement moral purpose and criminal justice within society. Conversely, another Greek philosopher, Plato, argued that in the absence of law, morality would be indefinitely neglected and humanity would regress into primal animals. John Stuart Mill suggested that the reasoning for law should be solely confined to the prevention of harm and external matters should not be micro-managed by authoritarian entities, leaving individuals to follow their own moral compass.
All these philosophers, whilst presenting different viewpoints, harboured one common belief, that laws serve as the building blocks of society and removing them would result in chaos.
An ongoing discussion that surrounds the subject of law, is the idea of “legal positivism”. This is the notion that law, which has been “posited” (created) by humans, is aligned more around objective social facts than constructed with a moral basis for its rulings. This concept that law disregards humane moral values and doesn’t correlate to what in theory is “correct” can be better understood when observing anti-positivist arguments.
On one hand it is undeniable that a set of rules created by humans cannot be perfect. It contains several imperfections which have been exposed over time. In the UK, at times, laws are rushed through parliament and even when fully debated, they can’t anticipate every situation that they will be applied to. It is the law makers’ role to fix such problems by amending the law when needed.
Notably, the ‘Dangerous Dogs Act’ 1991 was introduced rapidly in response to public pressure following a series of high profile dog attacks. In subsequent years the act was highly criticised for being poorly worded and thought through, which resulted in the amendment of the same act in 1997. This was because it was deemed to unfairly criminalise certain dog owners with strict liability.
A difficulty with the law is it’s no secret to the public that it is imperfect to some degree. This generates doubt in the legal system and fears about it being exploited by individuals in return for power.
Citing Radbruch’s formula, derived from Gustav Radbruch who was formerly a legal positivist from Germany, helps to advance our understanding of the topic. He believed morality had been stripped from laws, and had only furthered the oppressive Nazi regime of the 1930’s and early 1940’s. Instead he came up with the idea of ‘Statutory Lawlessness’, where if a law is intrinsically filled with a discriminatory intent then it is dissolved of all its legal character and validity therefore can’t be followed. Applying this logic, a nazi accused of war crimes could not rely on the defence that were simply following the law, as they would still be morally culpable.
This is entwined with his notion of ‘intolerable’ thresholds, which cites that positive laws whilst usually followed, must yield to justice once their contradiction with values of fairness surpass this intolerable threshold.
In short he believed that the law and morality are two abstractions which are inextricably linked and operate best when applied to each other.
On the other hand it can be argued that making these changes is neither pertinent nor realistic, a viewpoint Oxford academic H.L.A Hart holds, believing that it is inconceivable to rationalise the idea of a more flexible set of laws which give leeway to morally fair ideals. He saw the laws of a country as more of a generalised reflection of its legal system which has designed them.
This corresponds to oppressive regimes such as Germany when under Nazi control, whose set of laws, which were posited, disregarded basic human decency and set the precedent for the immoral treatment of people under the ruling of that government.
All these perspectives can be traced straight back to the initial question, why is law necessary in society? The reasons positivists and non-positivists argue is for the direct reason that the law is made and it is not simply to maximise full fairness in our legal system, nor instigate complete societal safety.
The laws that we adhere to are implemented by our respective governments in order to determine the type of society that they want us to live in. Irrespective of whether or not the laws are constitutionally correct, they must be followed. Of course most societies, such as the UK, have values which align with morality and criminal justice, which eliminate violence and oppression within the country.
However countries which operate under dictatorship regimes, can’t be labelled as not having laws, conversely it’s quite the opposite, these regimes enforce immeasurably mitigating rules for individuals in order to maintain totalitarian power and ensure a reduced chance of opposition.
As philosopher Edmund Burke famously said “Bad laws are the worst sort of tyranny”.
By staying in compliance with the law, the people of the country essentially enter what can be labelled as a ‘social contract’ with each other, all agreeing to comply with the formal rules and simultaneously benefiting from security, living standards, resources etc. which are provided by the state who enforce these laws.
This is the primary reason for why the law can be considered as a necessity for modern society, as it places the groundwork for people to reside in a functional world, which although it will never be perfect, allows society to keep progressing in a positive direction.
In this modern world it is more important than ever to understand the reasoning behind the laws that we follow in our day to day living and always question in which way they impact our society. Changes aren’t made in the law without a motive or precedent and these alterations must be made with the purpose of positively influencing the world in which we live in, otherwise they can be considered as meaningless in terms of achieving their overall intention.
A final thought is the idea that these laws can be interpreted as fragile and reliant on being relevant in order to be consequently followed. For example, highly personally ambitious individuals would likely oppose ideals of communism due to its ideals which emphasise the collective over the individual. This leads to separation of unified thinking within a country and creates an inharmonious balance between the people and government which can result in protests, violence and in worst cases civil war.
The law can cause as much destruction as it can serve the good of society, which is why it is so crucial that it is rationally implemented in the best interests of the people.
Another recent example of different interpretations of the law can be seen in recent political escalations, which is epitomised by Donald Trump’s comments about his discussions with Keir Starmer’s relating to the strikes by the US and Israel on Iran, saying Starmer appeared to be “worried about the legality” of the attacks.
This insinuation that international law is irrelevant when it does not marry to views of his morality, whilst in anti-positivist theory appears to be practical, can be seen in this situation to cause divide between nations on a political scale, as it sets a precedent that any law can be overlooked when given sufficient moral context and justification. Not all nations follow these sort of constitutional guidelines and this can result in a disconnect between countries.
This shows why it is not only important to have laws, it is important that they must be respected.
Gender Troubles: Butler, bell, (de)Beauvoir, and Becoming A Woman
By Isabella Heffer
Gender has been a key component of the cultural zeitgeist with increasing intensity in recent years, with the mocking challenge of ‘defining a woman’ often being hurled at those who purport to hold vaguely socially progressive views. Generally, this will be met with indignant tripping over words, responses of the tautological variety such as ‘someone who feels like a woman’, and an immediate defensiveness in said interviewee. While an understandable reaction after the tradition of lampooning has been hanging over the debate for years, it is nevertheless an interesting question, and, as with all interesting questions, worthy of debate and analysis. So what does it mean to ‘become a woman’? When do we acquire our gender? When we have our first periods? Try on a grown-up outfit for the first time? When we first hold keys between our fingers in case the man walking behind us at night decides ‘no’ doesn't mean no after all?
A biological approach is not uncommon in these cases, and when posed in good faith is still an interesting argument . Yet the question still remains. When does one become the gender assigned by society? At which point are we socialised (or do we socialise ourselves) as such into a gender? When our sex is established in chromosomes at fertilisation? If so, what do we do with the 1.7% with conditions such as Klinefelter syndrome (XXY) or Turner syndrome (XO)? Even aside from the spanner in the works the intersex community seems to introduce to this argument, if we are taking gender by its dictionary definition, as a construct defining a person’s ‘roles, behaviours, expressions, and identities’ in society, how do one’s chromosomes affect how one is viewed in society? OK, then surely we become women at week seven, when the Y chromosome starts to develop differently to XX? Well, apart from our continued avoidance of the intersex population, what about scales of different features? Do people with bigger characteristics ‘possess’ more gender? Even from a preliminary once-over we can see that applying a single-pronged answer for a social concept is difficult, and while biology is likely the most respected of the fields I will use today, it leaves us with more questions than answers, and acts as a puzzle piece you have to shove in a little: it’ll work if you really want it to- but it’s best to look at all the options before settling down.
Definitionally, there have been a wide range of theories around ‘woman’ itself. If you ask the OED, it’s “An adult female human being. The counterpart of man“, a fundamentally tautological definition, as with most, but even when trying to define female, my issue with the original definition, “an animal that can lay eggs or give birth to babies; a plant that can produce fruit”, we run into the same fertility problems as in the biological question. Etymologically, despite the theories of the fateful ‘womb’, ‘man’ the word actually comes from a contraction of the original word for adult female “wif” and the neuter word for human being/person ‘man’, joined together to specify. The lack of standardised spelling in Old English led to a shift by the time of ‘standardization’ (although if we take a look at Chaucer that word is used very liberally) in the Middle English period into “wimman” and “wommon” (adult female human being). By the 1600s, likely due to the prevalence of the great vowel shift amongst other things, the versions we know today were established: “woman,” singular, and “women,” plural, modelled on man/men. So, somewhat predictably, tautological definitions and etymological trundles do little to aid us either .
Back in territory where I’m far more comfortable, the philosophical sphere (in whose remit everything falls in one way or another) holds a wealth of gender criticism stemming from far further back than the coinage of the term ‘gender identity’ by psychiatry professor Robert J. Stoller in 1964.
The first person in history to credit their work (that we know of and still have) was Mesopotamian poet and priest Enheduanna, whose writings in circa 2300 BCE directly speak to the first glimpses of gender blurring and non-conformity. From an expression of the optional nature of sex: “Now I have been cast out…Even my sex is dust.” to the variability of gender "The power to destroy, to build up, to tear out and to settle are yours, Inana. To turn a man into a woman and a woman into a man are yours, Inana." , these texts speak to a gender ‘queerness’ which was a strong feature of the cults of the goddess for whom these poems were written, the pilipili, for example. In the Passionate Inanna (lines 80-90 ), a person brought to Inanna was raised as a woman but, blessed by the goddess by being handed a spear ‘as if she was a man’ and renamed pilipili, would come to be described as ‘the transformed pilipili’. Even though Sumerian did not use gendered pronouns, the imagery of the spear was enough to convince many of the shift from female to something quite different. In fact, Inanna was a source of a lot of interesting gender ‘queerness’. Priests for Inanna were known as the gala and were said to have been created by the god Enki to sing laments for her, before their role was heavily expanded through the old Babylonian period to include presiding over religious rites, healing and looking after the sick and poor, and performing elegies and lamentations. Perhaps most interesting to our purposes are the latter duty, as mourning rites originally sung by women were replaced over time by members of the gala, who in all intents and purposes with this shift became women. They adopted female names, sang in the Sumerian eme-sal dialect strictly reserved for feminine speakers to render the speech of female gods, and the possibility of ritual castration has been suggested frequently. In the later Babylonian poem The Epic of Erra there is perhaps what reads as more blunt direction to the fluid aspect of the feminine when of the assinnu – a feminine character who translator Stephanie Dalley calls ‘good-looks the playboy’ to avoid direct gendering and who she suggests may have been a boy ritually castrated – the poet says: “Whose maleness Ishtar (the Babylonian name for Sumerian Inanna) turned female, for the awe of the people”.
Archaeological evidence also points to greater fluidity of gender than we would perhaps expect from the ancient past. The statuette of the singer Ur-Nanshe found in the Sumerian city of Mari has a ‘feminine’ form: with trace makeup, the suggestion of breasts, and a soft face, despite their masculine name.
The intrigue of this statuette only increases when taking into account its attribution to the goddess, despite their name literally translating to ‘servant of Nashe’, a separate goddess of prophecy, social justice, and fish, suggesting a purposeful link with the goddess of love whose cult allegedly welcomed those outside of the gender binary could be due to the naru (singer)’s own identity.
A fragment of a statue currently in the archive of the British Museum furthers this interpretation perhaps most significantly in the inscription often translated as “Silimabzuta, hermaphrodite of Inanna”. Cheryl Morgan has offered a more literal, but more accurate, translation of “person-man-woman”, a term that has been used in relation to trans people in many cultures. So for the ancients (or as ancient as we have actual named sources for), what in theory should be the cradle of human thought and natural roles, there were still these exceptions to seemingly reasonably frail norms, the veil between the genders being deceptively thin. Interesting as this is, apart from introducing the concept of gender as a performance over four thousand years before Butler (though they’ll have their due turn soon enough) it fails to answer definitively the question of what it is to be a woman, so thanks for nothing.
Sailing through the quick interim between 20th century BCE Sumer and 20th century CE France requires little more than a few phrases when looking at the Western world (I know pitifully little about extra-European history and would be loath to make unsubstantiated judgements when, let’s face it, anyone but Europe was likely far better at handling the whole gender question in general. Certainly when we look at two-spirit people in some indigenous communities, or the Haudenosaunee in native America, or the pre-colonial Bugis people of Sulawesi in what is now Indonesia, they come up trumps.). En masse, women are subjugated, at times not allowed to be seen or heard, raped, treated as commodities, and belittled. I’m always averse to cite media which thinks itself far too witty for words, but to quote Fleabag, ‘women are born with pain built in’. So is that what it means? Is the threshold for womanhood pain? No, of course not. One of perhaps the most unifying things about humanity is pain- we all have it, and it would be preposterous to suggest all women ever have felt the same amount of pain, or that theirs is a pain inherently greater than that of men. Then perhaps it is a shared history of trauma, the whispers of fear in many cultures that at some point a man may just snap, and suddenly your body isn’t yours anymore. This is again flawed, while women may be the victims of much of the gender violence we have known in history to suggest we are the only gender to have that history only feeds abusers. Also, to define oneself by one’s inherited victimhood does nothing but perpetuate ancestral pain. Yet again, we find ourselves at an impasse.
Exhausting most options I can think of at the moment, we relinquish power to the experts (a policy some current politicians could do with exerting). Gender theory has been grappling with these concepts far before me, and frankly, I know when I’m beat.
Simone De Beauvoir, born in 1908 in Paris is first and foremost a highly controversial figure. I was excited to include her for the revolutionary work her 1949 work ‘The Second Sex’ did for the fundamentals of second-wave feminism, and what that means for the way in which we can perceive gender as the negative experiences one endures. Simone DeBeauvoir was also an alleged groomer, 3 separate girls accusing her of sexually exploiting them as 16 and17 year-old students in her care in her open relationship with Jean-Paul Sartre, and along with others signed a petition to lower the age of consent in France. On discovering her character and alleged grooming, I considered pulling her from the article, altogether, but for the sake of intellectual rigour and the propagation of aniconism her words remain as a puzzle piece to the exploration of this article. The fact her writing is useful to us does not take away from the fact that she was an objectively bad person. In her book ‘The Second Sex’, de Beauvoir introduces this concept of gender as a social construct in depicting how women are displayed as the ‘other’, the passive outline to man’s active agent: ‘immanence’ to ‘transcendence’. In a way, women provide the negative space to drawings presented by men. As she says “On ne naît pas femme: on le devient.” (One is not born a woman, but becomes one), as she argues by virtue of her socio-political upbringing. In using the masculine ‘le’ to describe the emergence of womanhood, in my view de Beauvoir further proves her point: in a patriarchal society, to be an agent is to be male, and one who is placed in this position of subjugation takes the role of the woman through the agency of the patriarchal society in which they reside (It is probably just a grammatical thing, the ‘le’ denoting women as a whole rather than ‘la’ suggesting one woman in particular- but the afterlife and reception of such a phrase is what makes this saying so interesting).
Judith Butler’s ‘Gender Troubles’ published in 1990 develops this theory of womanhood as an acquired gender characteristic rather than biological fact. Butler argues that gender is a performance, not only something imposed upon us by society but something that we ourselves confirm and develop even subconsciously by virtue of the norms and roles prescribed to the label with which we identify ourselves. Their argument detaches all sex-importance from the conversation, arguing that the way one identifies themselves is the entire story. However, critiques have been made of this argument as it seems to some to prescribe certain rules of femininity to which one must adhere in order to identify with said gender. In matters of personal identity it is often difficult to make widespread judgments which can accurately encompass each individual experience, but Butler's argument isn’t about that. They’re not trying to define femininity, they’re proving that what matters to society, the sphere under which gender resides and on which our relationships and how we see each other hinge is the way in which we perform our preferred gender, and therefore this is the only thing of tangible importance to our daily lives.
bell hooks’s pen name is purposely left uncapitalised by nature of her work as a philosopher, as she seeks to decenter the ego of name branding authorship. Perhaps inadvertently she has also exposed the ease at which Western society will baulk at any name they don't understand, as evidenced most clearly by the extent of the reaction to her insistence on her decapitalisation from the publishing world in particular. For fear of inadvertently propagating this centering tradition against her express wishes by over-explaining, we move swiftly onto the content of the work itself. A woman is an entity emancipated from her struggle for freedom, as a woman isn't a feminist by nature of her womanhood but her struggle for her rights. For hooks, womanhood is emancipation and freedom from patriarchal values and biological ties, it must be if it is to exist apart from men rather than as a lesser partner. However, in ‘Aint I a woman’ she brings out the idea of intersectionality as shattering the concept of women as a homogenised group. There is no such thing as the correct role for a woman because women have such different lived experiences and biological realities; therefore one cannot discuss ‘women’ as a group without crushing the extent that their other threatened characteristics also negatively affect them, as they are an essential element of how they perceive and receive their individual womanhood. Just as a wavefunction will intrinsically collapse upon interaction with the environment, or ‘observation’, the perception of blanket ‘womanhood’ destroys the nuance of the situation, even if you go on to explain. It cannot be returned once that impression has been made.
So what have we learnt? Absolutely nothing. Why? Because gender is essentially what we make of it. Despite the platitudinal status of the trite phrase, gender remains a social construct which means you can do whatever the hell you want with yours. Perhaps if you are to take anything from this, it should be that the performance of femininity, of gender, has been a tradition lit in the stars for millennia (despite the meaning changing) and whether you want to join the kickline, take a seat, or leave the auditorium, it’s gonna continue to be a bloody good show whether you like it or not.
-The Archer Eye-
Est. 2022