Skip to main content

Uppingham Research Journal_Issue 5_V02.26

Page 1


Uppingham Research Journal Issue 5 February 2026

Is morality a human invention?

“Classicism was the most important driving force behind art and architecture of the Renaissance” To what extent do you agree?

The Protein Folding Problem

Evaluating US Economic Growth in the AI Age

“Use every man after his desert, and who should ’scape whipping?” Should the law treat offenders better than they deserve?

The Scramble for Africa: A Resource Grab Disguised as Imperialism?

Rise of Shoegaze

Foreword

Dear Reader

Welcome to the fifth edition of the Uppingham Research Journal. The research and arguments constructed by the Upper Sixth editors continues to inspire creativity in each one of us in their exploration of innovative and varying topics.

As you read on, we hope you are challenged as much as you are inspired. The journal exists as a space for ideas that do not fit neatly into exam specifications: a space to interrogate assumptions, to question orthodoxies, and to contribute thoughtfully to wider academic conversations. Writing, whether scientific, philosophical or influential, continues to drive progress in the world we live in. It reshapes our perspective of the environment we are surrounded by, our interpretations and our understanding, and by doing so creates a butterfly effect of influence. On this path, this journal aims to celebrate this tradition and keep the power writing holds as an art we master.

Once more I thank the entire team of editors, across all subjects, alongside the celebrated and commended works of my incredibly talented peers. I thank Hermione for her help by my side, and hope this journal continues in its aim onwards. I am grateful to Mr Addis for his continuous support, guidance and expertise - an incredibly valuable member of our team. Finally, we thank you, the reader, for celebrating the incredible effort and interest poured into these pages, we hope they bring you the inspiration and enjoyment they have brought us.

If you are inspired by what you read and have produced work you believe merits publication, we encourage you to submit it to a member of the editorial team or the relevant subject head editor.

“If you want to influence people, write the truth.” Leo Tolstoy

Is morality a human invention?

The question of whether morality is a human invention confronts the tensions between divine command, natural law, secular humanism, and existentialism. Whilst traditional religious moral codes posit morality as an objective transcendent authority, existentialist and sociological perspectives undermine this view, especially amid secularisation and moral pluralism. The strongest argument is the one stating morality is not discovered or revealed, but it is constructed and shaped by individual freedom and reason within a cultural and social context. Thinkers that offer sustenance to this view are Sartre, Durkheim, Hare, and Mackie as they explore how morality emerges from human interpretation and evolution as opposed to divine ordinance or fixed truths. Through this existential freedom, linguistic development, and rational deliberation morality becomes a reflective and functional human project that is illuminated away from an eternal blueprint, but is an ongoing invention rooted in human thought and social cohesion.

It can be argued that morality is a human invention through the concept of existentialism and its rejection of objective moral foundations. Morality’s foundation could be viewed as relative to a scale in where one’s belief lies, the extremes being atheistic and purely divine. Sartre is an advocate for one of these extremes, the atheistic approach, rejecting the notion of an objective and divinely ordained moral order. Sartre argued that ‘existence precedes essence’ meaning that we do not possess any preordained purpose or moral nature when born, we acquire it throughout our lives¹. Sartre argues that ‘man first of all exists, encounters himself, surges up in the world and defines himself afterwards’² portraying how it is our existence and experience that cultivates our morality contingent on what we encounter, this is supported by the empirical evidence of different moral compasses for individuals. Here, we see support for morality as a human invention as he argues that humans are the authors of value and meaning, it is us that assigns meaning to life as the absence of divine moral law causes radical freedom, Sartre poses this through the words ‘If God does not exist, everything is permissible…man is forlorn, because neither within him nor without him does he find anything to cling to’ Sartre is saying that this absence of divine authority causes the lack of an external source to guide morality, he emphasises that humans are faced with incredible moral responsibility as we create objective truths and then have sole responsibility to act in accordance to them, all choices have profound implications upon the individual and the collective creation of the world³. Dostoevsky’s phrase ‘everything is permitted’⁴ signifies the freedom that accompanies this absence of a pre-ordained morality, humans invent their morality and take the consequences for their actions. Finally, one can filter into Sartre’s existentialism a form of humanism, as he believes that creating our own values mean we can achieve true authenticity and self-realisation, it’s our own self creation or ‘invention’ that leads to the treasure of meaning and purpose in our lives⁵. Macquarrie elaborates on this as he explains that morality is rooted in authenticity, morality is defined as the individual’s responsibility to act in accordance to their freely chosen ethical values ⁶due to the collective idea being that morality is ‘invented in the freedom of the individual’⁷. Sartre critiques the belief of ‘bad faith’ whereby an individual will attempt to escape the burden of moral freedom; moral failure becomes an idea of evasion not ignorance when aligned to his existentialist view of morality as an invention. Existentialism holds each person accountable for their created moral framework as morality is humanly constructed⁸. Thus, Sartre sets out a grounding argument for morality as a human invention due to it being the root of his philosophy, his simplistic approach to moral order firmly supports the argument alongside the evidence found in our natural moral codes and actions.

¹ Existentialism is a Humanism by Jean-Paul Sartre 1946

² Existentialism is a Humanism by Jean-Paul Sartre 1946, Page 22

³ Existentialism is a Humanism by Jean-Paul Sartre 1946, Page 34

⁴ Attributed to The Brothers Karamazov By Dostoevsky

⁵ Existentialism is a Humanismby Jean-Paul Sartre 1946

⁶ Existentialism: An Introduction, Guide and assessment by John Macquarrie Ch. 7 ‘Sartre’, Page 152

⁷ Existentialism: An Introduction, Guide and assessment by John Macquarrie Ch. 7 ‘Sartre’, Page 154

⁸ Being and Nothingness by Jan-Paul Sartre 1943

There is an obvious tension between religious moral authority and subjective morality, Kirkegaard is often regarded as a proto existentialist as he harmonises the two contrasting beliefs. Kirkegaard emphasises on individual subjectivity within the parameters of faith and morality. He presents the story of Abraham’s near sacrifice of Isaac and labels it as an illustration of faith excelling ethical duty. This complements his concept of ‘teleological suspension of the ethical’ which refers to our obedience to divine word suspending our moral order, in effect a rejection of universal moral reason, which indirectly refers to Kantian ethics. 9Kirkegaard sets this out through the words ‘the ethical expression for what Abraham did is that he was willing to murder Isaac; the religious expression is that he was willing to sacrifice Isaac’¹0 this demonstrates how religion and moral norms can differ and cause a preference being toward subjective commitment. Kirkegaard’s proposal of this paradox is discussed by Gouwens who argues that this teleological suspension illuminates the fragility and contingency of ‘universal ethics’ (whether that is an accurate referral to ethics is amidst the discussion of this essay) in the face of personal and religious belief¹¹. An insightful quote into the argument exposed by Kirkegaard is ‘Kirkegaard shows us that the highest moral and spiritual truth cannot be captured by rational universality; it demands personal appropriation and inwardness’ conveying how within a religious framework moral truth requires internalisation by the individual, as an objective law fails¹². When referring to this scale of views on morality, Kirkegaard falls into a niche category as he is positioned as a thinker who challenges fixed moral laws, even divinely ordained ones, as a religious thinker who favours personal responsibility. For Kirkegaard, moral and religious truth is not imposed or objective, it requires an individual to commit to it in freedom, he reinforces the existentialist notion of morality as personal and interpretive. He is certain of his belief in God, yet he doesn’t present divine command and imperative, it must be interpreted through subjective experience¹³. Gouwens offers a few words which underline the human mediated nature of morality Kirkegaard manages to reconcile as he notes ‘Faith is not the confident possession of divine truth but a trembling venture into the unknown’¹⁴. It is useful to reference Durkheim who takes the view of morality and religion a step further, he argues that religion is simply a symbolic expression of society’s moral order and is not dependant on supernatural revelation¹⁵. Much of his argument is framed around viewing religion as attending to unify individuals through the creation of a shared system of beliefs and reinforce the collective identity of a community. Durkheim believes that society is the foundation of religion, as it shapes its symbols, rituals, and moral codes to reflect and uphold the social order. Religion, therefore, does not exist independently of society but emerges as an expression of collective values and the need for social integration¹⁶ . This positions moral codes within religion as social constructs, exceeding Kirkegaard’s claim and stating morality originates from human society, not any source of divine command. Ultimately, Kirkegaard’s reconciliation of human sourced morality and divine law is persuasive but more convincing is Durkheim’s analysis which supports the existentialist view of morality as invention.

When assessing morality as a human invention it is important to evaluate and acknowledge the evolution of factors that portray it as such, a dominating factor here being secularisation. Charles Taylor writes an interesting examination on the role of moral identity in the west and its reconfiguration through secularisation. The core of Taylor’s argument stems from the view that within modernity the ‘moral sources’ that first stemmed from theocentric views are increasingly intrinsic to the individual instead of a God or transcendent order; ‘we have moved from a conception of the good as something given in a cosmic order to one rooted in the inner depths of the self’¹⁷. Morality as a human invention cannot be assessed without reference to the philosophical alteration of the enlightenment, Taylor looks upon thinkers such as Kant,

9 Fear and Trembling by Soren Kirkegaard 1843

¹0 Fear and Trembling by Soren Kirkegaard 1843, Page 58

¹¹ Kirkegaard as a Religious Thinker by David J Gouwens, Chapter 4

¹² Kirkegaard as a Religious Thinker by David J Gouwens, Chapter 4, Page 87

¹³ Unscientific Postscript by Soren Kirkegaard 1846, Page 178

¹⁴ Kirkegaard as a Religious Thinker by David J Gouwens, Chapter 3, Page 63

¹⁵ The Elementary Forms of Religious Life by Émile Durkheim

¹⁶ The seven Theories of Religion by Daniel L. Pals, Chapter 3

¹⁷ Sources of the Self by Charles Taylor, Part III, Page 395

Rousseau, and Mill who aided the redefinition of moral obligation surrounding autonomy, reason, and dignity as opposed to the formerly view of divine authority¹⁸. The enlightenment and the modification of people’s views on morality and order imply moral systems are historical developments that are shaped through an evolving understanding of the self. Taylor explores his concept of ‘moral space’ referring to the cultural frameworks within which an individual orients themselves ethically, moral identity is formed within these shifting maps of meaning, suggesting morality is always constructed in relevance to its historical and socioeconomic context¹9. This view promotes its existence as a human invention as we see how we shape our morals to the society and time we adhere to. Therefore, Taylor sees morality as something close to the self, there is no transcendent anchor, morality becomes a shared human project that is culturally embedded, events such as the Enlightenment support this argument as we see thinkers branch out of theocentric thought.

In disagreement with the Enlightenment’s way of thought are MacIntyre’s ideas of modern moral discourse in a state of disarray due to its severance from teleological roots. MacIntyre’s initial focus is on modern moral language and how it is merely fragments of a destroyed conceptual scheme, terms once associated with morality like ‘justice’ or ‘virtue’ lose their original meanings due to the evolution of morality in a destructive way²0. An Enlightenment thinker who focused upon reason was Kant, many others also did, but his deontological ethics grounded morality in reason alone, a new methodology breaking away from the parameters of the time. MacIntyre argues this is incoherent due to its absence of the Aristotelian (a view he has a significant bias to) context that these values emerged from. Posing an argument that the enlightenment failed due to the morality independent of tradition²¹. Taking Aristotelian ideas further, MacIntyre argues against morality as a human invention and states without a telos morality develops into emotivism which refers to moral judgements as expressions of preference and expressions of attitude or feeling. In effect, for morality to have meaning it needs a goal and virtue, it cannot be purely intrinsic to humanity to have meaning²², but does MacIntyre truly reflect this view? MacIntyre does affirm that moral systems are in fact embedded in practices, narratives, and traditions, and these are human constructs,²³ supporting the view that morality is shaped, maintained, and communicated through human communities, conveying how morality is in fact not passively received but is actively cultivated.

Whilst MacIntyre presents a view of language as meaningless without purpose or true morality, R.M. Hare develops a non-metaphysical and functional account of moral reasoning that aligns with logic or moral language rather than theology. Hare argues that moral language is prescriptive and universalizable, he takes a position close to prescriptivist or preference utilitarianism stating that the ‘the function of moral language is to guide choices and decisions’.²⁴ Hare differs from Natural Law or Theist views as he posits that saying ‘x is wrong’ is not a factual statement aligned to some moral order, but it is a rational commitment to act and judge consistently,²⁵ this diminishes divine command as it removes objectivity of morality and places down a personal and rational process. Anyone using moral language ‘is logically committed to universalising it’ ²⁶meaning moral reasoning is intersubjectively valid, within secular contexts also. Hare avoids divine command theory and moral relativism as he argues morality is a human invention, it’s governed by logical consistency and rational analysis, making it uphold independent of theology. Singer extends Hare’s

¹⁸ Sources of the Self by Charles Taylor, Part III, Page 352

¹9 Sources of the Self by Charles Taylor, Part III, Page 32

²0 After Virtue by Alisdair MacIntyre, Chapter 1, Page 2

²¹ After Virtue by Alisdair MacIntyre, Chapter 5, Page 50

²² After Virtue by Alisdair MacIntyre, Chapter 1, Page 11

²³ After Virtue by Alisdair MacIntyre, Chapter 15, Page 222

²⁴ The Language of Morals by R.M. Hare, Page 20

²⁵ The Language of Morals by R.M. Hare, Page 39

²⁶ The Language of Morals by R.M. Hare, Page 83

framework as he argues that universalizability ties ethics to reason and rejects religion. He states ‘the essence of ethical judgement lies in universalizability…this provides the link between reason and ethics’. While singer is more explicitly utilitarian than Hare, their ideas resonate surrounding the of the lack of necessity for metaphysical foundations of morality and support a secular morality.²⁷ Mackie, though more sceptical, agrees that morality is a human invention, he claims ‘there are no objective values’, aligning with both singer and Hare, he describes morality as a historically evolved invention to aide social cohesion. While Mackie views moral language as a systematic error²⁸, all three thinkers acknowledge morality as derived from social invention rather than divine authority that emerges from human reason, language, and context.

Therefore, morality is best comprehended as a human invention crafted through individual autonomy, cultural evolution, and rational consideration rather than divine revelation or objective moral truth. Sartre as an existential and humanist thinker illustrates the burden of moral freedom in the absence of a fixed order, whilst Kirkegaard offers a proto-existentialist analysis of this view. It is also critical to assess the sociological view of Durkheim that exposes how moral systems and religious order function to sustain communal and societal cohesion. Taylor’s reflections on the secular reconfiguration of morality enhance this view, demonstrating how morality alters with historical and cultural paradigms. Even critiques of modern morality, MacIntyre, acknowledge the role of narrative and tradition, as human constructs, in shaping moral order. Morality is further proved to be an invention through the way it operates via language, logic, and shared human values that are independent of theology as argued by Singer, Hare, and Mackie. Together, these perspectives affirm that morality is neither discovered in divine law nor imposed by divine command, but it thrives off human invention that aids the sense of self and societal cohesion.

Bibliography:

Kierkegaard, S., 1843. Fear and Trembling. Translated by A. Hannay. 1985 ed. London: Penguin Classics. Kierkegaard, S., 1846. Concluding Unscientific Postscript. Translated by D.F. Swenson. 1941 ed. Princeton: Princeton University Press.

Gouwens, D.J., 1996. Kierkegaard as a Religious Thinker. Cambridge: Cambridge University Press.

Sartre, J.-P., 1943. Being and Nothingness. Translated by H.E. Barnes. 1956 ed. London: Routledge.

Sartre, J.-P., 1946. Existentialism Is a Humanism. Translated by P. Mairet. 2007 ed. New Haven: Yale University Press.

Durkheim, E., 1912. The Elementary Forms of Religious Life. Translated by K.E. Fields. 1995 ed. New York: Free Press.

Pals, D.L., 1996. Seven Theories of Religion. New York: Oxford University Press. Taylor, C., 1989. Sources of the Self: The Making of the Modern Identity. Cambridge, MA: Harvard University Press.

MacIntyre, A., 1981. After Virtue. 3rd ed. 2007. Notre Dame: University of Notre Dame Press. Hare, R.M., 1952. The Language of Morals. Oxford: Oxford University Press. Singer, P., 1979. Practical Ethics. 3rd ed. 2011. Cambridge: Cambridge University Press.

Mackie, J.L., 1977. Ethics: Inventing Right and Wrong. London: Penguin.

²⁷ Practical ethics by Peter Singer, Page 12

²⁸ Ethics: the inventing of Right and Wrong, Page 15

“Classicism was the most important driving force behind art and architecture of the Renaissance”

To what extent do

you agree?

During the Italian Renaissance, the Florentines began to discover ancient Greek and Roman statues, art and architecture. These became a source of inspiration for their modern-day designs. Ancient texts gave an insight into the lifestyle modern Florentines became obsessed with imitating. Florence wanted to be the New Athens. Ancient Greek Mythology became a focal point for art in Florence despite it being a Christian society, so much so that the Medici hired a Greek tutor to educate their younger generations.

A piece of architecture in Florence that has been influenced by Classicism is the Palazzo Rucellai. It was designed by Alberti and built between 1446 and 1451. It was originally eight buildings which were brought by Giovanni Rucellai to allow for a large, classical façade. It is an asymmetrical, three-story building. The ground floor has two trabeated Greek styles doors separated by two brick bays. There is a bench which runs along the front of the building with diaper work (opus reticulatem) beneath it. This was an ancient Roman technique which would have been on ancient architecture which was being found at the time. This bench shows the Rucellai’s civic pride which was a Classical idea. It acts as a public display of commitment to the citizens of Florence. On the first and second floor there are round headed mullion windows separated by pilasters. Each floor level has a different order; this is inspired by The Colosseum in Rome. The floors also decrease in height as you ascend which follows the writings of Vitruvius in his Ten Books of Architecture. This is a strong classical influence as it was written roughly between 20 – 30 BCE under the rule of Caesar Augustus. There are two entablatures between the floors, the first shows the family crest of the Medici which is a diamond ring with three feathers, the second has the Rucellai family symbol: the Vela Gonfiata. The use of both family crests is representative of the marriage between Bernado Rucellai and Nanina de Medici in 1461. Giovanni Rucellai displayed this on his home to show his connections to the Medici family who were the wealthiest in Florence. The commemoration of the marriage on the classical façade shows Alberti’s ability to adapt Classicism in a way which represents his client. This stylistic frontage shows the high level of classical education both families possessed, further imposing their wealth and societal status.

A sculpture which has been strongly influenced by Classicism is Michelangelo’s “David” which was made between 1502 and 1504. This is a colossal nude statue commissioned by the Arte de Lana for the Duomo in Florence. “David” being naked makes him look like Apollo or Zeus who were Ancient Greek gods “David” was also the first colossal statue since “The Horse Tamers”. “The Horse Tamers” were the only giant statues at the time and Michelangelo would have seen them in the ruins of the Bath of Constantine so they must have been his inspiration. “David” has a contrapposto pose which tenses the abdominal muscles making him look more idealised which is preferred in classicism. He holds a slingshot in his right hand,

which is draped down his back, this is the only part of that statue that identifies him as “David”, without it he could be any other Greek God. Michelangelo’s obsession with classical sculptures began before he created “David”, earlier on in his career he was making sculptures and burying them. Later, he would unearth these buried sculptures, and they would be seen as true classical art. This shows Michelangelo’s incredible ability to recreate classically stylised works and therefore what a heavy influence Classicism has had on the art of the Renaissance. The statue is made using Carrara marble which was most used in the Ancient Greece and Rome. However, some of the Florentine population found “David” offensive as the population seemingly wasn’t prepared for a seventeen-foot-tall nude statue. This shows that only the cognoscenti were educated on Classicism, meaning that not all social classes of artists would have been influenced by Classicism.

A painting from the Renaissance which has somewhat been influenced by Classicism is “Primavera” by Botticelli painted between 1482 to 1483. The subject matter of this painting has been entirely inspired by the mythological, classical text Ovid’s Fasti. “Primavera” shows the transitioning between seasons which is the story told in Ovid’s poem. The figures in the painting are not as muscular or idealised as is expected in a classical painting. On the right-hand side of the painting is Zephyrus capturing the nymph, Chloris. Zephyrus (who is the Greek god of wind) marries her and causes her transformation into Flora, the Roman goddess of springtime. Flora is then painted to the left of Chloris. On the central axis is Venus, she has myrtle behind her head which is the plant associated with love. The darkness of the leaves contrasts her pale face making it stand out. This is common in Gothic style paintings as the outline is especially important, unlike in Classicism. The last figure to the left side of the canvas is the Roman god Mercury, who holds a caduceus which repels rain in the Garden of Hesperides. At the bottom of the canvas the forest floor is depicted, in the painting this resembles a Gothic tapestry. The International Gothic style can be seen throughout the entire painting. Although, the subject matter is identical to the Classical texts by Fasti including Greek and Roman Gods, it is clear the style and execution of the painting have been inspired by Gothicism. Therefore, other styles have also had an influence of the art produced during the I talian Renaissance.

One piece of architecture which hasn’t heavily been influenced by Classicism is Brunelleschi’s Dome. Work started on Florences new cathedral (Cattedrale di Santa Maria del Fiore) in 1296 which was watched over by Anolfo de Cambio. The cathedral was going to replace the Church of San Reparata which was a Gothic

Church built in the fifth century. Work on the drum finished in 1418, this caused work to stop as no architect knew how to construct a 180-foot dome upon 140-foot walls (these dimensions had previously been decided in 1337). Brunelleschi won a competition which meant he oversaw the designing of the dome. He was instructed to create a Pointed Fifth dome. It had to be a Quinto de Acuto/ Pointed Fifth because of the shape of the drum which meant that it couldn’t simply be an arch rotated 360 degrees. The Quinto de Acuto shape and the ribs in the ceiling of the dome are both inspired by the International Gothic Style. The incredible height of the building is also taken from Gothicism. Height was used to achieve spiritual ascension by reaching up to the heavens and impose God’s presence. The Gothic Cathedral Beauvais was seen as a warning of the maximum height of a stone building. Most of the Gothic cathedral collapsed in 1298 even though the roof was lower than the Duomo’s. Brunelleschi therefore looked at the Beauvais and wanted to create a taller structure; this would aid Florence’s cultural supremacy. The Cupola was also inspired by The Pantheon in Rome, especially the thinning of the walls as the dome became taller to reduce the weight. The only classical parts of the dome are the exedri and the lantern which surmounts the Dome. Overall, the main style that has driven the design of The Duomo is Gothicism which can also be seen in Brunelleschi’s prior work.

Although Classicism has been an important driving force for art and architecture during the Renaissance, other styles are also used as inspiration. Where an artist or architect sources their inspiration will depend upon their teaching, the commission and the location of their work.

The Protein Folding Problem

Proteins, the building blocks of all human cells, are complex molecules made of amino acid chains that are involved in various biochemical reactions in the human body including enzymatic catalysis¹, signalling, immune responses, as well as industrially in biofuel production. Considered alongside carbohydrates and fats as the three major macronutrients, proteins are especially important for growth, development and tissue repair. With the human genome encoding over 20,000 different proteins, a specific protein's function is hugely dependent on its unique tertiary structure. Therefore, the ability to predict a protein's structure allows for a greater understanding in various aspects of science including tackling industrial waste with enzymes, drug development and disease treatment. In fact, various techniques including Nuclear Magnetic Resonance (NMR), X-ray crystallography and Cryo-electron microscopy (cryo-EM)² determine protein structures. (Cheng, Grigorieff, Penczek, & Walz, 2015)

However, these techniques have failed to determine structures efficiently and had various constraints, partly due to the complicated mechanisms of proteins as well as the astronomical number of possibilities they can fold into before stabilising. American molecular biologist Cyrus Levinthal estimated by brute force calculations³ in 1969 that a typical protein holds around 10300 possible conformations (Levinthal, 1969) – an amount that would take longer than the age of the known universe (≈ 13.8 billion years) to enumerate. Yet, in rigid dichotomy, some proteins fold in as quickly as milliseconds. (Jumper, et al., 2021) Thus, the challenge posed by proteins’ nearly inestimable operations has kept the quest of computationally predicting a protein’s 3D structure unsolved in the last 5 decades.

X-ray crystallography, developed by physicists William Henry Bragg and William Lawrence Bragg in 1912, is a technique used to reveal structural information of molecules at an atomic level. (Smyth & Martin, 2000) Whilst resource intensive, it has been primarily used to determine protein structures in the past decades. X-ray crystallography predicts a protein's structure by crystallising it and exposing its crystal lattice to X-rays. The scattered radiation forms a diffraction pattern whose strength is proportional to electron concentration at the particular point; this is captured and mathematically converted into an electron density map via a Fourier Transform, an integral transform⁴ (Mirsalehi, 2003) that converts time series data⁵ to complex-valued domain data⁶ through a signal of sine and cosine waves of different frequencies. (Brigham, 1988) Each dot on the electron density graph is then shown by a single wave constituting of the magnitude and relative phase of X-rays scattered along the direction. (Cowtan, 2001) However, the amplitudes of the curves can only be obtained experimentally, and the phase information, which determines where the peaks occur, cannot be directly measured but only indirectly using molecular replacement, a time consuming and often unreliable method. Constructing an accurate electron density map required for protein structure modelling has hence been a bottleneck for the process historically. This is known as the "Phase Problem".

Moreover, X-ray crystallography cannot be applied to membrane proteins or large and complex structures, as these proteins are too flexible to stably crystallize. X-ray crystallography is also limited to providing static snapshots of a protein; this opposes a protein's dynamic characteristics in solution and often leads to the loss of dynamic information. Hence, X-ray crystallography has been unsuccessful in

¹ Enzymatic catalysis: The acceleration of chemical reactions in living organisms by enzymes.

² Cryo-electron microscopy: A technique used to determine the 3D structure of biological molecules at near-atomic resolution.

³ Brute force calculation: A straightforward but computationally expensive problem-solving approach where every possible solution is systemically checked until the correct one is found.

⁴ Integral transform: A linear operation that converts a function into another function via integration.

⁵ Time series data: A collection of observations for a single entity (subject) at different time intervals.

⁶ Complex-valued domain data: Data consisting of both the magnitude and the phase of a set of sinusoids (sine curve) at the frequency components of the signal.

providing structural predictions of the required atomic accuracy. (Jumper, et al., 2021) Combining with its resource demands, the technology has been unable to be effectively utilised.

What is AlphaFold 2?

AlphaFold 2 is a multicomponent artificial intelligence program that accurately predicts a protein’s 3D structure based on its primary amino acid sequence. The remarkable accuracy of AlphaFold 2 is depicted by its victory and record-breaking score in the 14th Critical Assessment of Structure Prediction (CASP14) in 2020, a biennial blind assessment founded by Professor John Moult and Professor Krzysztof Fidelis to evaluate protein structure predicting models. Seen as the “gold standard” for assessing protein predicting techniques, the CASP requires participants to blindly predict the structure of recent experimentally determined proteins that have yet to be deposited in the Protein Data Bank (PDB), a universal database for three-dimensional structural data of biological molecules such as proteins and nucleic acids. (Burley, et al., 2019) These predictions will then be compared to experimental data and scored according to the Global Distance Test (GDT) from 0-100. GDT measures the percentage of amino acid residues within a threshold distance from the correct position (Jumper, et al., 2021); a score of 90 GDT is usually considered extremely competitive. In CASP 14, AlphaFold 2 achieved a high median score of 92.4 GDT overall and 87.0 GDT in the most challenging free-modelling category. (Jumper, et al., 2021). The predictions possess an average Root Mean Square Deviation⁷ (RMSD) of 0.96Å⁸ relative to experimental coordinates for backbone atoms (Figure 1), in other words the errors are insignificantly similar to the width of chlorine’s atomic radius of 1.0Å. (Jumper, et al., 2021) Google DeepMind scientists Demis Hassabis and John Jumper were consequently awarded the 2023 Lasker Award for their development of AlphaFold 2, a prestigious award often considered as a precursor to the Nobel Prize.

Figure 1: The performance of AlphaFold in CASP14 relative to the top 15 out of 146 entries. The bars represent a 95% confidence interval of the median. (Jumper, et al., 2021)

⁷ Root mean square deviation: Measures the average difference between a statistical model’s predicted values and the actual values.

⁸ Angstrom (Å): A unit of length named after Swedish physicist Anders Jonas Ångstrom. 1Å = 1-10m.

How does AlphaFold 2 Work?

The accuracy of protein structure prediction by AlphaFold 2 is improved by its employment of sophisticated neural network structures and its training procedures based on the evolutionary history as well as physical and geometric patterns of protein structures. (Jumper, et al., 2021) The AlphaFold 2 network adopts the use of the Evoformer, a network consisting of 48 blocks9 which iteratively process and refine protein structure predictions. Each block processes two key inputs: The Multiple Sequence Alignment (MSA) Representation and Pairwise Residue Representation.

The MSA Representation is a Nseq × Nres matrix that aligns homologous protein sequences to capture evolutionary trends of proteins. (Mirabello & Wallner, 2019) Note that Nseq represents the number of homologous sequences in the MSA whilst Nres represents the number of amino acid residue positions in the protein sequence. The MSA Representation informs the model of any mutated amino acids that could induce a structural and/or functional change in the resulting protein.

Contrastingly, the Pairwise Residue Representation is a Nres × Nres matrix which stores the predicted distance information and interactions (i.e. hydrogen bonds and hydrophobic interactions) between two specific amino acid residues in the protein. Modelled as a graph but stored as a matrix, entries i and j of the pairwise representation help the model learn how the positions of different amino acid residues are relative to each other in 3D space. (Jumper, et al., 2021)

These inputs are processed inside each Evoformer Block, which consists of three specialised layers: The Self Attention Layer, Triangular Multiplicative Update layer and Transition Layer.

The Self Attention Layers analyse the MSA Representation by detecting residues that co-evolve and form physical contacts. Firstly, the Query (Q), Key (K) and Value (V) vectors of every residue in the MSA Representation are computed. The Q vector represents the residue’s “question” (i.e. What other residues am I linked to evolutionarily and am likely to interact with?), the K vector represents the residue’s “answer”, and the V vector ciphers information about the residue including hydrophobicity and charge. These vectors are then used to calculate a Weighted Score which indicates the relationship between the residues. The Weighted Score for two residues i and j is given by following relationship. (Vaswani, et al., 2023)

Where:

Qi • Kj represents the Dot product of the Query (i) and Key (j), √dk represents the scaling factor to stabilise gradients, Softmax¹0 converts the scores into probabilities.

9 Block (Computing): A sequence of bytes or bits consisting of one or more declarations and statements.

¹0 Softmax: A mathematical function that converts a vector of numbers into a probability distribution.

Since the Dot product of i and j measures how two residues align with each other evolutionarily, a large value indicates the residues frequently mutate together deleteriously and hence a strong co-evolution. A high attention weight (≈1.0) is assigned as the residues are likely directly contacting in the protein’s 3D structure. On the other hand, a low attention weight (≈0.0) indicates the residues are spatially distant from each other and have no functional similarity. (Jumper, et al., 2021) These weights are then refined using multi-head attention¹¹, where each “head” processes Q, K and V vectors for the same residues.

(Vaswani, et al., 2023) This allows different heads to detect different types of relationships simultaneously. The resulting attention weights, also known as final attention weights, are used to directly guide AlphaFold 2’s 3D structure prediction to a protein. High-attention residue pairs are reflected by shorter distances in the Pairwise residue Matrix, with the residues being positioned by the weights using Invariant Point Attention (IPA)¹².

The inputs are then processed by the Triangular Multiplicative Update Layer, which enhances geometric consistency of the prediction. Each Pairwise Residue Representation Matrix contains distance probabilities for likelihood residues to be 5Å apart. The Triangular Multiplicative Update Layer first computes a conceptual “Triangle Rule” score for every triplet of residues; this determines if the triplet of residues obey Triangular Inequality Theorem, which states the sum of lengths of any two sides of a triangle must be greater than the length of the third side. The “Triangle Rule” Score for the triplet of residues i, j and k is calculated as

score (i, j, k) = exp (pair (i, j) + pair (j, k) – pair (i, k))

Where pair (i, j), pair (j, k) and pair (i, k) are logarithmic probabilities representing the predicted proximity of residue pairs. (Jumper, et al., 2021)

For example: If pair (i, j) = 5Å, pair (j, k) = 6Å and pair (i, k) = 8Å, Triangular Inequality is satisfied as pair (i, j) + pair (j, k) ≥ pair (i, k). This shows that the predictions are geometrically feasible and a large score of ≥1 is given. However, if Triangle Inequality is violated i.e. pair (i, j) + pair (j, k) < pair (i, k) and a low score of <1 is given, the problematic pair (i, k) entry is suppressed to a lower Softmax weight and corrected to a value that satisfies the inequality. This is significant as folded proteins always obey Triangle Inequality. The Triangular Multiplicative Update Layer hence prevents gaps or steric clashes¹³ in the predictions of AlphaFold 2.

Finally, the inputs are processed by the Transition Layer, which acts as refiners. Here, MSA Representations and Pairwise Residue Representations are refined; noise¹⁴ is removed, residues are projected through non-linear transformations, and any absent or mislaid information is filled in. The Transition Layer is a position-wise feed-forward network (FFN)¹⁵ applied to each amino acid residue in single or pair representation. It comprises of the following components: Layer Normalisation, a Feed-Forward Network, and Residual Connection¹⁶.

¹¹ Multi-head attention: A mechanism used in neural networks to enhance the model’s ability to focus on different parts of the input sequence simultaneously.

¹² Invariant Point Attention: A specialised attention mechanism designed for processing spatial data in 3D structures like proteins and RNA.

¹³ Steric Clash: Overlapping atoms in a 3D model which violates physical constraints.

¹⁴ Noise: Random or unpredictable fluctuations in data that disrupt the ability to identify target patterns or relationships.

¹⁵ Position-wise feed-forward network: A mini neural network that processes each element of the input sequence independently.

¹⁶ Residual Connection: A neural network design where the input is added to the output to preserve information across layers.

Firstly, layer normalisation is applied to the input vector. This centres all values around zero and ensure they are on the same scale. Layer Normalisation applies the following LayerNorm formula: (Ba, Kiros, & Hinton, 2016)

Where:

μ is the mean of all values in the normalised input vector xn,

σ is the standard deviation (a small constant ϵ is added for numerical stability in the case where σ = 0), γ is the scaling factor, β is the shifting factor, xn is the normalised input vector.

The Feed-Forward Network (FFN) then expands the dimensions of the input vector by four; this allows more room for information to be combined more flexibly. The vector is then compressed back to its original size but now encompassing richer information. This is represented by the following formula: (Vaswani, et al., 2023)

h = W2 • ReLU (W1 • x + b1) + b2

Where:

x is the amino acid input vector, W1 is a matrix with the shape ℝ4d×d that expands the input vector, B1 is a bias vector added to each weighted input, B2 is a bias vector added at the end, ReLU is a non-linear activation function which keeps positive numbers and turns negative numbers into zero. It is defined as: (Nair & Hinton, 2010)

W2 is another matrix with the shape ℝ4d×d. W2 compresses the expanded vector to its original size, h is the final transformed vector which has the same size as x.

Finally, the original input vector is added via residual connection, ensuring important information from the input vector is preserved. Residual connection involves the following formula: (He, Zhang, Ren, & Sun, 2015)

Output = x + h (xn)

Where:

x is the original input vector, xn is the normalised input vector, h (xn) is the output of FFN, Output is the result containing both original and new information.

It is important to note that an additional procedure: Dropout, (Hinton, Srivastava, Krizhevsky, Sutskever, & Salakhutdinov, 2012) is applied during training to the FFN Output. Dropout involves randomly zeroing approximately 10 - 25% of values; (Jumper, et al., 2021) this prevents the neural network from over relying on any value. Although disabled during inference, Dropout enhances AlphaFold 2’s ability to process novel protein sequences and its effect is reflected in the model’s learned parameters.

Figures 2, 3 and 4 below show the flow of information from input to predicted structure in AlphaFold 2. Figure 2 demonstrates the overall pathway of Evoformer blocks iteratively refining the MSA Representation and Pairwise Residue Representation. Figure 3 focuses on one Evoformer block and demonstrates the operation of the Self Attention Layer, Triangular Multiplicative Update layer and Transition Layer. The resulting output is then transferred to the structural module in Figure 4, where Invariant Point Attention (IPA) converts the spatial data into 3-dimensional atomic coordinates which shape the final protein structure prediction. (Jumper, et al., 2021) Stereochemical errors are further corrected with energy minimisation by Amber Force Fields; this involves adjusting bond lengths and bond angles but has a negligible effect on accuracy of the final prediction. (Hornak, et al., 2006)

Figure 2: Model Architecture of AlphaFold 2. (Jumper, et al., 2021)

Figure 3: Architectural details of the Evoformer Block. Arrows show the direction of information flow; the shape of arrays is shown in parentheses. (Jumper, et al., 2021)

Figure 4: A structural module including the IPA. (Jumper, et al., 2021) Figure 4 describes how the final Pairwise Residue Representations are converted into 3D atomic coordinates via the IPA mechanism and rigid-body transformations.

This concludes the architecture of AlphaFold 2, yet the system must be trained to accurately map amino acid sequences into 3D predictions. The following section describes how training enables AlphaFold 2 to generalise predictions across a diverse range of proteins and sequences.

Training of AlphaFold 2

The AlphaFold 2 system is trained with both labelled and unlabelled data. Labelled data refers to protein sequences with experimentally known 3D structures, usually from the Protein Data Bank, whilst unlabelled data refers to protein sequences with experimentally unknown 3D structures, usually in source databases i.e. UniProt (A database which stores protein sequences and functional annotation) (Consortium, 2020) and derived databases i.e. UniClust30 (A database derived from UniProt to facilitate specific types of protein analysis). (Agrawal, Khater, Gupta, Sain, & Mohanty, 2017) The Protein Data Bank provides supervised learning with labelled data, which the AlphaFold 2 System generally performs well for a wide range of historically challenging proteins i.e. membrane proteins. However, the predictions of AlphaFold 2 may also contain less accurate regions with a lower confidence score, suggesting ambiguity in the structure of these regions. This is mostly prevalent in orphan proteins¹⁷ - proteins with no known homologous sequences, where side chain deviations, invalid interactions within protein domains and point mutations are often observed in highly variable sequences.

¹⁷ Orphan protein: A protein whose function is not yet known or that lacks a known binding partner or target organelle.

To enhance accuracy, the AlphaFold 2 adapts a self-distillation approach similar to that of Noisy Student. (Xie, Luong, Hovy, & Le, 2019) This involves generating pseudo – labels¹⁸ from labelled data in training, adding noise and repetitions of self – distillation¹9. This approach allows AlphaFold 2 to learn more effectively from the pseudo labels and amplifies its performance in scenarios with limited labelled data.

Implications of AlphaFold 2 in Accelerating Pharmaceutical Development AlphaFold 2’s ability to generate high-confidence structural predictions for proteins without any crystallography data enables structure-activity relationship (SAR) to occur at the early stages of drug discovery. This involves systemically altering the structure of a drug molecule and recording its change in chemical and biological activity to better understand the drug’s potency, solubility and interaction with target proteins. The utility of AlphaFold 2 induced SAR is illustrated by the discovery of CDK20 inhibitors for Hepatocellular Carcinoma (HCC) whose structure was previously unknown.

In 2022, AI-powered drug discovery platforms PandaOmics²0 and Chemistry42²¹ used predictions from AlphaFold 2 to predict the structure of the Cyclin-Dependent Kinase²² (CDK20) for HCC, an oncogenic driver²³ in an aggressive and dominant form of liver cancer accounting for ≈75% of the total patient population. (Ren, et al., 2023) Despite a recent Phase 3 clinical trial (IMbrave150) suggesting Atezolizumab²⁴ (an immunotherapy drug and PD-L1 inhibitor) and Bevacizumab²⁵ (an anti-angiogenic drug that inhibits VEGF²⁶ which is necessary for a tumour’s blood supply) to be an effective immunotherapy combination for treating advanced and unresectable HCC, (Finn, et al., 2020) a significant number of patients exhibit immense resistance, indicating the urgent need for a greater understanding and structural information of therapeutic targets like CDK20.

To predict the structure of CDK20, PandaOmics first analysed OMICs data from 10 HCC datasets containing 1800 samples (of which 1133 are disease samples and 647 are control samples) and shortlisted 20 targets. (Ren, et al., 2023) Multidimensional filtration is then applied; this process evaluates the biologics accessibility, safety, small molecule accessibility and tissue specificity of the targets. CDK20 was consequently selected due to its strong association to HCC. Yet, the structure of CDK20 was novel and lacked detailed dimensional data for pharmaceutical development. Nevertheless, AlphaFold was able to guide Chemistry42 in manufacturing CDK20 inhibitors through its high-confidence prediction; this allowed the latter to generate 8918 different molecules for drug testing. Through molecular docking and clustering, 7 molecules were synthesised for biological testing. Among them, the compound ISM042-2-001 successfully bound to CDK20 with a Kd²⁷ value of 9.2 ± 0.5 μM (where n=3), (Ren, et al., 2023) suggesting weak binding but validating the structure of AlphaFold’s prediction.

¹⁸ Pseudo Labelling: A semi-supervised learning technique where predicted labels for unlabelled data generated by a model are treated as true labels to expand the training set.

¹9 Self – distillation: A technique where the student model is trained to mimic the outputs of another instance of the teacher model, which has the same architecture and uses the same training data.

²0 PandaOmics: A biocomputational AI platform that uses algorithms and bioinformatics to discover therapeutic targets and biomarkers for various diseases.

²¹ Chemistry42: A generative AI chemistry platform that generates novel chemical structures from protein structure information.

²² Kinase: A phosphotransferase (enzyme) that catalyses the phosphorylation of a substrate.

²³ Oncogenic Driver: A gene that directly initiates and sustains the development and growth of cancer by promoting abnormal cellular proliferation and survival.

²⁴ Atezolizumab (Tecentriq): An immunotherapy drug used to treat various cancers. It blocks the PD-L1 protein on cancer cells to help the immune system recognize and attack them. Often used in combination with chemotherapy.

²⁵ Bevacizumab (Avastin): A targeted cancer therapy that stops tumours from growing new blood vessels. Often used in combination with chemotherapy.

²⁶ VEGF (Vascular endothelial growth factor): A signal protein that stimulates the formation of blood vessels (angiogenesis) especially in tumours.

²⁷ Kd (Dissociation Constant): Measures binding affinity, or how tightly a molecule binds to its target.

This initial binding data was fed back to Chemistry42, allowing for a more potent hit molecule ISM042-2048 with a Kd value of 566.7 ± 256.2 nM (n=3) to be generated. (Ren, et al., 2023) Such value indicates a moderate binding affinity exceeding previous models (See Figure 5 below), which is promising in the early stages but insufficient for a final drug. Through virtual docking²⁸, it has been found that the amide group (R₁C(=O)NR₂R₃) of the ligand ISM042-2-048 interacts with CDK20 by forming hydrogen bonds to amino acid residues Lys33 (Lysine²9 at position 33) and Asp145 (Aspartic acid³0 at position 145)³¹ in its ATPbinding site, thereby competing with ATP and inhibiting kinase activity. Additionally, the conjugated aromatic rings in ISM042-2-048 form π – π stacking interactions³² with the phenyl group (-C6H5) of amino acid residue Phe80 (Phenylalanine at position 80) in the CDK20 ATP-binding site; these forces arise due to favourable interactions in electron clouds. This increases binding stability by enhancing the hydrophobic complementarity³³ between water-averse aromatic rings and Phe80, which is highly hydrophobic, also reducing steric repulsion between the molecules.

Energetically, the association of aromatic surfaces in Phe80 and ISM042-2-048 displaces ordered solvation shells of water from the binding site. This is known as desolvation. Although desolvation involves a penalty from hydration shell removal, the release of these water molecules into the solvent involves a favourable entropy gain (where ΔS > 0) which outweighs the initial desolvation penalty. This results in a more negative Gibbs Free Energy (ΔG)³⁴ which increases stability in the protein-ligand complex, because of the following standard-state free energy equation³⁵ assuming a 1:1 binding equilibrium³⁶:

ΔG = RT ln Kd

where ΔG decreases with Kd (assuming R and T are constant) and a lower Kd value is associated with tighter ligand-protein binding. This relationship thus explains the favourable binding and high binding affinity for ISM042-2-048.

²⁸ Virtual docking: A computational simulation technique used to predict how small molecules (ligands) i.e. potential drug candidates, bind to a larger target molecule (receptor) i.e. a protein.

²⁸ Lysine: A hydrophilic amino acid with side chain –(CH2)4-NH3+ at physiological pH.

²⁸ Aspartic acid: A hydrophilic amino acid with side chain -CH2-CHOO- at physiological pH.

²⁸ Lys33 and Asp145: Amino acid residues that form the ATP binding site in CDK20. They are often primary targets for inhibitors that suppress the activity of CDK20.

²⁸ π – π stacking interactions: A noncovalent interaction between aromatic systems (i.e. benzene rings) arising from dispersion forces and electrostatic interactions.

Figure 5: Comparison of ISM042-2-048 and previously reported CDK20 inhibitors. (Ren, et al., 2023) Note that the structure of potent inhibitor MER-128 described by Mueller et al. (Mueller, et al., 2024) was not disclosed despite its highly competitive IC50 value.

Given that the hit molecule for CDK20 was identified without any existing experimental structure within 30 days of target selection at an unprecedented efficiency, (Ren, et al., 2023) the accurate predictions of AlphaFold 2 is expected to allow ligands of other molecules to be produced and optimised before any available crystallography data in the future. Apart from CDK20, AlphaFold 2 has also aided in various other previous campaigns including: the prediction of Plasmodium falciparum transmission-blocking (TB) vaccine candidate proteins Pfs48/45 and Pfs230 for malaria vaccine development, (Ko, et al., 2022) Trypanosome cruzi proteins which supports virtual screening campaigns, (Wheeler, 2021) SARS-COV-2 Spike Omicron JN.1, KP.2 and KP.3 variants during the COVID-19 pandemic (Raisinghani, Alshahrani, Gupta, & Verkhivker, 2024) as well as G-protein-coupled receptors (GPCRs) (a type of integral membrane protein) for structural biology and pharmaceutical advancements. (Sala, Hildebrand, & Meiler, 2023) In conclusion, AlphaFold 2 and its future models i.e. AlphaFold 3 hold extreme potential to drastically shorten future pre-clinical discovery processes for pharmaceutical development when used with generative systems.

References

Agrawal, P., Khater, S., Gupta, M., Sain, N., & Mohanty, D. (2017, July 3). RiPPMiner: a bioinformatics resource for deciphering chemical structures of RiPPs based on prediction of clevage and cross-links. Nucleic Acids Research, 45(W1), W80-W88. Retrieved November 14, 2025, from https://watermark02.silverchair.com/ gkx408.pdf?token=AQECAHi208BE49Ooan9kkhW_Ercy7Dm3ZL_9Cf3qfKAc485ysgAAA10wggNZBgkqhki G9w0BBwagggNKMIIDRgIBADCCAz8GCSqGSIb3DQEHATAeBglghkgBZQMEAS4wEQQM4D0xvrvO3un 0DVjkAgEQgIIDENKtb8UH0iEAqUIn1CQRq3uUSw0oOVGCNaEhf1_onQKS9L

Ba, J. L., Kiros, J. R., & Hinton, G. E. (2016, July 21). Layer Normalization. Cornell University. Ithaca, New York: Cornell University. Retrieved November 14, 2025, from arxiv.org: https://arxiv.org/abs/1607.06450

Brigham, O. E. (1988). The fast Fourier transform and its applications. Prentice Hall Signal Processing Series.

Burley, S. K., Berman, H. M., Bhikadiya, C., Bi, C., Chen, L., Costanzo, L. D., . . . Ioannidis, Y. E. (2019). Protein Data Bank: the single global archive for 3D macromolecular structure data. Nucleic Acids Res. 47, 520-528.

Cheng, Y., Grigorieff, N., Penczek, P. A., & Walz, T. (2015). A Primer to Single-Particle Cryo-Electron Microscopy. Cell, 161(3), 438-449.

Consortium, T. U. (2020, November 25). UniProt: the universal protein knowledgebase in 2021. Nucleic Acids Research, 49(D1), D480-D489. Retrieved November 14, 2025, from https://academic.oup.com/nar/article/49/ D1/D480/6006196?login=true

Cowtan, K. (2001). Phase Problem in X-ray Crystallography, and Its Solution. In Encyclopedia of Life Sciences. Macmillan Publishers Ltd, Nature Publishing Group.

Finn, R. S., Qin, S., Ikeda, M., Galle, P. R., Ducreux, M., Kim, T.-Y., . . . Cheng, A.-L. (2020, May 13). Atezolizumab plus Bevacizumab in Unresectable Hepatocellular Carcinoma. The New England Journal of Medicine, 382(20), 1894-1905. doi:10.1056/NEJMoa1915745

He, K., Zhang, X., Ren, S., & Sun, J. (2015). Deep Residual Learning for Image Recognition. Cornell University, Department of Computer Science. Ithaca, New York: Cornell University. Retrieved November 14, 2025, from https://arxiv.org/abs/1512.03385

Hinton, G. E., Srivastava, N., Krizhevsky, A., Sutskever, I., & Salakhutdinov, R. R. (2012). Improving neural networks by preventing co-adaption of feature detectors. Cornell University, Department of Computer Science. Ithaca, New York: Cornell University. Retrieved November 14, 2025, from https://arxiv.org/pdf/1207.0580

Hornak, V., Abel, R., Okur, A., Strockbine, B., Roitberg, A., & Simmerling, C. (2006, November 15). Comparison of multiple Amber force fields and development of improved protein backbone parameters. doi:10.1002/ prot.21123

Jumper, J., Evans, R., Pritzel, A., Green, T., Figurnov, M., Tunyasuvunakool, K., . . . Hassabis, D. (2021). Highly Accurate Protein Structure Prediction with AlphaFold. Nature 596, no. 7873, 583-589.

Ko, K.-T., Lennartz, F., Mekhaiel, D., Guloglu, B., Marini, A., Deuker, A. J., . . . Higgins, M. K. (2022, September 24). Structure of the malaria vaccine candidate Pfs48/45 and its recognition by transmission blocking antibodies. Nature Communications, 13(5603), 1-11. doi:10.1038/s41467-022-33379-6

Levinthal, C. (1969). How to Fold Graciously. Mössbauer Spectroscopy in Biological Systems, 22-24.

Mirabello, C., & Wallner, B. (2019). rawMSA: end-to-end deep learning using raw multiple sequence alignments. PLoS ONE14.

Mirsalehi, M. M. (2003). Optical Information Processing. In R. A. Meyers, & R. A. Meyers (Ed.), Encyclopedia of Physical Science and Technology (Third ed., pp. 335-369). Academic Press.

Mueller, D., Totzke, F., Weber, T., Kraemer, D., Heidemann-Dinger, C., Rademann, C., & Torka, R. (2024, October 1). Comprehensive characterization of CDK inhibitors using a complete panel of all 20 human cyclin-dependent kinases. European Journal of Cancer, 211(114925). doi:10.1016/j.ejca.2024.114925

Nair, V., & Hinton, G. E. (2010). Rectified Linear Units Improve Restricted Boltzmann Machines. Proceedings of the 27th International Conference on Machine Learning, Haifa, Israel, 1-8. Retrieved November 14, 2025, from https://www.cs.toronto.edu/~fritz/absps/reluICML.pdf

Raisinghani, N., Alshahrani, M., Gupta, G., & Verkhivker, G. (2024, September 13). AlphaFold2 Modeling and Molecular Dynamics Simulations of the Confrontational Ensembles for the SARS-CoV-2 Spike Omicron JN.1, KP.2 and KP.3 Variants: Mutational Profiling of Binding Energetics Reveals Epistatic Drivers of the ACE2 Affinity and Escape... (P. Wang, Ed.) Viruses, 16(1458), 1-37. doi:https://doi.org/10.3390/v16091458

Ren, F., Ding, X., Zheng, M., Korzinkin, M., Cai, X., Zhu, W., . . . Sun, C. (2023). AlphaFold accelerates artificial intelligence powered drug discovery: efficient discovery of a novel CDK20 small molecule inhibitor. 14431452. doi:10.1039/D2SC05709C

Sala, D., Hildebrand, P. W., & Meiler, J. (2023, February 16). Biasing AlphaFold2 to predict GPCRs and kinases with user-defined functional or structural properties. (R. P. Yuanpeng Janet Huang, Ed.) Frontiers in Molecular Biosciences, 1-8. doi:https://doi.org/10.3389/fmolb.2023.1121962

Smyth, M. S., & Martin, J. H. (2000). X Ray Crystallography. Mol Pathol, 8-14.

Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., & Polosukhin, I. (2023, June 12). Attention is All You Need. Cornell University. Ithaca, New York: Cornell University. Retrieved November 14, 2025, from arxiv.org: https://arxiv.org/abs/1706.03762

Wheeler, R. J. (2021, November 11). A resource for improved predictions of Trypanosoma and Leishmania protein three-dimensional structure. (U. o. Vyacheslav Yurchenko, Ed.) PLoS ONE, 16(11), 1-12. doi:10.1371/ journal.pone.0259871

Xie, Q., Luong, M.-T., Hovy, E., & Le, Q. V. (2019). Self-training with Noisy Student improves ImageNet classification. Cornell University, Department of Computer Science. Ithaca, New York: Cornell University. Retrieved November 14, 2025, from https://arxiv.org/abs/1911.04252

Evaluating U.S. Economic Growth in the AI Age

Introduction

The U.S. economy’s headline growth appears robust, but a closer look reveals a tale of two economies. On one side is an AI- driven boom, fueled by unprecedented investments in data centres and AI infrastructure, while on the other side lies a stagnant consumer economy where household spending remains effectively unchanged¹. This divergence raises pressing questions: Is the nation quietly slipping into a technical recession masked by an AI investment frenzy? And what does it mean when GDP growth is powered by server farms rather than storefronts and salaries? This article expands upon analysis of recent discussions and data to answer these questions. I will also critique the shifting narrative of AI’s ambassadors – notably OpenAI’s CEO Sam Altman – and examine how tail risks of advanced AI (superintelligence) are being addressed (or not) amid the euphoria.

Beyond GDP growth, consumers face another headwind: the cost of essential services has climbed far faster than the cost of discretionary goods. Many goods like toys and televisions have gotten cheaper, while critical services such as health care, tuition and child care have skyrocketed in price since 2000. The chart below illustrates these divergent price trajectories.

Figure: Bureau of Labor Statistics chart showing price changes across consumer goods and services between 2000 and 2020. This chart, designed by Mark J. Perry, illustrates the necessities like hospital services and college tuition soaring while goods such as televisions and software become cheaper.

¹ Thompson, D. “How AI Conquered

Headline GDP vs. Technical Recession – The Data Center Discrepancy Officially, U.S. gross domestic product grew in early 2025 – a reassuring signal to many. Yet beneath the surface, what makes up that growth is rather troubling. Harvard economist Jason Furman estimates that investment in data centres and information-processing technology accounted for nearly all U.S. GDP growth in the first quarter of 2025. In fact, information-processing equipment and software – just 4 % of GDP – were responsible for 92 % of GDP growth in Q1 2025² . In other words, if you strip out the AI/ data- centre spending boom, economic growth would have been essentially flat, at about 0.1 %⁴.

Consumer spending, which normally drives about 70 % of GDP, was miniscule. Apollo Global Management’s chief economist notes that over the first half of 2025, the contribution to GDP growth from data- centre construction was equal to that from consumer spending⁵. This has led analysts to warn that the U.S. is in a technical recession by some measures, despite positive headline GDP, because the broad household and business sectors are stagnating while a single category (AI infrastructure) props up the aggregate numbers.

To illustrate this divergence, consider how much of overall growth comes from AI-related capital expenditure versus consumer spending. The figure below compares the contributions to GDP growth across two recent halves. Whereas in early 2024 consumer expenditure accounted for most of the expansion, by mid-2025 AI capital spending dominated the picture. Consumer spending’s contribution fell sharply even as total GDP remained elevated because AI investment took up the slack.

² Furman, J. “Information Processing Investment and GDP Growth.” 2025.

³ Fortune. “Information Processing Investment and GDP: The New Bubble?” (2025.)

⁴ Fortune (2025).

⁵ Apollo Global Management. “Consumer vs. Data Center Contributions.” 2025.

Historical data highlight that this dynamic is new. In the early 2000s, personal consumption dwarfed tech spending, whereas information-processing investment barely registered in GDP figures. The next figure, adapted from a Bloomberg chart, shows how contributions to GDP growth were dominated by consumption in the early 2000s while information-processing investment remained negligible.

Figure: Bloomberg chart illustrating how information-processing equipment and software contributions to GDP dwarfed personal consumption expenditures in the early 2000s.

Such a disconnect becomes clearer when comparing real personal consumption to real GDP in recent years. Consumer spending growth has fell drastically as pandemic savings dwindled which might’ve been what resulted in inflation eroding purchasing power. Real personal consumption expenditures (PCE) have plateaued since late 2024, even as real GDP is boosted by large amounts of capital outlays in tech. Recent quarter data underscores this gap – for example, AI-related capital expenditures added about 1.1 percentage points to GDP growth in H1 2025, outpacing the U.S. consumer’s contribution as an engine of expansion⁶. In the second quarter of 2025, GDP surged at a 3.8 % annualised rate, but much of that strength traced back to tech capex and inventory builds, not household demand. Real final sales to private domestic purchasers (a core demand measure) grew far more slowly, reflecting softness in consumer outlays and business equipment spending outside of data centres. In effect, the economy’s growth is being propped up by a CapEx boom concentrated in a few giant firms, masking an underlying stall in the broader private sector. It’s little surprise, then, that American consumers report overall negative things despite glowing GDP reports – by some surveys, U.S. consumer confidence in 2025 has been as low as it was in the depths of the 2008 crisis. In short, by looking past aggregate GDP and into its composition, we find an economy on a very weak footing – one that feels recessionary to households, even if the AI investment frenzy prevents the technical label.

The chart below shows the widening gap between real GDP growth and real PCE growth across seven recent quarters. While GDP growth has accelerated with the AI boom, consumer spending growth has decelerated. The divergence suggests that the benefits of the AI investment boom are not trickling down to households, at least not yet.

⁶ J.P. Morgan Asset Management. “On the Minds of Investors: Is AI Already Driving U.S. Growth?” 2025.

The AI Investment Boom or a Mirage?

What exactly is happening with this AI investment boom, and why is it distorting our economic picture?

The “AI economy” has rapidly become a major macro force. Since ChatGPT’s debut in late 2022, U.S. tech giants have entered an arms race to build AI capabilities, pouring unprecedented sums into data centres, specialised hardware (like Nvidia GPUs), software development and related infrastructure. In 2025 alone, Microsoft, Alphabet (Google), Meta and Amazon are expected to spend about $370 billion on capital expenditures – mostly AI-related – and plan to raise that further in 2026⁷. Microsoft’s outlays (the amounts of money it spends or expends) in the latest quarter reached $35 billion (45 % of its revenue) largely for AI data centres, while Alphabet has changed its 2025 capex (capital expenditure) up to as much as $93 billion⁸. Rarely, if ever, has a single technology absorbed this much corporate investment this quickly 9 .

Additional reports underscore the sheer scale of this spending spree. Wired notes that the combined AI-related capex of Microsoft, Alphabet, Meta and Amazon in 2025 approaches $370 billion. Microsoft’s $35 billion outlay represented roughly 45 % of its quarterly revenue, while Alphabet could invest up to $93 billion. Together these firms account for about 75 % of S&P 500 (first ever index fund that represents NYSE and NASDAQ shares) returns and 80 % of corporate earnings growth¹0. Fortune further emphasises that absent data- centre spending, U.S. economic growth in the first half of 2025 would have been just 0.1 % and that information-processing equipment and software – about 4 % of GDP – accounted for 92 % of growth¹¹. A Sherwood News analysis concurs, observing that AI capex contributed more to

⁷ Matsakis, L. “The AI Data Center Boom Is Warping the US Economy.” Wired, 2025

⁸ Matsakis, L. Wired, 2025.

9 Matsakis, L. Wired, 2025

¹0 Matsakis, L. Wired, 2025.

¹¹ Fortune. 2025.

GDP growth than consumer spending in early 2025¹². These figures make clear that Big Tech’s investment frenzy is not a sideshow but the main act of economic expansion.

To visualise the distribution of this spending, the bar chart below approximates 2025 AI-related capital expenditures by Microsoft, Alphabet and the combined spending of Meta and Amazon. It illustrates how two companies – Microsoft and Alphabet – account for a significant share of the $370 billion total, with the remainder split between the other giants.

This surge has distorted the real figures of GDP growth. Normally, consumer spending and broad business investment march in step with overall economic expansions or contractions. Now, however, “the American economy has split in two”: a booming AI- capex sector and a lacklustre consumer sector¹³. Corporate spending on AI infrastructure has effectively become the tail wagging the dog of growth. Preliminary data from S&P Global indicate that by Q2 2025, U.S. GDP was about 0.5 % higher than it would have been if AI-related investment had only grown at its normal trend. In fact, investments related to data centres have become the single largest contributor to U.S. GDP growth. J.P. Morgan analysts have dubbed AI capex the new economic “bellwether”, noting that heavy tech investment has offset weakness in interest-sensitive sectors and “added resilience to the economy at a time when consumption is softening”¹⁴ . Simply put, Big Tech’s spending spree is doing the heavy lifting that consumer spending usually does – for the present moment.

However, there is an important detail: AI investment may not propagate its benefits through the economy in the same way consumer spending does. Building a data centre or buying advanced chips does create some construction and manufacturing jobs, but once built, data centres employ relatively few workers and have limited supply chains¹⁵. As J.P. Morgan’s research highlights, data centres and AI hardware have a low

¹² Sherwood News. “AI Capex Versus Consumer Spending.” 2025.

¹³ Thompson, D. Substack, 2025

¹⁴ J.P. Morgan Asset Management. 2025.

¹⁵ J.P. Morgan Asset Management. 2025.

“multiplier effect” – “data centres also employ few workers once built…limiting their multiplier effect through wage - driven consumption”¹⁶. Moreover, a significant chunk of AI investment goes into imported high-tech equipment (like Taiwan-made chips), which doesn’t add to U.S. GDP at all¹⁷. In essence, billions poured into AI might lift headline GDP, but less of that money circulates to households in the form of wages and broader demand than traditional investments would. This raises the question: are we seeing a GDP mirage, where top -line growth overstates economic well-being? Many economists are cautious. Analysts at Pantheon Macroeconomics estimate that without the AI spending boom, U.S. growth in 2025 would be under 1 % – an anaemic pace that flirts with recession. Even Federal Reserve officials have noted the unusual support that tech investment is providing to an otherwise cooling economy (while also warning that such one -legged growth is unlikely to be sustainable).

There is also a temporal element: how long can the AI capex boom last? Tech capital- expenditure cycles are historically volatile. If demand for AI services doesn’t meet the expectations or if credit conditions tighten, these investments could “turn on a dime”¹⁸. The current boom is in part enabled by huge cash flows from core tech businesses (e.g., advertising and cloud services). Should those profits disappear or shareholders grow impatient, AI spending could plateau, removing the main pillar holding up growth. In that scenario, the underlying weakness – soft consumption, rising disregard towards credit- cards, dwindling household savings – would come to the forefront, confirming a recession that, arguably, has already been underway beneath the surface. This can have devastating effects, essentially making the economy pop off like a champagne cork. As one commentator observed, “I thought data centres would cause economic growth, not be the economic growth. If they’re so great, why isn’t the rest of the economy ripping from the benefit of AI then?” This scepticism captures a widespread sentiment that the AI boom’s benefits have yet to trickle down, devastating news for the trickle-down economists. Consumers aren’t seeing cheaper goods or better jobs – at least not yet – even as corporations spend furiously. Indeed, if AI is truly revolutionary, its gains should eventually spread beyond corporate ledgers. For now, though, that promise remains speculative, and the risk of an AI bubble looms.

An insightful visualisation from Bloomberg emphasises just how circular the AI economy has become. The chart below depicts the network of investments, service agreements and hardware purchases among key players such as OpenAI, Nvidia, Microsoft, AMD and Oracle. The arrows show money flowing in loops: capital raised by one AI firm is used to purchase chips from another, which in turn invests back into AI labs. The same money practically flows up and down the supply chain, bloating their income statements, which interns bloats there share price. The sheer size of these corporate relationships underscores both the scale and the insularity of the current AI boom.

¹⁶ J.P. Morgan Asset Management. 2025.

¹⁷ J.P. Morgan Asset Management. 2025.

¹⁸ J.P. Morgan Asset Management. 2025.

Circular investment flows in the AI economy

Sam Altman’s Changing Narrative: From Equitable AI to Competitive Arms Race

The mixed economic picture is not only about numbers – it’s also about narratives and leadership in the AI space. Nowhere is this more evident than in the evolution of Sam Altman’s public stance. As CEO of OpenAI, Altman has been a chief evangelist of artificial intelligence, but the story he tells about AI’s role in society has shifted dramatically. In AI’s early days, Altman often spoke of it as a levelling force, something that could “lift all boats” and even pave the way for radical ideas like universal basic income. OpenAI itself was founded in 2015 as a non-profit with the mission of ensuring AI benefits all of humanity, rather than just a privileged few¹9. Back then, Altman declined to hold any equity in OpenAI and drew only a token salary (about $75,000)²0, signalling that his motivations were more idealistic than financial. This structure was meant to keep OpenAI focused on broad societal benefit, free from shareholder pressure. In Altman’s words, it was a mission beyond profit.

Today, however, Altman’s rhetoric and OpenAI’s direction paint a different picture. As the company’s prospects (and valuation) skyrocketed with the success of ChatGPT, OpenAI restructured into a for-profit enterprise (a “capped-profit” model initially, now morphing further). In late 2024, OpenAI moved to remove the old non-profit board’s control and to allow greater investor returns – essentially transforming into a typical startup²¹. Under this new plan, Sam Altman will, for the first time, receive an equity stake in OpenAI – a stake that could be extraordinarily lucrative. Reuters reports Altman’s equity could be valued around $150 billion in the future, especially as the company seeks to lift the cap on investor returns²². This

¹9 GrowthShuttle. “Why Sam Altman Doesn’t Own Equity (then).” (2023).

²0 GrowthShuttle. (2023)

²¹ Reuters. “OpenAI Restructuring and Altman Equity.” 2024.

²² Reuters. 2024.

is a stunning turnaround for someone who once emphasised he had no ownership because he wanted AI to be a public good. Altman, already a billionaire from prior ventures, is thus poised to become one of the richest individuals on the planet on the back of OpenAI’s commercial success. Notably, this shift coincides with changes in Altman’s messaging: he now frequently frames AI in terms of geopolitical and economic competition. In congressional hearings and interviews, Altman stresses that the U.S. must lead in AI as a matter of national destiny, implicitly casting AI as a “commercial weapon” in a global race rather than a freely shared utility. Concerns about safety and equality often take a back seat to talk of innovation speed and market dominance.

He is continuously pushing for more deregulation, be that profit motivated or not. I for instance argue that such argument is simply insufficient to make, given that US tech companies operate under the free market, competing against the command economy such as China with one of the most heavily regulated AI Industries in the world, the US AI deregulation just doesn’t make sense.

Critics have not missed this change. The YouTuber and commentator Thought Slime, in a video essay titled “Sam Altman is Not Who You Think He Is,” highlights how Altman’s personal behaviour diverges from his altruistic pronouncements. For instance, even as Altman promoted OpenAI’s mission to benefit humanity, he was reportedly investing lavishly in profit- driven side projects and living a lifestyle far from ascetic. (Altman has backed for-profit ventures from nuclear-fusion startups to cryptocurrency experiments like Worldcoin – the latter pitched as global altruism but widely criticised as exploitative.) The philanthropic veneer around OpenAI has likewise faded. Early on, OpenAI promised to openly share research and even pledged that if it achieved true AI breakthroughs it would distribute the benefits widely. Yet as the company pivoted to a capped-profit model and struck exclusive deals (e.g., with Microsoft), it became far more secretive and commercially oriented. Its much-touted charitable initiatives or plans for equitable distribution have either been put on hold or repackaged into token gestures. In Thought Slime’s view, Altman has shifted from AI idealist to AI arms- dealer – leveraging a narrative of fear (warning that if the U.S. doesn’t push AI forward, rivals will) to justify aggressive commercial expansion. Even Altman’s own spending hints at the priorities: while he certainly isn’t the most extravagant tech CEO, the scale of resources he’s marshalled (and the personal wealth he stands to gain) dwarf any of his public philanthropic efforts to date.

In addition to shifting rhetoric, Altman’s personal and corporate dealings reveal a complex picture. In May 2024 he and his husband Oliver Mulherin joined the Giving Pledge, promising to donate more than half of their wealth. Altman stated that his philanthropic focus is on technology that creates abundance, and the couple support universal basic income initiatives and pandemic preparedness research²³ ²⁴. The Axel Springer Awards recognised him as a philanthropist in 2025 for these contributions²⁵ . However, the advocacy group Public Citizen reported that Altman donated $1 million to Donald Trump’s inaugural fund and that OpenAI hired former Trump campaign staff as lobbyists, casting doubt on his political impartiality²⁶. This mixture of philanthropy and controversial political spending complicates the narrative of a selfless tech visionary.

Recent reporting further underscores the financial stakes. Guardian and Economic Times articles revealed that OpenAI has committed to spending around $1.4 trillion on compute and data- centre

²³ AP News. “Sam Altman and Husband Join the Giving Pledge.” 2024.

²⁴ Lifestyles Magazine. “Sam Altman 2025 Axel Springer Award & Philanthropic Commitments.” 2025.

²⁵ Lifestyles Magazine. 2025.

²⁶ Public Citizen. “Sam Altman Donated $1 Million to Trump’s Inaugural Fund.” 2020.

infrastructure over the next eight years. During a Q&A with investors, Altman reportedly told investor Brad Gerstner to sell his shares if he didn’t like the plan, even conceding that some investors would lose a lot of money in the AI boom²⁷. These articles also described circular deals in which Oracle builds data centres for OpenAI, OpenAI pays Oracle, and Nvidia sells chips to OpenAI while investing back into the company²⁸. Such circular investment schemes raise concerns about transparency and sustainability.

Time magazine reported that after OpenAI’s reorganization, the nonprofit OpenAI Foundation holds a $130 billion stake in the new for-profit OpenAI Group PBC, while Microsoft holds a 32.5 % stake valued at roughly $135 billion²9. This reorganization underscores the interdependence between major tech firms and the high stakes involved in steering AI’s future.

In sum, Sam Altman’s case reflects a broader pattern in the AI industry: lofty promises of equality and humanitarian AI have given way to the reality of market forces and power consolidation. Understanding this narrative shift is important because it suggests that AI’s benefits may not automatically be broadly shared. If the technology’s leading proponents prioritise competitive advantage and profit – driving billion- dollar spending sprees to outgun each other – who will ensure that ordinary people see the upside? Altman himself once blogged about how AI could eventually fund a universal basic income by vastly increasing productivity, he also later proposed the concept of the universal basic compute where everyone would get a share of the compute of the gbt-7. But as of now, AI’s windfalls are accruing to AI companies and their investors, with little evidence of an upcoming UBI, UBC or widespread societal dividend. This critique is not to villainise making a profit or investing in AI – it is to point out that narratives matter. When AI was sold as a public good, it attracted trust and collaborative spirit. Now that it’s being treated as a gold rush or arms race, public trust may wane, and calls for regulation or redistribution will grow louder.

Tail Risks and Superintelligence – Confronting the “Catastrophic” Downside

While the AI economy’s impact on growth is one concern, another lies in the “tail risks” of advanced AI – the small-but-nonzero probability that something could go dramatically wrong as AI systems become more powerful. Here I will add a section with recent input from OpenAI itself and other experts, which underscore how seriously those at AI’s frontier are taking the issue – at least rhetorically. In a notable November 2025 statement, OpenAI warned that superintelligent AI systems could pose potentially catastrophic risks to humanity³0. This was one of the most stark public warnings yet from a leading AI lab about its own technology. “The potential upsides are enormous; we treat the risks of superintelligent systems as potentially catastrophic,” OpenAI wrote³¹. In plain terms, if artificial general intelligence (AGI) or superintelligence were achieved without proper safeguards, the consequences could be dire – ranging from the AI disempowering humanity, to unintended disasters (cyber or even physical, via misuse of biotech or military tech). Importantly, OpenAI’s statement didn’t just introduce fear; it coincided with calls for concrete safety measures to mitigate these tail risks. They stressed the need to “empirically study safety and alignment” techniques for advanced AI and suggested that, if needed, the world should even consider slowing down AI deployment to allow more time for careful evaluation of systems that can self-improve³² ³³. I speculate that this may well act as a macro shock for the growing investor impatience which can be the causing factor of the burst.

²⁷ Guardian & Economic Times. “OpenAI’s $1.4 Trillion Compute Plan and Investor Reactions.” 2025.

²⁸ Guardian & Economic Times. 2025.

²9 Lee, C. “OpenAI Completes Major Reorganization with $135 Billion Microsoft Stake.” Time, 2025.

³0 Indian Express / PTI. “OpenAI Warns of Catastrophic Risks from Superintelligent AI.” 2025.

³¹ OpenAI. “Blog Post on Superintelligence & Coordination.” 2025.

³² Indian Express / PTI. 2025.

³³ OpenAI. 2025.

OpenAI’s leadership advocated several specific risk-management strategies: establishing shared safety standards among leading AI labs, instituting public oversight proportional to AI capabilities, building an “AI-resilient” societal infrastructure, and improving methods for auditing and controlling AI models³⁴ ³⁵. For example, they proposed that no AI system deemed “superintelligent” should be released without being able to robustly align and control it³⁶ ³⁷. This aligns with what AI safety researchers have long argued – that you should reliably steer an AI’s behaviour (alignment) and have off-switches or constraints (control) before it gets too advanced. OpenAI even floated the idea of international cooperation akin to arms- control treaties: “frontier labs should agree on shared safety principles, share research on new risks, and reduce race dynamics” – noting that collective action will be critical to prevent a destructive race to the bottom³⁸ ³9. Eliezer Yudkowsky stated that current developers "cannot, I fear, control the god-machines they're trying to build" regarding the super-intelligence, which highlights the “easier said than done narrative”. Easily a macro shock as well, I speculate, or a human extinction.

Notably, the Indian Express article highlights that firms like Meta, Microsoft and Elon Musk’s xAI are all vying to develop superintelligent systems, spending billions and offering colossal salaries to top researchers⁴0 This pace heightens the race dynamic risk: companies may cut corners on safety to beat competitors. It’s a classic prisoner’s dilemma – everyone knows a deviant AI could be catastrophic, but each player fears that if they don’t push forward, someone else will. We can only hope that we won’t end up in the Nash equilibrium in this case. OpenAI’s call to “mitigate race dynamics” is essentially to avoid this scenario through coordination.

The essay discussed tail risks like algorithmic bias, autonomous weapons, and the prospect of “runaway” self-improving AI (largely based on the ai-2027.com scenario) . Reinforcing those points, I highlight that the very creators of cutting- edge AI acknowledge the severity of these risks in unprecedented terms. When a leading lab uses words like “catastrophic” in relation to its own field, it lends credence to policy advocates urging proactive measures. Indeed, there is a growing consensus (across companies, academics and governments) on at least one step: no superintelligent AI should be launched without rigorous alignment⁴¹ ⁴² How we determine a system is safe remains an open question – one involving technical research into AI cognition and extensive testing. But the principle is clear. Additionally, OpenAI’s recommendations for public oversight suggest that leaving safety entirely to corporate self-policing is insufficient. Mechanisms like evaluation boards, audit requirements, or even licensing for training extremely large models are being floated to ensure an independent check.

One concrete proposal gaining traction is the idea of an “AI pause” at certain capability milestones. For instance, governments could mandate that any model surpassing a defined threshold of power (say, a model that could autonomously write and execute code to modify itself) must be held back until a safety review is passed. Another idea is a global monitoring body that tracks the computing resources (GPUs, electricity) being poured into AI projects – analogous to the IAEA’s tracking of nuclear materials – to spot when someone is potentially leaping toward a dangerous system. OpenAI’s latest statements nod to such ideas by emphasising “reporting and measurement by labs and governments on AI’s impact” and cross- country cooperation, especially to prevent things like AI-fueled bioterrorism⁴³ ⁴⁴

³⁴ Indian Express / PTI. 2025.

³⁵ OpenAI. 2025.

³⁶ Indian Express / PTI. 2025.

³⁷ OpenAI. 2025.

³⁸ Indian Express / PTI. 2025.

³9 OpenAI. 2025.

⁴0 Indian Express / PTI. 2025.

⁴¹ Indian Express / PTI. 2025.

⁴² Indian Express / PTI. 2025.

⁴³ OpenAI. 2025.

Ultimately, addressing tail risks requires navigating tension: how to reap AI’s benefits (which OpenAI rightly says could be “enormous” in areas like healthcare and education) without courting existential threats. The expanded discussion here, enriched by OpenAI’s own perspective, underscores my conclusion: robust AI safety is not a distant academic concern but an urgent economic and policy priority. As investors chase the next big thing and nations scramble for AI supremacy, the world must remember that a single misaligned superintelligence, even if highly improbable, could negate many of the gains – or worse. In the cost-benefit calculus of AI innovation, tail risks carry immense weight on the downside. We are encouraged that leading AI figures are voicing these concerns; the key will be translating words into concrete guardrails.

Conclusion

The American economy at the end of 2025 presents a paradoxical picture. GDP is growing, markets are exuberant about AI, and corporate earnings (at least for the tech elite) are robust. Yet many Americans feel as though a recession has already arrived, with living costs up, real incomes flat and consumer spending running on fumes. My research shows that these perceptions are not an illusion – they reflect an underlying reality in which headline growth is propped up by a narrow, extraordinary AI investment boom while the wider economy struggles. In effect, we’re living through a moment where an AI bubble (or revolution) is “covering up” a broader stagnation. Whether this ends happily – with AI delivering a productivity renaissance that does lift all boats – or ends painfully – with the bubble popping to reveal an economy in recession – remains uncertain.

What is clear is that policymakers, business leaders and investors must all adjust to this new landscape. Reading the aggregate GDP is no longer enough; one must learn which economy is growing. If it’s primarily the “AI economy”, then strategies must ensure that its gains translate into jobs, incomes and broad-based improvements – not just data centres and stock buybacks. Fiscal and monetary policy should also act fighting inflation or fine -tuning interest rates based on top -line growth could misfire if that growth is narrowly based. For instance, if the Federal Reserve sees 3 %+ GDP and tightens policy further, it might unknowingly crunch the non-AI economy into a worse state (since the AI sector is less sensitive to rates), causing a deeper fall in consumer activity.

On the technology front, the narrative shift among AI leaders like Sam Altman is a reminder that who guides AI development and why they do it matters profoundly. Altman’s journey from espousing AI as a public good to championing OpenAI’s for-profit dominance encapsulates the commercialisation of AI. There’s nothing inherently wrong with commercialisation – it drives innovation – but the balance between profit and public interest must be watched. If the next generation of AI is developed under a purely competitive, winner-takes-all mindset, issues like fairness, transparency and equitable benefit may fall by the wayside. We might get powerful AI systems that boost GDP while exacerbating inequality (e.g., via job displacement) or eroding privacy and trust. Ensuring that AI serves society broadly, not just its creators’ wealth, is a challenge we can’t ignore. Regulators may need to step in – for example, mandating that foundational AI models undergo external audits for bias and risk, or that the profits from automating labour at least partially fund retraining programmes or social safety nets. Sam Altman’s earlier vision of AI-funded UBI might have been premature, but it’s not too early to discuss how AI’s economic windfalls can be shared – before they concentrate irrevocably.

Finally, on the existential and long-term risks, the twin to Altman’s narrative shift is his dual role as both AI hype -man and AI doomsayer. In public forums, he urges society to brace for potential “AGI ruin” even as he builds AGI. This has the effect of raising awareness (good) but also perhaps instilling fatalism (bad) – some might think “if even the CEO of OpenAI can’t guarantee safety, what can we do?” The answer, as our report reinforces, is that we must do a lot: demand transparency, insist on safety brakes, craft sensible regulations,

and encourage international cooperation. The Indian Express coverage of OpenAI’s warning noted that companies are hiring top talent and pouring money precisely into the goal of superintelligence – a goal that, if achieved without checks, could indeed be a Pandora’s box⁴⁵. The time to put guardrails on AI development is now, not after the fact.

In conclusion, evaluating U.S. economic growth in the AI age requires a multidimensional lens. We must distinguish cyclical or structural weakness in the traditional economy from the dazzling ascent of the AI sector. We should celebrate genuine technological progress but remain critical of “growth” that is narrowly sourced. We must hold leaders to their promises and principles, and be prepared to call out hypocrisy where we see it – because trust will be crucial in guiding public support for AI initiatives. And we need to approach AI’s future not with blind optimism or dystopian despair, but with pragmatism: investing in its promise, managing its perils, and ensuring its rewards don’t merely pad a few pockets but uplift the many. The AI age is here; our task is to navigate it so that economic growth and human progress move in alignment, rather than at odds.

Bibliography

1. Furman, Jason. Information Processing Investment and GDP Growth. 2025. This paper estimates that information-processing equipment and software accounted for around 92 % of U.S. GDP growth in the first half of 2025.

2. Apollo Global Management. Consumer vs. Data Center Contributions. 2025. Analysis showing that data- centre construction contributed about as much to GDP growth as consumer spending in early 2025.

3. J.P. Morgan Asset Management. On the Minds of Investors: Is AI Already Driving U.S. Growth? 2025. Report noting that AI-related capex added around 1.1 percentage points to GDP growth in H1 2025 and that data centres have low multiplier effects.

4. Thompson, Derek. How AI Conquered the US Economy. 2025. Substack report highlighting the split between a booming AI- capex sector and a sluggish consumer economy.

5. Reuters. OpenAI Restructuring and Altman Equity. 2024. Coverage of OpenAI’s shift to a capped-profit model and Sam Altman’s acquisition of a potentially $150 billion stake.

6. GrowthShuttle. Why Sam Altman Doesn’t Own Equity (then). 2023. Article explaining Altman’s earlier decision to forgo equity in OpenAI and his minimal salary.

7. Indian Express / PTI. OpenAI Warns of Catastrophic Risks from Superintelligent AI. 2025. Reports on OpenAI’s warning that superintelligent AI poses potentially catastrophic risks and the firm’s call for coordination.

8. OpenAI. Blog Post on Superintelligence & Coordination. 2025. Notes that the company treats superintelligent AI risks as catastrophic and advocates for strong alignment and oversight measures.

9. Lee, Chloe. OpenAI Completes Major Reorganization with $135 Billion Microsoft Stake. 2025. Time article reporting that the nonprofit OpenAI Foundation holds a $130 billion stake in the for-profit OpenAI Group PBC and that Microsoft holds a 32.5 % stake valued at roughly $135 billion.

10. Matsakis, Louise. The AI Data Center Boom Is Warping the US Economy. Wired, 2025. Article noting that Microsoft, Alphabet, Meta and Amazon plan to spend roughly $370 billion on AI-related capital expenditures in 2025, with Microsoft’s $35 billion representing 45 % of revenue and Alphabet’s capex possibly reaching $93 billion.

11. Fortune magazine (anonymous). Information Processing Investment and GDP: The New Bubble? 2025. Outlines that without data- centre investment, U.S. GDP growth in H1 2025 would have been only 0.1 %, and that information-processing equipment and software (about 4 % of GDP) accounted for roughly 92 % of growth.

12. Sherwood News. AI Capex Versus Consumer Spending. 2025. Reports that AI-related capital expenditures contributed more to GDP growth in early 2025 than consumer spending.

13. Lifestyles Magazine. Sam Altman 2025 Axel Springer Award & Philanthropic Commitments. 2025. Celebrates Altman’s philanthropic contributions, noting that he joined the Giving Pledge and supports universal basic income and pandemic research.

14. Public Citizen. Sam Altman Donated $1 Million to Trump’s Inaugural Fund. 2020. Reveals Altman’s political donation and the hiring of former Trump campaign staff by OpenAI.

15. AP News. Sam Altman and Husband Join the Giving Pledge. 2024. Confirms that Altman and his husband Oliver Mulherin promised to donate more than half their wealth and emphasises their focus on abundance - creating technologies.

16. Guardian and Economic Times. OpenAI’s $1.4 Trillion Compute Plan and Investor Reactions. 2025. Describes OpenAI’s massive spending plan and Altman’s testy exchange with investor Brad Gerstner; highlights circular deals among OpenAI, Oracle, Nvidia and other partners.

⁴⁵ Indian Express / PTI. 2025.

“Use every man after his desert, and who should ’scape whipping?” Should the law treat offenders better than they deserve?

“Use every man after his desert, and who should ’scape whipping?” says Hamlet to Polonius in Act 2 scene 2 of Shakespeare’s Hamlet reflecting one of the play’s central moral tensions; justice vs. mercy. Shakespeare uses this poignant rhetorical question to comment on human imperfection, and therefore dramatically highlighting that treating everyone exactly as they ‘deserve’ would leave everyone punished, as everyone eventually does wrong. Therein lies the timeless question; should the law treat offenders better than they deserve, or in other words, is strict justice fair, or even possible, or should the legal system aim to be merciful, kind and focus on a humane view of offenders through rehabilitation, retribution and leniency, even for those who have violated justice?

At first glance, the immediate thought may seem to favour the latter, as treating offenders ‘better than thy deserve’ would suggest a merciful approach to law, allowing for retribution in a legal model which enables peace as well as justice through kindness towards human flaws. Moreover, this would avoid any unnecessary or excessive punishment that serves no good if peace is maintained regardless. However, is there a moral responsibility, or a just cause that calls for an ‘eye for an eye’ approach to law and offense? It is not only a responsibility that lies in the call for justice or deterrence in the path to preserving peace for the victims, their families and the world around us, but also a profound ethical dilemma that questions the understanding of what ‘wrong’ or ‘offense’ is defined as, whether other, equal persons have the moral right to condemn such actions, and finally how these persons should approach coming to a proportionate conclusion to preserve moral integrity and justice. Aristotle distinguished between ‘corrective’ justice and ‘distributive’ justice, the latter concerning the fair allocation of goods and burdens in accordance with what each person deserves, or ‘their due’. This Aristotelian concern with proportion and fairness is later echoed in Kant’s Metaphysics of Morals, where justice is framed as the objective proportion of action that the law demands; neither more nor less than what one deserves in answer to their desert. In this way, the principle of justice is preserved, ensuring a fair and consistent legal system that also maintains the law’s deterrent effect against wrongdoing. However, we then face the problem of justifying the actions taken, or punishment, that comes with this instalment of justice. To resolve this, we should look towards Locke’s own ideas surrounding retribution and restitution; Locke had argued that every man has a right to ‘punish an offender to such a degree as may suffice to make reparation and restraint’. Whilst this scholar had emphasised that punishment should be proportionate to the amount required for peace, rather than a purely equal amount of suffering or punishment in response to desert, he remains a key contributor to the retributive, justice focused view, arguing against the law treating offenders better than they deserve. Moreover, whilst Hamlet points towards a more merciful and humane view of law and punishment, Shakespeare had displayed opposing portrayals of justice within Othello, and the Merchant of Venice (e.g. “If you wrong us, shall we not revenge?” – Shylock, The Merchant of Venice) demonstrating humanity’s own internal conflict between our desire for true, fair justice and our inclination to be merciful which ultimately results in a sacrifice of one for the other. This brings relevance because this inherently human dilemma leads us to the rational conclusion that justice, fairness and an ‘eye for an eye’, almost old-testamental approach to law seems the right approach to the legal system, preserving public trust in justice, and justice itself without bias and without potential cultural relativism impacting the result.

Although, Alexander Pope puts it best in Essay on Man, stating “To err is human; to forgive, divine.”. We as humans all have our faults and flaws and inevitably desert or do wrong, whether this is within law or simply an ethical implication, it is unrealistic to suggest otherwise; and this, is truly what sustains Hamlet’s idea behind the quote that emphasises our human quality of kindness and mercy, even if that means avoiding excess evil to meet the invisible bar of what we conceive as the ‘deserving’ punishment if it leads to an altogether equivalent preservation of peace. Some may also take a utilitarian perspective, founded alongside Bentham and Mill’s arguments that treating offenders better than they deserve might actually reduce reoffences and create an overall greater amount of peace, which should be a legal system’s overarching focus. Furthermore, whilst strict justice may seem reasonable and attractive, it has

irresolvable limitations; not only does it carry a great burden on the legal system to ascertain what an objectively proportionate justice is, potentially leading to great bias and relativism across different scenarios, but also has potential of leading to wrongful convictions or injustices due to subconscious preconceptions, unidentifiable by law, relating to racial, economic, or other sociocultural factors. Additionally, Locke seems to lie in the balance of these two ideas as he goes on to emphasise the value of liberty and individual dignity, which may be overruled by higher positions of excessive moral power and authority. In reality, however, there lies a more simple, human issue at heart; it is often those who have fallen victim to offence, or suffering, that themselves commit similar offence, and often had experienced injustice themselves. As the famous aphorism goes ‘hurt people hurt people’, whilst not strictly justifying human wrongdoing or moral desert, it attaches a sympathetic and merciful lens to applying justice in the legal system. This leads us to question legal systems such as the United States’ that still holds capital punishments and the way we punish offenders, overlooking the causes and lead up to their actions, only serving justice based on the actions themselves. Now, to prevent an error of communication, I must clarify that this in no way suggests that the upbringings or factors affecting offenders, leading them to their desert, justify or deny punishment for such offences, but rather that treating offenders mercifully, in occasion treating them ‘better’ than what we perceive they deserve may simply be the more humane alternative, and perhaps even, ultimately, fairer. Hamlet himself, in the very quote proposed, had suffered his own internal conflict, as he fell outside the path that was fated for him. He was a character that didn’t belong in his own vengeful narrative; insofar as exposing the inherently human system that rejects an absolute vengeful legal model or punishment; we naturally incline towards empathy and mercy, it seems an intangible design, observable in people and legal systems globally, that strays from objective morality, and enables change, retribution, and peace.

However, on a practical basis, it may seem unreasonable to bare the economic, social and moral cost of relying on retribution and a human’s capacity to change. Moreover, whilst our humane instinct for mercy and empathy is inadmissible, we cannot framework our legal systems as such to fit a human instinct that is often subject to our chosen ethical codes or past and present environments. Instead, we should remain merciful whilst preserving moral law and a relatively ‘harsh’ legal system, not only for justice to victims and deterrence, vital to preserving peace in real circumstances, but also to discourage the effects of treating offenders better than they deserve. Doing so, in practical terms, may cause an overall increase in offenses and re-offenses if offenders no longer expect a hard punishment and naturally await a merciful, retributive attitude to their actions. George Eliot expands on this idea; In his words “Our deeds carry their terrible consequences […] but we can always be merciful”, thus reinforcing the emphasis on mercy and humanity and law, yet remaining opposed to the idea of treating offenders better than they deserve, as the consequences of such offenses have great impact, which must be accounted for, not only for justice but also mercy to the families of victims and victims themselves. This judgement and instalment of justice, however, should be softened by empathy and treatment of offenders as humans because people are more than their worst actions, and the law should understand change, not depend on it, but enable for a change in human behaviour and action. Through reason and conscience, as Locke highlights, we can become better people, and so may offenders after their action, although treating them better than they deserve implies the legal system is obliged to do so, rather than being their aim, such as in the situation of an excess in offenders with limited space in prisons or other misallocations of resources and underestimations. Simply put, a harsh legal system carries a heavy financial and administrative cost that cannot be overlooked and so does the opposing extreme. An example is evident in jurisdictions such as Norway’s, that may be seen as treating offenders ‘better’ than they deserve; this remains a heavy financial burden for the few that actually benefit from this rehabilitation system. This, in itself poses the opportunity cost to the government that could be spending on other public sectors such as education, healthcare etc. So, a pragmatic conclusion can be made in the median; a just legal system, allowing for

justice, that treats offenders with empathy and humanity, but not ‘better’ than they deserve, enabling for change without reckless sympathy, but remaining kind and equal for all.

Ultimately, the answer seems clear; justice and mercy should work together, and thus the law should not treat offenders better than they deserve, although the direction of the prompt remains significant. The question suggests a great emphasis on mercy, kindness and retribution, which remains important, however it is important not to be in excess of this idea. Justice is also important for both effectiveness of law, as well as kindness to victims and the public, and therefore must work hand in hand with the former. So, in Shakespeare’s own words “earthly power doth then show likest God's, when mercy seasons justice.” (Merchant of Venice, Act 4, Scene 1).

Bibliography

Shakespeare, William. Hamlet. Edited by Harold Jenkins. London: Arden Shakespeare, 1982. Locke, John. Two Treatises of Government. 1689. Edited by Peter Laslett. Cambridge: Cambridge University Press, 1988.

Locke, John. An Essay Concerning Human Understanding. 1690. Locke, John. Some Thoughts Concerning Education. 1693.

Anscombe, G. E. M. “Modern Moral Philosophy.” Philosophy 33, no. 124 (1958): 1–19. Retrieved from https:// doi.org/10.1017/S0031819100020319

Aquinas, Thomas. Summa Theologiae, Second Part of the Second Part, Q. 30–33. Retrieved from https:// www.newadvent.org/summa/

Augustine. City of God, Book 21. Retrieved from https://ccel.org/ccel/augustine/civitate/civitate.html

Beccaria, Cesare. On Crimes and Punishments. 1764. Translated by Henry Paolucci. Indianapolis: BobbsMerrill, 1963. Retrieved from https://oll.libertyfund.org/title/beccaria-on-crimes-and-punishments Eliot, George. Adam Bede. London: Blackwood and Sons, 1859. Retrieved from https://www.gutenberg.org/ ebooks/507

Kant, Immanuel. The Metaphysics of Morals. Translated by Mary Gregor. Cambridge: Cambridge University Press, 1996. Retrieved from https://doi.org/10.1017/CBO9780511809640

Pope, Alexander. An Essay on Man. London: J. Wilford, 1733. Retrieved from https://www. poetryfoundation.org/poems/44899/an-essay-on-man-epistle-i

Allott, Anthony. “Just Deserts and the Criminal Justice Act 1991.” Law & Justice: The Christian Law Review, no. 95 (1991): 82–91. Retrieved from https://heinonline.org/HOL/P?h=hein.journals/ljusclr95&i=88 Simmons, A. John. “Locke and the Right to Punish.” Philosophy & Public Affairs 20, no. 4 (1991): 311–349. Retrieved from https://www.jstor.org/stable/2265370

Tuckness, Alex. “Retribution and Restitution in Locke’s Theory of Punishment.” The Journal of Politics 72, no. 3 (2010): 720–732. Retrieved from https://doi.org/10.1017/S0022381610000125

Wainwright, Joel. Locke and Traditional Punishment Doctrine. Columbia Center for Contemporary Critical Thought. Retrieved from https://cccct.law.columbia.edu/content/locke-and-traditional-punishmentdoctrine

Green, Michael. “John Locke on Rights.” Carneades Political Philosophy. Retrieved from https://carneades. org/locke-on-rights/

Pratt, John. “Scandinavian Exceptionalism in an Era of Penal Excess.” British Journal of Criminology 48, no. 2 (2008): 119–137. Retrieved from https://doi.org/10.1093/bjc/azm072

Tonry, Michael. “Learning from the Limitations of Deterrence Research.” Crime and Justice 37, no. 1 (2008): 279–311. Retrieved from https://www.jstor.org/stable/10.1086/525023

The Scramble for Africa: A Resource Grab Disguised as Imperialism?

Africa was steadily being colonized by European powers in the years 1884-1914 following the Berlin Conference (1884-5), yet not one African leader was present at the table where the fate of their continent was being decided. In 1870, roughly 10% of Africa’s territory was controlled by European countries, but this was primarily coastal areas such as Algeria (by the French) and the Cape colony (the British). By 1914 (post Berlin conference), 90% of the continent was colonized. Only Ethiopia and Liberia remained independent¹. This was because the former defeated Italian colonizers at the Battle of Adwa² (1896), and the latter remained a U.S. backed settler state. European powers justified their colonization by claiming that most parts of Africa were just terra nullius or ‘empty land’ (e.g., the vacant land theory - South Africa³), as well as primitive societies before the partitioning. This claim can be disproven when looking at the diverse economic and political systems of Africa at the time – from the diverse trade networks ⁴ to the social and cultural complexity of the continent in the 17th century, there were sophisticated networks that dated back millennia. The establishment of the transatlantic slave trade (1500-1860) had already destabilized African economies, diverting labour to plantations. Colonialism further re-oriented resources away from these networks towards mining and extraction. The so-called Scramble for Africa was constructed as imperialist expansion but performed as a capitalist resource grab — which can be evidenced by exploring cases such as the Second Boer war, Britain fighting for Witwatersrand’s gold fields (around 40% of global output)⁵.

Europeans’ claims of empty land and their self-proclaimed civilizing mission were ideological cover- ups for exploitation, contradicted by Africa’s precolonial complexity and colonial brutality. Firstly, the claims of terra nullius were plainly inaccurate, given the political and economic sophistication that existed well before the 19th century. Politically, African societies demonstrated sophistication – for example the Ashanti Empire’s bureaucratic federal republic. This is one of many examples disproving the concept of empty land– the presence of such complex political thought proves the existing complexity of societal thought in Africa before colonization. Meanwhile, pre-colonial Africa had rich economic networks – the Trans-Saharan trade route⁶ was a vital exchange network which connected the continent with the Mediterranean, facilitating the exchange of goods like gold and salt. While this operated in the west, the Swahili coast was flourishing due to the Indian Ocean Trade network, which focused more on ivory and porcelain. These routes demonstrate Africa’s integration into global commerce, further evidencing the intricacy of precolonial society and invalidating the idea of Europeans taking vacant land. Secondly, Colonizers justified the partitioning of the continent in part through the myth of ‘civilizing’ the African people. By 1914, West Africa (with a population of 15 million) had only 5 secondary schools – which catered only to colonial administrators’ children ⁷ . Access to secondary education was heavily restricted, less than 0.1% of children in Africa were actively educated in 1914⁸. There is also evidence of education being used as a tool to assist cultural erasure. French and British schools banned the use of local languages, distancing

¹ Kulik, Rebecca. “Scramble for Africa.” Britannica, September 3, 2024. https://www.britannica.com/event/Scramble-for-Africa (accessed 8/07/25)

² Africa Rebirth. (n.d.). How Ethiopians fought and defeated Italians in the Battle of Adwa. https://www.africarebirth.com/how-ethiopians- foughtand-defeated-italians-in-the-battle-of-adwa/ (accessed 9/07/25)

³ South African History Online. (n.d.). The empty land myth. https://sahistory.org.za/taxonomy/features/empty-land-myth(accessed 9/07/25)

⁴ Green, E. (2016). Production systems in pre-colonial Africa. African Economic History Network. https://www.aehnetwork.org/wp- content/ uploads/2016/01/Green.Production-Systems-in-Pre-Colonial-Africa.pdf (accessed 9/07/25)

⁵ 911 Metallurgist. (n.d.). Witwatersrand gold deposits. https://www.911metallurgist.com/blog/witwatersrand-gold-deposits/ (accessed 9//07/25)

⁶ Prime Progress. (n.d.). How pre-colonial African trade networks powered the world’s economy. https://primeprogressng.com/perspective/how-pre-colonial-african-trade-networks-powered-the-worlds-economy/ (accessed 10/07/25)

⁷ UNESCO, General History of Africa, Vol. 7: Africa Under Colonial Domination (Paris: UNESCO, 1990), 412, https://unesdoc.unesco.org/. (accessed 12/07/25)

⁸ Gail P. Kelly, "French Colonial Education and Elite Formation," History of Education Quarterly 18, no. 1 (1978): 87, https://www.jstor.org/stable/216417. (accessed 13/07/25)

children from their own culture9.However, farming and mining training increased under colonial rule in order to sustain the economies that European colonizers were building. This, combined with the construction of mining and extraction- based infrastructure, more specifically railways being built from mines to ports¹0- show that colonizers were seeking to abuse the resources newly available to them in Africa. Before colonization, education in Africa was far from what people were making it out to be. 14th century Sankore University (Timbuktu) hosted over 25,000 students – teaching astronomy, law, medicine and more – all subjects that were later excluded under colonial schooling ¹¹ . This shows that colonial education was looking to reduce schooling to basic literacy, honing in on cultural erasure and vocational education to aid the new industries they had brought to the continent. This demonstrates how Europe replaced existing systems with exploitative alternatives, disproving the motive of colonisation to civilise and/or bring education to the continent. Resistance to these policies, for example the Kikuyu Independent schools¹² (1920s Kenya) which secretly taught subjects like African history and maths reveal African’s commitment to a well-rounded education. As soon as countries like Ghana and Tanzania gained independence, they rebuilt curricula to once more include local languages and history. Europe’s claims of ‘empty land’ and a ‘civilizing mission’ were not only inaccurate; they justified exploitation while simultaneously erasing Africa’s existing political, economic, and intellectual traditions.

The Scramble for Africa was not merely a territorial conquest, but an economic project designed to extract resources through exploitative labour. European powers designed extractive regimes that transformed entire regions into ‘single-commodity zones’ in order to maximize yields. This is demonstrated through instances like South Africa’s mineral empires, as well as the tragedy of Congo’s Rubber Terror. South Africa’s mineral empires denote the extraction of gold (in Witwatersrand post1886) and diamonds (in Kimberley, post-1870). The Witwatersrand mines saw severe injustices. For instance, the wage gap between white and black workers stood at 12:1¹³. Migrant workers from Malawi and Mozambique were paid far less than white miners, and this was part of a broader injustice, apartheid. This system of mining exploitation was replicated in diamond mining, where De Beers enforced even stricter racial hierarchies. Founded by Cecil Rhodes in 1888 with British imperial backing, De Beers controlled an estimated 90% of the world’s production of rough diamonds¹⁴. Rhodes exploited local South African labour through closed compound systems: where workers were confined like prisoners¹⁵ and were subject to whippings for “slow work”. The mines created lasting damage, in the form of apartheid funding and modern ‘ethical’ washing; De Beers still owns 30% of the global diamond trade while paying workers in Botswana mines approximately $5/day.¹⁶ In the Congo Free State, Belgian colonizers enforced rubber quotas through systematic mutilation. Soldiers of the Force Publique were required to amputate workers’ hands if they failed to meet production quotas. Unlike the exploitative wage systems in Witwatersrand and Kimberley, the Congo operated purely on terror: no payment, only punishment for lack of output. The required level of output was 4kg of rubber/week per village¹⁷: and if this wasn’t met, soldiers had to

9 Mkhize, N., & Banda, F. (2024). Indigenous language revitalization movements: Resistance against colonial linguistic domination. ResearchGate,https://www.researchgate.net/publication/381051290_Indigenous_Language_Revitalization_Movements_Resistance_Against _Colonial_Linguistic_Dominationhttps://www.researchgate.net/publication/381051290_Indigenous_Language_Revitalization_Movements_ Resistance_Against_Colonial_Linguistic_Domination

¹0 Ndlovu, T. (2017). The political economy of gold mining in colonial and post-colonial Zimbabwe. Resources Policy, 52, 136–144. https://doi.org/10.1016/j.resourpol.2017.02.007

¹¹ UNESCO’s General History of Africa, Vol. IV

¹² African Resistance and Cultural Nationalism: The Kikuyu Independent Schools Movement in Kenya Author: James Arthur Wilson, Jr. Degree Date: May, 1994

¹³ South African History Online. (n.d.). 1946 African mineworkers’ strike. https://sahistory.org.za/article/1946-african-mineworkers-strike (accessed https://sahistory.org.za/article/1946-african-mineworkers-strike (accessed 15/07/25)

¹⁴ Gemological Institute of America (GIA). (n.d.). Diamond history & lore. https://www.gia.edu/diamond-history-lore (accessed 15/07/25)

¹⁵ Kimberley Mine Records, 1895

¹⁶ The Economist, 2023

¹⁷ Leopold II’s 1892 decree

account for every bullet used by providing a severed right hand.¹⁸ This led to 10 million deaths overall (and a 50% population decline between 1885 and 1908)¹9. The Congo rubber trade epitomizes colonialism’s prioritization of resource extraction above all else, systematically dismantling economies and human rights in pursuit of profits.

This economic machinery of colonization is starkly illustrated in three case studies: Palm oil in Nigeria (British), Cotton in East Africa (German) and Groundnuts in Senegal (French). Precolonial Nigeria had thriving palm oil markets, with women playing a central role in production²0. Exploitation can be seen in the land seizures due to the 1903 Land Proclamation Act, which converted communal lands into plantations²¹. Secondly, it can be seen through the Hut tax²², which was payable only in palm oil, forcing farmers into cash-crop labour – and export duties then ensured that profits flowed to London. This then triggered the Aba Women’s war (rebellion due to tax hikes on oil traders) as well as famines (due to lost farmland) and long-term consequences such as Nigeria struggling to diversify its economy postindependence. Nigeria’s palm oil industry epitomizes colonialism’s weaponization of taxation and land theft to coerce Africans into global capitalism, sparking resistance that targeted economic frameworks.

This pattern of coercive monoculture repeated across the continent: for example, in German East Africa. The German colonial administration systematically transformed East African agriculture into a cotton export regime through coercive land policies, price controls, and violent enforcement – reducing subsistence economies to dependency as raw material suppliers for European industrialization. The German colonial Cotton Ordinances (1902-8) mandated farmers to dedicate 50% of arable land to cotton²³. By doing so, colonizers confiscated fertile areas and eliminated subsistence alternatives, forcing workers into wage or famine conditions for labour²⁴ guaranteed cheap export inputs (due to little to no land costs). Farmers were fined for ‘poor quality cotton' — which trapped some in debt cycles²⁵. Colonizers imposed fixed wages to take advantage of local workers, Tanzanian farmers were paid 1.5 marks/kg while globally, far less than the global rate of 4.5 marks/kg²⁶. These examples underscore the systemic resource abuse in colonial East Africa. Violent enforcement included the “Kiboko” (Whip) System, hostage taking and relocation to punitive ’cotton concentration zones’²⁷. The Kiboko System consisted of 25 mandatory lashes for farmers who were ’failing²⁸‘ to meet weekly cotton quotas — this resulted in 300+ deaths recorded in one district alone²9 (the Sinigida District). Children were often taken hostage for colonizers to have a form of leverage over farmers, they were imprisoned until the worker’s quotas were met³0– this misuse of power sparked revolts that preceded the outbreak of the Maji Maji Rebellion. These zones led to 4,000+ deaths from starvation in German-administered camps like Kondoa Irangi (1908)³¹.The German colonial system systematically implemented land expropriation, price manipulation, and unjust labour policies to transform local agriculture into a cotton export economy. These measures created a dependent economic relationship that prioritized industrial needs over indigenous subsistence systems. While German cotton policy relied on visible brutality, Senegal’s system employed subtler mechanisms of

¹⁸ Belgian officer’s diary, 1896 (Royal Museum for Central Africa, Tervuren).

¹9 Hochschild, King Leopold’s Ghost

²0 Economic Exploitation in Kenya (Brett, 1973).

²¹ Colonial annual reports, No. 438. SOUTHERN NIGERIA. Report for 1903

²² James Kanali Kwatemba SJ, MA Kenya III: Context & History With contributions by Dr. Jörg Alt SJ, MA, BD

²³ German Colonial Archives, Dar es Salaam

²⁴ Giblin, 2005

²⁵ Iliffe, Modern history of Tanganyika, p.112

²⁶ German trade reports 1904

²⁷ Deutsches Koloniallexikon, 1920, Vol. 1, p. 304

²⁸ german district officer reports, Tabora archives 1904

²9 liepzig Mission society, box 12

³0 Maji Maji Research Project, Dar es Salaam, Oral History #47

³¹ Sunseri, Vilimani, p. 62

control within the French groundnut monoculture system, where economic coercion was masked as ‘development’, while achieving the same goals as the German and British systems. Through export monopolies and extraction focused infrastructure, France engineered Senegal’s dependence on a cashcrop that enriched their own merchant, while impoverishing local African farmers. Pre-colonially, Senegalese farmers mainly grew crops like millet and rice —with groundnuts only grown for local oils and soaps. In the 1890s, an “indigénat” head tax, payable only in groundnuts³², was introduced by French colonial authorities. This coerced locals into cash-cropping in order to avoid imprisonment³³,which increased the need to grow groundnuts, devastating Senegal’s economy and society. Over time, the soil quality depleted, and yields dropped from 900 kg/ha in 1900 to 300kg/ha in 1950³⁴. Additionally, farmers were mandated to sell to French firms (Maurel & Prom) and were paid 40% below global rates. This depletion led to famines and food crises across the region. The infrastructure bias was present in the building of railways (e.g. the Dakar-Saint Louis line in 1885) solely to transport groundnuts to ports³⁵, neglecting local needs. This misuse of authority caused a plethora of strikes. In the years 1946-48, farmers withheld crops, reducing exports by 62% (1947), according to French colonial trade logs. Postindependence, Senegal still devoted 60% of its land to groundnuts³⁶ and France retained control via EU tariffs³⁷. Together, these three cases reveal the core economic logic behind imperialism: the dismantling of pre-existing economic systems to create export- dependent economies — through taxation, violence and market manipulation, while all being aimed at resource extraction.

The economic violence of colonialism—systematic land seizures, extractive forced labour regimes, and monopolistic control over export markets—provoked resistance movements that were fundamentally anchored in the material realities of subsistence, survival, and the defence of indigenous economic autonomy. Samori Touré’s sovereignty over the Wassoulou Empire (encompassing present-day Guinea and Mali) provoked French aggression precisely because his control over the gold-rich Bouré region and trans-Saharan trade networks threatened colonial ambitions to monopolize West Africa’s wealth. Touré delayed French Conquest by 16 years (the longest in West Africa). His multi-pronged resistance combined scorched-earth tactics (incinerating cash crops to deplete French supply lines), industrial autonomy (establishing indigenous arms production to break reliance on European weapons), and economic warfare (barring French merchants from accessing regional markets). Michael Crowder (1971) wrote “Samori didn't just fight armies – he strangled their supply chains”. The Mau Mau Rebellion³⁸ (1952-60) epitomizes anti-colonial resistance through its systematic dismantling of colonial economic infrastructure. The colonial administration expropriated Kikuyu ancestral lands, converting them into white-owned tea and coffee plantations where Kikuyu labourers earned twenty times less than European settlers. Simultaneously, Hut and Poll taxes³9, engineered to coerce Kenyans into wage labour and fund the colonial state—ignited mass indignation. Both factors contributed to the eventual development of the rebellion, which mostly consisted of the Mau Mau employing guerilla tactics (20,000 hid in the Aberdare mountains), killing settler cattle, burning plantations. The rebellion forced Britain to eventually negotiate independence (1963). Both examples reveal resistance as economic warfare—where attacking supply chains (Samori’s trade bans) and means of production (Mau Mau’s plantation burnings) proved more devastating to colonialism than battlefield confrontations. They exposed empire’s fatal contradiction: it

³² Archives Nationales du Sénégal, 1D-142

³³ Conkloin,1998

³⁴ World Bank Soil Surveys, 1961

³⁵ Colonial Railways in Africa

³⁶ FAO Statistics

³⁷ Van de Walle, African Economies, 2001

³⁸ https://www.britannica.com/event/Mau-Mau-Rebellion

³9 James Kanali Kwatemba SJ, MA Kenya III: Context & History

needed African labour and resources to thrive, yet its brutality guaranteed those very people would weaponize their economic leverage against it.

The Scramble’s economic motives remain fiercely debated, with competing frameworks—from Hobson’s theory of metropolitan capital surplus to Rodney’s thesis of deliberate underdevelopment— each illuminating distinct pathologies of colonial capitalism. These divergences reflect not merely academic disputes, but fundamentally opposed conceptions of how empire functioned as an economic project. These debates resolve into three primary frameworks. J.A. Hobson (in his “Imperialism, A study” (1902,) diagnosed colonialism as Europe’s escape valve for excess capital – finding support in sources such as Cecil Rhodes’ admission in 1895 that empire prevented “civil war at home” in the Westminster Gazette, and the 700% growth in French investments into Senegal’s groundnut trade in 1880-1900. Yet, Hobson struggles when confronting cases like the Congo rubber terror, where violence dominated over profit – as well as Portugal’s unprofitable cotton schemes in Mozambique⁴0. V. I. Lenin (“The Highest Stage of Capitalism” 1917) countered that imperialism reflected capitalism’s inevitable monopolization, exemplified by the German East Africa cotton boom of 1900-1914, where Deutsche Bank and Dresdner Bank formed a cartel controlling 85% of exports ⁴¹ . Yet, Lenin’s materialist framework stumbles when facing anomalies like Belgian Congo, which operated at a loss till 1900⁴², sustained by Leopold II’s personal interest rather than profit. The theory’s blind spot emerges most sharply in its erasure of African agency – for example, its inability to explain why Samori Touré’s arms factories threatened French capital flows more profoundly than any European competitor. Walter Rodney (How Europe Underdeveloped Africa 1972) reoriented this debate, by demonstrating how European colonial regimes systematically dismantled Africa’s development pathways. His work contrasted the complex pre-colonial systems, such as the Sankore University mentioned earlier, with the deliberate underinvestment into colonial education – for instance, where France allocated less than 0.5% of its West African budget to schooling⁴³. Rodney’s analysis of imperialist systems focused on areas like Nigeria (with their palm oil production) and Senegal (with groundnut exports). He claimed that structural violence created what he had coveted as “development in reverse”, a paradigm where – as data shows – former colonies like the DRC remain trapped in the same extractive patterns, now yielding $24 billion annually in cobalt exports, while 73% of citizens live in poverty⁴⁴. Where Hobson exposed capital’s excess and Lenin its monopolistic logic, Rodney revealed their terrible synthesis: a system that could only ‘develop’ Europe by underdeveloping Africa.

To conclude, European claims of 'increasing productivity' masked a grim reality: colonial regimes didn’t create wealth, but forcibly redirected existing African economic flows into channels that bypassed African benefit—a continental-scale system of value diversion whose legacy persists today. The extractive colonial model—monoculture dependence, infrastructure designed for export, and racialized labour exploitation— has evolved rather than ended. Contemporary parallels abound: IMF structural adjustment programs replicate colonial cash-crop taxation, while De Beers' modern mining compounds in Botswana maintain apartheid-era labour control through corporate housing schemes.

The same cycle of exploitation continues today, just with different faces in charge. Where colonial powers once took rubber and minerals, foreign companies now take cobalt and lithium—all while most Africans stay poor. The system hasn’t really changed: roads and railways still lead from mines straight to ports, just like in colonial times, making it easier to ship resources out than to build connections between African communities.

⁴0 Newitt, 1995

⁴¹ German Colonial Archives, R1001/726*

⁴² Vangroenweghe, 1985)

⁴³ Suret-Canale, 1971

⁴⁴ World Bank, 202

But Africans keep fighting back in new ways. Countries now ban raw mineral exports or demand local factories—just like past leaders burned cotton fields or made their own guns. The real struggle is against the lie that this is just "how global trade works." After all these years, we still must ask: why do Africa’s resources make others rich while leaving Africans locked out? The answers won’t come from foreign capitals, but from Africans building on their own history of resistance—from ancient scholars to today’s activists.

³² Archives Nationales du Sénégal, 1D-142

³³ Conkloin,1998

³⁴ World Bank Soil Surveys, 1961

³⁵ Colonial Railways in Africa

³⁶ FAO Statistics

³⁷ Van de Walle, African Economies, 2001

³⁸ https://www.britannica.com/event/Mau-Mau-Rebellion

³9 James Kanali Kwatemba SJ, MA Kenya III: Context & History With contributions by Dr. Jörg Alt SJ, MA, BD

Rise of Shoegaze

This article will be a journey through Shoegaze as I explore the four albums and EPs that took me through the genre, starting from its infancy in the early 90s until its digital revival in the late 2000s. I want to explore how each of these albums came into relevance, their impact and inspirations as well as giving some insight into how the genre evolved and how its culture defined itself. I will also give a small review of the album and my thoughts on it personally including my top songs on the respective album/EP. This genre is defined by its noisiness and uncertainty, often dipping its toes in themes of drug addiction, personal discovery, and existential exploration.

My first ever Shoegaze album was none other than the infamous heaven or Las Vegas by the legendary group the Cocteau Twins. It released on the 17th of September 1990 and took the world by storm as the first truly groundbreaking unique and experimental Shoegaze album that combined dreamy elements of early psychedelic rock from the 1960s, and the darker more brooding atmosphere of the post punk era with bands such as joy division, all while adding their own unmistakable ethereal beauty. Although the Shoegaze genre was still in its infancy with bands such as My Bloody Valentine experimenting, this album helped to signify a definitive start to the mainstream Shoegaze we all know. Low and behold it blew up gaining over 235,000 sells worldwide as of 1996, (Wikipedia) and signified what I consider to be the defining moment of the genre that kicked it into the mainstream, standing as a major cornerstone for the Shoegaze genre to grow upon.

Now I can talk about what makes this album as incredible as it is. From the moment you press play you are immediately blasted with a phenomenally warm, ambient and let’s not forget reverbed riff that immediately established the expectations of the album that never once dip. Cherry-Coloured Funk I believe to be the strongest opening track of any Shoegaze album to date. My top three songs include: Cherry-Coloured Funk, Iceblink Luck and Fotzepolitic however this is interchangeable with Fifty Clowns. Each of these songs is so phenomenally unique and unmistakeably Cocteau Twins, it is an absolute must listen for those dipping their toes in Shoegaze.

It makes sense that after the magnificent Heaven or Las Vegas that I should move onto Loveless by My Bloody Valentine. On the 4th of November 1991 after three years of production My Bloody Valentine’s magnum opus – Loveless, was finally unveiled to the world. It instantly garnered the attention of any Cocteau Twins and Sonic Youth fans as it took noticeable inspiration for both their very overly produced guitar tones and dissonance. This album instantly rendered itself as a cornerstone for the Shoegaze genre and often gets praise for being the greatest album of this era. This is seen in multiple Shoegaze album rankings (Pitchfork: album of the year, NME) and polls such as alternative press. This album is often referred to as the cornerstone of Shoegaze because it has all the notable features of Shoegaze (overly refined and noisy guitar tones, dissonance and overwhelming amounts of noise due to harmonics) and is a very open and welcoming album to the genre that allows someone to get into the genre but still without holding back any key elements. This album was so phenomenally good that it resulted in the death of the band itself because the lead guitarist and vocalist Kevin Shields couldn’t bear to release anything inferior to Loveless and therefore vaulted all work of the band post Loveless.

They also struggled financially after this album because of how messy the production was. This album was such a peak piece of music that it literally tore the band apart; there is no way in which this album can’t be defined as one of the greatest of all time.

The album itself can only be described as mesmerizingly noisy. I say this because of the incredible guitar tone that is established straight away as the album starts. Only Shallow is not only a fantastic track that encompasses all the elements of Loveless, but it also shows off the fantastic guitar tone that Kevin managed to forge after many gruelling years of experimentation. It’s loud, it’s brash and it’s not going to take the idea of subtlety lightly. It has a now iconic riff (?) that opens the song and sounds very similar to a roar. It then transitions into a gentler, but still very present, chord progression for the verse. This is a good time to touch upon the lyrics for this album. I had to look up if there even were lyrics to this album because I couldn’t distinguish whether they were singing actual words. It turns out there are, but this represents lyricism in Shoegaze rather well, I fear. The tone and production are so uniquely fantastic, it sounds as if the guitars are blending, melting and bleeding onto each other which all add to the noisy and messy atmosphere. This was all achieved by the heavy production and layers upon layers of guitars with different distortions, tunings and playing techniques being used. My top three tracks are: When You Sleep, To Here Knows When, and I Only Said. Please listen to this album. There is no more I can say.

Souvlaki by Slowdive has a special place in my heart as the “Berlin album.” Or as the album that made me realise the kind of person I am. Picture this – October 2024, just had half a term back in the Upper Fifth and I’m off to Berlin for a history trip. Obviously, my headphones were a trusted companion on this trip, but I was slowly running out of music to play, both Clairo and Mac de Marco were prominent figures in my Berlin music career, but the main contender was Slowdive, after I heard When the Sun Hits on a Shoegaze playlist. 2024 summer I was grappling with who I was. I felt as if I didn’t have a personality and that I would try and mould myself into whatever cookie cutter friend the person I was with were expecting. I didn’t have a personality, and I was searching all over the place to find who I was. One of the places I commonly searched was music and in Berlin it finally clicked. Atop of the Reichstag, headphones in, late at night and a view over the whole of Berlin I realised that everything alternative had my heart in an inescapable hold. I am proud to say that Souvlaki (especially Alison) helped me come to this conclusion.

However, the album itself was released on the 17th of May 1993. It is estimated that over 30 million copies of this album have been sold (although this number is not confirmed). It received harsh reviews from critics, notably NME, with a 6/10 and Chicago Tribune with 2/4 stars. However, as time went on critics began to see this album in a new light and started to give it the appreciation it deserves with Pitchfork giving it a 9/10 (2005) and Uncut giving it a 9/10 (2023). The brutality of the critics was because they felt it was derivative of Loveless and an inferior copy. They were also mostly over Shoegaze by this point with sites such as NME being focused on Britpop and wanting charm and swagger over the introspective and dramatic feel of Shoegaze. However, this criticism did amusingly inspire the name “Shoegaze” as Slowdive were often shy and looking at their feet which the critics hence mocked with “Shoegaze.” Although this album is often said to have been inspired (and to an extent a recreation) of Loveless, I disagree. This album was a new area in Shoegaze; it was heavier than the dreamy theme of heaven or Las Vegas, but it still

maintained rock heavy elements as seen in tracks such as When the Sun Hits. I don’t think that Slowdive was trying to recreate the melting feel of Loveless, or the dreamy feel of Heaven or Las Vegas, Slowdive feels like the state between being awake and being asleep. Heavy and dreamy, a combination of Loveless and Heaven or Las Vegas if you will.

Personally, Souvlaki means a lot to me, but above all else and beyond the personal meaning, the album is just good. I could go on for days about this album and how much I adore every single one of these tracks. But sadly, I don’t have days, I only have time for one song which I want to delve into. It obviously must be mentioned Alison as this is arguably the most beautiful song on this album – the first thing that struck me when I heard this for the first time was just how gentle and soft the vocals were. Neil Halsted delivers very emotionally deep lyrics that resonate with themes of drug addiction and the loss of consciousness in an almost lullaby type of way. I think that this song was in fact inspired by lullaby’s and was made as a tribute to one. The lyrics and general themes of the album would seem to suggest that this is the case. The first line of the song is “listen close and don’t be stoned, I’ll be here in the morning.” It’s clear that Neil is trying to comfort someone and to get them to fall asleep, this already resonates with the sleepy, unclear and almost indefinite feel of the album. This further works with the hypnagogic theme of the album. It perfectly sets the tone and establishes how angelic this album is. My top three songs are: Alison, Souvlaki Space Station, and When the Sun Hits. All three of these songs are phenomenal and an easy introduction into Shoegaze.

We take a time jump to 2007 for the release of LSD and the Search for God’s self-titled EP. On the 16th of January 2007 their self-titled EP was officially released to the world. Personally, I think that although this EP is very short at only 22 minutes and five tracks, it is still the most raw and pure embodiment of Shoegaze. Whilst bands like Cocteau twins, MBV and Slowdive all paved the way for Shoegaze’s rise as a genre, I believe that this self-titled EP is the final form of Shoegaze. After 17 years since the break of Shoegaze into the mainstream (Heaven or Las Vegas) this EP managed to perfectly embody the genre into a short and sweet package that is yet to be topped as of 2025. This EP single handedly helped to revive the genre in the 2010s for a small amount of time. After this album, Kevin Shields from MBV came back to release another album which was completely unexpected, especially after the self-destruction of the band after Loveless. Although he never confirmed that they inspired his return it is noted that he kept tabs on underground bands and was aware of the evolving music scene (Arthur magazine 2003 Kevin Shields interview.) It may well have been a possibility that this EP inspired a dead music legend to come back for the revival of Shoegaze. That is an incredible achievement, and even if they didn’t directly trigger the return of Kevin, they are a figurehead for the 2000s Shoegaze return. Due to the massive success of bands such as Radiohead (Kid A) and The Strokes (Is This It) which were very clearly inspired by genres gone by from the 90s, and the digitalization of music, it was no surprise that Shoegaze made a return to the mainstream. Key figures of this return include LSD and the Search for God, Deftones and eventually MBV. The first time I heard this EP was because I was looking through a list of Shoegaze bands and the name “LSD and the Search for God” struck my eye. Let’s just take a moment to appreciate how cool that name is. First off, this EP sounds exactly how you would imagine that to feel – reverb heavy, existentially large and overwhelmingly cosmic. There are no places to hold onto in the album, the structure of the songs isn’t

initially clear, there aren’t certain riffs which can be picked out like in different songs. It feels relentless in a comforting way, there is nothing you can do to give yourself any sense of control and therefore there is no need to worry. It’s hard to explain but it’s comforting and reassuring, once you’ve listened to it, you’ll understand. There isn’t much more to be said about this EP, but if you’re going to listen to any of the songs on it; Starting Over is the one to go with.

The evolution of this genre, from the early dreamy days of Heaven or Las Vegas to the heavy and brooding existentialism of LSD and the Search for God, it becomes clear that Shoegaze never once died away but was instead evolving and shifting in the background, finding the right time for it to surface again. The journey between themes and ideas in this genre reflect Shoegazes ability to transcend time and influence. Shoegaze will never truly die, as there will always be someone willing to carry the legacy of the genre on, whether in the steps of a previous artists or into new territory as seen by many Shoegaze artists before. There is violence and brutality in this genre but there is also unmistakeable beauty that links everything together. Not only does it make for an entertaining listening experience, but it also serves as a basis for self-discovery and introspection. Overall, the rise of Shoegaze was a rare phenomenon in music, as many bands came together to contribute towards one of the most unique and powerful genres of all time.

Turn static files into dynamic content formats.

Create a flipbook