Galatea 2.2 is a book that has taken me over two years to read. It’s not particularly complicated, but for some reason the dense technological jargon combined with Powers sifting through his own complicated human emotions sent me in a spiral of my own human thought, trying to process what actually is human thought in the wake of artificial intelligence. The novel details the trials and tribulations of programming technology to “think” in a humanlike manner to formulate its own literary criticism.
After finally finishing the book, I rushed to check its publication date. 1st January 1995. Although artificial intelligence has been in development since the 1950s,1 this novel seemed to anticipate the more recent attempts to create chatbots which replicate human interaction by mimicking speech patterns, making essentially a search engine with a generated personality layered on top. The fact that the novel is also semi-autobiographical, with the protagonist also being named Richard Powers blurs reality and fiction in a way that complicates the questions of technology in the field of humanities even more.
The title alone embodies the elements of construction and creation so prevalent in the novel. Galatea in Greek mythology is a replicated ivory statue of a woman, carved by Pygmalion. Her creator proceeds to fall in love with her, and Aphrodite breathes life into the statue. As well as mirroring the creation of Helen, the artificial intelligence system created to emulate human critical thought in terms of literary criticism, the title reminds us of Helen’s inherent unhuman nature. The animation of both Galatea and Helen is reminiscent of Frankenstein. Powers and neurologist Phillip Lentz essentially piece together their own monster, replacing parts of decayed flesh with strings of code and electrical wires that fire like synapses. The novel explores the depths of the human consciousness and its ability to be replicated by binary code and mathematic predictions.
Powers’ reoccurring thoughts of his past love, a woman only referred to as ‘C’, echo not only in his thoughts but seem to materialize in Helen’s. This subtly reminds us of the multifaceted human experience; the intense joy and pain of past memories, the present and the uncertain future. These emotions are undercurrents for Powers, reminding us that for Helen they are non-existent. Helen can create extremely detailed, moving essays on topics such as grief, but has no capacity to feel the things she discusses. To her, words are series of codes strung together based on immeasurable data inputs of other human works. The fundamental understanding and empathy for a general human experience does not exist. Helen’s essays provide an unsettling reminder of the consequences of reconstructing something semi-human and allows us to question the outcome. If Helen was to exactly replicate human feelings and thoughts, would she desire freedom? If Helen can replicate thought without emotion or desire, then is she truly replicating thought at all?
The creation of a semi-sentient being that simulates human thought to aid humanity raises even wider ethical questions. AI doesn’t need breaks – it can be created with the sole purpose of analysing and churning out response after response on demand. Aeon the Collective suggests that AI is inherently a ‘slavery fetish’,2 a capitalistic colonialist’s dream; a 24/7 unpaid worker with minimal margins for error. You can constantly demand and be met with no resistance. When you create something devoid of emotion, it cannot feel exploited. The implications of creating technology that simulates colonial force and repackages it as acceptable because “it’s not an actual person” echoes problematic past rhetoric and can open various ethical questions. When applying this to Galatea 2.2, Helen as a feminised and humanised AI tool constantly accessible foregrounds an insidious side of artificial intelligence. She even expresses some desire to control the conversation, stating ‘I don’t want to play anymore’ when she is still developing her ability to create literary works. Helen exists in a state where she can be overridden and is completely unaware of that fact.
Reading Galatea 2.2 this winter seemed oddly prophetic in a feverish, surreal way. Whilst the book dramatizes the extent to which Helen simulates feelings, wants and desires, the fundamental disconnection of human vs. technological “thought” processes is a popular topic today. Particularly in the humanities sector the allowance of AI for literary research and criticism is heavily debated. Can a machine truly replicate the cognitive process of researching, understanding and consolidating concepts to create a new, expanded idea? As of right now, in 2026, no. And if one day it does, what does that mean for the future necessity of “human” thought?
Photo by Alexandre Debiève on Unsplash