Google Dubtechno Now

Authenticity and the Privatisation of Emotion

11th of March, 2024




My nana died at the end of February, and one of the things I realised when looking at her body is that dead people don’t look like people. They’re too still and waxy; they look like movie prosthetics. The rise of artificial intelligence and humanoid robots has prompted horror writers to think more about the “uncanny valley” effect, and one of the fruits of this is a certain viral post from a few years ago which pops up periodically these days. It reads:

“The existence of the uncanny valley implies that at some point there was an evolutionary reason to be afraid of something that looked human but wasn't. You just discovered that the reason for which uncanny valley exist never disappeared, it just got better at looking human.”
u/TheArtick

Thousands of people report that they get chills reading this; that it actually goes beyond its intended use as a writing prompt for pulp horror and reveals something deep and true about the human past. The result of this short sightedness is, bluntly, romanticism for vampires and shapeshifters: pop culture tropes which are probably more visible in the zeitgeist of the secular post-industrial west than they were in the feudal societies of Europe. Very few people, when faced with that writing prompt, understood intuitively that the reason humans are generally apprehensive of things that look like humans but aren’t is that when people die they turn into corpses which stink and bring about disease. For millions of years sapiens have been looking at bodies and, according to the sophistication afforded to a society that is intimate with death, realising their disgust, veneration, and love. It would be ahistorical to codify these feelings as poetry in the modern Western tradition, because it doesn’t well up from within as a complex of contradictions. It is socially known. Indigenous fishermen in the rainforest who catch their fish with bow and arrow know to aim below the fish without understanding refraction, and when a child dies of malaria they know grief.

We have reason to believe that myth and folklore were not usually thought of as explicitly factual by the people of these societies—they understood them to be a different sort of knowledge than that of their immediate experiences or eyewitness reports from trusted peers—but they did in fact believe in them, without categorising them as “fiction” or “entertainment” the way that we do. Those epistemic models are modern ones. The irony is that the very presence of these epistemic models (which should ostensibly clarify our discrimination between material and ideal) actually colonise our thinking such that knowledge, and authenticity, is often mediated through the lens of entertainment media.

“His whole life was about putting himself in places where he could get the picture, and those places, of course, tended to be ones of extreme danger. When he died by stepping on a landmine, he had walked away from the convoy up a ridge in Vietnam; he wanted to be up there to get a good vantage point—to get a better picture.”
—David Kogan on Robert Capa
“If your pictures aren't good enough, you're not close enough.”
—Robert Capa

Robert Capa’s maxim works in both directions. If your picture isn’t good enough, don’t back up any farther. There is nowhere else to go but in, which is troublesome for the enterprise of authenticity. It seems hard to imagine now but in the late 19th century there was something of a battle against romanticism in photography—which means that the old dialectic of man and nature that predominated in romantic works before 1848 somehow managed to live on; smuggled into the new media. Peter Henry Emerson’s 1889 book Naturalistic Photography was designed to evangelise the author’s particular aesthetics, which entails not just an explanation of what he thinks photography ought to be but also diatribes against what he feels it currently is.

“Wherever the artist has been true to nature, and has been good; wherever the artist has neglected nature and followed his imagination there has resulted in bad art. Nature, then, should be the artist’s standard.”
—Peter Henry Emerson, Naturalistic Photography

This posited 19th century contradiction between nature and humanity fails obviously at the first hurdle in that it cannot actually define nature or humanity. It should not be controversial to say that what 19th century Europeans understood nature and the imagination to be were proscribed by their bourgeois myopia, and yet we have allowed their shallowness to continue to dictate our own standards for authenticity. We still learn to approach actors like Toshiro Mifune or even Nicolas Cage as eccentric expressionists who are uninterested in convincing us that we are not watching a film. We are not seriously asked to entertain the possibility that their characters are behaving like human beings; instead they are symbols or avatars of a kind of Platonic emotion that doesn’t exist in nature.

“The impulse of the century is toward naturalism. Today this force, racing toward us, is being emphasised more and more, and everything must obey it. This force has abducted the novel and the drama.”
—Émile Zola, Naturalism on the Stage

We should be sceptical about taking Zola at his word on this—he was writing in 1881 and, again, largely attempting to evangelise his own approach—but what he says has been made true decades after his death by the total victory of the cinema over the theatre and painting. Whatever novel developments away from naturalism might have been developing in the arts, including in the cinema, were destined to be marginalised by Hollywood’s prescription of authenticity. Most of the people who have been recognised as the great film actors of the last century work similarly to Humphrey Bogart, who understood that his presence (specifically his presence) within the flux of the edit meant that looking bored was usually the best way to enchant audiences. Marlon Brando and Henry Fonda were born to be great actors because their faces look ambiguous all the time; the extent that they appear convincing is the extent that they refrain from interfering with the audience’s (mis)conceptions about what gravitational operatics are roaring behind their eyes.

“People estimated the mechanical forces acting on an object by judging the critical tilt angle that would cause the object to fall. The judgment was influenced by the image of a face to one side, staring at the object. The effect was as if a force-carrying beam came out of the eyes and gently pushed on the object. [...] About 5% of participants reported an explicit belief in the physically incorrect extramission theory in which vision involves something streaming from the eyes. However, even when those participants were removed from the dataset, the implicit effect of gaze remained. In our interpretation, people construct an implicit model of other people’s vision as an active process that emerges from an agent and that can physically affect objects in the world.”
Implicit model of other people’s visual attention as an invisible, force-carrying beam projecting from the eyes by A. Guterstam, H. Kean, T. Webb, F. Kean, and M. Graziano

In discussions of emotion, particularly given its role in psychoanalysis, prepositions related to place and orientation tend to pop up. We describe emotions as coming from inside or within us; they originate beneath our conscious and logical cognition and rise above it in moments of passion; their expression usually develops from out of the eyes—modelled as a flow that joins regions of space by time. The particular cultural importance given to emotion as a flow implies that it is literally emanating from us; that it is not immanent in the dialectic of mind and body nor in simple social relation but that it develops in a little factory in our hearts and then rises to the surface. This entails that all expression of emotion is in some sense a sort of leakage. Emotion as the chemically driven, individually instantiated invisible aura becomes passion, and it is only an extreme intensification of emotion that initiates this state change—as in water getting too hot and becoming steam.

The authenticity of an expression of emotion appears to be a sweet spot in a negotiation between a communicative mode and physiological arousal level—which is to say, the social and chemical components of feeling. The exact place this sweet spot can be found depends not on the subject of the emotion but on its observers, though of course an individual experiencing emotion can act as an observer of themselves and so declare their own feelings to be more or less real. Film is everywhere; and of course it is not solely or even primarily responsible for whatever cultural criteria we have for discriminating between real and unreal emotions. It does however play a part in a feedback loop that informs how emotion as communicative mode is judged by the public, in that films are the most important examples of obviously inauthentic displays of emotion.

This remains true even for documentary footage: in a recent example, pro-Israeli social media accounts who were confronted with footage of Palestinian women holding their dead babies and weeping declared that the babies were rubber prop dolls and that the women were actors because if they really had just lost their babies then they wouldn’t be posing in front of a camera. Grief, then, is always most real when it is experienced alone and in silence. Grief is a private matter; and phrased like this it stands to reason that those espousing this view have never had to reckon with mass grief—experienced by a class—in the way that Palestinians are having to do right now.

In 2021, a Guardian article reported that North Korea had “banned laughter” during the ten year anniversary period of Kim Jong-il’s death. Two days later they issued a correction to note that the story was based on an unverified claim coming from notorious disinformation platform Radio Free Asia. The characterisation of this platform is my own—they were also the ones who started the story about people having to get the same haircut as Kim Jong-un. Consider the above footage. The citizens of North Korea are mourning their leader. Their grief is animated, expressionistic, and passionate. The comments on the video range from ridiculing the mourners to expressing horror that they are forced to emote as they do; no-one seriously entertains the possibility that their grief is authentic.

“In fact, the English term “emotion” differs from such similar words as “feeling”, “affect”, “sentiment”, and “passion”—used in the broad sense—in that the former has become established through the development of modern psychology and its experimental method aimed at measuring and categorising fragments of emotive expression, whereas the latter stem from the humanistic-philosophical tradition. The term “passion”, used in the past in place of “emotion”, is distinguished from the latter in common usage as being a violent tension of a certain duration that one undergoes regardless of social restrictions or self control.”
—Paolo Santangelo, Sentimental Education in Chinese History: An Interdisciplinary Textual Research on Ming and Qing Sources

Confucianism prescribes a great deal of cultural propriety in Korea, as it does in many other cultures of East Asia. On the one hand, Confucianism’s emphasis on individual temperance produces, at least among the literati and ruling classes, a culture of conservative sentimentality that borders on the repressive. In English the description of emotional “repression” often depends on the distinctly bourgeois Victorian English rigor mortis of the soul. Though doubtless the use of the word “repressive” in this sense to describe East Asian cultures can be appropriate in many cases, these cultures are unlike Victorian England in two important ways. Firstly, the moral doctrine of Confucianism is not so stratified along class lines as was the enormous divergence between the temperaments of English workers and the English bourgeoisie; the doctrine of zhongyong 中庸 (doctrine of the mean) was always much more rigorously applied to the aristocracy than any other class but the whole of society was touched by its influence. Secondly, as the name “doctrine of the mean” implies, the proper approach to passion was to exhibit and express it in its various forms, without any one form predominating over any other (to not show an excess of joy or a want of sorrow for instance) and to display each in moderation. The notable exception to this, especially in Korean Joseon society, was of course the display of grief in funerary rites.

“In general, male literati were deprived of opportunities to weep and mourn both socially and officially, even in the private and domestic spheres. However, the funeral oration gave them the opportunity to weep, mourn, and lament as they wished. By writing a funeral oration, a man could express his grief and sorrow and even cry in front of family and friends who were also mourning. This did not mean that he was weak or immature; rather, it created a meaningful context through which he could reveal his emotional vulnerability and thus show his compassion as a family member, friend, or colleague. In Confucian culture, funeral writing functioned as (1) the manifestation of courtesy, composing a significant part of the ritual, (2) the authentication of sincerely felt feelings for the deceased, and (3) a natural and familiar response arising out of human nature, a sort of sphere in which to rediscover humanity.”
—Choe Key-sook, A Weeping Man and the Mourning Ritual: Literati Writing and the Rhetoric of Funeral Oration in Eighteenth-Century Joseon

Understanding the Democratic People’s Republic of Korea is obviously very difficult: we don’t have enough information to do it well and almost all of what we do have is ideologically biased either by the reactionary sentiment of the West or by the filtration system that any media must pass to qualify for state sanction. For this reason we end up relying on sources that predate the Japanese annexation in 1910; and while assuming that social relations have remained largely unchanged from feudal Joseon is obviously wrong, we can measure the extent that these Confucianist principles still persist in China (where they are either cultivated or tolerated) and Japan and South Korea (where nationalist sentiment dictates that they are tolerated when invisible or convenient and rooted out otherwise) to give us clues as to how prevalent they are in the North. Of particular interest to us is the extent that Kim Jong-il resembles an Emperor to the masses: to what extent have they adopted the secular philosophy of communism or to what extent are the leaders regarded as social superiors in the sense demanded by the institution of monarchy? These ideas are contradictory, in the sense of the social sciences, but they are not in practice mutually exclusive: Lenin’s internment in a glass case at Red Square makes this clear.

Thai people mourning the death of King Bhumibol in 2016

Western historiography tends to overemphasise the revolutionary transformations of the early 19th century when explaining the radical and unprecedented newness of our world—Eric Hobsbawm’s neologism of “dual revolution” (to refer to the combined effects of the economic Industrial Revolution in Britain first and the political French Revolution in establishing an unchallenged capitalist social order) mark the pivotal moment of modern history as 1789-1848. But Hobsbawm himself is very clear about the limited movement of this period, and instead emphasises that for the vast majority of humanity the relevance of this Western European revolutionary period is that it led to the (arguably contiguous) world war period of 1914 to 1945: a disruption of the usual European order that finally ushered in the long awaited collapse of colonialism in Africa and Asia and the triumph of communism in East Asia.

For four fifths of humanity (and this majority has widened and will continue to widen) the revolutionary period as regards culture began in about 1945 and probably continues. The emancipation of women, the rise of a mass culture, the victory of the city over the town and village, and the proletarianisation of the peasantry are all phenomena which began to affect the whole world in this period. North Korea, having developed its national ideology of juche over the first few decades of Kim Il-sung’s premiership, closed itself off from the majority of the secondary cultural transformations which attended this post-war transformation.

Cho Sung-hyung, a South Korean-German filmmaker who lived in the North for some months in order to make her film My Brothers and Sisters in the North, writes that: “It was for me like a journey back in time—today's North Koreans are comparable to the South Koreans of the 1970s. During that time, South Korean society was not as extremely capitalistic as it is now, and people were also much more naïve and humane.” The racist mythology of North Korean sensationalism contends that at that time, the 1970s or thereabouts, there was a cultural break along the 38th parallel that saw the Northerners pursue a wildly idiosyncratic course that created a nation of weirdos. Totalitarianism created their psyches and subjectivities, it might be said. By a relative spatial metric, North Korea is a deeply unusual state with deeply unusual citizens in that they are quite unlike any other country or citizenry that currently exists. But by a relative temporal metric the North Korean subject is probably more like that of most other members of the human race before the 20th century’s revolution of subjectivity, and it is everyone else who has been made peculiar. When the rest of the world underwent this subjective revolution, North Koreans were insulated from some (but not all) of its critical results by their alienation from international capital: the material driver of its force.

One comment reads: “Imagine being forced to cry or your life would be in danger”. This is essentially what they are all doing, imagining. When a century of cinema has taught the West that the camera engenders subjectivity, whilst the material and cultural reality of North Korea’s profound strangeness makes this identification between subject and object impossible, we get intense emotional dissonance. Better political and historical literacy might have helped these people avoid being credulous dupes, but it is not this lack of literacy which has caused their judgements in the first place. What did cause them is complex: we might generalise the cause as the combined secondary cultural effects of the post-war subjective revolution. The emotional dislocation of the camera and the screen is one of those effects.

That these are North Koreans in particular is important—after all, very few people reacted to the passionate grief of the Thai people when King Bhumibol died in 2016 with the same accusations, since without the mythology of totalitarianism there didn’t seem to be an obvious political framework through which to understand their apparently unforced hysterics. The mythology of the North Korean totalitarian regime is obviously stupid, sometimes it is also racist. Invariably when one looks into the sources for the claims made about people being forced to grieve (whether for Kim Jong-il in 2011 or for Kim Il-sung back in 1994) one finds their origin in either the sensationalism of journalists and cultural critics who are financially backed by right wing think-tanks, or in defectors who are incentivised to lie by South Korean talk shows that pay them according to the peculiarity of their claims. Though this is important to note here, it is mostly irrelevant to the judgements made by those hapless commenters, who have no interest in reading sensationalist news and instead just watch the footage and form their opinions according to their instincts. The grief looks fake because it looks like bad acting, and bad acting is bad because it looks like acting at all.




Further Reading:

Anti-Communism in Left Philosophy
Cædmon the Outsider
Film Acting and Identity