The Rise of Digital Board Games in the Midst of a Global Crisis

As the COVID-19 pandemic continues, and many are trapped in their homes in isolation, a strange trend has begun to emerge in gaming: a rise in Digital Board Games. Berserk Games’ Tabletop Simulator is by no means a niche game, likely sitting in over one million players’ Steam Libraries. Despite this, the concept of ‘regressing’ to board games when more exciting games are readily available was a strange idea to many, with most players using in-built modding capabilities to make more extraordinary tabletop games of their own. Since March of 2020 however, Tabletop Simulator has seen a huge spike in players, greater than ever before. Of course during these unprecedented times, many gamers have been granted significant amounts of extra free time, and this clearly shows in statistics for all games across Steam. While this was bound to be the case, the spike in players of online, digital board games is distinct to this broader trend – simply put, more players are craving a board game experience than ever before.

There’s something so intensely tactile and personal about a physical, tabletop game. Rolling dice, tapping counters against the board to count spaces, moving the wrong piece or flipping the table in rage. While most video games are far from linear and rigid, theres an innate human unpredictability to a physical game that no coded events can match. It’s because of this that board games become the epicentre of fond social memories; every time a player forgets a rule in Monopoly, or adds a new wild card in Uno, the game becomes a completely unique and personal experience – for better or for worse – that only the players at the table can ever experience. This combined with the fact that board games are impossible – or at least incredibly difficult – to play alone, and that only people sat metres away from you are able to share in the experience, is what makes board games so special.

Tabletop Simulator‘s goal is simple: create a multiplayer, physics-based tabletop sandbox. The game has suggestions of games, of rules, of strategies, but insists that this should all be upheld by the players and the players only. The game merely gives players a cursor to grab and move pieces and gives them free reign of the table. No event is scripted; the die doesn’t roll at the click of a button, the counters don’t move automatically – all the power is given to the players. Human mistakes, fumbling actions, and strange interpretations of the rules are all facilitated by this format. The result of this is as close as many can get to the authentic board game experience without sitting around a table and actually playing one. It is no wonder, then, that despite many leaving the board game format behind, to pursue the more exciting digital world of gaming, so many players, cooped up in their homes, missing their friends and family, have leapt into the world of Tabletop Simulator. For many, at least for the near future, the game is one avenue to allow players to feel pure, authentic and chaotic human interaction from the loneliness of lockdown.

Spectacle and Visual Effects: Black Panther VS Infinity War

In ‘Space, Place, Spectacle’, Andrew Higson defines two interweaving concepts – narrative and spectacle – that, he argues, push and pull against each other; a conflict that is a prominent factor in the fundamental experience of cinema.

Narrative – in part, the sense of something lacking, installing a desire to explore, to find out what is missing, to move onto a new scene, and the possibility of achieving what is desired… And spectacle – the spectator confronted by an image which is so fascinating that it seems complete; no longer the desire to move on, no longer the sense of something lacking. (1984: 3)

The conflict of spectacle and narrative at times can be extremely effective in enveloping an audience into a film’s world and story, with the spectacular elements supplementing the narrative and ultimately providing a deeper experience. Other times, spectacular elements may be distracting or overbearing, ultimately compromising the importance of the narrative and taking the audience out of the film’s constructed world. Particularly in an age of visual effects and computer generated imagery, cinema has unlocked the potential to tell fantastic stories that were once thought to be impossible. The advent of these technologies have become a fundamental part in elevating cinematic spectacle to bold new extremes. This essay will explore the concept of spectacle in modern film; the ways in which visual effects have become a vehicle to generate the spectacular, as in the case of Back to the Future Part II (Zemeckis, 1989) and Transformers: Age of Extinction (Bay, 2014); and finally directly comparing the ways in which the Russo Brothers’ Avengers: Infinity War (2018) and Ryan Coogler’s Black Panther (2018) both use spectacle in an attempt to expand their cinematic worlds and weighing up how effective each film is in doing so.

Spectacle and Visual Effects

The importance of spectacle in cinema has been present since its advent – part of the pull of cinema is its potential to completely wow an audience. The writing of Tom Gunning surrounding the ‘cinema of attractions’ explores the fixation on spectacle in some of the earliest days of cinema – which he specifies as 1896-1907. He describes this age as ‘an exhibitionist cinema’, in that ‘it is a cinema that bases itself on… its ability to show something’ going on to explain it as ‘a cinema that displays its visibility, willing to rupture a self-enclosed fictional world for a chance to solicit the attention of the spectator’ (1984: 382). Gunning’s description of this era is particularly applicable to understanding the roots of spectacle within modern cinema. In the advent of cinema, grabbing the audience’ attention played a key role in lifting moving pictures into prominence and forming the foundations of the industry that exists today; in creating spectacular images that stunned spectators, early filmmakers formed markets of audiences hungry to see what more cinema could do – hungry for cinema that was bigger and better. In many ways, this idea is still at least partially prominent in modern cinema, though metamorphosed into something new. Modern blockbusters and studio films strive to hook audiences by promising the ‘biggest’ and ‘best’, just as in the cinema of attractions, though instead of the audience’s hunger to be wowed by the raw medium and apparatus itself (the ability to display moving images), modern cinema’s ubiquity and cultural prominence has evolved this desire into a desire to see unique methods of storytelling, but also to see entirely new, unique – and previously impossible – stories; audiences now understand the fundamentals of the medium and industry that has formed, and strive to see how the technology can be adapted and used in new ways, as opposed to in the cinema of attractions, where audiences (and filmmakers) were barely beginning to understand film. This metamorphosis of what is still essentially the cinema of attractions forms the foundation for spectacle in modern film.

Technology has been an important factor in generating spectacle since the cinema of attractions. As aforementioned, the initial invention of the apparatus of film and its capabilities were spectacular in and of themselves, but as audiences grew familiar to its principles, technological advances to further film’s abilities became spectacular. The films of Georges Méliès, for example, emphasised spectacle, generated by adapting and developing the technology available. Méliès saw film as a tool to further his illusions and in focusing on that, drove technological and logical advancements in the field through experimentation. In modern cinema, the very same drive exists, but for new reasons. Generally, modern cinema accepts narrative primacy in films as the default – already directly contrasting the attitude Méliès took. As Gunning translates in ‘Cinema of Attractions’, Méliès intended to present spectacle, with narrative as an afterthought to highlight the spectacle:

As for the scenario, the “fable,” or “tale,” I only consider it at the end. I can state that the scenario constructed in this manner has no importance, since I use it merely as a pretext for the “stage effects,” the “tricks,” or for a nicely arranged tableau. (1984: 382)

Since 1907, it could be argued that generally, a shift has occurred, in which the dialectic of narrative and spectacle has inverted; that generally spectacle is now the addition to highlight the narrative, or at least to in some way expand the cinematic world the filmmaker is presenting. In Art, Image and Spectacle, Isaacs discusses the work of James Cameron in a similar way to Gunning of Méliès:

I locate Cameron’s aesthetic orientation at the intersection of the two competing interests of the High Concept film. The auteurist vision subsists in the attempt to “invent cinema”, to make cinema new through the exponentially advancing technologies of the spectacle… For Cameron vision is more than a medium for the conveyance of “reality.” The special effect is never purely mimetic, but transformative. (2011: 91)

The drive that Isaacs argues Cameron plays a part in pioneering is extremely comparable to Méliès’ drive to adapt technology to further his illusions, however Cameron intends to adapt technology to further his narratives and expand the boundaries of the cinematic world he creates; Isaacs goes on to mention Cameron’s use of vision in The Terminator, allowing the audience to peer through the eyes of the Terminator himself  – a tactic which not only creates great spectacle, but also arguably assists the audience in understanding the world and narrative at a deeper level. The scenes create a sense of tangibility – that this character is real, alive and ‘thinking’ – presenting a believability which supplements the film’s narrative and immerses spectators into the world of the film. Cameron’s principles are held by many modern filmmakers: that technology can create spectacle that can be used to enrich the narrative of the film – it is here that visual effects have become a vehicle for the spectacular.

A reliance on visual effects to generate spectacle in film emerges from the successes of in-camera special effects created through experimentation in the age of the cinema of attractions, such as the illusions of Méliès, and their refinement and adaptation through to the sixties and seventies onwards. Scenes like Moses parting the Red Sea in The Ten Commandments (DeMille, 1956) and almost countless scenes in Kubrick’s 2001: A Space Odyssey (1968) proved that narratives could be heightened by intense, spectacular effects, whilst also connoting a potential for entirely believable new narrative universes, grounded in complete fantasy but at extreme new levels of realism or believability – a potential that was eventually exploited by the fantastic settings, creatures and vehicles, all made possible through visual effects in Star Wars (Lucas, 1977). Due to these massive advances in visual effects technologies, filmmakers like George Lucas had paved the way for the technology to be used by many to many different extremes in order to expand the narrative world of their films. Some films use extreme visual effects to generate gratuitous spectacle that still becomes the forefront over narrative and does not serve any other purpose than to wow, similarly to the ways the cinema of attractions used this spectacle. This is obvious in the case of an almost infamous example: Michael Bay’s Transformers series, which seems to pride itself on compromising narrative depth for the sake of spectacular action sequences with each release. In Transformers: Age of Extinction (2017), a pivotal fight between Optimus Prime and an ancient legendary warrior is continuously interrupted by spectacular slow motion punches, enormous explosions and a dragon-dinosaur transformer breathing fire at nothing. Angela Ndalianis eloquently describes this calibre of spectacle as ‘an invitation [that] is extended to us to marvel at the speed, special effects, camera work, and ability the cinema has to extract from us a sense of wonder when confronted with these effects’ (2000). While undoubtedly spectacular, and however impressive a feat of CGI, the scene’s spectacular interludes pause the narrative briefly, just to forefront the effects.

Conversely, an important example of how effective the inclusion of spectacular visual effects, even at relatively subtle levels can supplement a narrative is in the case of Robert Zemeckis’ Back to the Future Trilogy – though perhaps most notably in Back to the Future Part II (1989). As with many contemporary blockbusters, spectacular visual and special effects are abundant in Back to the Future Part II, some are eye-catching moments of intensified spectacle (hoverboards, skyways and parodic CGI sharks), but some are relatively subtle illusions that intend to be unquestioned or unseen, whilst still maintaining spectacle – these effects are the quintessential example of how spectacle can supplement a narrative. Using miniatures and compositing tricks, the Delorean in this installment of the trilogy is given the ability to fly – an effect which, at the time of release, is not unfamiliar. This being said, in one sequence during the film, when Marty and Doc drop a fainted Jennifer home in 1985, virtuosic compositing and visual effects are used to seamlessly blend composite footage of a miniature and footage of the real, practical Delorean. The Delorean comes in to land in front of Jennifer’s house visualised by an animated miniature composited over an empty background plate, shot using motion control. The take then continues as the miniature Delorean is slightly obscured by a street lamp, hiding a split screen that unveils a second motion control take, featuring the practical, full size Delorean on set on the other side of the post; in one seamless take, the Delorean flies into shot, touches down and drives to the porch for the car door to open and Einstein to step out. This moment is extremely spectacular – it is a feat of visual effects, pushing the bar for motion control cinematography of the time and is an impressive moment – yet it entirely revolves around being ‘invisible’ to the audience. What this presents is an antithesis of Gunning’s ‘exhibitionist cinema’ – this moment is an example of where arguably extreme spectacle is used exclusively to supplement the narrative, in an attempt to avoid audiences questioning the technical apparatus of the film. The effect itself is unquestionably spectacular, yet the seamlessness of it immerses the audience into the world of the film, as opposed to taking them out of the world to ‘show something’, as Gunning states of the cinema of attractions. Bob Gale, producer of Back to the Future Part II explains that this was the intention behind the effects of the film:

 

We take [the effects] for granted. The story is the most important part of the film, and if the audience is involved with that story, the effects are there to enhance that enjoyment and make the story more believable, not to call attention to themselves… What we tried to do was make sure these effects are so tightly incorporated into the story, no one will question how they were done until they’re driving home from the movie. (1989)

Unlike other blockbusters of the time, spectacular elements are employed into the film specifically to serve the narrative and not just to exist as they are. The advancements made in technology to allow for these spectacular effects and elements were not driven by a necessity for eye-catching moments, but driven by the desire to supplement the narrative. These two different ways in which films use spectacular visual effects to prioritise and elevate either spectacle or narrative will serve as the foundation for comparison between spectacle’s usage in Marvel’s Black Panther (Coogler, 2018) and Avengers: Infinity War (A. and J. Russo, 2018) and its effect on their narratives.

 

Black Panther and Infinity War

Coogler’s Black Panther, like most entries into the Marvel Cinematic Universe films, heavily relies on spectacle and visual effects as a vehicle to generate this spectacle. Throughout the film there are many strong examples of the use of spectacle to immerse the viewer and expand the film’s constructed world, supplementing the narrative. One of the film’s biggest strengths is rendering the city of Wakanda in spectacular detail in CGI and digital matte-paintings. In one example, as T’Challa first arrives in Wakanda in a flying vehicle, the audience is presented with a flyover of vast African plains occupied by wildlife and people alike, who wave at the passing vehicle – to enforce to the audience that these people are real and inhabiting the fictional world. Following this, the ship plunges into the trees and emerges ahead of Wakanda, in all its glory – great skyscrapers tower above the skyline, boats travel up and down the river, districts with individual streets form the city, all occupied by miniscule cars driving, with futuristic trams and hyperloop trains whizzing by alongside them. While this scene of course provides wonderful eye candy for an audience to acknowledge on the surface, the immense subtlety and detail serves a similar purpose to the aforementioned sequence in Back to the Future Part II; the decision to employ spectacle of this kind, in this way convinces the viewer that the world within the film is as real as the world outside the cinema. In an article for the website CityMetric analysing the fictional city, Stephen Jorgenson-Murray eloquently distinguishes Wakanda from other cities in the Marvel Cinematic Universe:

Fictional cities in previous Marvel films… don’t feel like real places at all, but collections of random monuments joined together by unwalkably-wide and sterile open spaces. Wakanda’s capital, the Golden City, seems to have distinct districts and suburbs with a variety of traditional and modern styles, arranged roughly how you’d expect a capital to be – skyscrapers in the centre, high-rise apartments around it, and what look like industrial buildings on its waterfront. In other words, it’s a believable city. (2018)


This inclusion is absolutely integral to the success of Black Panther and plays a huge part in supplementing the narrative. The audience are invited to follow a plot about a battle for the future of an entire nation implicating real world, contemporary racial politics, the consequences of which would not easily be felt without the sheer spectacle of the visualisation of Wakanda; without rendering the civilisation – arguably a character in its own right – in the extreme, spectacular detail the filmmakers chose to include, major character decisions that bear weight due to the grand repercussions against the nation, would lose the impact they require to keep the story believable. The cultural and narrative weight of the film would be undermined without the setting of the film being realistic, believable and applicable to the real world. The use of visual effects to render this detail and create such a cinematic spectacle directly supplements the narrative.

Contrary to this, at many times Black Panther employs spectacle in the same vein as the aforementioned example within Transformers: Age of Extinction. In the climactic battle towards the end of the film, the action is intensified by spectacular moments, aided by visual effects. What begins as armed combat between tribes is exacerbated when a character uses a horn to summon enormous rhinos to aid in their battle. Dan North explains of spectacular visual effects that ‘the first step in rendering an effects sequence consumable as spectacular fodder is to segregate it from the main body of the film’ (2005), an instruction that Black Panther follows: what is an otherwise intense character driven battle implicating all the leaders of the factions that the audience has been introduced to, is temporarily suspended and the audience is distanced from the narrative as the CGI war rhinos emerge from the ground. The rhinos’ gratuitous presence seems to make no difference to the narrative of the battle, providing no real obstacle for the characters aside from a relatively minor set-back to the protagonist, when he is flung through the air at a rock. The inclusion of the rhinos and complete pause of the narrative to showcase them, is left as nothing more than spectacle added for the sake of wowing an audience, at one point even including a slow-motion rhino beat-down, extremely reminiscent of the aforementioned Transformers scene. Sequences like this are incorporated into the film throughout and compromise the narrative just to forefront the spectacle of the visual effects.

The Russo Brothers’ Avengers: Infinity War, a seemingly generic entry into the MCU, takes a radically new approach to spectacle by introducing the primary antagonist of Thanos. Thanos as a character presents incredible production challenges, due in part to his alien species, but also due to the necessity to make him feel believable. Infinity War intends to partially subvert the conventions of the MCU, treating its villain as a character with depth and motivation and not as just a shallow, evil force lacking in motivation. It could be argued that the film’s narrative follows Thanos’ journey and the obstacles he faces in reaching his goal more than the journey of the Avengers, which is only bolstered by the film’s subversive ending, where Thanos is successful in reaching his goal. What was necessary, then, for the character of Thanos, and by proxy the entire film’s narrative to function, is on the one hand to present his spectacular appearance, with fantastic biology, but equally maintaining the capability to hold as much emotional depth as an ordinary character, so as to be a fully believable character and role in the unfolding story. Thanos in Infinity War attempts to become a balance of spectacle and narrative, providing spectacle whilst being fundamental to the narrative. In the film, Thanos is rendered in arguably unprecedented photoreal CGI with excruciating detail; elaborate muscle systems and physics simulations combined with ultra-high resolution modelling and texturing, and motion capture performance of Josh Brolin create the ultra-realistic antagonist. Absurd attention to detail furthers the illusion, with visual effects teams even including stubble that grows on Thanos’ head as the film progresses.  Hopkins argues that ‘For the viewer to successfully… leave the real world and enter, if only partially, the imaginary cinematic place, the spectacle on screen must resemble at least vaguely the spectacle of everyday life’ (1994). In creating a spectacular CGI character that this closely resembles reality, the filmmakers maintain the spectacular appearance of Thanos, comparable to the aforementioned spectacle in Transformers, whilst also persuading the audience to accept Thanos in the same way that the unquestioned moments of spectacle or ‘invisible’ visual effects function, serving the same purpose as the described moments in Back to the Future Part II; ultimately balancing the inclusion of spectacle whilst avoiding compromising the narrative of the film.

Even with Infinity War’s immense strive for spectacle to assist in immersing the audience in its narrative, the film still is arguably mostly constructed of spectacle. One could argue that despite the attempts of the filmmakers to make Thanos feel realistic enough to supplement their narrative, fundamentally the entire idea of including Thanos in the film was a decision fixated around spectacle over narrative; it is likely that Thanos was selected to be the film’s antagonist due to the spectacle his on screen presence would generate. Additionally, Infinity War is guilty of the same gratuitous inclusion of spectacular elements purely for the sake of showcasing these elements. In the many large scale battles of the film, in order to ‘raise the stakes’ of the battle, the filmmakers introduce spectacle to wow the audience, as opposed to narratively introducing new obstacles for the characters to overcome. In the case of the battle that takes place on the planet Titan, the stakes of the fight never develop or evolve, but to contrast this continuity the filmmakers employ spectacle to intensify the scene. In a moment of sheer, unadulterated spectacle, Thanos tears a moon from its orbit and sends its debris crashing down upon the Avengers. While the sequence serves as immensely visually stimulating material, the narrative repercussions of this action are near non-existent; the Avengers are almost entirely unphased, with the only reaction to the event being a quip from Iron Man. This entire moment serves as nothing more than spectacle, disrupting the narrative, compromising the immersion and jarring the audience out of the constructed world for a moment to gaze in awe at the event itself comparably to Black Panther’s rhinos and Transformers’ dragon.

Overall then, the conflict between spectacle and narrative continues to affect cinema greatly. Filmmakers are able to use spectacle to supplement their narratives as in the case of invisible effects, made up of spectacular detail. Equally, some filmmakers rely on spectacle to intensify elements of their narrative in place of a more narrative driven solution, ultimately compromising their narrative in favour of spectacle. An analysis and comparison of Black Panther and Infinity War reveals the methods contemporary blockbusters use to incorporate spectacle. Both films at times incorporate spectacular elements to immerse their audiences into the worlds they create, whilst at others halting the immersion and exhibiting pure spectacle as is. Despite emerging as a tactic to draw attention to the medium of film in the age of the cinema of attractions, the inclusion of spectacle is arguably still a necessity for modern cinema, as it forms foundations to create new stories; without elements of the spectacular, driven by technological advancements and visual effects, the ability to tell believable yet fantastic stories with believable yet fantastic settings, as in the case of Black Panther, or believable yet fantastic characters, as in the case of Infinity War, would not exist.

 

Thanks to Barry Langford for his incredible insight and thorough feedback.

 

Continue reading “Spectacle and Visual Effects: Black Panther VS Infinity War”

The Importance of Sound in McCabe & Mrs Miller

McCabe & Mrs Miller (1971) is no exception to Robert Altman’s genre-bending and unconventional filmography. Altman’s insistent distortion of classical and conventional techniques give all his films a consistent filmmaking style. Referring directly to the film’s soundtrack – in the broader sense, i.e the accompanying audio elements of a synchronised sound film – this essay will consider the ways in which Altman defies the ‘classical Hollywood style’ that had dominated narrative cinema since its conception. The classical Hollywood style of filmmaking was focused on concise portrayal of information above all else – every element of the film would fixate on pointing the audience in the right direction to better understand the narrative. Additionally, films in this style would uphold the conventions of their genres, for the same reason. McCabe & Mrs Miller, however takes a significantly different approach, rejecting the stylistic tendencies of the sea of Hollywood films before it and challenging the conventions of its genre, being proclaimed an ‘anti-western’ by Altman himself (Phillips, 2008).

Altman ensures that every generic aspect of the Western is subverted in McCabe & Mrs Miller, but identifiably so; despite subverting every audience expectation, the film is still discernibly a Western. One of the more obvious ways he achieves this in regards to the soundtrack is the choice to use drastically unconventional music. In contrast to the blaring horn ostinatos and galloping rhythms of classical Hollywood Westerns, such as those by John Ford, McCabe & Mrs Miller exclusively features folk tracks written and performed by Leonard Cohen. Instead of establishing the film with the optimistic Hoedown-esque (Copland, 1942) fanfares that are somewhat intrinsically linked to the themes of manifest destiny and the broad open plains of Monument Valley, Cohen’s somber tracks subvert this expectation and instead angle the atmosphere of the film towards its equally unconventional setting in snowy and rainy forests, pessimistic themes of vulnerability and decidedly anti-Western narrative. In what Scott Tobias describes as ‘mournful interstitials’ (2014) throughout the film, Cohen’s soft, downbeat guitar melodies provide the film with rich atmospheric texture and contrast conventional Western scores, just as the film’s snowy forests directly contrast desert plains. Phillips talks of the stark inversion this decision presents, explaining that through the use of Cohen’s ‘melancholy ballads’, along with unconventional visuals, ‘it is evident that Presbyterian Church contrasts dramatically with John Ford’s Frontier’(2008). In many ways, Cohen’s tracks achieve the same effect epic Western fanfares achieve, musically capturing and enhancing the fundamental emotions that drive the scenes. The first of the three Cohen tracks to appear in the film – ‘The Stranger Song’, which plays over the opening titles – is exemplary of this. Instead of blaring, upbeat brass exuding pioneer spirit and providing a fanfare for the stoic, alpha-male hero figure, the audience is met with pessimistic, somber guitar licks and ballad lyrics that mirror the situation of the protagonist they are soon to meet. The lyrics talk of a man who is yet to find his place in the world, who lives his life on the road, alone, constantly leaving people behind and not looking back. While a lone wanderer is not all too much of a far cry from an archetypal classical Western, the protagonist in the song is described as less of a stoic hero and more a vulnerable man on the run from himself, contrasting the conventions of the genre. By employing this song into the film’s soundtrack to introduce McCabe, Altman subverts the classical Hollywood style of filmmaking and inverts the conventional Western protagonist, reshaping the genre that categorises the film.

Conversely, Altman recognises that the music he omits from the film is equally as subversive as the music he includes. As aforementioned, the film – besides a few diegetic songs – has but three music tracks throughout its entire two hour runtime. The crucial, climactic gunfight that closes the film is a key moment where the omission of music is arguably more effective than its inclusion. The soundtrack feels most alive within the film when the deathly silence envelops our endangered protagonist. Traditionally, in a classical Hollywood Western – and in any classical Hollywood film, to an extent – music is included due to its arguably integral role in the tension and release of a scene. Epic gun fights that are typical of the genre are no exception to this stylistic choice, usually at the very least including dramatic stings and risers to heighten the tensions of the scene. In Howard Hawks’ 1966 acclaimed Western, El Dorado, a tense confrontation occurs when Mississippi threatens the last of his mentor’s murderers with revenge, eventually hurling a knife at one of the men. In classical Hollywood style, tension is heightened by musical cues; upon informing the men that he has killed all of the other murderers, a deep, bassy tuba stab marks the beginning of a musical sting that sonically conveys the danger of the situation. Even in a somewhat tamer scene for the genre, like the aforementioned, a dramatic sting or sometimes a full orchestral accompaniment to the dramatic action quickly becomes the centrepiece of the soundtrack in tense moments. This is even true in plenty of other classical Hollywood films outside of the Western genre. Altman, however, does not shy away from breaking the conventions of the classical style. In McCabe’s climactic manhunt gunfight, the tension of the cat-and-mouse nature of the scene is enough to peak audience attention and drive the dramatic action without the need of cues and stings. Altman’s deliberate omission of a tense backing track to raise the stakes in the audience’ minds is arguably more effective than the inclusion of this technique employed by classical Hollywood movies; the deafening silence of the soundtrack puts the audience intimately close to the characters as they make the few sounds audible in the scene, and ramps up the tension by wedging them right in the midst of the action. Compared to this aspect of the classical Hollywood style of filmmaking that is traditional of the Western, McCabe & Mrs Miller, while subverting convention, is equally as, if not more, effective than its more traditional counterparts.

A great part of a film’s soundtrack is its sound mixing. Classical Hollywood styles of filmmaking favour comprehensible audio mixing, for the sake of dialogue and narrative clarity. Generally to ensure this, primary or main dialogue in a scene will be favoured over all else, with important sounds being secondary, and atmospheric ambience or music where applicable being tertiary. David Bordwell explains the classical Hollywood system for conveying space, discussing composition of shots and blocking of characters and objects, concluding that central characters or significant objects are usually presented centrally in the frame in a balanced environment where audience attention is hardly competed for – i.e important characters tend to be foregrounded against a distinct background, fully in focus with nothing convoluting their presence, to ultimately ensure narrative clarity. He goes on to apply this to soundtracks in classical Hollywood, explaining that: ‘classical sound technique articulates foreground (principal voice) and background (silence, “background” noise, music “under” the action) with the same precision that camera and staging distinguish visual planes’ (Bordwell, Staiger and Thompson, 1985, 50-60). In McCabe, Altman goes against this convention, with sound mixing designed to detract from the subject. On several occasions, inconsequential, ambient, ‘background’ conversations are amplified to the volume of a prominent line of primary dialogue, and vice versa. This is particularly obvious in a sequence where McCabe, checking on the progress of the saloon, asks where the tents are. Immediately this exchange of dialogue begins, the voice of the worker being questioned is soft and distant, despite being centre frame and the significant voice of the scene, from which his speech is literally cut out by McCabe’s question. In response to McCabe’s question, the man starts explaining as his dialogue fades into obscurity beneath the rest of the convoluted soundtrack. Distant laugher occludes the conversation, and an entirely new and separate inconsequential conversation between otherwise irrelevant characters becomes the primary sonic layer. This unconventional sound mixing breaks away from the classical Hollywood style of mixing and shifts attention towards atmospheric texture over narrative progression.

Overall, while McCabe & Mrs Miller intended to bend the conventions of the traditional Western and to break new ground in doing so, the film also attacks the structure that is beneath nearly every classical Hollywood movie, even those outside of its genre. In a feat of experimentation, the film’s soundtrack presents new ways to achieve desired effects, unaccounted for by the overarching style that dominated classical Hollywood. Experimenting with radically different music choice, omission of traditional soundtrack cues and stings and toying with sound mixing techniques to draw attention to the texture of the created world and away from the narrative, Robert Altman’s anti-Western proves that the modal method of filmmaking is not necessarily the most effective one; that conventions can be subverted in creative ways to produce results that rival even the best a genre has to offer – a statement only bolstered by its rather ironic placement in the American Film Institute’s Top Ten Westerns (AFI, 2008).

Bibliography

AFI. (2008). AFI: Top 10 Western. [online] Available at: https://www.afi.com/10top10/category.aspx?cat=3 [Accessed 24 Feb. 2019].

Bordwell, D., Staiger, J. and Thompson, K. (1985). The Classical Hollywood Cinema: Film Style and Mode of Production to 1960. Routledge.

Copland, A. (1942), ‘Hoedown’ in Rodeo [musical composition]

El Dorado. (1966). [film] Directed by H. Hawks. Hollywood: Paramount.

McCabe & Mrs Miller. (1971). [film] Directed by R. Altman. Hollywood: Warner Bros.

Phillips, J. (2008). Cinematic Thinking: Philosophical Approaches to the New Cinema. Stanford, CA: Stanford U. P., 52-68.

Tobias, S. (2014). McCabe & Mrs. Miller: profound pessimism and Leonard Cohen kindness. [online] The Dissolve. Available at: http://thedissolve.com/features/movie-of-the-week/772-mccabe-mrs-miller-profound-pessimism-and-leonard-c/ [Accessed 24 Feb. 2019].

Free Labour in YouTube’s Capitalist Society

Digital media has had an undoubtedly huge effect on the world of work in many positive, but also many problematic ways. One of the many branches that digital media has unlocked is a whole new sphere of occupations; not only are there new ways to work, there are entirely new – and entirely digital – places to work. In 2007, YouTube launched its ‘Partner Program’ (Official YouTube Blog, 2007) after growing astronomically in 2006, creating advertising spaces on and around particular users’ generated content and offering said users a cut of the profits made from this. To users at the time, the partner program represented greener pastures on the horizon, where users could now be paid to create content that they were otherwise gladly making purely for their own personal interests (for free). This essay, however, will explore how digital media has bolstered the concept of ‘free labour’ and changed the way people work. Concerning user-generated content production, particularly on YouTube, this essay will discuss the impact of the YouTube Partner Program, its creator-centric facade and its sinister exploitative underbelly, which solidifies and embodies the notion of immaterial labour.

 

An Introduction to Free Labour in the Digital Age

Free labor is the moment where this knowledgeable consumption of culture is translated into productive activities that are pleasurably embraced and at the same time often shamelessly exploited. (Terranova, 2000)

Terranova wholly and eloquently summarises the dangers within the concept of free labour. Free labour or immaterial labour is a term for labour that produces no ‘material’ goods (i.e produces cultural or personal rewards as opposed to currency or ‘material goods’) for the labourer (Hardt and Negri, 2001). The danger, as Terranova explains, is the practice of commodifying – or transforming into trade value – these tasks and activities that are not generally understood as traditional ‘labour’ due to them yielding immaterial results. In the early internet days, as Terranova goes on to explain when summarising the case of The AOL Community Leader Programme, corporations like AOL manipulated people into free labour by attracting users of their sites to moderate chat rooms and community boards purely for fun, cultural and social reasons, something which has since become widely understood as ‘real’ labour that should certainly be rewarded like any other professional occupation, resulting in legal action to be taken against the offending corporations (Priceonomics, 2014). Since the AOL controversy, users have become wiser to outward exploitation of this calibre, yet online corporations have arguably only become more exploitative and manipulative, deeper disguising what is actually labour within the services of the sites they host.

In 2012, Facebook announced that the average revenue it makes per individual user of its then 955 million monthly active users quarterly was $1.21 (Facebook, 2012); with companies like Facebook making extortionate amounts of money from the content users, immaterial labour is clearly easy to lace into online services. Of course, many users extract immaterial value, whether that be social or emotional value, through generating content for sites like Facebook due to the networking service they present, so the argument of whether Facebook is forcing free labour out of its users is incredibly debatable. The point still stands that the users’ role in producing a current of profit exclusively for the corporation sitting beneath the facade of the site or service they provide is often overlooked by the very users in question, and the ethics of this lack of transparency are certainly something that should be discussed. Nicholas Carr puts this bluntly when comparing this type of business to sharecropping in agriculture – where a landowner will allow a tenant to use and work the land for a cut of the produce – explaining that:

One of the fundamental economic characteristics of Web 2.0 is the distribution of production into the hands of the many and the concentration of the economic rewards into the hands of the few. (Carr, 2017)

The premise of this business model is that the ‘many’ performing the labour make insignificant amounts of money through advertising revenue, but when every cent of every user’s revenue are totalled into one bucket, the ‘few’ snatches a lot of money exclusively for themselves for doing essentially nothing. Besides the obvious parallels between sharecropping, the print industry and alike, the premise of this business model on this sheer scale is undeniably something that could only be achievable in the digital age due to the advent of the internet and digital media and is certainly a potential downside to Web 2.0.

In the argument of what constitutes actual labour, reward for labour is at the epicentre of the debate. In the traditional sense, the word ‘labour’ implies work that one is employed to do, usually for a reward, which is usually payment in the form of money. Of course, alongside this, volunteer work is understood as work done for free, usually with the ‘reward’ for the labour being something else of value to the person performing the labour. In the case of a volunteer shop assistant, for better or for worse, the personal value the volunteer may be rewarded with could be experience working a shop, to further the future of their career. In more specific industries, such as film or journalism, often people will volunteer for labour for the sake of exposure, to make a name for oneself. It is acknowledged by most that this type of free labour for ‘exposure’ or ‘experience’ is manipulation – to the point where it has become a niche internet meme of its own (Estrada, n.d.) – yet this type of work continues on despite diminishing returns for the labourer, because independent content generators being manipulated believe that they are receiving value for their labour.

This argument is not about experience nor exposure being invalid or invaluable reasons to work. It is more that if certain corporations can offer these rewards and a monetary incentive, all – or at least most – corporations should. Digital media, for many reasons, has only made this form of work more common. Many users of the internet are under the pretense, or at least are ignorant to, laws that apply to their actions – whether that be copyright laws for photographs, music or art, or labour laws that ensure employee rights are upheld – due to the free and open nature of cyberspace, meaning that work is ignorantly stolen leaving the labourer unpaid. On top of this, the internet is a free-for-all for business startups and small companies that may be financially lacking, who recruit workers with the promise of reward later along the company’s timeline. These are just a few possible explanations why digital media has extended this older work model and enforces the notion of free labour.

It is not just the rewards of labour that have been changed by developments in digital media, types of labour have evolved; new ways to work, like remote working, are becoming feasible and entirely new occupations have been birthed from all this change. With most, if not all jobs requiring a comprehensive understanding of modern technology and social networks, it is not a stretch to say that the world of work has become technology-centric, with digital media infecting even the most isolated of jobs. With new jobs being born out of necessity, it is difficult to maintain a single definition of the word ‘labour’ broad enough to include modern and digital labour; the line of what is and is not ‘labour’ in the digital age is blurred. A perfect example of this is the discussion of review sites, whether it be establishment reviews, such as TripAdvisor, or film review sites like IMDb or Letterboxd. The main premise of these sites is that users review places they’ve been or films they’ve seen so that fellow users can gain an insight on a place or film without being there or seeing it. These sites are so effective because they are born of necessity for a platform and their shared simple premise is very attractive to users. This becomes problematic when a user considers that they are the fundamental driving force of the platform; without the content produced by the user-base, the sites would cease to function entirely.

The sites that host this content gain money from advertising revenue, brand deals and similar streams, while the producers of the content, the users, see nothing of this money. The debate that stems from this is once again a question of value – as previously mentioned, reward for labour does not necessarily equate to financial reward; writers of the content must receive some form of personal reward that is valuable to them, else the user base of the sites would drop significantly. If a writer eats at a terrible restaurant or watches a terrible film, they may feel rewarded through warning other readers. In this case, the readers, too, are rewarded for their use of the service, as they save time and money by avoiding poorly reviewed places. Peter O’Connor eloquently summarises why these platforms have such great success and why they are so ‘rewarding’ to their users: ‘By making it easier for consumers to disseminate their viewpoints, and facilitating access to such opinions, the Internet is having a profound effect on how consumers shop.’ (O’Connor, 2008). The owners of the sites acknowledge this as the reward for users’ labour, with monthly emails praising writers for how much they’ve helped readers and incentivising the production of further content with meaningless milestones, for example on TripAdvisor ‘You can collect points to reach new levels and unlock badges along the way!’ (TripAdvisor, 2019).  While there is undeniably personal value for the users, the revenue that the site owners receive is largely generated from the content produced, but whether this revenue is of greater value than the value the users gain from the service is difficult to discern. Problems like this are certainly not exclusive to jobs in the wake of the rise of digital media, though digital media has been integral in the creation of these platforms and has undeniably allowed this arguably exploitative form of labour to grow further.

YouTube’s Capitalist Society

Despite all the hype surrounding the innovative uses of the internet as a public medium, it is still a medium constructed in a capitalist era…  it is susceptible to the same forces that… defined the nature of radio and television, media once hailed for providing innovative ways of communication… Nowadays, both media have transformed and produce commercial, formulaic programming for the most part. Advertising revenue has more impact on programming than democratic  deals. (Papacharissi, 2002)

Papacharissi’s fears of the internet are incredibly apparent within YouTube. The rise of YouTube echoes the rise of television and radio with great similarities. Upon launch in 2005, YouTube was a bare-bones video hosting site that had as much potential of freedom as the internet itself did; users could share whatever video content they wanted to whoever they wanted for ‘free’. The platform quickly developed into more than just a pool of random videos as creators started uploading to their own schedules, driving traffic not just to a single video, but to the creator’s entire channel of content and content to come. Just like in film, television and radio, specific creators were cast into stardom and built up huge fanbases who would log in every upload day to catch the latest video as it went out – ‘YouTuber’ was quickly becoming an acknowledged profession. The ‘subscription’ service allowed creators to see how many fans they had, and provided easy access to their content for subscribers. This perfect blend of features was unstoppable in propelling YouTube into one of the fastest growing websites of its time (O’malley, 2006). Google was quick to see the capital potential of the platform, purchasing it in the early days of its extreme growth (BBC News, 2006). It was with Google’s purchase that YouTube’s focus for free and open sharing of video content began to take larger leaps towards a focus on generating revenue – understandably, as YouTube had been losing money since its launch. YouTube began ‘partnering’ with channels that had content ‘attractive for advertisers’ (Official YouTube Blog, 2007), offering them a fifty-five percent cut of revenue for every advertisement placed on or around their content (with YouTube/Google taking the remaining forty-five percent) (Peterson, 2013). YouTube since Google’s purchase has only echoed what Papacharissi describes of the television, radio and the internet more and more, with the free and open expanse of communication on YouTube getting dozens of limiting policy changes, prioritising revenue over the very user base that allow for that revenue stream to be a viable one in the first place.

YouTube has quickly evolved from snapping quick videos and posting them for the world to see to scheduled professional production of videos as a full time occupation. YouTube of course still offers free video sharing for all, but the community and the corporate underbelly of the platform undoubtedly have shifted focus onto the professional, money making users. It is hard to fault YouTube for what it has become, offering thousands of users genuine financial reward, however small a cut, for doing what they love. Unfortunately however, these top creators are just the tip of the iceberg of the user base, and what lies beneath the surface is what makes YouTube so problematic. YouTube has become an archetype for vicious capitalism. John Bellers’ famous statement of the poor making the rich in Proposals is impressively applicable three hundred and seventeen years later to the state of YouTube, as I will go on to discuss.

As a good and plentiful living must be the poor’s encouragement; so their  increase, the advantage of the rich; Without them, they cannot be rich; for if one had a hundred thousand acres of land and as many pounds in money, and as many cattle, without a labourer, what would the rich man be, but a labourer? … the labour of the poor being the mines of the rich. (Bellers, 1696)

By 2013, YouTube’s Partner Programme was at its broadest. YouTube themselves were still accepting partnership requests that met their then unspecified and unspoken requirements and taking forty-five percent of the earnings, but had additionally handed out power over partnership to Multi-Channel Networks. These networks were businesses separate from YouTube and Google that offer YouTube Partnerships and other services to help channels grow in exchange for a further cut of the creator’s revenue (Google Support, n.d.). MCN’s (Multi-Channel Networks) had much lower requirements for acceptance, meaning a lot of small creators jumped into long, unbreakable contracts stripping them of upwards of sixty percent of their total revenue just to reap the benefits of the partner programme. Harking back to Bellers’ ‘poor making the rich’ argument and Carr’s sharecropping analogy, YouTube themself and Multi-Channel Networks represent the landowners, and users become labourers of the land, who keep the platform afloat for themselves and each other. Additionally to this, for a long time, well established media properties owned by huge corporations were granted a larger cut of revenue than YouTube’s independent creators in order to make the platform more advertiser friendly and generate more revenue for the corporations and Google. While this was allegedly equalised in 2013 (Peterson, 2013), its effect is visible even six years later, with roughly half of the video spots on the ‘Trending’ feed being dominated by well established corporate entities or properties, despite independent content with similar view counts more so fitting the definition of ‘trending’ not even being included (YouTube, 2019). As evidenced by all the effects above, one could argue that YouTube has almost become a capitalist society of its own.

 

YouTube, Ad Revenue and Free Labour

It is almost an understatement to say that YouTube has been a driving force in the growth of digital media, fundamentally changing how media is consumed in the twenty first century – the effect of which is evident in other streaming services, like Netflix (Recode, 2017). YouTube’s effect on the digital workspace is equally dramatic. It has generated professions not just for independent personalities who wholly create their own content and post it on the service, but also for teams of content creators, just as in the television and film industries. For example, certain channels may uphold an individual entertainer as the face of the operation with video editors, marketers, producers and alike all orchestrating the content underneath. Additionally, YouTube itself hires employees as a business themselves to keep the platform afloat, with MCNs doing the same. YouTube’s sheer scale has opened up new opportunities for work and entirely new jobs have been created in the wake of its growth. On the other hand, the aforementioned ‘YouTube Society’ that has emerged from the platform undeniably promotes exploitative behaviour, to the extent where it is no exaggeration that in mainstream digital media, no online media platform epitomises the aforementioned dangers of free labour quite like YouTube.

The platform encourages a form of meritocracy in the sense that those that dominate YouTube with the largest subscriber counts and paychecks are ‘selected’ based on the quality of their content; the biggest YouTubers on the platform have earned their rank on the ‘Most Subscribed’ ladder through attracting the most viewers and most shares from viewers – PewDiePie would not have become the most subscribed YouTuber in 2013 (Cohen, 2013) if viewers did not enjoy and share his content enough to elevate him to that level. While this is a neutral system on the part of the service provider in theory, as the users decide who deserves the top spot, corporate interference has undoubtedly swayed this into bias territory. As corporations flooded YouTube to reap the benefits, they became the most advertiser-friendly content on the platform and were elevated in search results and homepage spots almost automatically. Additionally, users are more likely to stick to the content creators and entertainers they enjoy the most and rarely expand their horizons to smaller channels.

In 2018, YouTube shook the core of its platform by enforcing a harsh new requirement for YouTube Partnership, including partnership through an MCN. It stated that ‘starting today we’re changing the eligibility requirement for monetization to 4,000 hours of watchtime within the past 12 months and 1,000 subscribers.’ (YouTube Creator Blog, 2018). It meant that until a creator reached one thousand subscribers and four thousand hours of watchtime per year, a creator could not be partnered by any means, and thus not be paid for their content.  It is here that the problematic side of YouTube’s influence on work promotes free labour and by exploiting its own system. To reach the requirements of the partner programme, a YouTuber needs to grow their audience, theoretically, through engaging content. YouTube’s system does not necessarily promote engaging content, rather ‘fresh’ content that will make the most ad revenue and garner the most watchtime, usually promoting content with statistics that imply video quality – i.e high like-to-dislike ratios, high view counts and from users with high subscriber counts – by displaying them on the homepage or trending tab (Covington, Adams and Sargin, n.d.). This means that in creating engaging content, a creator will perform free labour and often investing their own finances into a video, in the hopes that they will rise the search ranks and gain views. YouTube’s algorithm will analyse the aforementioned statistics of the video, and in the case of a new channel, will see that the view count is very low, most likely discounting the like-to-dislike ratio from the calculation, as due to low sample size it is not necessarily a fair gage on how engaging the content is. As a result of the other statistics, the video will most likely not be promoted to users and the channel will not gain subscribers, as it is far less advertiser friendly than its competition. New channels are not just at a disadvantage, but the algorithm is actively working against these channels becoming popular. On the other side, established companies and properties are not only likely to gain subscribers and views through cross promotion on other popular media (for example, a chat show could promote their YouTube Channel on television and gain subscribers, theoretically without performing any labour in creating content for the platform), they are actively promoted by YouTube due to their appeal to advertisers. The established large scale YouTubers (in Bellers’ words ‘the rich’) dominate the platform, and as more new channels are created (‘the poor’), ‘the labour of the poor’ becomes ‘the mines of the rich’; The platform promotes free labour of small channels to generate content and culture for the platform to gain subscribers to be paid for their work, but does not reward this and in fact works against them in order to make it easier and quicker for the bigger and more established channels to take more money for themselves and for YouTube. An example of this in action is the rise of the YouTube channel Jablinski Games, created by Hollywood star Jack Black (Black, 2019), which within one week of creation and with merely twenty nine total seconds of content, gained one million subscribers (Socialblade, 2019). Without taking into account the quality of the content, it is clear to see that such huge channel growth in such a short amount of time from a single video is not a natural occurrence in the history of YouTube, the reason it was possible for Jablinski Games to explode like this is due to Jack Black being a well established star. His videos were almost immediately promoted on YouTube trending, due to his celebrity status, and he was able to cross promote his channel on Facebook to his millions of followers.

As outlined earlier in this essay, it is important to note that this type of free labour is not exclusive to YouTube and has been present for years in other industries and is comparable to work ‘for exposure’. On the other hand, this is not usually outspoken or in the case of YouTube, even promoted to such an extent. Usually, free labour like working for exposure is kept under wraps due to its exploitative nature, but YouTube seems to embrace that it relies on the free labour of its small users to keep the larger users afloat, and with policy changes it seems to be sealing it in as a fundamental part of its service. Just as with Facebook, TripAdvisor and the sharecropping of agriculture, labourers are forced to work fields that they do not own, making little to no reward for themselves and the corporation heads or landowners milk the profit. Digital media did not invent sharecropping, but has catalysed a whole new industry of sharecropping. While positive effects have come from digital media’s influence on the world of work, it is without a doubt that without a platform like YouTube at the forefront of digital media, exploitation on this grand scale could not be possible and certainly would not be encouraged.

One of YouTube’s core values is to provide anyone the opportunity to earn money from a thriving channel, and while our policies will evolve over time, our commitment to that value remains… (YouTube Creator Blog, 2018)

… If you’re Jack Black.

Thanks to Alfie Bown for his feedback and input into the discussion.

 

Bibliography:

Continue reading “Free Labour in YouTube’s Capitalist Society”

Monster Marketing, 65 Million Dollars in the Making

Before its release, nothing had roared quite as loudly as Jurassic Park (Spielberg, 1993). The film broke incredible boundaries and delivered something an audience were yet to see so convincingly on-screen, with impressive spectacle that even holds up to modern standards. Perhaps just as impressive as the film was the enormous sum of money it made in the box office – over one billion dollars all-time gross worldwide. While significant money and thought went into the production of the film, significant money and thought also went into the marketing of the film. Peeling away the dinosaurs and the spectacle that emerges from them reveals a Hollywood blockbuster machine chugging away, conjuring up money and fuelling the industrialisation of film. This essay will explore the argument that Jurassic Park is but a feature-length advertisement for branded merchandise and analyse the extraordinary methods within the film’s marketing process; expanding upon this, it will explore the idea that Jurassic Park and other such blockbusters are purely acquisitive and whether any art shines through in the finished product

From the early 1960s, newcomer filmmakers and seasoned European veterans burst the seams of conventional Hollywood creating fresh and incredible pieces of art for film lovers to enjoy, becoming Hollywood’s own ‘auteurs’. Dubbed the Hollywood Renaissance or the ‘American New Wave’, film, at least in Hollywood, had taken a turn for the better. 1975, however, to some marks the unfortunate decline of the Renaissance. This specific year holds significance due to the release of a pair of directly comparable films; November marks the release of Miloš Forman’s One Flew Over The Cuckoo’s Nest, regarded as one of the greatest films of all time according to the AFI (Afi.com). The film follows an in-depth narrative circling an interesting new character that Hollywood had rarely explored before and was deemed “culturally, historically, or aesthetically important” as it was archived in 1993 in the U.S. National Film Registry for preservation (Cs.cmu.edu). Despite being so highly acclaimed, One Flew Over The Cuckoo’s Nest’s one-hundred and eight million total domestic gross was trumped by the earlier Spielberg’s Jaws’ two-hundred and sixty million dollars – which also essentially doubled this number thanks to foreign markets (Boxofficemojo.com). This direct comparable is important to consider, due to it perfectly conveying the discrepancy between ‘quality’ and profit in the movie industry, highlighting the importance of spectacle and marketing, and highlighting the power of the release date. From Jaws, Hollywood learned a new way to make films and money and the importance of the spectacle re-emerged from its grave in film history. On top of this, Hollywood had now learned a new time to release films; executives were worried that due to nice weather, no-one would want to visit the cinema during the summer. It wasn’t until the June 20th release of Jaws that Hollywood grasped the potential of summer releases, where children and adults alike had free time to spend at the cinema. In his book Blockbuster (2004), Shone demonstrates this point eloquently by compiling quotes from Jaws itself:

If you want a trenchant analysis of Jawsmania… our best bet has always been to check out Jaws itself. It’s all there, up on the screen – the hysteria bleeding into the hoopla, the hoopla into hype…“We need summer dollars,” pleads the mayor… “We depend on the summer crowds for our very lives. You yell ‘shark’ and we got a panic on our hands on the 4th of July.” Which is when Dreyfuss delivers his great speech. “What we are dealing with here is a perfect engine, an eating machine. It’s really a miracle of evolution. All this machine does is swim, eat and make little sharks.” For those who care to see it, there was an allegory there for what was about to happen in Hollywood…

At this point in Hollywood, things began to teeter as the industry was slapped with realisation about the money making opportunities of more Jawses. Spielberg bolstered this notion with Close Encounters of the Third Kind (1977), proving once again that the scale of the film, the spectacle, makes the money. It goes without saying the phenomenal impact of Star Wars (Lucas, 1977) on this opinion. Flipping back to the side of New Hollywood ‘auteurs’, came the release of Heaven’s Gate (1980). Heaven’s Gate, while still widely considered an impressive film by Michael Cimino, it is infamous for being an unfortunate flop – wasting over forty million dollars of its forty-four million dollar budget in such a prosperous time for the industry, with blockbusters tearing down box office records with their spectacle. This box office tragedy, many argue heralded the end of Hollywood Auteurism – truly beginning the age of the summer blockbuster. Then came one of the biggest summer blockbusters ever created, Jurassic Park

Advertising and the culture industry merge technically as well as economically. In both cases the same thing can be seen in innumerable places, and the mechanical repetition of the same culture product has come to be the same as that of the propaganda slogan. In both cases the insistent demand for effectiveness makes technology into psycho-technology, into a procedure for manipulating men. In both cases the standards are the striking yet familiar, the easy yet catchy, the skillful yet simple; the object is to overpower the customer, who is conceived as absent-minded or resistant. (Adorno and Horkheimer, 1944)

The words ‘Welcome to Jurassic Park’ accompanied by an overpowering ensemble of brass and strings mark one of the most spectacular scenes in the history of cinema. Never had an audience seen dinosaurs – or much other computer generated imagery – so vividly integrated into our world. It is undeniably phenomenal to see such a rich display of spectacle, but an underlying argument from many critics is that Jurassic Park holds little more than just that, spectacle. Under scrutiny, seams begin to burst and reveal something arguably just as spectacular in its own right at work: an unsinkable marketing machine – a maker of ‘little sharks’. Jurassic Park from the outset was conceived to be a shared international event – a summer blockbuster phenomenon, to make as big a mark as Star Wars. The over sixty million dollar production budget was not only matched, but exceeded by a sixty-five million dollar marketing campaign (Broeske, 1993). As rightly stated by Marcy Magiera, ‘That phenomenon is no accident. Rather, it’s the culmination of a carefully crafted marketing and merchandising plan set in motion in late 1991 for the movie based on Michael Crichton’s best-selling novel.’ (1994). It is evident even from the simplistic initial teaser trailer that the film relied on secrecy to form hype (Universal Studios, 1992, a). Not a single dinosaur appears in the teaser, and mere glimpses of the creatures appear in the trailers that followed (Universal Studios, 1992, b); this strategy is to reserve the big spectacle exclusively for within the movie theatre, thus inducing further hype. Additionally, Jurassic Park formed promotions with one hundred companies, generating one thousand different products of merchandise (Broeske, 1993). This is where, one could argue, the pinnacle of Jurassic Park’s marketing lies – the sense of identity; the film was more than just a film, it had become a brand.

It takes only three colours – red, yellow and white, one minimal drawing – of a fossil, and some blank space – fit for a title to be translated into tens of languages, to create arguably one of the most powerful and recognisable logos ever made. Chip Kidd’s original logo for the original novel of the same name (Crichton, 1991) was so effective in showing so much without showing anything at all (i.e conveying dinosaurs with nothing but the silhouette of a fossil of one), the marketing team for the film used this logo to equal effect. With such a simple image, the film promised so much to the audience. The workings of this logo provided a perfect basis for the workings of the film’s marketing scheme and is almost the quintessential example of the aforementioned ‘striking yet familiar, the easy yet catchy, the skillful yet simple’ Adorno and Horkheimer wrote about forty-seven years earlier. The film’s marketing didn’t stop there, however – it went even further by playing with diegesis. The filmmakers placed the logo diegetically within the world of the film – the park shares the film’s logo. While on the surface this is a mere gimmick, this choice was a monumental and integral part in the strength of the marketing of the film. To immerse the audience in the relatively realistic world of Jurassic Park, the park itself needed merchandise, which the filmmakers put on display within the film. This tactic also epitomises ‘skillful yet simple’, ducking under the thought of the audience, but standing glaringly in full view for every scene this merchandise appears in. Soon after leaving the cinema, viewers would learn the exciting merchandise of the film world was of course also available in the real world. This is arguably the true genius in the marketing of the film, whilst simultaneously being the reason the film itself is harshly criticised when put under scrutiny; with this much thought and care being put into the financial success and promotion of the film, some would argue the film itself is one spectacular feature-length advertisement for the real star of the show – the merchandise, which the film’s genre and appeal lends itself so well to, as Barry Langford explains:

The rise of SF and fantasy moreover offers an obvious showcase for spectacular state- of-the-art technologies of visual, sound and above all special-effects design, the key attractions that provide a summer release with crucial market leverage. The genre is well suited to the construction of simplified, action- oriented narratives with accordingly enhanced worldwide audience appeal, potential for the facile generation of profitable sequels (often, as with the two Jurassic Park sequels (U 1997, 1999), virtual reprises), and ready adaptability into profitable tributary media such as computer games and rides at studio-owned amusement parks. (Langford, 2010)

To say there was not a large-scale industrial capitalisation on this film would be a complete fabrication, as marketing was a huge focus from the get-go of the film’s pre-production. The real discussion comes down to the chicken and the egg of the film – whether the marketing was the primary focus of the film, or whether it came separate, as an additional thought to take advantage of the potential of the tale. Cynically taking all the aforementioned points as reasons for the film being nothing, or close to nothing, but an inventive money making technique puppet-mastered by greedy studios is a fair interpretation that is even further evidenced by the release of the critically acclaimed Schindler’s List (Spielberg, 1993). Releasing the same year, the deep and gritty film was on many occasions implied to be the project Spielberg was more involved in, with post production of Jurassic Park taking more of a back-burner role. Spielberg and George Lucas worked on the post production of the film together, with Spielberg working from Poland while shooting Schindler (Rothman, 1993). It could be argued that Spielberg himself knew of the financial value of the film and needed to release it, where his real passion lay in telling the story of Schindler’s List. Tom Shone agrees with this notion when criticising the film: ‘It doesn’t feel like it has all his enthusiasm, all his energies. As miraculous as some of it still is, it feels a tiny bit like he’s directing it with his left hand.’ (Shone, 2015).

In defence of Jurassic Park, then, it is important to realise the stunning work put into the film. The realism of the combined visual and practical effects contend with modern day effects due to some incredible innovations in filmmaking. While the plot of the film may fall short from the depth and impact of Schindler’s List or even One Flew Over the Cuckoo’s Nest, to compare the films is not a fair stance. It could be argued, then, that while being a summer blockbuster, relying on the spectacularity of the film’s visuals and fantastic tale, that the film is a piece of art in its own right; the spectacle of Jurassic Park goes further than ever before, with careful reforging of the source material to appeal to a much wider audience: Keeping the thrills of the novel, whilst dropping most of the violence; keeping the attention to scientific detail the novel holds, whilst simultaneously juggling the hard-to-hate, engaging plots and interesting themes in a comprehensible and easy-to-follow classic cinematic way. One could argue that Jurassic Park was never a fool-proof planned marketing success, as without the film being an engaging, fun and enjoyable one, it wouldn’t have sold to the extent that it did. Even if cinephiles and critics do not hold Jurassic Park as an art piece, the film still was a carefully orchestrated and choreographed piece of media – arguably an art in itself.

To argue that Jurassic Park could have been exclusively made by acquisitive executives who lack the basic knowledge of visual storytelling, that the film could have been made without the vision of Spielberg, the musical prowess of Williams, would simply be false. While Jurassic Park was a marketing goldmine and while the film’s success was down to careful planning, to say anyone could have expected the level of engagement the film captivated worldwide without the art of the film itself is also a complete falsity. And, even if the film was nothing more than a feature-length advertisement for branded merchandise, it should still be credited as a monumental shared international experience, which captivated the minds, hearts and souls of billions of people across the globe; even if it was nothing more than an advert, it was clearly at the very least an incredible one.

Bibliography

Afi.com. AFI’s 100 Years…100 Movies. [online] Available at: http://www.afi.com/100Years/movies.aspx [Accessed 20 Feb. 2018].

Boxofficemojo.com. Jaws (1975) – Box Office Mojo. [online] Available at: http://www.boxofficemojo.com/movies/?id=jaws.htm [Accessed 21 Feb. 2018].

Boxofficemojo.com. Jurassic Park Movies at the Box Office – Box Office Mojo. [online] Available at: http://www.boxofficemojo.com/franchises/chart/?view=main&id=jurassicpark.htm&p=.htm [Accessed 20 Feb. 2018].

Boxofficemojo.com. One Flew Over the Cuckoo’s Nest (1975) – Box Office Mojo. [online] Available at: http://www.boxofficemojo.com/movies/?id=oneflewoverthecuckoosnest.htm [Accessed 21 Feb. 2018].

Broeske, P. (1993). Promoting ‘Jurassic Park’. Entertainment Weekly. [online] Available at: http://ew.com/article/1993/03/12/promoting-jurassic-park/ [Accessed 21 Feb. 2018].

Crichton, M. (1991). Jurassic Park. Orbit.

Cs.cmu.edu. U.S. National Film Registry — Titles. [online] Available at: http://www.cs.cmu.edu/afs/cs.cmu.edu/Web/Unofficial/Movies/NFR-Titles.html [Accessed 20 Feb. 2018].

Adorno, T. and Horkheimer, M. (1944). The Dialectic of Enlightenment. Stanford, Calif.: Stanford Univ. Press.

Kidd, C. (2012). Designing books is no laughing matter. OK, it is.. [online] Ted.com. Available at: https://www.ted.com/talks/chip_kidd_designing_books_is_no_laughing_matter_ok_it_is [Accessed 21 Feb. 2018].

Langford, B. (2010). Post-classical Hollywood. Edinburgh: Edinburgh University Press, pp.191-218.

Magiera, M. (1994). The “Jurassic Park” logo is a proud centerpiece for Kellogg and Sega…. [online] Adage.com. Available at: http://adage.com/article/news/jurassic-park-logo-a-proud-centerpiece-kellogg-sega-kenner-learned-insatiable-craving-dino-jurassic-park-s-amazing-story-graphic-promotional-marketer-year-spielberg-mca-put-dream-team-herd-jurassic-licensees-happier/88013/ [Accessed 21 Feb. 2018].

Rothman, M. (1993). ILM beams F/X to Spielberg in Poland. Variety. [online] Available at: http://variety.com/1993/film/news/ilm-beams-f-x-to-spielberg-in-poland-105405/ [Accessed 21 Feb. 2018].

Shone, T. (2004). Blockbuster: How Hollywood Learned to Stop Worrying and Love the Summer. New York: Free Press.

Shone, T. (2015). Interviewed In How ‘Jurassic Park’ Changed the DNA of Blockbusters.Vice. [online] Available at: https://www.vice.com/en_uk/article/exqxxp/how-jurassic-park-changed-the-dna-of-blockbusters-456 [Accessed 21 Feb. 2018].

Thompson, K. and Bordwell, D. (2002). Film History. 3rd ed. New York (etc.): McGraw Hill, pp.487-488.

Universal Studios (1992):

  1. Jurassic Park (Initial Teaser).
    Available at: https://youtu.be/-QMue9j_RKg [Accessed 21 Feb. 2018].

    Available at: https://youtu.be/-QMue9j_RKg [Accessed 21 Feb. 2018].

  2. Jurassic Park (Trailer). [image] Available at: https://youtu.be/Bim7RtKXv90 [Accessed 21 Feb. 2018].

Filmography

Heaven’s Gate. (1980). Directed by M. Cimino. Hollywood: United Artists.

Jaws. (1975). Directed by S. Spielberg. Hollywood: Universal Pictures.

Jurassic Park. (1993). Directed by S. Spielberg. Hollywood: Universal Pictures.

One Flew Over The Cuckoo’s Nest. (1975). Directed by M. Forman. Hollywood: United Artists.

Schindler’s List. (1993). Directed by S. Spielberg. Hollywood: Universal Pictures.

Star Wars. (1977). Directed by G. Lucas. Hollywood: 20th Century Fox.

Genre Mutation and The Film Noir

Collins English Dictionary defines genre as a ‘kind, category… of literary or artistic work’, but the modern day concept of genre is vaster than ever. A genre was defined by specific characteristics which set said genre apart from the rest – the lines were clear cut, black and white. As soon as the concept of genre began drawing its lines in the sand, filmmakers instinctively began to see the potential in crossing them. From this filmmakers could invent new genres and revise old genres in ways unseen by the public. Naturally, this begs the question of the value of genre in contemporary filmmaking – whether a genre actually holds any worth in defining a film by the standards of today. This essay will explore evolution and mutation of genre, specifically using the example of the Film Noir, over time. Using this exploration as a criticism of genre theory, debating the idea that the contextual understanding of a genre gives meaning to a genre film, like a gangster film, Noir or Western. On top of this, this essay will discuss the concept of genre’s value in the industry of Hollywood.

‘Genre’ as a word is widely considered to simply mean the conventions of portions of plot, the iconography within and such drawn out by the films that came before – the characteristics of a film, in order to categorise it with similar ‘types’ of film. As an example, in the case of the genre this essay will scrutinise, Film Noir, there are a prominent set of characteristics which define a film as a Noir. In terms of plot, a Noir will always focus on an investigative character who is usually separate from the law, in order to maintain honesty and hold a darker edge to the character. This character is traditionally a man, who is seduced and toyed with by a femme fatale – a deadly woman – who eventually causes the demise, or at least demise in part, of the detective or his values – corrupting the incorruptible. Most characteristics of Film Noir come from its radical visual style, causing much debate on whether Film Noir is actually a genre at all, or just an aesthetic. Visual conventions include low key lighting with an emphasis on the light and the dark, as opposed to the grey in-betweens, as explored by Renaissance artists, dubbed ‘chiaroscuro’. Unnatural compositions and mise en scène made to jar and confuse the viewer (Place and Peterson, 1974), and insequential, confusing plotlines also come hand in hand with Film Noir. The iconography of a Film Noir is one of the most recognisable – silhouettes, fedoras, glamorous seductresses, cigarettes, and the neon signs glaring through the darkness of the mean streets that lie ahead of the detective. Nowadays it doesn’t take a Noir detective to deduce a film’s genre – genre conventions are deeply rooted at the very least in the subconscious of any moviegoer, but are less so than semiotics. The reason an audience will feel, for example, that there is an underpinning darkness to the image is not necessarily due to the genre itself, but more the idea that, say, a silhouetted man against a window is threatening by nature due to the mystery of the identity of the man. This is where the simplified definition of ‘genre’ begins to blur, as many critics have begun to point out the formulaic nature of genre films. To some, genres are more than a method of an audience identifying the film, they are a formula for a storyteller to follow in writing, to design the film based on this conventional context. To deny the mutualistic bond between producer and customer would be completely false. Film, just as every other medium has, found the most effective method of delivering to its target market; categorisation of films allowed audience members who enjoy a specific ‘type’ of film to easily find and separate this type. The producer tailors the film to that specific audience, and that specific audience, in return, gain a quick and easy way to find where they will gain the most entertainment – they sort themselves into their target groups. Tom Ryall eloquently summarises this equation of audience and filmmaker together:

The ‘rules’ of a genre – the body of conventions – specify the ways in which the individual work is to be read and understood, forming the implicit context in which that work acquires significance and meaning. Genres were seen in social terms as institutions implying a bond, or contract, between producers… and audience relating to the significance and meaning of what was on the screen. (1998:328)

This summary of many critics’ opinions rather cynically points towards the idea that genre films, without the context of the genre, would somehow be less substantial or be harder to grasp for the audience, suggesting that the genre film potentially lacks a level of originality due to its reliance on previously laid out conventions. This interpretation while holding an extent of value when applied to semiotics certainly in more recent years begun to degrade due to the flowing, ever changing nature of the modern genre. The consideration of evolution and mutation of genres is key to the understanding of what genre actually is and how it actually affects the both the creation and reception of film separately from semiotics.

One of the most quintessential examples of the evolution of genre, or genre revisionism, is the direct comparison between the classic Marlowe Noir Murder, My Sweet (Dmytryk, 1944) and the post-modern, reworking of the Film Noir genre featuring the same character: The Long Goodbye (Altman, 1973). In the 1944 Marlowe adaptation, we see all the tell-tale Noir conventions being met. The film follows an insequencial narrative, just as other noirs do, as evidenced by the opening sequence triggering a flashback. The following scene, set in Marlowe’s office, is particularly useful for direct comparison to The Long Goodbye. Immediately we see the iconic chiaroscuro effect, with a harsh edge light on the side of Marlowe’s face as he sits smoking his cigarette, while the rest of the shot is plunged in near complete darkness. The iconic voice-over narration, combined with the cluttered composition of elements in the frame are also textbook. The plot begins to be introduced, with a dark storyline waiting to unfold, thanks to the semiotics of the scene previous. Nearly every aforementioned convention of the Film Noir genre is checked off the list just moments into the film. Supporting the interpretation set out by critics, an audience will, thanks to this conventions, now understand that this is of the same calibre as other Film Noirs and at the very least make this subconscious link.

Jumping ahead to The Long Goodbye, we see an opening scene that is directly comparable. Set in Marlowe’s apartment this time, he awakes in a scene filled with focused light and darkness, as per cinematographic convention. He then leaves his bedroom and enters the darkness of the hallway, plunged into the shadows. It is not until the following shot, of his cat running across the floor, that the audience are broken out of the traditions of Film Noir. While comedy is not necessarily a complete break from Noir, with Marlowe being somewhat of a wisecracking Private Eye from conception, to toy with the focus of the plot and sideswipe the audience into a somewhat ridiculous and seemingly irrelevant scene is newfound territory. The Long Goodbye certainly doesn’t break entirely from convention, it just radically churns the genre out into something new. The iconography fully present once again – with Marlowe striking another cigarette maybe every five minutes of screen time, the neon lights of the city surrounding his apartment, and alike. Meanwhile, composition and mise en scène are much tamer than in classic Noir. We can see fairly normal cinematography and production design for the time. Within the film, however, the ‘jarring’ composition remains, but has been changed into a new form. Instead of a convoluted image, the elements of sound within the film are now convoluted and jarring, with an unconventional method of recording sound – using lavalier mics, in order to have dream like, ethereal dialogue, which while coming out of the actors mouths in relative space, emitting to the audience at a consistent, spatially nonsensical volume – the sound stays the same volume, regardless of whether the character is two or ten feet away. This is a revision to the chaotic composition of images recycled into sound, keeping the necessity of in depth audience perception but in a new way. This recycling also allowed another revision to the typical Noir convention of voiceover narration. Instead of a direct narration, the audience is presented with an external monologue, with Marlowe’s ramblings about cat food feeling like a narration of events, poking fun at these Noir clichés. Another radical twist from tradition is in terms of plot, with Marlowe killing Lennox at the end of the film, which harms the honor of the detective that fans knew and loved. Robert Altman’s character description explains why this radical decision suits Marlowe, but breaks the boundaries of the Noir genre: ‘ I see Marlowe the way Chandler saw him, a loser. But a real loser, not the false winner that Chandler made out of him. A loser all the way.’ (Spicer, 2010). This is uncommon in the Noir genre, as the credibility and integrity of the detective character is usually a primary factor in the story – for example, with most protagonists being lawfully good, but separate from the Law Enforcement, to steer them away from corruption – The Long Goodbye’s ambiguous standpoint is a long step away from the norm.

Detective stories are often about a personal code; even when the character explicitly tells us he does not have “a code,” the role of the operator is to attempt to tilt the scales toward a form of justice he can live with, whether it is selfish, altruistic, cruel, or magnanimous. Marlowe’s last act in this version can be seen as justice or as selfish. (Pluck, 2013)

The question then arises – with such radical changes to genre, is The Long Goodbye even a Noir? The trick in this question  provides suitable evidence of an ambiguity in genre; this is one of the main hurdles one must overcome in order to understand modern genre and one of main considerations that punctures the previously mentioned theory. Genre is not as clear cut as the theory makes it out to be – as in that one must understand the context of the genre in order to understand the film. One may argue that to fully appreciate all the rules the film toys with and revises, an audience must know these rules. At the same time, however, if the rules are being broken in the first place, then the rules needn’t be understood by the audience, as they are witnessing something new, no matter how derivative it is. This creation of something new is precisely what is meant by ‘genre mutation’; definable exactly as mutation in biology is: ‘a sudden departure from the parent type in one or more heritable characteristics’ (Dictionary.com). In terms of genre, this cannot only be explained in the context of The Long Goodbye’s revision of its genre, but also in the context of multi-genre pieces. An archetypal example to draw upon is Blade Runner (Scott, 1982). Blade Runner welds together the (neo-)Noir with cyberpunk science fiction into something entirely new – dubbed ‘technoir’. While the whole film fits snugly in both the conventions of science fiction and neo-noir, an exemplary moment is the scene in which Deckard interrogates Rachael, using the Voight Kampff machine. The scene’s lighting reminisces on the chiaroscuro effect of the traditional Film Noir, with harsh edge lights and dark shadows causing strong contrast in the scene. The inclusion of colour is emphasised in a science fiction way, with strong saturation – vibrant skin tones, deep oranges and sharp, punchy blues. On the one hand, the inclusion of colour in such an expressive obviously breaks from the classic Noir aesthetic – due to the iconic black and white look being caused by the film stock used – one could argue that it stays within the realm of the noir and adapts it to more modern technologies; the constant inclusion of neon lights and sharp contrast in noirs, one could argue, would have been met with intense saturation of colour, could the technologies of colour film have been viable in the age of the genre. This is a key point when thinking on the idea of mutations in genre – adaptations like these, while drawing upon audience expectations are still so radical and new that without the context of the genre (or in this case genres), the film, arguably, would still work.

To summarise, genre is an incredible flowing entity that cannot be defined – or to an extent – confined to a black and white summary. The lines of genre blur more and more as time goes on. To understate the value of the genre system within the industry would simply be a falsity, but to say that the original theory of genre and lines in the sand hold as prominently in modern film as it did in the era of the Noir and the Western. These ambiguities in genre are fatal flaws in the argument presented by such critics as Lawrence Alloway, summarised by Tom Ryall. At its base level, even if genre films and genre itself were as cynical as this, genre should not be a factor of criticism of a film – the film in isolation, due to simply what we perceive as the language of film and the conveyance of the art, should – with the better films managing to – establish everything contained within its opening and closing titles, without the necessity of the stabiliser wheels of genre conventions to help it out. On the other hand, ingrained human semiotic response being relied upon by an artist is one that is fundamental in any art piece. To set aside film – specifically Hollywood genre films – as something of less value due to its reliance on this, clumsily labelling it as reliance on genre is failing to distinguish that semiotics and genre are separate entities.

 

Bibliography

Alloway, L. (1971). Violent America: The Movies, 1946-1964. New York: Museum of Modern Art.

Metz, C. and Taylor, M. (1974). Film Language: A Semiotics of the Cinema. New York: University of Chicago Press.

Place, J. and Peterson, L. (1974). Some Visual Motifs of Film Noir. Film Comment, pp.30-35.

Pluck, T. (2013). Reconsidering Robert Altman’s The Long Goodbye (1973). [Blog] Criminal Element. Available at: https://www.criminalelement.com/blogs/2013/11/reconsidering-robert-altman-the-long-goodbye-1973-neo-noir-elliott-gould-philip-marlowe-thomas-pluck [Accessed 21 Mar. 2018].

Ryall, T. (1998). Genre and Hollywood. In: J. Hill and P. Gibson, ed., The Oxford Guide to Film Studies. Oxford University Press, pp.327-337.

Spicer, A. (2010). Historical Dictionary of Film Noir. Lanham: The Scarecrow Press, p.108.

 

Filmography

Blade Runner. (1982). Directed by R. Scott. United States: Warner Bros.

Murder, My Sweet. (1944). Directed by E. Dmytryk. United States: RKO Pictures.

The Long Goodbye. (1973). Directed by R. Altman. United States: United Artists.

 

America’s Own Id: Forbidden Planet, Communism and Nuclear War

In the 1950s, the United States of America were wading through the bowels of the Cold War, with the threat of nuclear apocalypse constantly dangling in the minds of the public. On top of this, ingrained in their minds: the image of Bibles burning and their country in flames and complete ruin due to the ‘menace of communism’ (Is This Tomorrow, 1947). For a contemporary member of the public, anxieties were high. It is no coincidence that in Hollywood, the Golden Age of science fiction dawned. Barry Langford (2009) states that ‘Fifties science fiction films… offered American cinema a means to explore, in particular, anxieties about the nuclear arms race that had been largely suppressed in official media.’ One could argue that the most quintessential example of this exploitation of the science fiction genre is Fred M. Wilcox’s Forbidden Planet (1956). This post will uncover Wilcox’s imprint on the film and the injection of these societal values and concerns of the Post-war era.
Forbidden Planet on surface level is an impressive science fiction which captivates the imagination, but the influence of societal issues is deeply buried within the countless metaphors the film conveys. The most vivid and apparent piece of imagery is used as the main antagonist of the film, the invisible monster. From the first communication with Altair IV and Morbius, in even the opening ten minutes of the film, the audience are informed of an unknown threat and a promise that no one is guaranteed to be safe. The monster, for the largest part of the movie is nothing more than a whisper in the winds – a rumour, but an everlasting presence which stands tall over the characters shrouded in mystery. It is without a doubt that the writers were affected by the ongoing fear of communism when forming ideas on this antagonist and how it should be represented in the film. The United States particularly, under the pressure of tension between Russia and themselves, were – by the time the film was written – deeply weaved into a tangle of anxiety (History.state.gov, n.d.). The government’s plot to contain the spread of communism rapidly grew into fear-mongering on a large scale. The production of Forbidden Planet fits nicely within the period of the Cold War referred to as the second Red Scare. During this time, Senator Joseph McCarthy began to make bold, yet baseless claims that communism was infiltrating the U.S. Department of State (Storrs, 2014). He picked people blindly, pinning them as communists and causing massive uproar (dubbed ‘McCarthyism’). This fear spread much wider than just McCarthy – suspicions that ‘the reds’ had injected themselves into most industries and sections of American culture ran rampant – even teachers were under scrutiny (American Legion, 1951). This evil wave of unseeable enemies, hidden in plain sight – an invisible force. One could argue that these themes link very tightly to the representation of the Id Monster in Forbidden Planet; to some extent the monster represents the American perspective of communism. Every time the monster is addressed in the film, it is presented as a serious threat – and yet is not only literally invisible, it is also only merely referenced throughout the whole film, which is very similar to the way communism must have been interpreted in America, due to government propaganda and McCarthyism. In the concluding portion of the film, there is only one instance of the invisible creature becoming visible – where the creature is caught in the force field device, which illuminates its features. Despite the force field interference lighting up blue – along with the ray gun rounds, turret shots, etc. – the creature is lit up red. Red is not only the primary colour of the communist party, it is also the nickname of followers (the ‘Reds’). This takes the inference about the monster representing – or at least resembling – the American perspective of communism a step further, as the choice of the colour red seems somewhat groundless otherwise.
McCarthyism also infiltrated the film world, causing the attack on the Hollywood Ten – ten major film industry members who were denounced by the House Un-American Activities Committee. A second interpretation is that the Id Monster is an impressively socially aware, metapolitical representation of the second Red Scare itself (i.e despite the filmmakers themselves being clouded by the Scare) and an antagonisation of McCarthyism. This is alluded to, when the ‘innocent’ men shooting at the monster are picked up by the Id (McCarthy) and are lit up red by its grasp. In this interpretation, the idea that people are picked and made red (communist) is a direct metaphor for McCarthyism. William Lorenzo agrees with this interpretation and infers that the monster is a direct criticism of Senator McCarthy:

This Id Monster is an image of McCarthy’s Red Scare, and it is only fitting that the monster itself is red. As this monster terrorizes Adams’ crew, it picks up and throws aside a few crew members. This moment in the film is crucial to the interpretation of Morbius and the Id Monster. When the Id Monster picks up the two crew members, these individuals actually turn red. They are both engulfed by the monster, which marks them as red and eventually destroys them. This is exactly what happened during the Red Scare. McCarthy’s own Id Monster marked certain Americans as communists and they were, in turn, blacklisted and “destroyed” (Lorenzo, 2016).

One may counter this argument with the statement that this choice was purely aesthetic, though the film is otherwise so thoroughly premeditated in terms of plot – for instance its orbit around Freudian psychology throughout – it is not unreasonable to interpret this trait in the aesthetics of the film also. It is, however, without a doubt that the second Red Scare and McCarthyism affected the movies of the time and Forbidden Planet is no exception to this.
In the 1950s, the threat of nuclear war was exponentially increasing due to the ongoing Korean War and anxieties were high. Even with the knowledge of possible mutually assured destruction, American Generals, including General Douglas MacArthur (who once requested 34 atomic bombs be dropped on North Korea) – and even President Truman himself – did not rule out the use of atomic bombs against China, Russia and North Korea (Nti.org, 2005).

[On the use of the atomic bomb] There has always been active consideration of its use. I don’t want to see it used. It is a terrible weapon, and it should not be used on innocent men, women, and children who have nothing whatever to do with this military aggression (Truman, 1950).

With the nuclear attack on Hiroshima happening a mere ten years previous to it and with all this tension between countries, it is without question that Forbidden Planet, like many fifties science fiction films, was influenced by this. One primary theme within the film is the idea that we are all our own enemies. This is presented very literally by the concluding act, where we learn that the Id monster is a formation of Morbius’ own subconscious – he is his own monster. This theme is one that links very closely to nuclear war, despite being obvious perhaps particularly in hindsight. The concept of Mutually Assured Destruction, while evolving mostly in the sixties, still held its principle during the fifties. The MAD doctrine insisted that if one power were to use nuclear warheads against another, it would amount to suicide, as Josh Clark so eloquently explains:
Because the U.S. and the USSR both had enough nuclear missiles to clear each other from the map, neither side could strike first. A first strike guaranteed a retaliatory counterstrike from the other side. So launching an attack would be tantamount to suicide — the first striking nation could be certain that its people would be annihilated, too (Clark, 2008).
So, while this concept wasn’t explicitly applicable to the early fifties, the time of Forbidden Planet’s production, the destructive power of nuclear weapons, thanks to the bombings of Nagasaki and Hiroshima was undoubtedly frightening for the American public and the filmmakers – particularly with the knowledge that the Soviets had in fact created and detonated its first nuclear bomb (Atomicarchive.com, n.d.). One could argue that the Id monster and its story is an exploration of this theme, with the monster representing the weapons themselves and Morbius representing trigger-finger American powers, for example General Douglas MacArthur, who’s thoughtlessness could have cost vast amounts of innocent lives, not only in enemy territory, but also in home territory due to counter-strikes. The filmmakers play with not only Morbius’ lack of self awareness of the issue, but also his denial. Drawing a parallel between this and, for example once again, General Douglas MacArthur; MacArthur’s treacherous strategies appreciated no consequences and rejected the idea that they would do more damage than good – lack of self awareness and denial. Forbidden Planet’s play on the idea that a human can be their own undoing, one could argue, links very closely to this idea that a real person held a dangerous amount of control of this much power and could easily cause World War III – perhaps not particularly referencing anyone specific, but more the concept of this. A bolstering factor to this interpretation is the monster itself. The monster is very quick to endanger – and kill – ‘innocent’ people (the soldiers from the space-ship). When Morbius uses the Krell machine to unlock new knowledge, which seems to him a good idea, he also lets out the monster within his subconscious – he is the cause of the death of the soldiers directly due to the repercussions of using the machine. The idea that the filmmakers are reflecting the real state of the Korean and Cold War is highly probable.
Forbidden Planet reflects the threat of nuclear apocalypse that contemporary Americans anxiously anticipated much like many other science fiction films of the time, but in a much more subtle way. The Day The Earth Stood Still (Wise, 1951) is a prime, in-your-face example of a criticism of our own dystopia-esque world. The resonating, concluding message of the film, said by the character Klaatu, applies directly to Forbidden Planet also:

It is no concern of ours how you run your own planet. But if you threaten to extend your violence, this Earth of yours will be reduced to a burned-out cinder. Your choice is simple: Join us and live in peace, or pursue your present course and face obliteration. We shall be waiting for your answer; the decision rests with you.

Forbidden Planet, one could argue, presents the same exact message, in a much more subtle and metaphorical way. Where The Day The Earth Stood Still’s plot revolves around this theme, Forbidden Planet buries this theme deep within its lore. The filmmakers present this threat of apocalypse as long forgotten and in the past to give the audience a more third-person perspective on the dangerous teetering society of the Atomic age. Continuing the aforementioned interpretation that the Id Monster represents nuclear weapons, one could argue that the human race is reflected in the remnants of the Krell race. While the audience is given know direct confirmation of anything about the Krell, from what they were to how they disappeared, the common inference is that they suffered a similar fate to Morbius; the Krell destroyed themselves with Id Monsters of their own. One could argue that this, just like other science fiction films of the time, reflects the anxieties of the public about nuclear annihilation of the whole human race. The Krell represent ourselves and that our future follows the same dark path that their species took. While The Day The Earth Stood Still poetically and optimistically ends on the note that there is still time to solve Earth’s problems, Forbidden Planet takes a different stance. One interpretation is that Forbidden Planet gives a much more pessimistic view – that despite the destruction of the Krell, Morbius, fuelled by pride and to an extent greed, follows suit and destroys himself too. This pessimistic approach serves the viewpoint that humans never change and that as a species, we are doomed by our flaws. On the other hand, another interpretation is a bittersweet approach; the final glimmer of hope is that the soldiers and Altaira all manage to escape, learning this lesson from Morbius and the Krell and potentially breaking the circle, which could represent the audience following the film.
Overall Forbidden Planet is certainly greatly affected by social issues of the time, relating to communism – all it entailed – and nuclear war. The filmmakers have clearly been influenced by the threat and anxiety fifties American society faced and this shines strongly through in the final film. The themes the film presents and the composition of its own world carries incredibly similar social issues to the real world of the time. My perspective is that the filmmakers were not only influenced by these issues subconsciously and coincidentally, but have consciously chosen to reflect these values and important messages from the society they inhabited in order to convey their political opinions and viewpoints on the state of their country. I believe that the imagery used is incredibly significant and details, to a potentially unaware audience, the problems of the world around them. In a world of anxiety, threat and terror, where the viewpoints of most were clouded by politics and danger, I believe the filmmakers set out to diffuse some of this anxiety by expressing an awareness of their own surroundings. Much like The Day The Earth Stood Still attempts to present this viewpoint by directly mirroring the world that a fifties American society knew, Forbidden Planet projects this onto another civilisation, another planet, light years away to create a bond between audience and message which could not be achieved through literal means. Forbidden Planet proves to the audience the dark society that surrounds them: the dangers of our own human flaws, the threat of nuclear holocaust and the injustice of McCarthyism by cleverly seating the audience far from the action, in the hopes that they can take this third person perspective on their own culture also.

Continue reading “America’s Own Id: Forbidden Planet, Communism and Nuclear War”

Horror/Slasher Movies

Horror Movie

Horror/Slasher is a hugely popular genre, which people just cannot resist watching. This article will break down the genre, it’s codes and conventions, narrative structures and more. Be sure to check the hyperlinks for sources and examples!

Let’s Play: Name That Movie!

Let’s Set the Scene:
There is a killer on the loose. Tension is high in the college and students are dropping like flies. A girl and her best friends begin to investigate and find out about the killer, as the police are following cold, near dormant leads. On the other hand, it becomes apparent that the more they learn, the more danger they’re in. The girl and her best friends go to a dangerous situation due to the circumstances, like a party. This becomes a bloodbath. The protagonist’s best friend is killed brutally in front of her. The killer is revealed to be someone right under our noses the entire time. A motive is given and with the help of some information given earlier in the film, the girl manages to escape the killer to tell the tale,

Worked it out yet?

Well, of course, as you probably guessed it was a bit of a trick question. This synopsis, with a few discrepancies in the stories here and there, generally applies to a chunk of the horror/slasher genre. This is a very formulaic, general and cliché set of codes and conventions which can be spotted, at least in part, across a variety of horror films.

codes and conventions

Unfortunately, the conventions of Horror/Slasher movies have become so overplayed that they’ve become a set of tropes, a set of shortcuts that crop up time and time again in movies. Rather than being the defining thing of a genre, or a baseline, as conventions should be, some movies take these conventions as a script written for them and their work entirely consists of them. Generally speaking, there are a few common baseline conventions which tend to crop up to set the scenes.

Helplessness is key to charge any sort of emotion in the horror genre. Without the characters being in a state of peril, simply put, it isn’t a horror movie. There are a lot of ways this is done, mostly through characters’ physiques – i.e  the protagonists not being as physically strong as the enemy, for example in the cases where the lead characters consist of smaller teenagers, usually girls in horror movies (as you’ll learn as this article goes on) against a larger, brawny brute, or even a highly functioning psychopath who has every step the protagonists take mapped out before they even take them. Another way this is achieved, and this one applies to nearly every horror movie, is through the use of a secluded location. A burning question that if left unanswered will ruin a horror movie is: ‘Why didn’t they just call for help?’. Use of a secluded location is the easiest wrap up to that question and all of its stems. Whether it be an abandoned city, a spooky cabin in the woods, even space. As long as they’re trapped alone with noone to save them, peril is much easier to create. Conveniently there is no cellular service in these kinds of places, which helps tie up that loose end too. Another classic convention is that characters don’t treat the situation seriously. There will always be a character who laughs off the danger, or maybe the characters will decide to turn up to places where they could be at risk because they’ve put the situation out of their mind. Also, the idea that locking oneself in a room is apparently safe. Usually a protagonist will jump to the conclusion that they must hide somewhere instead of running away or trying to escape. Either that or they will willingly walk towards and investigate a strange incident.

Another trope that crops up very often is a fake scare- the tone will be set and the audience will expect a scare, but the scare is just a perfectly normal event, like someone making the protagonist jump, etc. Another classic is the stereotypical use of pathetic fallacy – i.e a dark and stormy night – OooOOOoOoOOoh! 

Stereotypes.png

The most notorious stereotype of all is the ‘Final Girl’ trope. The ‘Final Girl’ stereotype states that someone has to live to tell the tale, that it’s most likely the protagonist and that it’s most likely a young woman – hence, ‘final girl’. Theres also a clear correlation which shows that these girls have to be chaste in a good count of cases. Laurie Strode from Halloween, Alice Hardy from Friday the 13th, Nancy Thompson from A Nightmare on Elm Street – hell, even Ellen Ripley from Alien to some extent even shares this. The ‘final girl’ theory, written by Carol J. Clover in her books Men, Women, and Chainsaws: Gender in the Modern Horror Film and Her Body, Himself: Gender in the Slasher Film is defined by a few characteristics, which are described rather eloquently here:

“The image of the distressed female most likely to linger in memory is the image of the one who did not die: the survivor, or Final Girl. She is the one who encounters the mutilated bodies of her friends and perceives the full extent of the preceding horror and of her own peril; who is chased, cornered, wounded; whom we see scream, stagger, fall, rise, and scream again. She is abject terror personified.” – C.J Clover, Her Body, Himself: Gender in the Slasher Film

Not only is this a convention of slasher/horror films, it also is an extreme example of stereotyping and representation of women, and arguable sexism that the filmmakers choose to use simply for the sake of following a ‘satisfying’ narrative.

Other stereotypes include the alpha male jock character, who always takes charge, the whore, who is an ‘impure’ popular girl who is far from chaste and for some reason always dies first because of this, the misleading character who ends up being a villain the whole time and finally the stoner/pervert character.

 

Mise en scene

As with all movies, mise-en-scene is not usually something you can spot all of across loads of different movies. No one movie is exactly identical to another, though there are some clear things that tend to carry across into plenty of movies.

Colour is a common toy to play with in most genres, and most genres have a conventional scheme. With horror movies, it’s green or blue. Blue being cold and harsh and causing the audience to feel more sad and off, green being dirty, eerie and unnerving and giving the audience a feeling of unease. This also ties into Production Technology, as it is nowadays mostly pushed through the use of colour grading.

Related image

Related image Another common colour convention is high contrast. A high contrast image with a low-key light tends to make darker details completely invisible in darkness but also makes richer details much more prominent. Things from eyes all the way down to deep wrinkles are far more accentuated and this can be an incredibly powerful effect because of it. Even something as harmless as Teletubbies looks scary in high contrast.
Image result for why is high contrast scary

This is mostly used when something needs to be shrouded in darkness and needs to be intimidating. This was used to great effect with the spooky nun lady in The Conjuring 2, with her deep black robes, stark white face and deep blood red mouth.
Related image

(I don’t know about you but I find that proper scary.)

Low-key lighting is another classic horror convention when it comes to mise-en-scene. It really sells darkness effectively without the screen being pitch black. This low-key look is done in almost every horror movie at some point in the darkness. One of my favourite usages is in The Exorcist as the Priest arrives (one of the strongest and most memorable images in the entire movie). The lighting is so low-key it reaches a chiaroscuro type effect, giving off a strong noir vibe and really adds creepiness to the scene.

A more contemporary lighting effect that crops up in modern horror films mis-en-scene is the use of lighting in reverse of what is expected. A usual use of high-key lighting implies safety and purity. Warm colours give you security and make you feel comfortable seeing what is going on in full and as if you were in a tungsten lit house. It’s a cosy feeling. Modern horror rather predictably plays on this thought and turns it on its head, throwing your subconscious mind off by throwing horror into a warm scene. Once again The Conjuring 2 comes to mind with the very same shot shown earlier.

Due to the hallway of the house being incredibly well lit and warmly toned, the stark contrast of Valak at the end of the corridor is quite jarring. It’s a super effective scene and hits harder this way, than if it were just another jumpscare. The setting of horror films is very similar to lighting. Conventionally, a horror film will be set, as aforementioned, in a secluded location where no help is available, for example The Woman In Black‘s setting of a fairly stereotypical spooky old house. However, once again as the years have gone on horror films have slowly started bringing the spooks all the way to your front door, throwing terrifying experiences into a nice house and playing with the viewer’s senses of security. A very good example of that – which isn’t quite as in-your-face as something like Paranormal Activity – is Sinister, setting itself in a quaint little house not too different from yours or mine and stereotypically spooky.

 

Once again delving back into the more Production Technology side of Horror but a genre convention, one of the most classic things to do in a horror is use the camera to your advantage. Cinematographers of horror have a few tricks up their sleeves that they capitalise on across nearly every single horror movie. One of the most common tricks is a simple Dutch angle, or titled frame. This helps sell unease to the audience because the scene isn’t as straight as it used to be so it really pushes this feeling that something is just slightly off about a scene. Here’s a classic use in The Twighlight Zone.
 

Another classic cinematography technique in horror movies is lining a scare up on the left third. There are countless examples of this in Western cinema, as we read from left to right, which draws our eye to the left naturally. This means the first place you’ll look when it cuts to a scene is the left hand side of the frame and 9 times out of 10 that’s where the scare will be. This is further pushed by the cinematographer’s use of phi, trying to force the viewer to look in that golden direction where the scare seriously hits them. Another way this is achieved is through racking focus. Essentially, any way a cinematographer can guide the viewer’s eye, you’ll see crop up in horror.

Semiotics

Semiotics, or enigma, is nice and eloquently put here by Nick Lacy when talking about Barthe’s Narrative Codes:

bathes

To put it in short, an audience will see sign or enigma in a text (or in this case a film) and fill in the blanks using intertextuality. They’ll associate a feeling, image or sound – a sign -to another thing through the context brought from other texts in the genre or even other life lessons in general. Blaringly obvious signs are things like knives, indicating that there’s a killer on the loose, pentagrams – specifically upside-down ones – implying the religious imagery of perversion and evil. The same goes with crosses/crucifixes and upside-down crosses/crucifixes, tending to convey a demonic attachment or an attachment to the antichrist. More recently Ouija boards have become a symbol of poltergeists and once again demons. These signs are chosen because people take this fear of the unknown and take the religious associations very seriously. Even if you don’t believe in such things, a sense of dread and unease certainly comes with these items.

Another piece of imagery that has forever stuck with the horror genre is simply the imagery of eyes. Eyes hold most of the expression and have become a strong symbol in the horror genre. The gaze can tell a story for itself, whether it’s menacing and unnatural, like in The Ring:

Glaring and simply uneasy, to show the intent of the character, like in Psycho:

Or even, pulling from Psycho again, vacant and lifeless, like the eyes of a dead person:

Punching in on a close-up of an eye is not only a convention in the mis-en-scene of the horror genre, but has at this point become a staple piece of symbolism in it too.

Soundtrack is another classic example of semiotics in horror, but also the production technology side of horror. When rising strings kick in, you are to expect a scare from your brain’s desire to know what’s next, through intercontextuality. When a droning sound holds, you are to feel uneasy for the sake of the character. When a large orchestral hit sounds, the scare is ramped up by 10 because you feel threatened by the sound. A contemporary and intriguing use of soundtrack used in Paranormal Activity, particularly is the use of low frequency rumbling tones. This triggers the brain to expect something bad to happen, as it is a similar frequency to waves that accompany natural disaster and also believed to accompany real paranormal activity. This has been dubbed as the ‘Fear Frequency’ and it’s worth looking into if you’re interested in sound design. The movie also plays with this by not only putting it in in times of peril, but also taking it out in times of peril too, leaving the audience completely unsuspecting of any scares without this pre-warning that they eventually become accustomed to.

 

 

Narrative Structure

Narrative structure is fairly similar across all lines of horror. Usually, we can apply some extent of Todorov’s ‘Hero’s Journey’ theory onto them.

Hero's Journey - Mythic Structure - Monomyth

Take Scream, for example. The film begins with a prologue, with Drew Barrymore’s infamous death scene, but past that we enter the Hero’s Journey for Sidney. The film starts with an ordinary world and the Call to Adventure is the spark of the killings and their ties to our protagonist. At this point we sort of jump ahead in this case, reaching the the point where our protagonist crosses the threshold by being thrown into the situation by the killer phoning her. She makes an enemy fairly early – Gale, the reporter. Thus begins the many tests of the movie. One of the more in-your-face tests is the test of Sidney losing her virginity. Scream pokes fun at the final girl trope throughout and this is one of the most obvious ones. The first time around is played off, but in the second time, when Billy tries again at the Party, there is an outright narration of the virgin trope. Literally immediately after, Billy is stabbed in the back by the killer and thus presents the Ordeal – the final test, if you will. After gathering the information she has, Sidney now has to defeat and escape the killer, in order for the story to be complete to the final girl trope. Not surprisingly, she finds the motive of the killer(s) and then works to defeat the enemy. Afterwards we reach back to the ordinary world as Gale reports the news in the closing scene. You could argue that the film has a multi-strand story, as Gale undergoes a more selfish Hero’s Journey too, with the Call to Adventure being the return of the killer and with her strong belief that the killer of Sidney’s mother has been falsely accused. The tests are the many attempts to uncover the true identity of the killers, the ordeal is finding them and ultimately saving Sidney’s life and obviously the closing of the film is the resolution for her story too.
Generally speaking, Horror as a genre closely follows this Hero’s Journey format, but with slashers and more conventional horror films switching up the ending for a Final Girl format. Overall I’d state that a closely bonded multi-strand story line, like with the case of Scream is very common in horror as a genre. The strands are very tightly intertwined throughout, which may not be the case in other genres. Even in the most basic group slasher movies, the characters all have their own story branch, but are all incredibly tightly intertwined throughout and you could interpret the killer as another strand as usually a motive is presented throughout the film and obviously while linking with the other strands, it’s usually a very separate story, or side to the coin, that the protagonists are following. Horror movies are fairly closed off narratives. Even films with sequels end pretty straight up and wrap up all the loose ends and don’t leave it open for a sequel that maybe an adventure movie would. We always finish the Hero’s Journey in each film – nothing is left half-way to be closed off later.

 

 

Video Games, Aggression and Audience Theory

vid games

The age old debate “Do Video Games Cause Violence?” is a classic heated discussion of not just a vital medium in the media industry, but also, at it’s base level, a discussion of audience theory. In order to fully understand whether or not video games negatively effect the brains of the players, we have to learn about the way the brain consumes media and how the brain processes emotional responses to said media. In the case of video games, it goes without saying that there is a huge impact on the brain, usually featuring highly brain-intensive situations, whether it be anywhere from a puzzle game to a high octane first person shooter. The brain, as you can imagine, certainly is not short of things to do – unless the game is unnecessarily boringThe brain will be constantly occupied, thinking deeply about solutions to problems, surveying landscapes and understanding new worlds or even triggering high speed instincts, which is what sets video games apart from other media, like film and television for example. This is why video games as a medium are so highly praised, improving hand-eye coordination, problem solving skills and alleviating stress. Essentially by playing games a consumer is training themselves to do many things, which in some cases is an excellent thing, with puzzle games like Picross, KAMI and Portal honing in on puzzle solving skills. On the other hand, there is the side of things like first person shooters, i.e Call of Duty and Battlefield highly training hand-eye coordination, speeding up reflexes and generally getting adrenaline flowing. These two examples have a high competitive factor on top of this, which gives depth and reason to entering the world. With something like CS:GO or DOTA 2 there are incredibly high levels of adrenaline and competitive urge to the extent where it has become a full blown electronic sport and profession to play these games.

Image result for esports

To say that this doesn’t influence the brain would be a complete lie. This does, however, include negatively effect too. It is a well known fact that games can become incredibly addictive for a start. Immersing oneself into a world that is rich in lore and pulls you in just by existing in a satisfying way is more than likely going to negatively effect someone interested in the type of games. MMOs and RPGs are most famous for this, with such games as World Of Warcraft – a game currently 13 years old – still retaining it’s original player-base and causing serious addiction. The world is incredibly realistic, with a full blown economy, a breathing population of real people and hugely social guild features. People were fully able to – and some fully prepared to – lose themselves in this virtual expanse and a huge amount of people did – in fact, the same amount of people as the total population of Germany, Belarus and Sweden combined did. Another big problem – or alleged big problem – in the increase in aggression from playing video games. This is probably the most notorious debate of all video game debates. The debate stems from the belief firstly that video games cause aggression and  more specifically that violent video games are a cause of this. First of all,  it is fact that with a high level of focus comes potential frustration and by nature all tests, activities and puzzles cause this. This means it pretty much goes without saying that playing video games can get intense and frustration can ensue. Does this cause aggression in general however? No conclusive evidence has been found. There are several tests that have checked for a correlation of aggression with violent video games and huge waves of mixed results have come back. There is no conclusive proof that explicit violence affects a gamer’s level of aggression. In fact, there is conclusive proof of the opposite – that higher aggression levels are caused by frustration instead – this was tested on Horizon using a game called Bastet (Bastard Tetris), which will pick blocks based on helpfulness: there’s a 70% chance that the next block you are given is the worst possible block for the situation, a 15% chance it’s the second worst possible block, a 9% chance you’ll get the third worst possible block and a 6% chance that you’ll get a useful block. Obviously being designed to frustrate a player, aggression levels bumped up higher comparatively to players playing regular Tetris. Despite great assumptions and apparent correlations between mass murders and the consumption of violent video games, there is such a high level of conflicting evidence that nothing can be proven, though it’s most likely true for mentally unstable individuals, who fail to form a wall between reality and a game, though to most players appreciating the art form and playing as a hobby, just like watching TV, this notion seems ridiculous and almost completely inapplicable to them. Most of these so called correlations have no real evidence backing them, except for the fact that the person just so happens to play video games as well as being a serial killer – remember that 42% of Americans play video games and in 2013, more than 1.2 billion people regularly played games, so these ties, statistically are no surprise. On the other hand, until we find conclusive evidence, we cannot prove that these ties are not more than just simply a violent individual just happening to own games too. The lack of evidence works against both sides of the debate and truly until this evidence arises, we shan’t know.
But how does any of this this relate to audience theory? Well…

Hypodermic needle

The Hypodermic Needle Model is an early method of understanding how an audience takes in media. In essence, it suggests that an audience consume the media and immediately take in all the messages of the media the same way in one specifically designed way. Many definitions will call this a ‘Passive Audience’, as in, they do not object to anything and are impressionable and that the media is consumed one-hundred percent as intended. I would describe it as a black and white look at the product with no grey areas, but really there isn’t even a secondary colour, it’s the thought that everyone has the same view with no questions. The name is rather apt then; like through a hypodermic needle, the substance (in this case, the product) simply is pushed into the body (or in this case, the brains) of the audience and that’s that. This model is considered bonkers at this point, we know that no human thinks the same way as another, we’re all individuals and not one clump of brains that all are impressionable and open to everything. Each person has different morals and each person will accept or reject certain statements too. Gaging responses wasn’t truly written in detail about until people began to question the plausibility, or in this case, the ridiculousness of the 1920s Hypodermic Needle Model. In 1973, Stuart Hall devised a new model, defining a set of specific responses that audiences can take, which seem like no-brainers to us now.

encoding

This model was the Encoding/Decoding Model, which explores the different ways that audiences decode media, as opposed to ‘consume’. The term consume has become increasingly out of date as we begin to understand psychology further, as this refers to the idea of just taking the media in simply. Hall refers to it as ‘decoding’ or ‘reading’ in his theorem, the reason for it has been eloquently summed up in the below quote:

“By the word reading we mean not only the capacity to identify and decode a certain number of signs, but also the subjective capacity to put them into a creative relation between themselves and with other signs: a capacity which is, by itself, the condition for a complete awareness of one’s total environment” – The Cultural Studies Reader. Edited by Simon During. 2nd edn. London, England: Taylor & Francis.

In short, Hall talks about three main hypothetical positions, or stances on interpreting media:

  • Dominant-hegemonic Reading
    • The decoder takes connotations from the media straight up with no questions. Taking in the media “Full and Straight” is how it is described.
  • Negotiated Reading
    • The decoder accepts the connotations from the media, accepting what would be the ‘preferred reading’, though applied differently per person. In essence, modifying the media in order to accept it further by applying it to themselves or by reflecting themselves. An example of this is the many layers of Christianity as a religion – it’s all the same book, but many different people take many different stances.
  • Oppositional Reading
    • Simply put, rejecting the code. The exact opposite of dominant-hegemonic reading.

(Source)

uses and gratifica

The Uses and Gratifications Theory is one of the leading interpretations of the consumption of media and bases itself around the audience’s reasoning for using – reading/consuming – the product. This is famously explained in a quote from Elihu Katz, where he begs the question of “what media does to people”, as opposed to “what people do with media”. At a base level it states that audiences only consume the media they need in their lives, as in, they aren’t cracking a code – ‘consuming media’ – unless they want/need to. This is once again kicking out the Hypodermic Needle Theory and shutting down the idea that an audience is completely passive to all media. According to the theory, the piece of media must be able to provide one of the following to a reader:
Identification: The consumer can pick apart the product and identify with it and apply thoughts that the product raises to their own lives.
Education: Learning from the product.
Entertainment: Gaining pleasure and enjoyment from the product.
Social Interaction: Gaining conversation points and topics to talk about and discuss with other people – sharing an experience and gaining discussion about it.

(Sources: 1, 2)

link

What does this have to do with video games and aggression you ask? Well it’s more meta than that. On the side that firmly believe that violent video games cause people to become aggressive and violent, they are essentially applying the hypodermic needle model to games and players. They’re assuming that if a player plays a violent game, they immediately and uncontrollably accept everything within the content as acceptable and the games implant this subconscious urge to replicate in-game content (i.e violence). In thinking this way with no given evidence for why, you are implying that this outdated method is taking effect. There are of course people on this side who hypothesise, as opposed to believe, that violent video games do cause people to become more violent, which are inclined to test their hypotheses, which come out with, admittedly, some results that are substantial. Taking a step further into the meta, the people who firmly believe with no necessary evidence that video games do cause violence and aggression due to their violent nature are reading the assumptive media in a dominant-hegemonic position, whereas someone who is waiting for evidence may be more negotiated or oppositional in terms of their position. You could even argue that hearing the statement that ‘Violent Video Games cause aggression and make people violent’ and believing it outright is the reader succumbing to the Hypodermic Needle method.

video games violence

Overall, then, can we say conclusively that violent video games cause aggression? Of course not, not without substantial evidence. On the other hand, can we say that they 100% do not? Not even slightly, despite lots of evidence being on this side, the only way to disprove or prove this notion would be to get evidence from the other side, as until we know for sure it would be lying to outright state that they do not cause aggression. So why does this debate continue? Because it is about more than just aggression – it is about audience theory, about bias, about fact and fiction and a battle of the in-depth psychology of audiences, not just as a whole, but individually.

Task 4 Understanding

Regulatory and Professional Bodies

Regulatory bodies are essentially the authority who are responsible for regulating what is released to the public in the media. They help protect an audience from offensive material to avoid complaints or uproar. Examples of regulatory bodies are:

  • BBFC
    • The BBFC are responsible classifying films in the U.K. The BBFC are an independent, non-governmental body and is made up by members of a council and examiners who provide the classification for movies. Any movie shown in a cinema or due to be released on DVD has to be reviewed and given a rating. They help protect the public from viewing anything too indecent when they go to watch a movie. They also give reasoning for their ratings and give information on what people could expect to see when they are watching the Movie. Their main area of concern is what is being shown in the movie making sure that there is nothing obscene or unsuitable for the age that it is aimed at. An example of the BBFC’s more infamous ratings is the Clockwork Orange case.
      • Stanley Kubrick’s ‘A Clockwork Orange’ was a film about violence and how the future of England would be influenced by it. It shows the story of Alex, who begins as a violent individual, with the story ending with Alex suffering from the comeuppance of his own actions. Though there are many misconceptions about this film’s release in Britain, the BBFC never rejected the film and rather controversially accepted it giving it an X rating (viewable for 18+ only) saying:
        • “Disturbed though we were by the first half of the film, which is basically a statement of some of the problems of violence, we were, nonetheless, satisfied by the end of the film that it could not be accused of exploitation: quite the contrary, it is a valuable contribution to the whole debate about violence” – Stephen Murphy, then BBFC Secretary

      • In 1973, two years after being accepted, the film became more controversial when reports began to flood in of ‘copycat violence’ and, rather ironically, threats jeopardising the safety of Kubrick and his family were made, causing Kubrick to withdraw the film from the UK. It wasn’t until after Kubrick’s death that the film returned to Britain, thanks to Kubrick’s family’s permission in 1999.

 

  • Ofcom (Office for Communication)
    • Ofcom regulate communication services and they also regulate broadcasting licences. Ofcom is run independently and has a main decision making board which meet at least once a month. They also have a policy management board, Content board and committees. They have annual reviews of the board to ensure that they are representative of the public. Ofcom make sure people in the U.K get the best out of their communication services whether it’s from phones, internet, television and Radio. Ofcom help protect the general public from things like scam calls, making sure a range of different programmes are shown, that there is no harmful or offensive material in programmes on the radio or television and there’s a wide range of electronic communication services. The rights and interests of the consumer are protected well as they make sure that nothing obscene is broadcast especially anything before the appropriate watershed. They are most concerned about scam calls and offensive programming being broadcast. An example of Ofcom’s work would be the Russell Brand/Jonathon Ross fine on the BBC.
      • Russell Brand and Jonathon Ross released a large amount of explicit, intimate confidential information about Georgina Baillie on the  25th October, 2009. This was all offensive, humiliating and demeaning material which should not have been aired. Therefore the action taken was fairly severe.
        • “A fine of £70,000 was imposed for the breaches of Rules 2.1 and 2.3; and a fine of £80,000 imposed for the contraventions of Rule 8.1.” – Ofcom

        • The presenters’ shows were taken off air on BBC frequencies for a period of time following this too.
  • PCC (Press Complaints Commision)/IPSO (Independent Press Standards Organisation)
    • The PCC is the press complaints commission and has been replaced by IPSO. IPSO is the Independent Press Standards Organisation and is responsible for monitoring standards in the news. The reason they exist is to uphold professional standards of journalism in the UK. They do what is possible to address complaints made by the public who believe that a journalist has broken what is called the Editors’ Code of Practice, which was originally put in place by the PCC. In short, the code ensures that journalists must be accurate, respect privacy of individuals, avoid harassing individuals,  be mindful of children in all cases, particularly in sex cases, be mindful of hospitals and their non-public nature in areas, be mindful when reporting on crime and other regulations which tend to spread across the whole of media (including regulations on discrimination and etc.). An example of the work of the PCC, before it was replaced, is their response to reports by the Scottish Sunday Express in 2009 making an article about the Dunblane School Massacre in 1996.
      • The Express wrote a front page article following the survivors of the shooting, scrutinising them for ‘shaming’ the memory of the deceased with “foul-mouth boasts about sex, brawls and drink-fuelled antics”. The paper ripped photos from the survivors’ social network account and used them in the paper, despite there being no real need to write the article, humiliating the survivors, in the first place. The PCC stated the following, after upholding the complaint, expressing the failure to respect the teens’ private lives:
        • “[The survivers had done] nothing to warrant media scrutiny, and images [from social networking sites] appeared to have been taken out of context and presented in a way that was designed to humiliate or embarrass them” – The PCC

  •  ASA
    • The ASA regulate Adverts across all forms of the media in the U.K. The ASA is built up of a senior management team and a council, which deals with complaints about advertising. The ASA review thirty thousand advertisements every year and these ads, if considered inappropriate have to either be amended or withdrawn. The ASA continuously help protect audiences from indecent things that may be shown in Adverts their main goal is to ensure that adverts are responsible and appropriate. They look into every complaint that gets made and out of the 31,136 complaints made in 2013 4,161 either had to be amended or withdrawn. Although acting largely based off complaints, they try to be as proactive as possible by taking action against misleading, harmful or offensive advertisements. They work very closely with Ofcom. A noteworthy case of the ASA’s work is their work with a complaint for an atheist bus campaign.
      • In January 2009, the ASA ruled judgement onto a bus campaign  launched by the British Humanist Association which read “There’s probably no God. Now stop worrying and enjoy your life.” The advert, largely due to the earlier section of the quote and I’d imagine somewhat due to the fact it is larger than the text that follows below (although clearly a more stylistic, typographic choice than a ploy to offend), caused over 326 people to complain. The ASA assessed these complaints and questioned the potentially offensive wording to those who follow religions. The ASA concluded that the ads were unlikely to mislead or cause widespread offence and the cased was closed. The complaints in question were about the adverts being offensive and misleading, with the company no being able to substantiate the claim that God does not exist. The word “probably” is used in said claim, completely invalidating this argument as they are not claiming to know 100%. The word “probably” is essentially a successful loophole, while getting across the point and not attempting to mislead. The word “allegedly” is another popular choice of word here. Due to the “Stop worrying and enjoy your life”, the ASA ruled that it was not trying to offend and more get across a happy, friendly slogan. (source)Atheist advertising campaign launched 
  • The Gaming Industry
    • There are many companies within the gaming industry which help regulate and improve the business. These include
      • TIGA (The Independent Games Developers Association)
        • Launched in 2001, TIGA was created in order to represent the interests of video game developers in the UK. They were a founding member of the European Game Developers Federation (the EGDF) who are a federation who ensure the stability, vibrancy and creativity of game developers in the EU. They provide a platform for collaboration and discussion within the community.
      • IGDA (The International Game Developers Association)
        • The IGDA are a non-profit organisation who are very similar to the EGDF, except on an international level. They work hard to identify and speak out on issues in the industry, connect members with peers and expand the reach of the developer community.
      • PEGI (Pan European Game Information)
        • The main regulatory body for the gaming industry, currently would technically be the ISFE, the Interactive Software Federation of Europe. They set up what is known as the PEGI rating system, which is now a legal requirement in the UK. PEGI ratings are put in place in order to regulate games and rate them for their appropriateness for specific age groups.
  • BAFTA (British Academy of Film and Television Arts)
    • The British Academy of Film and Television Arts is an independent charity organisation which does its best to promote, support and develop the television and film industry by rewarding creators in the BAFTA Awards. They offer members workshops and classes across the world to inspire creators and benefit the public.
    • “BAFTA identifies, rewards and celebrates excellence at its internationally-renowned, annual awards ceremonies whilst providing opportunities for the public to find information and inspiration through its year-round programme of events” – BAFTA Mission Statement

  • CRCA (Commercial Radio Companies Association)
    • The CRCA is a trade body for commercial radio within Britain. They manage the Radio Advertising Clearance Centre, which regulates advertisements before they are broadcast. They work hard to aid advertisers whilst also criticizing work which may breach any codes set by Ofcom. (source) It also jointly owns Radio Joint Audience Research Ltd (with the BBC), which researches into radio broadcasting as an industry (source).

Regulatory Issues also exist and are problematic. These issues include:

  • Control of Ownership
    • Ownership is regulated thoroughly in the media industry. The reason for this is not to limit success, but more to stop a monopoly (see below) occurring. A monopoly is where one person or business gains so much power over an industry that it is all powerful over certain aspects. When a company has this much power, it then has sway, not only over the industry but over the public. For example, all newspapers would have the same view and there would be no competition, as all of the media would be owned by the same company.
    • What a monopoly means for the audience is that the same type of stories and alike would be told across all aspects. Similar films, TV shows and newspaper reports and alike would start to crop up. This would cause a level of uncertainty, as the public would be unsure on what to believe, as the endless streams of the same stories could all be bias.
  • Taste Vs. Decency
    • Taste and decency are keywords when referring to censorship. They divide a line between obscenity and appropriateness. Anything can be seen as morally wrong by anyone in a given audience. Taste and decency implies that censorship must be done tastefully but also must ensure that the media stays decent. For example, when rating a film, the BBFC may choose to cut certain elements of the film. Take that the film depicts scenes of a very sexual nature. The BBFC must chose whether this is done tastefully, in the sense that it helps move the story along, and if cuts need to be made, to ensure that the scene is cut to be decent – i.e to censor anything that is quite clearly not tasteful, in order to be decent to the audience. It is all up to personal opinion and therefore becomes a big problem in regulation. Many disagree on taste and decency arguments.

One thing that we can all agree on, however, is that there were far too many acronyms in the media industry.