Searching for blog posts tagged with 'material digital culture'

Collaborators have to bring something of themselves to the party

Apr

10

 

This post was initiated after a meeting of the Research Centre in Material Digital Culture at the end of February. After reading the two papers and coming armed with notes, I had a view that one of the papers was defining the world wide web in a manner I felt was technologically determinist and reductionary. Adamant, I shared my views with the other members of the group present, who included both my DPhil supervisors, who politely refrained from commenting on my own, reactionary attitude to the paper. I repeated the charge, confident that my reading of the author's views was illuminating and progressive. A calm descended, and the quiet that ensued was just long enough to allow me to consider the silence of the other readers, and allow the discussion to move on to the actual views expressed in the paper in question, and not the views or opinions I had inferred from my reading of the authors intent.

Hmm. I have reread the papers in question this evening, reviewed my notes and reconsidered the evidence I had for the charge I had in mind. On review, its thin. "Of what was I charging the authors?" I hear you ask. Well, I thought that the paper included a descriptive definition of what "the internet does". Now, why had this impertinence developed into a intellectual equivalent of a  'ear-worm'; a song so catchy it is impossible to remove it from one's head, regardless of the gossamer thin nature of the melody or the shallow veneer of the lyric. On reflection, and after rereading the papers, it is simply because the two papers concerned, which discuss the aspects of crowd-sourcing in action in wikipedia in one paper and the commercial application of the concept in another, are descriptive of the process and are not interested in a critical appraisal of the product of the process. 

Wikipedia is critically appraised and the claims made of the website investigated through an analysis of data relating to the creation and maintenance of the sites pages. The implementation of software bots to translate pages from English into other minority languages is hailed as a process of co-authorship and collaboration, removing opportunities for bias and inaccuracy to enter into the text. In languages with smaller numbers of native speakers, the number of bots outnumber the human editors. For me, the obvious outcome of this reliance on automated software, designed to ensure that minority languages are included in Wikipedia, is to limited the opportunities for the creation of original content in those languages. If a page for "Super Furry Animals", for instance, is translated into Cornish from Welsh (or English), we will never receive the benefit of a page created from a Cornish perspective. All content is replicated and interlinked until it becomes part of the protocol of control active in much of the existing internet, and in this process, the bots have a great part of the workload.

I still feel that, while the activity of bots is important maintain to the content of wikipedia, elevating the processes they facilitate to collaboration is kin to giving the stapler a credit on a term time paper, or acknowledging the printer in the bibliography of your dissertation. These are tools, just as the robotic lawn mower tested last week on the lawns of Sussex uni is a tool. It may be able to know what is grass and what is a sleeping second year (at least I hope that was part of the programming), but it doesn't not find a bare spot and lay new seed to fill in the missing turf unless it has been asked to, by the programmer. Collaborators have to bring something of themselves to the party.

Also posted here

 

Attack of the Mutant Camels - revisited

May

06

Not exactly close to either Brighton or Hove, and not part of this year’s festival (topical reference - tick), but the video games selected for an exhibition being hosted by the Smithsonian Art Museum, entitled 'The Art of Video Games' have been announced. This list was collated with the help of a poll of the public, with gamers able to nominate their personal favourites.

The list is speckled with some classical titles, including work by Jeff Minter, Fumito Ueda, Hideo Kojima and Tetsuya Mizuguchi. The list includes both console and PC based gaming, and covers most formats, including the early cassette based software that powered the likes of the Commodore 64. While it is always lovely to gaze at the visual beauty of 'Shadow of the Colossus', I am glad to see that the game play and originality of Worms has found a place in this exalted company. 

This exhibition, which is taking place between March and September next year, is a further step on the acceptance of the video game as an aesthetic object; one where the narrative is co determined by designer and player, and one where the graphics are frequently supplemented by the imagination of the gamer. 

Video games code is all about compromise. How can the game code provide a realistic impression of the physical laws of the world, at least a consistent rendering for the world on which the action is based, while running on a piece of hardware that is frequently underpowered and may be of a specification that is up to 5 years old (PlayStation 2 had a production life of 8 years, with the same hardware specifications from day one). Graphics have to be rendered and dropped with amazing efficiency, and any lag in the controls will appear to the gamer to be a failure in the game to respond, making the game difficult to play, reducing the gaming life of the game, and ruining the reputation of the software company responsible for creating it.

The most affecting video game will be remembered with smooth rendering graphics, which slickly invite the gamer into a new paradise of ludic challenges. The games of yesteryear are remembers for the joy created by playing them, and it is always a shock to the memory to see how blocky or pixelated the graphics now appear. Time is unforgiving, and each subsequent generation of video console reduced the previous principle of polygon performance to the status of a bundle of hopeless line drawings.

So speed is all; not necessarily in the game’s action, but definitely in the smooth progress of the game code. This, in part, has driven the segmentation and specialism within the game development industry, with game engine companies providing development software and game play specialists supporting designers and graphical specialists. As with all other industrial structures, specialisation is key to developing efficiencies in production. Which is great when the fruits of the development cycle are the likes of Heavy Rain or the forthcoming LA Noire, where the spirit of the auteur is channelled by producers looking to explore a new creative media and develop narratives to take advantage of the higher levels of affect available to play with. Anyone who doubts if gaming creates an embodied response in the gamer should try to cut their finger off, as demanded on one of the games in the Smithsonian list. However, I hope there is still space in world of social mobile gaming for the development of a new Daredevil Denis!