critical code

Critical Code Studies is a small but growing field of study, founded by Mark Marino in 2006, in which humanists read and analyze the extra-functional aspects of computer code as a sign system to gain further meaning.

projects

webs of knowledge

Year 2016 – present
On May 2, 2001, user Erdem Tuzen, a physician from Istanbul, Turkey, created the myasthenia gravis Wikipedia page. It was written as a simple paragraph describing the disease’s main symptoms, how the disease works, and how it is treated. Zooming through time shows that it has been edited 1023 times by 521 users. It has 10 main sections, with another 15 subsections. There are 14 links to other Wikipedia pages in the introductory paragraph alone. The amount of ‘information’ has multiplied over time. What does it mean to be an embodied being within systems of data? What happens when standards and practices rely on particular forms of information as being more worthy than others, causing gaps in the sharing of knowledge?

queer ghosts in the machine:
the mechanics of networked anonymity in the TOR project

Year 2013
In the United States, freedom, democracy, anonymity, and the individual are misunderstood entities within, and in relationship to, digital spaces. Digital technologies have, until recently, been popularly considered to be open, democratic networks of distributed and decentralized power, allowing individual people equal access to receive and share information. United States cultural stories of the Internet also misplaced digital technologies within the utopian ideal of a democratic society, a free space – free as in freedom- where people could leave discrimination behind, simply by becoming anonymous. Of course, academics and everyday users came to realize that socio-cultural biases repeat themselves, regardless of anonymity, across digital social spaces in much the same ways as other public spheres.

homeland security twitterbot

Year 2012
This was a temporary tactical performance bot that questioned Homeland Security’s creation and use of a keyword surveillance list on Twitter and Facebook. The bot randomly spit out sentences that included words from this list, including unlikely words such as ‘bacteria’ and ‘plume,’ to more commonly used words such as ‘disaster’ or ‘response.’

 Reb00t

Year 2011
Reb00t is a series of short analyses on digital embodiment. It was written in 2011. Read the essay

the ego_page

Year 2011
(computer) code for everyday digital technologies/objects is (usually) written by a human, or a group of humans, who have preconceived notions about the world, through their own common senses, knowledges, and values that enable them to do their work according to best practices and web-standardizations. This project Uses accidentally released Facebook code to understand how embedded normativity within the code renders the content inaccessible.

Read full paper

related conference presentations, forums, + workshops

Queer Ghosts in the Machine: the mechanics of networked anonymity in the Tor Project
Theorizing the Web
University of Maryland, 2012

Critical Code Studies Working Group
wg12.criticalcodestudies.com, 2012

Queer Profiles: embodying (computer) code
Theorizing the Web
University of Maryland, 2011

the_ego_page: queerly reading (computer) code
DC Queer Studies Symposium
University of Maryland, 2011

Critical Code Studies Forum
HASTAC, 2011