Digital Shift: The Cultural Logic of Punctuation (2015)
Connecting the Dots
Periodizing the Digital
The Digital Period
The most elemental of punctuation marks, the period gets its own place on the QWERTY keyboard, just to the right of the M and the comma keys. In the original design for this layout, the period was supposed to be housed where the R currently resides. But it was then moved so that all the letters of TYPEWRITER could be found along the top row. Economist Paul A. David explains,
In March 1873, Densmore succeeded in placing the manufacturing rights for the substantially transformed Sholes-Glidden “Type Writer” with E. Remington and Sons, the famous arms makers. Within the next few months, QWERTY’s evolution was virtually completed by Remington’s mechanics. Their many modifications included some fine-tuning of the keyboard design in the course of which “R” wound up in the place previously allotted to the period mark “.” Thus were assembled into one row all the letters which a salesman would need to impress customers, by rapidly pecking out the brand name: TYPE WRITER.1
Embedded within our contemporary computer keyboards, then, is this earlier history of the period’s shift, displaced from its originally central position on the keyboard for the sake of showmanship and selling commodities. Imagine the subliminal ways in which we might have thought of punctuation differently if it had indeed occupied a place among other letters, rather than on their peripheries. Punctuation would have an altogether different relationship to our fingers, our muscle memory, our touch, and our bodies. In calling attention to the relationship between punctuation and textual “shift,” I evoke the history of keyboard design and a broader idea of how keyboard design might impact the very ways in which punctuation signifies — in line with the phenomenological inquiry Vilém Flusser instantiated.
Figure 7. Shift key
More directly, however, this chapter is concerned with examining the intersection of textual shift and the period in relation to digital media. This most unsuspecting of inscriptions, when we stop to think about its varied uses, indexes changed signifying practices that correspond with networked computing’s digitally mediated communications. One can, in effect, “connect the dots” to narrate the history of digital media as a series of periods, dots, and points. From its typographical redefinition as an organizational structure for Internet protocol in domain names in the 1980s, through the dot-com boom and collapse in the 1990s and early 2000s, to the period-as-decimal-point that borrows from software versioning to mark different phases of the Web and speculative discourses in the early 2010s about what Web 3.0 is and could be, the dot inscribes the spirit of digital culture. This chapter unpacks the aesthetics, ideologies, logics, and politics of this punctuational inscription. In more than just a flimsy sense, the lesson that emerges from this tale is that the period periodizes the cultural history of the Internet, differentiating its historical phases.
The very term Web 2.0, Tim O’Reilly has claimed, originated as a rehabilitative response to dominant sentiments about the dot-com era ending. Recalling the conference brainstorming session where the term was born, he explains the participants’ impression that “far from having ‘crashed,’ the web was more important than ever, with exciting new applications and sites popping up with surprising regularity.”2 By evoking the software versioning strategies of numbers with decimal points, Web 2.0, and the corresponding retroactive categorizing of the Internet’s prior phase as 1.0, was intended to be a corrective to discourses about the Internet becoming history that followed the dot-com collapse.
One could read the shift from the dot to the point, then, as the new media industry reframing the Internet as software that is, depending upon one’s perspective, planned-to-be-obsolete or planned-to-be-improved. The dot’s continuity but shift in both phases is worth further reflection too: the use of the mark suggests a way of conceptualizing the Internet that is more syntactic than semantic, more mathematical than humanistic, but also elemental and basic.
Conversely, attention to this punctuation mark also opens up an alternate — surface — side of this history that lies not in computer code and discourses about new media but in the very shifting mechanics of human languages that have accompanied the uses of proliferating digital technologies. For example, today in text messages and online conversations we commonly drop periods that would normally and formally belong at the ends of sentences in print. Language pundit Ben Yagoda discusses this phenomenon in a New York Times op-ed. He writes, “My 21-year-old daughter once criticized my habit of ending text-message sentences with a period. For a piece of information delivered without prejudice, she said, you don’t need any punctuation at the end (“Movie starts at 6”). An exclamation point is minimally acceptable enthusiasm (“See you there!”). But a period just comes off as sarcastic (“Good job on the dishes.”).”3 Another critic, Ben Crair, observes that the presence of a period in text messaging tends to be read as indicating anger, writing that “digital communications are turning it into something more aggressive,” lending the sentence a gratuitous finality.4 Indeed, one could say that as the period shifts into the middle of web addresses, it tends to shift out of short-form communication of sentences, where the mark is increasingly viewed as unnecessary to convey information quickly.
While the mark continues to possess a key organizational function, the content it organizes has shifted. It no longer plays a strict role in sentence-level communication but takes on a looser role, and now alongside this older role it adopts an additional set of roles in organizing computer-to-computer communication. In other words, the period’s functions loosen in natural languages and develop new roles in machine languages. Both sides of the period’s roles in digital culture — in natural and machine languages — suggest that it no longer regularly inscribes and signifies finality, but rather a set of qualities that could be better understood as ongoing, architectural, and less conclusive. Viewed from this perspective, the period reflects a larger shift that displaces the priority of the semantic in favor of the mediated epistemological infrastructures that channel the textual practices and protocols of our postprint, digital era.
N. Katherine Hayles has been one of the critics to take on the greatest interest in these changes and to most articulately argue for the need for cross-disciplinary investigations of them. In My Mother Was a Computer she writes,
Now that the information age is well advanced, we urgently need nuanced analyses of the overlaps and discontinuities of code with the legacy systems of speech and writing, so that we can understand how processes of signification change when speech and writing are coded into binary digits. Although speech and writing issuing from programmed media may still be recognizable as spoken utterances and print documents, they do not emerge unchanged by the encounter with code. Nor is the effect of code limited to individual texts. In a broader sense, our understanding of speech and writing in general is deeply influenced by the pervasive use of code (my deliberate situating of them as legacy systems above is intended as a provocation to suggest the perceptual shifts underway).5
Hayles argues that beyond generating new frameworks for approaching new media textualities, we must also specify how the language of computer code bears continuities with the distinct historical textual regimes of speech and writing that have been elaborately theorized by semioticians and literary historians — and the ways machine languages present undeniable, significant ruptures with them. But in her formulation, she also importantly acknowledges that the effects of code upon language extend beyond literal inscriptions of code. The “broader sense” of which she speaks suggests that code is part of, even exemplary of, a larger shift in textual logics and systems that cuts to the core of today’s communication practices and that is reconditioning worldviews (not deterministically or unilaterally, but reconditioning nonetheless). Punctuation marks in particular, I argue here, map paths upon which one can examine the nature of these changes, signaled by the term shift.
The route this chapter takes to map this shift begins with some considerations of the cultural logic of the period, collecting examples and comparative questions from across a range of sites and discourses in visual culture, politics, digital etiquette guides, and literary criticism. Once this logic is mapped, I turn to two analyses that stage two sides of the typographical period’s cultural logics and textual shift across the information age. The first, taking us back to the 1980s, will look at the ways in which the mark was in effect redefined as a “dot” by computer programmers in Internet protocol documents. This analysis, so to speak, takes us “inside” the logic of the period’s textual shift into digitality, considering its literal redefinitions, infrastructural transformations, and new role in computing. By contrast, I juxtapose this with an analysis that repositions us to an “outside” of sorts to these logics. Moving forward in time two decades, yet to an older media form, I explore a probably unexpected text, Spike Jonze’s 2002 film Adaptation. I argue that the interpretation of this film — a film that has generated endless interpretations — is enriched by situating it in a different phase in digital culture marked by the dot-com frenzy. The title’s little-noticed period in fact offers what could be taken to be a synthetically symptomatic reading of the film’s significance, making sense of it in its historical context when digital dreams saturated public discourses and the national imagination. The chapter concludes by considering periodization as a type of knowledge work, asking how the typographical period’s textual shift maps onto notions of digital periodizations more broadly.
Zeroing In on the Period’s Cultural Logic
The period, off to the side on our typing interfaces, with its no-frills simplicity and ubiquity, might not seem worthy of much fuss. It sits firmly but quietly in the middle of a holy trinity of sentence-ending marks. On its one side rests the uncertainty of the question mark and on its other side one could locate the overpowering certainty of the exclamation mark. These two other terminal points tend to elicit stronger aesthetic convictions in our cultural imaginations than their neighbor in the middle — soliciting the disdain of everyone from F. Scott Fitzgerald to the bloggers of Excessive Exclamation!!, a website devoted to visually documenting various cultural artifacts and signs that illustrate a persistent overuse of exclamation marks.6 David Shipley and Will Schwalbe make sense of this trend in their digital etiquette book, Send: Why People Email So Badly and How to Do It Better. They write:
Exclamation points can instantly infuse electronic communication with human warmth. “Thanks!!!!” is way friendlier than “Thanks.” And “Hooray!!!!!” is more celebratory than “Hooray.” Because email is without affect, it has a dulling quality that almost necessitates kicking everything up a notch just to bring it where it would normally be. . . . The exclamation point is a lazy but effective way to combat email’s essential lack of tone. “I’ll see you at the conference” is a simple statement of fact. “I’ll see you at the conference!” lets your fellow conferee know that you’re excited and pleased about the event.7
Given that one of punctuation’s conventional uses is to convey a writer’s tone to a reader, communication in digital contexts would seem to particularly need punctuation to convey tone. The marks help recipients know how to read messages in media whose users often lament not being able to “read” each other’s moods, finding that with no voice to carry them, tones are easily misinterpreted. This offers an explanation not only for the recent pervasiveness of punctuation that explicitly registers affect: “excessive exclamation” but also even more popularly, emoticons.
Figure 8. Excessive Exclamation!! blog
Iconographically, wireless technologies also use the exclamation point to instill panic. For example, when a computer is unable to connect to a network, and the four bars signaling connection strength on a monitor display do not fill, an exclamation mark is superimposed over the empty bars. One of the goals of this book is to call our attention to and reflect on such images, whereby textual inscriptions have come to form a patchwork of iconography throughout the visual culture of digital media, engaging in modified practices of signification. As we asked in the previous chapter in relation to the isolated equal sign removed from mathematical statements: Is the use of punctuation in digital culture as iconography, disentangled from its ties to language, something new, and if so what does it signify?
In contrast to the question mark and exclamation point, the period’s seeming neutrality projects a certain ambivalence, if it projects anything at all. In her witty Atlantic Wire article, “The Imagined Lives of Punctuation Marks,” Jen Doll personifies various marks, writing,
The period is the good-on-paper guy or girl (he/she is unisex really). You’ll never really fall in love, but you’ll appreciate and respect the Period deeply. And you do, at the end of the day, realize in your heart of hearts that you need him or her. Inevitably, however, you’ll cheat on the Period with the Ampersand, Semi-Colon, or possibly the Interrobang. The Period keeps an impeccably clean house and can be relied upon to come and visit you in the hospital. He/she always forgives. Full-stop.8
Yet even this seemingly inviolable inscription can elicit controversy, as in Barack Obama’s 2012 campaign for a second presidential term. The campaign’s slogan was “Forward.” A Wall Street Journal article reports: “The period was subject of a spirited debate as Mr. Obama’s senior advisers and outside consultants spent hours in a conference room at their Chicago campaign headquarters deliberating over the perfect slogan, according to an adviser who was in attendance. Does a period add emphasis? Yes! Does it undermine the sense of the word? Maybe!”9 (The article also illustrates how nearly impossible it is to not play with punctuation when writing about it.) If “forward” was intended to project voters into the future, the period following it, some feared, seemed to halt the word — and the idea it stands for — in the present. Catherine Pages, a Washington, D.C. art director, was quoted in the newspaper as saying, “There’s been some speculation that the period really gives the feeling of something ending rather than beginning.” Invoking the punctuation mark’s “full stop” alias, one of the president’s advisers and former chairman of the Council of Economic Advisers, Austan Goolsbee, explained, “It’s like ‘forward, now stop.’ It could be worse. It could be ‘Forward’ comma.” Linguist George Lakoff chimed in to the conversation, responding to questions regarding whether it was even proper English to include a period after one word, confirming that the single word is indeed a legitimate imperative sentence: “You can look at the period as adding a sense of finality, making a strong statement: Forward. Period. And no more. Whether that’s effective is another question.”
News outlets and critics of Obama noted the slogan’s ties to communist propaganda campaigns by the Soviets that also heavily used the word, but notably, they usually added an exclamation point: “Forward!” The Washington Times, for example, ran a story titled “New Obama Slogan Has Long Ties to Marxism, Socialism.”10 Less enthusiastic and more understated than its Soviet counterparts, Obama’s mark is more ambivalent: it might simultaneously read in conjunction with the “Forward” it follows as opening a conversation or, under the media’s unrelenting microscope, as cutting off possibilities for imagining what that “forward” might mean.
Figure 9. Obama’s 2012 “Forward.” campaign
Figure 10. Soviet wartime poster: “Forward! Victory is close!”
The period, as these illustrative examples drawn from popular culture and literature suggest, is not as definitive or neutral as we might tend to think. Once we spend some time with it, we realize that rather than making a statement it in fact raises more questions than it answers — about how effective our language is in communicating desired meanings, and about what our desired meanings are in the first place. In every writer’s inscription of a period there is a loaded paradox: one is relieved to have completed a sentence, but in this moment of relief one confronts an anxiety that threatens to overwhelm any sense of relief its inscription might have achieved. (Does something come next or have I finished? If something comes next, what is it? Is it someone else’s turn to speak? Or must I come up with something else to say?) Every period, in other words, seems to disguise at least four question marks. In this sense the period inscribes many of the same anxieties over finality that the idea of periodization does for many historians and humanists.
Defining the Dot
Alexander Galloway suggests that the social order of our current digital or postmodern period (descriptors he self-consciously alternates in using) is based on a configuration of control that is represented by the Internet’s style of management — its protocol — established through the circulation of Requests for Comments documents (RFCs) that set standards for computer-to-computer communication. The logic of control set by these protocols, Galloway explains, is characterized by a contradictory pull whereby on one hand Transmission Control Protocol/Internet Protocol (TCP/IP), which transmits data between computers across networks, “radically distributes control into autonomous locales,” while on the other hand, Domain Name System (DNS) protocol, which translates web addresses in natural languages into numerical IP addresses, “focuses control into rigidly defined hierarchies.”11 Thus even though TCP/IP might shape a dominant impression of the Internet as unbound, horizontally sprawling networks whose inorganic logic is impossible to grasp, DNS has the opposite effect, configuring a highly regimented, hierarchical tree structure responsible for facilitating successful communication between computers.
It is here, with the vertical, hierarchical, ordered logic of DNS protocol, where the actual dot of web addresses carries a new procedural function in the computer age. The dots in web addresses break up the command of control into subdomains, where the label following the last dot on the right (such as com, edu, gov, org, or two-letter country codes like ca, au, fr, uk) is the top-level domain, and the subdomain hierarchy descends from right to left. As Jon Postel and Joyce Reynolds explain in RFC 920, a policy statement on domain requirements, “in the future most of the top level names will be very general categories like ‘government,’ ‘education,’ or ‘commercial.’ ”12 Indeed, it is also in an RFC such as this where the punctuation mark might be understood to be officially though casually renamed and redefined.
Across the various standards-setting documents for the Internet, the “.” character is treated as a noun, not just a mark of punctuation. In fact, it is often punctuated further, surrounded by quotation marks, foregrounding the intentional textuality of the punctuation, as if to call our attention to a mark so small we might otherwise think it is a typo or overlook it. In this context, if spoken aloud, it becomes a word to pronounce, not just a mark that aids the reader’s flow in pronouncing other words. We now write periods in web addresses before we write a website’s top-level domain, which, in Paul Mockapetris’s more technical language in RFC 1034, “mark the boundary between hierarchy levels” of “name spaces.”13 Nam June Paik’s collaborator, net activist, and Name.Space’s founder Paul Garrin perhaps best articulates the character’s newly achieved significance: “With the stroke of a delete key, whole countries can be blacked out from the rest of the net. With the ‘.’ centralized, this is easily done. With the ‘.’ decentralized such a deletion is not unilaterally possible. Control the ‘.’ and you control access. Control the content of the ‘.’ and you also control the market.”14 In other words, the architecture of the Internet relies on the character in its organization of web pages and in the chain of commands required to take us where we want to go. Every time we visit a website, we need to use the character to get there. This cannot be said of any other character. Nor can it even be said of the composition of every sentence in human languages: there is always the possibility of ending with a question or exclamation mark, or today, in the increasingly short messages we use to communicate with each other, with nothing at all.
This punctuational shift is perhaps most immediately perceptible via changed terminology: no longer a period, it is now referred to as a “dot.” In RFCs 1034 and 1035 from 1987, a pair of documents widely credited for defining and setting the standards for domain name protocol, Mockapetris defines the mark as such: “When a user needs to type a domain name, the length of each label is omitted and the labels are separated by dots (“.”).”15 Mockapetris’s parenthetical quotation of the character (symmetrically triple punctuation at that) in effect serves to name the mark. One could read it, as similar parenthetical expressions often are supposed to be read, as defining unfamiliar terms for future use in a text. In effect it says: “from here on out the “.” character will be referred to as ‘dot.’ ” One might almost interpret these RFCs as documents that are renaming and redefining punctuation insofar as they imagine and set standards for new textual uses of the characters in the hopes of optimizing the Internet’s future — for technologies that run it and users that operate it.
In their illustrated history of the Internet, Katie Hefner and Matthew Lyon have noted the importance of the particular tone and the rhetoric of RFCs. They describe Steve Crocker, the author of the first such document, which was about the “basic ‘handshake’ between two computers,” as an “extremely considerate young man, sensitive to others.” They suggest that his personality inflected the language of the first RFC, written in a bathroom in the middle of the night so as to not disturb Crocker’s housemates. Even the note’s title of “request for comments” was named politely to “avoid sounding too declarative.” Hefner and Lyon explain, “The language of the RFC was warm and welcoming. The idea was to promote cooperation, not ego. The fact that Crocker kept his ego out of the first RFC set the style and inspired others to follow suit in the hundreds of friendly and cooperative RFCs that followed.”16Computer scientist Brian Reid notes about this first document’s rhetorical friendliness, “It is impossible to underestimate the importance of that. I did not feel excluded by a little core of protocol kings.”17 It is worth considering how punctuational thinking metaphorically textures this history. We might recall that punctuation is the part of written language that lends it tone, and specifically, the period is the mark of punctuation that lends writing a declarative tone (ending a phrase, sentence, or thought). Does it not almost seem inevitable, read from this perspective, that these documents attempting to thoughtfully open up an inclusive and evolving dialogue would redefine with special care the inscriptions that tend to lend certainty and finality to our thought?
And if we think the period is declarative, imagine the extent to which early Internet protocol designers might have wanted to avoid the even greater assertiveness of what Walter Ong refers to as the exclamation point’s standard “sense value.”18 Consider for example Crocker’s fellow Internet pioneer Jon Postel’s 1979 memo IEN 116. (IENs, or Internet Experiment Notes, were a shorter-running series of protocol documents modeled after RFCs that Postel edited from 1977 to 1982.) It reads:
It is strongly recommended that the use of host names in programs be consistent for both input and output across all hosts. To promote such consistency of the internet level, the following syntax is specified:
The SYNTAX of names as presented to the user and as entered by the user is:
! NET ! REST
NET is a network name or number as defined in “Assigned Numbers” 
REST is a host name within that network expressed as a character string or as a number. When a number is used, it is expressed in decimal and is prefixed with a sharp sign (e.g., #1234).
Note that this syntax has minimal impact on the allowable character strings for host names within a network. The only restriction is that a REST string cannot begin with an exclamation point (!).
The !NET! may be omitted when specifying a host in the local network. That is “!” indicates the network portion of a name string.19
Without getting bogged down in the details of network infrastructure, one can discern at least two points worth considering for our present purposes. First, Postel explicitly defines here a new role for the exclamation point. And second, this redefinition rests on a fundamental assumption about our expectations about punctuation in natural languages — the term used to refer to languages humans speak and write (which are of course far from “natural”), as opposed to machine languages like code. This assumption is that punctuation is in a sense not as necessary as letters and digits; it is more disposable from language, perceived to have a hierarchically lower value than other types of characters. The exclamation point, in other words, can be redefined precisely because one does not need it to uphold its normal sense value. Domain names can be written without them. In this way, then, one can begin to understand how punctuation registers more precisely and cleanly than other elements of language the textual shift from human languages to machine languages that correspond with the interconnections forged between digital technologies.
On one hand, in a web address the period continues to mark the end of a semantic unit, insofar as we might understand a name space as a semantic unit. In this sense, and in a very real way, the period continues to organize our experience of textualities and communication. On the other hand, the punctuation mark takes on an irrevocably different role — one that is less for aiding speech in the classical sense of punctuation that has been explicated by scholars like M. B. Parkes and Walter Ong. Ong, for instance, writes in his study of the first punctuation marks (the period, comma, and colon) in Elizabethan and Jacobean English that in predominant early uses of all three marks, “the clarification of the syntax is coincidental. The grammarians are interested primarily in the exigencies of breathing.”20 It was only in later medieval writing, after writing came to be culturally valued more than speech, that the period’s function shifted to primarily syntactical clarity. With textuality’s increasingly close relationship with computer languages, the period’s primary sense value evolves again. Indeed, with the dot one detects a shifting role of punctuation that represents a much broader and more significant textual shift that indexes language’s modified role in the digital age. Now, the punctuation mark emerges as an inscription tool to help manage the Internet’s growth in the 1980s.
If the period steadily began to gain recognition as a dot in the 1980s, I will now propel us twenty years forward, to a moment after the dot became a fixture across not only the infrastructure of the circulation of visual culture but across the surfaces of it as well, with the centrality of the dot-com craze in the United States at the end of the 1990s. To settle into this time frame and typographical character, I wish to consider what at first would seem to be an extremely unlikely outlier as an example: a hardly noticed yet ingenious manifestation of the punctuation mark at the title’s end of the film Adaptation. (directed by Spike Jonze, U.S., 2002). While this period fairly consistently appears in official references to the film’s title and throughout its marketing, it is more often than not neglected in casual and even scholarly references to the film, demonstrating just how unnecessary and unobserved punctuation can be. The film, though — which takes writing, and even more specifically the anxiety of writing, so seriously — clearly intends and is enriched by its period. Beyond just encapsulating anxieties about writing and narrative ending, one could read this period as also inscribing a historically specific set of anxieties in the United States about the (momentary) ends of technological enthusiasm and investment, a claim I will move toward here.
Nicolas Cage plays two lead characters: a tormented Charlie Kaufman, an autobiographical version of the film’s real-life screenwriter of the same name (who in the film is fresh from writing Being John Malkovich, a 1999 film Kaufman did write); and his carefree twin brother, Donald, also a screenwriter in the film but with no real-life referent. Charlie, admired for his writing talent and originality, is enlisted to adapt Susan Orlean’s (Meryl Streep) New Yorker story-turned-book The Orchid Thief for the screen. The film focuses on Charlie’s struggle to adapt Orlean’s work. He wants to resist the Hollywood clichés. As he puts it to the studio executive Valerie Thomas (Tilda Swinton), who has solicited his adaptation: “I just don’t want to ruin it by making it a Hollywood thing, you know. Like an orchid heist movie, or something. . . . Or, you know, changing the orchids into poppies and turning it into a movie about drug-running. . . . Why can’t there be a movie simply about flowers?” He continues, “I don’t want to cram in sex or guns or car chases. You know? Or characters, you know, learning profound life lessons. Or growing, or coming to like each other, or overcoming obstacles to succeed in the end.”
If the first two-thirds of the film focuses on Charlie’s neuroses and writer’s block (often conveyed by a recurring voice-over that draws us into his obsessions and afflictions), then the final one-third self-consciously and ironically crams in all that Charlie was trying to keep out. The film unravels, sweeping into its narrative orbit everything its main character wanted to avoid: sex via an unlikely romance between Susan and her orchid-expert muse John Laroche (Chris Cooper); a drug-running scheme that Charlie discovers Laroche is orchestrating in Florida; and a fast-paced car chase to a swamp where Susan and Laroche run Donald and Charlie down, leaving Donald shot dead and Laroche killed by an alligator. And even that most important of Hollywood conventions, a moral: “you are what you love, not what loves you.”
Intricately interweaving and blurring fiction with reality, the film thus asks its viewer to reflect on what adaptation means and entails. Is Jonze’s film ultimately a true adaptation of Orlean’s story? Are its meandering attention to The Orchid Thief and insertion of the Kaufman twins and Hollywood clichés unfaithful to its source, or is it a faithful adaptation precisely insofar as it transposes The Orchid Thief’s own meandering attention to its subject, Laroche? Is the film about the real-life Kaufman and a fictional twin brother, or do both of Cage’s characters represent two competing halves of the same real-life Kaufman torn between maintaining an original, independent vision and selling out to Hollywood? (With screenwriting credits and Oscar nominations for both Charlie and Donald, this was the first time a fictional person was ever nominated for an Academy Award.) Whatever one’s ultimate reading is, Adaptation. is surely, at least partially, about the anxiety of ending.
Adaptation.’s seemingly tacked-on ending has been scrutinized in much of the film’s criticism, leaving spectators uncertain of what to make of it. In his otherwise positive New Yorker review, David Denby representatively writes of his disappointment with the ending:
What then envelops Orlean and Laroche and Charlie (who writes himself into the story) is awful nonsense. Drugs, guns, car crashes, alligators — the movie becomes a complete shambles, and far more desperate than anything conventional filmmakers would fall into. It’s hard to know how to read this mess of an ending. . . . The trouble with experimental comedies is that it’s often impossible to figure out how to end them. But at least this one is intricate fun before it blows itself up.21
While Denby and a significant number of others seem to find the film’s conclusion condescending and alienating, I disagree. It is certainly a joke, and one that we are allowed in on. After the action-packed narrative ending, however, the film in fact does not end. In a smart analysis of Adaptation. (which despite its thoughtfulness, representatively neglects the title’s punctuation), Joshua Landy considers seven possible interpretations of the movie. He focuses on the significance of the film’s true ending:
Recall, however, that Adaptation closes with a time-lapse sequence of daisies on a meridian, an astonishingly powerful sequence, with rhythms borrowed (appropriately enough) from the Fibonacci series and set to music that ends in a lush, ethereal harmony [The Turtles’ “Happy Together”]. What if this sequence were not just the finale but also the telos of the movie? What if the entire film were simply building up to the daisies on the meridian, indeed making them possible, turning them for the first time into something that can be noticed?
This, I want to claim, is the deep strategy of the film, the seventh and only successful approach, the one that finally brings about a victory for the nonnarrative (the static, the cyclical) over the narrative.22
Landy claims that Adaptation.’s motivating question is how to make a cinematic narrative “simply about flowers,” recalling Cage’s words to Swinton. The film, Landy reasons, delivers an exaggerated, complex, and overstuffed narrative to satiate our desire for narrative beyond any reasonable doubt, so that by the time we see the flowers that close the film, we appreciate them. In the end, the goal — the “deep strategy” — is successful, and the viewer has arrived at a point of being able to notice the flowers and appreciate them for what they are. In a sense, according to this compelling interpretation, the entire rest of the film we have seen until this point, with all its loopholes and fictions, has canceled itself out to finally become a film “simply about flowers.”
Yet one could in fact read past the flowers to the film’s period as offering its ultimate meaning, an added interpretation of a film that invites nested layers of reading. The period, standing for writing’s final mark and an ending that goes unnoticed, inscribes all of the anxieties over finality that the film is about. I have little doubt that the real-life Kaufman, no stranger to including literary devices with multiple layers of meaning in his titles (his directorial debut in 2008 was the acclaimed Synedoche, New York), is registering in this single dot the many narrative desires, anxieties, and interpretations the film stages. As a mark that ends a sentence but normally not a film title, the period’s placement asks us to think twice about how Adaptation. ends. Out of place and tacked on, does the period resemble or counterbalance the film’s own spectacular denouement? At the same time could it be just conspicuous enough to catch our attention as the true finishing touch — the point of it all — much like the film’s closing flowers? Indeed, beyond its closing time lapse of flowers, the key to the film could be understood to lie in its period, since it at once evokes the writing process and an ending out of place.
As much as the period might seem to end the conversation and be Adaptation.’s final point, I would not stop there either. We should not forget when this movie was made. This particular punctuation mark was prominent across popular culture in 2002. To read the film in historical context, one might recall that the late 1990s and early 2000s were among other things characterized by great hype over the dot-com boom and crash. Discourses in an increasingly globalizing American society were saturated and undergirded by a hope bordering on greed invested in the futures of new Internet companies, followed by a quick realization that such a creative, economic, and expanding technological utopia could not be sustained — a historical lesson that the contemporary wave of excitement about social media seems not to have learned. As Andrew Ross puts it in his study of Silicon Alley workplaces, “In next to no time, the Internet gold rush story sucked in all the available currents of public attention.”23 Geert Lovink also emphasizes the pervasiveness of dot-com discourses during this time: “For a short while, around 1998-2000, the rhetoric of the New Economy was hot and glamorous; Internet reporting was everywhere, from the entertainment sections to media pages and IT supplements.”24 A symptomatic, situated analysis of Adaptation. can be enriched by understanding the movie, particularly the attitudes and actions of the twins portrayed by Cage, as deeply formed by and responsive to the cultural ideologies of this particular historical moment’s “dotcommania,” to borrow Lovink’s phrase for the phase.
As the new millennium approached, news outlets and analysts also talked with frequency about Y2K as though it would bring about apocalypse. Our computer systems, and by extension our networked world, were based on programs that temporally stored and used years’ last two digits. The logic underlying the panic had it that now the first two digits would change and throw things —including life-sustaining systems — out of whack. January 1 came and passed, and everything was fine. But in only a few months’ time, with these anxieties hardly settled, there was new reason for panic. The new technology-based economy (often referred to just as “the New Economy”) unraveled: many Internet start-up companies turned out to be making less money than they had reported; many were fined by the government for misleading the public; many declared bankruptcy; and many of their employees in the industry were left without jobs. The riskier nature of the work they had participated in and the career decisions dot-commers had made — often abandoning high-paying and stable jobs, which Gina Neff argues in her ethnographic work with New York City’s Silicon Alley entrepreneurs characterizes a broader shift in U.S. economic history at this time — proved to have their consequences.25
The dot-com crash that began in March 2000 was soon followed and accompanied by 2001’s September 11 terrorist attacks in New York City. To reappropriate a remark from Walter Benjamin’s poetic ruminations, these occurrences had a cumulative effect of “piling wreckage upon wreckage.”26 A large chunk of the country’s strong sense of invincibility and inflated prosperity was revealed to have been built on false promises and beliefs. People lost jobs and were forced to redirect career paths, and a nation that imagined itself as a first-world safe haven from terrorist attacks lost thousands of its citizens in a symbolic act of violence, leading the nation to embark on a deeply ambiguous and one-sided “War on Terror.” Though these events did not lead to the radical reevaluations of the national dreams they were bound up with that one might have hoped would result, these events, and the ways in which they were retold in the media, had all the makings of a sensational Hollywood ending.
Adaptation.’s filming took place between March and June 2001, and its theatrical release was in late 2002. The dot-com boom and bust thus historically coincided with the film’s development and production: Kaufman had written two drafts by September 1999 (close to the height of the dot-com boom) and completed a third in November 2000 (just months after the crash had begun). I note this context to recall the zeitgeist of the film’s period of production, characterized by the dot-com ideologies of speculation, greed, big risks, and big hopes — and the sudden big disappointments — that immediately preceded it. Adaptation.’s characters are explicitly working in Hollywood, a media industry that scholars such as Neff in Venture Labor point out paralleled the new media industry at the time. In multiple scenes we see characters participating in the kind of social networking, meetings, technical seminars, and conversations about innovation and creativity that were also so constitutive of and influenced by the dot-com bubble.
Though the dot-com context is not obviously legible in Jonze’s film, its environment and its ideologies are subtly yet fundamentally intertwined with the film’s significance. Moreover, there are markers of the period within the movie that do make this historical moment legible. Beyond its characters’ enmeshment in the media industry milieu and the frequent images we see of Donald working on screenplays at his computer (by contrast Charlie — the quality, original writer of the two brothers — of course prefers a typewriter), Internet entrepreneurialism does appear within the film’s narrative, even planting the seed for the movie’s unraveling. In one blatant departure from The Orchid Thief, a few years have passed, and we see Susan drunk and, as a version of the script indicates, “dolled-up” in a hotel room.27 She calls Laroche. We see him in a room that had not been described in the 1999 scripts but is described in the November 2000 version as a “little boy’s bedroom,” “now filled with computer equipment. Posters of naked women adorn the walls” (76). After Susan asks Laroche how it’s going, he updates her:
Great! I’m training myself on the Internet. It’s fascinating. I’m doing pornography. It’s amazing how much these suckers will pay for photographs of chicks. And it doesn’t matter if they’re fat or ugly or what. (76)
Laroche’s new self-trained Internet pornography start-up is a startling shift in trajectory from his intense, obsessive involvement with orchids. It is both symptomatic of the time and an ironic extension of the film’s running reminders of the connections between orchids and sexuality: as if web porn is the technological version of its counterpart in the natural world, orchid collecting, where both hobbies are driven by obsessions, arousals, and specialized tastes. Laroche’s new undertaking, though, is also part and parcel of the film’s deeper theme and narrative ending of selling out, which we see enacted as the fundamental conflict of interest between Charlie and Donald. Laroche’s new embrace of Internet porn thus stands for the film’s “bad turn,” becoming a pivotal example of exactly those Hollywood clichés of sex and money that Charlie did not want to sensationalize his script.
What I am suggesting, which I want to neither overemphasize nor underemphasize, is that put in context the bad narrative Charlie tries to avoid in the film is fundamentally bound up with the (generally unattractive) ideologies of dot-com entrepreneurialism that were dominant throughout American society at the time the film was in development and production. In a sense, too, this precisely furnishes the additional layer to and crucial difference with The Orchid Thief that Kaufman has contributed in his own adaptation of the source material. The process of adaptation itself — as we see in a variety of postmodern film adaptations, from Amy Heckerling’s spin on Emma with Clueless(U.S., 1995) to Baz Luhrmann’s updated cinematic treatments of literary classics, Romeo + Juliet (U.S., 1996) and The Great Gatsby (Australia/U.S., 2013) — is about modifying texts in current contexts, inviting spectators and readers to think about the continued yet modified relevance of older questions and concerns. The unusual period of Adaptation.’s title thus propels us to consider how the film as an adaptation is punctuated by the new historical context in which it was created. And in turn, the added narrative layers — the film’s self-reflexive anxieties over writing, ending, and selling out — are imbricated in a critique of the dominant social and cultural values that are encapsulated in the dot of the dot-com and open a window into an American psyche witnessing the sharp burst of the bubble of its presumed technological and economic prosperity. Read against this backdrop, the film’s deliberate period also provides an opportunity to consider the film’s persistent resistance to closure. A punctuation mark that normally closes might here be taken as resisting the dominant “com” of the time that it accompanied (which, one might remember too, stands for “commerce”) — reminding us that this film is ultimately a challenging, self-reflexive experiment in narrative storytelling, not commercial fare.
We thus note how the film’s punctuation mark ties back to our consideration of the period’s textual shift in digital contexts. If, as we observed, the period today is arguably no longer primarily used to end sentences, then Adaptation.’s period performs this change, understatedly mediating the shifting and loosening of its epistemological certainty and the range of new roles it now has in the digital “period.”
Figure 11. Nicolas Cage staring at his typewriter, as tormented writer Charlie Kaufman in Spike Jonze’s Adaptation. (2002)
Periodizing the Period
In her book on quotation marks, Marjorie Garber notes that the pun, a linguistic shifter that in different utterances signals different referents, can be a way of “getting at the radical capacities to mean their various and often contradictory meanings.” She claims, “the mode of argument that takes words seriously — and takes them most seriously when confronted with their capacity to and for play — is an aspect of rhetorical criticism that has historically frightened some rationalist readers, by confronting them with their poets’ — or their own — unexpected inner thoughts.”28 With the power of word association in mind, and without hinging a full-blown rational argument on one, there is a line of thought, or perhaps a path of dots, to be pursued in the connections between the typographical period and the notion of the period as historical era.
One of the questions the preceding interpretation of Adaptation. surely invites one to pursue is what periodization provides as a reading strategy. How does a consideration of a historical context’s ideological conditions complement an understanding of stylistic operations within a text from that historical period — and vice versa? How does the visual culture of a punctuation mark seem to particularly mediate digital ideologies and aesthetics?
Similar to reflecting on the smaller-scale typographical period, attending to the idea of periodization encourages us to debate the stakes of using one word over another. It forces one to organize one’s thought, to determine how to divide and combine units of thought to express them most effectively to others, and to decide where they begin and end, change or continue. Italian philosopher Benedetto Croce writes, “To think history is certainly to divide it into periods, because thought is organism, dialectic, drama, and as such has its periods, its beginning, its middle, and its end, and all the other ideal pauses that a drama implies and demands. But those pauses are ideal and therefore inseparable from thought, with which they are one as the shadow is one with the body, silence with the sound.”29 Croce’s remarks invoke the conceptual intimacy between punctuation and periodization in referring to the concept of pause — one of punctuation’s primary functions as it has been identified by those who have written about the subject, such as M. B. Parkes in Pause and Effect. Much as Croce says of periodization, punctuation is a structural necessity, a “silence with the sound.”
Or as Marshall Brown puts it, “Without categories — such as periods — there can be no thought and no transcendence beyond mere fact toward understanding. Periods trouble our quiet so as to bring history to life.” Summarizing their role in scholarly inquiry, he writes, “We cannot rest statically in periods, but we cannot rest at all without them.”30 In other words, we need to periodize to order thought and make sense of history, but at the same time, we need to resist periodizations. David Perkins claims periods are “necessary fictions. . . . We require the concept of a unified period in order to deny it, and thus make apparent the particularity, local difference, heterogeneity, fluctuation, discontinuity, and strife that are now our preferred categories for understanding any moment of the past.”31 After the insights and demands generated by poststructuralist and postmodernist thought especially (whose very names, dependent upon prefixes, indicate a need to hold on to but move beyond master categories), the unifying perspectives that periodization threatens to impose become opportunities for counterreadings.
Fredric Jameson defends the importance of this kind of idea of periodization for the “exceptions,” or what one might think of as the counternarratives, it helps locate. (Adaptation., through its metanarrativity, must surely count as one of the most intricate counternarratives — counter to dot-commania, as I have read it at least — of the early 2000s.) Jameson writes:
[T]o those who think that cultural periodization implies some massive kinship and homogeneity or identity within a given period, it may quickly be replied that it is surely only against a certain conception of what is historically dominant or hegemonic that the full value of the exception . . . can be assessed. Here, in any case, the “period” in question is understood not as some omnipresent and uniform shared style or way of thinking and acting, but rather as the sharing of a common objective situation, to which a whole range of varied responses and creative innovations is then possible, but always within that situation’s structural limits.32
The emergence of digital media is deeply embedded in a variety of interrelated categories that have been used to describe periodizing shifts — whether in terms of philosophy or cultural production (postmodernism), social order (network society, control society), epistemologies of materiality (the information age), or technology (computer age), which are echoed in higher education with the recent move toward an interdisciplinary “digital humanities.”
In calling attention to digital media’s place in periodizing efforts, I am closely aligned with a range of models that posit a historical tripartite structure whose terms, again, differ depending on the overarching goals and contexts of such schema. For scholars of textuality, from Vilém Flusser to N. Katherine Hayles, the major regimes that periodize history are speech, writing, and computation.33 This schema more or less also forms the backdrop for Brian Rotman’s philosophical consideration of human subjectivity’s increasingly distributed nature as computational media shift textual systems away from their centuries-long emphasis on writing. Rotman situates the beginning of the end of the “alphabet’s textual domination of Western culture” with the introduction of photographic “new media” in the nineteenth century, which began to replace alphabetic representations of information and ideas with visual ones. But, Rotman claims, “this dethroning of the alphabetic text is now entering a new, more radical phase brought about by technologies of the virtual and networked media whose effects go beyond the mere appropriation and upstaging of alphabetic functionality. Not only does digital binary code extend the alphabetic principle to its abstract limit — an alphabet of two letters, 0 and 1, whose words spell out numbers — but the text itself has become an object manipulated within computational protocols foreign to it.”34
For Hayles and Rotman, then, this new textual situation is about much more than just code. Indeed, it is not code that they are analyzing — it is more accurately a collection of literary texts, critical theories, and historical and scientific discourses. In effect they urge their readers to reckon with code as a form of textual unconscious in contemporary life. What does it mean that the languages we encounter on computer screens undergo series of mostly invisible translations in coded machine languages? How do knowledge and experiences of this layered effect of machine translations — what Rotman would likely call “ghost effects” — affect, to quote the title of Hayles’s recent book, “how we think”?35 The stakes of this, according to Rotman, are huge. To vulgarly summarize a very complicated argument, he suggests that alphabetic text and its accompanying possibility of imagining disembodiment essentially invented God, and that the dismantling of alphabetic text brought about by digital technologies will have profound consequences for Western monotheism.
While Rotman and Hayles both offer important insights and provocations about human subjectivity, cognition, literacy, education, mathematics, religion, and technology, what I wish to emphasize by calling attention to their scholarship is in many ways a much more basic and obvious component of their more elaborate theorizations. I believe that their most important contributions are ambitious periodizing strategies, the significance of which is only further supported by how wide ranging the spheres of thoughts they connect are and how distributed the consequences of their analyses prove to be. They in effect claim that digital media represent nothing less than a radically new phase in human history, with new epistemological configurations, ideas of the self, formations and relations of bodies, technological infrastructures of communication, and habits of living.
This is also of course the overarching conceptual periodization from which this book’s notion of textual shift departs, and I mobilize it to make sense of the changed nature of language systems, practices, and visual culture in the digital age. It allows us to ask what set of qualities characterizes contemporary textuality. The case of the period, this chapter’s starting point, demonstrates that as computing technologies have come to trump the printed page as our primary medium of communication, textual protocol shifts.
One key factor that makes this trend possible and also characterizes it is textuality’s increased mobility. By mobility I refer to a range of possible movements, only some of which have been explored in this chapter. From text messages to e-mails, from computer to computer, from one media form to another, from South America to Asia, textuality moves across platforms and locales — in short, what I would refer to as contexts — with increasing ease. With this, the range of functions that can be assigned to a given textual inscription expands and is redefined by various users of technologies in different contexts.
At the heart of this new textual period is the networked computer, the technology that makes language’s mobility possible. How might we interpret the fact that in this period, the period itself seems to take over society’s visual iconography and infrastructural logic? One way to understand the newfound significance of the period, and indeed of punctuation more generally, is in this context of textual travel and mobility. It might be helpful to think of an analogy. When a person travels, it is worth her while to pack for her destination smartly. She does not want to take more baggage than she will need, but she also wants to be prepared for the range of possible weather conditions and activities she might engage in. Versatility and lightness are qualities to strive for. If textuality is traveling as well, these same properties make punctuation marks attractive accessories to pack for the trip, so to speak. Punctuation marks, smaller than letters and certainly smaller than words and sentences, make communication efficient, because they are “lightweight” and can quickly register a mood or tone, but at the same time in many ways they represent what we might think of as floating signifiers, whose meanings and functions can be flexibly adapted for various desired effects.
It is precisely this versatility and lightness that make the period, for example, fit to be widely adapted throughout digital discourses and Internet protocol. Even though it has historically been attached to specific functions in writing, at the same time it is on one level only a dot, so basic and portable that it seems nearly impossible not to be reappropriated. Moreover, when taken to a reasonable level of abstraction, the period’s historical function has been syntactical clarity, and thus it makes sense that it continues to hold on to this function in new ways and for new types of clarity, syntactical and otherwise. These types of clarity include the syntactical, as in the case of the organization of chains of movement in subdomain names and commands. But they also include functions that could be said to offer periodizing clarity in cultural discourse, as in the decimal point that marks the phases of the Web’s historical progression from 1.0 to 2.0.
Beyond such textual portability, the period’s specific periodizing capacities for the digital should also be thought through in terms of its visual aesthetic — of smallness and roundness — what Peter Sloterdijk might classify as a “microsphere” or bubble.36 For Sloterdijk, the sphere is a form that represents nothing short of the human condition, our neuroses and our most important questions, from the mother’s womb to the planet Earth. He calls for “spherology,” a mode of inquiry whose aim “is simply to retrace the formations of shapes among simple immanences that appear in human (and extra-human) systems of order — whether as organizations of archaic intimacy, as the spatial design of primitive peoples, or as the theological-cosmological self-interpretation of traditional empires.”37 Though Sloterdijk does not discuss textuality, his formulation applies to this context. If we take writing to be one of our most important “systems of order,” the period would in Sloterdijk’s terms be the textual system’s microsphere par excellence. Moreover, discourses about the “dot-com bubble” are provocative to imagine in this context: the term is almost redundantly spherical, or perhaps concentrically circular, describing a round enclosure in which cultural activity had its own set of rules and practices, which then burst when pressures and energies from outside proved too strong.
Understanding the period as a microsphere draws it into an even wider context of philosophical problems and intellectual history that might well help provide one final explanation for its pervasiveness in the digital age. If, as so many critics have observed, language has been reduced to zeros and ones, to what extent is the period a closed-in zero-sphere, representing the ultimate reduction of language from complex expressions to two digits to a single punctuation mark? Viewed from this perspective, the period represents the end of one system of textuality, serving as its ultimate periodizing mark. Or, alternately and perhaps even more provocatively, to what extent does the dot’s roundness stand in for a larger sphere, a big world that is now connected, a world whose scale has irrevocably changed and been reconceptualized beyond what other previous periods ever imagined possible? To what extent does the dot compress the large, unwieldy global sphere — what Timothy Morton might call a “hyperobject” — and make it manageable, small, almost invisible?38 In this sense the digital dot might be better viewed as paired not with zero but with another equally pervasive sphere across the visual iconography of digital media —the world. The period, a self-enclosed sphere, emerges as both a synecdoche for the world but also a more manageable version of it, lending it order and clarity at a time when the globe seems to be spiraling out of control — and when global warming, nuclear terrorism, and other threats on an unforeseen scale seem to be threatening the end of the world, with a finality that exceeds even Adaptation.’s crazy ending.
Figure 12. Microsphere/macrosphere conflation? Happy face planet on the cover of WIRED (July 1997)
Figure 13. Poster for Startup.com, a 2001 documentary about the rise and fall of a dot-com company, where the period becomes a boulder, a suggestive graphic representation of dot-com anxieties
Dotting the I
We might observe that this chapter began with one “holy trinity” in which the period belongs — alongside the other terminal punctuation marks, the question mark and the exclamation mark — and ends with another, where the period stands between the small-scale nothingness of the zero and the global sphere in which we are housed. The discussion also began with reference to a historical decision about typewriter design in 1873, an example where the period was quite literally shifted to the side in technological design, and I have ended it by drawing us in closer to today’s concerns, launching us into the digital “period.”
For the sake of bringing this chapter full circle, then, but also perhaps refracting it and even spinning it off in a new direction, I will close in not on zero or infinity, but on a personal aside about my own process of typing this manuscript, which has been accompanied by the added dimension of machine intelligence. In writing this chapter, Microsoft Word’s autocorrect feature has continually capitalized words following dots I write midsentence, assuming I intend to punctuate the sentence’s end with a period. In this sense, the shift function has now been internalized. The computer automatically performs this “correction” without responding to my own keyboard command, or lack thereof. Yet I have in fact often intended to use the character as a noun (much as the language of the RFCs did), or as part of a noun (as in the case of Adaptation.’s title), where in many cases, after the punctuation, my sentence was supposed to continue flowing.
What does it mean that writing is now a battle — or I could be kinder and say collaboration — with a machine? As I write, I not only have to be mindful of managing what I mean and how I express myself, but I rely on Microsoft Word to fix my spelling when I make a clumsy mistake, and I also have to watch out for the mistakes that it makes on my behalf. To my mind, the benefits of its corrections and the drawbacks of its errors, on one level, like the narrative loopholes of Adaptation., cancel each other out. The effect of this, thus, is not so much that machine intelligence makes one’s writing qualitatively better, but it makes writing itself a more layered process, with a built-in system of checks and balances between the human and the machine. The nature of this system and its distinctions from prior systems of writing will, as we are drawn deeper into computer age, need to be more sufficiently taken into consideration. One site where this will be especially important is in writing instruction. How will we cultivate in students the common sense, skills, and vigilance required to know when we are right and artificial intelligence wrong? Applying N. Katherine Hayles’s elucidation of three different reading strategies — close reading, hyper reading, and machine reading, to be used in conjunction with each other — seems like a promising direction forward.39 But adopting such strategies on a base level and refining different forms of writing and reading habits will certainly pose ongoing pedagogical questions, which will be scholars’ and teachers’ responsibilities to tackle as our writing technologies and habits evolve.