Anarchy, Status Updates, and Utopia

Social software has a power problem. Actually, it has two. The first is technical. Unlike the rule of law, the rule of software is simple and brutal: whoever controls the software makes the rules. And if power corrupts, then automatic power corrupts automatically. Facebook can drop you down the memory hole; Paypal can garnish your pay. These sovereigns of software have absolute and dictatorial control over their domains. Is it possible to create online spaces without technical power? It is not, because of social software’s second power problem. Behind technical power there is also social power. Whenever people come together through software, they must agree on which software they will use. That agreement vests technical power in whoever controls the software. Social software cannot be completely free of coercion — not without ceasing to be social, or ceasing to be software. Rule-of-law values are worth defending in the age of software empires, but they cannot be fully embedded in the software itself. Any technical design can always be changed through an exercise of social power. Software can help by making this coercion more obvious, or by requiring more people to join together in it, but it alone cannot fully protect users. Whatever limits make social software humane, fair, and free will have to come from somewhere else — they will have to come from We the Users.


Users.
I.
Technical Power The Fifth Amendment provides that "No person shall . . . be deprived of life, liberty, or property, without due process of law." 2 But the Fifth Amendment doesn't apply to social software. Just ask Marc Bragg. He was a player in Second Life, 3 where almost anything you can imagine can be brought to life with a little sculpting, a little painting, and a little programming. 4 Like many other players, Bragg wanted a parcel of virtual land to make his home. On April 30, 2006, he won a land auction, paying $300 for a parcel named Taessot. 5 Two days later, though, Bragg received a warning from Second Life's administrators, alleging fraud in the auction. 6 At this point, a normal government could have taken him to court to set the sale aside. But Second Life doesn't have a normal government. The one it has rules by software. Second Life's administrators went into its database of land titles and took Marc Bragg's name off the records for Taessot, instantly ousting him from possession and locking him out. 7 And then, as if to further prove who was boss, Second Life took away all his other land as well-and sold it at auction to the highest bidder. 8 So much for "property" and "due process of law." Or ask Vi Hart, a "recreational mathemusician," who creates stop-motion videos that mix obsessive doodling with whimsical soundtracks to explore mathematics in an inviting handson way. 9 She posted her videos to YouTube, where she has over 800,000 subscribers and millions of views. 10  merged its Google+ social network with YouTube, requiring a Google+ account to post comments on YouTube. 11 The move encouraged more people to use the struggling Google+, but it also displaced fans' voices in favor of "popular G+ users . . . . a very small segment of mostly male, professional, egotistical, entitled people" who leave distracting and harassing comments. 12 This put Vi Hart and everyone like her to an unpleasant choice: start using Google+ and its incoming wave of haters, or give up on YouTube entirely. As she explained, I invested so much into my YouTube channel, and they're taking that investment and threatening to throw it away if I don't also start investing in Google+. No thank you Google, but you've already made me regret investing so much into you the first time. Do you really think I'm going to do it again? . . . . Making huge forced changes to a platform is problematic for people whose livelihood depends on certain things being a certain way. I would not recommend making YouTube or Google+ a large part of your business . . . . 13 Or take Mailpile, a project to create a "modern, fast webmail client with user-friendly encryption and privacy features." 14 It carried out an online fundraiser, bringing in $163,192 and 54  (Nov. 12, 2013), http://vihart.com/google-youtube-integration-kind-of-like-twilight-except-in-this-version-when-cullen-drinks-bellatubes-blood-they-bothbecome-mortal-but-cullen-is-still-an-abusive-creep-also-it-is-still-bad/.
14. Bitcoins. 15 But $45,000 of those donations came through Pay-Pal, 16 which froze the money, refusing to let Mailpile have it until the developers provided "an itemized budget and your development goal dates for your project." 17 Only after a wave of online bad publicity did PayPal release the funds. 18 PayPal has a "long history of similar things;" 19 it has blocked fundraisers for Wik-iLeaks 20 and Bradley Manning. 21 This is not the place to reargue these cases. Indeed, even calling them "cases" is a misnomer. In the first instance-before Bragg, Hart, and Mailpile were deprived of their rights and privileges within Second Life, YouTube, and PayPal-there was no litigation at all. The companies simply modified the software on which their platforms ran, and that was it: Bragg's land was gone, Hart was stuck with Google+ boors, Mailpile's money was inaccessible.
They were all victims of technical power: the authority exercised over any software-mediated space by the person or entity that controls the software. Code is law, and the platform operator controls the code. A few tweaks to settings in a database can banish a user, silence her, or confiscate all her digital goods. Virtual worlds, social networks, and payment processors hold technical power. So do Internet service providers ("ISPs") such as Comcast, web hosts such as Tumblr, and the millions of other middlemen who run the systems on which the Internet runs.
Technical power gives rise to a distinctive anxiety: the God problem. The exercise of legal power, no matter how dictatorial, is restrained by the fact that any legal threats must be carried out by humans, fallible humans. They can be bribed, persuaded, seduced, overwhelmed, or distracted. Legal power can be resisted, passively or violently. But technical power cannot: those who wield it are as gods. PayPal changed a status field in the database entry corresponding to Mailpile's account and that was that. Mailpile's money was beyond its reach. Google combined Google+ and YouTube overnight, without so much as a hearing or a notice in the Federal Register. Second Life foreclosed on Taessot and ousted Bragg from possession with a few keystrokes. Mortgage lenders can only dream of such remedies. These software monarchs have metaphysical jurisdiction over their domains-absolute control over what happens, over what exists. 22

II. Social Power
But focusing on technical power raises its own question: why didn't Marc Bragg and Mailpile head for the exit when things got bad, the way Vi Hart did? 23 Yes, Second Life and PayPal changed the way their systems worked, but so what? Database entries only matter if they control your access to something that matters in the real world. Technical power only has bite to the extent you use a software system-walk away from the keyboard and the software can't follow.
To understand where this argument goes wrong, consider what it suggests for our disappointed victims of technical power. Marc Bragg didn't need Second Life: he could have drawn a picture of Taessot on a napkin and continued to enjoy his imaginary property. Mailpile didn't need PayPal; it could have drawn pictures of Benjamin Franklin on napkins and used those. You don't need Facebook; just take a Sharpie to your living-room wall. You don't need YouTube for cute cat videos; just film your own damn cat.
These suggestions are so unsatisfying because they miss the inherently social nature of social software. The fun and the value of these systems come from sharing them with others. YouTube's other users provide me with better cat videos than I could film for myself; Facebook tells me what my friends are actually up to, not just what I imagine they're up to. Countless online journalists use social platforms to publish their work. Virtual property in Second Life, like a domain name or like a LinkedIn account, is valuable only because it's networked. To withdraw from the network in which the property is embedded is to give up something of real value, however virtual the property itself may be.
This, then, is a point about social power: The person or entity who controls the terms on which a community comes together enjoys authority over that community. The threat to boot you from YouTube if you don't accept Google+ comments isn't just about cat videos: it's also about the people who make and watch those cat videos. The threat to boot you off of a mailing list isn't just about the emails; it's about your access to the other people on the mailing list. The threat to boot you from eBay isn't just about the stars next to your name; it's about the community of people who know what those stars mean, who give those stars their meaning.
Facebook, for example, has a privacy problem the way alcoholics have sobriety problems. But it is Facebook's users who enable its addiction to personal information. Facebook's software exists in a constant state of flux; the user community built around that software is the source of stability. Each time Facebook redesigns its sharing settings to be more profligate with users' private lives, it subjects them to technical power. Each time users swallow hard and keep on using Facebook because their friends are there, they subject each other to social power. They are trapped in a dysfunctional codependent relationship with Facebook-and with each other. This is the Cheers 24 problem: you want to go where everybody knows your name. Leaving a social software platform means leaving a social network. Whoever controls that network has you locked in. It's extraordinarily difficult for any individual user in a truly social medium to escape from policies she considers oppressive without giving up all the benefits of being in the same place as the rest of her social circle. This too is a form of power: if no one wants to be the first to leave, no one will leave. Whoever controls the agenda by which the community settles on the software it will use-like Facebook's programmers pushing out an "improvement" to its "privacy" controls-can take advantage of this social power to confer technical power on himself or herself themselves. Wherever there is a software platform, there will be the potential for abuse. Technical power is inescapable because it is inescapably social.

III. Anarchy
There is no way to redesign the technologies of social software so that technical power disappears, for the reason that it is the social power that gives the technical power its bite. 25 We think of social software as being "social" because it enables social connections among users. But it is also "social" because it is socially constructed. If I use a drawing program to doodle for my own amusement, no one else cares what software I use. But if you and I want to share our doodles, we need to agree on which software to use, which requires us to agree on what that software is. It does no good for me to post to doodle.ly 26  Madoodle, 27 not if we want to see each other's work. Sharing a social medium requires running the same software. But it is this agreement-to interoperate at a technical level-that creates the possibility for technical power. 28 Because it is rooted in human agreement rather than in any specific details of software, technical power can be surprisingly tenacious. What makes Facebook the Facebook we know and love/hate? It's not just Facebook the company and its control over a server farm and a domain name. Facebook is also Facebook because its users choose to type "facebook.com" into their browsers-that is, to converge and coordinate on the Facebook software-mediated community.
Even systems specifically designed to escape technical power run afoul of social power. Take Diaspora*. Diaspora* is a peer-to-peer social network platform explicitly founded as an alternative to Facebook. 29 It allows (and encourages) users to host their own Diaspora* servers and gives them the software under a free software license so they can configure their servers as they wish. 30 Its developers explained, "Like the Internet itself, Diaspora* isn't housed in any one place, and it's not controlled by any one entity (including us)." 31 What makes Diaspora* a coherent community? Not the control over Diaspora* servers by one company, but rather the agreement to run a common set of software, with common protocols that interoperate in particular ways. And so there is technical power here, too. It resides in the current configuration of the Diaspora* protocols and the common software, and it flows from the practical ability to push an "upgrade" out to a user community that will agree to run it. Or take Reddit. This "place friendly to thought, relationships, arguments, and to those that wish to challenge those genres" has what seems like a gold-plated exit option to preserve user freedom. Any user (or "redditor") can create a new section of the site (or "subreddit"), automatically becoming its new moderator 32 and establishing its rules. 33 But the tale of its politics subreddit ("/r/Politics") shows why that option is often unsatisfying. /r/Politics has over three million readers, 34 and some of them became concerned in November 2013 about what they saw as the rightward political slant of the moderators. 35 The moderators kept a list of "banned domains" that produced "sensationalist titles" and "bad journalism"-a list that included Salon, the Huffington Post, and Mother Jones. 36 In explaining why dissatisfied redditors didn't simply depart for a more left-leaning political subreddit, one journalist and redditor wrote: First, let's remember what's at stake here: a vibrant community of three million subscribers. So 'start another reddit' is not a fair response to redditors who already built this community over most of a decade, only to watch it taken over and locked down by amateur dictators. 37 What made /r/Politics worth fighting over-that "vibrant community of three million subscribers"-is also what made the fight necessary. The great value of a subreddit is that redditors 32 are talking to each other rather than to themselves; if you split the community, you hurt it. But once you have a single community, someone has to be the moderator, and that someone has the power to determine which publications end up on the "banned" list.
Not even Bitcoin, 38 the libertarian peer-to-peer electronic currency "designed to allow people to buy and sell without centralized control by banks or governments," can escape from the problem of social power wielded through technical means. 39 Consider, carefully, how Bitcoin works. The global log of transactions is jointly maintained by users' computers; distributed cryptography substitutes for centralized anti-forgery controls. 40 The supply of Bitcoins is controlled by a function embedded in the cryptographic protocols, not by a single authority with the power to confiscate them or to make more. 41 But where do Bitcoin's cryptographic rules come from? Not from the mysterious "Satoshi Nakamoto" who originally designed the protocol. 42 44 To resolve the disagreement, some developers tried to "convince a majority of the network's miners to voluntarily downgrade their software." 45 It worked. 46 Similar disputes happen all the time; indeed, the Bitcoin protocol's stability depends on community consensus to resolve them. 47 This is social power, and once again, it creates technical power. If ninety-nine percent of Bitcoin users agree that they need to update their software to deal with a bug and that update requires rolling back a day's worth of transactions, then the one percent of Bitcoin traders who made a killing that day have just lost out to the others. If they update their software, they lose the Bitcoins they just made; if they don't, those Bitcoins will be worthless because there will be no one to trade them with. Bitcoin has no coercive central banker, but it does have a coercive global banker embedded in the software, chosen by the mass of users.
Thus, while the God problem-the unilateral exercise of technical power-is immediately dramatic, it exists because of the Cheers problem-the social lock-in from agreeing to use a common social software platform. We can never completely get rid of technical power, and we can never make exiting any of these platforms completely costless. To join a platform is to commit to its user community, and since technical change over time is inevitable, it means also committing to living with the consequences of technical decisions the community will make in the future. The social is technical, the technical is social, and both are always and forever political. 48 Perfectly libertarian social software does not exist.

IV. State
All is not lost. It is possible to design software that makes it harder to misuse technical power. 49 Harder, not impossible, but that is still something. The heart of social power is the consensus to use particular software with a particular design. Technical decisions cannot thwart a group of users who have reached consensus from putting it into place-but can influence the agenda by which the group makes its decision on which software to use.
A simple example is it that it matters whether changes to software can be made unilaterally by a single actor, or whether such changes require coordinated action by individual users. Facebook, for example, has immense agenda-setting power because it can simply update the software on its servers, automatically changing the "Facebook" experience for everyone. 50 Diaspora* is not immune from software change, but making a change requires persuading a critical mass of users to switch, since each user must make an individual decision to upgrade. 51  upgrade or quit-but it is harder to persuade a majority of users than it is to persuade one individual. On Diaspora*, the sheer force of social inertia protects users.
At first glance, it seems as though we could protect users by locking a design in place for all time and giving no one at all the ability to modify the software. Unfortunately, this approachget the software right and then never change it-doesn't work, because technical power is secondary to social power. Software is not self-executing, so if people agree to discard a piece of software, no safeguards embedded in it will do any good. The parties to a contract can rescind it; the partners in a partnership can dissolve it; the users of software can replace it.
There are also strong practical reasons not to freeze code forever. Software is buggy, and users want someone to be able to fix bugs. If Bitcoin's current implementations can only process seven transactions a second, its users will want to be able to upgrade the protocol's capacity. 52 But once we admit of that possibility, what counts as a "bug" and what counts as a "feature" is necessarily in the eye of the beholder. Marc Bragg-according to Second Life-took advantage of a bug to place early and artificially low bids for virtual land. 53 Leaving that bug unfixed could have broken the land-auction process for everyone else. But a Second Life that can roll back botched land auctions is a Second Life that can confiscate Bragg's property without a hearing.
The same goes for disagreements over how Bitcoin's blockchain protocol 54 should operate, or how to weigh redditors' votes when moderating comments. The necessity of change creates the possibility of oppression. Software is a human construct, made for social purposes; there is no such thing as perfect software, any more than there is a perfect human or a perfect society. 52. See Timothy B. Lee, Bitcoin Needs to Scale by a Factor of 1000 to Compete with Visa. Here's How to Do It, WASH. POST, Nov. 12, 2013, http://www.washingtonpost.com/blogs/the-switch/wp/2013/11/12/bitcoinneeds-to-scale-by-a-factor-of-1000-to-compete-with-visa-heres-how-to-do-it/.
53 Put another way, even software that never changes still creates technical power. It freezes a specific set of rules and power relations in place for all time, favoring some tasks and users over others. An electronic stock exchange that executes trades in the order they are received favors whoever can shave the most microseconds off the time it takes their sell orders to arrive. 55 An Internet on which anonymity is easy and unmasking is hard favors harassers over victims. 56 Those who come out ahead under those rules may be disinclined to notice the technical power sustaining their advantages, but the power and the advantages are still there. The computational is political. 57 We return, therefore, to partial techniques that moderate power rather than eliminate it. One is that having smaller communities with more competition among them makes it easier for users to threaten to leave. The proliferation of subreddits makes redditors' threats to start their own more credible. The moderators of /r/Politics still have technical and social power over it; those who depart still give something up. But they give up less than those who leave Facebook do; the hurdles they must jump are lower. The design of Reddit doesn't prevent the moderators of a subreddit from behaving atrociously; it just makes it harder to force users to hold still while they do.
To generalize, distributed systems disperse social power; centralized systems concentrate it. While the nature of social software means that no technical design can eliminate the need for agreement on some aspects of the design, some designs require greater agreement than others. Facebook is a tightly coupled software system-more than one billion users 58 experience 55 it through exactly the same server software. All one billion users must agree on what "Facebook" is, which gives Facebook enormous, concentrated power.
But other social-software systems are less tightly coupled; they are more tolerant of the possibility that people's experiences will be inconsistent. Factoring web discussions among social platforms such as Digg, Reddit, Slashdot, Metafilter, and a million others means that it is no longer necessary for each to have the same software-imposed rules as the others. This technical modularity creates social modularity: fewer people need to agree on what "Pinterest" or "Tumblr" is than on what "Facebook" is. Reducing the need for agreement on each platform reduces the degree of technical power that each platform possesses over its users.
But dispersion comes at a distinctive cost: fragmentation. It was harder to travel from Antioch to London after the collapse of the Roman Empire; the conversation about a photograph splinters as it crosses from one site to another. Conversations on /r/Liberal 59 and /r/Conservative 60 and /r/Neutralpolitics 61 take place in substantial isolation from each other. There will always be a tradeoff between freedom and interoperability in social software systems. 62 And note carefully, the technical power is not gone. It has simply been placed in more hands: a million mayors instead of a lone emperor. The moderators of /r/Anarchism (52,643 readers) 63 enjoy the same kind of technical power as the moderators of /r/Politics (3,085,888 readers). 64 And, if /r/Postleftanarchism (803 readers) 65 is to be believed, they have abused that power. A mailing list moderator exercises the power to decide which messages she will forward to the list and which messages she will block, just as Facebook does. A piranha's teeth are as sharp as a shark's.
Another technique for checking technical power, one so frequently mentioned that it needs little elaboration, is transparency. The EdgeRank algorithms Facebook uses to decide which stories to show to users are proprietary, secret, and inscrutable. 66 It is hard to detect censorship on Facebook, and even harder to prove. 67 PayPal, at least, cannot freeze a user's account without the freeze being obvious to the user-and thus open to public challenge. 68 Bitcoin's open-source implementation makes it accessible to users what the protocol does and does not do. 69 This fact does not prevent one group of users from insisting on a change that hurts others, but it does make it harder: the consequences of a proposed change are visible in the proffered source code, which makes it easier to mobilize resistance.

V. Utopia
Technical power is dangerous because it can be abused, not because it is bad in itself. Facebook couldn't "give people the power to share" 70 without software and the technical power that comes with it. PayPal, Second Life, Reddit, Bitcoin, YouTube, and all the other social software platforms that enrich online life use technical power to do great things for users. Rather, the fundamental problem with technical power is that it is unconstrained by the rule of law. 71 Software itself can be almost per-fectly rule-like-automatic, precise, consistent, and utterly indefatigable-but there is no way to make similar guarantees about the people who create the software. 72 It is deeply undemocratic, for example, for a government to make new rules in secret and impose them without warning or a chance to be heard. And yet, that's exactly what happens when a platform owner pushes out a new version of its software that takes away a feature users had come to take for granted. The handheld Nintendo 3DS comes with a stylus and a touchscreen, enabling users to run the Swapnote program to "create handwritten notes and then share those notes with other Swapnote users . . . from across the room . . . or across the world." 73 But when Nintendo decided that some users were using Swapnote to "exchange offensive material[,]" it disabled the feature. 74 No consultation, no vote, no warning, no appeal, no refund. Technical power can be wielded without any of the checks and balances that apply in any democracy worth its salt.
The rule of law is a characteristic of a social institution, not of a technology. When software treats users fairly, it is because the programmers and system administrators behind it are committed to treating users fairly. Those commitments don't just happen. They arise when the programmers care about making their online spaces vibrant, safe, fair, and just, and the programmers care when users care. Some administrators will share users' values and act on them; others will be afraid of what will happen if they don't. But either way, the culture of the rule of law must come from users. The users are the relevant political community entitled to make policy for themselves. They are the ones who can hold platform providers truly accountable. They are the ones who best understand the norms and values of their communities. They are the ones with a deep and personal stake in the success of those communities. They are the ones in a position to weigh the costs and the benefits to their community of different rules: to decide, for example, whether the platform should be relatively more tolerant of wide-ranging debate or relatively more protective of its users from abuse.
In the end, following extensive debate within /r/Politics, its moderators apologized, added an FAQ, and reopened consideration of each and every banned domain. 75 Whether you see them as foiled right-wing plotters or as overworked public servants, the debates that led them to change course look like deliberative democracy in action. 76 If the essence of the rule of law is that the government has guns and doesn't use them, /r/Politics comes off looking good. Whether by force or by force of argument, its moderators were persuaded not to use the technical power everyone agreed they possessed. 77 One last example. In 2007, Digg 78 users repeatedly posted a 32-digit hexadecimal number-an encryption key for HD-DVDs. Digg's administrators initially complied with Digital Millennium Copyright Act ("DMCA") 79 takedown notices from the Motion Picture Association of America ("MPAA"), which sparked an outcry from Digg users. After a long night of the soul, Digg co-founder Kevin Rose posted a note: But now, after seeing hundreds of stories and reading thousands of comments, you've made it clear. You'd rather see Digg go down fighting than bow down to a bigger company. We hear you, and effective immediately we won't delete stories or comments containing the code and will deal with whatever the consequences might be. 80 In the end, the MPAA quietly backed down. The moral of the story is not that Digg's software worked, but that its politics worked. Right or wrong, its users collectively made a decision and acted on it.
What Digg and Reddit had that PayPal and YouTube lacked was not just a conscientious administrator in a position of power, but also a user community that cared about how that power was wielded. The values that good administrators act on are the values of their communities. Good administrators online, like good governments offline, explain their policies, give fair warning whenever possible, seek comments and feedback on changes, and are ultimately accountable to those they serve. The technical power is still present, but its use is checked, less visibly and less formally, by the social power behind it.
The rule of law will come to social software when We the Users insist on it. 80