An FCC meeting, with the new chairman, Ajit Pai, on the left.
I wrote something for Jacobin about the new internet privacy bill that Trump just signed, why it matters, and how you can fight back. I also offered some thoughts on the limits of net neutrality, and what a left-wing agenda for internet governance might look like.
The piece also got picked up by The New York Times in their weekly roundup of "partisan" writing, whatever that means. The NYT describes my article as "against net neutrality," which isn't quite right.
Read the piece at Jacobin.
A worker using a stocking frame, an early form of automation.
The media talks a lot about mass automation—the idea that robots will take all our jobs. It's not happening yet, and it might not happen for a pretty long time. But when and if it does, the most plausible scenario given our current trajectory is really, really bad.
I wrote some cheery thoughts on the subject for The Guardian.
A protest outside of Palantir's offices. Photograph by Nitasha Tiku.
My latest for The Guardian is about the rising tide of resistance to Trumpism in tech. As tech workers have watched their CEOs cozy up to Trump, they've been increasingly alarmed. So they've begun to do something about it. I think they've got a real shot at disrupting Trump's agenda—and, potentially, transforming the tech industry as a whole.
Read the piece at The Guardian.
The iconic British World War I recruitment poster with Lord Kitchener, which inspired the Uncle Sam "I Want You" poster in the United States.
I wrote a review of Tim Wu's new book on the history of attention capture, The Attention Merchants: The Epic Scramble To Get Inside Our Head, and had fairly mixed feelings about it. Wu offers an interesting account of the past century of people selling our attention to advertisers, from the Mad Men to Google. But he takes an oddly moralistic approach to the contemporary Internet that derails the book's later sections. It's disappointing, given Wu's stature and his valuable contributions as a scholar and an activist, to see him spill so much ink bashing BuzzFeed and selfies.
It's comforting to pretend that the digital sphere is a pure cesspool of narcissism and stupidity, and that we should simply abandon it by returning to more virtuous activities like reading long novels. (This is roughly what Wu recommends in his conclusion.) But for most of us, abandoning the Internet is neither feasible nor desirable. The task ahead is to democratize it.
Read the review at The Guardian.
Trump sits down with tech leaders at Trump Tower on December 14, 2016.
My latest for The Guardian is on the subject of Trump, tech, and neoliberalism. Today, Trump held a summit with various tech leaders at Trump Tower, and I use that meeting as a jumping off point for a broader discussion of what tech and Trump, despite their obvious differences, have in common. Some commentators have been surprised that so many Silicon Valley executives have warmed up to Trump, after screaming bloody murder about him during the campaign. I think one of the reasons for that is because, at root, they share a basic worldview.
Read the piece at The Guardian.
A creepy still from Trump's victory speech on election night.
My latest for The Guardian is an analysis of how Trump won. The triumph of Trump, I argue, is the triumph of Trumpism, a new kind of right-wing politics perfectly engineered for our era.
The article is an attempt to understand Trumpism on its own terms. But I should add that I absolutely despise Trump and his politics, in case that's not evident from the piece.
Read the piece at The Guardian.
A terrifying cover for Philip K. Dick's gorgeous novel Ubik (1969), which includes a prophetic description of the Internet of Things.
My latest for The Guardian is on the subject of the sharing economy, and the potential for blockchain and Internet of Things (IoT) to greatly accelerate it. It seems plausible that we could find ourselves surrounded by networked, programmable, "smart" property fairly soon, making it possible to borrow and lend a much greater variety of assets than before. In the piece I discuss both the dystopian and utopian prospects, although I'm afraid I fall a bit more on the dystopian side, inevitably. Blame it on Philip K. Dick.
Read the piece at The Guardian.
A very unlikeable tech CEO, courtesy of Silicon Valley.
I did a piece for The Guardian on the subject of why prominent people in the tech industry so often say things that make people upset. I talk about techno-utopianism, Sixties counterculture, Free Basics, Star Wars, and what it means when the tech industry says it wants to change the world.
Read the piece at The Guardian.
Yahoo on Mosaic: the Internet in 1995.
The upcoming transition of formal control over DNS from the Commerce Department to ICANN on October 1st gave me an opportunity to write about the privatization of the Internet for Jacobin. In the piece, I tell the story of how the Internet emerged from decades of public planning and billions of dollars of public funding, and explain how it was privatized fairly rapidly in the 1990s. I also explain why privatization wasn't inevitable, and how we might begin to reverse it.
Read the piece at Jacobin.
Peter Thiel speaks at the Republican National Convention.
My latest piece for The Guardian is about Peter Thiel's endorsement of Donald Trump, and why it might make perfect sense. It's easy for liberals to caricature Thiel (and Trump), but I think it's worth taking him seriously, as a powerful political actor and as a sophisticated political thinker. His politics are atrocious, but they're also fairly consistent and well argued. And in Trump, Thiel may have found the ideal instrument for his particular vision of authoritarian capitalism.
Read the piece here.
A team of scientists conduct the first Internet transmission in August 1976 at Rossotti's, a bar/beer garden outside of Palo Alto. Photo credit: Donald Nielson.
Forty years ago this August, a team of scientists from the Stanford Research Institute set up a computer terminal at a bar outside of Palo Alto and conducted what can pretty accurately be described as the first Internet transmission. I wrote about the experiment for The Guardian, and talked more broadly about how the Internet was invented, and how it works.
Read the piece at The Guardian's website.
The Atlas robot, built by Boston Dynamics and DARPA, getting whacked around by some dude in a warehouse.
My latest piece for The Guardian is about the tech industry’s fascination with universal basic income. It’s a subject that’s already been written about a fair amount. But none of the existing pieces I found took aim at one of the core arguments behind the tech case for UBI, which is that basic income is necessary because technology inevitably creates inequality, and as technological innovation continues it will create even more inequality.
This assumption isn't credible to me, and I go into some detail about why in The Guardian piece.
Unfortunately I didn’t have the space to address the more specific claim made by tech’s UBI advocates, which is that breakthroughs in robotics and AI will automate most jobs out of existence in the very near future. I don’t find that claim persuasive either, for reasons I hope to explore in another piece.
I also didn’t get a chance to discuss whether I think UBI is a good idea. I do, but I'm skeptical of the "basic income because automation" argument—not only because the facts on the imminence of mass automation don't add up, but because the politics seem dubious. I think UBI supporters are standing on firmer ground, factually and politically, when they see basic income as a tool to reduce poverty and precariousness in the present tense, not as a salary substitute for a fully automated economy in the near future.
Read the piece at The Guardian here.
A screenshot from Bernie Arcade, from Bernie's 2006 Senate campaign.
I wrote a piece for The Guardian about why large numbers of workers from the tech industry are supporting Bernie. The starting point for the article is FEC filing data showing that four of the top five employers of Sanders donors are tech companies. (Google, Microsoft, Apple, Amazon). Sanders has also received more individual contributions from tech workers than Clinton. I was curious about why that might be the case, because it contradicts a couple of pieces of received wisdom: the first is that tech workers are all libertarians or centrist Democrats, not left-wing; the second is that since tech workers are high-wage workers, they would have much to lose from Bernie's redistributive policies.
Both assumptions turn out to be wrong. The data suggests there's a not-insignificant number of left-wing tech workers. And many believe they would benefit from Bernie-style redistribution, because while they're better off than most other workers, they're still under significant economic pressure.
Read the piece at The Guardian here.
Possibly the dankest meme of all the memes in the Bernie Sanders' Dank Meme Stash.
Imagine for a moment that Bernie becomes President. It almost certainly won’t happen. But if it did, here’s the scenario: he starts racking up big primary wins, pulls ahead of Hillary in the delegate count, and clinches the nomination. Then he goes on to crush Trump in the general, as most polls suggest he would. Come January 2017, Bernie begins his term. Then what?
What could Bernie possibly hope to accomplish as President? Congressional Republicans have spent eight years obstructing a very centrist Obama. Why would they go any easier on a left-wing Bernie? If Republicans raised hell over Obamacare—a plan cooked up by their very own Heritage Foundation—what kind of hell might they raise over single-payer healthcare, free college, the expansion of Social Security, and so on? And how much support could Bernie muster from Congressional Democrats, many of whom are openly hostile to his agenda?
Answering these questions is important, because they’re the ones I hear most often from Bernie skeptics. In my experience, Hillary supporters are pretty sympathetic to Bernie’s proposals. Liberal voters, unlike liberal pundits, are enthusiastic about ideas like single-payer healthcare. Where Hillary supporters differ is usually not on the desirability of big social programs, but their political feasibility. They like many of Bernie’s ideas. They simply doubt his ability to implement them.
According to this logic, Hillary is the practical candidate. First, her proposals are more moderate than Bernie’s. Second, she has published more than two dozen heavily footnoted policy papers describing exactly how they will work. These two features of her candidacy may not always energize primary voters, the logic goes, but they will be an asset when she gets to the White House. There, her centrism and obsession with policy detail will enable her to overcome Republican opposition and enact at least a portion of her platform. It won’t be nearly as much as what the average Democratic primary voter wants, but it’ll be something. “I’m a progressive,” as Hillary memorably said at one debate, “but I’m a progressive who likes to get things done.”
This narrative rests on the assumption that a wonkish, centrist President is more likely to push progressive legislation through a Republican Congress than a combative, leftist one. It’s the kind of argument that might make intuitive sense, but which entirely fails on the evidence. We’ve just had eight years of a wonkish, centrist President, and he’s faced intense Republican resistance at every turn. Hillary may turn out to a better negotiator than Obama (nearly anyone would be), but I don’t expect Republicans to treat her any differently. If anything, they may treat her worse, given their party’s particularly vicious history with her.
So how many of the progressive items on her agenda could Hillary get through a Republican Congress? About as many as Bernie could of his: close to zero. That Hillary’s proposals are more moderate and more complicated than Bernie’s will mean absolutely nothing to a Republican Party whose “mainstream” is light-years to the right of Reagan, and whose emerging Trumpist wing is essentially fascist.
On certain issues, of course, Hillary and mainstream Republicans will likely find common ground: the TPP comes to mind. But on other issues, especially the ones she’s staked her primary campaign on—criminal justice reform, gun control, and financial regulation, to name a few—she has very little hope of moving forward.
Of course, the President doesn’t need Congress for everything. A President can issue executive orders, set enforcement priorities, and conduct foreign policy. But the promises that Hillary is making in the debates and on her website—and the ones that seem to matter most to her supporters—would almost certainly require new legislation to fulfill. And that legislation will not pass a Republican Congress.
If you concede the point that Congressional Republicans are as likely to obstruct Hillary as they are Bernie, then the only path forward is to change the political math that’s filling Congress with Republicans in the first place. In other words, the question of whether Hillary or Bernie is the more “practical progressive” has nothing to do with which of them could “get things done” on day one of their presidency. Under current conditions, neither of them could. The more practical candidate, then, is the one with the better chance of changing those conditions, by transforming the political composition of Congress. This means leading Democrats to Congressional victories in 2016, 2018, 2020, and beyond.
It won’t be easy. Liberals often like to pretend that the Republican Party is doomed, based on the clownishness of the Republican debates. But it’s not. The Republican Party is in excellent health. Yes, they likely won’t win the White House if and when they nominate Trump, and more broadly, Trump’s popularity has opened a potentially damaging fissure between the party’s corporate and populist wings. But these are relatively small weaknesses compared to the party’s overall strength. The last midterm election in 2014 gave Republicans their biggest majority in Congress in eighty-five years. At the state level, Republicans are just as dominant. They control seventy percent of state legislatures and more than sixty percent of governorships. They enjoy unified control of twenty-five state governments, meaning they have both houses of the state legislature plus the governorship. By comparison, Democrats enjoy unified control of only seven states.
How do the Democrats begin to drill holes in this Republican wall? The Democrat sitting in the White House is the one responsible for coming up with the answer. As party leader, he or she has to put forward an agenda that will get more Democrats elected. Obama’s record on this front is not inspiring. He has been, at best, an indifferent party leader, and has suffered bruising midterm defeats that have produced the largest decline in Congressional Democratic membership in modern history.
Which candidate would be better able to reverse these losses and rebuild Democratic power at the state and national level? I’d choose Bernie, for two reasons. First, he’s better at competing on Republican territory. One of Bernie’s core constituencies is the much-maligned white working-class voter. Bernie’s rhetoric and platform resonates strongly with those white workers whose livelihoods have been decimated over the past few decades. When they vote, these workers have often voted Republican—not because they’re too stupid to act in their self-interest, as liberals are fond of saying, but because Republican candidates have been much better at capitalizing on these workers’ anger towards elites. By acknowledging that anger and offering a social-democratic interpretation of its origins—putting the blame for working-class immiseration where it belongs, on policies like NAFTA—Bernie has managed to do very well with a demographic that has strayed from the Democratic Party in recent decades. This could prove to be a valuable asset when it comes to building coalitions capable of ousting Republicans in places where they usually do well.
Now, white workers don’t have a monopoly on economic hardship—far from it. They may feel a greater sense of grievance because they have a golden age to mourn—the era of the Fordist family wage. But even in their current state, they’re typically better off than non-white workers.
One thing working-class Americans of all races share, however, is that they vote in far fewer numbers than wealthier Americans. The 2014 midterm elections that gave Republicans their commanding majorities in Congress saw very low turnout overall (the lowest since 1942, in fact), but especially among lower-income Americans. Only twenty-five percent of Americans earning less than $10,000 a year voted in that election. For those earning around $30,000, the number was closer to thirty-five percent. By contrast, for those earning $150,000 and above, turnout was more than fifty percent. If you add age, the contrasts become even starker. Younger, poorer Americans were significantly less likely to vote than older, richer Americans. To quote the Demos study where these numbers come from:
Among 18-24 year olds earning less than $30,000 turnout was 12 percent in 2014, but among those earning more than $150,000 and older than 65, the turnout rate was nearly four times higher, at 65 percent.
This leads us to the second reason that Bernie would be a more formidable party leader. He’d not only compete on Republican territory—he’d grow the map.
Non-voters are on the whole far more progressive than voters: non-voters broadly support expanding social services, increasing aid to the poor, reducing inequality, and enacting redistributive policies. In other words, they form a natural constituency for Bernie’s social-democratic platform. As a result, they are far more likely to be mobilized by Bernie than by Hillary.
Of course, it’s not merely a matter of motivation. Lower-income Americans face many structural impediments to voting—perhaps most obviously the fact that they can’t afford to take off work to go to the polls, given that Election Day isn’t a national holiday. But if the primaries are any indication, Bernie is good at turning out typically dormant sections of the electorate. The vast majority of young people don’t vote—in 2014, only about twenty percent of Americans 18 to 29 cast a ballot—but they’re coming out in droves for Bernie. The Democratic primaries have seen historically high turnout partly as a result. Bernie has much further to go, especially among black voters, but if he continues this trend, he could draw large numbers of people to the polls for the first time. These new voters might provide the building blocks of a social-democratic majority that could change the balance of power at all levels of government.
When Bernie calls for a “political revolution,” this is what he’s talking about. Throughout his campaign, he has argued that real change can only happen by pulling millions of people into the political process and keeping them there. You may doubt that such a mobilization will materialize. But Bernie is standing on solid ground when he says that such mobilizations are the only way progressive change ever happens. Every major progressive achievement in American history, from abolition to Social Security, had its basis in sustained bottom-up struggle.
Bernie is the only candidate who grasps this fact and who has built a campaign around it. Which is why it’s been strange to see liberals like Paul Krugman blasting Bernie for his “idealism.” Yes, Bernie’s ideas are to the left of the mainstream. But his strategy for how to fulfill them is quite practical—it may not succeed, but it rests on a realistic understanding of how politics actually works. The notion that Hillary, by continuing in the Obama tradition, will produce anything other than years of tepid incrementalism and continued Republican dominance seems to me far more fantastical.
But Krugman’s disdain signals a deeper threat to Bernie’s political revolution: the Democratic Party itself. As his skirmishes with the Democratic National Committee and near-total lack of Democratic endorsements make clear, the party establishment greatly distrusts Bernie. His politics, while squarely in the mold of Great Society liberalism, are still too far to the left for a party that has long since reconciled itself to the rightward turn in American politics over the past three decades. That Obamacare represented the outer limit of political possibility at a time when Democrats commanded both houses of Congress is revealing. Democratic complaints about Republican obstructionism, while correct, have often been used to obscure the fact that liberal elites simply aren’t very liberal anymore. As Krugman and other Bernie bashers like Ezra Klein and Paul Starr demonstrate, today’s liberal intelligentsia has little appetite for ambitious progressive policymaking. Their timidity dovetails nicely with the class interests of the Democratic financial base, composed of corporations and wealthy individuals who have much to lose from the redistributive bite of Bernie-style social democracy.
Ultimately, the most impractical part of Bernie’s campaign might be his choice to run as a Democrat. As he recently admitted, competing as a Democrat comes with big advantages: for one, it offers him a media platform that he wouldn’t have enjoyed as an independent. But in the long run, the Democratic Party establishment may be permanently inhospitable to both Bernie’s political vision and to his strategy for achieving it. In that case, the people energized by Bernie’s campaign will have to find a new vehicle for continuing the coalition-building he’s begun, and forge a practical progressive alternative to both parties.
Better times for Trump, back in his Reform Party days.
I have no doubt that Donald Trump will win the Republican nomination. I’m also pretty sure he’ll lose in the general election.
Neither of those statements are particularly controversial. Plenty of pundits have reached the same conclusion already. But many of them also seem to think that once Trump loses, his brand of politics will disappear. The weird joke will be over. Things will go back to normal.
They won't. Trump most likely won't win the White House. He may not ever win anything, and he may even decide to quit politics altogether. But that shouldn’t be too comforting, because the future of Trumpism doesn't depend on Trump. Trumpism is bigger than Trump. And it's here to stay.
The arrival of Trumpism is the biggest story of the 2016 election. I don’t want to diminish how much Bernie’s accomplished, and I'm genuinely excited about what his popularity says about the possibilities for social-democratic politics in the US. Still, the substance of Bernie-ism isn't that original—as Jedediah Purdy and others have observed, it's basically Great Society liberalism, the kind that used to be mainstream in the Democratic Party.
Trumpism, on the other hand, feels like something we haven’t seen before. Or, maybe more accurately, something we haven’t seen in awhile, at least in the US.
What is Trumpism? The question might seem a little ridiculous when you think about Trump himself. He's extremely light on policy detail—he basically has no policies at all, at least in the conventional sense. He does have promises—his promise to build a wall on the southern border and make the Mexican government pay for it, for example—but these don’t generally cash out to specific proposals. The “Issues” page on Marco Rubio’s website has a decent amount of detail about what Rubio hopes to do as President. Trump’s has a series of videos on vaguer topics: “The Establishment,” “Unifying the Nation,” and so on.
So Trump’s a lightweight on policy—so what? In the early days of his candidacy, it was widely seen by the media as fatal. When Trump was a joke, his lack of specifics was just another punchline. Now that he's a serious contender, the pundits have come around to the obvious: Republican voters like Trump’s lack of specifics. And it's not because they're all catastrophically stupid—the Idiocracy theory of American politics as pure cretinism. It's because Trump’s policy vagueness facilitates his biggest strength as a candidate: his ideological flexibility.
Trump has been accurately called a racist, misogynist, and warmongerer. He's also promised to tax Wall Street, punish corporations for outsourcing jobs, block the Trans Pacific Partnership, and renegotiate or rip up NAFTA. Now, it’s fair to ask whether he has any intention of following through on any of these promises—if his extraordinarily regressive tax plan is any indication, the answer is almost certainly no. Still, Trump’s rhetoric is significant. As Edward Luce has observed, he sounds significantly to the left of Hillary on economic issues. And this is his great strength as a candidate: by breaking with Republican orthodoxy on the economy, he can more powerfully speak to the anger and hopelessness felt by so many Americans, especially the white working-class voters that make up Trump’s base (the ones who aren’t voting for Bernie, that is). He can capture the ground long vacated by the Democratic Party, and tap into widespread rage over decades of social rot, sliding wages, and rising uncertainty and immiseration.
The result is a political juggernaut. Trumpism combines the fearmongering and racial hatred that have long been the emotional core of the Republican Party with a populism that is significantly more anti-corporate than the Republican Party can comfortably handle. Politically, this heterodox approach is paying huge dividends. But it scares the hell out of the Republican establishment—not because of Trump’s bigotry, as some Republicans have pretended. Bigotry is the bread and butter of Republican politics: when Trump says horribly racist things, he’s merely saying as text what many Republicans prefer to leave as subtext.
No, the real reason that Republican elites are freaking out over the prospect of Trump becoming the nominee is their terror at his populism. The Republican Party has faced insurgents before—but never an insurgent as ideologically dangerous as Trump. The Tea Party represented a radicalizing of Republican orthodoxy, not a departure from it. The problem with the Tea Party from the Republican establishment’s point of view was that it took right-wing ideology too seriously, memorably threatening to nuke America’s credit rating to make a point about small government. Trump’s problem is that he doesn’t take it seriously enough.
What kind of conservative says hedge fund managers are “getting away with murder”? Or that he wants “fair trade,” not “free trade”?
If Trump loses and disappears, maybe his heresy doesn’t matter. But if Trumpism catches on—if Trump clones start popping up in races at the state and local level—then the corporations and wealthy individuals who form the power base of the Republican Party will start to worry. The Republican Party is a very reliable political organ of big business—even more reliable than the Democratic Party, which is saying something. If Trump or Trumpists take elected office and enact even a portion of their populism, the traditional link between big business and the Republican Party might start to fray. Then you might have a Republican Party that starts to resemble the far-right insurgent parties of Europe: ultranationalist, racist, and authoritarian, yet full of invective against free trade, economic elites, and the professional political class.
Fortunately, we have two parties in America, for the same reason that Google keeps multiple copies of your Gmail inbox in case one goes down: redundancy. If one party isn’t doing its part for what Bernie calls the billionaire class, there’s always another to take its place. If and when Trump wins the GOP nomination, every elite with any reasonable sense of class interest will line up behind Hillary Clinton. And, in fact, this is just what Trump wants. The capitalist class closing ranks around Hillary makes it easier for him to portray her as the quintessential insider, the establishment darling, the favorite of Wall Street. And easier to portray himself, the billionaire son of a slumlord, as the savior of the working class.
Cover of the first issue of Radical Software.
In 1970, a handful of video artists started a magazine called Radical Software in New York. The Internet didn't exist yet (its predecessor, ARPANET had only started a year earlier) and computers were still mostly big mainframes. But theRadical Software editors saw technology's future as decentralized, personalized, mobile, and interactive--"technology as ecology." They were way ahead of their time, and in some respects, still are.
I find their politics particularly inspiring. At a very early stage, they saw the emancipatory potential of digital technology while avoiding the pitfalls of techno-utopianism.
Here's an excerpt from the first issue's opening editorial:
As problem solvers we are a nation of hardware freaks. Some are into seizing property or destroying it. Others believe in protecting property at any cost including life or at least guarding it against spontaneous use. Meanwhile, unseen systems shape our lives.
Power is no longer measured in land, labor, or capital, but by access to information and the means to disseminate it. As long as the most powerful tools (not weapons) are in the hands of those who would hoard them, no alternative cultural vision can succeed. Unless we design and implement alternate information structures which transcend and reconfigure the existing ones, other alternate systems and life styles will be no more than products of the existing process.
Fortunately, new tools suggest new uses, especially to those who are dissatisfied with the uses to which old tools are being put. We are not a computerized version of some corrupted ideal culture of the early 1900's, but a whole new society because we are computerized. Television is not merely a better way to transmit the old culture, but an element in the foundation of a new one.
Our species will survive neither by totally rejecting nor unconditionally embracing technology--but by humanizing it: by allowing people access to the informational tools they need to shape and reassert control over their lives. There is no reason to expect technology to be disproportionately bad or good relative to other realms of natural selection. The automobile as a species, for example, was once a good thing. But it has now overrun its ecological niche and upset our balance or optimum living. Only by treating technology as ecology can we cure the split between ourselves and our extensions. We need to get good tools into good hands--not reject all tools because they have been misused to benefit only the few.
One of the features of "extended adolescence" is that college feels less like a discrete life phase and more like a template for everything. An example: roommates. Everybody has roommates in college, but the idea of having roommates forever is a relatively recent one. The number of 25 to 34 year olds living with roommates increased by 39 percent from 2005 to 2015. The reason is pretty obvious: it's cheaper.
A lot of the commentary seems to suggest this is a temporary phenomenon: that at some point in the future, a better economy will lift millennials' miserable incomes, and they'll be able to afford to form single-family households like their parents.
More likely, however, is that this doesn't happen. When you look at the sectors where millennials are working (leisure, hospitality, retail, wholesale), their wages aren't going up anytime soon. But their rent will: as urban cores continue to reurbanize/gentrify, the cost of living is going to take a bigger and bigger bite of millennials' stagnant incomes. This means, inevitably, more roommates.
It might take awhile, but I'm pretty convinced that these economics will eventually make the single-family household obsolete. Living with roommates is just the beginning: there are going to be a lot more experiments in collective living over the next several decades. You already have "co-living" spaces popping up in New York and Syracuse: so-called "dorms for grownups," where people rent small apartments connected to a common area where they can watch movies, play games, cook group dinners, etc.
It actually doesn't sound so bad. The rise (or, more accurately, return) of collective living might be for all the wrong reasons, but it could have pretty okay consequences. As any punk rocker can tell you, the suburbs are lonely. The single-family household probably never made sense at a human or economic level, so I'm not sure we should mourn its demise. Still, homeownership is traditionally how people become middle-class in America. They get an asset that's supposed to appreciate indefinitely, and the federal government makes it possible with massive tax subsidies and the world's biggest socialized mortgage market. Excluding a generation of Americans from homeownership means a lot more than making them live with roommates forever--it also means barring them from the American government's biggest free-money machine, and the main route to middle-class wealth accumulation. It means their tax dollars go into helping their parents pay their mortgage, without ever being able to afford one of their own.
San Francisco in the 23rd century, according to Star Trek.
I routinely get into arguments with my parents about the San Francisco housing crisis. I tell them it’s all their generation’s fault, because their small-is-beautiful-localist-preservationist-Jane-Jacobs-ism prevented SF from building the kind of high-density housing and robust public transit infrastructure that it needs to be a livable city. They tell me that I’d be a Jacobsite too, if I’d lived through the postwar era when “urban renewal” maniacs like Robert Moses were slaying neighborhoods in the name of progress.
Fair enough. It’s mostly a matter of perspective: in the 1960s, when the prospect of living in a dystopian Corbusier city seemed like a real possibility, it made sense to stick up for the virtues of smallness. These days, smallness feels like a dead-end: or, more specifically, it seems destined to produce a city that's a picturesque playground for the rich. Say what you will about Robert Moses—and there’s plenty to say, the man was a racist, a tyrant, and an all-around unpleasant dude—but he recognized that what defines cities is their bigness—and that urbanists should embrace that bigness, not try to recover some impossible village ideal. As cities emptied out in the postwar decades, especially after deindustrialization, it was easy to forget the point about bigness, because it didn’t seem to apply anymore. But now that re-urbanization is making cities crowded and economically powerful again, recovering that large-scale thinking is essential. Otherwise, cities simply become enclaves for elites. Jane Jacobs loved the Village, but she’d never be able to afford it these days.
For a much more informed take on Bay Area housing, and Prop F in particular, see Kim-Mai Cutler's very thorough breakdown in TechCrunch. There’s a lot of excellent stuff in there, but I’ve been chewing over one quote in particular:
The central dilemma in the U.S. and U.K. systems is that housing is both a durable good as shelter and an investable asset as a person’s primary form of wealth. That introduces a fundamental tension between policies that maximize property values and conversely, policies that support the goal of affordability.
Just because there's a ton of momentum behind weed legalization doesn’t mean it’s inevitable. Yes, weed's already legal in 4 states, and ballot measures in 2016 might make it legal in several more, including California. Still, we’ve been down this road before—many states decriminalized in the 70s, only to get tough again in the Reagan 80s.
But one thing we have now that we didn’t have back then is something resembling a legitimate weed industry. There’s a publicly traded weed social network, a YC-backed weed delivery service, even a weed business incubator in Denver (and they’re opening another location in SF in early 2016). Big Tobacco's been trying to get into weed since the 70s, and now it might finally get its chance (imagine a world in which your local liquor store sells Marlboro spliffs).
This is why Willie Nelson's worried about pot getting too corporate. But here’s what’s weird about the weed industry: it’s not really about weed. So long as marijuana remains illegal at the federal level (and it remains very illegal—marijuana is a Schedule I drug, which means the federal government considers it as dangerous as heroin, and more dangerous than coke and meth), people who sell weed can’t get banking services, even if they’re operating in states where weed is legal. That’s because banks are terrified of running afoul of the feds by taking weed money: best-case scenario, they lose their federal deposit insurance; worst-case scenario, they get prosecuted on money laundering charges.
One consequence of this is that the “legit" weed business is about paraphernalia and services, not actual weed. CanopyBoulder, the Colorado-based weed biz incubator, says your company can’t apply if you “touch the plant.”
How sustainable is this? Imagine a tobacco industry that only consisted of companies that sold lighters and cigar-cutters. Pretty weird. But until the law changes at the federal level, which is unlikely anytime soon, it’s hard to see how the businesses actually selling weed can become fully mainstream. There might be temporary workarounds, like using Indian casinos as weed banks (brilliant), but selling something that the federal government considers as illegal as heroin makes it pretty hard to build a legitimate business.
A "continental" designed by Benjamin Franklin.
In 1775, the American Revolution began, and the Continental Congress started printing paper currency ("continentals") to help fund it. They asked Benjamin Franklin to design the notes. Never one to squander a potential platform for his views, Franklin emblazoned the notes with emblems and mottos intended to instill republican virtues of hard work and self-reliance. The bill above says "Mind Your Business;" his six-dollar bill had the Latin word Perseverando (“Perseverance”), while his one-dollar bill read Depressa Resurgit (“Though Crushed, it Recovers”). As both elegantly executed works of art and cleverly disguised propaganda, Franklin’s continentals were the most visually interesting paper money that America had ever seen.
Group of "contrabands" in Virginia, May 1862. Photograph by James F. Gibson.
In the opening months of the Civil War, three slaves escaped. This wasn't anything new; in fact, it happened so often that in 1850, Congress passed the Fugitive Slave Law to make it easier for slaveholders to recover their "property." The difference, as Adam Goodheart discusses in his 1861: The Civil War Awakening, was that now the North and South were at war. The escaped slaves showed up at the Union-held Fort Monroe in Virginia, and provided valuable intelligence about the Confederate fortifications they'd been building. When a Confederate officer arrived at Fort Monroe to demand the slaves be returned, the Union commander, Benjamin Franklin Butler, came up with a brilliant legal loophole that changed the course of the war. As he recalled in his memoir:
"What do you mean to do with those negroes?" [said Major Carey, the Confederate officer.]
"I intend to hold them," said I.
"Do you mean, then, to set aside your constitutional obligation to return them?"
"I mean to take Virginia at her word, as declared in the ordinance of secession passed yesterday. I am under no constitutional obligations to a foreign country, which Virginia now claims to be."
"But you say we cannot secede," he answered, "and so you cannot consistently detain the negroes."
"But you say you have seceded, so you cannot consistently claim them. I shall hold these negroes as contraband of war, since they are engaged in the construction of your battery and are claimed as your property."
"Contrabands" flocked to the Union lines by the thousands, serving as informants, scouts, laborers, and when the Army began recruiting blacks in 1863, soldiers. They helped pave the way for emancipation. Their status as property, as something that could be confiscated because they helped the enemy fight the war--like a cask of gunpowder--ironically helped make them free. The Civil War, especially for the North, was all about drift. What started as a conservative war--a war to preserve the Union as it existed--became a radical one, involving the abolition of slavery and the annihilation of the Southern way of life. "Contrabands" were the first crucial step.
Imagine an industry where seventy percent of your products lose money. You knit ten different types of wool socks. Seven don’t sell enough to cover the cost of the wool, while the other three are so popular they’re capable of keeping the whole enterprise afloat. This is the basic math of book publishing, a business model that’s evolved over the course of the last couple centuries and has alternately baffled, unnerved, and outraged the long list of hugely intelligent people who have given their lives to it. The “worst business in the world,” Doubleday’s cofounder Walter Hines Page called it, and even in flush times, the refrain is usually the same. It’s hard to think of another industry so perpetually prone to grumbling and self-hatred. As early as 1896, Publisher’s Weekly wondered whether the book business was “A Doomed Calling”—a question that, by the late nineteenth century, had already become a cliché.
Recently, the doomsaying has reached a fever pitch over the threat posed by e-books. Publishers fear that companies like Amazon will erode their margins by setting unreasonably low prices for digital books. Even more frightening is the possibility that the handful of bestselling authors who keep the industry solvent will start self-publishing through digital platforms, leaving publishers out in the cold. The apocalypse of American book publishing, after a hundred or so years of false alarms, seems finally to have arrived.Read the rest at Lapham’s Quarterly‘s Roundtable.
Walt Whitman and his (likely) lover Bill Duckett.
The word "homosexual" didn't appear in print until 1869, when a Hungarian writer named Karl-Maria Kertbeny wrote an anonymous pamphlet arguing against a proposed section of the Prussian legal code that would make homosexual acts illegal. Kertenby's close friend had been gay, and he'd committed suicide after an extortionist threatened to expose him. Kertbeny wanted to make sure nothing like that ever happened again.
For much of its history, homosexuality had a language problem. Gay people lived in a world with no words for what they were, where homosexual love wasn't only forbidden but invisible--enciphered in metaphor, perhaps, but never plainly discussed.
We two boys together clinging,
One the other never leaving,
Whitman had his own word for homosexuality: "adhesiveness." He borrowed the term from phrenology: a popular pseudoscience based on the idea that the size and shape of a person's skull said something fundamental about their character. According to your cranial measurements, you could be classified as "adhesive": which meant you were highly prone to same-sex friendships. As science, phrenology was bullshit--but, by linking sexuality to an unalterable fact of physiology, Whitman was making a radical point: he was born this way.
When Edward Bellamy’s strange sci-fi novel Looking Backward: 2000-1887 appeared in 1888, it quickly became a runaway bestseller. It sold half a million copies in its first decade, putting it in the same league as Uncle Tom’s Cabin and Ben-Hur. Everyone from Mark Twain to William Jennings Bryan loved it. So how did a novel of far-out, time-traveling fantasy fiction become such an instant classic? Bellamy’s book had plenty of futuristic gadgets: credit cards, electronic broadcasting, electric light, and pneumatic tubes. But the real reason Looking Backward became so popular was because it offered an elegant solution to the crisis of late nineteenth century America: an era when the widening gap between rich and poor was threatening to destroy the country’s democracy. The utopian society imagined by Bellamy reconciles the loftiest ideals of the American Revolution with the realities of modern industrial society, a balancing act that, more than a century later, still eludes us.
I wrote an essay about Bellamy for Lapham’s Quarterly. It’s called “Magical Thinking” and you can read it on their website.
A woman being measured for Women's Measurements for Garment and Pattern Construction, a 1941 government report that laid the groundwork for women's clothing sizes.
Where did women's clothing sizes come from? In 1939, during the Great Depression, the Department of Agriculture teamed up with FDR's Work Projects Administration to undertake an ambitious bit of research that had never been attempted before: a scientific survey of women's body measurements. Fifteen thousand women participated. They were paid a fee--money spent on food, most likely, this being the Depression--to stand in their underwear behind a curtain while researchers took 57 different measurements of their bodies. The results, published in 1941 as Women's Measurements for Garment and Pattern Construction, helped the garment industry develop the sizing system still in use today.
In 2004, a new national survey--done with laser scanners, not rulers--found that American bodies had changed dramatically in the last sixty years. People had gotten thicker, especially around the middle. The population had also become much more diverse. How do you manufacture ready-made clothes for such a wide spectrum of shapes and sizes? 3-D clothing printers?
A phrenological chart, mapping the different zones of the human brain. “Destructiveness” is near the ear; “Individuality” is right above the nose.
Why do we talk about foreheads when discussing culture? The word “highbrow” first appeared in the 1880s; “lowbrow” came into use right after the turn of the century. They came from phrenology, a nineteenth-century pseudoscience based on the (entirely false) idea that the shape of a person’s skull revealed something fundamental about their character. The creative, intellectual parts of the brain were located behind the forehead: thus Anglo-Saxons were superior to other, darker races because of their higher foreheads, or “brows.” Italians, Irishmen, Africans, Asians couldn’t create art on the level of Shakespeare or Milton because their brains simply weren’t built for it. They belonged to the “lowbrow,” on account of their lower foreheads.
During the Civil War, New York City almost became its own country.
In January 1861, as the South began seceding from the Union, New York City Mayor Fernando Wood came up with an unusual idea. Declaring disunion "a fixed fact," the Tammany Hall Democrat proposed that New York become an independent nation composed of three islands--Manhattan, Long Island, and Staten Island--called "Tri-Insula." The city council approved the plan. "I would have New York a free city," declared one supporter, "not a free city with respect to the liberty of the negro, but a free city in commerce and trade." Independence would have concrete economic benefits, since it would enable New Yorkers to continue their lucrative trade in Southern cotton. The Free City of Tri-Insula might've become a reality, if it weren't for the attack on Fort Sumter in April 1861, which triggered an outpouring of patriotism throughout the North and effectively killed Wood's proposal. Later that year, he lost his re-election campaign, and Tri-Insula became another footnote in the long history of American secession.
Cocaine, as Rick James said, is a hell of a drug. South Americans have been chewing coca leaves for centuries, but the drug’s modern history began in 1855, when a German scientist named Friedrich Gaedcke isolated the active ingredient. Cocaine became a popular medicine: doctors used it as an anesthetic, a stimulant, and an antidepressant.
It also became popular for other, less clinical purposes. A French chemist made a wine called Vin Mariani by infusing Bordeaux with coca leaves: his customers included Queen Victoria, the Pope, and Ulysses S. Grant, who sipped it while writing his memoirs. Vin Mariani’s success inspired John Pemberton to create a nonalcoholic coca drink called Coca-Cola.
Cocaine has stimulated a lot of famous nervous systems over the years. The list includes Robert Louis Stevenson (who allegedly wrote the Strange Case of Dr. Jekyll and Mr. Hyde in less than a week while coked out of his skull), Jules Verne, and Thomas Edison. Its best-known abuser might be Sigmund Freud, who published his first scientific paper on cocaine: “Über Coca” (1884). In June 1884, the 28-year-old doctor wrote his wife about the essay he planned to write.
Woe to you, my Princess, when I come. I will kiss you quite red and feed you till you are plump. And if you are forward you shall see who is the stronger, a gentle little girl who doesn’t eat enough or a big wild man who has cocaine in his body. In my last severe depression I took coca again and a small dose lifted me to the heights in a wonderful fashion. I am just now busy collecting the literature for a song of praise to this magical substance.
The Concord coach was the plane, train, and automobile of nineteenth-century America. Ben Holladay used them for his Overland Stage Route to California, later absorbed by Wells Fargo. These sturdy little carriages—named after Concord, New Hampshire, where they were designed—could withstand terrain so rugged it would scare the gnarliest SUV. They connected America in the days before railroads and highways and airports, when traveling from one side of the country to the other took weeks, not hours. What distinguished the Concord coach was its superior suspension: strips of leather called “thorough-braces” suspended the carriage and allowed it to rock back and forth, absorbing the shocks of the road.
“The Curse of California,” with the Central Pacific railroad monopoly as a hungry red octopus. Mark Hopkins and Leland Stanford are its eyes; farmers, miners, lumberjacks, and other victims are entangled in its tentacles. First published August 19, 1882 in The Wasp during Ambrose Bierce’s editorship.
Americans love using metaphors in their politics. They love to make complex issues intelligible by speaking in symbols. During the late nineteenth century, when a new industrial order was creating unprecedented consolidations of capital, a metaphor emerged to describe the powerful corporations built by Cornelius Vanderbilt, Andrew Carnegie, and other tycoons: the Octopus. Many cartoonists drew the Octopus (the National Humanities Center has even compiled a collection), and Frank Norris wrote a novel called The Octopus: A Story of California.
The Octopus is a great metaphor for monopoly: it’s big, slimy, sinister, and can squeeze a lot of things at once. What’s the right spirit animal for today’s economic overlords? Swordfish? Salamander?
The clientele pose at the Cuckoos Nest, a bar in San Francisco, undated. Courtesy of the San Francisco Public Library.
In San Francisco, I get into a conversation with a cab driver about the old Port, the once bustling industrial waterfront that disappeared after the 1960s, when containerization forced the whole industry across the bay to Oakland. Back then, my cab driver reminisces, the city had a real nightlife: the bars and diners stayed open all night, serving the dockworkers who unloaded ships around the clock. This reminded me of my favorite map in Rebecca Solnit’s amazing Infinite City, showing the “The Lost Industrial City of 1960 and the Remnant 6 AM Bars.” Nothing like that exists anymore, my cab driver says. Now it’s sports bars, and parasites. “Bankers, brokers, and lawyers,” he says. “Which are you?”
On the night of November 3, 1873, Laura D. Fair pulled a pepperbox pistol from her cloak and drilled a single bullet into the chest of Alexander Parker Crittenden. Crittenden died forty-eight hours later, after a painful struggle.
The two were lovers: Crittenden, a prominent lawyer and legislator, had taken Fair, a former actress, as his mistress. Crittenden frequently told Fair he’d divorce his wife and marry her. When it grew clear he never intended to keep his promise, Fair swore revenge.
The subsequent trial became a national sensation. It captured the fears of adulterous men everywhere, and drew the sympathy of women’s rights crusaders like Elizabeth Cady Stanton and Susan B. Anthony. Fair’s first trial resulted in a conviction, overturned on a technicality; her second ended in acquittal. Upon her release, she promised to lecture on the topic of morality. Although an angry mob of men prevented her from speaking, she remained defiant to the end. Her advice to wronged women was simple: instead of waiting for men to do the right thing, take matters into your own hands. ”[W]hen an American woman in justice avenges her outraged name,” she wrote, “the act will strike a terror to the hearts of sensualists and libertines.”
La vie boheme arrived in America in the late 1850s, care of a Nantucket-born radical named Henry Clapp, Jr. After three years in France spent loafing in the Latin Quarter, Clapp hoped to bring Parisian cafe culture to New York City. For his headquarters he chose a gritty German bar called Pfaff’s at 647 Broadway. It was an underground cellar, minimally furnished. There Clapp and his circle passed hours of inspired idleness, chugging mugs of cheap beer. Like their French forebears, they embraced lives of poverty and vice. They drank to excess, smoked opium and hashish, contracted venereal diseases.
Clapp had an unpleasant reputation—a “snapping turtle” is how William Dean Howells described him. Yet he drew fascinating men and women into his orbit, spearheading a short-lived experiment in American counterculture on the eve of the Civil War. His associates included Adah Isaacs Menken, the daughter of a French Creole mother and a free black father who converted to Judaism for her first (of four) husbands—and, more importantly, became an actress whose scandalously erotic performances made her an international sex symbol. Another was Fitz Hugh Ludlow, a consciousness-expander along the lines of Aldous Huxley and Timothy Leary whose memoir The Hasheesh Eater is a founding document of American psychedelia. There was also Walt Whitman, who had just published the second edition of his Leaves of Grass after encouraging praise from Emerson. He wrote an unfinished poem about his time at Pfaff’s:
The vault at Pfaffs where the drinkers and laughers meet to eat and drink and carouse While on the walk immediately overhead pass the myriad feet of Broadway As the dead in their graves are underfoot hidden
In 1856, a twenty-one-year-old Mark Twain was stranded in Keokuk, Iowa, working for his brother’s printing office, bored to death by the small town’s soporific pace. Restless, he needed a change. He started reading about the Amazon River, and soon cooked up a scheme to sail to Brazil. In August, he wrote to his younger brother Henry about his plans. Fifty-four years later, he reminisced about the episode in an essay published just two months before his death in April 1910:
Among the books that interested me in those days was one about the Amazon… [H]e told an astonishing tale about coca, a vegetable product of miraculous powers, asserting that it was so nourishing and so strength-giving that the native of the mountains of the Madeira region would tramp up hill and down all day on a pinch of powdered coca and require no other sustenance. I was fired with a longing to ascend the Amazon. Also with a longing to open up a trade in coca with all the world. During months I dreamed that dream, and tried to contrive ways to get to Para and spring that splendid enterprise upon an unsuspecting planet.
In short: Mark Twain, at twenty-one, almost became a drug dealer. He wanted to go to Brazil and start importing cocaine into the United States. He got as far as New Orleans before he decided to become a steamboat pilot instead.
In the September 16, 1852 issue of the Hannibal Journal, a young Mark Twain clowns a rival editor.
At sixteen, the boy who would become Mark Twain worked as a typesetter for his brother’s newspaper, the Hannibal Journal. In the summer of 1852, his brother took a business trip and left Twain in charge. The boy—small, impish, high-strung—took advantage of his brother’s absence to print a caustic attack on a rival editor, Josiah P. Hinton. Hinton had recently lashed out at the Journal for an editorial about dogs barking at night. Twain retaliated by making a crude woodcut with his penknife that pictured Hinton with a dog’s head, headed towards Bear Creek with a bottle of booze. Hilton had recently tried to drown himself after being rejected by a woman—the whole town knew the story, and Twain spun it into his satire. Hinton made some indignant noises in reply but basically crumpled. The boy with the short temper and savage wit had won his first literary feud.
"Highway #2" by Edward Burtynsky. Los Angeles.
Vacant land is an important idea in America. Vacuum domicilium was what John Winthrop called it: land that hadn’t been put under cultivation, and thus free for the taking from the Indians. Thomas Jefferson thought that democracy required a limitless reserve of empty land: a frontier where poor emigrants could escape the overcrowding and poverty of the Eastern cities and become farmers. So long as empty space existed, American democracy was safe.
So what happens when we run out of space? Jefferson was pessimistic. ”When we get piled upon one another in large cities, as in Europe, we shall become corrupt as in Europe, and go to eating one another as they do there,” he wrote to James Madison in 1787.
In 1857, Lord Macaulay offered an even bleaker assessment in a letter to Henry S. Randall, a biographer of Jefferson’s:
As long as you have a boundless extent of fertile and unoccupied land, your laboring population will be far more at ease than the laboring population of the Old World, and, while that is the case, the Jefferson politics may continue to exist without causing any fatal calamity. But the time will come when New England will be as thickly peopled as old England. Wages will be as low, and will fluctuate as much with you as with us. You will have your Manchesters and Birminghams, and in those Manchesters and Birminghams hundreds of thousands of artisans will assuredly be sometimes out of work. Then your institutions will be fairly brought to the test…
When a society has entered on this downward progress, either civilization or liberty must perish. Either some Caesar or Napoleon will seize the reins of government with a strong hand, or your republic will be as fearfully plundered and laid waste by barbarians in the twentieth century as the Roman Empire was in the fifth; with this difference, that the Huns and Vandals who ravaged the Roman Empire came from without, and that your Huns and Vandals will have been engendered within your own country by your own institutions.
Los Angeles in 1871. Courtesy of the Bancroft Library.
Los Angeles was once the most dangerous city in America. In 1850, two years after Mexico officially surrendered, it boasted the highest murder rate in the country: one per day, on average. Its unpaved streets and adobe huts were the setting for constant shoot-outs: cowboys, crooks, and gamblers predominated, armed to the teeth. The justice system barely existed—lynchings were about as common as legal executions—and the law inevitably favored Anglos over Mexicans. Intense racial animosity between the two groups often erupted in violence.
When a Presbyterian minister from Massachusetts named James Woods visited, this is what he wrote in his diary:
April 29: 1855 Sabbath—
Between four and five oclock in the afternoon. And all around my house near the head of main Street, are hundreds of Spaniards in all sorts of revelry and noise—men on horse back—women on foot—children crying—and such a constant gibber jabber, as would remind one of bedlum. Horse racing is the object calling the crowd together. Several races have already occured this afternoon and also a fight or two… I would have left the house, but was afraid of its being broken open thefts committed… A poor child with its wretched mother I suppose, is crying constantly with cold—And now the dogs are in a fight. Just now a row was raised by one man trying to ride over another… I hear every few minutes the voices and conversations of Americans, betting, cursing, blaspheming as they stand leaning against my window blind—this is nominally a christian town, but in reality heathen.
"Nevada 2" by Thomas Struth, 1999.
From Wallace Stegner’s essay “Living Dry,” included in Where the Bluebird Sings to the Lemonade Springs (1992):
“Scale is the first and easiest of the West’s lessons. Colors and forms are harder. Easterners are constantly being surprised and somehow offended that California’s summer hills are gold, not green. We are creatures shaped by our experiences; we like what we know, more often than we know what we like. To eyes trained on universal chlorophyll, gold or brown hills may look repulsive. Sagebrush is an acquired taste, as are raw earth and alkali flats…
You have to get over the color green; you have to quit associating beauty with gardens and lawns; you have to get used to an inhuman scale; you have to understand geological time.
For the last several years I’ve kept a notebook with passages from my favorite books, something to look at when I’m feeling clouded or sluggish. They’re not always inspiring—some are pretty dark—but they’re all written with the kind of brain-puncturing clarity that makes it feel like the author is sitting next to you, speaking—something to aspire to. The following is from Richard Wright’s essay “How ‘Bigger’ Was Born,” an essay about the process of writing his novel Native Son (1940).
With the whole theme in mind, in an attitude almost akin to prayer, I gave myself up to the story. In an effort to capture some phase of Bigger’s life that would not come to me readily, I’d jot down as much of it as I could. Then I’d read it over and over, adding each time a word, a phrase, a sentence until I felt that I had caught all the shadings of reality I felt dimly were there. With each of these rereadings and rewritings it seemed that I’d gather in facts and facets that tried to run away. It was an act of concentration, of trying to hold within one’s center of attention all of that bewildering array of facts which science, politics, experience, memory, and imagination were urging upon me. And then, while writing, a new and thrilling relationship would spring up under the drive of emotion, coalescing and telescoping alien facts into a known and felt truth. That was the deep fun of the job: to feel within my body that I was pushing out to new areas of feeling, strange landmarks of emotion, cramping upon foreign soil, compounding new relationships of perceptions, making new and—until that very split second of time!—unheard-of and unfelt effects with words. It had a buoying and tonic impact upon me; my senses would strain and seek for more and more of such relationships; my temperature would rise as I worked. That is writing as I feel it, a kind of significant living.
Lola Montez was an Irish-born dancer and courtesan. Her friends, lovers, and clients included Franz List, Alexandre Dumas, and King Ludwig I of Bavaria, who rewarded her affections by making her (briefly) a countess. In 1851, she came to the United States and in San Francisco, first performed her notorious “Spider Dance”—in which she pretended to be bitten by a spider, flailing and wiggling in a way calculated to induce maximum lust in the mostly male audience. In 1858, she published The Arts of Beauty: or, Secrets of a Lady’s Toilet, excerpted below.
If Satan has ever had any direct agency in inducing woman to spoil or deform her own beauty, it must have been in tempting her to use paints and enamelling. Nothing so effectually writes memento mori on the cheek of beauty as this ridiculous and culpable practice. Ladies ought to know that it is a sure spoiler of the skin, and good taste ought to teach them that it is a frightful distorter and deformer of the natural beauty of the “human face divine.”… And let no woman imagine that the men do not readily detect this poisonous mask upon the skin. Many a time have I seen a gentleman shrink from saluting a brilliant lady, as though it was a death’s head he were compelled to kiss.
Without a fine head of hair no woman can be really beautiful. A combination of perfect features, united in one person, would all go for naught without that crowning excellence of beautiful hair. Take the handsomest woman that ever lived—one with the finest eyes, a perfect nose, an expanded forehead, a charming face, and a pair of lips that beat the ripest and reddest cherries of summer—and shave her head, and what a fright would she be! The dogs would bark at, and run from her in the street.
When I went to Charlottesville for the Virginia Festival of the Book, I found a store with a ton of Confederate bills for sale. I always knew the notes were cheap—the South printed a huge amount of money during the Civil War—but I was still surprised by just how cheap: anywhere from ten to thirty bucks apiece. So I couldn’t resist: I bought up a wide selection, with a few examples above.
What’s amazing is the visual diversity of Southern money. The Confederate government in Richmond wasn’t the only one printing cash during the war: individual states printed it too. The Confederate Constitution copied most of the original Constitution verbatim, but made a few revolutionary changes, including lifting the ban on state governments printing paper money. This greatly increased the amount of inflationary money in circulation, undermining the Southern war effort by making its currency virtually worthless. The best part is that many of these paper bills weren’t backed by anything but more paper. Take a look at the second note above: “The State of Georgia will pay the bearer fifty centers at the Treasury in Confederate Treasury Notes.” The value of each state’s money, in other words, relied on the value of the national money, which itself relied on promises the Southern government couldn’t possibly fulfill. Even so, the cash they printed are remarkable works of art, filled with vignettes celebrating the Southern way of life. And more than a century later, they’re still cheap.
To the uninitiated, San Francisco in the 1860s was a strange sight. It was densely urban, yet unmistakably Western; isolated yet cosmopolitan; crude yet cultured. It belonged to America yet held itself apart, swearing its allegiance while celebrating its independence. It appeared virtually overnight in 1849, created by one of the largest mass migrations in history. People came to strike it rich, yet they also cared deeply about culture. The gold they dug from the ground didn’t just build brothels and saloons—it also built theaters, opera houses, and music halls. It financed a large publishing industry, including twelve daily newspapers and a number of literary weeklies.
The motivation for all this was simple: boredom. Stranded on the far side of the continent, thousands of miles from home, in crude camps of the kind described by “Dame Shirley” and other correspondents of the Gold Rush, early Californians lived in a cultural vacuum. After long, grueling days at the diggings, they needed some entertainment. So they told each other stories—not romantic, pietistic tales but ironic, unsentimental ones. They had been lured West by fantasies of wealth and instead found an unforgiving reality, where a handful made a fortune and the vast majority destroyed themselves with liquor or overwork. Irony became the way Californians dealt with this disenchantment: “the Western predilection to take a humorous view of any principle or sentiment,” in Bret Harte’s words. They loved hoaxes, satire, burlesques. They loved to poke fun at the saintly and the self-serious. They had lost faith in what later generations would call the American Dream—the idea that any man could become a millionaire through hard work—and this made them a uniquely skeptical and subversive bunch.
NoneOn August 9, 1872, a pair of Western businessmen filed a patent. Jacob Davis, a Latvian tailor living in Reno, and Levi Strauss, a German dry-goods wholesaler in San Francisco, had come up with an idea for a new kind of denim work-pant. They wanted to sew copper rivets into the seams. These would be placed at the pant’s most vulnerable points—the corners of the pockets, for instance, or the bottom of the button fly—to prevent the fabric from tearing. This would make it more useful for workers, who needed durable clothing. Blue jeans were born.
Chinese vegetable garden in Cow Hollow, c. 1880s.
I grew up in San Francisco, in a neighborhood called Cow Hollow. I was always curious about what the name meant. I remember asking my parents about it. They said it was because cows used to graze there, a long time ago.
I grew up there during the dot-com boom of the 90s, and we had a lot of Internet types moving in. Since there’s never enough street parking in San Francisco, they used to park their SUVs on the sidewalk right in front of our house. They’d get ticketed for it but they were making so much money that they didn’t care, like that was the price of parking. They’d have barbecues on the sidewalk, get tanked on tequila, try to snatch our skateboards. It used to drive us crazy.
It was back then that I asked my parents about the name of the neighborhood. I tried to picture the cows, tried to make it work in my head, but I just couldn’t see it. I decided my parents were wrong. I figured Cow Hollow was probably one of those quaint countrified names a real estate agent had come up with to sell houses to dot-commers.
But it turns out, as with many things, my parents were right. There used to be a ton of cows in Cow Hollow.
In 1849, when the Gold Rush brought hordes of gold speculators to San Francisco and rapidly transformed the small Mexican village into a major city, Cow Hollow was a valley irrigated by several creeks, with a large freshwater pond. It was an ideal place to graze cattle. The first dairy sprung up in 1861, and more soon followed. On land now occupied by cupcake shops, clothing boutiques, and sports bars, there existed hundreds and hundreds of cows, supplying milk to the growing population of San Francisco. The city needed it, because no large agricultural region yet existed: San Francisco developed so rapidly, most food was imported rather than grown in California.
That soon changed. By 1890, there were roughly 800 cows roaming the area. And cows weren’t the only thing in Cow Hollow. The Chinese kept large vegetable gardens in the area. They would peddle the vegetables on the street, or sell them to local cooks. By the late 19th century, tanneries, slaughterhouses, and sausage factories had moved into the area, generating sewage.
The conditions at the dairies deteriorated, and a series of articles appeared in local papers about sick cows and contaminated milk. In 1891, the Board of Health banished the cows. By then, the area had become more fashionable, and wealthier San Franciscans had built ornate Victorian homes. Once the dairies and the gardens left, the area became entirely residential.
In the summer of 1854, William Tecumseh Sherman described Oakland as:
a low sandy piece of ground on the San Antonio Creek directly opposite this city with oak trees beautifully distributed. The climate is much milder than [San Francisco] as the winds do not blow so hard and the fog scatters before it crosses the bay. The titles to land are not well settled, yet quite a town has grown up there and two steam ferry boats cross with considerable regularity.
Half a century later, Jack London grew up along the city’s waterfront. From his novel Valley of the Moon (1913):
‘No more Oakland. No more living in Oakland. I’ll die if I have to. It’s pull up stakes and get out.’
He digested this slowly.
‘Where?’ he asked finally.
‘Anywhere. Everywhere. Smoke a cigarette and think it over.’
He shook his head and studied her.
‘You mean that?’ he asked at length.
‘I do. I want to chuck Oakland just as hard as you wanted to chuck the beefsteak, the coffee, and the butter.’
She could see him brace himself. She could feel him brace his very body ere he answered.
‘All right then, if that’s what you want. We’ll quit Oakland. We’ll quit it cold. God damn it, anyway, it never done nothin’ for me, an’ I guess I’m husky enough to scratch for us both anywheres. An’ now that’s settled, just tell me what you got it in for Oakland for.’
And she told him all she had thought out, marshaled all the facts in her indictment of Oakland, omitting nothing.
From Mark Twain’s The Innocents Abroad (1869):
“Men lived long lives, in the olden time, and struggled feverishly through them, toiling like slaves, in oratory, in generalship, or in literature, and then laid them down and died, happy in the possession of an enduring history and a deathless name. Well, twenty little centuries nutter away, and what is left of these things? A crazy inscription on a block of stone, which snuffy antiquarians bother over and tangle up and make nothing out of but a bare name (which they spell wrong)—no history, no tradition, no poetry—nothing that can give it even a passing interest. What may be left of General Grant’s great name forty centuries hence? This—in the Encyclopedia for A.D. 5868, possibly:
‘URIAH S. (OR Z.) GRAUNT—popular poet of ancient times in the Aztec provinces of the United States of British America. Some authors say flourished about A.D. 742; but the learned Ah-ah Foo-foo states that he was a cotemporary of Scharkspyre, the English poet, and flourished about A.D. 1328, some three centuries after the Trojan war instead of before it. He wrote Rock me to Sleep, Mother.’
Charles Warren Stoddard (1843-1909) suffered for his secrets. From an early age, he knew he had something to hide. He was what his idol and sometime correspondent, Walt Whitman, would call “adhesive”—homosexual. Dreamy and delicately built, he also made a natural poet.
One evening in 1861, he paced along the south side of Clay Street in San Francisco. He lingered at a box belonging to a literary weekly called the Golden Era, and spent a full hour summoning the courage to slide his submission through the slot. When the Era appeared the following Sunday, he bought a copy and leafed breathlessly through it. On the fifth page was his poem, accompanied by an appreciative note from the editors, requesting the anonymous author for more.
Stoddard was seventeen when he published his first poem. By day he worked in a bookstore, dusting the shelves. By night he wrote effusive, sensuous verse that swiftly secured his place as the boy wonder of the San Francisco literary scene. But his writing wasn’t the only thing to recommend him. “More delightful than either his prose or his verse,” William Dean Howells remembered, “was the man himself”—what Howells called his “utter loveableness.” Feminine, with pleading blue eyes and an exuberant laugh, Stoddard possessed a relentlessly endearing quality that made him a beloved companion to many writers. Yet he also had a more melancholy aspect, linked to his submerged homosexuality. He loved being in love, even—or especially—with those who didn’t reciprocate his affections. He came to expect rejection, occasionally to take a kind of pleasure in it. He lived in a world with no words for what he was, where gay love was not only forbidden but invisible—enciphered in metaphor, perhaps, but never plainly discussed.
These torments made him fragile, prone to depression. In 1864, after a brief stint as a college student, he suffered a nervous breakdown and sailed for Hawaii, where his sister lived. He adored it. The lushness of the scenery seemed ideally suited to his aesthetic; the sensuality of the natives helped liberate his own. He explored gay relationships with local boys, beginning a lifelong enchantment with the tropics. When he returned to San Francisco in 1865, he abandoned his studies and committed himself to a literary life. He grew close to Bret Harte, who gave invaluable guidance, and Ina Coolbrith, who advised him to marry—advice she notably never followed herself. In 1867, with Harte’s help, Stoddard published his first book. The response from both West and East was withering: critics on both coasts slammed his poetry as derivative, overblown. The author was devastated. He renounced poetry, converted to Catholicism, and returned to Hawaii.
His Catholic faith and his infatuation with “primitive” cultures of the Pacific belonged to the same impulse. Both offered anti-modern alternatives to the dreary materialism of midcentury America. “I couldn’t be anything else than a Catholic,” he confessed to a friend, “—except—except a downright savage, and I wish to God I were that!”