subscribe: Posts | Comments      Facebook      Email Steve

Style and eccentricity in wine

2 comments

 

Two articles struck me this week, in publications that, you might say, are diametrically opposed to each other: The New York Times and Playboy. While the topics are different, I hope to be able to draw a connection between them, as concerns our current wine culture.

The Times article was about a fashion designer, Isabella Blow, whose glory years were the 1970s-1990s, and who now is the subject of a retrospective in London. Isabella was certainly a couture eccentric: the author, Andrew O’Hagan, describes her wearing “giant mink antlers” and “a sneering mouth so red with lipstick that it was like an open wound.” (Blow is Lady Gaga‘s spiritual grandmother.) She had a “phantasmagoric sense of fashion [and] beauty” that O’Hagan says is missing today, when too many people are mere “imitators” of fashion, “publicity scavengers…who think it’s merely about fame or attention.”

IsabellaBlow2
Isabella Blow

Other style setters whom O’Hagan admires are the famously infamous writer Quentin Crisp, Anna Piaggi, who wrote for Vogue, and the recluse Edith Bouvier Beale, Jackie Kennedy’s cousin, who lived and died alone in a falling down mansion filled with garbage, even as she dressed as outrageously as anyone in the Hamptons.

O’Hagan’s point isn’t necessarily a new one: celebrate style. Be yourself, and unafraid to show the world who are are.  He quotes another of his muses, Elsie de Wolfe: “Only those are unwise who have never dared to be fools.” When I read that, I immediately thought of those California vintners who are daring to march to a different beat from today’s consumer favorites. Not for them another oaky Cabernet Sauvignon or Chardonnay.  No, they want to split off from the crowd and explore niches that interest them. I think of someone like Marimar Torres. True, she makes great Pinot and Chardonnay, and could easily get by with only them, but instead she pops out of the envelope with such interesting blends as her Chardonnay-Albariño and Syrah-Tempranillo. There’s Cambiata, whose Tannat is at the top of the list in California, even though most consumers wouldn’t know Tannat if it walked up to them and punched them in the nose. Or ONX’s Reckoning, which daringly combines Syrah, Petite Sirah, Zinfandel, Tempranillo and Grenache in a wholesome way. These are wines of a certain eccentricity, perhaps not for everyone: but they are wines of beauty and artistry.

The Playboy article, Talkin’ ‘Bout Your Generation, is funny and trenchant. The writer skewers every generation born during the 20th century (including mine, the Baby Boomers) right through Generation Z (born after 2000). You have to smile as you read his descriptions. Here’s a snippet from “Generation Y, AKA The Millennials”: “They’ve earned the nickname the Me Me Me Generation for a reason: They’re three times more likely than Boomers to have narcissistic personality disorder. Materialism and a lofty sense of entitlement–minus the means to realize their caviar dreams–have contributed to breathtaking delusions of grandeur. Generation Y is arguably the most medicated on record, their hazy state and sedentary social-media lifestyle contributing to a rise of obesity and its BFF, diabetes.” As for their obsession with social media: “Millennials who tried to quit social media showed the same symptoms as drug addicts in withdrawal.” Ouch.

I’ve tried to live my life in a way where I didn’t much care what anybody thought of me. And I like people who feel the same way. People of style are generally people of honesty and integrity. You can’t have integrity if you follow the herd, because having integrity takes guts. You have to be willing to take risks, to split off from the mainstream and explore new, and sometimes unpopular, dimensions. When I was in grad school, I’d take BART (the San Francisco subway) to S.F. State, outbound from downtown, and look at the mobs of people on the platform across from me, heading to the office towers of downtown. They all looked the same, dressed in severe business attire (men and women; we called it Financial District drag), with their little leather attaché cases and bored faces. I didn’t scorn them so much as feel sorry for them. They were just doing what they thought they were supposed to do–what everyone else was doing–what they hoped would bring them money and happiness.

Perhaps as a child of the Sixties I tend to romanticize the outlaw view, that people who “celebrate diversity” (to use that phrase) contribute more to humanity’s spectrum and upward spiral than those who remain confined within narrow limits. (I think of Steve Jobs in that respect, a hippie if ever there was one.) My sense of style tends to conform to O’Hagan’s; as he writes, “the true eccentric gives us more mystery, more wonder about being human, a new side to beauty…”. Wine is like that, too. There aren’t very many eccentrically mysterious wines being produced today in California, because most proprietors are too concerned with the bottom line to take risks. But I sense that may be changing. As for those Millennial social media addicts, I suppose the ultimate risk would be a Digital Sabbath: put the smart phone down and connect with the real world.

I’m off to Seattle today to celebrate Thanksgiving with my “northern” family. I’ll try to post something every day this week. Meanwhile, here’s wishing you a happy, healthy and safe Thanksgiving!


What today’s social media means for tomorrow’s wine industry

29 comments

 

The advent of the Millennials and social media is said to be revolutionizing consumer behavior in wine to such an existential extent that the Old Order is in dire threat of imminent demise.

From this historical vantage point, some people say that the wine world has gone through two major eras and is now entering a third. Wine 1.0, which lasted for a millennium, saw a few European regions dominating that continent; wine was virtually non-existent in the rest of the world. Wine 2.0, which began roughly in the late 19th century and continues today, saw the emergence of the New World, but that, in reality, was actually (and merely) an extension of Wine 1.0, because the New World mostly meant the former colonies of England (Australia, South Africa, New Zealand, America), who carried English traditions to the farthest points of the globe, resulting in the continuation of the domination of Cabernet Sauvignon, Pinot Noir, Chardonnay, Sauvignon Blanc, etc. This continuity of English tradition, with its focus on rank, privilege and status, also guaranteed the public’s ongoing fascination with the Great Growths/Grand Crus of France, as well as their equivalents (decreed by cognoscenti) in the New World (Penfolds Grange, Harlan Estate, for example).

Now, according to the new historians, we see the nascent parameters of Wine 3.0. These are clear and distinct. One is that the world has shrunk so that ideas are now global rather than regional. Another is that technology has made the spread of ideas instantaneous. For the first time in history, an idea does not need a physical mode of transportation to convey it to the farthest reaches of the planet: the mere click of a mouse now does that. A third leg of this analysis is that a new generation (Millennials) is fundamentally different from its forebears, if for no reason other than that they grew up in a reality in which the first two parameters (a shrunken world and instantaneous transmittal of information) were taken for granted. The result, says this new interpretation, is that wine has been liberated from the shackles that bound it for centuries.

This is an attractive analysis for those who argue for a more liberal interpretation of history–such as, for example, the one governing a view of America that sees our country continually spiraling upward and outward in recognizing the human rights of all its inhabitants (notwithstanding that the reality of this view is not always consonant with the theory). Thus, the democratization of human society both anticipated and parallels the democratization in consumer wine preferences. According to this view, wines of any variety, style or flavor now may be permitted to stand beside glorious Bordeaux/Cabernet Sauvignon or Burgundy/Pinot Noir: younger consumers don’t care anymore about those old paradigms, nor do they care about Authority. Tannat, Furmint, Rkatsiteli, Welschriesling, Savtiano–Millennials happily embrace them all, perhaps all the more exuberantly due to the fact that they were formerly under-appreciated by those very Authorities whom they reject as arrogant and irrelevant.

This certainly is a viable, even compelling way of looking at things; the fact that it accords well with our own American experience in democratization adds vigor to it. (White male property owners at first had all the rights. Then came non-property owners, women, 18-year olds, African-Americans, the handicapped, the GLBT community; PETA is hoping animals may be next. If white male property owners were Bordeaux, the GLBT community is Rkatsiteli.) The argument becomes even more enhanced when critics of the old school embrace it, as Jancis Robinson did last week, when, in Washington, D.C. to promote her new book (“The World Atlas of Wine,” co-written with Hugh Johnson), she declared that “This democratization of wine is great.” 

Jancis might have been reciting the talking points of the blogging community when she added, “No longer are wine critics and reasonably well-known wine writers like me sitting on a pedestal, haughtily handing down our judgments.” This is if-you-can’t-beat-‘em-join-‘em-ism at its most resilient, although I do wonder if Jancis really thinks of herself (much less Hugh Johnson, God forbid) as “haughty.” At any rate, you can hardly blame a critic these days for going over to that side of the fence.

But I would like to segue now into history to make my point, which is that (as your financial statements constantly remind you), “past performance is not necessarily indicative of future results.” If the study of history proves anything, it is how utterly useless it is in predicting the future. We in the West like to assume that history proceeds according to some kind of orderly, predictable template, like the unfolding of a computer program, so that a proper understanding of the past can result in a fairly accurate knowledge of the future: not necessarily in detail, but in general outline. This philosophy was most famously summed up in Santayana’s slogan that “Those who cannot remember the past are condemned to repeat it.” We study history in order to more perfectly align with its forward direction.

Alas, reality has the unpleasant tendency to throw curveballs at us, upsetting the best-laid plans of men. (Heisenberg understood this tendency toward the erratic in the realm of the sub-atomic.) I referred earlier to Wine 1.0, which was totally dominated by Europe (“Old Europe,” Donald Rumsfeld contemptuously called it.) So, too, has the long political history of the West been dominated by events in Europe (and, after the year 1000 A.D. or so, the entire planet: when Europe coughed, the World caught cold). We saw this appalling phenomenon with the two World Wars, and then with the advent of the Cold War, which quickly spread to every continent on Earth.

As a result, my generation–the Baby Boomers–was obsessed with Europe. As a history buff, I’ve read about Europe all my life, and can tell you that, before 9/11, there was hardly a serious history book that even mentioned Islam. The Muslim world was seen merely as an adjunct of the great Western powers (subsequently joined by China, hardly a Muslim nation). The study of Western history tended to be about the causes and aftermath of World War II and the Cold War, and how those continuing power politics were shaping the political and economic realities of the world.

Suddenly, 9/11 occurred–and now Europe, with all its past problems and glories, seems almost irrelevant. If you know history at all, you will find that shocking. And yet, it happened: Europe was wiped out of global historical calculations overnight. History threw a curveball at the world, which didn’t see it coming. And the world now is scrambling to catch up.

I say these things simply to point out the uselessness of predictions based on prior assumptions, that how things appear today is necessarily prescriptive of how they will be tomorrow. We do certainly have a spike of interest in this social media phenomenon–an interest pushed by a media eager to report on “trends”. But one cannot extrapolate from this any conclusions concerning how wines will be described, popularized, marketed or sold in the future, much less what kinds of wines the people of the world will demand. A fundamental truth of human experience is: For now, we see through a glass, darkly. Paul’s conclusion from that is that mankind ought to be charitable. Mine is that proprietors of wineries ought to be skeptical.

Could the Internet itself be the curve ball that history has tossed at the world? Yes. But the outcome of that phenomenon is no more predictable than that of the world’s current situation vis-à-vis the rise of militant Islam. Nobody knows where that is going, and to make any predictions whatsoever based on what has happened in the past is futile and possibly dangerous.

My friend Rajeev, whom I mention here from time to time because he is emblematic of so many other small business owners, just enrolled in a social media course in Palo Alto, which he will attend next week. He has been reading and hearing so much about how entrepreneurs like him should be diving into social media that he’s finally decided to tackle something he’d been avoiding for years. He told me of his hopes and expectations: that mastering the intricacies of Foursquare, Twitter, Facebook and LinkedIn will help him make more money. Rajeev even used the metaphor of exploring a new land. I listened sympathetically but with (I must admit) some inward humor, and thought of Bob Dylan’s 115th Dream, in which the singer meets the captain of three ships sailing toward America, as the singer is heading in the opposite direction. “He said his name was Columbus,” the singer sings, “[and] I just said, ‘Good Luck.’”

 


We should be spending MORE time at social media? No way!

12 comments

 

I got miffed the other day at someone I love. We hadn’t seen each other in quite a while, and agreed to meet up in Oakland to catch up. No sooner had we kissed cheeks than she whipped out her iPhone and began fumbling with it.

I had thought that we’d chat for a while. “How are you? What’s new”–and do the real social thing, which is human interaction and communication. Instead, within 30 seconds of greeting each other, the lady was totally absorbed in trying to download a photo to her Facebook page.

Well, I took some umbrage at that. But what can you do? Fifty million Frenchmen can’t be wrong. Yesterday, I was having lunch with two young friends, both in their twenties, a prime demographic for living the online life. I laid out my case: People spend too much time gazing into blue screens, and not enough time in the real world, perceiving the things around them, making eye contact, talking to actual people instead of digital ones.

I was surprised that my two young friends agreed with me.

A few weeks ago, a man on a bus in San Francisco shot another man in the back, in what police called a random shooting. The victim died. This would be just another shocking case of senseless violence, except for this telltale fact: Although the shooter had raised and lowered his gun “several…times,” pointing it down the aisle of a crowded bus, no one on the packed bus reacted, or even saw it. Instead, “Their eyes, focused on smartphones and tablets, don’t lift until the gunman fires a bullet…”. 

Their eyes could block out the reality around them, but their ears couldn’t. The San Francisco Chronicle, in reporting this troubling incident, headlined the article “Absorbed device users oblivious to danger.”

With all this fresh in my head, when I sat down at the computer yesterday morning, I found an online article through LinkedIn Today. The title, “Why Small Business Isn’t Winning on Social,” grabbed my attention, as a good headline should. I clicked on the link.

The article made some good, if hardly newsworthy, points: that lots of mom-and-pop businesses aren’t trying social media because they believe they can’t afford the time or the money. The author made the additional point that “many” social media consultants “can be dishonest about the realities of what they can do for their client,” which is something I’ve been saying about the social media consulting complex for years. (“Give me your money. I promise ROI!”) I thought it was pretty cool for the writer, who was obviously a proponent of social media, to admit that the field is riddled with fraud.

But my jaw dropped when, at the end of the article, the author came out and said the main problem with small businesses is that they don’t spend enough time at social media. He estimated it takes “a solid 9-10 hours a day of work”!!! I had to reread that. Didn’t he mean 9-10 hours a week? No, a day.

Can you imagine spending 9-10 hours a day doing social media?  It’s impossible for me to wrap my head around that. How would it even be possible, with everything else that people do, such as working, commuting, eating, raising kids, walking the dog, reading a book, keeping up with the news, maintaining actual relationships with friends, working out at the gym, and, oh yes, sleeping?

The author has an answer for that: “You can always sleep a few hours less every week.” This, in a nation where “insufficient sleep is [already] a public health epidemic,” according to the Centers for Disease Control.

When I got to the end of the article, still gob-smacked and incredulous, I realized who had written it, and why. “You can find out more,” the author concluded, “at garyvaynerchuk.com.”


Social Media is now just a part of Big Media. Welcome to the club!

4 comments

 

Seems like just yesterday that social media was portraying itself as the revolutionary alternative to Big or Traditional media.

(Actually, social media, not being an animate being, cannot “portray” itself as anything. It can’t even drink wine! So I should have said certain social media adherents were portraying it that way.)

The world seemed divided into two camps: You were either a hopelessly old fuddy-duddy who read the New York Times and watched T.V., or you were a young, hip, cool trendster with a smart phone or tablet pasted onto your face.

No inbetween. “You’re either for us or against us,” went the refrain of the social media-ists. (Longtime readers of this blog know that I was perceived in some circles as an “againster.”) The social media-ists insisted that the new media were qualitatively different from the old media–that in some way it was purer, more honest, closer to God and less controlled by the greedy hand of self-interested corporate America. Social media would, they asserted, knock old media to its knees.

Well, a funny thing happened on the way to the future. Things didn’t quite turn out the way they were supposed to. We now know that social media has quite a lot in common with old media. For one thing, social media is corporate-owned now; the people that run these networks are filthy rich–richer than most old media tycoons, in fact–and the us.-versus-them mentality that fueled an infant Twitter or Facebook has now morphed into an Animal Farm ending. (Remember that in the book’s final chapter, the other animals could no longer tell the difference between men and pigs. Mark Zuckerberg hangs out with, and presumably advises, everyone from President Obama to Russian Prime Minister Dmitry Medvedev. Not sayin’ anyone’s a “pig,” just makin’ the point.)

We know, too, that businesses–from mom and pop wineries to the world’s biggest corporations–no longer perceive social media as weird or alternative, but rather as integral parts of their marketing mix. A company’s advertising and marketing budget now includes every aspect of modern media: Social, print newspapers, magazines, radio and T.V., if they can afford it. In essence, then, the people who spend the money make no distinction in kind between Facebook and Vanity Fair magazine.

Finally, we now know far more about who actually uses social media than we ever did before, and you know what? It’s everybody! It’s not just hip cool tattooed kids, it’s grandma. A study published yesterday on social media usage demographics stunningly paints a picture of an increasingly fragmented, even fractured public. You can read a summary of the study here; a few illustrative highlights are that Facebook is increasingly trending old, Instagram and Pinterest are trending female, LinkedIn swings male (no surprise there) as does Google+. Twitter retains its juvenile appeal, again no surprise given that even the least literate being on Earth can peck out 140 characters.

 An earlier study, from last May, analyzes social media use from a slightly different perspective. Its findings once again suggest that a kind of rainbow effect has influenced social media. Users are dividing up along racial, ethnic, age, educational and household income lines, making sweeping statements about social media, per se, unreliable to the point of untenable.

The point I would like to make is that whatever allure social media had four years ago, as a kind of Jesus in the temple, cleansing it of the old money lenders, has now evaporated, if in fact it ever existed. We no longer have “social media” and “old media” in America. We have Media, pure and simple, and while each medium differs in distinctive ways, collectively they’re all the same. And you know what it’s all about? Profits.


The irrelevance of blogs that say the 100-point system is irrelevant

11 comments

 

Back in the 1990s I was supplementing my wine-writing income by doing a little healthcare reporting. As things turned out, I became known as something of an expert in the intersection of the U.S. healthcare industry–the nation’s biggest–and the emerging Internet. Everyone from pharmaceutical manufacturers to insurers, hospital administrators and individual doctors wanted to know what would happen when these two gigantic forces met, and how they could incorporate this new-fangled World Wide Web into their businesses. I didn’t have a clue, to tell you the truth, but I knew just enough more than most people (including my editors) to keep me gainfully employed and give my writing the semblance of expertise.

One memory stays with me of that period. Dot-com startups were as common in those days as the squirrels now gathering nuts in my neighborhood here in Oakland as Fall approaches. Here’s how it typically worked (and this is a fascinating and useful glimpse into the troubled, and troubling, nexus of marketing and reporting, as well). Someone starts a dot-com company; let’s say it purports to keep Internet-based communications between physicians secure. Somehow, that new business, which may not yet even exist (which would qualify it as “vaporware”), attracts the attention of one of the big investment banks. That bank would buy a piece of the company or otherwise associate itself with it. Then the bank would make its analysts available to journalists writing about that topic; the analysts would help the hapless reporters understand the finer points of whatever the topic was. If you’ve ever been a working reporter, you know how much we depend on the kindness of analysts whom we can quote with certain knowledge that the quotes are accurate, because after all, that analyst is an expert who works for an important investment bank.

Well, you see the obvious conflict of interest. The analyst talks up the new company, explaining how the product or service it provides is badly needed, and that this company (which he has analyzed in detail) is in a good position to succeed. The company is, according to the analyst, a sure thing. Meanwhile, the reporter–me–is typing all this down, to incorporate into the story. Next thing you know, a big, glossy healthcare magazine is running it, complete with the analyst’s spin (but now in my words) about how great this new startup company is. It’s a win-win-win for everybody: the startup’s owner, the investment bank, the analyst, me, and the publisher of the magazine who’s paying me.

Except for one little thing: in many cases, the analysts either lied or were entirely incorrect (for whatever reason) in their judgments. As we all know, the dot-com era crashed spectacularly in the early 2000s. A lot of “sure things” perished overnight; a lot of people lost everything. Many of the “sure thing” predictions turned out to be as premature as reports of Mark Twain’s demise. That horrible era taught me some important lessons I’ve carried through in my journalism ever since: Be skeptical of claims, even by so-called  “experts” who seem so self-assured. I developed a B.S. radar that to this day serves me well. That radar always asks these questions of anyone giving information or advice: Does this person have a hidden agenda? What does he or she have to gain (or lose)? Is there solid evidence of this person’s claims?

When the dot-com collapse finally happened, I’d been out of healthcare writing for a few years. But I was shocked that my reporting had had something to do with instilling a sense of trust in these startup companies–trust that, as it turned out, they didn’t deserve. People all over the country had read my articles and made decisions based upon facts provided to me, and by me to readers, that had turned out not to be true. I vowed never to let that happen again.

We come now, after this somewhat labored intro, to the subject of today’s post. Over the last number of years, we’ve had many reports of the Death of the 100 Point System. These have mainly come from the wine blogosphere. We can now see that these reports have been characterized, not by critical judgment or factual data, but by wishful (and even magical) thinking. Typical of the genre is this blog post from last week that incorporates the standard memes:

–       people, especially younger ones, don’t care about point scores

–       they would rather get a recco from a friend than from some famous [old] critic

–       wine criticism is subjective anyway, so giving it a number is crazy

–       the only way to judge a wine is to experience it yourself

The writer then offers examples from his own experience to “prove” the truth of each assertion.

Well, of course, each of these bullet points is true in its own way. But they’re no truer now than they were pre-social media. Only a tiny percentage of wine consumers ever cared about point scores. But in fact, more people today are influenced by them than ever. More and more big retailers (Beverages & More, Costco) are using point scores (by people like me) to market their wines, so that more and more consumers are exposed to them. And believe me, these retailers wouldn’t use point scores if they didn’t know a high score moves SKUs.

And younger people, of course, always have been resistant to the advice of elders. That’s what it means to be young: There’s nothing inherently different about kids in their twenties today than at any other point in history. They’ve always been more inclined to respect the opinions of their friends than of their elders. Social media hasn’t changed that. The point is that young people will someday be middle-aged people (that’s the way it goes, kids), and when they have a little more money in their pockets, they’ll do what people with disposable incomes have always done: seek the advice of experts when it comes to buying things, like autos, high tech devices and, yes, wine.

There’s simply no evidence that the 100-point system is endangered or irrelevant. In fact it’s at the height of its impact. Virtually every major critic in the world uses it (or some variation of it). The writer of the blog post I referenced understands this full well: in his concluding paragraph (which is where you always want to make an important point to leave with the reader), he writes: “Ultimately, the 100-point scale is here to stay.” That disclaimer, you’d think, would nullify everything he’d said up to that point. After all, you can’t be “here to stay” if you’re irrelevant, can you?


The solution to Napa Valley traffic? Not social media!

8 comments

 

Last Wednesday’s public meeting of the Napa County Planning Commission, as reported by the Napa Valley Register, sounds like a typical exercise in broad-based bureaucratic eye-glazing bureaucracy.

Everyone got to say his two cents, and the rest of the audience had to sit through it and listen, no matter how rambling or opaque the remarks were. That is the essence, and the curse, of participatory democracy.

The main topic, so far as I could discern it, was the impact of growing winery infrastructure on traffic. This is a perennial concern in Napa Valley and a legitimate one. If you’ve driven Highway 29 lately, you know how awful it can be. Gridlock, sometimes stretching from north of St. Helena all the way through American Canyon down to the 101 Freeway, is the norm. It’s why I advise visiting tourists to avoid Napa and stick to Sonoma County. No fun sitting in your car inhaling gas fumes.

I don’t know what the answer to the traffic is, but I don’t think it involves social media. And yet, there at the meeting was the ubiquitous Paul Mabray, singing the social media halleleujah chorus to a roomful of voters and county officials who must have wondered just who he was, and why he was there, lecturing them about “digital human beings” and the mean number of times per day the average winery posts on Twitter (2.2 times a day, in case you’re wondering), when what they were there to talk about was traffic.

I mean, social media is very glorious and wonderful, and we all are grateful it exists and can hardly remember what life was like before it did. But social media has not yet turned into a deus ex machina–a miraculous intervention that solves all problems, like those quack nostrum peddlers used to claim when they sold horse linament to naïve uneducated people looking for a cure for their cancers, arthritis and venereal diseases.

I cannot, by any stretch of the imagination, see the relevance of “engag[ing] potential customers on social media networks” to local traffic conditions, unless Paul was trying to make the point that the more people who are buying wine online, the fewer people will be actually visiting Napa Valley. But I don’t think that was his point. And even if it was, it was effectively rebutted by vintner Michael Honig, who stated the obvious truth that “People…come and see our valley. They…see our barrels. They…kick the dirt.” That’s how you get a lifetime customer, not through tweeting 80 times a day.

Once Paul had his say, it seems that things got back to the topic at hand. The problem of traffic is insoluble, short of draconian steps that wineries would oppose, tourists would hate and would probably be doomed to fail at any rate. (Back during the gas shortages of the 1970s, you could only fill up on certain days of the week, depending on the numbers on your license plate. Maybe they could drop a similar dictat on touring Napa.)

My own feeling for traffic in Napa is to widen Highway 29. Make it four lanes instead of two–and put a detour around St. Helena, or perhaps a bridge over it, so you don’t have to drive through it. If four lanes is too much, make it three lanes, and have the directions change the way they used to at the Caldecott Tunnel before there were four bores: Two westbound lanes, one eastbound for morning commute, then switch it for the evening rush.

Of course, an elevated freeway over Highway 29, with on- and off-ramps for the townships, is the ideal solution. But it’s probably too expensive, and I’m sure the enviros would object. They’re against everything that makes our lives more efficient.


« Previous Entries Next Entries »

Recent Comments

Recent Posts

Categories

Archives