Connecting The Dots Of The Web Revolution

For several days my brain has been connecting the blogstorm over AP trying to dictate how much of their content can be quoted on the web with the “quote” that Nick Carr lifted from one of my blog posts in his Atlantic article — I finally figured out why. The problem with the AP isn’t really about linking, it’s about quoting. And the problem with quoting is that, now that anyone can publish any thought or idea on the web, and anyone can link to it or reproduce it, the whole notion of quoting and citation has been completely turned on its head. Let me try to explain.

Ever since Nick Carr’s Atlantic article appeared on the web (finally), there’s been a spike in Google alerts for my name. Prior to this quote in the Atlantic, whenever I checked out a site where my name was mentioned, it almost invariably had a link back to my site — because someone was quoting me from my blog and linked back. But The Atlantic article had no link to my blog, even though Nick lifted the quote verbatim from the site. So here are all these reproduction of this citation, but no links. And I’m getting a spike in traffic from people searching for “Scott Karp blog” because they’re looking for the source.

Something is very wrong on the internet.

Take a look at the way Nick quoted me:

I’m not the only one. When I mention my troubles with reading to friends and acquaintances—literary types, most of them—many say they’re having similar experiences. The more they use the Web, the more they have to fight to stay focused on long pieces of writing. Some of the bloggers I follow have also begun mentioning the phenomenon. Scott Karp, who writes a blog about online media, recently confessed that he has stopped reading books altogether. “I was a lit major in college, and used to be [a] voracious book reader,” he wrote. “What happened?” He speculates on the answer: “What if I do all my reading on the web not so much because the way I read has changed, i.e. I’m just seeking convenience, but because the way I THINK has changed?”

It’s very similar to the way journalists have traditionally quoted sources that they actually TALKED to. Substitute “he said” for “he wrote,” and you would think that this was a phone or in person interview. Even when articles with such quotes are published on the web, the source isn’t typically linked because it doesn’t exist on the web.

But in this case, the source did exist.

The traditional practice of journalism also requires that you ask a source’s permission to quote them. Nick never asked my permission — he assumed he didn’t have to, because I had already published what he was quoting. And yet he doesn’t cite the post or even name my blog. It’s like he’s quoting me, personally, yet also citing a published source. When I first saw the quote, my gut reaction was to feel annoyed that Nick didn’t ask my permission, although technically he didn’t have to. Sources have forever complained about journalists quoting them out of context, and that’s exactly how I felt — yet if there had been a link, or a URL in print, I don’t think I would have felt that way.

It’s like Nick and The Atlantic were trying to play by the old rules and the new rules, and yet not really adhering to either.

So what does this have to do with the AP?

If The Atlantic, with its top shelf editorial standards, can do this, then why can’t a blogger quote AP — almost as if the AP were a person?

What’s happened is that the lines between quoting a person and quoting a published source have blurred.

Bloggers aren’t really reacting to the copyright issue, although that’s what everyone is taking about. It’s more like AP is giving on the record interviews to bloggers with its stories, and then when bloggers quote them, the AP turns around and claims the interview was “off the record.”

The AP found itself deeper in the hole when a blogger discovered a page where the AP was asking payment per word for citations. Yet the AP quotes from blogs and other sites — as if they were abiding by interview standards. Which has led some bloggers to turn the tables and demand payment for all the times the AP quoted them.

Geesh.

Yesterday Jay Rosen wondered on Twitter how the AP could have so distanced itself from Tom Curley’s speech in 2004. Mathew Ingram expressed what so many are of us were thinking:

@jayrosen_nyu: i’d love to explain how they got here from there — i wish i knew :-) do you think curley has been steered wrong by others?

It just defies comprehension.

Here’s why I think this is all such a mess, why the AP is cutting off their nose to spite their face, defying comprehension, why Nick Carr thinks access to more information and more connections between information is making us dumber (also defying comprehension).

Nobody has really been able to conceptualize yet just how dramatic the change is in our traditional systems of information, media, publishing, reading, writing, relating ideas, and thinking itself. Nick Carr has come close with his recent writing, and he’s brave enough to try, but he gets too distracted by his nostalgia for a simpler age.

Nick argues that we are losing our ability to “read deeply,” e.g. read a whole book and contemplate it, without “distraction.” The problem is he’s using an antiquated yardstick to measure the quality of thought.

Maybe I don’t need 250 page books anymore because the web enables me to connect ideas and create narratives that I used to depend on book authors to do for me, because I wasn’t able to access all the information and connect all the dots myself.

Maybe the reason why Nick and so many other literati are losing their patience with long form information is that it is so fundamentally inefficient and inferior to connected bits of information.

You look at a book, read a book, and you easily perceive a coherent whole. You look at all the information on that book’s topic on the web, all connected, and you can’t see the sum of the parts — but we are starting to get our minds around it. We can’t yet recognize the superiority of this networked thinking process because we’re measuring it against our old linear thought process.

Nick romanticizes the “contemplation” that comes with reading a book. But it’s possible that the output of our old contemplation can now be had in larger measure through a new entirely non-linear process.

Just look at this post. If there’s any insight here (which still remains to be seen), it didn’t come from a linear process of A to B to C. It came from all of these seemingly random nodes connecting, and all these bits of information coming together, and then suddenly I saw the whole. If you had watched me, tracked my reading and my thoughts, you would have judged me positively scatological by traditional standards.

But even in presenting my “aha,” I’m jumping all over the place because I’m still trying to figure out how to make sense of networked thought process. The end of this post may seem completely disconnected from the beginning, but it’s all deeply connected. (Although it makes choosing a pithy title difficult.)

So what’s the lesson for the AP and every other media business? We don’t “get it” yet — none of us do. We’re starting to connect the dots, slowly but surely, but we’re looking through a glass darkly at the change we’re immersed in.

As Jon Fine observed about all the Titans of Media speaking at the All Things D conference:

It is sobering when not even the smartest guys in the room have any plausible answers. But then, no one has the answers.

What is increasingly clear is that the thought processes, assumptions, and standards that governed analogue media, information, and thought are increasingly going to get us into trouble in a digital media world.

What I’m hoping is that we’re bumbling through a “period of stupid” before we realize that we’ve actually become a lot smarter.

The next media business to connect those dots will be the next Google.

UPDATE

TheAtlantic.com is now linking to this blog in Nick’s article, which I suppose proves the squeaky wheel maxim, but there still aren’t links to other quoted blogs, e.g.

Bruce Friedman, who blogs regularly about the use of computers in medicine, also has described how the Internet has altered his mental habits. “I now have almost totally lost the ability to read and absorb a longish article on the web or in print,” he wrote earlier this year. A pathologist who has long been on the faculty of the University of Michigan Medical School, Friedman elaborated on his comment in a telephone conversation with me.

I’ve searched for this blog, but can’t find it (for shame, Google) — so a link isn’t just about principle, it’s about real utility.

Of course, Nick actually talked to Bruce, and also quoted him the old fashioned way. Ah, well.

Associated Press Hands Local And National News Sites An Opportunity To Get Links And Traffic

The Associated Press is facing a blog firestorm after issuing take down notices to Drudge Retort for linking to and reproducing snippets of AP stories. AP is now attempting to define how their stories can be linked to and excerpted — and the response from the blogosphere appears to be to boycott the AP, i.e. not link at all. This is a huge opportunity for local and national news sites to be the sources that bloggers and social news sites link to instead of the AP.

Take the story of flooding in Iowa, for example. The AP is covering this story extensively, as you can see in this Google News search result. But local news media in Iowa is also covering the story extensively, as you can see in this search limited to Iowa sources — the story is happening in their own backyard, giving these local sources a unique perspective and knowledge.

So if a blogger wanted to discuss the Iowa floods and needed a source to cite, they can easily find an original local source instead of the AP story. And they can think of the link and the traffic they send as a contribution to the local news outlet’s original reporting, particularly the local newspapers struggling with new economic realities.

Or let’s take a national example. Britain and the EU have announced new sanctions against Iran. The AP covered this story, but so did national news sites like the Washington Post. Link to the Post instead, to support their original reporting.

To seize this opportunity, national and local news sites could get the word out to bloggers that they want the links and the traffic, if AP doesn’t. It’s up to them whether they agree with AP’s attempt to copyright the commodity of the news event itself (via Saul Hansell):

“The principal question is whether the excerpt is a substitute for the story, or some established adaptation of the story,” said Timothy Wu, a professor at the Columbia Law School. Mr. Wu said that the case is not clear-cut, but he believes that The A.P. is likely to lose a court case to assert a claim on that issue.

“It’s hard to see how the Drudge Retort ‘first few lines’ is a substitute for the story,” Mr. Wu said.

Mr. Kennedy argued, however, that The Associated Press believes that in some cases, the essence of an article can be encapsulated in very few words.

The problem with this argument is that, if the “essence of the article” is the fact of the event itself, e.g. floods in IA, and that is the entire value of the content, then the content is sorely lacking in value. If local and national news sites can create more value than just the commodity statement of the facts, which be “given away” in a snippet, then they can earn all of the links and traffic on the web for their original reporting. They should also look at how they allocate resources to creating original content that people want to and can link to vs. licensing content that nobody wants to or can link to (especially when that same licensed content appears on Google, Yahoo, and hundreds of other sites).

If local and national news sites really want to seize the opportunity, they won’t just leave it to bloggers to link to their original reporting — they will start linking to each other’s original reporting, and help each other capture that economic value, which they so clearly need.

Google Friend Connect Disabled By Facebook

Google is taking a big shot at Facebook in the PR war over data portability and social network interoperability. I signed in to Google Friend Connect, implemented on the Go2Web2.0 blog, and saw this:

Google Friend Connect Diabled By Facebook

Normally, you wouldn’t list a service that isn’t a partner, but in this case Google chose to list Facebook and let users know loud and clear that they can’t connect to their friends on Facebook because the feature has been DISABLED BY FACEBOOK.

This is subtle in some ways, but in others it’s as big a smack as Apple’s brilliant I’m a Mac, I’m a PC ads.

Google is betting that hell hath no fury like a user denied access.

Probably a good bet.

What Magazines Still Don’t Understand About The Web

Since I already drilled a nerve with What Newspapers Still Don’t Understand About The Web, which is on its way to becoming one of my most linked posts ever — and since everyone loves a sequel — I thought I would do a follow up for magazines. The lessons, of course, apply to every print publisher, who constantly discovers new ways to frustrate web users by prioritizing print over web.

This time I’m going to pick on The Atlantic, which like the Washington Post is a publication I have a great deal of affection for (published by my former employer Atlantic Media), so this is not a general critique but rather a very specific example representative of a much larger industry-wide problem (i.e. I could find instances of the same problem on virtually any magazine website).

It started this past Saturday when a friend (also a former Atlantic employee) emailed me asking me why I hadn’t mentioned my quote in the Atlantic’s latest cover story by Nick Carr. I responded saying I had no idea I had been quoted.

I immediately when to TheAtlantic.com, where I discovered that the current issue was still the June issue, and that the July issue with Nick’s cover story still hadn’t been posted. This is a common practice among publishers who make early receipt of the new issue a benefit for print subscribers.

But by doing that the publisher basically thumbs their nose at web readers and violates a fundamental principle of the digital age — if a user knows your content exists, but can’t access it, the result will be frustration or worse.

The Atlantic already made a brave move by following NYTimes.com and removing their paid subscriber wall on the website.

But still in this instance the print subscriber had access to content that, despite the power of the web, I couldn’t access.

To make matter worse, I stopped by Borders on Sunday to see if they had the July issue — physically driving to a location to obtain content that already existed in digital form seemed ludicrous. But I was willing to pay for the print issue (and probably would have read more than Nick’s article once I had it in hand).

Sadly, on the rack I found the June issue, just like on the website.

I joked to my friend by email about the frustrations of being unable to access content in the digital age. He offered to fax over the article… or 8-track tape it.

So I resigned myself to waiting for it to go up online, which I knew it would shortly.

This afternoon, I saw on TechMeme a link to this CNET story about the Atlantic article. Great, I thought, it’s up online.

It’s not yet on the Web, but the July issue of The Atlantic has an exceptional and provocative article by Nick Carr, asking “Is Google Making Us Stupid?”

Being a web user on a mission, as most are, I didn’t bother to read the sentence — I just clicked on the link and found the same June issue.

This is ridiculous, I thought — here is a someone who has access to the article and wants to link to it, but can’t. And here I am, a consumer eager to read the article, and I can’t. Wall-to-wall frustration.

But guess who stepped in to save the day… can you guess?

This afternoon, I received a email from the Google alert ego feed for my name:

Google Alert Atlantic

Another print publisher trumped by Google.

But it gets even worse.

I clicked on the link in the email which took me to the article, which is in fact online. Actually, the whole July/August issue is online.

It’s not linked on TheAtlantic.com homepage yet, as of this writing — and it’s not on the current issue page.

Atlantic June 2008

But Google knows it’s there. Google knows everything. And most importantly, Google gives me what I want, even when print publishers, still trying to balance demands of two entirely different modes of publishing, choose to prioritize print over web.

The web is Google’s first and only priority. That’s why they are beating the pants off of every legacy media company on the web.

But wait, there’s more.

I found the section of the article where I was quoted, unbeknown to me, because Nick lifted it from one of my blog post. In fact, it’s in a section about bloggers who have commented on the issue at hand.

But there’s no links to those posts. So readers have no opportunity to see my quote in context, which was a post called The Evolution From Linear Thought To Networked Thought.

There are other links in the online version, so additional links may be added before it goes live. But most print publishers have no editorial process in place for converting print content to web content, e.g. putting in links, which leads invariably to a frustrating web user experience.

If publishers want to maximize value on the web, they have to put the web first every time — that means you can’t just take what you create for print and dump it on the web, regardless of the cost efficiencies, because you’re destroying value for web users.

If a user can’t find what they want going straight to your site, the next time they are going to go straight to Google — and Google will capture the value of that content distribution.

But this story has one last delicious drip of irony. Nick agues in the Atlantic article, with his usual brilliance, that Google and digital media is actually changing the way we think — to our detriment.

I agree with Nick that the way we think is likely changing, which is what my post was about. But I don’t know that I agree with Nick’s pessimism that the change is for the worse. Yet the way I’m quote in the article, it leaves open the possibility that I agree with Nick that the change is negative.

When I mention my troubles with reading to friends and acquaintances—literary types, most of them—many say they’re having similar experiences. The more they use the Web, the more they have to fight to stay focused on long pieces of writing. Some of the bloggers I follow have also begun mentioning the phenomenon. Scott Karp, who writes a blog about online media, recently confessed that he has stopped reading books altogether. “I was a lit major in college, and used to be [a] voracious book reader,” he wrote. “What happened?” He speculates on the answer: “What if I do all my reading on the web not so much because the way I read has changed, i.e. I’m just seeking convenience, but because the way I THINK has changed?”

But if you read my whole post, you’d find the following:

What I’d be most curious to know is whether online reading actually has a positive impact on cognition — through ways that we perhaps cannot measure or even understand yet, particularly if we look at it with a bias towards linear thought.

If anything is making us dumber, it’s that we’re betwixt and between old modes and new modes of both information and thought.

The irony of The Atlantic’s print article is that by bounding the reader into a box where they can’t seek more context, and worse, by being the antithesis of the digital media experience that Nick describes, it becomes irrelevant to its own thesis.

Fortunately, if you take my quote from the print article and put it into Google, you can find my post — and the missing context.

I’d say in this instance, Google actually made me smarter.

If publishers followed Google’s example, they’d be smarter, too.

UPDATE

Lot’s of people are now discussing Nick’s article — although mostly they are discussing the CNET post ABOUT the article, because the article itself is not online — I’m guess Matt Asay is a print subscriber, who couldn’t wait for the article to get up on the web to start talking about it.

Atlantic Techmeme

It’s great to give print subscribers an advance look at the magazine — except those subscribers have blogs, and they don’t really want to keep with the print-centric program. They want to talk about it NOW, not when it finally shows up on the web. Matt even scanned in the brilliant cover:
Atlantic July August 2008 Cover

UPDATE #2

You can find all of The Atlantic’s July/August 2008 issue content indexed by Google News here, which is how I got the Google alert.

You can embargo the newsstand, but you can’t embargo Google, which is the new newsstand.

If Your Users Fail, Your Website Fails, Regardless Of Intent Or Design

On the web, in the age of Google, design has no margin of error, and there are no stupid users, only inadequate designs. Those were the main points of my critique of newspaper websites generally, and WashingtonPost.com in particular, which to be fair, apply to all online publishers, and really any website. I’m writing another post on this same topic because the issue is so fundamental to the future of media, news, publishing, and journalism, that it really can’t be over-emphasized or over-clarified.

In print, a design flaw is unlikely to cause a reader to abandon a newspaper or magazine entirely — they are a largely captive audience. But it will cause them to abandon a website.

Google understands this better than any web company, which is why they are the most successful. Google is obsessed with making sure its users never fail, no matter how “stupid” they are. Google makes users feel smart. That’s why they keep coming back.

Invariably, when I write about a negative experience with a website, e.g. Twitter or WashingtonPost.com, someone puts forth what I call the “stupid user” argument — essentially, I failed because I’m a stupid user. And if I were a better user, I would have been more successful with the site.

For example, I discovered that WashingtonPost.com has a local version of its homepage, which it displays to logged in users. Creating different versions of a site for different users is web-savvy. If I had been logged in, I would have found the content I was looking for on the homepage. That’s all good, and much to their credit.

Unfortunately, I never log in to WashingtonPost.com, although I read it frequently. Therefore, the “stupid user” argument goes, the failure to find the content I wanted was my fault.

Here’s the problem — my failure to find the information I wanted is not MY problem, because I went to Google and found it. I succeeded. The failure is the site’s problem, because I abandoned it and went instead to a site that would help me succeed without having to be smarter.

WashingtonPost.com and, to be fair, most other sites that require registration assume that users will register to help the site achieve its goals, whether customizing content or targeting advertising.

But users don’t care about the site’s goals. They care about THEIR OWN goals.

Nowhere on WashingtonPost.com’s homepage do I see clear a message that registering or logging in will help me achieve MY goals. There’s a link to the Washington version of the homepage in the upper right corner, which has the best of intentions, but because I didn’t find it, it might as well not exist.

This is why Google rules the web. In Google’s world, the user is always right. Google knows that if users fail at their task, they will abandon Google in a heartbeat. Google’s dominance is EARNED, with every search, every click.

I saw Google’s Marissa Mayer give a talk at Web 2.0 a few years back about Google page load times — the talk had a narrowly focused, OCD quality to it. It was weird on the face of it. But this is how Google wins. By obsessing over user experience above all else.

This is also why Google punishes advertisers who try to trick users or provide a poor user experience. Because it reflects poorly on Google. And users don’t come back.

A commenter argued that I should have asked the Washington Post for a comment before publishing a critiquing of their site. My response was that in an analysis of a user experience with a web site, the publisher’s intent DOESN’T MATTER. Web users are utterly unforgiving. If it doesn’t work the way I want, I’m gone in a click. There is no other side to the story.

That’s brutal and, as the commenter asserted, rude and irresponsible. It just doesn’t seem fair.

But it’s also the reality of the web. Google understands this. If publishers want to compete, they need to accept this reality, swallow their pride, and realize that the user experience is EVERYTHING. Design on the web is not about ideals — all that matters is whether the user succeeds.

Before the web, having great content was enough. The irony of my critique of WashingtonPost.com is that it wasn’t a critique of content. They had GREAT content, when I actually found it — there weren’t really any editorial shortcomings. The critique had much more to do with software design than with editorial quality or judgment. News organizations need to add software user interface design to their core competencies.

Lesson for publishers: The web is more about applications than publications.

This is why it’s so damaging for news organizations to apply the standards of print publishing for design, content, and experience — they simply don’t apply on the web. The reality is that designers didn’t necessarily know if they were successful in print, because people kept subscribing to the newspaper anyway. But on the web, success or failure is evident with every click.

Perhaps the biggest problem is that user interface and user experience design are HARD. Even the best designer can’t always anticipate what users will do — or fail to do. Sites need to create a continuous feedback loop with users and improve their design and user experience over time.

WashingtonPost.com’s homepage has a far better design than many other newspaper websites, but its relative merits didn’t matter for my specific use case.

And to be clear, helping users succeed isn’t about pandering. My goal in going to WashingtonPost.com, as it frequently is, could be to find out what’s going on in the world. How I determine whether I’ve succeeded can be much more a function of the quality of editing and content. But when I want specific information, my criteria are far more narrow, and much more unforgiving.

According to usability guru Jakob Nielsen, web users are actually getting MORE hyper-focused and. unforgiving

To remain relevant as a destination, news sites need to help me achieve ALL my objectives ALL of the time.

Just like Google.

UPDATE:

Google is inviting users to help them test out new features of Gmail. Can you imagine your average news site integrating users this deeply into their design process? I know that some have made meaningful efforts to test new designs, but Google keeps upping the ante on the embrace of users.