Is a blog without comments still a blog?

Blogger and Yahoo employee Russell Beattie has been taking a fair bit of flack for removing comments from his blog – and seems more than a little defensive about it, from what I can see. Fair enough. As he points out, it’s his blog and he can run it however he wants to. He says he got fed up with having to weed out the flames and spam, and also was spending too much of his time responding to comments, so he’s returning to “old-school blogging.”

With all due respect to Russell, I’m not sure blogging without comments constitutes “old-school blogging,” although I admit that the blogosphere’s eminence grise, Dave Winer, kind of screws up my argument by not allowing comments on his blog. But even Dave has come around of late, it seems, since he has a second WordPress blog where he does allow comments. In fact, I would argue that a website isn’t even a blog at all unless it includes comments, and I know that others agree. Don’t get me wrong – a blog without comments might still be valuable, but it’s not really a blog.

Russell says that now everyone has blogs, they can just respond to him on their blog if they don’t like something he says, or want to get in touch with him – and other than that, they can hunt for his email address in his “About” page and get to him that way. As more than one person has pointed out, it’s ironic that Russell decided to do this only days after a new comment-tracking feature called CoComment.com came out (which I am beta-testing and so far quite like, but more on that another time).

As anyone who has read my previous posts will know, I think the “conversation” is part of what makes blogs so powerful (even if it’s more of an argument :-)), so I’m disappointed Russ has done what he’s done. It’s his blog, and so I wouldn’t presume to tell him what to do, but I still think it’s a mistake.

Update:

Kent Newsome has some thoughts on the subject too (thanks for the compliments, Kent) and it’s probably a fair point that Russell’s views might have been influenced by the cease and desist letter he got recently – although he didn’t mention that in his post. I would also recommend – not surprisingly – that anyone reading this should look through the comments. There’s more good stuff in there, which kind of helps make my point.

Dave Winer has also clearly caved in under the convincing weight of my arguments and decided to get back in the comment game (hat tip to Kent Newsome for noticing). Just kidding about the caving part, Dave.

Google and Orwell? Come on, people

On the assumption that posting at least two comments about something on two separate blogs means I feel strongly about it, I’ve decided to wade into the whole Google-BMW fray. So here goes:

Can’t we save the term Orwellian for something really meaningful, like a state taking some kind of oppressive action against its own citizens, or using doublespeak in the service of some great wrong? Using it to describe an Internet search engine engaging in the site-indexing business is more than a little over-the-top, I would argue. Scott Karp of Publishing 2.0, who has said in the past that he wishes bloggers would take more time with their posts and not say things just to be inflammatory, says he wonders whether “total power [will] totally corrupt Google.”

Just one question, Scott: Since when does Google have anything approaching “total power?” It’s a search engine, for pete’s sake. It indexes websites. Yes, it cut off BMW’s German site because they used hidden front pages to try and game the indexing and page-rank process – which Google has made clear is not allowed. So their site was removed. But despite the inflammatory rhetoric everyone loves to use to make it seem a lot more exciting than it really is, this is hardly a “death penalty.” There are plenty of other search engines.

Google’s market share isn’t even close to giving it the kind of dominance that would justify a term like “total power,” or make removing a site the equivalent of cutting off BMW’s “oxygen supply,” as my friend Paul Kedrosky describes it. And for Scott to mutter darkly about whether his post on the subject might earn him the wrath of the great Google, or for Alex Muse of Texas Venture Capital to wonder whether doing so might affect his page-rank is just ridiculous. Surely there are some really important issues out there that we could all be devoting some time to instead of this.

Not a great start for FON

Well, the FON network has $22-million or so from Google, Skype and Index Ventures, but it might have a bit of a credibility problem as well, after reports from one U.S. Internet service provider that contradicted what founder Martin Varsavsky said when he announced the deal. He said the company has the support of Speakeasy, a large ISP — but Glenn Fleishman of Wi-Fi Networking News says that he got a message from the ISP saying that isn’t the case.

Looking at what Mr. Varsavsky said on his blog, he referred to being “pleased to announce today that we have obtained the support of two significant ISPs for FON. In America, Speakeasy has said that they welcome FON and in Europe, Glocalnet and FON have signed an agreement.” While he doesn’t specifically say that FON has a deal or has signed anything with Speakeasy, he makes it sound as though the company has pledged its support in some fashion.

According to Glenn, Speakeasy said that it supports sharing of Wi-Fi (one of the few ISPs that does) but that doesn’t mean it supports FON. In fact, the company says that FON’s plan sounds like something Speakeasy came up with in 2003 called NetShare, which also involved revenue sharing with those who allowed others to use their wireless connection. Although Speakeasy says it has reconsidered its initial plan to take legal action against FON for the statements, it is obviously less than pleased.

Needless to say, this kind of thing doesn’t bode well for the success of FON. It’s possible that Mr. Varsavsky misspoke, or that he was over-eager, and wanted to show more support than his company actually had. Whatever the case may be, it doesn’t look good to be claiming relationships you don’t have when you are trying to get something as ambitious as FON off the ground – and I would argue that it’s not likely to help other ISPs feel particularly comfortable about negotiating deals with the company either.

FON sounds great, but will it work?

It’s nice to hear that FON, the share-your-Wi-Fi network founded by entrepreneur Martin Varsavsky, has gotten an investment from Google, along with Skype founders Niklas Zenstrom and Janus Friis – but while that is a huge vote of confidence, it doesn’t remove some of the uncertainties surrounding the FON business model. For one thing, as more than one person has mentioned (including in the comments on Scoble’s post) almost every major ISP specifies in their contracts that this kind of wide-open sharing isn’t allowed.

According to comments Martin sent to Om, the company is trying to bring ISPs on-side, but has so far only managed to strike a deal with Speakeasy (Update: According to Om, Speakeasy says it has no arrangement with FON). Alec Saunders of Iotum says that most ISPs don’t enforce these agreements, and that’s true – but they might decide to change their minds about that if they find widespread sharing of the type FON has in mind.

Glenn Fleishmann of Wi-Fi Networking News, who has been a major skeptic on FON, says the investment by Google and the Skype gang (as well as Index Partners, which made a bundle on its investment in Skype) makes him a little less skeptical, but he still has concerns – including the difficulty of getting ISPs on-side, but also the difficulty of building out a robust enough wireless network to make what the company has in mind actually feasible.

Not only that, but how many people are going to feel the same concerns over security that the commenter on Scoble’s post feels? FON has a response here, but that might not satisfy enough people to open up their networks – especially after everyone has been telling them to lock them down so no one piggy-backs on them. FON has a response to the ISP question too, but that amounts to trying to convince the ISPs they will share revenue with them (assuming there is any). Like my friend Rob Hyndman, I think many providers (particularly in Canada) would be skeptical.

At last, a way to track blog comments

If you’re like me (and I know I am), you travel around the blogosphere reading different posts on blogs and adding your thoughtful comments here and there – and again, if you’re like me, you often forget what you said or where. As a result, you miss the responses to your comments, which in most cases (okay, some cases) have valuable information in them, or make a point that corrects your initial impressions. Like others, including Steve Rubel of Micropersuasion.com, I’ve been looking for an easy way to track this kind of activity.

At last, it looks as though someone has come up with it: Robert Scoble and TechCrunch are both talking about a new beta service called CoComment.com, which allows you to track your comments wherever they are made, to be notified when someone responds to your comment, and to see all your comments in one place (and publish them on your blog, if you wish to do so).

I was unable to use one of the demo codes that the CoComment guys attached in comments (how fitting) to Scoble’s post, but I’m eager to try this service out. I think it is exactly the kind of juice we need to keep the conversation going.

Update:

Someone at CoComment.com was kind enough to send me an activation code, so I am now signed up with the service, and have installed a “comment box” in my sidebar, which will track comments I’ve made, as well as responses to those comments. You can also subscribe to an RSS feed of that comment stream, which I’ve done – and in the future, the site says blog publishers will be able to add code to their comment sections so that the service will index comments left there even if the commenters themselves haven’t signed up for the service.

Stowe Boyd has more here, and so does Solution Watch – including a greasemonkey script that avoids the need for a bookmarklet. Ben Metcalfe has some thoughts as well.

The disruptiveness of doing what you love

Anne Zelenka, whose excellent blog I have only recently discovered, has a great post about how doing what you love can lead in unexpected directions – in which she uses the example of Mary Hodder, who started a Web 2.0 video-sharing service called Dabble about six months ago and is almost ready to launch (which is part of a much larger story about how easy it is to start companies now… but I digress).

Mary wrote something about how she wanted to stop doing things she didn’t like and start doing something she loved, and how great it was to do that, and she mentions the insightful (if long) piece by uber-geek Paul Graham called How To Do What You Love, which is worth a read. Paul mentions how “The test of whether people love what they do is whether they’d do it even if they weren’t paid for it– even if they had to work at another job to make a living.” And when you combine that with Web 2.0, you wind up with something quite powerful. Even usually gruff blogger and Kurt Cobain-lookalike Ben Barren gets a little misty-eyed at the idea.

Anne says:

One thing that must scare the wigs off of media moguls is that many writers and other content creators will work for free, because it’s so intrinsically enjoyable. In fact, they’ll pay to be able to create and publish content like essays, software, videos, and photographs. I’m a great example. Not only do I pay for TypePad for my momblog and Haloscan for the comments here, I am foregoing a six-figure income in software development for the opportunity to write and think and develop what I want. I am effectively paying more than $100,000 annualized in order to do what I love.

That is a pretty incredible statement. And yes, it must scare the wigs off of many media moguls, not to mention people in lots of other businesses. How can you compete with something that allows people to do what they love and start a business all at the same time? Just think of Mary and Dabble, or Josh and del.icio.us, or Kevin and digg.com, or Gabe and memeorandum.com. A recent interview with Gabe said that email responses came in from him at 3 a.m. – would he be doing that if he worked at any other company but one he started and runs for the love of it?

The courier, the driver and the Internet

Anyone outside Toronto might not have heard of this little story – unless they frequent the boingboing.net website – but a week or so ago there was an altercation downtown between a bicyle courier named Leah and a young male driver whose name remains unknown (probably for his own protection). A local photographer happened to be there and took some shots of the driver assaulting the courier, stomping on her bike and generally being a complete asshole. He was restrained (and pummeled, apparently) by some bystanders.

The photos produced an avalanche of comments on the citynoise.org forum, and that no doubt picked up after boingboing linked to it. After spreading through the blogosphere, a story made it into one of the Toronto newspapers, the National Post and then into the Toronto Star and the Globe and Mail. Since then there have been a number of stories about the larger picture surrounding the incident – including the fact that the courier threw garbage back into the driver’s car after he tossed it out the window, and that she keyed his car (she has apologized, and is not pressing charges)

One of the most interesting elements from my point of view, however, is how this event would never have even made it into the media if it were not for the blogosphere – and to that extent boingboing.net and citynoise.org and other types of sites act as a kind of proto-journalism, an early-warning system for the “old” media. My colleague at the Globe and Mail, columnist and author Russell Smith, put it well in something he wrote, which I’m going to quote here because it will soon be behind our “pay wall.” He’s talking about how the comments at citynoise.org start off with misconceptions, then there are flames, then misconceptions are corrected, then Leah comments, then the brother of the driver (apparently).

So there is a sort of fact-checking at work here: Multiple posters will correct each other, and at some point, a witness will step forward. The reporting, and its verification, happened about as fast as any mainstream news network could do it. The Internet is a parallel news network, spreading news much faster than we in the media can with all our technology and organization. The pictures were posted some time last Thursday; by Monday morning, the discussion about them had involved thousands of people from all over the world. By the time a newspaper ran the story the following day, it was old news on the Net.

This is an interesting point. There were plenty of trolls and flames and so on on citynoise.org, but the “true” story came out eventually, and there was plenty of commentary from both sides (pro-courier and pro-driver). Russell goes on to say:

And why did these pictures not make it to the newspapers and the TV stations right away? They – we – would have loved to have them. I think, first of all, because it didn’t occur to the photographer to go there with them. His first instinct was to post on-line. Not only is it easier to do this – no phone call, just a mouse click – but you can control how your story appears and how you get credit for it. And he knew, too, that his story’s dissemination would be just as quick and just as effective.

Interestingly enough, considering that last point about controlling the story and credit, according to a note at citynoise.org the photos there were taken and printed in the Toronto Star without the photographer’s permission – in fact, against his specific wishes. Whether he later gave permission (because I saw them in other papers) isn’t clear.

Blogs — it’s all about the conversation

This may or may not be part of the “secret sauce” in Gabe’s memeorandum.com, but I think Stowe Boyd is onto something. In a post about what makes blogs work — i.e., what makes them vibrant and helps them grow, as opposed to stagnating or becoming echo chambers — he says that he thinks it has something to do with the ratio of posts to comments and trackbacks.

Being a geek (and I meant that in a good way) Stowe comes up with a “conversational index” that quantifies that ratio, and figures if it is more than one — that is, if there are as many or more comments and trackbacks as there are posts — then the blog will flourish. Don Dodge has come to a similar conclusion, and so has Zoli Erdos.

I don’t know if the ratio needs to be one, or close to one, or whether you can even put a number on it, but I think this hits the nail on the head — what makes most blogs interesting isn’t so much the great things that the writer puts on there (as much as I like to hear the sound of my own voice), but what kind of response it gets, and how that develops, and who carries it on elsewhere on their own blog. And I agree that it would be nice if someone like technorati.com or memeorandum.com could track that kind of thing and make it part of what brings blogs to the top.

I like to see what people are talking about — not just what a blogger has to say, but what others have to say about what they say. That’s why I also agree with Steve Rubel that it would be nice to have a way of tracking comments, other than by subscribing to a feed of comments, or bookmarking posts you’ve commented on with del.icio.us or some other tool.

Update:

Stowe Boyd has more on the “conversation” conversation, as it were, here. And as far as tracking comments, no sooner did I mention it then CoComment.com came out with that exact thing. I’m sure that’s a coincidence though 🙂

Memeorandum is a black box

There’s no question that Gabe of Memeorandum.com has created a tremendous resource (there’s an interview with him at Don Dodge’s blog) but I must admit it baffles me sometimes. I considered not writing this post at all because it will probably sound like I’m just whining about not being on the top of tech.memeorandum.com with the A-listers, but I’ve followed the site for quite a while now, and the reason some posts rise or fall in the ranking of topics — and some stay longer while others disappear — eludes me. And it kind of bugs me a little bit. And no, I’m not writing this post just to try and get to the top by mentioning Gabe 🙂

I know that the algorithm behind the site is top secret, so there’s not much point in asking about it. But today is a good example of how mysterious the system is — I’ve been on memeorandum.com many times, either linked to other posts or sometimes as a major topic. I was even at the top of the site briefly one day (although it was a weekend, so that might have increased my chances). But today I wrote a post about IE7, commenting on some of the criticisms and joining in the conversation, but that post never appeared anywhere on tech.memeorandum.com — nor did one that I wrote the day before about network neutrality.

Neither one appeared despite the fact that I wrote them around the same time as several other people whose posts were linked to or formed major memeorandum.com topic headings, including Scott Karp of Publishing 2.0 and my friends Mark Evans and Rob Hyndman. Is there something I’m doing wrong, Gabe? WordPress pings technorati automatically, and a bunch of other sites. Is it that I’m linking too much to different people, or not linking enough? I have to know. Not that I care about that kind of thing, of course. It’s just bugging me. (Dave Taylor doesn’t like memeorandum because he says it adds “an amplifier to the echo chamber” of the blogosphere).

Update:

This post is now near the top of the section about Gabe, and showed up only a few minutes after I posted it, which actually makes me more confused instead of less.

Update 2:

Gabe emailed me and said that both of the posts I mentioned had actually been linked to on the site at different times, and sent me links to cached versions of the pages. I guess I must have missed when they were on the site and then they fell off the radar quickly and so I never saw them. The strange thing is, some posts (like the one above) show up right away, and the ones Gabe checked on didn’t show up for hours. Maybe I’m trying too hard to figure this whole thing out — I should probably just go read a book, or alphabetize my CDs or something useful 🙂

Hey bloggers — MSFT doesn’t care about you

Many of the reviews and comments about the new beta of Microsoft’s Internet Exploder Explorer, IE7, have focused on the RSS implementation. Adam Green at Darwinianweb.com got everybody’s attention when he said that he thought the browser would kill a lot of aggregators, and later amended this to say that while IE7’s handling of RSS wasn’t that great, it was probably good enough. As he put it — in a phrase I wish I had come up with

“Microsoft long ago mastered the trick of calculating exactly the minimal feature set needed to suck the air out of a market it wants to enter.”

That is exactly right. It’s not that IE7’s version of the RSS reader is that great — in fact, it is pretty much “just like favourites,” as Scott Karp at Publishing 2.0 puts it — it’s that it’s probably just good enough for most people. Dave Winer might be right when he says that the “river of news” is a better model for an aggregator, but IE7 doesn’t really have a dog in that race. It just wants something simple that people can use without too much trouble.

Is the way they have done it good enough? That remains to be seen. RSS is still not easy enough, as my friend Paul Kedrosky keeps pointing out, and people are (in general) lazy. Not everyone wants to see if they can break Robert Scoble’s record for most RSS feeds subscribed to. Kent Newsome asks why he should care about IE7, and the answer is that he probably shouldn’t.

We are all “edge cases,” as someone has pointed out, and I would have to go along with Jeff Nolan – IE7 wasn’t designed for us. Simple as that. We can keep on using Firefox and Performancing and Greasemonkey and all those great things, but the fact is IE still has 80 per cent of the browser market, and it got that way by not being on the edge.

Telecoms and the toll-road gambit

I wasn’t sure whether to write anything about the “network neutrality” issue, in part because my friend Rob Hyndman has done such a good job of covering the subject – particularly an overview of the current state of affairs in his latest post – but as usual I couldn’t resist 🙂 Verizon has reportedly filed documents with the Federal Communications Commission that say it plans to use as much as 80 per cent of its network for its own purposes. Everything else would get shoe-horned into the remainder (although Cynthia at IPDemocracy says it might not be as bad as it sounds, and it looks like Om Malik agrees).

This, of course, is just the latest step in a campaign by the major telcos to strong-arm convince Internet companies such as Google and Yahoo to pay extra for delivery of their broadband content to consumers, a campaign that got its start with comments from Ed “pay up for those pipes” Whitacre of AT&T (formerly SBC) and Bill Smith of BellSouth. Why should they have to carry all that content on their networks, the telcos complain – why should Google make money from broadband and not share some of it with the carriers whose pipes they use?

As Mike at Techdirt notes, part of the problem is that the phone companies haven’t spent the money necessary to do all the things they want to do on their networks. The telcos made all kinds of promises about upgrades they planned to make – in return for which they got various concessions from the U.S. government – and then they never followed through, as telecom analyst Bruce Kushnick writes in a new book called The $200-Billion Broadband Scandal.

The big question is: Will the U.S. government allow the telcos to get away with this move, or will they step in to enforce some form of network neutrality? There used to be a concept called the “common carrier” principle, in which telcos were required to carry any and all voice traffic — that idea seems to have gone out the window.

Newspapers need to get a clue – quickly

The Paris-based World Newspaper Association, a body that appears to be almost pathologically clueless when it comes to the Internet, is blustering and grumbling about how search engines such as Google News are “stealing” their content and should be made to either stop or to pay for it. Although the group hasn’t said what it has in mind, it is muttering darkly about challenging the “exploitation of content” that its members feel is going on. In a magnanimous gesture, they admitted that search engines help drive traffic to their sites, but said this didn’t justify the fact that Google and others have built their businesses on “taking content for free.”

This issue has come up before, when a representative of the European Publishers’ Council accused Google and other Web search companies of being “parasites” living off the content of others. Gavin O’Reilly of the WNA has been quoted as saying that the Web companies are engaging in “kleptomania.” Here’s what he told the Financial Times:

Mr O’Reilly likened the initiative to the conflict between the music industry and illegal file-sharing websites and said it was not a sign that publishers had failed to create a competitive online business model of their own. “I think newspapers have developed very compelling web portals and news channels but the fact here is that we’re dealing with basic theft,” he said [snip]. Services such as Google News link to original news stories on the home pages of newspapers and magazines and display only the headline and one paragraph of the story [but] “That’s often enough” for readers browsing the top stories, Mr O’Reilly said.

I must admit that I thought the WNA was out of its mind to even bring this subject up in the first place, but the comparison to the RIAA and its war against file-sharing took the association’s case well past stupidity and into the realm of farce (ironically, as Rafat at PaidContent points out, the WNA has a great blog called Editors Weblog). How exactly is linking the headline and first paragraph of a story to a newspaper’s website the same as people downloading an entire song from a P2P application? The answer: It isn’t.

As for Mr. O’Reilly’s argument that readers are often satisfied with the headline and one paragraph, whose fault is that? Maybe the WNA should try suing every user of Google News in court, the same way the RIAA has — that’ll show them. Or they could block all search engines, and get no traffic whatsoever. As James Robertson notes, this appears to be more about a cash grab than it is about the way that search engines work. Techdirt asks whether newspapers can really be that clueless, and the short answer is: Yes.