The CIA’s venture business is going strong

Dozens of venture capital firms have come and gone since the great Internet bubble of the late 1990s, but one relatively little-known firm has stuck around and continued to invest heavily in some of the Web’s leading technologies. It’s probably a lot easier to do this if you only have one investor to answer to, and In-Q-Tel Technologies has that in spades: it’s sole controlling shareholder is the U.S. government, in the form of the Central Intelligence Agency.

Created in 1999, In-Q-Tel is the venture capital unit of the federal spy agency, and was set up as a way for the CIA to stay in touch with “cutting-edge technology” that might be useful to its purpose as the U.S. government’s intelligence arm. According to the CIA site, the idea for such a venture came from Dr. Ruth David, a former CIA Deputy Director for Science and Technology. The agency’s rationale was that “as an information-based agency, the CIA must be at the cutting edge of information technology in order to maintain its competitive edge and provide its customers with intelligence that is both timely and relevant.”

As the CIA site notes, the agency has been involved in many leading technologies over the past few decades, including the U-2 and SR71 reconnaissance airplanes and Corona surveillance satellites — and, of course, the Internet itself was a spinoff of a research project developed by the Defense Advanced Research Projects Agency or DARPA. But the agency apparently realized that the pace of technology development was accelerating beyond its ability to keep up, and that investing in startups might be a way of keeping track of those new technologies, and possibly benefit from them as well.

As the CIA says itself: “One of the great leaps of faith the Agency took in this venture was to recognize, early on, that private sector businessmen were better equipped than it was to design the Corporation and create its work program.”

One of In-Q-Tel’s investments from several years ago has paid off in both senses: the company invested in a small geo-targeting and satellite-mapping company called Keyhole, which was bought by Google in 2004 and renamed Google Earth. Although Google did not say how much it paid for the company, In-Q-Tel sold more than 5,000 shares of Google following the deal and pocketed about $2.2-million. As of 2006, it had reportedly invested more than $130-million in about 90 businesses, including these ones.

The company invests in several broad areas, including: application software and analytics; bio-, nano- and chemical technologies; communications and infrastructure; digital identity and security; and embedded systems and power. Its investments include companies such as Asankya — which is working on a way of speeding up Internet traffic using a proprietary technology — and Decru, which makes highly secure data-storage products. The company has also invested in Attensity, which does advanced text analysis on large quantities of data, something the CIA no doubt does a fair bit of, and a company called Stratify that specializes in analyzing “unstructured data.”

All four of those investments have potential implications for the future of the Web, since both Asankya and Decru’s technologies are used in “cloud computing” infrastructure, and Attensity and Stratify’s products are useful for taking the existing Web and adding layers of meaning or understanding on top of it — along the lines of what Web creator Tim Berners-Lee has called the “semantic Web.”

Although not every investment the company has made has paid off, even formerly skeptical Silicon Valley investment watchers have grudgingly admitted that In-Q-Tel has done pretty well for itself, while one of its founders told the Washington Post as far back as 2005 that while he still saw the venture firm as an “unproved experiment, it was already “far more successful than I ever dreamed it would be.”

Islamic clerics consider issuing a fatwa against Facebook

As one of the top — if not *the* top — social-networking sites, Facebook tends to draw a disproportionate amount of criticism from those concerned about the effect the site has on young minds or the welfare of society as a whole. And no one is more concerned about those risks than the various religious groups who routinely try to ban, block or otherwise crack down on the scourge of modern social networking.

The latest to make this kind of move — or at least a threat in that direction — was a group of Islamic clerics who were meeting in Indonesia. They declared that they were contemplating a religious ruling that would issue a “fatwa” or declaration about Facebook, requiring observant Muslims to practice what amounts to “safe social networking” or suffer the wrath of their imam (priest).

In a nutshell, a spokesman for the group said that Facebook and similar sites could be used for appropriate purposes such as education about the faith or keeping in touch with one’s family and friends, but noted that they could also present a temptation.

“The clerics think it is necessary to set an edict on virtual networking, because this online relationship could lead to lust, which is forbidden in Islam,” said Nabil Haroen, a spokesman for the Lirboyo Islamic boarding school, which was hosting the event. The head of the council of imams said that the growing number of Facebook users in Indonesia was a controversial subject among Muslim leaders and that he favored a ban because of possible sexual content.

“People using Facebook can be driven to engage in distasteful, pornographic chatting,” said Amidan, who — like most Indonesians — uses only one name. Another spokesman for the clerics said that “spreading ill words about others, gossiping and other things that go against religious teaching on social networking sites in the virtual world are forbidden according to Islamic law.” Despite these protests, one Muslim group on Facebook has 48,000 members, while a Muslim fan page has 18,000 fans.

The Indonesian clerical group’s move would not be the first time the country has stepped in to block social-networking sites for religious reasons. Last year, the country ordered its largest ISPs to block YouTube and MySpace because they both carried an anti-Islam film called Fitna, created by Dutch filmmaker Geert Wilders, and the government said that seeing it might “disturb relations between faiths.”

And Muslims aren’t the only religious groups to be concerned about the Internet and the intrusion of social networking into the lives of the faithful. Although Pope Benedict and other senior members of the Catholic clergy have made positive statements about the benefits of the Internet — and even at one point created a Catholic version of Facebook called Xt3.com to appeal to young people — not everyone is quite so sanguine about these new services.

In a recent address to his flock, the Bishop of Paisley, Rt. Rev Philip Tartagliawarned the faithful that “In dialogue with others we need to be wary of the inane chatter that can go on in the digital world which does nothing to promote growth in understanding and tolerance.” He also raised concerns about who young people might contact through the networks, saying: “What parent has not wondered what their child is doing on the internet? What material are they accessing? Who are they talking to in social networking sites?”

Jewish groups have yet to raise any substantial concerns about social networking and its effects from a religious point of view — although there are Orthodox adherents that believe Facebook and similar sites can lure the faithful away from the path of righteousness — but they have become concerned about the use of Facebook as a tool to spread hatred about their faith, including a number of groups that deny the existence of the Holocaust. So far, Facebook has said that it believes the groups fall under the category of freedom of speech, and has resisted efforts to close them.

Facebook is making money on apps

By now, everyone seems to have become pretty comfortable with the idea that Facebook is a revenue-generating enterprise. Although originally there was a lot of skepticism about whether the social network would be able to produce much revenue, advertising deals with companies like Microsoft (which invested $240-million for 1.6 per cent of the company in 2007) have established that Facebook is definitely producing plenty — as much as $500-million in revenue this year.

What’s even more fascinating, however, is the amount of money that is being generated by the Facebook “ecosystem” — that is, the considerable number of applications, tools and games that are built using Facebook’s F8 platform. According to some estimates (and they are just estimates), app developers as a whole could bring in almost as much or possibly even more revenue this year than Facebook itself.

As Eric Eldon points out, coming up with an overall estimate of what Facebook app developers are making is difficult — if not impossible. Many developers don’t want to say publicly what they bring in, for competitive reasons. But some of the estimates that have appeared, particularly about the largest app developers, are accepted by most observers as being fairly accurate. Zynga, for example, which makes the Texas Hold ‘Em app, is believed to have a “run rate” that would produce revenue this year of about $100-million.

That’s a single developer (although it has multiple apps in its stable). About half of its revenue is estimated to come from its Facebook apps, and the other half from its MySpace businesses. Playfish is expected to make about $30-million this year. In all, Eldon estimates that the top handful of developers make about $150-million, followed by several other tiers of smaller developers who collectively make another $150-million or so. By the end of this year, several sources said that could hit $500-million.

Online gaming industry players told Advertising Age magazine the same thing, with many of them estimating developer revenue could hit $500-million or more. “It wouldn’t surprise me if apps on Facebook generate more revenue this year than Facebook,” LivingSocial CEO Tim O’Shaughnessy told the magazine. LivingSocial currently has the number one most popular Facebook app, “Pick Your Five.” Mark Pincus of Zynga recently wrote up some of his tips on how to make money through Facebook apps for the Facebook developers blog.

One interesting thing to note is that Zynga and several other developers make money in a variety of ways — not just through banner advertising, as many online media outlets do, but also through CPA (cost per action) payments, as well as the sale of virtual goods. In Zynga’s case, the company makes about a third of its revenue from the sale of poker chips for its Texas Hold ‘Em games. The virtual goods market is one that is already well established in other countries: in China, for example, gaming company Tencent made $1-billion in revenue from the sale of goods and services in its virtual world, including clothing for avatars.

Is there a lesson here for other companies such as Twitter, which is searching for revenue-generating opportunities for its fast-growing service? Clearly, there is. If you can build a social network that attracts the kind of devoted users that Facebook has, and in large enough numbers, you can generate substantial amounts of revenue both for yourself and your partners. And if Facebook is developing an integrated payment system (as it is rumoured to be), the revenue potential could be set to explode.

Europe protests the US control of ICANN

The popular perception of the Internet is that it is inherently global in nature, an international network that is just as open and accessible regardless of what country you happen to be in, or what language you speak. And for the most part (with the exception of some totalitarian states such as China) that is the case. However, the keys to this particular kingdom belong to one country: namely, the U.S., through its control of the Internet Corporation for Assigned Names and Numbers.

Some critics don’t like that state of affairs, and are trying to change it — including the Information Commission of the European Union, Viviane Reding. The EU official said this week that ICANN should not be overseen by the U.S. government (the agency operates under an agreement with the Commerce Department), but should be run democratically by a group of states. While the U.S. has done a good job of managing the process, she said, “in the long run, it is not defendable that the government department of only one country has oversight of an Internet function which is used by hundreds of millions of people in countries all over the world.”

ICANN, as it is called, is the non-profit entity that controls the Internet’s entire domain name structure — including the creation of new “top-level” domains such as .xxx and .mobi — and is in charge of handing out domain names and IP addresses. It manages this process through a separate entity known as IANA (Internet Assigned Numbers Authority), which controls the 13 root nameservers that translate domain names and URLs into IP addresses. Until relatively recently, these DNS servers were all located in the United States, in most cases inside government offices.

The EU commissioner said in her recent address that she hopes President Obama continues with the Clinton administration’s plan to privatize ICANN, and she suggested that a perfect opportunity for such a move is coming in September, when the agency’s agreement with the U.S. government expires. Among other things, Reding said that she would like ICANN to be regulated by an international tribunal (any legal disputes involving the agency are currently handled by the courts in California, since it is located there).

There have been discussions in the past about the United Nations taking over control of ICANN, and other EU critics have requested that the European Union investigate the U.S. agency for restraint of trade because of its control of the plumbing of the Internet. Consumer advocate Ralph Nader has also criticized ICANN.

Will the September expiry of the U.S. government’s deal with ICANN be the opportunity that Viviane Reding and others are hoping for? Will the international community — either the EU itself or the UN — take control over the reins of the Internet? There certainly seems to be increasing interest in doing so, and the Obama administration is ideologically a lot closer to the previous Clinton government, which had begun the process toward releasing ICANN from its government associations. All that remains is for the last ties to be cut, and for the agency to become as international as the Internet whose plumbing it manages.

It’s all about dematerialization

If climate-change experts are correct in thinking that we could be close to a “tipping point” that might accelerate the already substantial climate change we’ve seen to date, the pressure to find solutions is sure to intensify. One of the potential mitigating factors is something that economists and scientists like to call “dematerialization.” Simply put, this is the process by which products and services with greater environmental impact are gradually replaced by those with less impact. As Wikipedia defines it, the term refers to the absolute or relative reduction in the quantity of materials required to serve economic functions in society.

Theoretically, the IT industry should be a great contributor to this effect or process, for a variety of reasons. For example, computers have gotten a lot smaller, cheaper and more efficient, in terms of computing power per watt of energy, etc. As this piece from Hewlett-Packard notes, the fastest computer in 1946 performed about 5,000 operations a second, weighed more than 30 tons, and consumed almost 200 kilowatts of electricity. In contrast, an off-the-shelf laptop purchased today has thousands of times the processing power, weighs just a few pounds, and consumes less than one-thousandth the electricity.

This effect can have related “rebound” effects, however, as a number of researchers have noted. The fact that electronic devices including computers have become smaller and cheaper to both buy and operate means that there are also a lot more of them, as this paper points out, and can contribute to a consumer attitude of ”replace rather than repair.” As the researchers note, while dematerialization “may be the case on a per-unit basis, the increasing number of units produced can cause an overall trend toward materialization with time.” And then there are all-new devices as well: according to Apple, the lifecycle carbon footprint of an iPhone involves the emission of 121 pounds of CO2-equivalent greenhouse gas emissions over the course of a three year expected lifetime of use.

While IT services might lead to improvements in a number of ways — thanks to developments such as videoconferencing, telework, and electronic commerce — it’s difficult to draw a straight line connecting these services to an overall reduction in environmental impact, according to some scientists who have looked at the issue. While IT can improve the efficiency of existing operations and create new, more efficient ways of doing the same things, “having the means to use less materials does not mean that they will be adopted, nor does it guarantee that their adoption will actually lead to dematerialization.” As an example, despite the arrival of personal computers and other electronic services such as email, the total consumption of paper has doubled since 1950.

So while dematerialization has promise as a mitigating factor in climate change — and the contributions of the IT industry on that score are many — it is far from being a panacea, since the effects of such changes in the IT business can be more than counterbalanced by contrary activity elsewhere in the global economy.

America has a new Chief Technology Officer

So America has a new “Chief Technology Officer” (although the actual name of the position is Associate Director for Technology in the Office of Science and Technology Policy). Aneesh Chopra, formerly Virginia’s Secretary of Technology, didn’t get any resounding cheers from the hardcore geeks — in part because he isn’t from the center of the geek-o-sphere, Silicon Valley — but even having the position at all is a big step up from previous administrations, so most industry observers are fairly positive about the appointment. But will the position become simply a platform for simplistic solutions, such as a broadband-for-all policy, as some fear, or will it have broader and more far-reaching effects, as some are recommending?

Those who are hoping for more depth than a simple broadband rollout strategy will likely be pinning their hopes on the newly announced President’s Council of Advisors on Science and Technology (PCAST), a non-policy making body that President Obama named on Monday. Perhaps the most prominent name on the list is Eric Schmidt, the CEO of Google. As one observer noted, it wasn’t that long ago that the search-engine giant was seen as an outsider in Washington, with little or no ties to the government and no real influence. That has definitely changed with the arrival of Barack Obama and his technology-friendly White House.

The full PCAST roll call (the complete list is here) also includes Craig Mundie, the Chief Research and Strategy Officer at Microsoft Corporation; Shirley Ann Jackson, president of Rensselaer Polytechnic Institute and former Chair of the US Nuclear Regulatory Commission; James Gates Jr., director of the Center for String and Particle Theory at the University of Maryland, and Rosina Bierbaum, dean of the School of Natural Resources and Environment at the University of Michigan. Breadth is something that the Obama administration seems to have more or less covered, since the list Includes specialists and experts in chemistry, physics, biology, geology, computer science, engineering, environmental science, astrophysics and internal medicine.

The President said that the PCAST board would be charged with advising him “about national strategies to nurture and sustain a culture of scientific innovation.” Some hope that Schmidt and Mundie will be able to sway Obama on the topic of net neutrality, something both are interested in. And those hoping for a broad mandate can also take comfort in the fact that the government has pledged to spend 3 per cent of the country’s GDP, or about $415.2-billion, on science and technology research, development and education — the largest financial commitment in America’s history (larger even than the original moon-shot space program). Among other things, Obama has proposed doubling the budgets for the National Science Foundation, the Department of Energy’s Office of Science, and the National Institute of Standards and Technology.

Could these kinds of investments be jeopardized by the shaky economy? No doubt. But the President has made it clear that investment in science and technology research and development is a priority for his government, and he arguably has an advisory group with enough intellectual heft and real-world experience to make it a reality.

Britannica tries to eat a little of Wikipedia’s lunch

After years of more or less ignoring its open-source competitor, the venerable Encyclopedia Britannica will soon be taking a page from Wikipedia’s playbook and allowing members of the public to contribute to articles and other content at Britannica.com. That’s according to Jorge Cauz, president of the 240-year-old institution, which at one time was synonymous with knowledge in many Western households and schools. The Britannica head told the Sydney Morning Herald in Australia and The Times in the UK that Britannica plans to offer the new features on its website soon.

Mr. Cauz made it clear, however, that anything submitted by users will have to be vetted by one of the encyclopedia’s staff of paid researchers before it appears either on the website or in the actual print version of the EB. “We’re not trying to be a wiki – that’s the last thing we want to be,” he told The Times. “Britannica doesn’t offer that voyeuristic benefit. Users won’t be able to write anything they want and have it published.” The changes — which are just part of the creation of a larger Britannica community portal — were first described last June, and it was made clear then that Britannica didn’t plan on letting the whole “crowd-sourcing” thing get out of hand: “We are not abdicating our responsibility as publishers or burying it under the now-fashionable ‘wisdom of the crowds,” a blog post said at the time.

Of course, the Encyclopedia Britannica head probably knows that in many cases, users can’t just write anything they want and have it published in Wikipedia either. There are dozens of moderators and editors working for the open-source encyclopedia (although they are unpaid volunteers) who check page changes for accuracy and to make sure they uphold the Wikipedia principles of fairness and a “neutral point of view.” While there are some pages that can be edited freely, where mistakes might not be noticed quickly, other pages (including the one about President George Bush) are “locked,” and can’t be edited by anyone but a Wikipedia-sanctioned moderator. A well-known comparison between Britannica and Wikipedia found that the error rate in each case was roughly equivalent. Wikipedia founder Jimmy Wales has also recently proposed changes that would restrict editing even further.

It also seems pretty obvious from the Britannica president’s comments that he is a) more than a little jealous of Wikipedia’s traffic numbers (the open-source encyclopedia gets about 6 million visitors a day, while Britannica gets about 1.5 million a day) and b) irritated that Google features Wikipedia links so prominently in the search results for many common terms that people might otherwise go to Britannica.com for. “If I were to be the CEO of Google or the founders of Google I would be very [displeased] that the best search engine in the world continues to provide as a first link, Wikipedia,” Cauz told the Sidney Morning Herald. “Is this the best they can do?” He also made it clear that he sees Wikipedia as the fast-food version of knowledge, saying many people turn to it for answer, but that many people are also “happy to eat McDonalds every day.”

Britannica isn’t the only one to try and take the Wikipedia model and blend it with the authoritative voice of the expert: a project called Citizendium, which started up a little over a year ago, was created by Larry Sanger — a former co-founder of Wikipedia — as an attempt to create a “crowd-sourced” encyclopedia, but with input from subject-matter experts rather than just anyone. Google has taken some steps in that direction as well, with a service called Knol (derived from “knowledge”), which encourages experts to create Wikipedia-style entries on specific subjects. But neither Knol nor Citizendium have gotten much traction, and are certainly nowhere near challenging Wikipedia for the title of “the people’s encyclopedia.” Whether Britannica’s changes can put it back in the race or not remains to be seen.

China cracks down on Twitter

For a brief moment 10 years ago, a lone figure blocked the path of a giant tank in Tiananmen Square, and it seemed as though Chinese dissidents might be able to shake the pillars of power in that vast country and bring some semblance of democracy to China. In the wake of the pro-democracy protests, however, the Chinese government redoubled its crackdown on dissidents and crushed any hint of dissent. As the 20th anniversary of the Tiananment Square demonstration approaches, Chinese authorities seem more determined than ever not to allow even a hint of unrest to appear anywhere — including the Internet.

Early this week, journalists and bloggers in China started noticing that Twitter, the chat-style social networking application, was no longer available — and that users were also having difficulty accessing other popular websites and services such as Google-owned YouTube and the Yahoo-owned photo-sharing website Flickr. Blogging platforms such as WordPress and Blogger (also owned by Google) were said to be unavailable as well. According to several reports, the Chinese authorities had also shut down message boards on thousands of websites used by college students.

Blocking and/or filtering services is nothing new for the Chinese government. Even widely-used and official services such as Google, Yahoo and Microsoft’s MSN Search are filtered by the authorities to exclude any reference to the events in Tiananment Square in 1989, and all of the major search engines have in the past supplied names of specific users to the Chinese government. As more and more official services have been blocked or filtered, the popularity of social-networking sites such as Twitter has increased. As a number of observers have noted, Twitter takes on even more significance in a country like China, not just because it allows people to exchange messages easily, but because the use of Chinese characters allows Twitter posts to include far more information than a similar English message.

As the Committee to Protect Journalists details in a recent piece, the Chinese government has been cracking down on virtually every form of media: newspapers published in Hong Kong that might mention the anniversary are blocked from delivery to mainland China, and radio broadcasts are interrupted by commercials if the topic comes up; even a former Communist Party leader’s memoirs are no longer available through official channels (according to the New York Times), presumably because they mention the events of June 3 and 4, 1989. Several prominent Chinese citizens associated with the demonstration were either detained or warned to leave the capital before the anniversary.

Despite the official crackdowns, however, Twitter users seemed determined to find a way around what Chinese dissidents and other critics refer to as the “Great Firewall of China.” As the blockage moved into its second day, users were exchanging tips on using so-called “anonymous proxy” software such as Hotspot Shield (which hides the IP address of a user who runs the software) and other similar technological workarounds.

The RIAA switches tactics

The Recording Industry Association of America has been waging war against copyright infringement — in the form of illegal downloads — since the file-sharing app known as Napster first appeared on the scene a decade ago. For much of that time, the RIAA’s preferred form of attack has been the lawsuit, with the record-industry body filing claims against hundreds of thousands of individuals for financial damages, including several high-profile cases that have targeted single mothers, war veterans, and teenagers for sharing as few as a dozen songs. Now, the RIAA has apparently decided to switch tactics, according to a recent report in the Wall Street Journal, and will no longer sue individual downloaders.

Instead, the agency plans to strike deals with Internet service providers that will see the ISPs agree to contact individual file-sharers when notified by the RIAA that illegal activity is occurring. After two warnings about their behaviour, the RIAA is hoping that ISPs will agree to shut off their customers’ access to the Internet.

What this means is that the record-industry group is trading a head-on assault for a back-room negotiation with Internet providers, in the hope that it can do an end-run around the problems its previous strategy produced. Not only was the lawsuit approach a magnet for negative publicity, but it also ran into repeated legal roadblocks as well. In one of the latest, for example, a U.S. court ruled that the RIAA would have to show that songs had actually been downloaded before it could prove infringement. Unfortunately for the RIAA, however, its new strategy could cause some legal headaches as well, since the “three strikes” approach is almost certain to be challenged in court.

One of the biggest issues with such an approach is that it assumes the RIAA has correctly identified a) specific songs that are being infringed, and b) the individual doing the infringing. If it makes a mistake in any one of the three steps, an innocent user has their Internet access cut off without any recourse. And over the years that the record industry has been pursuing its legal strategy, there has been plenty of evidence to show that its ability to do either a) or b) is far from perfect. The agency has repeatedly mis-identified not only the individuals doing the sharing — who in many cases are distinct from the customers the Internet account belongs to — but has also mis-identified songs as infringing when they have been legally acquired.

The “three strikes” approach is not a new one. The French government has proposed new legislation that would require ISPs to cut off the access of users who repeatedly infringe on copyright, although the legislation has yet to become law. An amendment by the European Parliament earlier this year strongly urged France and other European Union countries not to adopt this kind of legislation, but it was later overturned by the EU Council of Ministers, and according to the most recent reports France is going ahead with it. Britain’s record-industry lobby group managed to convince several ISPs to warn their users when notified by the agency, but they have stopped short of actually cutting off their users’ Internet access after a third “strike.”

The RIAA may think that it has come up with a clever way around its earlier legal hassles, but it could find that trying to make ISPs into copyright police faces just as many hurdles, if not more.

Would we create the Web today?

Duke University lawyer and Creative Commons director James Boyle has a thoughtful piece in the Financial Times about the Web and whether it would (or could) be created today the way it is now. He ends with a depressing thought (via Slashdot):

“Why might we not create the web today? The web became hugely popular too quickly to control. The lawyers and policymakers and copyright holders were not there at the time of its conception. What would they have said, had they been? What would a web designed by the World Intellectual Property Organisation or the Disney Corporation have looked like? It would have looked more like pay-television, or Minitel, the French computer network…. The lawyers have learnt their lesson now. The regulation of technological development proceeds apace. When the next disruptive communications technology — the next worldwide web — is thought up, the lawyers and the logic of control will be much more evident. That is not a happy thought.”