It’s all about dematerialization

If climate-change experts are correct in thinking that we could be close to a “tipping point” that might accelerate the already substantial climate change we’ve seen to date, the pressure to find solutions is sure to intensify. One of the potential mitigating factors is something that economists and scientists like to call “dematerialization.” Simply put, this is the process by which products and services with greater environmental impact are gradually replaced by those with less impact. As Wikipedia defines it, the term refers to the absolute or relative reduction in the quantity of materials required to serve economic functions in society.

Theoretically, the IT industry should be a great contributor to this effect or process, for a variety of reasons. For example, computers have gotten a lot smaller, cheaper and more efficient, in terms of computing power per watt of energy, etc. As this piece from Hewlett-Packard notes, the fastest computer in 1946 performed about 5,000 operations a second, weighed more than 30 tons, and consumed almost 200 kilowatts of electricity. In contrast, an off-the-shelf laptop purchased today has thousands of times the processing power, weighs just a few pounds, and consumes less than one-thousandth the electricity.

This effect can have related “rebound” effects, however, as a number of researchers have noted. The fact that electronic devices including computers have become smaller and cheaper to both buy and operate means that there are also a lot more of them, as this paper points out, and can contribute to a consumer attitude of ”replace rather than repair.” As the researchers note, while dematerialization “may be the case on a per-unit basis, the increasing number of units produced can cause an overall trend toward materialization with time.” And then there are all-new devices as well: according to Apple, the lifecycle carbon footprint of an iPhone involves the emission of 121 pounds of CO2-equivalent greenhouse gas emissions over the course of a three year expected lifetime of use.

While IT services might lead to improvements in a number of ways — thanks to developments such as videoconferencing, telework, and electronic commerce — it’s difficult to draw a straight line connecting these services to an overall reduction in environmental impact, according to some scientists who have looked at the issue. While IT can improve the efficiency of existing operations and create new, more efficient ways of doing the same things, “having the means to use less materials does not mean that they will be adopted, nor does it guarantee that their adoption will actually lead to dematerialization.” As an example, despite the arrival of personal computers and other electronic services such as email, the total consumption of paper has doubled since 1950.

So while dematerialization has promise as a mitigating factor in climate change — and the contributions of the IT industry on that score are many — it is far from being a panacea, since the effects of such changes in the IT business can be more than counterbalanced by contrary activity elsewhere in the global economy.

America has a new Chief Technology Officer

So America has a new “Chief Technology Officer” (although the actual name of the position is Associate Director for Technology in the Office of Science and Technology Policy). Aneesh Chopra, formerly Virginia’s Secretary of Technology, didn’t get any resounding cheers from the hardcore geeks — in part because he isn’t from the center of the geek-o-sphere, Silicon Valley — but even having the position at all is a big step up from previous administrations, so most industry observers are fairly positive about the appointment. But will the position become simply a platform for simplistic solutions, such as a broadband-for-all policy, as some fear, or will it have broader and more far-reaching effects, as some are recommending?

Those who are hoping for more depth than a simple broadband rollout strategy will likely be pinning their hopes on the newly announced President’s Council of Advisors on Science and Technology (PCAST), a non-policy making body that President Obama named on Monday. Perhaps the most prominent name on the list is Eric Schmidt, the CEO of Google. As one observer noted, it wasn’t that long ago that the search-engine giant was seen as an outsider in Washington, with little or no ties to the government and no real influence. That has definitely changed with the arrival of Barack Obama and his technology-friendly White House.

The full PCAST roll call (the complete list is here) also includes Craig Mundie, the Chief Research and Strategy Officer at Microsoft Corporation; Shirley Ann Jackson, president of Rensselaer Polytechnic Institute and former Chair of the US Nuclear Regulatory Commission; James Gates Jr., director of the Center for String and Particle Theory at the University of Maryland, and Rosina Bierbaum, dean of the School of Natural Resources and Environment at the University of Michigan. Breadth is something that the Obama administration seems to have more or less covered, since the list Includes specialists and experts in chemistry, physics, biology, geology, computer science, engineering, environmental science, astrophysics and internal medicine.

The President said that the PCAST board would be charged with advising him “about national strategies to nurture and sustain a culture of scientific innovation.” Some hope that Schmidt and Mundie will be able to sway Obama on the topic of net neutrality, something both are interested in. And those hoping for a broad mandate can also take comfort in the fact that the government has pledged to spend 3 per cent of the country’s GDP, or about $415.2-billion, on science and technology research, development and education — the largest financial commitment in America’s history (larger even than the original moon-shot space program). Among other things, Obama has proposed doubling the budgets for the National Science Foundation, the Department of Energy’s Office of Science, and the National Institute of Standards and Technology.

Could these kinds of investments be jeopardized by the shaky economy? No doubt. But the President has made it clear that investment in science and technology research and development is a priority for his government, and he arguably has an advisory group with enough intellectual heft and real-world experience to make it a reality.

Britannica tries to eat a little of Wikipedia’s lunch

After years of more or less ignoring its open-source competitor, the venerable Encyclopedia Britannica will soon be taking a page from Wikipedia’s playbook and allowing members of the public to contribute to articles and other content at Britannica.com. That’s according to Jorge Cauz, president of the 240-year-old institution, which at one time was synonymous with knowledge in many Western households and schools. The Britannica head told the Sydney Morning Herald in Australia and The Times in the UK that Britannica plans to offer the new features on its website soon.

Mr. Cauz made it clear, however, that anything submitted by users will have to be vetted by one of the encyclopedia’s staff of paid researchers before it appears either on the website or in the actual print version of the EB. “We’re not trying to be a wiki – that’s the last thing we want to be,” he told The Times. “Britannica doesn’t offer that voyeuristic benefit. Users won’t be able to write anything they want and have it published.” The changes — which are just part of the creation of a larger Britannica community portal — were first described last June, and it was made clear then that Britannica didn’t plan on letting the whole “crowd-sourcing” thing get out of hand: “We are not abdicating our responsibility as publishers or burying it under the now-fashionable ‘wisdom of the crowds,” a blog post said at the time.

Of course, the Encyclopedia Britannica head probably knows that in many cases, users can’t just write anything they want and have it published in Wikipedia either. There are dozens of moderators and editors working for the open-source encyclopedia (although they are unpaid volunteers) who check page changes for accuracy and to make sure they uphold the Wikipedia principles of fairness and a “neutral point of view.” While there are some pages that can be edited freely, where mistakes might not be noticed quickly, other pages (including the one about President George Bush) are “locked,” and can’t be edited by anyone but a Wikipedia-sanctioned moderator. A well-known comparison between Britannica and Wikipedia found that the error rate in each case was roughly equivalent. Wikipedia founder Jimmy Wales has also recently proposed changes that would restrict editing even further.

It also seems pretty obvious from the Britannica president’s comments that he is a) more than a little jealous of Wikipedia’s traffic numbers (the open-source encyclopedia gets about 6 million visitors a day, while Britannica gets about 1.5 million a day) and b) irritated that Google features Wikipedia links so prominently in the search results for many common terms that people might otherwise go to Britannica.com for. “If I were to be the CEO of Google or the founders of Google I would be very [displeased] that the best search engine in the world continues to provide as a first link, Wikipedia,” Cauz told the Sidney Morning Herald. “Is this the best they can do?” He also made it clear that he sees Wikipedia as the fast-food version of knowledge, saying many people turn to it for answer, but that many people are also “happy to eat McDonalds every day.”

Britannica isn’t the only one to try and take the Wikipedia model and blend it with the authoritative voice of the expert: a project called Citizendium, which started up a little over a year ago, was created by Larry Sanger — a former co-founder of Wikipedia — as an attempt to create a “crowd-sourced” encyclopedia, but with input from subject-matter experts rather than just anyone. Google has taken some steps in that direction as well, with a service called Knol (derived from “knowledge”), which encourages experts to create Wikipedia-style entries on specific subjects. But neither Knol nor Citizendium have gotten much traction, and are certainly nowhere near challenging Wikipedia for the title of “the people’s encyclopedia.” Whether Britannica’s changes can put it back in the race or not remains to be seen.

China cracks down on Twitter

For a brief moment 10 years ago, a lone figure blocked the path of a giant tank in Tiananmen Square, and it seemed as though Chinese dissidents might be able to shake the pillars of power in that vast country and bring some semblance of democracy to China. In the wake of the pro-democracy protests, however, the Chinese government redoubled its crackdown on dissidents and crushed any hint of dissent. As the 20th anniversary of the Tiananment Square demonstration approaches, Chinese authorities seem more determined than ever not to allow even a hint of unrest to appear anywhere — including the Internet.

Early this week, journalists and bloggers in China started noticing that Twitter, the chat-style social networking application, was no longer available — and that users were also having difficulty accessing other popular websites and services such as Google-owned YouTube and the Yahoo-owned photo-sharing website Flickr. Blogging platforms such as WordPress and Blogger (also owned by Google) were said to be unavailable as well. According to several reports, the Chinese authorities had also shut down message boards on thousands of websites used by college students.

Blocking and/or filtering services is nothing new for the Chinese government. Even widely-used and official services such as Google, Yahoo and Microsoft’s MSN Search are filtered by the authorities to exclude any reference to the events in Tiananment Square in 1989, and all of the major search engines have in the past supplied names of specific users to the Chinese government. As more and more official services have been blocked or filtered, the popularity of social-networking sites such as Twitter has increased. As a number of observers have noted, Twitter takes on even more significance in a country like China, not just because it allows people to exchange messages easily, but because the use of Chinese characters allows Twitter posts to include far more information than a similar English message.

As the Committee to Protect Journalists details in a recent piece, the Chinese government has been cracking down on virtually every form of media: newspapers published in Hong Kong that might mention the anniversary are blocked from delivery to mainland China, and radio broadcasts are interrupted by commercials if the topic comes up; even a former Communist Party leader’s memoirs are no longer available through official channels (according to the New York Times), presumably because they mention the events of June 3 and 4, 1989. Several prominent Chinese citizens associated with the demonstration were either detained or warned to leave the capital before the anniversary.

Despite the official crackdowns, however, Twitter users seemed determined to find a way around what Chinese dissidents and other critics refer to as the “Great Firewall of China.” As the blockage moved into its second day, users were exchanging tips on using so-called “anonymous proxy” software such as Hotspot Shield (which hides the IP address of a user who runs the software) and other similar technological workarounds.

The RIAA switches tactics

The Recording Industry Association of America has been waging war against copyright infringement — in the form of illegal downloads — since the file-sharing app known as Napster first appeared on the scene a decade ago. For much of that time, the RIAA’s preferred form of attack has been the lawsuit, with the record-industry body filing claims against hundreds of thousands of individuals for financial damages, including several high-profile cases that have targeted single mothers, war veterans, and teenagers for sharing as few as a dozen songs. Now, the RIAA has apparently decided to switch tactics, according to a recent report in the Wall Street Journal, and will no longer sue individual downloaders.

Instead, the agency plans to strike deals with Internet service providers that will see the ISPs agree to contact individual file-sharers when notified by the RIAA that illegal activity is occurring. After two warnings about their behaviour, the RIAA is hoping that ISPs will agree to shut off their customers’ access to the Internet.

What this means is that the record-industry group is trading a head-on assault for a back-room negotiation with Internet providers, in the hope that it can do an end-run around the problems its previous strategy produced. Not only was the lawsuit approach a magnet for negative publicity, but it also ran into repeated legal roadblocks as well. In one of the latest, for example, a U.S. court ruled that the RIAA would have to show that songs had actually been downloaded before it could prove infringement. Unfortunately for the RIAA, however, its new strategy could cause some legal headaches as well, since the “three strikes” approach is almost certain to be challenged in court.

One of the biggest issues with such an approach is that it assumes the RIAA has correctly identified a) specific songs that are being infringed, and b) the individual doing the infringing. If it makes a mistake in any one of the three steps, an innocent user has their Internet access cut off without any recourse. And over the years that the record industry has been pursuing its legal strategy, there has been plenty of evidence to show that its ability to do either a) or b) is far from perfect. The agency has repeatedly mis-identified not only the individuals doing the sharing — who in many cases are distinct from the customers the Internet account belongs to — but has also mis-identified songs as infringing when they have been legally acquired.

The “three strikes” approach is not a new one. The French government has proposed new legislation that would require ISPs to cut off the access of users who repeatedly infringe on copyright, although the legislation has yet to become law. An amendment by the European Parliament earlier this year strongly urged France and other European Union countries not to adopt this kind of legislation, but it was later overturned by the EU Council of Ministers, and according to the most recent reports France is going ahead with it. Britain’s record-industry lobby group managed to convince several ISPs to warn their users when notified by the agency, but they have stopped short of actually cutting off their users’ Internet access after a third “strike.”

The RIAA may think that it has come up with a clever way around its earlier legal hassles, but it could find that trying to make ISPs into copyright police faces just as many hurdles, if not more.

Would we create the Web today?

Duke University lawyer and Creative Commons director James Boyle has a thoughtful piece in the Financial Times about the Web and whether it would (or could) be created today the way it is now. He ends with a depressing thought (via Slashdot):

“Why might we not create the web today? The web became hugely popular too quickly to control. The lawyers and policymakers and copyright holders were not there at the time of its conception. What would they have said, had they been? What would a web designed by the World Intellectual Property Organisation or the Disney Corporation have looked like? It would have looked more like pay-television, or Minitel, the French computer network…. The lawyers have learnt their lesson now. The regulation of technological development proceeds apace. When the next disruptive communications technology — the next worldwide web — is thought up, the lawyers and the logic of control will be much more evident. That is not a happy thought.”