Eric Schmidt writes in Foreign Affairs, “The Digital Disruption”  27.10.10

You never know with these Foreign Affairs articles, how significant they will be for actual policy making. But they reveal at least what is being discussed in US foreign policy circles. Google’s ties with the US administration and the Department of State became visible for a larger audience in the course of the China-Google showdown earlier this year. The publication of Eric Schmidt’s and Jared Cohen’s article “The Digital Disruption – Connectivity and the Diffusion of Power” in the forthcoming issue of Foreign Affairs only stresses this special relationship.

Foreign Affairs continues its tradition of articles on the strategic usage of information technology for US foreign policy. Back in 1996, Nye/Owens called for an “information umbrella” as a future means to allow the US to further lead an alliance of like-minded states in a post-“nuclear umbrella” world. Schmidt/Cohen discuss in a diplomatically sterile language the effects of “connection technologies” on politics, governments, and the diffusion of power among different actors. They have retained some techno-optimism:

In an era when the power of the individual and the group grows daily, those governments that ride the technological wave will clearly be best positioned to assert their influence and bring others into their orbits. And those that do not will find themselves at odds with their citizens.

But also within Western states, the notion of governance will further flourish:

Instead, governments, individuals, nongovernmental organizations, and private companies will balance one another’s interests.

Looks like multi-stakeholderism gone ubiquitous.

If you don’t want to register with the foreignaffairs.com website, Stefaan Verhulst has the complete article.

Gunter Ollmann (Gamballa) has new figures on Botnet Hosting  26.10.10

World-wide leader in botnet CnC-hosting according to an Gunter Ollmann, VP Research of security provider Damballa, is the German ISP 1&1 Internet AG.

1&1 headquarters will be relieved to read this:

It is important to note that the ISP’s and hosting providers listed in the top-10 do not necessarily conduct criminal practices, but they have found themselves in a position of being “preferred” by the criminals operating the botnets.

It it surprising to see 1&1 spearheading CnC hosting. The data for a study released earlier this year by my TU Delft colleagues Michel van Eeten, Hadi Asghari et al. reveals that 1&1 is among the best ISPs when it comes to dealing with malware and spam. In that perspective, 1&1 has one of the cleanest ASNs, much better than, say, Deutsche Telekom.

I’ve briefly skimmed through some Gambella papers, but I could find a description of their method to detect CnC servers.

Stephen Walt, foreignpolicy.com, embraces Wikileaks: “a good thing”  26.10.10

Stephen M. Walt, good-ol’ Realist with an almost Niebuhrish image of humanity, embraces Wikileaks:

Realist that I am, I believe that human beings are more likely to misbehave if they think they can shield what they are doing from public view. For that reason, I also believe that democratic societies are more likely to adopt better policies when information is plentiful and when government officials cannot determine which facts are available to the public and which are not. Because its primary function is to make more information available on issues that concern us all, I therefore conclude that what Wikileaks is doing is on balance a good thing.

The German liberal, internet-politics blogosphere and IT magazines still appear to have visions of transparent, democratically organised Wikileaks clones. I’m wondering how such an organisation would transparently and democratically deal with the spectre of their members being declared “enemy combatants”.

Seymour Hersh’s 6731 words take on “the online threat”  26.10.10

Summary: There is no cyberwar-problem, only cyber espionage. Cyberwar is made up by cybergeddonists, who happen to work for security contractors after having left their public cyber-security posts. China has no interest in launching a cyberwar against the US, even if it might possibly have the means. Cyberwar is hardly wageable, because of unintended consequences caused be the openness of the web. Espionage could be dealt with by obligatory encryption, which however is costly and hard to operate and maintain. Non-encryption however might not be the underlying cause for internet security problems. And military activities can however have unintended consequences. Nevertheless, recommended reading.

The political situation:

In the next few months, President Obama, who has publicly pledged that his Administration will protect openness and privacy on the Internet, will have to make choices that will have enormous consequences for the future of an ever-growing maze of new communication techniques: Will America’s networks be entrusted to civilians or to the military? Will cyber security be treated as a kind of war?

Blurring definitions of cyber war and cyber espionage…

Blurring the distinction between cyber war and cyber espionage has been profitable for defense contractors—and dispiriting for privacy advocates.

The cybergeddonists’ false scenarios:

The most common cyber-war scare scenarios involve America’s electrical grid. … There is no national power grid in the United States. There are more than a hundred publicly and privately owned power companies that operate their own lines…. …an electrical supplier that found itself under cyber attack would be able to avail itself of power from nearby systems.

Stuxnet:

If Stuxnet was aimed specifically at Bushehr, it exhibited one of the weaknesses of cyber attacks: they are difficult to target and also to contain. India and China were both hit harder than Iran… The real hazard of Stuxnet, he [Schneier] added, might be that it was “great for those who want to believe cyber war is here.”

On Army General Keith Alexander (head of US cyber command, director of NSA):

One of Alexander’s first goals was to make sure that the military would take the lead role in cyber security and in determining the future shape of computer networks.

Military-civilian relationship:

If the military is operating in “cyberspace,” does that include civilian computers in American homes?

Encryption as he solution for the cyber security problems (citing John Arquilla):

“We would all be far better off if virtually all civil, commercial, governmental, and military internet and web traffic were strongly encrypted.” … “Today drug lords still enjoy secure internet and web communications, as do many in terror networks, while most Americans don’t.”

A Maginot line mentality (citing Marc Rotenberg, EPIC):

“The question is: Do you want an agency that spies with mixed success to be responsible for securing the nation’s security? If you do, that’s crazy.”

Clipper-Chip 2.0:

The legislation, similar to that sought two decades ago in the Clipper Chip debate, would require manufacturers of equipment such as the BlackBerry, and all domestic and foreign purveyors of communications, such as Skype, to develop technology that would allow the federal government to intercept and decode traffic.

A long list of interviewees and sources:

Jonathan Pollack, Whitfield Duffie, Jeffrey Carr, “a retired four-star Navy general”, John Arquilla, Marc Rotenberg, Howard Schmidt, “a senior official in the Department of Homeland Security”, William J. Lynn III, James Lewis (senior fellow at CSIS), Bruce Schneier, J. Michael McConell, Army General Keith Alexander (head of US cyber command, director of NSA), “a defense contractor” (“one of America’s most knowledgeable experts on Chinese military and cyber capabilities”), Richard Clark (cybergeddonist, security contractor and Bush’s man for cybersecurity, “poison gas clouds…”), J. Michael McConell (Bush’s second director of National Intelligence, now cybergeddonist and security contractor, “Our cyber-defenses are woefully lacking”).

Microsoft’s Zink on whether ISPs should cut off infected users  26.10.10

Terry Zink, Program Manager for Microsoft Forefront Online Security, wants ISPs to play the role similar to the one email security service providers have in mitigating the spam problem.

In my view, ISPs taking action on botted machines is very similar to the problem that we as an outbound mail relay had when we were taking action on customers that were/are sending outbound spam…

For an ISP, if they know which domains a botnet calls home to, then in theory they could tell which IP address is connecting to which botnet URLs. Whenever someone sends a request, either http, ftp, or some other DNS protocol, that attempts to resolve the botnet C&C’s domain, then it is a logical assumption that the machine behind the IP address is part of a botnet. …

Obviously, it would be nice to use a finer layer of granularity but that option is not available without deep packet inspection where you can possibly map finer levels of identification.

In short: Anti-botnetting should be done by ISPs without using DPI. Zink does not want to see ISPs filling their data centres with perimeter DPI boxes, a) for privacy reasons and b) for the costs, as they would force ISPs to find new revenue models and become, e.g., non-net-neutral.

Microsoft isn’t the “internet security industry”, even though their Malicious Software Removal Tool and Security Essentials A/V are among the most widely deployed security tools out there. Microsoft is in the security business above all to get rid of infected Windows machines and to protect their Windows brand. Hence, my hunch is that they are rather pragmatic in their choices and would opt for any approach that helps to clean up the bot mess.

I wonder how such a botnet URL database would be operated, who would feed, who would harvest it, how it would be governed. Centrally? Commons-based? Commercial? Based on a club-model? Botnet URLs are too trivial to pose as the core of commercial security products in a way as virus signatures are a core asset for AV software providers. But commercialising security problems isn’t Microsofts problem.

Outstanding crowdsourcing for Crowdsourcing Summit blogging  25.10.10

Next week, the Future of Crowdsourcing Summit will be held in San Francisco. Those apt writers, who will be covering the Summit with bright and insightful blog entries, will not attend it.

For the next 10 days we need writers to write blog posts for the Future of Crowdsourcing Summit website: http://futureofcrowdsourcingsummit.com/

We are only looking for outstanding writers, who can write extremely well and who really understand crowdsourcing, global business, and the web space.

Candidates: 13 (avg $14.44/hr)

Pentagon’s point about harmfulness of openness  25.10.10

It doesn’t come as a surprise that the Pentagon doesn’t heartily embrace the leakage of some 400,000 classified records covering unfavourable Iraq incidents. The line is familiar among students of security institutions: Openness would be detrimental to security by creating new vulnerabilities. In the words of Pentagon press secretary Geoff Morell:

“Potentially what one could mine from a huge data base like this are vulnerabilities in terms of how we operate, our tactics, our techniques, our procedures, the capabilities of our equipment, how we respond in combat situations, response times — indeed how we cultivate sources,” Morrell said. “All of that, [given the] thinking and adaptive enemy we’ve been facing in Iraq and Afghanistan, can be used against us.”

(Source: Smallwarjournal.com; similar in an press conference early August)

Openness, i.e. sharing operational and tactical information with adversaries, can create opportunities for adversaries to mitigate attack or defence capabilities. Can. Potentially. But what are the real costs of openness? And how do they compare to societal, political, and humanitarian costs of closure?

MSFT sec report: Non-technical roadblocks against “botnet superhighways” needed  25.10.10

Microsoft’s Malware Protection Center has released it’s latest Security Intelligence Report v9. It calls to mind that anti-botnetting isn’t a just a technological challenge:

Regardless of how botnets are doing their distribution, one thing is clear: because of their networked and often organized structure, they allow malicious and illegal activities to be performed at a scale that has not been seen before. The solution to this problem isn’t always about technology. As a community, we can take collaborative and legislative action to take down massive botnets like we did with Waledac. As researchers, we must evolve the way we view these threats and continue to think of creative and novel ways to stop them.

Interesting technical finding: 33% of malware-infected machines are bots.

“adhocracy” – yet another governance-kid on the block  25.10.10

Piotr Konieczny proposes in the recent issue of the Journal of Information Technology & Politics that Wikipedia is an “adhocracy”. Adhocracies, as suggested by Mintzberg and McHugh in the mid-eighties, have the following five features:

1. They operate in a complex and dynamic environment and are highly innovative. 2. Innovations require highly trained and motivated experts. 3. The experts may be formally allocated to different divisions but usually work in informal, multidisciplinary teams. 4. Coordination and communication rely on semiformal structures, while more formalized structures and managerial practices are rare. 5. Parts of the organization are highly decentralized.

Konieczny sees “fundamental similarities” between “adhocracy” to the “open-source development models”. He uses Bauwens’ concept of commons-based peer production as an example of those “open-source development models”. All the characteristics of Bauwens’ cbpp would also define to “adhocracy”. All, but one: “financial gain as a motivator”.

The problem here is that Bauwens uses a normatively stricter, more egalitarian, less capitalistic version of Benkler’s commons-based peer production. Benkler uses Wikipedia as an example for commons-based peer production, Konieczny for adhocracy. I don’t quite get it. A rather mediocre article, even more so as the review of the Wikipedia research literature appears to be incomplete.

Konieczny, Piotr (2010) ‘Adhocratic Governance in the Internet Age: A Case of Wikipedia’, Journal of Information Technology & Politics, 7: 4, 263 — 283 (DOI: 10.1080/19331681.2010.489408)

Wikipedia fosters “a new form of gatekeeping” and is an “adhocracy”  25.10.10

No surprise here, nevertheless it’s worth reading:

This article introduces criticism elimination, a type of information removal leading to a framing effect that impairs Wikipedia’s delivery of a neutral point of view (NPOV) and ultimately facilitates a new form of gatekeeping with political science and information technology implications. This article demonstrates a systematic use of criticism elimination and categorizes the editors respon- sible into four types. We show that some types use criticism elimination to dominate and manipulate articles to advocate political and ideological agendas.…

The examination of 627 edits spread over 16 Wikipedia articles demonstrates systematic removal of criticism published by reliable sources, despite policy. This leads to framing that runs counter to the NPOV policy…

We have shown that criticism elimination can have a gatekeeping effect that allows parts of Wikipedia to be dominated by those with an agenda. …

For now, criticism elimination means at least some parts of Wikipedia are susceptible to unexpected, systematic framing, and gate- keepers do indeed exist.

Oboler, Andre, Steinberg, Gerald and Stern, Rephael(2010) ‘The Framing of Political NGOs in Wikipedia through Criticism Elimination’, Journal of Information Technology & Politics, 7: 4, 284 — 299 (DOI: 10.1080/19331680903577822)

Limits of commons-based peer production at Wikipedia  28.9.10

I’ve just returned from an inspiring weekend at a conference on Wikipedia in Leipzig, Germany. The title of the conference, CPOV, Critical Point of View, a bit awkward at first, is derived from Wikimedia Foundation’s demand for “neutral point of view“, which all Wikimedia articles should adhere to.

Geert Lovink, professsor at Amsterdam University of Applied Sciences and one of the conceptual figures behind this conference series with previous stops in Bangalore and Amsterdam, pointed out that the location of the conference and the conference language, German, were not chosen by chance. The German-language Wikipedia still is the second largest behind the English one, and the German-language Wikipedia research community appears to be larger than the English counterpart.

Furthermore, one of the, well, peculiar things about Germany culture, it’s Vereinswesen (culture of associations and chapters), has become the organisational role model for Wikipedia. Under pressure by the successful Spanish Wikipedia fork, Enciclopedia libre, Jimmy Wales & Co decided to transfer the Wikipedia assets (e.g. it’s domain name) away from their tits&porn business bomis.com to the newly and specifically founded Wikimedia Foundation.

From my perspective as a student of community-driven internet security operations, two things were particularly interesting: The concept and model of peer production – or, to be precise, the commons-based mutation as defined by Yochai Benkler – starts getting wider acceptance in the scientific community as a theoretical starting point for researching non-hierarchical and non-market governance approaches. Until now, I had the impression I would almost be the only one using it as a framework for empirical analysis.

The second observation: Wikipedia, which is always mentioned as the poster-child of collective, distributed, peer-producing collaboration on the internet, has numerous layers of control built into its organisational and operational structure. While access to resources is unrestricted, the right to edit and commit changes is getting ever harder, at least for those 0.4% of all Wikipedia articles that are on average protected at any point in time (as of 2008), a sharp increase from 0.05% in 2003. These “protective” measures are taken to defend existing pages from being vandalised or deteriorated in edit-wars.

Wikipedia thus loses some of its openness (““The encyclopedia anyone can edit”). The authority over the revocation of general edit rights lies with a relatively small group of administrators, who have turned into a de facto club with high entrance barriers. Newcomers have to invest substantial amounts of time, efforts and cultural adaptability to climb up the administrative Wikipedia ladder. Lengthy process instructions put increased demands on administrators and authors, new roles increase the organisational complexity within Wikimedia/pedia.

Historically, one could argue that Wikipedia has grown out of its infancy. From an institutional perspective, it sheds some light on the viability of common-based peer production in an unfriendly environment which attracts malevolents and informational opponents to get engaged in vandalism and edit-wars. While access to Wikipedia’s products, its articles, remains commons-based, the definition of its products doesn’t. It has, in part, been clubbified.

I’ve discussed the session on “Digitale Governance” in greater detail at the CPOV website (in deutsch, though).

Script for turning messy texts into well-structured, -outlined and -formatted Word documents  16.6.10

Some interesting pieces of software have been developed in recent years that aim at replacing the venerable Word as an authoring tool for large and complex writing projects. On the Mac side, two humbly named applications, Ulysses and Scrivener, have most notably emerged as popular writing tools. While everything is nice and fine as long as you write, sharing your output and delivering well-structured (in a technical sense) and formatted documents is a bit cumbersome and usually requires dreary manual intervention. As I had written a script for Word for Windows back in my, well, teens that did just some of that things I until now had to do manually on the Mac, it should be fairly easy to update and extend that thing and write some code.

scrivener2word.png

[…]

The security risk of bad security-provisioning design  10.6.10

I’ve pointed out earlier some of the research questions for social scientific internet governance research. The main issues I described there are:

  1. There is a lack of empirical analysis undertaken by social scientists, who are not affiliated with biased agencies engaged in turf-wars or the fear-mongering security industry, about the scale, quality and impact of internet security issues. Furthermore, existing institutions have hardly been researched.
  2. Ongoing debates in the political sphere often refer to an lack-of-enforceability argument. More often than not, these arguments fail to be backed by scientific findings.
  3. The geopolitical dimension of internet security is under-researched.
  4. The potentially disruptive impact of internet-based collaboration on traditional security provisioning processes is to be explored. We can observe these discourses about new forms of distributed collaboration everywhere, but not in the field internet security governance.

The main issue for social sciences however to provide guidance for institutional and organisation design for internet security governance.

IMG_0390.jpg

Ad-hoc defense system protecting railway embankment against Danube flood

[…]

The emergence of internet security governance as a research field in social sciences  10.6.10

It’s finally happening. After an abysmally long time of politicians, military, and the security industry coming up with streams of innovative policy tangle in the name of internet security or cybersecurity, a critical mass of social scientists and research interested practitioners has teamed up to start deepening our knowledge of internet security and its governance. While Hungary was having difficult times by floods and economic turmoils, Budapest couldn’t have been a more lovely and welcoming place in the last couple of days.

IMG_0349.JPG

[…]

Internet and statehood – the battle over informational asymmetries  21.4.10

“Everything that can be thought is thought at some time or another. Now or in the future.”
“Those things which were thought can never be unthought.”
Friedrich Dürrenmatt, The Physicists

Ralf Bendrath and I gave a presentation on “statehood and internet” at this year’s re:publica conference in Berlin. Re:publica is an annual conference for internet aficionados, bloggers, internet activists and, ever more so, politicians and public authority representatives involved in internet regulation. For the first time organised in 2007, it has by now risen to host some 2500 visitors and has been extensively covered (DE) by old-media outlets.

We used the opportunity of the China-Google/US conflict to discuss basic relationships between states and private actors, a question raised (both links DE) in the blogosphere and media, and some general perspectives of internet politics.

[…]

Nagging questions in cybersecurity research  12.4.10

It doesn’t happen too often that you read about a conference or a workshop and think: Now, that was about time! Internet governance is about to undergo some fundamental changes, states are getting ever more involved, mostly for addressing internet security problems. A plethora of questions need to be resolved to deal with these problems with well designed institutions. And yet, as far as I can tell, there is no major research programme on internet security governance going on anywhere on this planet. Hence, the workshop “Europe And The Global Information Society Revisited: Developing A Network Of Scholars And Agenda For Social Science Research On ‘Cyber Security’” could not have been launched more timely.
The Center for Media and Communication Studies at the Central European University (Budapest, Hungary), in partnership with the Centre for Global Communications Studies at the Annenberg School of Communications (Philadelphia, USA) will convene 30 selected experts next week at CEU in Budapest for a Strategic Workshop sponsored by the European Science Foundation (ESF). As flattering as rather undeservedly, I will be on a panel discussing the relations between cybersecurity on the one hand and International Relations, governance and institutions on the other. Following, my take on some blind spots in internet security research from a social scientific perspective.

[…]

Information production support systems: DEVONthink  24.2.10

Writers love having written. Problem is the time, work and brain twisting necessary between an idea to produce something and actually having done it. Well, it’s not that bad, sometimes you love writing, but sometimes you hate it. Or it bores, is cumbersome, and annoyingly laborious. This is why the human species loves to create machines: to enjoy the fruits of life, ransomed from the need to plug, wash and process them. With the field of information production, it’s about the same. The invention of computerized information processing has led to the rise of numerous attempts to create machines supporting human efforts of thinking, understanding, and creating meanings. In a sense and high on the abstraction layers, this is what computing is about in general. More narrowly, the question is how and which kind of software can support individuals in their efforts to gather information, grasp it, recombine it, and create new insights, new meanings, new information, new knowledge. What would be the equivalent of exoskeletons for the brains, which would enable the average brain to easily jump on the notorious shoulders of giants and beyond?

[…]

26C3: internet politics 2010, defence of the digital habitat, internet utopia, decentralized technologies and implementing Cryptonomicon  6.1.10

“It seems like the Crypt is their worst nightmare.”
(Neil Stephenson, Cryptonomicon)

China spearheads the anything-goes movement of technology-based societal control, authoritarian countries worldwide follow suit, and we yet don’t know whether western democracies will manage to at least remain in their currently mediocre shape if one of the many ongoing global developments and crisis should ever have a major and disruptive societal impact. From the perspective of the freedom and unhindered flow of information, the internet makes a bad expression these days and things haven’t changed for the better in the last year and the naughties.
John Perry Barlow’s “fuck them” […]

A follow-up on the German botnet-center  18.12.09

I’ve written a quick analysis of the recent anti-botnet politics in Germany. Kind crew behind netzpolitik.org has published it on this blockbuster blog. It’s written in German, though, but you could alternatively give Google Translator a moment of embarrassment.

Shadowserver Foundation publishes Conficker botnet stats  16.12.09

This is going to be an interesting experiment in internet security governance. Scientists have argued for years that internet security problems are as much caused by a misalignment of incentives as they are by technological flaws in software and hardware. One obvious recipe to call ISPs for action against botnets is one that has helped to increase software vendors’ activities in increasing software robustness.

Gathered under the umbrella of the Shadowserver Foundation, a group of engineers and scientists have scrupulously gathered evidence and background information about the activities of the Conficker botnet. They have known for months that millions of machines worldwide had been infected with Conficker malware. Yet, no one reacted, only shoulders were shrugged. At govcert.nl in October, many were contemplating how to proceed with Conficker.

Starting today, Shadowserver let’s everyone know where these Conficker-infected machines are. The move is a valuable contribution to increase global transparency about the somewhat obscure botnet problem.

An interesting example from Germany immediately sticks out. 1&1, a big hosting and medium-sized accessed provider, had initiated an internal initiative against botnet-infected customer systems earlier this year. Today, only ten IP addresses and 0% of their routed space are assigned to infected machines. For customers of Deutsche Telekom, which hasn’t announced a similar program, things look worse: 0.1% of all IP addresses or more than 32,000 IP addresses belong to a Conficker-infected machine.