Troubling news from the EU – they’re smudging the difference between “open source” and “homogeneity” in software. Here’s what Cory at Boing Boing had to say in reference to the story on Computer World / Slashdot:

Lobbyists at the EU have gutted the definition of “open” (part of a proposal to require more open standards and open source tools in European government) to mean “the willingness of persons, organisations or other members of a community of interest to share knowledge.” This meaningless drivel replaces a more robust definition that included, “The standard is adopted and will be maintained by a not-for-profit organisation, and its ongoing development occurs on the basis of an open decision-making procedure available to all interested parties (consensus or majority decision etc.).” Link.

Ars Technica teases out the implications for data, something which we in the NHS should be particularly aware of when we’re considering portals etc:


This statement implies that interoperability can be achieved if everyone merely uses the same exact software. Although a homogeneous environment simplifies a lot of things, it’s deeply misleading to suggest that it is equivalent to open standards as means of achieving interoperability. Open standards make it possible for third-party developers to create independent software implementations that can access and manipulate the data. Depending on a homogeneous environment to facilitate interoperability without open standards naturally creates a very strong risk of lock-in.
It’s not enough to be able to share content between users of the same software application; you need to be able to pull apart the data so it can be repurposed and used in other systems. This is a key aspect of interoperability. Open standards make data more future-proof and guarantee that it will still be accessible when the software you use to produce and access the data is no longer available or maintained. Link.