[As I touch on intellectual property issues herein, I'd like to remind everyone that this is my personal blog and should not be in any way taken as official Red Hat positions or statements, nor presented as such.]
Just because something is widely used doesn't make it a standard--de facto, de jure, or otherwise--in the sense that anyone can use and build implementations of that standard without restriction. Indeed, as we shall see, even standards that are "blessed" by powers-that-be are not always fully open in the ways that I have outlined previously.
Standardization has been around for a long time. The IEEE tells us that:
Based on relics found, standardization can be traced back to the ancient civilizations of Babylon and early Egypt. The earliest standards were the physical standards for weights and measures. As trade and commerce developed, written documents evolved that set mutually agreed upon standards for products and services, such as agriculture, ships, buildings and weapons. Initially, these standards were part of a single contract between supplier and purchaser. Later, the same standards came to be used across a range of transactions forming the basis for modern standardization.
A lot of this early standardization pretty much came down to custom. The convoluted history of why we drive on one side of the road in a given country is instructive. (Though each country's conventions are now enshrined in The Geneva Convention on Road Traffic.)
The history of the shipping container, as detailed in Marc Levinson's The Box: How the Shipping Container Made the World Smaller and the World Economy Bigger, offers another, fairly typical, historical example. Incompatible container sizes and corner fittings required different equipment to load and unload and otherwise inhibited the development of a complete logistics system. The standardization that happened around 1970 made possible the global shipping industry as we know it today--and all that implies. The evolution of standardized railroad gauges is similarly convoluted. The development of many early-on computer formats and protocols was similarly darwinian.
It's tempting to take this past as prologue and conclude that similar processes will continue to play out as we move to new styles of computing in which different forms of interoperability assume greater importance. For example, published application programming interfaces (API) are at the heart of how modular software communicates in a Web services-centric world. One set of APIs wins and evolves. Another set of APIs becomes a favorite of some particular language community. Still another doesn't gain much traction and eventually withers and dies. It sounds like a familiar pattern.
But there's an important different. In today's software world, it's impossible to ignore intellectual property (IP) matters whether copyright, patent, trademark, or something else. An API isn't a rail gauge--though perhaps someone today would try to patent that too.
As a result, tempting as it might be to adopt some API or other software construct because it's putatively a "de facto" standard, which is mostly a fancy way of saying that it's popular, that may not be such a good idea.
RedMonk's Stephen O'Grady offers some typically smart commentary on why:
it’s worth noting that many large entities are already behaving as if APIs are in fact copyrightable. The most obvious indication of this is Amazon. Most large vendors we have spoken with consider Amazon’s APIs a non-starter, given the legal uncertainties regarding the intellectual property involved. Vendors may in certain cases be willing to outsource that risk to a smaller third party – particularly one that’s explicitly licensed like a Eucalyptus [coverage]. But in general the low risk strategy for them has been to assume that Amazon would or could leverage their intellectual property rights – copyright or otherwise – around the APIs in question, and to avoid them as a result. Amazon, while having declined to assert itself directly on this basis, has also done nothing to discourage the perception that it has strict control of usage of its APIs. In doing so, it has effectively turned licensed access to the APIs into a negotiable asset, presumably an outcome that advocates of copyrightable APIs would like to see made common.
In fact, lack of openness can even extend to standards that have gained some degree of governmental or quasi-governmental approval--which is, after all, a political process. Last decade's fierce battle over Microsoft's submittal of its OOXML document format as a standard to ECMA and ISO is perhaps the most visible example. The details of this particular fight are complicated, but, in Kurt Cagle's words, "The central crux of the [then-]current debate is, and should be, whether Microsoft’s OOXML does in fact represent a standard that is conceivably implementable by anyone outside of Microsoft."
Issues of the conditions that should be satisfied in order for a vendor's preferred approach/format/etc. to become a "blessed" standard continue to reverberate. The latest round is about RAND (Reasonable-and-Non-Discriminatory) licensing and whether that can take the place of truly open implementations. It's essentially an attempt to slip proprietary approaches requiring a patent license into situations, such as government procurements, that require open standards.
The presence of RAND terms at best chills developer enthusiasm and at worst inhibits engagement, as for example it did in the case of Sender ID at IETF. As Välimäki and Oksanen say, RAND policy allows patent holders to decide whether they want to discourage the use of open source. Leaving that capability in the hands of some (usually well-resourced) suppliers seems unwise.
At one level, the takeaway here might be "it's complicated." And it is. But another takeaway is pretty simple. You can dress up proprietary standards in various ways and with various terms. And such standards have a place in the IT ecosystem. But they're not open, whatever you call them.
No comments:
Post a Comment