How Microsoft Fought True Open Standards IV
Published 10:52, 30 April 12
Yesterday I looked at the first part of a long document that Microsoft sent the Cabinet Office in October last year. Here I'd like to explore one of the other sections, which is headed as follows:
The revised definition [of open standards] would not advance the government’s underlying interoperability objectives, nor would it advance the government’s underlying cost saving objectives.
This part begins:
While standards are clearly an important part of a government interoperability program, it is important to remember that standards alone do not guarantee interoperability and that they are merely one piece of the puzzle towards fostering better interoperability.
Some of the others, Microsoft suggests, are "clear drafting, robust interoperability testing of implementations, widespread market acceptance."
As for that clear drafting, here's what someone wrote the day after Microsoft's OOXML was accepted as an ISO standard:
To begin with, it seems that the OOXML standard was poorly defined, leaving a huge number of ambiguities and undefined terms. That's not surprising, given the fact that it is 6,000 -- yes, six thousand -- pages long, a size which makes it nearly impossible to ensure internal consistency. The large size also ensures that it will be difficult to create alternative implementations; would you like to be the programmer charged with checking that a particular program adheres to all 6,000 pages of the standard?
Moreover, parts of the standard require a programmer to deviate from many other, correct standards. For example, 1900 was not a leap year, as is the case with three out of every four "00" years. (Thus, 1900 was not a leap year, but 2000 was.) Microsoft got this point wrong when they first implemented Excel, and as a result, the OOXML standard requires that implementers make this same error, for the sake of consistency.
That may only be one example, but as I discussed recently, it's probably the one that Microsoft's cares most about, and has put most effort into. And if it can't even produce clarity there, it doesn't augur well for future open standards it might offer.
So what about "robust interoperability testing of implementations"? Well, the ODF world has been running what it calls "plugfests" since 2009 - this year will see the eighth such meeting. Here's how the plugfest site describes them:
ODF plugfests are an ongoing series of vendor-neutral events, bringing together implementers and stakeholders of the standard. The goal is to achieve maximum interoperability by running scenario-based tests in a hands-on lab and discuss new and proposed features of the ODF specification.
Now, I may be wrong, but I don't think there's anything comparable in the world of OOXML - anyone know? And if there is, I certainly doubt that there have been eight of them. So it would seem that on "robust interoperability testing of implementations", ODF comes out well ahead. Again, this is only in one area, but it shows what open source implementations of open standards are already doing.
That brings us to "market acceptance". Claiming that lock-in to a product should be part of the puzzle of fostering interoperability is a bit rich: the whole point of promoting interoperability is to get away from that lock-in - it's a vice, not a virtue. If it were taken into account, nothing would ever change, since lock-in would argue strongly for continued lock-in (and, indeed, that's precisely what most procurement decisions come down to - we're using Microsoft products, so we'll carrying on using them because it's too hard to extricate ourselves.)
Finally, Microsoft makes the following claim:
Governments around the world have been actively debating the extent to which standards which can be implemented with payment of a royalty impact the interoperability generally and the overall costs of IT procurement. A recent, comprehensive and unbiased view was recently published by the Dutch Court of Audit which found no evidence to support for the claim that mandating royalty free standards in the procurement context lead to such outcomes
The Dutch Court of Audit has failed to do an independent review of the government's savings possible with open source, writes Hans Sleurink, editor of the Open Source Jaarboek, an annual review on open source developments, in a public letter published this afternoon.
According to Sleurink, the Court has committed a grave error by limiting its scope to just two policy areas, the internal IT needs of the government and market competition "Instead of reviewing the effectiveness of policy, the court is now setting the agenda." He also accuses the court of having become too close to the interior ministry, losing its role as an independent watchdog.
Lacking concrete numbers, the Court of Audit's report is in stark contrast with calculations done earlier by the ministry of the Interior in 2010. According to that earlier report, the government could save between one to four billion Euro per year. That report looked at government costs for proprietary licences, costs for procurement, costs for licence management and costs for IT maintenance.
Based on evidence such as the Dutch Court of Audit it is therefore difficult to understand how the new definition will help advance the government’s goal to reduce the cost of IT and inadvertently may well increase the cost to government.
But one could equally write:
Based on evidence such as the Dutch Ministry of the Interior it is therefore easy to understand how the new definition will help advance the government’s goal to reduce the cost of IT and is very likely to decrease the cost to government.
What this - and my analysis of the other documents obtained under the FOI request - goes to show is that that things aren't as black and white as Microsoft likes to paint them, and that its presentation of information to the Cabinet Office is highly partial, in both senses.
That's why it's crucial that other points of view are submitted to the UK government's consultation on open standards. The really great news is that we have more time to do that, as result of a rather interesting situation, explalined in this post on the Cabinet Office's blog:
Dr Hopkirk is a respected advocate for “openness and interoperability of systems, of people, processes and information technologies”. He has in the past, for example, been an invited observer at events such as Open Forum Europe.
However, at the time he was engaged to facilitate the Open Standards roundtable, while we were aware that he represented the National Computing Centre on the Microsoft Interoperability Executive Customer Council (along with 40 other CIOs/CTOs across the public and private sector who participate in a voluntary capacity) he did not declare the fact that he was advising Microsoft directly on the Open Standards consultation.
As a result, two things will be happening:
any outcomes from the original roundtable discussion [on Competition and European Interaction, and held in London on 4 April] will be discounted in the consultation responses and we will rerun that session and give time for people to prepare for it. We will also run a teleconference as well as a meeting to ensure that everybody has a chance to participate.
Furthermore the consultation will now be extended for an additional month. We would urge those interested in the Open Standards debate to fully participate in the consultation and you can submit your consultation responses here. The formal closing date for submissions will now be Monday, 4th June 2012.
The extended deadline is a great opportunity to make your voice heard, as is the re-run of the roundtable - do come along if you can. And you don't even have to come to London to make your views heard, since a teleconference will also be organised for those who find it hard to travel. Please try to take advantage of these new options, since every submission really counts.