Open Enterprise

RSSSubscribe to this blog
About Author

Glyn Moody's look at all levels of the enterprise open source stack. The blog will look at the organisations that are embracing open source, old and new alike (start-ups welcome), and the communities of users and developers that have formed around them (or not, as the case may be).

Contact Author

Email Glyn

Twitter Profile

Linked-in Profile


Time to Fight Against a DRM'd Web - by Forking It

Article comments

At the beginning of the year, I wrote abut a shameful move by the BBC to support adding DRM to HTML to control the playback of video content. This scheme has now moved on, and the news is astonishingly bad:

There were various concerns by the EFF and individuals about the charter including playback of protected content within its scope. While we remain sensitive to the issues raised related to DRM and usage control, the Director reconfirmed his earlier decision that the ongoing work is in scope.

Translated into English, this means that Sir Tim Berners-Lee - the Director - has just officially approved the idea of a locked-down Web. In an excellent post on the EFF's site, Danny O'Brien explains why this is a terrible precedent:

EME [Encrypted Media Extension ] is exclusively concerned with video content, because EME's primary advocate, Netflix, is still required to wrap some of its film and TV offerings in DRM as part of its legacy contracts with Hollywood. But there are plenty of other rightsholders beyond Hollywood who would like to impose controls on how their content is consumed.

Just five years ago, font companies tried to demand DRM-like standards for embedded Web fonts. These Web typography wars fizzled out without the adoption of these restrictions, but now that such technical restrictions are clearly "in scope," why wouldn't typographers come back with an argument for new limits on what browsers can do?

Indeed, within a few weeks of EME hitting the headlines, a community group within W3C formed around the idea of locking away Web code, so that Web applications could only be executed but not examined online. Static image creators such as photographers are eager for the W3C to help lock down embedded images. Shortly after our Tokyo discussions, another group proposed their new W3C use-case: "protecting" content that had been saved locally from a Web page from being accessed without further restrictions. Meanwhile, publishers have advocated that HTML textual content should have DRM features for many years.

The point here is that now Berners-Lee has officially blessed the idea of locking down HTML, it will be hard if not impossible to resist calls for the same technology to be applied in other ways. Here's where that takes us:

A Web where you cannot cut and paste text; where your browser can't "Save As..." an image; where the "allowed" uses of saved files are monitored beyond the browser; where JavaScript is sealed away in opaque tombs; and maybe even where we can no longer effectively "View Source" on some sites, is a very different Web from the one we have today. It's a Web where user agents—browsers—must navigate a nest of enforced duties every time they visit a page. It's a place where the next Tim Berners-Lee or Mozilla, if they were building a new browser from scratch, couldn't just look up the details of all the "Web" technologies. They'd have to negotiate and sign compliance agreements with a raft of DRM providers just to be fully standards-compliant and interoperable.

This is clearly the Web that the copyright industries have been dreaming of, one where they - not you - are in control of everything online. It is a complete and utter betrayal of everything that the open Web has stood for since it was invented by Berners-Lee in 1989, which makes his acquiescence in that betrayal both baffling and depressing.

Fortunately, we have free software. That means that hackers will build browsers that simply ignore the W3C's suicidal decision, and continue to make all elements of the Web page freely available. Many people will naturally choose such browsers over those from Microsoft or Google, which will dutifully follow the W3C's rules. That's good news, in that it will encourage people to move to open source; but also bad news, because some content will be locked-down and therefore not generally accessible.

The solution is simple to describe, if harder to realise. Since the W3C has clearly lost its way here, and considers kowtowing to the copyright industry more important than safeguarding an open Web that serves all users, we must clearly fork HTML. As O'Brien points out:

even if the W3C has made the wrong decision, that doesn't mean the Web will. The W3C has parted ways with the wider Web before: in the early 2000s, its choice to promote XHTML (an unpopular and restrictive variant of HTML) as the future led to Mozilla, Apple and Opera forming the independent WHATWG. It was WHATWG's vision of a dynamic, application-oriented Web that won—so decisively, in fact, that the W3C later re-adopted it and made it the W3C's own HTML5 deliverable.

Recently, WHATWG has diplomatically parted with the W3C again. Its "HTML Living Standard" continues to be developed in tandem with the W3C's version of the HTML standard, and does not contain EME or any other such DRM-enabling proposals.

By contrast, W3C has now put its weight behind a restrictive future: let's call it "DRM-HTML". Others have certainly bet against open, interoperable standards and user control before. It's just surprising and disappointing to see the W3C and its Director gamble against the precedent of their own success, as well as the fears and consciences of so many of their colleagues.

Indeed. If we want to keep the open Web, it looks like we are going to have to fight for it - and, sadly, against its inventor.

Follow me @glynmoody on Twitter or identi.ca, and on Google+

Email this to a friend

* indicates mandatory field






ComputerWorldUK Webcast

Advertisement
ComputerworldUK
Share
x
Open