In this final installment of our 2009 retrospective, we’ll look at standards and consortia related to rights technologies. These fall into three general categories: DRM, content identification, and rights licensing information. The most significant standard for digital content ecosystems or supply chains, DECE, had no real developments last year but made several announcements after the new year; I covered these last week.
The two most important standards in the DRM world are from the Open Mobile Alliance (OMA DRM) and the Marlin joint development group. OMA DRM is a product of the “handset maker axis,” while Marlin is a product of the “media player maker axis” — two groups with obvious overlaps.
OMA released version 2.1 of OMA DRM this year, an incremental upgrade from the OMA DRM 2.0 that was released back in 2005. OMA DRM 2.0 had been languishing in the market, with very few adopters and some controversies around patent coverage. But now OMA DRM 2.1 is finally getting some traction, primarily through Sony Ericsson’s Play Now Plus and its adoption by the music service provider Omnifone and the large wireless carrier Vodafone in several countries.
Marlin is also seeing some uptake thanks to its adoption in various Sony platforms (PlayStation gaming devices and Sony Reader e-book readers) and in the Japanese IPTV market, and its embrace by the Open IPTV Forum. Various startups are also beginning to work with Marlin. Marlin is based on technology from Intertrust and governed by a group that also includes Sony, Samsung, Philips, and Panasonic.
OMA DRM 2.1 and Marlin play in different markets, at least for the time being: OMA DRM 2.1 in music and Marlin in video. But that could change. Just as the “axes” behind each of these technologies overlap, the applications could overlap as these technologies move forward.
The content identification market include three major pieces: watermarking, fingerprinting, and content identifier standards. As far as watermarking and fingerprinting are concerned, standards haven’t existed, as each vendor of those technologies attempted to sell its own “secret sauce.” 2009 was a slow year for new deployments of watermarking and fingerprinting — so slow that the market consolidated with Dolby’s Cinea division ceasing operations and the Philips spinoff Civolution acquiring the content identification business from Thomson.
It is hard to see how standards could apply in fingerprinting, because each vendor’s proprietary technology includes not only its own algorithms for examining content but also fingerprint formats. Standards are much more necessary for watermarking, because watermarking depends on multiple entities inserting (embedding) and detecting watermarks at various points in the digital content supply chain; these entities need standards in order to coordinate their efforts.
The RIAA released a standard for watermark payloads (the data embedded in content files as watermarks) last year. The standard payload contains enough bits to include standard identifiers used (in theory) by the music industry, including ISRC and GRid. It also contains fields for parental advisory and other concerns that are separate from anti-piracy.
The RIAA took a low-key approach to releasing the spec, fearing consumer backlash. But it’s important to remember that watermarks enable a wide range of applications, not just forensic copyright tracking.
The final area of standards activity has been in rights licensing metadata. The idea behind standard rights metadata is not to control usage of content but to automate licensing, tracking, and monetization. The AP’s hNews tagging scheme, covered in Part 2 of our year-end roundup last week, falls under this category. I also covered the news publishing industry’s ACAP standard for expressing search engines’ rights to index content and include it in search results last week.
The rights licensing standard that has the most potential to transform the industry is CC+ from Creative Commons. I was expecting to see a lot of activity in this area in 2009, but sadly, there was virtually none. CC+ is an extension to Creative Commons that allows for commercial licensing. Copyright Clearance Center (CCC), the US licensing agency for text content, helped design CC+ and launched a promising content licensing service around it in late 2008 called Ozmo. Ozmo is basically dead now.
CC also turned up in the ever-widening array of innovative services that Attributor has been building around its text fingerprinting technology. Back in March, Attributor announced FairShare, which enables content creators — such as bloggers — to attach CC licenses to their content and use the fingerprinting service to monitor its use. Content creators simply need an RSS feed of their content to participate.
I still say that CC+ is a great idea that could finally bring Creative Commons into mainstream commercial publishing. It’s instructive to think of CC as analogous to RSS. Before RSS, content syndication was something that happened through arcane contracts and lawyers. Attempts to automate the process, such as the ICE standard from IDEAlliance, were over-engineered and therefore not accepted. RSS came along as a simple, streamlined technology — with many free tools around it — and took off. Analogously, CC simplifies and streamlines many aspects of rights licensing that are currently prohibitively complex and arcane.
Let’s say that the economic meltdown in 2009 was the reason for the lack of entrepreneurial activity around Creative Commons. With enough such activity, the “religious dogma” on both sides of the CC phenomenon can finally melt away so that it can realize its commercial potential. Perhaps I date myself by saying that I remember the good old days of what we now call open source software — when it was “free software” and informed by Marxist principles. Much of that was conveniently forgotten once open source software companies began to sell for hundreds of millions of dollars.
I expect similar things to happen around Creative Commons once the economy picks up. In addition to its many noncommercial uses, I expect to see more activity around CC and commercial content licensing — and not just of traditional commercial content. Any successful content licensing business that uses Creative Commons to grease the wheels will need to apply equally, if not more, to user-generated content as well.