Since the formal semantic failed to become massively used, everyone is trying to make sense of the Web using scraping. It's something most viable sites should want to get behind, from ecommerce where you can find the best price faster to education. But I suspect if a very good shared scraping system were created we'd see countermeasures pretty fast, from timid content producers, as well as the Googles who want to have exclusive access to this level of information. Yet it's a natural layer for a shared commons, and what can be built on that, when finally get past the mess of arbitrary representations, is very exciting. Anyone know of any standard to create interoperable scrapers? In Javascript, of course. Here's one submission: https://github.com/zotero/translators