Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

first: i will not comment on the actual findings teased in this blog post, because we miss lots of information, data and context (javascript to make rendering faster, was it really the first pageview that was faster or was this aimed at the second, client side rendering actually makes rendering of the first pageview slower (please, proof me wrong))

second: this is the way SEO should be done - a systematic analytics dev. dirven approach - and they solved one of the challenges big sites regularly face SEO wise: running multiple onpage (SEO is just one aspect) tests simultaneously over chunks of their sites.

most of the time you are stuck with setting a custom variable (or virtual tracker) in google analytics of the pages you changed (and a control group) the issue with this approach is that GA only reports a sample of data (50 000 rows a day) and for big sites this sample becomes insignificant very fast, especially if you run tests. additionally it's not easy to compare the traffic figures of the tracked page-group with log-data like crawling, so you need a custom built solution to connect these dots.

this leads us to a serious limitation of the GA and pinterest approach: connecting their data with google serp impressions, average rankings and clicks. yeah, traffic is the goal of SEO, but it is pretty late in the funnel, crawling is pretty early in the funnel, you can optimize everything in between. for the in between we are stuck with google webmaster tools for reliable data (at least it's data directly from google and not some third party). so to get most out of such tests you must set them up in a way that they traceable via google webmaster tools.

and to make something traceable in google webmaster tools basically means you have to sice and dice them via namespaces in the URL.

simple setup

   www.example.com/ -> verify in google webmaster tools
   www.example.com/a/ -> verify in google webmaster tools to get data only for this segment
   www.example.com/b/ -> verify in google webmaster tools, ...
   ...
make tests on /a/ -> if it performs better than the rest of the site, good

the issue there is that to have a control group you need basically move a comparable chunk of the site to a new namespace i.e. /z/ and site redirects are their own hassle but well on big sites most of the time are worth it. also you don't have to move millions of pages most of the time a sample on the scale of 50 000 pages is enough (p.s.: every (test) segment should of course have it's own sitemap.xml to get communicated / indexed data)

one more thing: doing positive result tests it actually quite hard - doing negative result tests is much easier. make a test group of pages slow, see how your traffic plumbles. make your titles duplicate, see your traffic plumble, ... yeah, these tests suck business wise, from an SEO and development point of view they are a lot of fun.

shameless plug: hey pinterest, check out my contacts on my profile. the goal of my company is to make all SEO agencies - including my own - redundant. we should do stuff.



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: