I saw the title of this week’s White Board Friday on Moz, When Bounce Rate, Browse Rate (PPV), and Time-on-Site Are Useful Metrics… and When They Aren’t, and was hoping, praying (but I’m kinda agnostic) it would be about onsite metrics sniffed-out and measured from Google Analytics (GA) data to rank websites. Alas, sadly, this was not the search ranking piece I’d hoped for. It’s a great video for those new to CRM, not SEO as it pertains to SEO, and analysis of your onsite website analytics.
SEO Conspiracy Theories: Google Analytics, Clickstream Data, & SEO
I’ve been speaking to people for many months about what I consider “conspiracy theories” of how Google sniffs out UX signals via Google Analytics or 3rd party clickstream data. I’m mildly shocked at how many fantastic SEOs talk about these metrics/signals without saying HOW Google evaluates the signals or how great UX actually translates to high search ranking. SEO was always about signals. Now many give advice as if the touchy-feely, un-measurable, of user experience with too-small sample sizes– is advice that directly effects ranking signals. In addition to confusing new SEOs about what Google actually measures, it also misses that mark by not showing the specific ways in which search engines can or can not understand engaging content and website design.
From my data studies for Brian Dean and Neil Patel, I can say evaluate UX signals by evaluating code, the minimal rendering GoogleBot does, and most under-looked: semantic analysis. A human debater just lost in one of two debates against an IBM AI yesterday. Natural language processing with the biggest AI robot in the world, Google Search, has powers like a minor diety.
Googlebot does not know if your web pages are interesting based on time on site or click paths through your site. They know how to measure and give high search ranking to comprehensive coverage of page and website topics. They definitely look for “pogo stick” behavior where users click on a search result and click back to the SERP faster than other pages competing in the same SERP (which shows your page or website did not satisfactorily answer their query when other sites from the same search term did not result in users returning to the result page as quickly).
Google is not blatantly lying by using Google analytics data to study time on site engagement or similar. They do not hire a 3rd party company to look at clickstream data on websites. They’ve said so much, ad nauseam. If they did this, they would open-up for lawsuits that could literally cost billions. Why do that when they can measure signals (and increasingly, use machine learning)? That would be illogical and illegal.