4

My site can enhance itself to client side rendered (CSR) (ish) if a prediction can be made by mouse movement analysis. It can gauge a visitor's familiarity with the site, confusion, etc. The content gets adjusted to appeal to the individual without being intrusive.

Problem is, Googlebot just "clicks" links without triggering mouse movement. Without usable movement data, the algorithm can't predict intelligently, so CSR isn't used. The session remains server side rendered (SSR).

The content google sees is valid. New sessions always start out SSR. Mouse movement is tracked until a prediction can be made. From there CSR takes over until they leave the site.

Is this allowed by Google standards? I would think so since the Google cached version and the version served to someone entering my site from Google matches.

Is it cheating to serve different versions of the same content to users and crawlers? Not a duplicate of but the same general issue.

Stephen Ostermiller
  • 99,822
  • 18
  • 143
  • 364
FrostyFire
  • 143
  • 4

1 Answers1

2

This kind of issue depends on the intent behind what you're doing.

Google has two kinds of penalties: Algorithmic penalties and manual actions.

1) I don't think your setup would ever trigger an algorithmic penalty, because their crawlers would never trigger the client side updates you're talking about.

2) You should safely pass a manual review if you even get one at all. A manual review would show that your content is not deceptively different from what Google sees.

Ultimately Google can do whatever they want, but going from years of experience in this industry, I think you'll be fine.

Google cares about cloaking when you're showing normal content to the crawler and showing malware, scams, or otherwise banned (or illegal) content to users.

It takes manual actions to catch those guys, so if you were ever in a queue for manual reviews I think your site would easily pass the "gut check".

Hayk Saakian
  • 271
  • 1
  • 3