Don’t try to reinvent the SEO wheel, says Google’s Martin Splitt

[embedded content]

“You might shoot yourself in the foot when you don’t expect it, so why would you build something more brittle if all it does is solve a non-problem?” Martin Splitt, search developer advocate for Google, warned SEO Companys and developers. Reinventing the wheel with technical workarounds has been a frustrating trend, Splitt said on our crawling and indexing session of Live with Search Engine Land.

During his discussion Friday with Search Engine Land News Editor Barry Schwartz, Splitt shared ways that SEO Companys can avoid the most common indexing and JavaScript pitfalls, why he and other Google employees refrain from addressing all our questions as well as why the company isn’t as communicative as we’d like it to be.

Why it’s probably better that Googlers don’t answer every question

“[People say,] ‘Oh, but you as a Googler can’t talk about this,’ — I can pretty much talk about nearly everything, it’s just often unwise to do so,” Splitt said, adding that Google is a huge company that operates in many countries and that employees don’t always have a complete overview of what’s going on.

“For instance, ranking for me,” he used as an example, “I am not involved in ranking; I can look things up if I have to, but I really don’t want to because it’s not my domain and when people ask me about ranking I don’t say anything — not because I am bound to not say anything about ranking, but because I just don’t know.”

The changes Google makes, be it to the layout of its search results page or its ranking algorithms, have tremendous implications for businesses and users. Googlers commenting on topics that they do not have a comprehensive knowledge of could generate a lot of mixed messaging and confusion, which may lead SEO Companys to recommend tactics that just don’t work or may even hurt their chances at ranking.

Why Google has communication issues

Google’s Search Relations team, which Splitt is a member of, is the link between the company’s internal teams and the SEO Company community. Handling all those communications in an accurate, punctual manner has, at times, proved difficult.

“While most engineering teams work with us quite closely and are receptive for both feedback as well as giving us a heads up before they do things, sometimes some teams are a little less inclined to do that and then they just make changes and we find out randomly at some point when the change has already happened — that’s the rel=prev/next situation,” Splitt said. This particular announcement was met with frustration as the change had occurred years prior. 

“We should have proactively communicated this change to happen, even though [that team] made the decision because they thought nothing externally changes,” Splitt elaborated, adding “But that’s not quite the reality because, as we know, [SEO Companys] advise clients and customers to make a change that costs them money — that wasn’t exactly fantastic and we were in the middle of this entire thing; a very unfortunate position.”

A similar miscommunication occurred earlier this year when Google backtracked on some guidance regarding keywords within Google My Business descriptions. Despite the initial shock, these kinds of periodic missteps may have a silver lining as they prompt more dialogue and transparency between Google and the SEO Company community, which will hopefully lead to less botched communications.

Common, but avoidable, indexing and JavaScript mistakes

“These are things that worry me a lot and, oftentimes, it is either very over eagerly excited developers or SEO Companys who understand enough of the technology to be dangerous with it,” Splitt said of the needlessly complicated solutions that some developers and SEO Companys opt to implement.

These could lead to crawling errors. “We are still seeing websites not linking properly,” Splitt provided as one example, explaining that he’s seen errors on both internal and external links, “There is an HTML link tag, you put a URL in the href, that’s how you link and I don’t know why people are reinventing the wheel.” Most of the time these over-engineered solution seem to work, but they also fail in certain cases, and those failed instances usually involve crawlers, said Splitt.

“My favorite question is ‘Can we, for Googlebot, not serve the CSS to save some bandwidth?’ and I’m like, ‘The straightforward answer to this question is yes, you can,’” said Splitt, caveating that “the real answer is no, you should not, because that means that you’re building in complexity to solve something that isn’t a problem in the first place.” These kinds of shortcuts can end up hurting your site’s visibility in ways you might not anticipate and may also result in wasted resources during the implementation and additional hours spent remedying the situation.

Don’t opt for JavaScript if there’s a simpler method. A lack of expertise and an inclination for what Splitt refers to as “brittle” technical workarounds are also responsible for many of the common JavaScript issues he comes across. “For instance, robots.txt is a classical thing where [SEO Companys and webmasters] are like, ‘I don’t know, Google has dropped my site from the index,’ and then you check and then it’s like, ‘Yeah, your robots.txt says that we shouldn’t take this URL,’” he said, adding that when a site’s robots.txt file blocks Google, the search engine won’t see that site’s content regardless of how the JavaScript API is set up.

“We do see people breaking websites for users rather than for search engines,” said Splitt, “So, it is indexable, we do rank it, but it is terrible if you open it on a normal website because [you] are shipping like five megabytes of JavaScript for a very simple scrolling list of products.” Building sites for users, instead of for search engines, has been Google’s go-to advice for years; after all, a site that ranks well isn’t that useful if users have to wait so long to load it that they eventually bounce.

“Another thing that I see relatively often is that people rely on JavaScript to do things that you can do without JavaScript — that’s not something that you need to inherently be careful about, it’s just something that I think is pointless,” said Splitt, emphasizing his prior advice on opting for simple, proven techniques over Frankenstein-like workarounds that may end up costing you over the long haul.

Want more Live with Search Engine Land? Get it here:


About The Author

George Nguyen is an Associate Editor at Third Door Media. His background is in content marketing agency, journalism, and storytelling.

Website Design & SEO Delray Beach by DBL07.co

Delray Beach SEO