...
Close

Search & SEO News Roundup May 2023

 

What’s new in search this May? Search is set to change forever, ambiguous Google advice and a leaked Google memo. 

During May 2023, these are some of the highlights so far: 

  • Traditional Search is Not Permanent 
  • Some Users Served Google Search Generative Experience (SGE) 
  • Google: Don’t Gamble with SEO Shortcuts 
  • Google State that Invalid Schema Doesn’t Hurt Your Rankings 
  • Cross Domain Canonical Advice Updated by Google 
  • Google Leak Accepts Loss of Open-Source AI Arms Race 
  • Google Continue to Come Down on Link Disavow Projects 

Read on for our take on recent events in the search. 

Traditional Search is Not Permanent 

Posted May 26th on Search Engine Land 

This a very interesting post from Search Engine Land’s Danny Goodwin. In 2014, Google bought DeepMind. Mustafa Suleyman was a co-founder of DeepMind, and he believes that Google will “look much different in 2033 – where the conversation is the interface rather than the search box”. 

This is something that everyone in search is currently talking about. When Google introduced knowledge graph panels and started answering questions directly in search, this journey was initiated. This happened at a time before complex; chat-based AI became mainstream. Now that this technology has more backing and investment, everything is accelerating. Google’s journey from search engine to information assistant has been catapulted, but can Google keep up? In many ways, Google’s Bard is still inferior to OpenAI’s ChatGPT. Though Bard has more data to draw upon, its interpretation engine is poor by comparison. 

Imagine what will happen when this technology is fully integrated with voice assistants. For example, the other day, I was driving along, using Google Maps (‘driving mode’) to navigate. I asked Google: “How many miles are left of my journey? 

Google responded with, “I am sorry, I don’t know what you mean”. And to me, I felt like that was really the kind of response that Google’s voice assistant should have been programmed to respond to. It’s not that complicated. With the power of AI behind voice assistants, the responses will become more fluid and complete. 

Now let’s take that through to the main-line search. It’s going to be a lot more responsive and dynamic. Rather than supplying users with a simple list of links, the search engine will attempt to provide the needed info or answer the question directly. Of course, search engines will still have to hand off links to other sites where queries are navigational or demonstrate purchase intent. AI is still a far cry from being able to login to a 3rd party site, enter our credit card details for us and make a purchase. Even once AI can do these things, how many of us will trust AI with that much power? 

In any case, the search interface will have to change. Right now, everyone seems obsessed with how this will look visually. However, once AI is really cemented as part of the search, I think there will be a synergy with voice assistants. You’ll just ask your phone a question, and the AI will operate in the background and read out the answer. No more typing, no more screens. Of course, websites and PCs will still exist. But they will be used more for performing certain commercial or navigational actions. Most informational queries will probably be handled by an AI-powered voice assistant, making smartphones even more magical than they already are. 

Main takeaway: Search will have to change and adapt as AI becomes more and more mainstream. This will start with minor visual adaptations to search engines. Eventually, they will be unrecognisable – though that may still be some months or years away. After that, many of us will simply ask our phones a question and get an answer without looking at search engines (visually) at all. However, we will still need web-accessible, visual search engines. If our queries are navigational or commercial, or we want to read documents without an AI filter, we’ll need something a little more traditional. Aspects of the traditional search will survive. 

Some Users Served Google Search Generative Experience (SGE) 

Posted May 26th on SEO Roundtable 

You might have seen this post if you’ve been following Google’s Search Generative Experience (SGE) preview since May 10th. Google is teasing a potential new interface which they dub “Search Generative Experience” or SGE for short. 

Barry’s post on SEO Roundtable references this announcement from Google: 

Today, we’re starting to open up access to Search Labs, a new program to access early experiments from Google. If you’ve already signed up for the waitlist at labs.google.com/search, you’ll be notified by email when you can start testing Labs experiments, like SGE (Search Generative Experience), Code Tips and Add to Sheets in the U.S. And if you want to opt-in to these experiments, simply tap the Labs icon in the latest version of the Google app (Android and iOS) or on Chrome desktop to sign up. You can also visit the Labs site to check your waitlist status. 

Once you’re in, the new generative AI-powered Search experience will help you take some of the work out of searching, so you can understand a topic faster, uncover new viewpoints and insights and get things done more easily. So instead of asking a series of questions and piecing together that information yourself, Search now can do some of that heavy lifting for you. 

It’s very exciting news, however, the Search Labs link, which Google supplies, is not working for many users. Some users are getting error messages when they click the link related to account-level access problems, or perhaps there’s just too much interest for Google to handle. 

Main takeaway: Google says they are opening Search Labs so users can experiment with the new SGE preview. However, the link isn’t working for many users. 

Google: Don’t Gamble with SEO Shortcuts 

Posted May 19th on SEO Roundtable 

Google’s Search Advocate, John Mueller, has urged webmasters against engaging in SEO shortcut gambling. This post comes from the Mastodon social network: 

 

 

This is an interesting response from John Mueller, and it mirrors our internal SEO philosophy at Anicca. No one should categorically state that black-hat SEO shortcuts ‘don’t’ ever work. However, they’re not long-term solutions. Sites caught employing such tactics are often burned, and the domains are banned from Google’s results. 

If you’re happy to make hay while the sun shines and then have to start all over again, fair enough. However, managing sites and networks on that scale is equal (if not more) effort than employing white-hat SEO practices and doing the job properly. 

Since we (at Anicca) work with brands who want a long-term web presence, there’s no question or debate. We don’t employ shady, black-hat SEO shortcuts. Some, maybe, have a preference towards such short-term, win-big business models. They’re risky, though, and the gambling analogy is not unwarranted. 

Main takeaway: Google warns users away from shady SEO shortcuts. Spend time building brands that last online. 

Google State that Invalid Schema Doesn’t Hurt Your Rankings 

Posted May 12th on SEO Roundtable 

Recently, a rather interesting video was posted on Google Search Central. Within the video, a user asks whether an invalid schema can hurt their site. The answer from Gary Illyes was: 

Short answer is no. The long answer is also no because if we can’t parse it, we just won’t use it. That might mean that you’ll miss out on some search features like rich attributes under your snippets though. 

Writing on SEO Roundtable, Barry Schwartz notes: 

Gary Illyes from Google said that having invalid schema markup does not hurt your rankings or your site. He said the worse case, if Google cannot parse the markup, it won’t use it. 

This is only a partial answer from Google, and many will take this too literally. Will having an invalid schema on your site cause Google to down-rank your domain actively, which will hurt you? No. Will your site miss out on rich snippets which might draw in greater volumes of SEO traffic? Yes. Do you consider that loss of traffic to be a form of ‘hurt’? Well, Google might say not. We think a lot of business owners will feel differently. 

If your site receives a certain amount of SEO traffic, and then bad schema causes the loss of your rich snippets… if this issue causes a drop in traffic and revenue, does that hurt you? Obviously, it does. 

Main takeaway: Invalid schema implementation will not garner active penalties which harm your rankings. However, your SEO traffic will suffer since your site will fail to generate rich snippets within Google’s results. As such, so will your revenue. Actually, that does cause you harm – so watch out! 

Cross Domain Canonical Advice Updated by Google 

Posted May 4th on Search Engine Journal 

Roger Montti, posting on Search Engine Journal, noticed that Google had updated their cross-domain canonical advice and documentation. This is what Roger had to say: 

Google has updated the guidance on Cross-Domain Canonicals for Syndicated content; however, the guidance for syndicated news remains the same. A cross-domain canonical is when the duplicate page appears on an entirely different website (domain). Google updated the guidance about crawling and indexing on Tuesday, May 2, 2022, to remove guidance about cross-domain canonicals. 

This has apparently caused confusion, as the documentation change was not accompanied by an explicit statement from Google. Google removed guidance for cross-domain canonicals. However, Google did not categorically state that they were no longer supported. 

Later, Google added documentation to clarify their prior removal of advice: 

The canonical link element is not recommended for those who wish to avoid duplication by syndication partners because the pages are often very different. 

The most effective solution is for partners to block indexing of your content. 

It ‘seems’ as if Google are no longer supporting cross-domain canonical tags. 

Main takeaway: Documentation changes from Google seem to suggest that cross-domain canonical tags are no longer supported. However, there has been no concrete official statement. 

Google Leak Accepts Loss of Open-Source AI Arms Race 

Posted May 5th on Search Engine Journal 

Roger Montti of Search Engine Journal has revealed a leaked memo from Google. Here are the supposed contents of that memo: 

We’ve done a lot of looking over our shoulders at OpenAI. Who will cross the next milestone? What will the next move be? 

But the uncomfortable truth is we aren’t positioned to win this arms race, and neither is OpenAI. While we’ve been squabbling, a third faction has been quietly eating our lunch. 

I’m talking, of course, about open source. 

Plainly put, they are lapping us. Things we consider “major open problems” are solved and in people’s hands today. 

While our models still hold a slight edge in terms of quality, the gap is closing astonishingly quickly. 

Open-source models are faster, more customizable, more private, and pound-for-pound more capable. 

They are doing things with $100 and 13B params that we struggle with at $10M and 540B. 

And they are doing so in weeks, not months. 

Google’s acceptance that their massive data set and index of the web is failing to feed their own AI projects fast enough – is astonishing. Speaking to Bard directly, asking Bard to compare itself against ChatGPT, Bard stated that it was superior to ChatGPT due to its access to more (and fresher) data from the web. 

And yet, when I compare Bard with ChatGPT, its interpretation engine (working out what I want it to do) is massively inferior to ChatGPT. So, while Bard operates on much larger and more complete data sets, it’s not such a smart thinker. This memo seems to confirm that, whilst more data helps AIs perform better, it’s simply not enough in Bard’s case. It’s not enough to compete with OpenAI, and it’s not enough to compete with the open-source community. 

Consider why WordPress, as a CMS, is still so popular despite its faults and clunkiness. Every time Google release new features or technologies, the WordPress open-source plugin community reacts. Features that take months or years to hit other CMS are there for WordPress overnight. How can anything proprietary hope to compete? 

The same may end up being true of AI. Even now, projects like Stable Diffusion are rapidly catching up with or overtaking proprietary equivalents (such as Dall-E 2). 

Main takeaway: If this leaked memo is legitimate, Google is in a lot of trouble. If you’ve been reading our recent SEO and Analytics posts, you’ll know that this is something we speculated well in advance. It seemed as if Google’s Bard was failing to maintain pace with OpenAI’s ChatGPT, in some ways, this still seems to be true. But now, the sleeping giant of the open-source community is catching up with, and potentially set to overtake – both Google and OpenAI. 

Google Continue to Come Down on Link Disavow Projects 

Posted May 4th on SEO Roundtable 

John Mueller is at it again! He’s taking aim at those who sell large link disavow projects, particularly those who base such work on 3rd party metrics: