Latest SEO Updates for 2019
SEO moves quickly. So quickly, in fact, that by the time you’ve decided on best practice for your SEO team, you’ve already been penalised and having to work out what happened. This is why you have to remain on top of the latest SEO updates for 2019 and new ways of working within the SEO industry constantly, ensuring you’re at least in line with what’s going on.
You may not be able to stay one step ahead, but you can at least put your website, or your client’s website, in the best possible position to succeed in the current climate.
Latest SEO Updates
So, here are latest SEO updates for 2019 so far that we’ve seen in SEO over the last 3 months.
BERT Algorithm Update
If you haven’t heard of the BERT algorithm update and you work in SEO, you’ve probably been living under a very big, and very soundproof, rock. SEO’s the world over have been dissecting this update and what it means for the future of rankings. Although the field of information about this is currently at peak level, it’s always worth another hat getting thrown into the ring.
In short, BERT is a neural network-based technique for natural language processing pre-training. In actual English, this means Google will become better at discerning the context of phrases. At the moment, it can struggle to put things into context, instead, looking at a phrase as a collection of words rather than a full string sentence.
With this update in place, you can add much more finesse to your language, and not worry so much about getting the keyword exactly right. When you’re creating content for long tail keywords, it can often feel quite stilted, and that you can’t stray too far from this. But with this update, you can be a little more natural in the language you use and still pull in long tail keywords.
Official Google Updates
Although Google updates it’s algorithm several times a week (and in some cases several times a day), these can often be quite minor tweaks. Occasionally, these updates are large enough to cause widespread and severe volatility in SERP’s.
In the past, these updates were often named after animals (penguin, panda and hummingbird), but are now given much more pedestrian names like ‘Google March Core Algorithm update’ (boring, yes, but correct). This is due to the fact that, surprisingly, Google has been releasing information about these updates before they happen. This would be handy, except that they also say that nothing concrete can be done for your site if it gets hit, which is the equivalent of standing in front of a tsunami with an umbrella.
Over the past 3 months, we’ve seen a large core algorithm update and a number of low-to-medium severity updates (depending on who you talk to). As we’ve come to expect, these updates have been focused (surprise, surprise) on content. This has been a common trend for the past few years and looks set to continue for many more years to come. As with every update, Google says that the only way to improve if your site, if it gets hit, is to improve your content. How do you do this? Make it relevant, make it long and make it useful.
Google has now released its Speed Report in Search Console, which is basically a condensed version of their Mobile Speed Report or Web.dev tool. From first glance, it doesn’t give you a lot of information but does give you an insight into whether Google likes or dislikes your page, giving it a slow, moderate or fast grade.
Although this doesn’t give you further, more valuable insights that something like Web.dev does, it can be used as a tool to convince your client that work needs to be done on page speed. It can sometimes be difficult to convince clients, and their dev teams, that the site is sluggish for users, even though they can often plainly see it themselves.
But this is straight from the horse’s mouth, so it would be difficult to go against this. This is definitely a tool to watch, as it may be improved upon over the following months and years, giving more insight into why Google doesn’t like the page, and what you can do to improve it.
Screaming Frog Update
Screaming Frog has released its latest SEO update for 2019. Each update gives you a few more little tools to use to get an even more in-depth view of your site, and this update is no different.
Although it doesn’t feel as though this update is absolutely groundbreaking, it’s still released some valuable additions which weren’t there previously, for example, the PageSpeed tab is now integrated in with Google’s PageSpeed Insights API. You can also customise the tabs you use within the tool, either removing them entirely or moving them around to ensure you’re seeing the most valuable information first.
One downside though is that, say for your example your client’s site doesn’t follow the exact .xml extension for your sitemap, it won’t be able to crawl it. There are workarounds to this, but it is something to think about if your client uses a CMS that doesn’t produce a standard xml sitemap.
‘Self Driving SEO’
And finally, BrightEdge has announced a new ‘Autopilot’ offering which promises what they call ‘self-driving’ SEO. In a nutshell, this will automate some of the more mundane SEO tasks, giving more time to put into overall strategy and content.
From the news we’ve seen coming out of the PPC world, this seems to now be a common trend. Why spend hours doing a mundane task when it can be done with the same precision but done by a program, app or AI? With Google becoming more interested in content and the usefulness of your site, it’s only natural that the small tasks that were once the focus of SEO work (like page titles, alt text and the like), are being pushed to the back burner.
Whether this tactic will be used by the wider SEO community is another story. It really depends on the type of client you’re working on. When it comes to a store selling clothes, this can be quite easy to automate as the keywords speak for themselves. But when it comes to the world of medicine, interesting new apps and market-leading technology, will automation really be the tool to market your products or services?
It may be tiresome, but sometimes it’s necessary to trawl through hundreds of page titles and meta description. If a human’s going to read it, why not write it as one?