Messy SEO is a column covering the nitty-gritty, unpolished tasks involved in the auditing, planning, and optimization of websites, using MarTech’s new domain as a case study.
This installment of “Messy SEO” details my process of working with our marketing, content and development teams to further clean up the search engine results pages for MarTech. In Part 5, we discussed the fallout and improvements of our title changes and site structure issues.
Our MarTech website houses a lot of marketing industry content. In addition to the pieces we’ve published since its launch in May 2021, the domain has all of the content that was previously featured on Marketing Land and MarTech Today.
One would think that with so much industry-specific content, Google would have an easy time finding and serving up relevant results for searchers. Unfortunately, it seems like the search engine is having a difficult time identifying our main topics.
Many of the MarTech topics (shown below) that we cover are still getting little interaction in the SERPs.
Queries
Clicks
Impressions
CTR
Position
customer experience
5
4,651
0.10%
28
email marketing
22
24,239
0.09%
40.04
agile marketing
5
7,046
0.10%
48.4
marketing automation
11
66,534
0.02%
53.93
crm
0
10
0%
57.7
MarTech queries that are receiving little interaction.
After researching these keywords/topics and their related pages — taking note of the site structure issues we’d already identified — the problem we were experiencing became clear: We were missing pillar pages.
Understanding the importance of pillar pages
Content pillar pages are designed to be the go-to source for your site’s main topics. They cover subjects in-depth, linking to related pieces covering the same topic (known as topic clusters), which helps site users find all the information they’re searching for. They serve as the ideal landing pages, introducing readers to your site’s subject matter.
From a ranking perspective, pillar pages are gold. They have the potential to rank well for given topics and pass ranking signals to their related pages.
After our content analysis, our team quickly realized the MarTech site was missing these key pillar pages. We had plenty of content covering a slew of marketing and technology topics, but no central pages that gave in-depth overviews on any subject in particular.
Our top-ranking pages for the keywords shared above were largely evergreen “how to” articles. These are helpful resources for users, but don’t serve as good pillar pages.
The top-ranking page that came closest to the “pillar” style was our “Marketing Automation Landscape” article. It gave an overview of the topic, linked to related pages and was longer than an average piece of content on our site. So, seeing its potential, we added more in-depth content and links to other related pages.
We then analyzed the rest of these pages and mapped out a strategy for creating new pillar pages, consolidating similar pages into hubs, and updating old content.
Creating pillar pages that connect topic clusters
Developing pillar pages became our next big project for MarTech. Our team outlined the highest-ranking pages for the site’s main topics (as described above) and reviewed their structure. We also looked for pages that weren’t ranking well but had the potential to become pillar content.
We believe this was our missing puzzle piece. The issue wasn’t our lack of authoritative content; it was how we structured that content on this new MarTech domain, a conglomeration of content from two well-established marketing news sites.
We began creating new pillar pages (and modifying pages with pillar potential) that met the following conditions:
The content went in-depth on a relevant topic.
It contained at least 2,000 words.
It linked to at least five relevant pages.
It featured authoritative information on the topic, citing sources when necessary.
There’s no magic formula to crafting a high-ranking, engaging pillar page. We simply found these criteria helped us create content that meets users’ needs and establishes topical hierarchy.
Avoiding keyword cannibalization
While undergoing this process, our team is doing its best to avoid keyword cannibalization — the unfortunate scenario when multiple pages on your site are competing for the same keyword or topic. This scenario could end up harming our organic search performance.
To prevent this issue, we are creating pillar pages under the following guidelines:
Avoid long-tail keywords and topics (these are for sub-topic pages).
Review the site to see if any existing pages are competitors.
Add internal links from sub-topic pages to the main pillar page.
Consolidate pages that aren’t unique enough into pillar pages.
No guideline is foolproof; Google may still force these pillar pages to compete with similar content on our site. But we believe adding these content hubs to our site structure will help users and search engines find out what MarTech is all about.
Have you had difficulties ranking for your site’s main topics? How are you addressing the issue? Email me at cpatterson@thirddoormedia.com with the subject line “Messy SEO Part 6” to let me know.
More Messy SEO
Read more about our new MarTech domain’s SEO case study.
Artificial intelligence, machine learning and neural networks are major buzzwords in the SEO community today. Marketers have highlighted these technologies’ ability to automate time-consuming tasks at scale, which can lead to more successful campaigns. Yet many professionals often have trouble distinguishing between these concepts.
“Artificial intelligence is essentially the term that defines the whole space,” said Eric Enge, president of Pilot Holding and former principal at Perficient, in his presentation at SMX Next. “Machine learning is a subset of that [AI] set around specific algorithms.”
Natural language processing (NLP) is another system that’s been used for SEO tasks in recent years. It’s primarily focused on understanding the meanings behind human speech.
“NLP is about helping computers better understand language the way a human does, including the contextual nuances,” he said.
With so many developing technologies available, marketers would be wise to learn how they can be applied to their campaigns. Here are three ways AI and its branches can automate SEO tasks at scale.
AI can address customers’ long-tail needs
Enge pointed to a customer search engagement study from Bloomreach that found that 82% of B2C shoppers’ experience is spent searching and browsing. This leaves room for plenty of long-tail searches, which are more niche in nature and, consequently, often overlooked by marketers.
Bloomreach’s own AI tool focuses primarily on extracting insights from this phase of discovery, Enge explained. It can identify site content that’s both underutilized and matches customer long-tail searches.
“AI improves pages by presenting more related pages that currently aren’t being linked to,” he said, “Or even potentially create new pages to fill the holes of those long-tail needs to create a better customer experience.”
Marketers can use AI systems to generate more relevant pages based on these long-tail interests. But, there are some caveats to be aware of.
“Just be careful not to create too many new pages,” Enge said. “There are certainly cases where too many pages can be a bad thing. But deployed properly, this can be very effective.”
AI can enable automated content creation
Enge shared some information about GPT-3, a popular AI language model, to demonstrate AI’s content creation capabilities. While impressive, he noted how a system like this can get out of control if there aren’t proper constraints.
“They [AI systems] currently don’t have any model of the real world,” he said. “They only have the data that they were trained on. They don’t have any perspective or context for anything, so they can make really bad mistakes, and when they write, they’re prone to bias.”
“The wonderful thing about the web is that it has all the world’s information on it — the terrible thing about the web is all the world’s disinformation is on it, too,” he added.
Despite these weaknesses, AI systems have a lot of promise. Continuous improvements in these technologies can help marketers scale content efforts to meet customer expectations.
GPT-3, in particular, has the ability to generate content in a variety of formats, allowing SEOs to focus more on optimization efforts.
“You can use it [GPT-3] to create new content,” Enge said. “You’re going to have to put in a lot of effort and bring a lot of expertise to the table to do it. It might be more cost-effective than writing from scratch, or it may not, depending on how good you are.”
AI can leverage deep learning to help establish topical authority
Having topical authority means your site is a perceived expert on a given subject. This is one of the factors many SEOs believe is vital for improving rankings, which is why so many have leveraged AI’s capabilities.
Enge pointed to seoClarity, which uses an AI tool called Content Fusion designed to help brands write with more authority, to highlight these deep learning capabilities: “The approach is to leverage deep learning to identify entities and words that help you establish authority in a topic,” Enge said. “It extracts intent, entities, terms and potentially related topics. Then they apply their machine learning models that are specific to your market space.”
The deep learning capabilities offer marketers a clearer view of their brand’s area of expertise, which can then be used to further develop their web properties. Establishing an automated deep learning system can provide them with fresh data to help demonstrate E-A-T (Expertise, Authoritativeness, Trustworthiness).
Every AI integration will look different, but each one has the potential to streamline your SEO efforts through automation and machine learning.
“There’s an incredible amount of stuff happening out there with AI,” Enge said. “Some of it you can take into your own hands if you’re willing to do the programming; in other cases, you can use tools. It’s up to you.”
Much like a physical marketplace, the online search environment has both its successful businesses and those that fail to gain traction. Matt Colebourne, CEO of Searchmetrics, used the analogy of a “high street” — the main area of commercial or shopping — to describe the current state of search marketing in his presentation at SMX Next.
“Just as you have the winners and the losers in the physical space, you have the same in the digital space; page two of Google or any search engine is fundamentally the ‘backstreet,’” he said. “That’s where a lot less audience is going to end up.”
Marketers have long used search data to optimize their content so it meets user needs. But many fail to apply those same insights to inform decisions that impact the long game.
“A lot of companies make the mistake of optimizing for growth way too soon,” Colebourne said. “They settled for their current product set and their question becomes, ‘How can we optimize sales of what we have?’ Whereas the questions they should be asking are, ‘What are the sales that we could have? How much of our target market do we have right now?’”
Each day Google processes over 3.5 billion searches, which provides marketers with a wealth of data. Here are three reasons why analyzing this search data improves marketers’ decision-making processes.
Search data shows where your growth is coming from
“Currently, about 15% of search terms that appear on Google every month are new,” said Colebourne, “So, that starts to give you an inkling of the pace of change that we have to deal with. We see trends come and go in months, and some cases even weeks. And as businesses, we have to respond.”
Many organizations focus too much energy on driving growth while neglecting to determine where that growth is coming from. And in this digital age, there’s a good chance much of it is coming from search. This data offers marketers valuable insights, especially those relating to their industry segment.
“You have to understand how your industry and category is structured and ask the right questions,” he said. “If, for example, you sell specialty sports shoes, it doesn’t make a lot of sense to compare yourselves with Nike or similar companies who have much much bigger coverage, but may not be leaders in certain segments.”
It helps address your most significant decision-making challenges
Data — specifically search data — should be part of any company’s core decision-making process. To show how often brands use it, Colebourne highlighted a survey from Alpha (now Feedback Loop) that asked 300 managers how they made decisions.
“The question they asked was, ‘How important or not important is it to you to use data to make decisions?’” said Colebourne. “And I think nobody is going to be surprised by the results — 91% think data-driven decision-making is important . . . But the corollary to this question was, ‘How frequently or infrequently do you use that?’”
The answer was just 58%.
Clearly, knowing search data is valuable isn’t enough to be successful — marketers need to use these insights from searchers to make better business decisions. Otherwise, they’re going to miss out on a good source of traffic insights.
“65% of all e-commerce sessions start with a Google search,” Colebourne said. “I would argue that makes it a good source for decision-making. It’s a massive sample set, completely up to date, and it’s continually refreshed.”
Search data gives you more consumer context
“That [search] data — sensibly managed and processed — can show you the target market and provide you with the consumer demand,” said Colebourne. “It can show you if the market is growing or contracting.”
Analyzing search data can give marketers a clearer view of their consumers, especially for those groups they haven’t reached yet. Reviewing what people are searching for, how often they’re searching and how your competitors are addressing the challenge can make decision-making that much easier.
But more than that, marketers must look at the marketplace as a whole, using search data to inform decision-making.
“We’re all very focused on keywords and rankings and all these good things that we know how to manage,” Colebourne said. “But what we need to do is step beyond that and not just look at what we have or what competitors have, but look at the totality of the market.”
“Let’s look at the input search data to understand what the real demand is and how big this market is,” he added.
“There are roughly three and a half billion Google searches made every day,” said Craig Dunham, CEO of enterprise SEO platform Deepcrawl, at our recent MarTech conference. “According to research from Moz, 84% of people use Google at least three times a day, and about half of all product searches start with Google. The way that consumers are engaging with brands is changing, and it’s doing so rapidly.”
He added, “Consumers begin their journey with the tool that many of us use hundreds of times a day. Thus, the connection to revenue becomes clear — it starts with search.”
The concept of digital transformation has been around for years, but it’s taken a whole new form in the wake of recent societal shifts. New technologies and the 2020 pandemic have led to a “greater focus on the need to drive optimal digital experiences for our customers,” says Dunham.
A brand’s website is often the first, and most lasting, impression customers will have of your organization. Here are some strategic actions he recommends marketers take to ensure their online properties are optimized for the search-first age.
Educate your team and higher-ups about the necessity of organic search
“The website is a shared responsibility and it requires proper strategic leadership,” Dunham said. “The first step is to take some time and educate yourself, your leadership, your board and your organization so they more broadly promote organic KPIs as business-wide objectives.”
“There’s great data out there on the impact of the efficiency of SEO as a low-cost acquisition channel,” he added.
Aside from sharing communication from Google on the importance of search from a business perspective, marketers can look for case studies from reputable organizations to encourage search prioritization. This can help higher-ups start seeing organic traffic as a key business metric.
“I was in a meeting recently and I had a digital leader say to me that you know website performance should not be an SEO metric — it has to be a business metric,” he said.
Create a cross-functional search ops task force
“Much of the data and insight generated by CEOs and their tools today are rarely utilized to their full potential,” Dunham said. “This is in part due to SEO not being seen as a business priority. As a result, it’s been siloed — pulling in teams from across the organization breaks down those silos.”
The more team members are involved with search processes, the more they’ll see its impact. People from each department will have more opportunities to contribute to growing online visibility using their unique skillsets.
“We know that businesses that are able to implement these organizational-wide search operations systems and practices — connecting a range of perspectives and search activities that are happening — are going to be the ones that will have a competitive advantage,” said Dunham.
Apply SEO testing automation
More and more brands are turning to automation tools to streamline tasks. According to Dunham, these solutions can be used for search-related activities as well.
“Automation can be well-deployed within web development processes,” Dunham said. “Until recently, this technology didn’t exist.”
Brands now have access to a wide variety of automation tools to streamline SEO-related tasks. The key is to pick solutions that align with your organization’s goals and give you full control over their deployment: “There are additional risk mechanisms that can be put in place to ensure you don’t release bad code that will result in large traffic losses, ultimately driving down revenue across your critical web pages,” said Dunham.
If brands can optimize their internal process, teams and tools around organic search, they’ll increase their chances of achieving long-term success in the search-first digital landscape.
The Microsoft Bing team said that the IndexNow protocol is now at a place where those participating are co-sharing URLs submitted, meaning if you use IndexNow to submit URLs to Microsoft Bing, Microsoft will immediately share those URLs with Yandex, the company announced.
Co-sharing URLs. The promise of IndexNow was to submit a URL to one search engine via this protocol and not only will that search engine immediately discover that URL, but it will also be discovered on all the other participating search engines. Right now, that is just Microsoft Bing and Yandex, but Google is exploring using this protocol.
Microsoft said “the IndexNow protocol ensures that all URLs submitted by webmasters to any IndexNow-enabled search engine immediately get submitted to all other similar search engines. As a result of co-sharing URLs submitted to IndexNow-enabled search engines, webmasters just need to notify one API endpoint. Not only does this save effort and time for webmasters, but it also helps search engines in discovery, thus making the internet more efficient.”
Microsoft said that Bing “has already started sharing URLs from IndexNow with Yandex and vice-versa, with other search engines closely following suit in setting up the required infrastructure.”
When this first launched, the participating search engines have not yet begun co-sharing URLs – but now they are.
IndexNow API. Also, you no longer need to submit the URLs to https://www.bing.com/IndexNow?url=url-changed&key=your-key or https://yandex.com/indexnow?url=url-changed&key=your-key. IndexNow.org is also directly accepting these submissions at https://api.indexnow.org/indexnow?url=url-changed&key=your-key
Microsoft Bing updated this help document to make it easier to understand how to set this up at any of those URLs mentioned above.
80,000 sites. Microsoft said that 80,000 websites are now using IndexNow for URL submission. “80k websites have already started publishing and reaping the benefits of faster submission to indexation,” the company said. Last November, the company said 60,000 of those websites were using IndexNow directly through Cloudflare, which added a toggle button to turn on this feature for websites using Cloudflare.
What is IndexNow. IndexNow provides a method for websites owners to instantly inform search engines about latest content changes on their website. IndexNow is a simple ping protocol so that search engines know that a URL and its content has been added, updated, or deleted, allowing search engines to quickly reflect this change in their search results.
How it works. The protocol is very simple — all you need to do is create a key on your server, and then post a URL to the search engine to notify IndexNow-participating search engines of the change. The steps include:
Host the key in text file named with the value of the key at the root of your web site.
Start submitting URLs when your URLs are added, updated, or deleted. You can submit one URL or a set of URLs per API call.
Why we care. Like we said before, instant indexing is an SEO’s dream when it comes to giving search engines the most updated content on a site. The protocol is very simple and it requires very little developer effort to add this to your site, so it makes sense to implement this if you care about speedy indexing. Plus if you use Cloudflare, it can be turned on with the flip of a switch.
Now that co-sharing URLs is enabled, you should see your content flow faster between Microsoft Bing and Yandex, hopefully other search engines will adopt this protocol going forward.
Microsoft Bing has published a new WordPress plugin that makes it easy to integrate your WordPress blog and site with the IndexNow protocol. The plugin was released over the holidays and is available over here in the WordPress plugin directory.
What is it. The WordPress IndexNow plugin enables automated submission of URLs from WordPress sites to the multiple search engines without the need to register and verify your site with them. Once installed, the plugin will automatically generate and host the API key on your site. It detects page creation/update/ deletion in WordPress and automatically submits the URLs in the background. This ensures that search engines will always have the latest updates about your site, Microsoft wrote.
What is IndexNow. IndexNow provides a method for websites owners to instantly inform search engines about latest content changes on their website. IndexNow is a simple ping protocol so that search engines know that a URL and its content has been added, updated, or deleted, allowing search engines to quickly reflect this change in their search results.
How it works. The protocol is very simple — all you need to do is create a key on your server, and then post a URL to the search engine to notify IndexNow-participating search engines of the change. The steps include:
Host the key in text file named with the value of the key at the root of your web site.
Start submitting URLs when your URLs are added, updated, or deleted. You can submit one URL or a set of URLs per API call.
The WordPress plugin makes this easier and you don’t have to go through all these steps to set it up.
How to install. To install the IndexNow WordPress plugin, follow these steps:
Log in to WordPress admin panel for your WordPress site. Click on ‘Plugins > Add New’.
Search for ‘IndexNow Plugin’ and install.
Once installed, click on ‘Activate’ to enable plugin.
Go to IndexNow admin page and click ‘Let’s Get Started!’.
Why we care. Like we said before, instant indexing is an SEO’s dream when it comes to giving search engines the most updated content on a site. The protocol is very simple and it requires very little developer effort to add this to your site, so it makes sense to implement this if you care about speedy indexing. Plus if you use Cloudflare, it can be turned on with the flip of a switch.
Now, if you have WordPress, you can follow the steps above to easily activate IndexNow.
In 2021, SEOs faced a flurry of Google updates (including the highly anticipated yet possibly-overhyped Page Experience update), new search results page features like continuous scrolling and countless other updates that could potentially affect visibility for their brands — all while operating amid the second year of the COVID pandemic.
From core updates and title change fiascos to improved shopping options and new ways of tracking data, this year was full of surprises. Here’s our look back at the most impactful SEO news, tactics and tools of 2021.
Core updates. Google released three major core updates — one in June, one in July and one in November. The first caused a lot of search volatility, with tools like MozCast reaching a temperature of 107.3°F on June 3. The July update continued this spike until it all died down around July 12.
On November 17, Google announced a third, somewhat surprising core update, just days before the Thanksgiving holiday, Black Friday and Cyber Monday. According to many tracking tools, this update had higher volatility than June and July’s. Due to the speedy rollout and widespread ranking shifts, many SEOs rightfully wondered why Google chose this time of year to release such a large update.
Page title rewrites. Of all the algorithm updates from this past year, the changes to Google’s page title rewrites in the search engine results pages (SERPs) were the most controversial. Marketers began noticing significant changes to their SERP titles around mid-August.
Following a slew of feedback claiming huge decreases in result quality, Google rolled back some of these updates later in September. But many sites still experienced major ramifications in the following months, including our own properties.
Spam updates. Google released a number of major spam updates throughout the year. The first set rolled out on June 23 and June 28, although there wasn’t much of a noticeable impact on rankings.
The second update, released on July 28, targeted link spam. Rather than penalizing sites with bad links, Google stated that it focused on ignoring those signals.
The final spam update rolled out in November. Google didn’t offer much detail on this update, but search volatility skyrocketed following the release.
Product reviews updates. This year, Google launched two updates to help combat spam and/or thin product reviews. The first update was released in April and the follow-up came in December. Both of these were designed to prioritize reviews with in-depth research, including “content written by experts or enthusiasts who know the topic well.”
Passage indexing. Google introduced passage indexing, an algorithm tweak that ranks segmented pieces of content on a page, to the SERPs in early February. Google now displays these passages as featured snippets and links users to that particular part of the page.
The year in SEO news
The SERP. Google added an “about this result” box to the SERP in February, giving searchers more information about their results. It expanded this feature in July.
In a similar fashion, Microsoft Bing launched Page Insights in November, which features a lightbulb icon next to each search result that gives searchers more details about them.
Google added free listings to its Hotel search in March. Later, in December, it allowed hotels to use Google Posts in a limited manner to extend their local reach.
Google also rolled out continuous scroll on mobile search in October, which seemed like it would encourage more clicks on results past page one. SEOs are still measuring what impact this change has had on CTR.
In November, the search company added features designed to give more visibility to local news content.
On the Microsoft side, Bing Search gained a new interface to make its results more appealing, including an infographic-like search panel and expandable search carousels. It also introduced “Make Every feature Binary” (MEB), a new algorithm model designed to help improve search relevance. And in October, the company released IndexNow, a cross-search engine collaboration with Yandex to set a protocol that would index any new content instantly.
COVID-related updates. As many marketers know, the pandemic has spurred on more interest in SEO as businesses search for new ways to connect with customers. This interest in SEO has remained high over the past year, but there were a number of additional trends. These included an increase in searches for local businesses and pandemic-focused topics.
In April Google announced that additional COVID-related travel advisory information would be shown in Google Search to assist with trip planning. It also expanded its Explore section for its travel site.
In December, Google began rolling out a search feature that lets users see if a doctor or healthcare facility takes their insurance — no doubt spurred on by the increased number of COVID cases worsened by holiday gatherings and the Omicron variant.
Yelp introduced “Proof of vaccination required” and “Staff fully vaccinated” profile attributes. It also added a health and safety measures community feedback feature to help consumers learn more about local businesses’ health and safety compliance. And, to help prevent customer confusion, it added a virtual restaurant attribute.
SEO documentation. Google published updates to its search documentation throughout the past year, though some of those changes weren’t officially announced.
The company quietly published new manual actions targeting News and Discover penalties in February. In June, Google offered an SEO guide to address HTTP status codes, network issues and DNS errors. And in October, it refreshed its search quality guidelines to expand on the concepts of YMYL content and lowest-quality content.
Microsft also published a list of Bingbot IP addresses in November to better alert users when it was crawling their sites.
Diversity and inclusion. In response to the growing amount of hateful rhetoric and attacks against people of color, women, and other minority groups, industry leaders — both search professionals and brands — made pushes for change.
Google announced in February that it would be changing its policies toward diversity research, following its questionable firing of AI ethicist Timnit Gebru. Due to criticism of how the situation played out, the company said it would tie business goals more closely to inclusivity and diversity — and change how it handles employee exits.
In April, Yelp rolled out an Asian-owned business profile attribute in response to the recent rise in anti-Asian violence and xenophobia. Later, in May, the company introduced an LGBTQ-owned attribute option to celebrate pride month.
Third Door Media (the parent company of Search Engine Land and SMX) held the second annual Search Engine Land Award for Advancing Diversity and Inclusion in Search Marketing. The previous winner, Areej AbuAli, served as a judge, with Rejoice Ojiaku and hasOptimization earning the accolade in 2021.
We also put together a list of inclusive marketing resources to help marketers highlight their brand values. Besides being the right thing to do, becoming a more inclusive organization has been shown to be better for your brand.
Image and YouTube. In February, Google provided documentation on image SEO best practices. The advice was focused primarily on ranking well in Google Images, but marketers can apply many of the suggestions to image ranking in general.
YouTube, seeking to assist creators with their reach, added video chapter previews and auto-translate captions. And in December, it launched a new feature that automatically linked to places mentioned in videos, giving users even more context.
Structured data. In May, Schema.org launched its schema markup validator tool in response to Google deprecating its structured data tool. It’s for more “general purpose” use than Google’s Rich Results tool.
In August, Google updated its Article structured data help document to reflect changes to its author properties. It added an author URL property to more easily identify authors of articles.
Industry and legal news. After postponing the mobile-first indexing deadline — first moving it from September 2020 to March 2021 — Google decided to leave the deadline open-ended. It said that there are still many sites not ready to shift over due to unexpected challenges they’re facing.
Mozilla tested Bing as the default search engine for 1% of Firefox users, leading many SEOs to reconsider the importance of optimization for non-Google search engines.
DuckDuckGo pushed past 100 million searches in a single day on January 11, showing how important private search experiences are to a growing number of users. And in December, the company announced that it’s working on a desktop browser, further signaling their support for greater privacy in search.
The battle for data privacy continued throughout 2021 with additional legal actions brought against Google. On March 12, a California judge ruled that Google must face a lawsuit claiming it tracks users in Incognito mode. In response, Google released a court filing saying that it makes clear that “Incognito” does not mean “invisible.” And in November, Google managed to win a dismissal of the U.K. Top Court’s data privacy suit relating to iPhone users.
Google’s run-ins with policy hit issues across the board. In October, the tech giant faced allegations from 17 state attorneys claiming it throttled non-AMP ads to give AMP a boost. This, along with Google’s decision to remove the AMP requirement from Top Stories, led many publishers to reconsider using the format.
Google was also fined €500 million ($589 million) by the French Competition Authority for failing to comply with negotiations with news outlets. Later, it lost a key appeal against the EU’s €2.4 billion ($2.8 billion) fine against the company from 2017, which found that Google broke an antitrust law in how it promoted its search engine regarding shopping.
In December, the company came under investigation for alleged harassment and discrimination against Black female workers. The report said the regulator began looking into the company’s practices after formal complaints.
Google Search Console (GSC). In April, Google released a pilot tool in Search Console that allowed users to report indexing issues; it was fully rolled out in August. Google also added practice problem rich results data, providing more insights for education content publishers. We also saw an upgrade to the AMP debugging section, which now links users to the AMP page experience guide.
To improve accessibility and user experience, Google introduced a new design for Search Console in November (shown below).
On December 14, the Review Snippets rich results report was updated, reducing the number of review objects; namely, the top-level schema.org/Rating objects.
Google Analytics 4. Google announced changes to Google Analytics 4 that included integration with Search Console, new machine learning models and data-driven attribution features. Interestingly, the language in this update suggests that the company may be considering sunsetting Universal Analytics in the not-too-distant future.
Google also unveiled a new version of Analytics 360, the company’s suite of products designed for enterprise-level companies, using Analytics 4 as its foundation.
Bing Webmaster Tools. Microsoft released its Bing content submission API to all users. Unlike its URL submission API, this version lets users submit content, images and HTML to the index as well.
Google Question Hub. In January, Google opened up its Question Hub for US-based publishers — it’s been available to users in India, Indonesia and Nigeria since 2018. The tool “enables creators to create richer content by leveraging unanswered questions,” according to Google.
Retail and e-commerce
In April, Google began enforcing its policy requiring merchants to show the actual price of items throughout the entire checkout phase. The company also updated Google Merchant Center’s product data specification requirements to encourage optimized Google Shopping ads and organic listings.
Google Shopping and WooCommerce partnered together in June to help retailers show their listings across Google. The search engine also released an e-commerce SEO guide to help improve retail sites’ search visibility. These updates reflected the changing landscape of retail due to COVID-19.
In an effort to put more offers in front of users, Google added a “Deals” feed to the Shopping tab and Merchant Center. It also began showing retailers when their items were eligible for badges. And, in order to show relative visibility and other metrics, Google provided Merchant Center users with a relative visibility report.
In November, Bing Shopping introduced customer-focused tabs to help shoppers find what they were looking for in one place. This update also made it easier for retailers to list their products. And later in December, Microsoft Bing launched the Ethical Shopping hub in the UK, which helps users shop for eco-friendly and fair-trade fashion items.
Microsoft also announced a partnership with Shopify to integrate Bing Shopping with the retail platform, which rolled out in December.
Local
Google Business Profile (formerly Google My Business). Early in the year, Google released a tool to help businesses better manage reviews, enabling business owners to monitor the status of reviews they’ve flagged.
On November 4, Google announced it would be renaming Google My Business to Google Business Profile. Along with this update, the company released new features that would give marketers and business owners more control over their accounts, which include:
Claiming and verifying Google Business Profiles directly in Google Search and Google Maps;
Call history launching in the U.S. and Canada;
Messaging directly from Google Search; and
Message read receipts being controlled within Search and Maps.
Maps. The importance of local maps has only increased throughout the past year. We saw a deeper integration between these and local business profiles across the board.
Google Maps started showing price ranges for U.S. restaurants, adding to a rollout of new features focused on expanding indoor business directories, which included airports, malls and transit stations.
Microsoft Bing introduced a new feature that allows users to search local stores. It’s designed to enable searchers to check store stock, helping them choose whether to buy online and pick up in-store.
Later in November and early December, Google rolled out an update to how it ranked the local search pack and map pack results. Termed the “Vicinity Update,” the change drastically impacted local rankings across industries.
Local SEO tools. Google Business Profile Product Experts worked together to help users find unique listing identifiers. Using a Chrome extension called GatherUp, they showed profile managers how to find their business’s unique CID number, which is useful to know when listings are merged or duplicated.
To help local businesses expand their reach, Yelp rolled out Custom Location Targeting, budget recommendations and other helpful local features. It also introduced custom search filters, themed ads and Project Cost Guides to support service businesses.
Looking forward to 2022
With so many algorithm and platform updates taking place this past year, many SEOs will be anxious to look at their data. Just remember: many of these updates are broad, and the most important thing for you to do is to keep your clients updated on Google’s changes.
Many brands are responding to user demand for greater accessibility and increased privacy, so be sure your websites and other properties are compliant and support all kinds of users.
Finally, we’re still a long way off from the end of the pandemic, so focus on answering your audience’s most pressing queries and making things as convenient for them as possible. Showing customers your brand’s values is more important than ever.
A lot of landing pages expire every day when outdated information becomes obsolete, products are sold out, services are discontinued and entire communities sunset. How that expiring content is being handled from an SEO perspective can greatly impact the organic search rankings of websites. If its handling is floundered, SEO landing pages with expired content have the potential to kill the organic rankings of the website overall.
PageRank vs. user signals
A frequently mentioned argument made by website owners for maintaining landing pages with expired content, especially sold-out products, is to preserve incoming external PageRank to the website. It is a false assumption that a landing page must be maintained as indexable and by returning a 200 OK status code, even when a product or service isn’t available to users any longer in order to keep whatever authority or PageRank that same landing page has accumulated over time. Doing so effectively means creating a soft 404 landing page. A soft 404 is an error page with no relevant content which continuously returns a 200 OK status code instead of a 404 or 410 status code.
For a number of reasons, that strategy is a recipe for disaster. Firstly, the conversion rate rather than presumed PageRank accumulation ought to be the primary goal of a commercial website. After all, no publisher cares for their PageRank value, high or low, as long as conversions meet or exceed expectations. Secondly, PageRank can not be gauged by any degree of accuracy. PageRank changes continuously as Googlebot crawls the web and Google does not disclose the actual value for individual landing pages or websites. No external third-party tool can substitute that value in any meaningful way. Finally, product landing pages rarely attract lasting, high-quality, merit-based backlinks to begin with. Effectively, the perceived PageRank loss is debatable, while actual PageRank loss is negligible.
Soft 404s are bad for user experience and therefore a thorn in the side of search engines, Google in particular. This is why maintaining expired content landing pages, especially unavailable product pages, considerably magnifies the risk of poor user signals. Google has become more adept in identifying negative on-page language and can accurately detect strings like “unavailable,” “out of stock,” “0 results found” or “sold out.” Frequently, yet not always, it will highlight the problem as soft 404 pages in Google Search Console. However, a major issue is that CTR is likely to suffer from snippet representation, highlighting information that services or products are unavailable to the user. Worse yet, if users are still compelled to click on results that turn out to be discontinued landing pages (also known as soft 404s), they are almost inevitably going to return to search results, look for an alternative and/or refine their query. Doing so, the users indicate with their click behavior that the individual user experience was bad for them. With this “bounce rate” growing, which is often mistaken for, yet unrelated to, the Google Analytics or on-site bounce rate, the relevance of the website as a whole suffers in the organic search rankings.
Although PageRank remains an important ranking factor, it pales in comparison with the weight of user signals which search engines collect for rankings. While emphatically denying the use of specific user signals, such as Google Analytics data or dwelling time, Google continues to favor websites that are popular with users. When compared against each other, the PageRank argument does not stand a chance. On the one hand, PageRank remains elusive and at best a means to an end. User signals on the other hand, directly and imminently contribute to the success of a website, with and beyond SEO.
The trends game
Google rankings, to a large extent, depend on SEO signal trends. For a large website, with many millions of relevant landing pages, a few thousand expired content landing pages are unlikely to trigger a ranking loss. They are relatively too few to decidedly tip the trend of a website’s signals one way. For a smaller website comprising ten thousand landing pages in total, a few hundred expired indexable landing pages can already pose an SEO danger.
Ultimately, the decisive factor is trends measured in percentages, rather than the actual total numbers of indexable expired content or soft 404 landing pages. Which website ranks well and which one does not depends on a number of critical factors. These include, among other factors, the total volume of crawlable landing pages, their content quality, the overall trends involved and, most importantly, the user experience signals trends indicating user satisfaction.
There are no fixed thresholds that must be observed. Instead, trends are front and center when SEO signals, and therefore organic search rankings, are to be improved. The question of how well a specific website fares in this regard can only be answered by analyzing the website’s specific data, especially its server logs. This is why commercial websites with a sizable and changing product database must regularly perform technical SEO audits.
In-depth SEO audits are the only means of accurately gauging crawl budget management, or how long it may take for Google to re-crawl expired landing pages in order to register the changes applied. Only an SEO audit can help to identify whether expired content landing pages pose a problem and/or if it’s a serious one.
Doing it the right way
Larger sections of a website that have outlived their usefulness but can’t be deleted, like sunset communities, can be moved off domain, thereby boosting the main website’s trend signal. In that instance, 301 Moved Permanently redirects must be established and maintained without an end date or return 404 status code so search engines know to discount the content.
Expired product landing pages, however, must not be 301 redirected to other landing pages, thereby meddling with user signals. Instead, when products or services are no longer available, respective landing pages must return either 404 Not Found or 410 Gone HTTP status codes. Doing so, these status codes will signify to Google and other search engines that the landing pages no longer provide what they used to and strengthen the user signals of the remaining, still available 200 OK landing pages that continue to offer products or services.
There is, however, a possibility to legitimately capitalize on 404 error pages without taking the unnecessary business risk of confusing search engines or diluting user signals. That is by enhancing 404 Not Found pages, which still return this correct status code and supplementing the content of the error page with relevant, in-context information for users. These so-called smart or custom 404 landing pages must continue to address the fact that their main purpose, product or service is unavailable. But, they can be augmented with relevant product alternatives and/or the results of an internal search based on keywords from the request URL, enabling users to continue on their journey within the website — and for the website operator to potentially still capitalize on the lead. Custom 404 pages are not an SEO growth method, but much rather a means for maintaining user satisfaction and improving conversions. When applied, they pose no SEO risk as long as the status code is still a 404.
Ultimately, whether expired content landing pages return 404, 410 or a custom 404 response, it is important not to block the URLs in the robots.txt. Doing so inhibits search engines from crawling and understanding the changes applied and can have an undesirable effect on user signals.
At the same time, internal linking to expired content landing pages must be updated and consequently discontinued. Internal linking is among the foremost important on-page signals indicating to search engines both relevance and importance from a crawl priority point of view, hence there’s no point in boosting content landing pages that have expired.
Lastly, it is important to always keep in mind that 404 Not Found landing pages, no matter how numerous, will not impact a website’s organic rankings. No website ranks poorer or better because of, or despite, its 404 Not Found pages. Soft 404 landing pages, however, can not only impact rankings but also have the potential to drag down the entire website in organic search.
IndexNow has now been turned on by over 60,000 websites that use Cloudflare in less than two months after IndexNow was announced by Microsoft. IndexNow is an open protocol that any search engine can participate in to enable site owners to have their pages and content instantly indexed by the search engine.
Microsoft and Cloudflare announced today that “more than 60,000 unique websites that have opted-in to Crawler Hints. Those zones have sent Bing about billion Hints for when specific assets on their websites have changed and need to be re-crawled.” I turned it on for the Search Engine Roundtable, my personal search blog, when it was announced.
How to turn it on. It literally is controlled by the flip of a switch in Cloudflare under the crawler hints section that you can access under the cache tab, then under the configuration section:
Microsoft said once this setting is enabled it, IndexNow “will begin sending hints to search engines about when they should crawl particular parts of your website.”
Google may adopt it. Google said recently that it too will test the IndexNow protocol for indexing. So while Microsoft Bing and Yandex are the only two who have fully adopted it, if Google adopts it, you can expect other search engines to as well.
Why we care. Like we said before, instant indexing is an SEO’s dream when it comes to giving search engines the most updated content on a site. The protocol is very simple and it requires very little developer effort to add this to your site, so it makes sense to implement this if you care about speedy indexing. Plus if you use Cloudflare, it can be turned on with the flip of a switch.
Microsoft and Yandex announced a new initiative today named IndexNow, a protocol that any search engine can participate in to enable site owners to have their pages and content instantly indexed by the search engine. Currently, Microsoft Bing and Yandex are the two search engines fully participating in the initiative but others are welcome to adopt this open protocol.
IndexNow allows “websites to easily notify search engines whenever their website content is created, updated, or deleted,” Microsoft wrote on its blog. The goal is to make for a “more efficient Internet,” the company said, by reducing the dependency on search engine spiders having to go out into the web and crawl each URL they find. Instead, the goal is for site owners to push these details and URL changes to the search engines directly. “By telling search engines whether an URL has been changed, website owners provide a clear signal helping search engines to prioritize crawl for these URLs, thereby limiting the need for exploratory crawl to test if the content has changed,” Microsoft wrote.
How it works. The protocol is very simple — all you need to do is create a key on your server, and then post a URL to the search engine to notify IndexNow-participating search engines of the change. The steps include:
Host the key in text file named with the value of the key at the root of your web site.
Start submitting URLs when your URLs are added, updated, or deleted. You can submit one URL or a set of URLs per API call.
Submit one URL is easy as sending a simple HTTP request containing the URL changed and your key. https://www.bing.com/IndexNow?url=url-changed&key=your-key and the same would work by using https://yandex.com/indexnow?url=url-changed&key=your-key
They work together. If you use the Bing method, then both Bing and Yandex (or other participating search engines) will get the update. You do not need to submit to both Bing and Yandex’s URLs, you just need to pick one and all search engines that are part of this initiative will pick up on the change.
The search engines are sharing this IndexNow system, so if you notify one, that search engine will immediately re-ping each other engine in the background, notifying them all. In fact, it is a requirement of IndexNow that any search engines adopting the IndexNow protocol must agree that submitted URLs will be automatically shared with all other participating search engines. To participate, search engines must have a noticeable presence in at least one market, Microsoft told Search Engine Land.
Similar to Bing URL submission API. Is this similar to the Bing URL submission API? Yes, in that the aim is to reduce crawling requirements and improve efficiency. But, it is different in that this is a completely different protocol. If you are using the Bing URL submission API or the Bing content submission API, technically Bing will get your URLs and content changes immediately but these two APIs do not work with the IndexNow protocol, so the other search engines won’t get the changes.
Will these APIs go away if and when the IndexNow initiative becomes more popular? That is unclear. The URL submission API would be somewhat redundant to IndexNow but the content submission API is unique.
Integrations. IndexNow is gaining support among third-party websites, like eBay, LinkedIn, MSN, GitHub and others to integrate with the IndexNow API. Microsoft said many have adopted the Microsoft Bing Webmaster URL submission API and are planning a migration to IndexNow.
Microsoft said it is encouraging all Web Content Management Systems to adopt IndexNow to help their users get their latest website content immediately indexed and minimize crawl load on their websites. In fact, Microsoft provided WordPress code it can use to integrate IndexNow into its CMS. Wix, Duda and others plan to integrate with IndexNow soon as well. CDNs like CloudFlare and Akamai are also working with the IndexNow protocol and so are SEO tools like Botify, OnCrawl and others.
What about Google. We were told that Google is aware of the IndexNow initiative and the company was asked to participate but at this point Google is not an active IndexNow participant.
Why we care. Instant indexing is an SEO’s dream when it comes to giving search engines the most updated content on a site. Google has been very strict about its applications indexing API, used for job postings and livestream content only now. So while it seems Google may not participate in IndexNow in the near future, search engines like Microsoft Bing and Yandex are aiming to push this initiative hard.
The protocol is very simple and it requires very little developer effort to add this to your site, so it makes sense to implement this if you care about speedy indexing. It seems more and more search engines will participate but in terms of the big one, Google, that remains unclear.