Google is now letting anyone under the age of 18, or their parent or guardian, to request the removal of their images from the Google Search results. The removal request can lead to the image no longer appearing in the Google Images tab or as thumbnails in any feature in Google Search, the company said.
Reputation management. This may make the process of removing images from Google Search easier and online reputation management companies may benefit from this. Having more and faster methods to remove content from Google Search is likely welcomed by SEO firms, especially those that focus on reputation management.
How it works. Here are the steps to remove these images, assuming you are under 18 and there’s an image of yourself that you want removed from Google results. Now, you or your parent/guardian or authorized representative (maybe an online reputation management firm) can follow these steps.
Visit the help page for this new policy to understand the information you’ll need to provide when using the request form.
After you submit the request, our teams will review it and reach out for any additional information we might need to verify it meets the requirements for removal. And we’ll notify you once we’ve taken down the image, if it meets the requirements.
Why we care. Sometimes teenagers and kids do mindless and dumb things with their phones. Having these controls in place can help undo some of the harm. On a more professional level, this may give more tools for online reputation management firms to deal with some content removal within Google Search.
Google has added new support, data and features to the Google Search Console Search Analytics API. The API now supports showing data for Google Discover, Google News and also supports Regex commands — all of which were already supported in the web interface.
Google announced this morning this support has been added to the Search Analytics API after many requests from the industry to add it.
API updates. “The searchType parameter, which previously enabled you to filter API calls by news, video, image, and web, will be renamed to type and will support two additional parameters: discover (for Google Discover) and googleNews (for Google News),” the company said. Google is still supporting the old name searchType for the time being, so it is backwards compatible.
Also, Google explained that “some metrics and dimensions are compatible only with some data types; for example, queries and positions are not supported by the Google Discover report.” This would also apply with the API and the API would thus return an error message.
Regex API support. Google has added Regex support to the API, specifically to the query and page dimensions. Two new operators have been added to the existing match operations, they are includingRegex and excludingRegex.
Already in web interface. Like we said above, these features have been in the web interface for a while. Google has now brought support to the API. Google News performance reports were added in January 2021, Google Discover performance reports gained full data in February 2021 and Regex support was added in April 2021.
Why we care. Many of you use APIs to help automate and streamline your day-to-day SEO practices and reporting. Having access to these additional data points and adding in Regex controls should make these reporting tasks easier and more automated.
This should save you time to do other SEO-related tasks, tasks you might have a harder time automating.
In August, Google introduced a new system for generating title links (the title of a search result in Google Search). “This is because we think our new system is producing titles that work better for documents overall, to describe what they are about, regardless of the particular query,” the company explained.
However, during the new system’s initial rollout, SEOs provided example after example after example of titles that not only failed to describe what the page was about, but may also confuse users and deter them from clicking through. Fortunately, the situation has since improved, but placing blind faith in Google’s new system can mean that you’re ceding control over a crucial aspect of your content, which could ultimately affect your business. Below, you’ll find a synopsis of how Google’s title changes have evolved, how you can verify whether your titles have been changed and what you can do to regain control over them.
Title changes: Then and now
A tale of two title changes. Google has been adjusting titles links for a long time. In 2014, the company explained that it might change a title to match the query (to a certain extent). This is an important detail because Google would later cite these historical practices as precedent for its new system — a justification that some SEOs found misleading as the magnitude and impact of the changes contrast sharply.
“[More recently,] I’m rarely seeing examples in the wild of noticeably worse rewrites for large-scale sites that I’ve done in-depth audits for,” said Brodie Clark, Australian SEO consultant, “This was definitely not the case initially (for about a month post-update), but Google seems to have since turned down the dial and made the update work as intended.” The other SEOs that spoke to Search Engine Land for this article shared similar experiences.
The first weeks of the title change rollout. When the new title change system rolled out in August, SEOs took to Twitter to share examples of poorly rewritten titles in the search results. “While many of the title overwrites made sense and were unlikely to negatively affect performance, there were many (too many) examples of title overwrites gone awry,” said Lily Ray, senior director, SEO and head of organic research at Amsive Digital.
SEOs feared that rewritten titles might be inaccurate or simply worse than what was in the title tag. While the title changes do not affect rankings, the title itself can influence clickthrough rates (CTR), thus also potentially impacting business KPIs such as revenue. Consequently, Google’s botched rollout of title changes fueled a movement among some SEOs who demanded a way to opt-out of the changes.
At one point, Danny Sullivan, a cofounder of Third Door Media (Search Engine Land’s parent company) and now public liaison for Google Search, also advocated for a similar feature: “As a site owner, I hate this. I want Google to use whatever page title I give it. Google argues back that it has to be creative, especially in cases where people have failed to provide titles. I’ve argued in the past that as a solution, Google should provide site owners with some type of ‘yes, I’m really really sure’ meta tag to declare that they absolutely want their page titles to be used.”
The nature of Google’s title rewrites. “It appeared that Google was truncating some article headlines in strange ways that changed the meaning of the title,” said Ray, “In other cases, it seemed that punctuation, like quotation marks or dashes, caused the title to break early. In even rarer and stranger situations, Google would choose anchor text or other article text to display as the title, which was occasionally taken out of context and was a poor representation of the full page content.”
“[Google] seemed to latch on to any type of header tag and really didn’t like the pipe character and overt branding,” said Colt Silva, SEO engineer at iPullRank. During our own analysis of Search Engine Land titles that changed in the search results, we also noticed that Google had a proclivity for removing the pipe character.
Google has since improved its system for rewriting titles (more on that below). To illustrate some of the types of title rewrites that we’re still seeing in the search results, Clark assembled a collection of examples from Search Engine Land article titles.
Here is Clark’s analysis of some of the changes:
There were many instances of adding the site name with a hyphen when the site name wasn’t included in the <title> tag. And additional examples where the vertical bar with the site name was changed to a hyphen (a commonality among sites that I’m seeing). Interestingly, most SEOs still prefer the vertical bar post-update.
Long title tags that result in truncation are simplified at key sections of the snippet. For example, #3 has the removal of a complete section of the title link, with the site name then added in with a hyphen.
Complete replacement was happening rarely for Search Engine Land, but there was the odd instance where Google was replacing the title link, such as for #4. In this example, the H1 was taken to replace what had been written in the <title> tag.
Google has since improved its title rewrites. After the initial blowback from the SEO community, Google’s Danny Sullivan published a post explaining why Google made the title changes. Several weeks after that, the company published more help documents on controlling titles and descriptions in Search. Just as important, Google’s explanations seem to be accompanied by improvements to its title change algorithm.
“Fortunately, many people submitted feedback and examples to Google, which caused them to acknowledge that they were still refining the title change,” Ray said, “Since then, it’s clear Google has made improvements to the title overwrites and even reverted many of the worst offenders back to their original <title> tag.”
“As soon as title-change-mania started, we saw one of our biggest e-commerce clients have 5% of their title tags changed without any real effect on their CTR,” Silva said, “Shortly before the Google announcement of rollbacks, we saw it drop to 2%. The client was concerned about a couple of high-traffic keywords, but those have since been rolled back and it’s no longer a point of discussion in any of our meetings.”
What to do if you suspect Google is changing your titles
If you’ve noticed fluctuations in your CTRs, it may be worthwhile to investigate whether Google has changed your title link. SEOs and tool providers have come up with numerous ways to do this — we’ll discuss a few of them below.
“Essentially, you’ll need a way to start tracking and trending titles. You’ll need to collect your site’s popular search terms, and then gather the Google SERPs title and compare it to the actual title,” Silva said, adding, “This Search Engine Land article is a solid highlight of options to track title changes. In addition to that, there’s Thruuu, Keywords in Sheets solution, and this creative bookmarklet to inject titles into a SERP.”
Ahrefs users also have a new tool that enables them to export title changes for deeper analysis. Brodie Clark has provided instructions on how to get started with it and how he analyzes the data.
The new tool is in the “Top pages” tab underneath the “Site Explorer 2.0” heading. Once you’re there, you’ll have to toggle the “SERP titles” button and change the date for comparison. Next, you can export the data for analysis.
“There are important aspects to keep in mind when interpreting the data to ensure you’re getting an accurate depiction,” Clark said, recommending that SEOs remove new URLs and URLs that are no longer ranking so that they’re only looking at titles that are eligible for comparison.
“Changing the grouping of the rows to the top pages based on est. traffic that has had a title link change, we can see trends for what has changed,” Clark said. At this point, you’ll have to perform a manual review. “When completing the manual review, you’ll also need to look out for titles that have manually changed for pages during the comparison period,” he added.
What you can do if you’re unhappy with how Google changed your titles
Some titles may still be unsatisfactory — it can be argued that the example in line #3 from the chart above is less informative than the original title, for example. Unfortunately, there is little you can do to directly change Google’s title links, but embracing a more holistic view of the issue can help you craft more informative titles and avoid bad rewrites from Google.
One thing you can do to bring a particularly inaccurate title change to Google’s attention is to submit feedback: “Google created a form where you can submit your feedback for incorrect or egregious titles,” Lily Ray pointed out, “Otherwise, pay attention to when the overwrites take place and what they look like; this could provide insight into potential issues Google may have with your titles and offer some inspiration about how to adjust them. Google also offers clear examples about the types of titles it intended to overwrite, so you can evaluate whether your titles fall into any of those categories.”
“If you’re seeing poor title links for your site, try to look at it from a non-biased point of view,” Clark recommended, “Are you keyword stuffing? Is the title accurate enough? Is the text using too much boilerplate content? All are important aspects to consider before making the judgment that Google has done something wrong. If you’re confident that they are at fault, try to make the on-page content more closely aligned with what you’re trying to achieve.”
“This is the perfect opportunity to start testing,” Silva said, recommending that SEOs “Follow the scientific method from hypothesis to conclusion and find why an algorithm has latched onto a specific block of text [to replace your title tag]” Since you’re likely in control of your title tags and on-page content, you can use these levers to see what works for your business, your audience and Google’s algorithms.
The more things change, the more they stay the same
For SEOs. We’re now accustomed to optimizing for rich results, featured snippets, knowledge panels and dozens of other non-traditional search features, but titles — as Google has now reminded us — are one of the oldest forms of on-SERP SEO. While its system for title links has improved, SEOs will have to double-check their titles moving forward to ensure that they follow Google’s guidance so users see what we intend for them to see, instead of an inferior title scraped from anchor text, for example. This will simply have to become part of your workflow and best practices will adapt to account for Google’s title changes.
For the industry. We rely on Google for traffic and Google relies on us for content to show users. When the title changes rolled out in August, Google said it wasn’t new, which was only half-true as the search engine has been known to replace titles, but had not done so to the extent that we’ve recently experienced. What’s more, it was showing title links that could have confused users and deterred them from visiting our pages. It cannot be said for certain that SEOs holding the company accountable for the flaws in its new system was what moved the needle and got Google to “revert many of the worst offenders back to their original <title> tag,” as Ray put it, but it is something that search marketers will likely have to continue to do in order to advocate for our businesses and the audiences they serve.
At this year’s SMX Report, I provided an overview of SEO Issues that hold us back from achieving our goals. I took a holistic look at resources, communication and mental constructs around SEO that often hinder progress.
Oftentimes we look for the quick fixes that drive major ranking improvements. These still exist, but the relationships involved with connecting us to clients, and the website to users are where the most sustained value can be found.
Here are some questions to ask before we even get started with the fixes:
Is the company ready?
At LOCOMOTIVE, we work with a wide range of clients. One of the key benefits is we see a wide range of problems, and we also develop insights on how different companies handle SEO from an implementation perspective. Three of the key factors for our clients with the greatest YoY organic wins are:
Sufficient resources to implement,
Acceptance of value of SEO across various stakeholding teams,
Openness to testing and failing.
Technical SEO recommendations will impact a wide range of teams across your organization, from developers to content teams and more. If your teams are currently barely keeping up with a two-year backlog of issues, adding fresh Technical SEO recommendations will probably never see the light of day.
I once had to make a business case to an IT lead for the SEO team on having access to Search Console and Google Analytics. The level of distrust of SEO was so high within the IT team, they actively worked to thwart any requests from our team. This was at the kickoff of our engagement. If you have teams actively working against your SEO priorities, it will not work.
If it takes you six months to build an accepted business case to add two lines to the site’s robotx.txt file to exclude paths with poor content experience, then you will be very limited in what you are able to accomplish with SEO.
Is the SEO team ready?
It is not always clients creating a bottleneck to SEO growth. It is often the SEO agency team. Every SEO team should focus strongly on the following three areas:
Communicating issues clearly,
Prioritizing projects well,
Testing and reporting on impact.
If you have ever sent a client raw output from Screaming Frog as a CSV and asked them to fix the 32,000 301 URIs, you are doing it wrong. This tells their teams:
I don’t value your time.
I don’t understand what is required to fix these issues.
It is on the SEO consultant to go through this list, look for site-wide 301s in the footer, clean up parameters (e.g. https:example.com/sid=12345), and provide the client a clear concise list of 301s that must be handled in themes or components, and 301s that must be resolved in content.
To help prioritize issues in our tech audits, we like using a tool called Notion.
Notion allows us to prioritize all issues for clients:
Create views that are filtered to the relevant teams:
And finally, add very clear information on the what, why, and how as tickets to resolve each issue:
Outside of technical audits, we use the ICE method to help qualify and prioritize recommended growth projects:
This allows you and the client to quickly prioritize quick wins versus projects that will take significant resources:
Finally, showing the value of the projects you have worked on is critical to gaining the trust of internal teams as well as resources to tackle larger projects.
Using Google Data Studio can make this very easy and efficient. By creating easy-to-reuse templates, regular expressions for URIs, and sharing with the right stakeholders, you can quickly and easily show the value of the work to gain buy-in for larger projects.
SEO Issues
Most SEO issues can be broken down into a few categories. I like talking about these items, rather than using technical jargon because it helps relate back to things that are meaningful and understandable for non-SEOs.
Links
Links, specifically <a> elements, are a vote that another URI is important. If you posted something to Reddit, you would not expect the post to gain wide visibility after receiving only one upvote. Links to pages on your website are similar.
Links are also discovery mechanisms for search engine crawlers. They help them find good, as well as bad, URIs. It is our job as SEOs to help them discover the good, but keep them from discovering anything bad. In this case, bad could mean a URI with no content, or a page specifically designed for your logged-in users. Essentially, “bad” are pages you don’t want users to find.
Having an up-to-date dynamic XML sitemap is the first step in solving for the “good” URIs. XML sitemaps help search engines find the content you want them to show to their users. An XML sitemap should list ALL URIs you want users to find on your website and nothing else.
Google gives site owners a tool called the Coverage Report in Search Console. This will show you URIs indexed, but aren’t in your sitemap. If your XML sitemap is all the “good” URIs, then this is a good place to look to see why other URIs are being indexed, and if they should be.
The coverage report will also show you URIs submitted in your sitemap, but Google decided not to include in their search results. This is often because there is other code like your robots.txt file or a meta robots tag telling search engines you don’t want them to show the URI. In other cases, the URI is either not the best URI on your site for its topic, or the URI doesn’t align as a good answer to topics where there is searcher demand.
You can use Google Analytics for ground truth for all pages being found by users. Again, using your XML sitemap as the baseline for your “good” URIs, comparing the pages users are landing on from search results (organic) to pages in your sitemap is a good exercise to find URIs you should include in your sitemap, or that you should be excluding.
If search engines are finding URIs they shouldn’t find, you should consider:
Removing links to the URIs.
Block search engines from accessing the path or URI in robots.txt.
Block access to the URI at a server level. (e.g. 403 Forbidden)
Removing the URI via Search Console removal tool.
It is worth noting that site owners should carefully consider options 2 and 3 above as blocking a URI in robots.txt will block them from reading and processing a meta robots or header noindex directive.
If search engines aren’t finding a URI that you want users to find, consider:
Adding more links from other pages to the URI.
Asking other websites to link to the URI.
Including the URI in your XML sitemap.
Content
Content is the most often abused word in SEO. Content is treated as an unhelpfully broad noun in most communications from Google and other SEOs. “Just make your content better”. What if we treated it as a verb? To satisfy. Content is not text. In fact, there are millions of pages ranking right now with very little written content. Content enhancement by adding some entities or LSI keywords is not a far step from keyword spamming of years past.
One of our biggest visibility wins in the last two years was simply adding a downloadable PDF to certain pages where a PDF was strongly associated with the user intent of the page. Content is all about listening and designing an experience that satisfies what the user was trying to learn, or do as clearly and efficiently as possible.
From a technical aspect, there are things we can do to help us measure pages satisfying to users.
Parameters
Many companies use parameters to track usage across a website, or perform other functions like sorting content, or establishing user state. This can lead to situations where we have many URIs representative of the same page being tracked in reporting tools.
In the example above, Google has sent traffic to two separate versions of the same webpage based on the site’s usage of a sid parameter in internal links. This makes our lives harder as marketers because instead of seeing this page has 880 user sessions and is an important page, the data is fragmented across multiple URIs.
Updating internal links to remove unnecessary parameters.
Important to note internal links will almost always be a stronger signal for search engines than a canonical link element. Canonical link elements are a hint of the correct URI version. If Google finds forty internal links to https://example.com/page.html?sid=1234, even if you have https://example.com/page.html specified as the canonical version, the linked version will, most likely, be treated as the correct URI.
Feedback
Incorporate feedback mechanisms into your pages that report back to your analytics tools
Using this feedback can help you to sort pages that have the following issues:
Outdated content,
Didn’t answer the user’s question,
The wrong page is linked to in navigation,
The content is confusing or the wrong media is used.
Custom Metrics
Consider using custom metrics like read time, persona, jobs-to-be-done, logged-in vs logged-out content to enhance how you report on your pages in analytics tools:
Site Search
Ensure you are tracking site search queries in your analytics tools. Site search, over and over again, has proven to be a wonderful tool to diagnose:
Important pages that should be in the navigation.
Content you should be covering but are not.
Seasonal trends or outlier issues.
Cannibalistic Content
Cannibalistic content is problematic because you can lose control of the designed experience for users and search engines can get confused and rotate through the URIs they show to users for specific search terms. Combining highly similar pages is a great strategy for users and for search engines.
If you click on a single search query in Search Console, Google will show you all the pages on your site that have competed for that query over the time frame given
This can sometimes be confusing because in many cases Google may display site links in search results causing multiple URIs to display in search results for queries.
If we focus more on non-brand queries (search queries that don’t contain a discrete brand or product name), it is often more fruitful to find pages where you really have highly cannibalistic content. If you are handy with Python, this can be pulled from the Search Console API and you can create spreadsheets that give counts of the number of URIs receiving clicks for the same query.
Screaming Frog now has a content duplicates report that will allow you to crawl your website and quickly review content that is either exact or near-duplicate.
Finally, giving SEM teams a path location to place paid landing pages is a good strategy to ensure that disparate teams are inadvertently creating cannibalistic content.
Experience
The experience that users have on websites can affect visibility as well as revenue. In many cases, if a change is good for revenue, it will also be good for visibility as search engines incorporate experience more into their understanding of metrics that quantify user satisfaction.
Page Speed
Two of the best ways to get page speed to be a priority for a company is to align with revenue loss or position as a way a competitor is beating them.
Google Analytics has limited page speed metrics and for smaller sites, can give strongly biased averages with small timing samples, but by increasing timing samples, and working to align metrics, like document interactive with meaningful revenue decline, this can give you the ammo you need to get speed work prioritized.
One of my favorite reports to share with developers is the Measure report from web.dev. Not only does it provide an overview with prioritized issues and guides to resolution, but it also links to a lighthouse report so developers can drill down for more details of individual issues.
Web.dev also provides a link to a handy CrUX Data Studio dashboard that will make it easy to see improvements and celebrate with a larger set of internal stakeholders.
Microsoft Clarity
Clarity is a wonderful free tool from Microsoft that connects from Bing Webmaster Tools and provides a rich tapestry of experience metrics as well as individual session recordings. There is no better way, in my opinion, to understand user experience than reviewing session recordings. You can see when people are reading, see when they are having to close 15 popups, see when the hamburger menu keeps closing unexpectedly, and see if there are other things getting in the way of something you want them to do.
Understanding Intent
In your analytics tool, utilizing the second page and exit page in landing page reports can give you really good information on what users want and their path to get it. Does the landing page include links to the information they were looking for? Did users end up navigating to another page for an answer that should have been answered on the landing page?
Hidden Issues
Getting in the habit of opening the Developer Tools Console in Chrome when visiting pages is a good way to spot hidden errors that may be impacting users or your metrics.
Errors here can lead to:
Incomplete tracking information
Missing content
Insecure pages
Poor page performance.
Relevance
Relevance, to me, is how well a page aligns to and covers what the user was looking for. This is not at a keyword level, but rather, does the page provide the answer or solution to the overriding meaning the user had with their search query.
In Google Data Studio, you can quickly pull user searches and landing pages along with other informative metrics like clicks and impressions.
Downloading these to a CSV and using a simple pivot table in Excel or Google Sheets allows you to get a high-level view of what the best relevance engine on the planet, Google, thinks your page is about.
Since this page on the Locomotive website is designed to sell Technical SEO services, we can quickly see where the page is relevant for things that maybe it shouldn’t be.
This is an opportunity for us to update the page with more text describing the type of analyses we offer, speak to the benefits of Technical SEO, and talk about our credentials as an agency. The searches in green (below) are aligned with the goal of the page.
The items in red (below) are an opportunity for us to produce more educational content which goes more into the details and mechanics of Technical SEO.
In addition, understanding your authority and expertise, as a search engine would see it, is critical to understanding what you can be relevant for. Around 2019, Google started elevating rankings for some absurdly unoptimized sites. Many of these were local government sites which had never seen an SEO and rarely a developer or designer. They got better at understanding authority attributed to websites.
Search engines also can use the entire corpus of a site’s text content, authors, links, etc. to understand the expertise that a site has for a given subject area. Writing new content which is aligned with your web site’s subject matter expertise, or your civic authority, will most always perform better than content that isn’t. This also aligns with the concept of “A rising tide floats all boats”, meaning that, over time, the more you demonstrate your expertise in new content, this has an additive impact on all content in the subject area.
Finally, the last two areas around relevance include knowing what you can be relevant for, and understanding when Google adds relevance for you.
If I worked at an energy company and it was suggested that we name a new product plan “unlimited utilities”, unless substantial monies were applied to awareness of this name, it is very unlikely that users would ever find our landing pages via Google searches due to Google’s understanding of this as a navigational term for a specific company.
I like to think that Google just includes things that it knows about me into my search text. In the example below, Google knows that I am in Raleigh, so they included a +raleigh into my search.
I am sure it is more complex than this, but in terms of a mental construct, it is useful to consider that Google provides your location, search history, etc. into the processing of your search to provide results more tailored to you.
Wrapping Up
Effective SEO requires a holistic approach. These are the key elements to coming at it from every angle:
Companies need the team buy-in and resources to succeed with SEO.
SEO teams should focus on clarity of communication and efficient prioritization.
The key areas to consider in SEO strategies are Links, Content (page satisfaction), Experience, and Relevance.
GIGO is a real thing. Taking time to go slow with accurate XML sitemaps, custom metrics, user feedback mechanisms, etc can make your life easier and give you data to inform growth.
Spend some time watching user sessions. You will thank me.
Work hard to ensure your pages solve a problem or provide the right answer.
Look at how your page’s content aligns with user searches provided by Google.
Write to support and build your site’s subject matter expertise. Credibility is key.
Want to watch the full session and others from SMX Report on-demand? Register here.
Search Engine Land’s daily brief features daily insights, news, tips, and essential bits of wisdom for today’s search marketer. If you would like to read this before the rest of the internet does, sign up here to get it delivered to your inbox daily.
Good morning, Marketers, tomorrow night commemorates the 9th anniversary of the death of my mother. In the Jewish world it is called a yahrzeit and it got me thinking about change and the change we experience with death and of course over the past year and a half with COVID.
Often, SMX East fell on the same week as the yahrzeit and this and last year, we didn’t have an in-person SMX event. In fact, last year, the Javits Center was a makeshift hospital for COVID patients. In 2019, the last time the show was in-person, I was thankful for several Jews who helped me hold a small prayer service in memory of my mother at the show. Who would have thought, just a few months later, that venue would be transformed into a hospital?
Change is not always bad. In fact, virtual conferences have given the opportunity to many professionals that were unable to fly to an event to showcase their knowledge. As someone who has been involved in the search conferences for almost 20 years, it is amazing how the industry has adapted to change — for the better. Oh, and even for my mother’s yahrzeit — we went virtual by offering a jewish app in her memory.
How do you embrace change?
Barry Schwartz, The good son
Google search quality guidelines updated
Google has finally updated the company’s search quality raters guidelines, this update comes after over a year of the document not being updated. This time Google expanded on the YMYL category, it clarified what constitutes lowest quality content, simplified the definition of upsetting-offensive and the overall document has been refreshed and modernized with minor updates throughout. In fact, the old document was a 175 page PDF, the new one is 172 pages.
Why we care. Although search quality evaluators’ ratings do not directly impact rankings (as Google clarified in the document), they do provide feedback that helps Google improve its algorithms. It is important to spend some time looking at what Google changed in this updated version of the document and compare that to the last year’s version of the document to see if we can learn more about Google’s intent on what websites and web pages Google prefers to rank. Google made those additions, edits and deletions for a reason.
Microsoft Advertising is introducing Health insurance ads as a pilot program, the company announced Tuesday. The new format is now eligible for advertisers targeting U.S. customers.
Why we care. Health insurance ads can help health insurance providers get in front of searchers, which may be especially important given the upcoming annual enrollment period. Additionally, Health insurance ads are dynamically generated, which may help advertisers save time. This is the fourth vertical-specific ad type Microsoft Advertising has introduced this year and they are all similar in that they’re intent-triggered, appear on the right-hand rail of results and are dynamically generated based on a feed. Maintaining this formula across ad products can also help PPC professionals, particularly those at agencies, more easily get this ad type going for different clients since the requirements and placements are all the same.
Would you want an ad free Google Search for a monthly subscription fee?
Would you pay a monthly subscription fee to remove all the ads from the Google Search results? Neeva thinks so but so far, Google has not gone down that route. But Google is asking some users via a Google opinion rewards survey if they would like such an option.
Eli Schwartz spotted this survey and posted it on Twitter. The survey asks, how interested would you be in paying a reasonable price for a search service with that feature. The feature is “Results show no ads at all.”
Google does offer a premium service for YouTube without ads – so I guess it would be feasible for offer this for Google Search. But honestly, I’d be shocked if Google ever did this in search. The only way I can see this happening is if government regulation pushed Google to a point where this might make sense for their revenues.
Announcing the winners of the 2021 Search Engine Land Awards
The competition was fierce for the 2021 Search Engine Land Awards. The pandemic caused lockdowns and shutdowns, which affected many businesses’ main income streams. Not only that but it forced many consumers almost completely online.
As such agencies, in-house marketing teams, and individual marketers had to get creative, think on their toes, and often make a little go a long way. We are, as always, ever thankful to our amazing roster of Search Engine Land Awards judges who brought their keen expertise, provided thoughtful input, and donated their time.
Microsoft and Yandex announced a new initiative today named IndexNow, a protocol that any search engine can participate in to enable site owners to have their pages and content instantly indexed by the search engine. Currently, Microsoft Bing and Yandex are the two search engines fully participating in the initiative but others are welcome to adopt this open protocol.
IndexNow allows “websites to easily notify search engines whenever their website content is created, updated, or deleted,” Microsoft wrote on its blog. The goal is to make for a “more efficient Internet,” the company said, by reducing the dependency on search engine spiders having to go out into the web and crawl each URL they find. Instead, the goal is for site owners to push these details and URL changes to the search engines directly. “By telling search engines whether an URL has been changed, website owners provide a clear signal helping search engines to prioritize crawl for these URLs, thereby limiting the need for exploratory crawl to test if the content has changed,” Microsoft wrote.
How it works. The protocol is very simple — all you need to do is create a key on your server, and then post a URL to the search engine to notify IndexNow-participating search engines of the change. The steps include:
Host the key in text file named with the value of the key at the root of your web site.
Start submitting URLs when your URLs are added, updated, or deleted. You can submit one URL or a set of URLs per API call.
Submit one URL is easy as sending a simple HTTP request containing the URL changed and your key. https://www.bing.com/IndexNow?url=url-changed&key=your-key and the same would work by using https://yandex.com/indexnow?url=url-changed&key=your-key
They work together. If you use the Bing method, then both Bing and Yandex (or other participating search engines) will get the update. You do not need to submit to both Bing and Yandex’s URLs, you just need to pick one and all search engines that are part of this initiative will pick up on the change.
The search engines are sharing this IndexNow system, so if you notify one, that search engine will immediately re-ping each other engine in the background, notifying them all. In fact, it is a requirement of IndexNow that any search engines adopting the IndexNow protocol must agree that submitted URLs will be automatically shared with all other participating search engines. To participate, search engines must have a noticeable presence in at least one market, Microsoft told Search Engine Land.
Similar to Bing URL submission API. Is this similar to the Bing URL submission API? Yes, in that the aim is to reduce crawling requirements and improve efficiency. But, it is different in that this is a completely different protocol. If you are using the Bing URL submission API or the Bing content submission API, technically Bing will get your URLs and content changes immediately but these two APIs do not work with the IndexNow protocol, so the other search engines won’t get the changes.
Will these APIs go away if and when the IndexNow initiative becomes more popular? That is unclear. The URL submission API would be somewhat redundant to IndexNow but the content submission API is unique.
Integrations. IndexNow is gaining support among third-party websites, like eBay, LinkedIn, MSN, GitHub and others to integrate with the IndexNow API. Microsoft said many have adopted the Microsoft Bing Webmaster URL submission API and are planning a migration to IndexNow.
Microsoft said it is encouraging all Web Content Management Systems to adopt IndexNow to help their users get their latest website content immediately indexed and minimize crawl load on their websites. In fact, Microsoft provided WordPress code it can use to integrate IndexNow into its CMS. Wix, Duda and others plan to integrate with IndexNow soon as well. CDNs like CloudFlare and Akamai are also working with the IndexNow protocol and so are SEO tools like Botify, OnCrawl and others.
What about Google. We were told that Google is aware of the IndexNow initiative and the company was asked to participate but at this point Google is not an active IndexNow participant.
Why we care. Instant indexing is an SEO’s dream when it comes to giving search engines the most updated content on a site. Google has been very strict about its applications indexing API, used for job postings and livestream content only now. So while it seems Google may not participate in IndexNow in the near future, search engines like Microsoft Bing and Yandex are aiming to push this initiative hard.
The protocol is very simple and it requires very little developer effort to add this to your site, so it makes sense to implement this if you care about speedy indexing. It seems more and more search engines will participate but in terms of the big one, Google, that remains unclear.
Messy SEO is a column covering the nitty-gritty, unpolished tasks involved in the auditing, planning, and optimization of websites, using MarTech’s new domain as a case study.
Hello marketers,
This installment for “Messy SEO” details my process of rectifying the broken link and image issues that arose following the MarTech website consolidation. In Part 2 we discussed fixes for incorrect canonicalization, which aimed to future-proof our site for proper indexation.
Anyone who’s worked with website moves and merges knows redirecting the old pages to the new domain is just part of the story. Consolidating duplicate pages, fixing canonical tags, and optimizing page indexation are foundational steps that make search engines happy. But an equally important stage in any SEO project is user experience optimization.
SEO is incomplete without good UX
Regardless of how well Google and other search engines crawl your website, few people will interact with your brand if they have a poor experience. Broken and non-HTTPS links discourage many from trusting what your page has to offer. People want relevant links and engaging images in their content, not outdated links no longer working.
Link issues were everywhere after the MarTech site consolidation. Whether they were broken or non-secure, many of the newly consolidated pages suffered from choppy pages filled with blank boxes and links leading nowhere.
Fixing broken links
Virtually all of these instances had nothing to do with negligence; they were largely the result of old domain URLs on the site. The link issues we chose to tackle first were those that were completely broken, undoubtedly leaving visitors wondering why they were included within the article at all.
Many of these were simply outdated links to external sites that either completely removed their content or neglected to redirect it to someplace else. Finding the lost page was relatively easy after combing through their sites.
The most obvious broken links, and those which have the most obvious impact on user experience, were those housing images that were no longer available.
Replacing images that no longer exist
Search marketers run into this messy problem most often during migrations. If the old site domain isn’t set to redirect every image file to its new URL, the new site will fail to pull the old, non-existent content.
Needless to say, visitors will have little interest in pages full of blank boxes. The question is, how do SEOs replace these missing pieces?
The Internet Archive and Wayback Machine
In scenarios such as this, some marketers opt to view cached images from their old domain in the search results. The problem is that search engines such as Google only store cached images for a limited amount of time. And this is only helpful if your domain URLs are still indexed post-migration.
We’ve found the Internet Archive’s Wayback Machine to be the most extensive and reliable source of images no longer published online. We used its archive of pages from the old MarTech Today and Marketing Land sites to find the corresponding missing images from MarTech’s domain.
This process may seem pretty straightforward; after all, most experienced SEOs are aware of Wayback and its functions. But simple mistakes—no matter how small they seem—can land marketers right back at square one.
Restoring images with cached images the right way
It may be tempting to save time in this process by copying the cached images and pasting them into the pages in question. But there are many problems with this method. For instance, the image’s source will lie on another site. This takes away all control from the webmaster; there’s no guarantee that the image will stay on Wayback forever.
In addition, a cached image on your site takes away your chance to rank for your own site’s images. And in the visual-driven search landscape, you want to take advantage of every opportunity to improve image rankings.
To address this potential issue, we downloaded the applicable images from the cached MarTech Today and Marketing Land pages and uploaded them to MarTech. This process allowed us to replace the broken images with the restored versions on our new domain.
Though tedious at times, we found this solution to be a much better alternative than leaving visitors to comb through a sea of broken images and links.
Wrapping up
That’s it for the third installment of “Messy SEO.” We’ll continue to go through the steps taken toward cleaning up the issues that arose post-site consolidation and migration.
Have you had issues tracking down pages or images from an earlier version of a site? Did find any hiccups using the Wayback Machine? Email me at cpatterson@thirddoormedia.com with the subject line Messy SEO Part 3 to let me know.
Search Engine Land’s daily brief features daily insights, news, tips, and essential bits of wisdom for today’s search marketer. If you would like to read this before the rest of the internet does, sign up here to get it delivered to your inbox daily.
Good morning, Marketers, and where is search marketing headed?
This year’s SMX Next keynote (delivered by yours truly) is dedicated to advancing your search marketing career (regardless of whether you’re a manager, a specialist, a CEO, or a consultant). The job market is wild right now. More companies are investing in SEO and PPC after the pandemic proved online is THE place to be — no matter your business.
On the consultancy side, at the start of the pandemic, I had many clients pull back on their SEO investment because they weren’t sure what was going on and what the future looked like. A month or so in, though, I got a huge influx of leads because businesses realized they HAD to play the SEO and PPC game just to show up to the starting line. It became table stakes.
That reality has only continued to grow. And marketers are the beneficiaries. But what does it mean for your career especially as the big players continue to take away more of our controls and levers? What skills are critical for us as we’re continuing to advance in our professional development? Here are just a few I think will be the game-changers in 2022:
They see me scrollin’: Google rolling out continuous scroll on mobile
Google’s mobile search results now offer infinite scroll, what Google is calling continuous scroll. So as you scroll, Google will not show you the “more results” button when you reach the bottom of the page, instead, Google will just load the next page of results automatically.
Why we care. This may (or may not) encourage searchers to look beyond the first few results and scroll more through more results. It is yet to be determined how this might impact your click-through rates and traffic from Google search but keep an eye on it.
Microsoft will shutter LinkedIn in China by the end of the year
Microsoft will shut down its localized version of LinkedIn in China by the end of the year, the company announced Thursday. Microsoft cited a difficult operating environment, enhanced compliance requirements and a lack of success with the social aspects of its platform as reasons for shutting down LinkedIn in China.
InJobs to launch in China. As it sunsets LinkedIn, Microsoft plans to launch InJobs, a new, standalone jobs application for China, later this year as well. Unlike LinkedIn, InJobs will not have a social feed or the ability to share posts or articles.
Why we care. Sunsetting LinkedIn in China is likely to hinder B2B businesses that have a partner there or rely on the platform for communication with potential partners. Additionally, LinkedIn advertisers will no longer have access to users in China. However, it is likely that InJobs will offer some of these capabilities.
Microsoft publishes holiday 2021 checklist to help retail advertisers in a changing ecommerce landscape
The holiday season has officially started (hey, I count Halloween), and most retailers and shopping advertisers have planned their holiday campaigns already. New data from Microsoft indicates that “retail industry and advertising best practices have changed over the last two years, indicating growing e-commerce adoption and shopping that will begin earlier than ever,” said Eugene Goldenshteyn, Senior Product Marketing Manager at Microsoft Advertising. So what can you do now that we’re in the thick of it?
Start now: Respondents said they’ve already started holiday shopping.
Reconfigure your ROI calculations: The market will be more competitive than ever.
Always be testing: Now is not the time to set it and forget it.
Optimize your shopping campaigns: Lots more people will be buying online this year.
The guide includes specific “What to do” recommendations for each suggestion, including reviewing last season’s campaigns for successes and lessons learned and testing In-market Audiences, directly and indirectly, related to your industry.
How to build a high-performing marketing team
As we get closer to SMX Next, I’ve got professional development on the brain. This year we’re including a Career track with sessions on “How to find and support entry-level SEO talent” and “Climbing the ladder from paid media manager to CMO in a world of automation.”
That’s probably one reason this article from Emily Hackeling at Front stuck out to me. She partnered with Dr. Ron Friedman, a social psychologist specializing in human motivation, and author of The Best Place to Work and Decoding Greatness to collect data on what makes teams work. Here’s the tl;dr. The highest performing teams…
Communicate frequently and openly: Members of high-performing teams don’t shy away from honest dialogue, sharing both positive emotions, like wins and jokes, and negative comments, like critiques and complaints.
Build relationships on purpose. High-performing team members build friendships and are more likely to view their teammates as kind and trusting.
Meet with a purpose. The best teams don’t waste meeting time — 77% come to the meeting with a set agenda.
Prioritize inclusivity and diversity. Companies with executive teams in the top quartile for gender diversity were 25% more likely to have above-average profitability than companies in the fourth quartile.
Bring care to the workplace. High-performing teams care deeply about the quality of work they’re doing each day and the impact it has on the world around them.
If you’re a manager or looking to be one soon, these indicators may be #goals to work toward.
Search Shorts: Google Partners Rewards site down, temporary API extensions, a guide to content audits, and supply chain issues
The Google Partners Rewards site went down for maintenance for over 24 hours. Hat tip to Brett Bodofsky for the heads up. PPCers are hoping the extended downtime means more and better goodies.
Request temporary extension for v1.4 of the AdSense Management API. As of October 12, 2021, v1.4 of the AdSense Management API is no longer available. Google is “temporarily accepting extension requests for v1.4 of the AdSense Management API. If you would like to request an extension, please send us your project number. Note that this is a short extension period. All extensions will expire on October 26, 2021 regardless of when the request was made.”
The complete guide to content audits. This new guide from SEO whiz Crystal Oritz includes how to identify poorly performing content, determining which content is cannibalizing your rankings, and what to do when your content has a too-high bounce rate.
The shopping season is likely to be plagued by supply-chain issues. FexEx, UPS, and USPS have released their holiday 2021 shipping deadlines. Shipping issues, labor shortages, and a lack of materials means advertisers should prep their holiday campaigns for potential supply-chain issues.
Quote of the Day
“People don’t buy products or services. They buy transformation… and results.” Julia McCoy, founder of Content Hackers.
Google’s mobile search results now offer infinite scroll, what Google is calling continuous scroll. So as you scroll, Google will not show you the “more results” button when you reach the bottom of the page, instead, Google will just load the next page of results automatically.
Google said “with this update, people can now seamlessly do this, browsing through many different results, before needing to click the “See more” button.”
Google added that while “you can often find what you’re looking for in first few results, sometimes you want to keep looking.” And for those searchers who want to keep on looking, you will be able to continuously scroll “up to four pages of search results” without clicking to load more.
What it looks like. Here is a GIF of it in action:
When can I see it? Google has been testing this for some time, in fact, I’ve seen it myself over the past couple of weeks. Google did say it will “gradually roll out today for most English searches on mobile in the U.S.”
In 2018, Google took a step closer to infinite scroll with launching the more results button.
Why we care. This may (or may not) encourage searchers to look beyond the first few results and scroll more through more results. It is yet to be determined how this might impact your click-through rates and traffic from Google search but keep an eye on it.
Search Engine Land’s daily brief features daily insights, news, tips, and essential bits of wisdom for today’s search marketer. If you would like to read this before the rest of the internet does, sign up here to get it delivered to your inbox daily.
Good morning, Marketers, can you imagine a search engine without a results page?
That’s what Neeva, the ad-free, private search engine, introduced yesterday — a feature called FastTap Search that enables users to bypass the results page via a list of direct links that are generated when the user puts a query into the URL bar or the Neeva app (learn more about the feature below).
It is very interesting to see what kind of features are possible when a search engine isn’t reliant on ad revenue. Neeva is supported by its subscribers, meaning that the company has no skin in the game when it comes to whether users spend time on their properties or head directly to another site.
Of course, Neeva is not on the same plane as Google, but neither is its strategy. Google has a similar feature (the “I’m Feeling Lucky” button that takes users straight to the top result for their query), but appears to be moving towards more robust results pages. In fact, its MUM-related announcements from Search On in September gave us a preview of results pages that lead to even more search results, with featured snippets that may resolve queries without users having to click through to another site.
Will Neeva’s new feature change the way we optimize for search? Unlikely, but I’m hoping this spurs even more innovation among search engines. That kind of competition is important because it often provides marketers with more opportunities, like the free product listings Google introduced last year (and Bing also launched shortly after), which help it compete against e-commerce platforms like Amazon.
George Nguyen, Editor
Neeva’s ‘FastTap Search’ feature presents direct links instead of a results page
Neeva is launching a feature that enables searchers to type queries directly into the URL field of their browser or the Neeva app to be shown a drop-down menu with direct links. Dubbed “FastTap Search,” the feature allows users to bypass traditional search results pages and head directly to a site via the list of links generated based on the query. As Neeva’s founder and former SVP of ads at Google, Sridhar Ramaswamy, said in the announcement, this type of feature is made possible by the search engine’s unique business model, in which users pay $4.95 per month for an ad-free, customizable search experience.
Why we care. This feature may take on more significance if Neeva is able to increase its share of the search market or if Neeva is popular with your particular audience. While this is a reimagining of the search results page, it is still a list of results and there is still a top position, which means algorithms have to determine relevance and award that position to a page, just as they do on other search engines. However, since FastTap Search only presents a few results, the brands or publishers that are able to earn that top spot stand to gain significant visibility, which can be important if you operate in a highly competitive sector.
Facebook changes how it measures users for advertising purposes; Instagram introduces notifications for outages and a new Account Status tool
Facebook changes how it measures accounts for advertisers. “If someone does not have their Facebook and Instagram accounts linked in Accounts Center, we will consider those accounts as separate people for ads planning and measurement,” Facebook announced Monday, adding, “Facebook and Instagram accounts that are connected in Accounts Center will continue to be counted collectively as a single person.”
Previously, if someone used the same email address for both their Facebook and Instagram accounts or accessed both platforms using the same device, the company counted them as one person when they interacted with ads. As this new methodology rolls out over the next few weeks, advertisers should expect increases in pre-campaign estimates such as estimated audience size, “but for most campaigns we do not believe this will have a substantial impact on reported campaign reach,” Facebook said.
Instagram launches notifications for outages and a new Account Status tool. Instagram is testing Activity Feed notifications to inform users when the platform experiences outages (like the one from last week), technical issues and when those issues are resolved. “We won’t send a notification every single time there is an outage, but when we see that people are confused and looking for answers, we’ll determine if something like this could help make things clearer,” the company said. The test will run in the U.S. for the next few months.
Alongside that announcement, the company also unveiled a new tool called “Account Status.” The tool is designed to inform users about whether their account is at risk of being disabled. Within the tool, users can see if their content has been removed and why. They can also appeal a removal by requesting a review from their Account Status menu.
For the visual learners among us
“Short Videos” carousel spotted on desktop. First appearing as a test in mobile search results in November 2020, Brodie Clark has spotted a “Short Videos” carousel in Google’s desktop search results (shown above). The screenshot even includes a video from TikTok. Search feature junkies might also want to bookmark Brodie’s timeline of Google SERP features, which is sure to become even more handy as the company experiments with more features.
I heard you like flowcharts. Aleyda Solis has published her 10 favorite flowcharts to support SEO decision making. The charts can help you avoid the “it depends” answer with stakeholders, they may be easier to understand for non-SEOs and can make the criteria for various decisions more transparent.
Shipping delays mean Santa’s got some bad news this year. Marketoonist Tom Fishburne addresses supply chain slowdowns with the help of Santa and a young child wondering whether a year of good behavior was even worth it. Fishburne’s take is especially pertinent since so many businesses pivoted to e-commerce over the last year and a half. If you’re looking to learn how shipping delays might affect your advertising, check out Fred Vallaeys’ post on the subject over at Optmyzr.
What We’re Reading: Leaked documents suggest Amazon has been doing exactly what some merchants and regulators have suspected for years
“Use information from Amazon.in to develop products and then leverage the Amazon.in platform to market these products to our customers” — That quote comes from an internal Amazon document that details the strategy for Solimo, a private brand Amazon created in India. Reuters reporters have analyzed thousands of internal Amazon documents showing that, in India, the e-commerce platform ran campaigns to create knockoff products, undercut the original brands and promote them using manipulated search results.
“The documents show that two executives reviewed the India strategy – senior vice presidents Diego Piacentini, who has since left the company, and Russell Grandinetti, who currently runs Amazon’s international consumer business,” wrote Aditya Kalra and Steve Stecklow for Reuters.
The documents are about as damning as it gets: There are specific instructions on identifying which brands to “replicate,” a strategy for partnering with the manufacturers of the original item, there’s even a designated term for the practice of placing Amazon’s own newly launched private brand items in the top three ASINs in search results — “search seeding”; seedy indeed. Focusing specifically on the above-mentioned Solimo brand, Amazon matched or exceeded the quality of competing products but were 10–15% cheaper, the documents from 2016 revealed.
This is bad news for merchants who have reached a high degree of success selling their own products on Amazon: “It is third-party sellers who bear the initial costs and uncertainties when introducing new products; by merely spotting them, Amazon gets to sell products only once their success has been tested,” Lina Khan, now-chair of the U.S. Federal Trade Commission, wrote in a 2017 paper for the Yale Law Journal. “The anticompetitive implications here seem clear: Amazon is exploiting the fact that some of its customers are also its rivals.”
“As Reuters hasn’t shared the documents or their provenance with us, we are unable to confirm the veracity or otherwise of the information and claims as stated,” Amazon said in a written response. “We believe these claims are factually incorrect and unsubstantiated.”