Category: Channel: SEO

Auto Added by WPeMatico

SERP trends of the rich and featured: Top tactics for content resilience in a dynamic search landscape

These days, there is a lot going on at the top of the SERP. New features, different configurations, variations for devices and challenges for specific verticals pop up every day. Seasoned SEOs will tell you that this is ‘not new’, but the pace of change can sometimes pose a challenge for clients and webmasters alike. 

So how do you get ahead of the curve? How do you make your site better prepared for possible new SERP enhancements and make better use of what’s available now? I’ve outlined 4 potential strategies for SERP resilience in my session at SMX Advanced:

1. Prepare to share (more)

From Featured Snippets and Google for Jobs to Recipe Cards and Knowledge Graphs, Google is unceasing in its efforts to create more dynamic user-pleasing SERPs. This is great for users because Google can serve lighting fast results that are full of eye-catching information that is easy to navigate even on the smallest mobile screen. And with the range of search services available for a query – Google Lens, Google Maps, Google Shopping to name but a few – the granularity of Google’s ability to provide information at the most crucial point of need is immense. 

For SEOs, this means that it’s becoming increasingly rare to rank exclusively in the top spot of a given query. Even without ads, and even with the Featured Snippet, the top results can often include a mix of links, videos, and/or images from different domains. 

Have a look at this Featured Snippet for the query “What is a Featured Snippet”. 

Here, the main paragraph and blue link come from Backlinko site, but there are four linked images in the carousel before you get to the text. And only two of them are from Backlinko’s page, the others are from pages ranking 9th and 2nd on the main SERP page.  So, while some tools would report the paragraph snippet result as ranking “first”, from a user perspective the text result is the fifth clickable link.

And while this scenario is not new, it does illustrate something that we are seeing more regularly and in more complex configurations. 

For instance, in the query for 50 Books to Read Before You Die, Google is serving a host of results ‘From sources across the web’ in an accordion. Then within each drop-down is a carousel of results that includes web pages and bookmarked YouTube videos. 

That means that plain blue links aren’t visible until after a row of ads and then after 20+ links from the accordion carousel.

This presents both challenges and opportunities for SEOs.

Strategic challenges from mixed SERPs

For those who are looking to protect traffic, the challenge is to ensure that you are offering users a means of connecting with your content via multiple forms of media and mediums. Relying on a single content type (written blog without images for instance) could leave your traffic vulnerable to changes in the SERP. So, a strategic approach to your most important SERPs should include a mix of written, video and/or image content. This will ensure that you are optimized for how users are searching, as well as what they are searching for. 

Strategic opportunities from mixed SERPs 

For sites looking to gain traffic from established rivals, top SERPs with multiple site links present an opportunity to gain precious ground by optimizing for search services that your rivals are ignoring. So as well as looking for keyword gaps, make sure your content plan is looking for gaps in media formats.

Use a good technical SEO framework

In both cases, the multi-media content you develop should be underpinned by sound technical infrastructure, like a good CDN, image sitemaps for unique images, structured data, and well-formatted on-page SEO.

2. Invest in knowledge hubs

In Nov 2020 and Jan 2021, Google Tested Featured Snippet Contextual Links which added reference links to other websites from within the Featured Snippets. Then May 21 Google “bug” showed Featured Snippets that included links to further searches in Google.

While Google has yet to outline any specific plans to roll this out as standard, they have been known to test new SERP features on live results in the past. For instance, they were testing image carousels with Featured Snippets in 2018 before the wider rollout in 2020

Not only that, but rivals at Bing are already using these techniques extensively. Their SERPs are bursting with contextual links pulling images, copy, and clickable links into the SERPs from Wikipedia. 

This suggests to me that contextual links are likely to become a Google thing in the near future. 

How might you be able to optimize for this possible feature development? In my humble opinion, it is worth spending some time investing in knowledge hub-style content. Hubs to enable you to become a reference for users on your own site and the wider web. While it is likely that much of the traffic for potential contextual links would go to reference sites like Wikipedia, it is also the case that not every niche term or topic will have a wiki page. So, if you start building now, you could be adding value for current users and future needs of Google’s bots. 

Example of a simple knowledge hub

A knowledge hub can be technically simple, or complex, but should be underpinned with good on-page SEO and unique content that is written in natural language. 

3. Stay ready so you don’t have to get ready (with structured data)

At the top of the SERP plain blue links are becoming increasingly rare and today your search results are likely to include a mix of links and information from Google managed channels like:

  • Google for Jobs
  • Video snippets, predominantly from Google-owned YouTube
  • Structured Data enabled Rich Results as we see the recipe cards and/or Google Ads 

These features are generated using Google APIs, YouTube, services liked Google Ads, and also largely through Structured Data specifications. This serves them well because they deliver the information with a more consistent user experience which is particularly crucial in the constraints of a mobile-first web.

I bring this up in a discussion about SERP resilience because, as these new and shiny features are added, they take the place of plain blue links and, historically, they have been seen to replace Featured Snippets. 

For instance, we saw a significant drop in Featured Snippets in 2017 as Google-managed Knowledge Panels increased.

During this time, one of the most prominent Featured Snippet category types was for recipes. But Google soon found a more user-friendly way to display this content via mobile-friendly Rich Results.

Now, you might say, Well 2017 was a long time ago, but we’ve seen similar activity this year in February when Moz reported that as the number of Featured Snippets temporarily dropped to historic lows, we also saw a rise in rich results for video at the same time.

And though many of the Featured Snippets returned, the phenomena of SERPs neither being created or destroyed, but simply changing form is a regular occurrence. Even this Summer, it is the case that the prevalence of People Also Ask is steadily declining as Videos increase.

June/July 2021 SERP feature changes as reported by Semrush Sensor

This means that Google SERP developments can cause traffic disruption for pages that are optimized for a single type of search result. 

The TLDR of this is, don’t put all your eggs in one basket. 

If you have a page that performs well as a top-ranking link, Feature Snippet or other feature, don’t expect that to be the case forever because the SERP itself, could completely change.

  • Protect your traffic by optimizing your pages for relevant APIs and strategic structured data for your niche, alongside your on-page optimizations.
  • Gain traffic by identifying competitors who are not using structured data and target your efforts accordingly.
  • Monitor your niche for changes to Rich Results and Google features, plan accordingly.  This will include many of your regular tools, but also manually reviewing the SERP to understand new and emerging elements.

4. Dig into core topics for passage ranking

Google’s commitment to natural language processing within its algorithms gained pace in the last 12 months when Google introduced Passage Ranking at autumn’s Search On 2020 and MUM at Google I/O in Spring 2021.

Often confused with jump to text links, Google has explained that Passage Ranking is intended to help them to understand content more intelligently. Specifically to enable them to find ‘needle in a haystack’ passages that answer queries more accurately, even if the page as a whole is not particularly well-formatted. 

The analogy that I often use is that, if we imagine that the SERP was a playlist of songs, then previously, the whole song would have to be strong to make it on to list. Passage Ranking is essentially saying that if the rest of the song is so-so, but the guitar solo is really, really good, then it’s still worth adding that song to the playlist. 

On 10 Feb 2021, this update went live and Google said that it would affect 7% of searches and SEOs had a lot of questions:

  • Will Passage Ranking affect what the SERPs look like? 
  • Will Passage Ranking affect what Featured Snippets look like? 
  • Will Passage Ranking affect Featured Snippets exclusively or only Feature Snippets?

Speaking with Barry Schwartz via Google’s Search Liaison, Danny Sullivan, said the answer was No, No, and No. 

So why am I bringing this up in a discussion about SERPs?

Well, since Passage Ranking is now a contributing factor for ranking, and Featured Snippets are elevated from the top-ranking SERP results, in my opinion, we are likely to see more variation in the kinds of pages that achieve Featured Snippet status. So alongside pages that follow all of the content formatting best practices to the letter, we are likely to see more pages that are offering query satisfying information in a less polished way. 

“The goal of this entire endeavor is to help pages that have great information kind of accidentally buried in bad content structure to potentially get seen by users that are looking for this piece of information” – Martin Splitt

Confronted with these results, SEOs who love an If X, Then Y approach may be perplexed but my research has led me to believe that one of the contributing factors is user intent. 

Ranking shifts directly following the Passage Ranking update suggest that the content that was boosted sought to answer both the what and why behind the user queries. Case in point, a website that was traditionally optimized for the query different colors of ladybirds owned the Featured Snippet in January. 

This page is optimized using many of the established SEO techniques

  • Literally optimized for the search query 
  • Includes significant formatting optimizations 
  • Covers keyword topic directly to answer What are the different colors of Ladybirds
  • Core Copy is around 500 words

But after the Passage Ranking update, the same query returned a page that was less literally optimized but provided better contextualization. This usurper showed a better understanding of why ladybirds were different colors and jumped from 5th to 1st position during February.

Reviewing the page itself, we see that in contrast to the earlier snippet, on this page

  • Core Copy is over 1000 words
  • Includes limited formatting
  • Covers intent-based topic, in general, to answer Why

Other examples and other big movers during this time showed a similar correlation with intent-focused search results. 

Possible example of Passage Ranking in impact on Investopedia
Possible example of Passage Ranking impact on National Zoo 

In each case, it seems that Google is attempting to think ahead about user intent replying to queries with less literal results to better satisfy the thought process behind the query. Their machine learning tools now allow Google to better understand topics as well as keywords. 

So, what does this mean for SEOs?

Passage ranking looks like good news for long-form content

Well, where you have a genuine, unique perspective on a topic, Passage Ranking could be an incentive to create more thorough and in-depth content centered around users’ needs rather than search volume alone. 

  • Protect your traffic by optimizing your content for longtail keywords and intent.
  • Gain snippet traffic by creating intent-focused content. Answer the so what and don’t be afraid of detail.
  • Consider topics as well as keywords in content, navigation and customer journey. 

From a technical SEO perspective, top tactics include solid internal link architecture optimized with long-form content templates with tables of contents.

How can you build SEO resilience for a dynamic SERP?

The same way you dress for a pumpkin spiced autumn day, with layers.

In this blog, I’ve discussed tactics for 

  • Optimizing content for mixed media Featured Snippet panel results
  • Creating knowledge hubs for potential contextual linking developments
  • Building structured data into your website before rich results arise
  • Using Intent Focused Long Form content to potentially benefit from Passage Ranking

There is no single tactic that works in isolation. The SERP is so highly dynamic at the moment, that aiming for, or banking on a single part of the SERP is likely to leave you vulnerable to traffic disruption if/when things evolve. Think about how you can use these tactics to build upon and level up your existing SEO foundations. Change is the only constant, plan accordingly. 

The post SERP trends of the rich and featured: Top tactics for content resilience in a dynamic search landscape appeared first on Search Engine Land.

Read More
Jason August 23, 2021 0 Comments

Top stories images aren’t showing in Google Search

Beginning early this morning, there have been numerous reports of images not loading in Google’s Top stories carousel (as seen below). The issue seems to affect search results globally and Google has confirmed that it’s a bug.

The Top stories carousel, where featured images are not currently loading. Some have also taken screenshots in which only one or two of the stories feature an image that loaded as usual.

A problem on Google’s end. “Yeah, it looks like an issue on our side,” Google’s John Mueller tweeted regarding the issue. The company is currently working to correct the bug, Danny Sullivan, the company’s public search liaison, has confirmed.

Why we care. A blurred featured image may negatively affect your clickthrough rate, so make sure to annotate your reports to reflect this oddity. Some professionals have taken screenshots showing the Top stories carousel with just one or two images that loaded successfully, which could mean fewer clicks for the stories that didn’t show a featured image. And, since the Top stories carousel is an important source of visibility and traffic for some publishers, this could also impact advertising revenue and other marketing opportunities that depend on getting a user onto your site if the bug goes unresolved for an extended period. Fortunately, Google is already aware of the issue — we’ll continue to provide updates as they come in.

The post Top stories images aren’t showing in Google Search appeared first on Search Engine Land.

Read More
Jason August 18, 2021 0 Comments

How to craft a winning Search Engine Land Awards entry: Past judges share their advice

Since inception in 2015, the Search Engine Land Awards has recognized exceptional marketers on an annual basis — showcasing their oustanding work, providing well-earned exposure in coverage and interviews, and bestowing upon them the highest honor in search.

But the road between deciding to begin an application and winning the award can be a long one. Although this year’s submission process has been significantly streamlined — it’s never been faster or easier to apply to the Search Engine Land Awards — there’s still a story that has to be told. And while the way in which you tell that story is entirely up to you, we thought we’d look back on some advice from past judges about what really wows them, what they would love to see more of, and what areas are best avoided...

Keep reading for 17 tips for creating an award-worthy submission:

What impresses the judges most:

  1. “What impresses me is when people have clearly aligned the tools and features they’re using to the goals they want to achieve. It sounds simple, but the entries that are goal-oriented rather than focused on tactics are always strongest.” – Ginny Marvin
  2. “When entries have a new take on a situation or feature and talk about into how their strategy is different from the norm, and demonstrate why is their strategy or tactics are award-worthy.” – Brad Geddes
  3. “When submissions are succinct but concrete in their campaign summaries, show examples (i.e., ad creative where relevant) and use straight-forward English rather than marketing speak.” – Greg Sterling
  4. “When applicants are able to go beyond percentages of increases and show tangible results of how the campaign directly impacted the bottom line of the business. Also, it helps to put results into perspective — so instead of simply saying: ‘Before the campaign, the client was only bringing in this # of leads, clicks, etc — but the campaign raised that number to XXX’ offer an example of how the campaign impacted the business overall and not just the analytics. – Amy Gesenhues
  5. “When entrants share a lot of technical data around their case studies.” – Barry Schwartz
  6. “When entries prove their point with stats, graphs, and especially screenshots of GA/PPC Engine/ other paid search tech providers. Too many just say, ‘we increased business [some huge number]’ with no way to back it up.” – Brad Geddes
  7. “It really impresses me when entrants show how they retooled, revitalized [a campaign] or did something extraordinary to achieve extraordinary results. Or, how they outfoxed a competitor in a clever way – anything that shows how extraordinary results came from really extraordinary work.” – Matt Van Wagner

What judges want to see more of:

  1. “I love to see orchestration — when teams use tools, tactics and features in interesting ways to solve problems and execute on a strategy.” – Ginny Marvin
  2. “Images from the campaign and data illustrating concrete outcomes. Calling out what was innovative or especially significant or effective about the campaign.” – Greg Sterling
  3. “Stories around how the campaign was unique from other campaigns the agency and/or client had implemented in the past and the tools used to implement the campaign. Also, did you learn anything from the campaign that you’ve been able to introduce to other campaigns/clients. Were there any unexpected benefits that played out during the course of the campaign?” – Amy Gesenhues
  4. “I’d love to see more data from our entrants that pinpoint successes or failures in their case studies.” – Barry Schwartz
  5. “Entries that show the challenges they had to overcome that are outside of the norm (the scrappy startup against goliath, goliath showing it can innovate still against the scrappy startups stealing market share, etc), which might be market conditions, a business change, etc.” – Brad Geddes

What entrants need to stop doing:

  1. “It’s great to test new betas, but having access to betas doesn’t make you a great marketer. Be sure your entry doesn’t lean on implementing the newest beta features as evidence of running a successful campaign. That’s not enough.” – Ginny Marvin
  2. “Padding their discussions, using marketing jargon or bloated writing. I’d also like to see less self-congratulation.” – Greg Sterling
  3. “Using language like world-class, best-in-class, etc. to define your campaign. Talk specific numbers and results. Using flowery language to build-up the campaign takes away from actual/quantifiable results. (In other words, let the numbers speak for themselves.)” – Amy Gesenhues
  4. “Not differentiating on strategy or tactics. While it’s important that we see ‘best or standard practices’ are in place in an account, we are also looking for a detailed explanation of strategy that truly differentiates the work from others… For example, an account testing new ad extensions / formats or a landing page that breaks convention but delivers impressive conversion data.” – Brad Geddes
  5. “Claiming increases of 200% when you really mean 100%. A 100% increase means you doubled your number. Going from $100 to $137 is not a 137% increase. It is a 37% increase. I’d like it that when you say ROAS, you show the formula you used to calculate it. A 1000% increase is almost always ignored as a metric. It is the opposite of impressive – it is suspicious. It is most likely you were doing very little before and now you are doing a little more than nothing.” – Matt Van Wagner

The final deadline for the 2021 Search Engine Land Awards is September 3, 2021 at 11:59 pm PST. Review the categories for 2021 and begin your application here

The post How to craft a winning Search Engine Land Awards entry: Past judges share their advice appeared first on Search Engine Land.

Read More
Jason August 18, 2021 0 Comments

SMX Convert is tomorrow… don’t miss out!

If you’re searching for actionable tactics to drive more paid and organic conversions you can’t afford to miss SMX Convert — happening online tomorrow, August 17, from 11:00am – 5:30pm ET.

At just $149, your All Access packs a tremendous amount of value:

  • Grow your knowledge step-by-step with a brand-new two-track program from the Search Engine Land experts.
  • Optimize user experiences, craft compelling copy, and boost landing page performance with actionable tactics you can implement immediately.
  • Prepare for upcoming privacy changes with insights from Google’s Group Product Manager for Trust and Privacy and more special guest speakers.
  • Discuss common CRO challenges and creative solutions with like-minded attendees during community meetups.
  • Soak up inspiration from expert-led audits of peer-submitted assets during live clinics.
  • Get your specific questions answered during Overtime, live Q&A with all of the SMX Convert speakers.

Here’s another way to look at it…

  • Because all sessions are available live and on-demand, you’ll get 12 hours of SEO and PPC conversion optimization tactics. That’s just $12 per hour of expert-led training!
  • You’ll attend presentations from 21 of the world’s leading search and conversion optimization experts. That’s just $7 per expert. (You’d spend more buying each a cup of coffee!)
  • Attend a track in its entirety to earn a personalized “Certificate of Completion”, a wonderful way to demonstrate your worth when asking for a promotion or a salary bump.
  • Hearing what industry experts are up to will help validate your ongoing initiatives and confirm you and your team are on the right track. That kind of peace of mind is priceless.

What are you waiting for? Secure your $149 All Access pass now and join us tomorrow at 11:00am ET.

The post SMX Convert is tomorrow… don’t miss out! appeared first on Search Engine Land.

Read More
Jason August 16, 2021 0 Comments

Google’s tool to report indexing bugs is now available in the U.S.

Google’s reporting tool for indexing bugs is now available to all signed-in Search Console users in the U.S., the company announced on Monday. The tool, which was first announced as a pilot program back in April, can be accessed at the bottom of the URL inspection help document and indexing coverage report document.

The button to access the reporting tool, as it appears at the bottom of the URL inspection help document and indexing coverage report document.

Intended use. The tool enables SEOs and site owners to report an indexing issue directly to Google. It is designed for those who need further support with indexing issues outside of the Google community forums and support documentation.

How to report indexing issues. Below is a screenshot of the form.

google_indexing_issue_report_form

As the form is filled out, follow-up questions are generated so that the SEO or site owner can add more details about the issue. “We may follow-up for more information if we confirm an actual indexing bug,” Google says on the form instructions, “We will not respond to other kinds of issues.”

Why we care. Indexing issues in Google Search are fairly common. In fact, we’ve reported numerous confirmed indexing issues with Google over the years. Now that this tool is out of the pilot program, SEOs and site owners in the U.S. have a way to escalate these indexing issues, which should help them get closer to resolving them.

The post Google’s tool to report indexing bugs is now available in the U.S. appeared first on Search Engine Land.

Read More
Jason August 16, 2021 0 Comments

How to plan SEO content that actually ranks

Content has been king for a while now, but just because you wrote something doesn’t mean it’ll drive qualified traffic to your site. In fact, it doesn’t even guarantee that your content will show up in search results: 90% of content on the web gets no traffic from Google, according to 2020 data from Tim Soulo of Ahrefs.

The key to effective content is planning. I’m sure there are some people who just type out bangers from their stream of consciousness, but those writers are definitely few and far between. The rest of us rely on planning and thoughtful execution. So how do you plan SEO content that actually ranks?

Aja Frost, head of English SEO at HubSpot, went over just that at SMX Create this year. One of the highest-rated sessions at the event, her top suggestions are covered here to help you get your content higher in search engine results.

How to identify measureable metrics for your content goals

Frost recommends picturing a chart like the one below and asking, “What do I want to see on the Y-axis?”

Step 1. “Figure out what would make your boss [or] your client over the moon on the Y-axis,” she said. It’s likely not traffic, but something more like leads, appointments, purchases—which should be your ultimate goal. The traffic goals will lead us to those end targets. “This is why we start with demand goals and back those into our traffic goals by [dividing] by your historical or expected conversion rates.”

demand goals ÷ historical (expected) conversion rates = traffic goals

Note: The formula above has been adjusted to correct for a typo in the presentation slides.

At this point you may be like, that’s great but I have no idea what my expected conversion rates are. Here’s how to figure it out: “Take your demand actuals from the last 12 to 24 months… and then compare them to your traffic actuals from the same time period,” Frost said. “Sum up the demand metric of your choice divided by your organic traffic, and there you go. You’ve got your CVR.”

If you don’t have this data, you’ll likely have to get creative and figure out a comparable conversion rate. For example, if you’re creating a new product or service, you can use comparable CVRs from other assets you’ve been working on (say a blog or an online community, etc.).

Step 2. Next, figure out the demand you want to drive in the next 12 months and divide those by your historical or expected CVR. That gives you traffic goals.

Once you have these goals, you should also calculate where you’d land if you did absolutely nothing. “Unless you have zero content right now, your traffic is going to grow regardless of what you do,” advised Frost. “So by figuring out where you’d land if you did nothing, and the gap between that and what you need to grow, you can figure out how much additional traffic and conversions you need to generate.”

Step 3. After that, you need to determine how much monthly search volume you have to target to make up the difference between your projections and how your content would grow if you did nothing. Frost recommends a CTR curve analysis and creating estimates by SERP positions 1-3, 4-6, 7-9 and 10th position on the first page of search results. “You can multiply your weighted CVR by the traffic you need to generate to find your MSV [monthly search volume] estimates by positions,” said Frost.

How to perform keyword research based on personas

First, create or refine your personas. “When I talk to advanced SEOs, this is often a step they skip,” said Frost. But she implores SEOs of all skills levels not to forego this step. “The more deeply you understand your personas and the more detailed your insights, the more comprehensive and accurate your list of seed keywords will be.” All of your target keywords in the research process stem from these personas.

Some of the basic persona questions you need to answer include what their industry is, how big their department within their company is and what tools they need to do their jobs. So, if you find out that your target persona is in the hospitality industry with a team of two people and a company of 24 and that they typically use tools for hotel reservations and accounting, you’ll know that “hotel management software” is a seed keyword.

From there, develop your list of seed keywords and expand it out to related short-tail keywords and down to long-tail keywords. Along with the standard tools (Ahrefs, Moz, Semrush), Frost also recommends a few other keyword research tools that SEOs may not know about. Using different tools also means that you’ll get insights that other SEOs researching this space may not have. Her recommended tools include Keyword Keg, Bing’s keyword research tool and seedkeyword, which lets you ask your target audience how they’d search for a particular topic.

After that, clean the list up based on what you know about each persona and determine what’s relevant and what may not be worth your time. “Filter, categorize, and group your keywords together so you can efficiently create content,” recommended Frost.

keyword research intent

Once you’ve got your list of seed keywords, upload them into the tool of your choice and download search suggestions.

Next, Frost categorizes queries by intent: informational, transactional, and navigational. “Informational queries contain modifiers like ‘who, what, where, when, why,’ transactional queries contain questions related to price, cost, and promotion, and navigational queries are specific to the brand or product you’re doing research for,” advised Frost.

Build an editorial (or content) calendar

Find the editorial calendar tool that works best for you and that you’ll actually use — whether it’s Trello, Asana, Monday, or just plain Google Sheets. From there, Frost recommends adding the following to your content calendaring tool of choice:

  • The basics: Like target keyword, URL recommendations, headers and more.
  • Internal linking opportunities: Products, offers or signup pages.
  • Level of effort: The average of keyword difficulty of target keyword(s) multiplied by competitor content quality score.
  • Expected traffic: Multiply search volume by the CTR of your expected position.
  • Competitive advantage: Something that will differentiate your content (original data, a strong point of view, etc.).

You can also group keywords by theme (as opposed to persona) and sum up how much search volume you’re targeting for each theme. Finally, you get to writing your next-level content based on these goals and data points.

Learn more from SMX Create on demand

This is just a taste of what’s available on-demand from the super popular SMX Create event. Check out Aja Frost’s full presentation and the rest of the SEO content creation journey including…

  • Creating compelling content for SEO with Alli Berry from The Motley Fool,
  • Optimizing your content for increased findability with Niki Mosier from AgentSync and
  • Alternative content strategies to increase organic traffic and tracking success in 2021 with Maria Amelie White from Floristpro and John Shehata of Conde Nast.

The post How to plan SEO content that actually ranks appeared first on Search Engine Land.

Read More
Jason August 13, 2021 0 Comments

Google adds author URL property to uniquely identify authors of articles

Google updated the article structured data help document to add new author properties to the list of recommended properties you can use in Google Search. The company said it added a new recommended author.url property to the article structured data documentation.

What is author.url. The author.url property is a new recommended property you can add to your article structured markup that is essentially a link to a web page that uniquely identifies the author of the article. This link can be to the author’s social media page, an about me page, a biography page or some other page that helps identify this author.

Alternative. Google also said in the help documents that “you can use the sameAs property as an alternative.” Google can understand both sameAs and url when disambiguating authors, the company said.

Why it’s important. Some authors, like myself, write across two or more websites. Giving the search engine a way to identify that the same author wrote articles on site A and on site B can help Google better understand the author’s footprint. It might be used for the new article carousel in the author knowledge panels and for broader reasons at Google.

Why we care. If your site publishes articles, it might benefit you to add this new property to your article structured data. Who knows if Google will use it more broadly than just in the author knowledge panels, and use it to try to understand the expertise of a specific author across multiple sites. Maybe, just maybe, that can help your site rank better in the long term. That is assuming SEOs spammers do not manipulate it and post-fact author markup for their stories.

The post Google adds author URL property to uniquely identify authors of articles appeared first on Search Engine Land.

Read More
Jason August 10, 2021 0 Comments

How to maintain organic performance when merging multiple websites

Developing a new organizational structure when merging two or more businesses is a complicated affair, but if your new business is going to rely on its website to drive sales, leads or audience engagement, then defining a website structure that preserves and builds upon the performance of any existing websites that the merging parties own should be a top priority. With that in mind, creating a sitemap that draws on the strengths of the current websites will help to give the new business/brand the initial visibility it needs in order to be successful.

Over the years, I have managed many website migrations, but in the past year, I have had the opportunity to manage the successful migration of a merger of three different businesses/websites into one new website. In fact, I have been lucky enough to successfully manage this scenario twice in the past year and I’ve learned a lot in the process. By cherry-picking the most valuable pages to develop the sitemap, one project resulted in the website retaining nearly 100% of the traffic the previous domains were getting (there was some loss where previous services became irrelevant and therefore pages were removed), and the other project resulted in the website increasing traffic levels post-migration. For what was essentially two entirely new brands, this gave them a hell of a head start when entering their respective markets. 

While the migration strategy involves a lot more than just structuring a sitemap, when it comes to mergers this is a particular area of importance, and it needs the appropriate level of analysis to ensure the migration is a success. Get this part right and the new website will be well on its way to retaining and even improving upon the performance of the merging sites. 

What should you look for when structuring the new sitemap?

So, what exactly qualifies as a “valuable” existing page, and which pages are we looking to retain? This may look different from website to website, but as a general rule of thumb, I look at the following:

Traffic drivers. Pages that are already driving a lot of traffic to the existing websites are obviously going to be important, particularly pages that are driving traffic that is still relevant to the new business’s offering. Even if the high traffic-driving pages are slightly less relevant (but not completely irrelevant) to the new business’s offering, it might be worth keeping them to help build brand awareness in the early days. This won’t work for services/products that are no longer offered, but for loosely related blog topics etc. it can be a good brand builder to keep that traffic flowing through the site.

Convertors. Pages with a high number of conversions/conversion rate should be considered, as long as what users were converting for is still relevant to the new business. These pages can keep the sales/inquiries etc. rolling in whilst the site builds its rankings/visibility up in other areas. 

Ranking pages. The new site will likely have a target keyword list, but your current sites might already be ranking for some of those keywords. Finding pages that rank for valuable keywords, whether they have high search volumes or not (maybe they don’t drive a lot of traffic, but they attract the RIGHT traffic that converts) and whether they have high rankings or not (if a page ranks position 36 for a target keyword, it can be developed and improved to rank better, rather than trying to start completely from scratch) will be an important part of the strategy.

Pages with backlinks. Backlinks are a big part of what strengthens a domain over time, so if you don’t bring across pages that have backlinks, then the new site will be missing out on all of that potential authority-building goodness. This gives the new site a shortcut to quickly building a healthy backlink profile.

Priority page supporters. Some pages may appear to have no value as they get no traffic, conversions, rankings or backlinks, but they might be the supporting architecture helping to hold up the rankings of other pages. Relevant and high-quality content which links to priority pages that are already ranking should be retained where possible to ensure the priority page’s rankings don’t crumble because the architecture has been deconstructed.

New business offering/priorities. Of course, the sitemap needs to look to the future, and not just to the past, so any new offerings or priorities for the newly formed business will need to be considered within the sitemap, and pages will need to be built out within the proposed architecture to cater for these new offerings.

How do we find these pages in order to add them to the sitemap?

So, now that we know what we are looking for, how are we going to go about finding these pages? The following audit process pulls together data from multiple sources and analyses each page on the existing sites to discover whether any of them qualify as a “page of value” for the new site once the merger/migration is complete.

1. Keyword audit:
Pages of value discovered: Ranking pages and New business offering/priorities
Tools used: Semrush (or similar tool)

The first step is to conduct keyword research based on the offering of the new website. At this point, we are looking for relevant keywords for every product, service and user intention, as well as local variations of “[keyword] + [location]” if appropriate. If using Semrush, you can then add that keyword list to a new rank tracking project, and add all three (or more/less, depending on the merger) existing domains to that project. That way, you are able to see which pages on which domains currently rank best for each keyword, as you might find that more than one domain ranks for some of the target keywords. Pick the highest-ranking pages for each keyword (you might want to set a limit for what is an acceptable ranking to try and retain, e.g., position 40 or better) and add them to the sitemap if they seem like a good match for the new business and can be optimized/improved going forward. If the ranking for a keyword is too low, it might be better just to start from scratch when targeting that particular keyword.

2. Content performance audit:
Pages of value discovered: Traffic drivers and Convertors
Tools used: Google Analytics (or similar platform)

Next is the content performance audit, where we look to discover pages that are driving traffic and/or conversions deemed valuable to the business going forward. Most website owners will be keen to retain as much of their current traffic as possible, and as long as it is still relevant, then high traffic/conversion driving pages should be kept. 

Using Google Analytics, filter by organic traffic and look back at a specific timeframe (I usually look at the past year). Go to the Behavior > Site Content > Landing Pages report and sort by “Sessions” (descending). At this point, you need to set a limit for how many organic landing sessions a page should have had in the past year to be deemed valuable. This might be a specific number of sessions or just a percentage of the traffic overall. The limit will vary from website to website. Pull together a list of all the pages that are over that threshold and sense check them to ensure they are still relevant to the new business offering.

Next, you’ll filter that same list by Goal Completions or Revenue, depending on whether it’s an ecommerce site or not. Again, you’ll need to set a limit as to how many conversions / how much revenue is deemed valuable, and keep those pages that are driving a high number of conversions. You can also look at pages with high conversion rates, but be sure that there is enough traffic going to the page to make an informed decision about whether the conversion rate is actually good or not (e.g., one session at a 100% conversion rate could be a fluke, but if you have thousands of pages like this, it does add up, so again, decisions are made on a case by case basis).

3. Content architecture audit:
Pages of value discovered: Priority page supporters
Tools used: Screaming Frog (or similar)

At this point, you should have an understanding of which pages are performing well on the website, whether it’s through rankings, traffic or conversions, and you should also know which pages/services/products are going to be a priority going forward for the business. 

Using Screaming Frog, crawl each website. You’ll then need to find the landing pages that are deemed a priority by searching for them in the “Search” box. In the bottom navigation menu, you can then click on “Inlinks.” This will show you all of the pages that are linking internally to the priority page and may be supporting its success. Keep in mind that if your priority page is in the footer or main navigation, every page on the site will likely link to it, so this gives you an idea of where that page should sit within your sitemap hierarchy. 

Of particular importance are any pages that are linking internally to the priority page using keyword-optimized anchor text, but other internal links may be helping too. At this point, you need to look through the list of internal linking pages, decide which ones are still relevant, and keep them in the sitemap if possible.

4. Backlink audit:
Pages of value discovered: Pages with backlinks
Tools used: Majestic SEO (or similar)

Next, we need to try and retain any pages that have strong, authoritative backlinks pointing to them. The best way to retain the value from the backlink is to replicate the page on the new site and redirect it appropriately. Later down the line you can then contact the owners of the site linking to that page and ask them to update it to the new domain.

Using Majestic SEO, search each of your domains, and filter by “Root Domain.” That way, you can see all of the backlinks across your site. Then, head to the “Backlinks” tab and export the data (ideally, you will look at “All backlinks per domain,” as this will show you if you have multiple pages being linked to from a single domain, but you may hit a limit on how many you can download, depending on your subscription). If there are less than 5,000 backlinks on your site, you can go ahead and export the data, but if you have more than this, you will need to create and download an Advanced Report. 

Once you have exported your data, you can sort by “TargetURL,” which will give you an understanding of which pages have the most backlinks and are a higher priority to keep. Majestic SEO has “TrustFlow” and “CitationFlow” scores which will give you an indication of the quality of those backlinks. Depending on the size/quality of the backlink profile, you may again need to set a limit on the quantity/quality of backlinks you want to retain and add those pages with high quantity/quality of backlinks to your sitemap.

5. Defining the information architecture:

Now that you know which historic pages hold SEO value, you need to define the information architecture in order to better enable crawling and indexing of priority pages. The safest way to migrate pages and retain their value is to keep URL structures as they are, but this most likely won’t be possible when bringing multiple sites together, so you’ll need to consider two things. One, is the priorities for the new business, i.e., which pages are going to represent the main offering, and two is the performance of any existing pages on the old domains that represent those offerings.

If one of the existing sites is performing much better in terms of rankings, traffic and conversions than the other sites, and you are bringing across multiple pages from that domain, it makes sense to try and maintain that URL structure, if possible, and then replicate that across any similar pages coming from the other domains. Migrations do present an opportunity to improve URL structure, but as previously mentioned, the safest bet is to maintain current structures and not deepen the crawl depth/folder level of priority pages if possible.

Simply put, your top performers/priority pages need to sit at the top of the information architecture, and maintain as much of their current URL structure as possible. You can visualize your new sitemap and information architecture, and also ensure you have a spreadsheet that details all of the information you have discovered during this audit process for each page so that you, your client or your boss can see the justification behind each page.

Defining the sitemap/information architecture using any existing data is only one step in the migration process, but it is without a doubt one of the most important steps as it can lead to retained traffic, rankings, conversions and brand presence in the SERPs. That doesn’t mean you can then ignore the technical setup, landing page design, content optimization or any of the other factors that go into a successful migration, but for business mergers, this is one of the best places to start.

The post How to maintain organic performance when merging multiple websites appeared first on Search Engine Land.

Read More
Jason August 9, 2021 0 Comments

Google drops safe browsing as a page experience ranking signal

Google is removing the safe browsing signal from the Google page experience update, the company announced. Google said “we recognize that these issues aren’t always within the control of site owners, which is why we’re clarifying that Safe Browsing isn’t used as a ranking signal and won’t feature in the Page Experience report.”

As a reminder, the page experience update is rolling out, it has been since June 15th and will continue to roll out through the end of August.

New page experience diagram. Here is the new diagram that removes “safe browsing” from the list of page experience signals:

You can compare it to the original diagram:

Why is Google removing safe browsing. Google said it is removing this as a signal because these are issues that are not always in the control of site owners. Google said “sometimes sites fall victim to third-party hijacking.” Google will continue to flag these notifications in Search Console but outside of the page experience report.

Google is also removing the Ad Experience widget, Google said “to avoid surfacing the same information on two parts of Search Console.” But Ad experience was never used in the Google page experience update.

More changes to the page experience report. Google made additional changes to the page experience report including:

  • Added a “No recent data” banner to the Core Web Vitals report and Page Experience report.
  • Fixed a bug that caused the report to show “Failing HTTPS” when Core Web Vitals data was missing.
  • Rephrased the empty state text in the Page Experience report and Core Web Vitals report.

Why we care. This is one less ranking signal and factor you need to worry about when it comes to your performance in Google Search. Of course, you don’t want to provide an unsafe browsing experience for your users, but you can still learn about those in Search Console, but it won’t count against you in your rankings.

The post Google drops safe browsing as a page experience ranking signal appeared first on Search Engine Land.

Read More
Jason August 4, 2021 0 Comments

Meet Make Every feature Binary: Bing’s sparse neural network for improved search relevance

Bing has introduced “Make Every feature Binary” (MEB), a large-scale sparse model that complements its production Transformer models to improve search relevance, the company announced Wednesday. This new technology, which is now running on 100% of Bing searches in all regions and languages, has resulted in a nearly 2% increase in clickthrough rate for the top search results, a reduction in manual query reformulation by more than 1% and a 1.5% reduction of clicks on pagination.

What MEB does. MEB maps single facts to features, which helps it achieve a more nuanced understanding of individual facts. The goal behind MEB seems to be to better mimic how the human mind processes potential answers.

This stands in contrast to many deep neural network (DNN) language models that may overgeneralize when filling in the blank for “______ can fly,” Bing provided as an example. Most DNN language models might fill the blank with the word “birds”.

“MEB avoids this by assigning each fact to a feature, so it can assign weights that distinguish between the ability to fly in, say, a penguin and a puffin,” Bing said in the announcement, “It can do this for each of the characteristics that make a bird—or any entity or object for that matter—singular. Instead of saying ‘birds can fly,’ MEB paired with Transformer models can take this to another level of classification, saying ‘birds can fly, except ostriches, penguins, and these other birds.’”

Discerning hidden intent. “When looking into the top features learned by MEB, we found it can learn hidden intents between query and document,” Bing said.

Examples learned by MEB model. Image: Bing.

MEB was able to learn that “Hotmail” is strongly correlated to “Microsoft Outlook,” even though the two aren’t close in terms of semantic meaning. Hotmail was rebranded as Microsoft Outlook and MEB was able to pick up on this relationship. Similarly, it learned the connection between “Fox31” and “KDVR” (despite there being no overt semantic connection between the two phrases), where KDVR is the call sign of the TV channel that operates under the brand Fox31.

MEB can also identify negative relationships between phrases, which helps it understand what users don’t want to see for a given query. In the examples Bing provided, users searching for “baseball” don’t typically click on pages talking about “hockey” even though the two are both popular sports, and the same applies to 瑜伽 (yoga) and documents containing 歌舞 (dancing and singing).

Training and scale. MEB is trained on three years of Bing search that contain more than 500 billion query/document pairs. For each search impression, Bing uses heuristics to gauge whether the user was satisfied with the result they clicked on. The “satisfactory” documents are labeled as positive samples and other documents in the same impression are labeled as negative samples. Binary features are then extracted from the query text, document URL, title and body text of each query/document pair and fed into a sparse neural network model. Bing provides more specific details on how MEB works in its official announcement.

How MEB is refreshed on a daily basis.
How MEB is refreshed on a daily basis. Image: Bing.

Even after being implemented into Bing, MEB is refreshed daily by continuously training on the latest daily click data (as shown above). To help mitigate the impact of stale features, each feature’s timestamps are checked and the ones that have not shown up in the last 500 days are filtered out. The daily deployment of the updated model is also fully automated.

What it means for Bing Search. As mentioned above, introducing MEB on top of Bing’s production Transformer models has resulted in:

  • An almost 2% increase in clickthrough rate on the top search results (above the fold) without the need to scroll down.
  • A reduction in manual query reformulation by more than 1%.
  • A reduction of clicks on pagination by over 1.5%.

Why we care. Improved search relevance means that users are more likely to find what they’re looking for faster, on the first page of results, without the need to reformulate their queries. For marketers, this also means that if you’re on page 2 of the search results, your content probably isn’t relevant to the search.

MEB’s more nuanced understanding of content may also help to drive more traffic to brands, businesses and publishers, since the search results may be more relevant. And, MEB’s understanding of correlated phrases (e.g., “Hotmail” and “Microsoft Outlook”) and negative relationships (e.g., “baseball” and “hockey”) may enable marketers to spend more time focusing on what customers are really searching for instead of fixating on the right keywords to rank higher.

For the search industry, this may help Bing maintain its position. Google has already laid out its vision for MUM (although we’re far from seeing its full potential in action), and MEB may bolster Bing’s traditional search capabilities, which will help it continue to compete against the industry leader and other search engines.

The post Meet Make Every feature Binary: Bing’s sparse neural network for improved search relevance appeared first on Search Engine Land.

Read More
Jason August 4, 2021 0 Comments