Category: SEO

Google December 2021 product reviews update is finished rolling out

Google has confirmed that the December 2021 product reviews update is now finished rolling out. This update has officially completed rolling out a few days before Christmas.

The announcement. “The Google product review update is fully rolled out. Thank you!” Google’s Alan Kent wrote on Twitter.

December 2021 product reviews update. As a reminder, the December 2021 product reviews update started to roll out at about 12:30pm ET on December 1, 2021. This update took 20 days to roll out after it was announced. So this update started on December 1, 2021 and lasted through December December 21, 2021.

When and what was felt. Based on early data, this update was not a small update. It was bigger than the April 2021 product reviews update but also seemed to continue to remain pretty volatile throughout the whole rollout. The community chatter and tracking tools were all at pretty high levels consistently for the past few weeks.

Why we care. If your website offers product review content, you will want to check your rankings to see if you were impacted. Did your Google organic traffic improve, decline or stay the same?

Long term, you are going to want to ensure that going forward, that you put a lot more detail and effort into your product review content so that it is unique and stands out from the competition on the web.

More on the December 2021 products reviews update

The SEO community. The December 2021 product reviews update, like I said above, was likely felt more than the April version. I was able to cover the community reaction in one blog post on the Search Engine Roundtable. It includes some of the early chatter, ranking charts and social shares from some SEOs. In short, if your site was hit by this update, you probably felt it in a very big way.

What to do if you are hit. Google has given advice on what to consider if you are negatively impacted by this product reviews update. We posted that advice in our original story over here. In addition, Google provided two new best practices around this update, one saying to provide more multimedia around your product reviews and the second is to provide links to multiple sellers, not just one. Google posted these two items:

  • Provide evidence such as visuals, audio, or other links of your own experience with the product, to support your expertise and reinforce the authenticity of your review.
  • Include links to multiple sellers to give the reader the option to purchase from their merchant of choice.

Google product reviews update. The Google product reviews update aims to promote review content that is above and beyond much of the templated information you see on the web. Google said it will promote these types of product reviews in its search results rankings.

Google is not directly punishing lower quality product reviews that have “thin content that simply summarizes a bunch of products.” However, if you provide such content and find your rankings demoted because other content is promoted above yours, it will definitely feel like a penalty. Technically, according to Google, this is not a penalty against your content, Google is just rewarding sites with more insightful review content with rankings above yours.

Technically, this update should only impact product review content and not other types of content.

More on Google updates

Other Google updates this year. This year we had a number of confirmed updates from Google and many that were not confirmed . In the most recent order, we had: The July 2021 core updateGoogle MUM rolled out in June for COVID names and was lightly expanded for some features in September (but MUM is unrelated to core updates). Then, the June 28 spam update, the June 23rd spam update, the Google page experience update, the Google predator algorithm update, the June 2021 core update, the July 2021 core update, the July link spam update, and the November spam update rounded out the confirmed updates.

Previous core updates. The most recent previous core update was the November 2021 core update which rolled out hard and fast and finished on November 30, 2021. Then the July 2021 core update which was quick to roll out (kind of like this one) followed by the June 2021 core update and that update was slow to roll out but a big one. Then we had the December 2020 core update and the December update was very big, bigger than the May 2020 core update, and that update was also big and broad and took a couple of weeks to fully roll out. Before that was the January 2020 core update, we had some analysis on that update over here. The one prior to that was the September 2019 core update. That update felt weaker to many SEOs and webmasters, as many said it didn’t have as big of an impact as previous core updates. Google also released an update in November, but that one was specific to local rankings. You can read more about past Google updates over here.

The post Google December 2021 product reviews update is finished rolling out appeared first on Search Engine Land.

Read More
Jason December 21, 2021 0 Comments

Google Search Console accessibility issues fully resolved five days later

Last Thursday, many SEOs noticed that Google Search Console was inaccessible. Google confirmed the issue saying “We’re aware of an issue with Search Console that prevents some users from using the service. We’re working on fixing it and we’ll post an update when the issue is resolved.” The issue was not impacting all users, but it did impact many users.

The issue is now officially resolved, five days after it was first confirmed. Google posted in an update “The issue is now resolved. Thanks for the patience.”

The notices. Here are the two posts on Twitter from Google on this issue:

Resolved earlier. I believe this was mostly resolved earlier, like within 48-hours of the issue, but Google probably fully restored access five-days later. It is not clear what the issue was exactly but it seemed to me it was around server capacity and resources for the Search Console tools. Again, that is not confirmed but the errors displayed conveyed a 429 Apache error which means “Too Many Requests response status code indicates the user has sent too many requests in a given amount of time (“rate limiting”).”

Why we care. If you were having issues with accessing Google Search Console, those issues should now be fully resolved. Some of you may already be off on your holiday break, so I guess any reporting you need to run can wait until you return.

The post Google Search Console accessibility issues fully resolved five days later appeared first on Search Engine Land.

Read More
Jason December 21, 2021 0 Comments

How to optimize your site for better findability

“You wouldn’t build a house without having a strong foundation,” Niki Mosier, head of SEO and content at AgentSync said, “And you shouldn’t build a website without having a strong foundation either, and without constantly making sure that foundation is strong and that there are no cracks in it.”

Optimizing the architecture of your site can help search engine crawlers find and index your content, which enables them to show those pages to users in the search results. It can also help send link authority around your site as well as make it easy for visitors to find what they’re looking for.

In her session at SMX Create, Niki Mosier shared the strategies she uses to ensure that the foundations of her site are solid and identify opportunities for greater search visibility.

Crawl budget analysis

Crawl budget refers to the number of URLs per site that Googlebot (or any other search engine crawler) can and wants to crawl.

“Every website gets a crawl budget, which can vary depending on the size of the site and the frequency that new content is being published on the site, so having an idea of what a website’s crawl budget is can be really beneficial in making informed decisions on what to optimize,” Mosier said.

Conducting a crawl budget analysis enables you to get a more comprehensive view of:

  • How your website is being crawled. “If you identify that Googlebot is the client, you can use log file analysis to find out how Googlebot is handling the URLs on your site [and] if it is crawling any pages with parameters,” she said.
  • How fast your site is. While there are many tools that can tell you how fast your server reacts, a log file analysis shows you how long it’s taking for a bot to download a resource from your server. 
  • Indexing problems. “Getting into the log files can really show us whether bots are having trouble downloading a page fully,” Mosier said.
  • How often a URL is being crawled. The crawl frequency can be used to figure out if there are URLs that a search engine crawler should be crawling but isn’t, or vice versa.
  • Crawling problems. This tactic can also reveal when a crawler is encountering 404 errors or redirect chains, for example.

“When it comes to actually doing crawl budget analysis, there’s a couple of tools that are helpful,” Mosier said, recommending ScaremingFrog’s Log File Analyser, Microsoft Excel and Splunk.

Image: Niki Mosier.

Mosier outlined her steps to performing a crawl budget analysis:

  1. Obtain your log files; Mosier recommended working with at least a month of data.
  2. Look at URLs with errors.
  3. Assess which bots are crawling which areas of your site.
  4. Evaluate by day, week and month to establish patterns that may be useful for analysis.
  5. See if a crawler is crawling URLs with parameters, which may indicate wasted crawl budget.
  6. Cross-reference crawl data with sitemap data to assess for missed content.
SEOs should evaluate the impact of fixes, developer resources needed for fixes and the time to fix when prioritizing optimizations.
Image: Niki Mosier.

“Once you’ve dived into the server logs and have a good sense for what your crawl budget looks like, you can use this data to prioritize your SEO tasks,” she said, adding that SEOs should “prioritize based on the impact that fixing different areas of your site will have, the dev resources needed to fix issues and the time to fix those issues.”

RELATED: How to optimize your website’s crawl budget

Driving traffic with technical SEO 

Finding out how well your site is functioning can help you put the right strategies in place to drive more traffic to it.

Factors that can help drive traffic to a site.
Image: Niki Mosier.

“Doing regular site audits is a great way to keep a pulse on what’s happening with our websites,” Mosier recommended. In addition, Google Search Console should be used to check for Core Web Vitals or schema issues, for example. “Using monitoring tools, [such as] Rank Ranger, Semrush and Ahrefs, these are great ways to stay alerted to any issues that might pop up with your website,” she said.

Assessing the search results pages (SERP) can give you a feel for the landscape of the keywords you’re targeting. In addition to seeing what search features may be available, the SERP also shows you which sites are ranking higher than you — “See what those sites are doing; looking at their source code can tell you what schema they’re using,” Mosier said, adding that you should also be viewing their pages to scope out what their headings and user experience look like.

Updating your old content can also result in a rankings boost. Mosier recommends paying extra attention to your headings and above-the-fold content. Adding schema markup may also enable your content to appear as a rich result, which may also increase your visibility on the SERP.

“Using tools like Frase or Content Harmony can help you see what other sites that are ranking for the keywords that you want to be ranking for are using for headings, what kind of FAQ content they’re using and what content they have above the fold,” she added.

“Paying attention to page speed is definitely an important metric to think about, [but] I think it’s also important to pay attention to what the industry average is,” Mosier said, “So, go and look at where your competitors’ sites are ranking or are at as far as page speed and kind of set that as your benchmark.”

It’s also important to assess individual page speed versus overall site speed: “You want to see what each page on your site is loading for and make improvements on a page-by-page basis and not just look at the site speed as a whole because pages are what is ranking, not necessarily the whole site,” she said.

Additionally, how your pages render can affect your user experience as well as what search engine crawlers “see.” “Is there a pop-up or a really big header on a particular page that’s taking up a lot of the above-the-fold space? That can be a problem,” Mosier said, noting that page speed can also impact how search engines render a page.

More from SMX

The post How to optimize your site for better findability appeared first on Search Engine Land.

Read More
Jason December 17, 2021 0 Comments

How to gain SEO insights using data segmentation

Recognizing patterns is a major part of successful SEO strategies, yet it can seem daunting with the sheer amount of data available. Murat Yatağan, consultant for growth and product management at Brainly, recommends marketers address these issues using data segmentation tactics.

“You need to focus on generating insights based on patterns that you can recognize,” said Yatağan in his presentation at SMX Next. “An important part of a successful issue strategy is relying on these patterns that you have recognized — these things are telling you a story.”

Yatağan suggests marketers use one of two data segmentation tactics — with a developer’s help, if needed — to serve as the foundation for your SEO strategy: custom scraping using Regex/Xpath or segmentations post-crawl.

“I segment data by traffic,” he said. “It’s organic traffic along with the crawl information that I gathered because these two [metrics] enable me to triangulate the data about the website, so it shows me some patterns.”

Here are four common patterns Yatağan recommends marketers look for to gain insights from their data segments.

Author productivity and credibility

Yatağan presented an example of a campaign using segments to track authors, along with the numbers of the pages they have produced. Grouping the data this way gives SEOs insights into data they wouldn’t see by only looking at broad metrics, such as pageviews or dwell time.

Image: Murat Yatağan

“This shows you that it’s not just the amount of the articles that are being created, it’s more about the credibility,” he said. “By just looking into this data, you have a direction.”

Content quality and traffic

Obviously, content quality is a huge factor when it comes to SEO and readability. But, it’s often difficult to measure its impact on site traffic and rankings.

Yatağan gave an example of how he segmented data using word count. Though most SEOs consider this metric less relevant than other factors, it can still serve as a good measuring stick when comparing pieces of content.

“I don’t think word counts are good indicators of quality of the page by themselves,” he said. “But you can use them to compare different pages performances against each other.”

“So it’s not about the number of articles that were produced, it’s the amount of quality,” he added.

measuring content quality using word count and sessions
Image: Murat Yatağan

These metrics can help marketers determine which articles provided the most information and how well they were written.

Similar content affecting traffic

Yatağan pointed to a graph showing a segmented group of content displaying pieces that were near-duplicates or thin content — both poor quality signals. These segments illustrated how they correlated to decreases in SEO sessions.

using data segments to measure thin and duplicate content's negative impact
Image: Murat Yatağan

Viewing similar content in this way can show a direct correlation between poor traffic numbers and low-quality content. These segments can serve as the foundation for content strategies.

“I’m not only saying it’s a strategy, but that it’s a finding that leads you to create a strategy,” Yatağan said.

Effects of internal links to content

One of the most effective ways to get more eyes on your content is internal linking. They tell search engines and readers that it’s important information. But if marketers are struggling to prove their worth, data segmentation can help.

“Links are a big part of the web experience. So this is how you navigate the web,” Yatağan said. But, unfortunately, there are many instances where adding internal links to your pieces doesn’t lead to an increase in SEO sessions.

data segments measuring impact of internal linking
Image: Murat Yatağan

Yatağan suggests looking for those pieces that had the highest interaction via comments, click actions or some other activity. Then, focus your internal linking on those pieces.

“Identify the content that has the largest amount of comments,” he said, “Then you can restructure your internal linking and show your users and new visitors that there’s a big community talking about it.”

“Your community is also creating value that you have been adding to the website by creating this content,” he added. “So, it is part of your entire website experience and you might want to boost it.”

Watch the full SMX Next presentation here (free registration required).

The post How to gain SEO insights using data segmentation appeared first on Search Engine Land.

Read More
Jason December 17, 2021 0 Comments

Google confirmed serving issue with Google Search results

Google said “there’s an ongoing issue with our serving system in Google Search that’s affecting a small number of sites,” on Twitter this morning. Google confirmed at 11:30am ET, that it fixed the issue. The issue started at about 4:30am ET and lasted several hours.

The tweet. Here is the tweet so you can read it yourself:

What is the issue. At around 4:30am there were reports, which I covered on the Search Engine Roundtable, that Google was having issues indexing or crawling or serving new content from both major and smaller publishers. I showed screenshots of Google not showing new content from publishers like the Wall Street Journal, New York Times, and niche publishers like TechCrunch and The Verge.

Here are some of those screenshots:

But then at around 6:30am ET, it seemed like Google fixed the issue. Well, it seems Google did not fully resolve the issue because now the company officially confirmed the issue on Twitter.

Smaller publishers? If I look at what Google indexed on my personal site in the past 24 hours, it shows a story from yesterday, that was published before Google had these serving issues. And a story that I manually pushed to Google using the request indexing issue in Search Console. It does not show the several other stories I published this morning at the Search Engine Roundtable.

But even now, if I do a site command for wsj.com and filter to show stories in the past hour, I am still seeing the issue, after I thought it was resolved:

Resolved. Google has fixed the issue at around 11:30am ET today, here are the notifications Google posted about this:

Why we care. If you are having issues with Google now showing your recent content in Google Search today, December 16, 2021, do not worry, it seems to be an issue on Google’s end. Google is working on resolving it.

Until then, I guess try to use Google Search Console to push your content into Google Search and hope that works. Also it is a time to remind you that having other channels to drive traffic to your site, like social media, email marketing, direct traffic and other means is important.

Now that it is fixed, you should now see your content showing up in Google Search.

The post Google confirmed serving issue with Google Search results appeared first on Search Engine Land.

Read More
Jason December 16, 2021 0 Comments

How to effectively deliver the results of your work to clients

One of the most challenging aspects of providing SEO services is proving the value of your work to clients. 

To address this problem, SE Ranking sat down with representatives of both digital agencies and businesses from the US, UK and Australia to discuss the main issues that occur in the communication between agencies and their clients and to find reasonable solutions for them. 

This article highlights actionable insights and useful advice shared during the discussion.

The agency side was represented by Jen Cornwell, Harry Sanders, and Lidia Infante, while Dave Lavinsky and Hayley Keith voiced their opinions as business representatives. 


“Why does it take so much time for SEO to show palpable results?!”

Clients often don’t fully understand all of the nuances and value of SEO. As a result, you will have a hard time finding an SEO contractor who hasn’t been asked multiple times why it takes so long to produce results.

Here are a few things you can do to relieve the anxiety of your clients and build trust from day one. 

  1. Set clear expectations. As soon as you start working with a client, let them know what results you can help them achieve. Give them a clear idea of what’s ahead by providing projections on what results they can see in 3, 6, 9, and 12 months.
  1. Communicate with and educate clients. Make sure to spend enough time explaining to clients that SEO is a massive field with many moving parts where each part needs a different amount of time and effort to produce results. Besides talking to clients, build a good rapport with their in-house teams too, and make sure they’re also on the same page.
  1. Provide interim results. Your clients must be able to see actual progress as soon as possible. Alleviate their concerns by showing specific progress that’s been made since you started optimizing their website.

For example, let’s take tech SEO. You can audit the website of your client using website audit tools, such as SE Ranking, to let them know how healthy their website is from a technical perspective.

Then, you can explain what issues and why are to focus in, say, the first month as well as during the first three months. That way, your clients will have a better picture of what to expect.

Plus, the Website Audit also contains detailed information on each issue in terms of why it’s important for SEO and how it can be fixed, allowing your clients to learn everything they need to better understand the urgency of fixing particular errors. 

And as for providing interim results, you can compare several audit reports to show clients an overview of the progress that’s been made. 

The feeling of having control over the situation is very important for your clients. That is why a bit of education supported by clear self-explanatory reports will help you explain the key things that matter for SEO and show the progress of your work. 

“I don’t think these changes are necessary. We have more important things to do!”

Another issue that agencies often come across is that clients don’t implement all the necessary changes. For example, your client doesn’t see the need to create new content, update their landing pages, or build backlinks. So, what do you do?

  1. Find a common point of agreement. Make your clients feel that you are on their side and want to achieve the same things they do. While it’s okay for SEO and business reps to see things differently on occasion, it’s important to articulate what the common objective is.
  1. Provide stats and case studies to prove the validity of your point when it comes to dealing with big changes and investments. Your clients probably want to take the beaten path to success instead of being trailblazers. 
  1. Prove the value of suggested changes by providing examples from competitor websites. With a Competitive Research tool, you can gather the necessary data to show your clients how their website needs to be optimized in order to catch up or even outrank the competition. 

Depending on your task, you can point out the keywords that competitors use, the amount of optimized content, the quality of their links, the technical performance of their websites, and so on. 

  1. Prioritize what clients need to implement. Clients are often too busy with their business and need your help to set their SEO priorities straight. To do this, score everything you need a client to do by its potential effectiveness and urgency, plus point out how much time each issue can take. 

“I don’t understand what I am paying you for!”

Sometimes, clients don’t really understand or haven’t the slightest idea of what their SEO agency is currently working on. What this means is that they don’t know what they’re paying for, which can end up being harmful to both clients and agencies. 

Here’s the advice our experts shared that will help your clients avoid having any question marks.
 

  1. Be extra transparent and honest. Make sure to be as open as possible in your communication with clients. Present them with a detailed plan of action and hold regular sync-up meetings to report on what’s been done and what you are planning to do next. Communicating KPI progress reports also helps achieve this. 
  1. Talk about your own mistakes. If the strategy you’ve opted for didn’t work, don’t try to cover it up with some vague conclusions. Instead, be open and frank, and have a plan of recovery ready to make up for the losses in the future. 

To avoid your clients guessing what they are paying you for, take advantage of a  Report Builder tool to send out regular reports to your clients. Depending on how often they want reports, you can send out reports on a daily, weekly, and monthly basis or on-demand.

Sit with your client and agree on the data points that must be included in the report and how often they want to get such reports. You can design and personalize the report however you want and add all the necessary data points to make sure the client understands exactly where you are and where you are going. 

“I’m overwhelmed with the technical details you are sharing with me and don’t understand why it matters for my business!”

In addition to providing SEO reports full of easy-to-read charts and tables, you must make sure your clients, as well as their teammates, understand all of the data they see. So, besides focusing on SEO progress, provide additional information to let your clients know how each aspect of your work correlates to the business goals of your clients.

Let’s take a look at some actionable tips you can benefit from.

  1. Set up tracking straight away. Don’t delay getting access to your clients’ analytics and verify that you are tracking every part of the client website. That way, you will be able to make data-informed decisions along the way. Keep in mind that linking SEO deliverables to business parameters is key. 

If you want to give your clients the opportunity to check in on specific aspects of your work whenever they choose to, you can also share a guest link with them. Sharing a guest link facilitates the work of SEO agencies. 

  1. Provide context to your reports. Don’t just throw raw SEO data at your clients, but provide detailed explanations on what exactly you’ve been working on, why it’s important, what were the objectives, and which parameters indicate the results. Adding comments to each report section and tailoring your reports to each specific client can go a long way.
  1. Know who you’re talking to. Prepare different reports for the CEO of a client company and for that company’s SEO or marketing team. After all, they have different levels of engagement in the actual SEO process and need different levels of digitalization in the information you can provide. So, before putting together a report, find out whose table it’s going to end up on. 

Remember that smart reporting is not just about sending out visually stunning graphs to your clients, but about providing enough relevant data along with all of the necessary context. 

Final thoughts

It’s absolutely critical that you and your clients realize that you are playing for the same team and that you are both invested in the growth of the client’s business. So keeping your clients as well-informed as possible is key. 

Dedicate the time to make sure your clients understand what you are planning to do or are already doing to their website. That way, you won’t have any unpleasant surprises further down the road. Also, be transparent and honest in your communication so that the client doesn’t just know when everything is going according to plan but when things fall through as well.

While there are multiple ways of communicating with your clients, it’s best to manage it all from one central location like SE Ranking which gives you all of the SEO data you need to carry out your work as well as the tools you need to keep your clients fully informed.

The post How to effectively deliver the results of your work to clients appeared first on Search Engine Land.

Read More
Jason December 15, 2021 0 Comments

How marketers can prepare for what’s next in page experience

When Google’s Page Experience update finished rolling out in early September, it changed how the search engine evaluates websites — namely, a new emphasis on user experience signals. In his session at SMX Next, Patrick Stox, product advisor for technical SEO and brand ambassador at Ahrefs, noted some important alterations to this update that took place in the months following, which continue to leave many SEOs confused.

“Safe browsing is already out,” he said, “And cumulative layout shift has changed a bit. It’s the five seconds where the most shifting occurs. Google has also removed AMP from Top stories requirements and many news sites are looking at dropping it.”

Image: Patrick Stox.

One of the key pieces of the Page Experience update — Core Web Vitals — is also one of the most hotly debated. SEOs and agencies have questioned how big of a difference these metrics make in terms of rankings, leading many, such as Stox, to take a deeper look at the data.

“We looked at about 5.2 million individual pages, which I think is the largest data set that’s been studied now,” he said. His study found that only 11.4% of them met the recommended standards for Core Web Vitals.

This begs the question: Is Core Web Vitals optimization necessary?

“Maybe these are small ranking factors,” said Stox, “But I think if you argue this from an SEO standpoint, you’re going to fail. Many Google employees have now said these are small factors, that these are tiebreakers signals.”

“It’s going to be hard to get any prioritization when it’s not going to have a huge impact for SEO,” he added.

Still, Core Web Vitals point to some key principles of page experience. Even if they fail to provide the ranking boosts SEOs hope for, these signals point to aspects of user experience that are necessary if marketers want to convert more visitors. And, as Google continues to add these kinds of signals to its algorithms, your site’s visibility is that much more likely to improve.

Stox offered five concepts marketers should understand and prioritize when optimizing with the Page Experience update in mind.

Smaller is faster

The largest Contentful Paint (LCP) measures perceived load speed and it’s an important user-centric metric. This is the point where the main content on a page has loaded.

It’s no surprise that users want their content loaded as quickly as possible. But to do so, marketers must work with their development teams to cut down resource sizes — the smaller the resource, the faster the page will load and the higher LCP score the page will receive.

Image: Patrick Stox.

“If you don’t need something, don’t load it,” said Stox. “A smaller site is faster, so that means you need to zip your JavaScript files, CSS and HTML — make everything as small as possible. Get rid of things you don’t need.”

Server location matters

The location of your site’s servers has a direct impact on its page experience. That’s why Stox recommends marketers and site owners use a content delivery network (CDN). These geographically distributed server networks can work together to enhance user experience.

“Pretty much every article will tell you to use a CDN,” said Stox. “Simply, location matters. The connection time that it takes, the amount of time that it takes to get things from a server.”

He added, “If you have copies of your site all over the world, that time is cut out.”

Image: Patrick Stox.

CDNs can be particularly effective for large sites because they enable them to draw in resources from locations across the globe. That, in turn, can help ensure your content gets to searchers when they need it.

Use the same server

Stox noted that marketers should opt to use a single server if possible when improving page experience signals. Each additional server connected adds further delay in rendering, which could end up causing slow load times on pages.

Image: Patrick Stox.

“Keep as much as you want or as much as you can on the same server,” he said. “Every connection to a different location takes additional time, additional roundtrips.”

“If you are going to use additional servers, you need DNS-prefetch and preconnect,” he added.

Image: Patrick Stox.

Adding preconnect and DNS-prefetch code can serve as an alternative. These can help sites establish early connections between servers.

Ensure caching is set up correctly

Setting up effective caching can help provide a good page experience over the long run, Stox argues. He recommends SEOs and site owners use this function regularly to put less strain on servers — even if the first page takes slightly longer to load.

“Cache as much as possible,” Stox said. “Your first load may take longer, but then every other page after that — they’ve already got your CSS downloaded, your JavaScript downloaded, your fonts downloaded. They’re in their browser. They’re stored locally at that point, which means it’s going to be super fast.”

Image: Patrick Stox.

Prioritize items in page resource loading

Web pages need to load many resources before users can view and engage with them. But some resources are more important than others.

“In the browser, you have fonts, CSS, HTML and JavaScript — they [the servers] have to figure out what to prioritize,” Stox said. “You need to load things that are going to make up the initial viewport — the things that people are going to see first — and then everything else comes later.”

Image: Patrick Stox

The goal should be to load the things people see first, then load secondary resources. Structuring your resource loading with inline coding this way can help you deliver your most important content to users faster.

“When I say ‘inline’ CSS, what I mean is taking part of the CSS file and putting it in the HTML,” he said. “So when the HTML is being downloaded, I get the CSS that I need for the visible page. I don’t have to wait for another CSS file to download for that to be processed and the render to start.”

“We’re going to be prioritizing things that are needed and the render can just happen a little earlier this way,” he added.

Ultimately, as Stox notes, platforms like Google will become advanced enough that SEOs won’t have to worry too much about these technical aspects of page experience: “I think the platforms are going to solve a lot of these speed issues for you so that people won’t have to worry about this as much. SEO’s won’t have to bother devs and devs won’t necessarily have to focus on it.”

But that future is still a long way off. Optimizing for Core Web Vitals — even if they’re just a tiebreaker — can bring audiences to your site instead of a competitor’s.

Watch the full SMX Next presentation here (free registration required).

The post How marketers can prepare for what’s next in page experience appeared first on Search Engine Land.

Read More
Jason December 14, 2021 0 Comments

Google issue sent Search Console redirect error notifications; Google will fix issue

Over the weekend an internal Google issue caused Google to send out Search Console notifications about redirect errors. Google confirmed those emails were sent due to an internal Google issue and the notifications were not sent due to any website issue.

Bug confirmed. Google confirmed the issue on the Search Console Twitter account:

The email. Here is a sample of what the email looks like, it read “Coverage issues detected on domain. Search Console has identified that your site is affected by 1 Coverage issues: Top Errors. Errors can prevent your page or feature from appearing in Search results. The following errors were found on your site: Redirect error. We recommend that you fix these issues when possible to enable the best experience and coverage in Google Search.”

When you clicked on fix coverage issues, Google would take you into the Search Console Coverage report, directly drilled down into the Redirect error statuses page:

More screenshots of this issue. Here are some more screenshots of this issue shared on Twitter:

Why we care. Just to be clear, if you received this notification, you do not need to take action. Google will fix the coverage report in Search Console and the redirect errors should go away. Google will likely notify us when the report is fixed.

If you receive this error in the future, you probably want to address it. But if you received this notification over the past weekend, you can likely safely ignore the notification.

The post Google issue sent Search Console redirect error notifications; Google will fix issue appeared first on Search Engine Land.

Read More
Jason December 13, 2021 0 Comments

Top 4 backlink API vendors compared (1 million domain study)

For a couple of years, SEO PowerSuite has been working on enhancing its backlink index, used by the SEO PowerSuite backlink API and the SEO SpyGlass backlink tool. In June 2021, a huge infrastructure update went live, increasing our crawling speed by three times.

To better understand our progress and see where we stand, we’ve decided to run an in-house analysis of the top backlink database providers — Ahrefs, SEO PowerSuite, Semrush, and Majestic.

We’ve described all the major findings of the comparison here, but more details can be found on this page.

Disclaimer: For this study, we’d purchased the API keys from each of the compared providers ($6,200 worth in total). In this report, SEO PowerSuite enclosed the raw data files for your reference. To verify that the data hasn’t been modified in any way, you can reach out to the mentioned companies and purchase similar stats from them to run your own comparison.

The methodology

To run this analysis, we decided to use the Majestic Million dataset.

Why this dataset? The Majestic Million is a list of the top million sites on the web, based on the number of citations (aka links) from other websites. That’s why each of these domains should have sufficient backlinks to make the comparison statistically significant.

So, we have taken this dataset and analyzed the results from the following backlink index providers:

SEO PowerSuite

SEO PowerSuite backlink index counts 3.6 trillion external backlinks and 264 million indexed domains.

The cost of checking backlink stats for 1M domains: $899.

Ahrefs

According to the official website, Ahrefs has 3.2 trillion external backlinks and 193 million indexed domains.

The cost of checking backlink stats for 1M domains: $2,000.

Semrush

Semrush contends to have over 43 trillion backlinks and 1.6 billion domains in its index. 

The cost of checking backlink stats for 1M domains: $2,500.

Majestic

We haven’t found any information on the number of backlinks and referring domains in the Majestic backlink index. The only stats that are available on the official website are that the Fresh backlink index has 434 billion URLs crawled and over 1 trillion URLs found.

The cost of checking backlink stats for 1M domains: $800.

After setting up the accounts and checking the data for one million domains, we then calculated

  • the total wins by referring domains;
  • the total wins by backlinks;
  • how often each of the vendors occupied the 1st, 2nd, 3rd and 4th position in the comparison (for links and referring domains);
  • and a few more metrics covered on this page.

The study was conducted in autumn 2021.

Findings                                  

Who wins by the number of referring domains?

First off, we looked at the number of referring domains returned by every provider, the metric that lets us evaluate the backlink index coverage.

When it came to the largest number of reported backlink domains, SEO PowerSuite led the way with 44.4% wins:

The backlink index by Semrush took second place with over 36.3% wins. Ahrefs received 14% wins, and Majestic – 5.3%.

Please keep in mind that the graph above highlights the number of wins. Here’s how the quantity of reported referring domains looked in absolute numbers:

What’s the difference between wins and absolute numbers?  Here’s a simple example – let’s imagine we’ve checked how many referring domains each of the analyzed databases reported for the same URL:

  • SEO PowerSuite: 100 referring domains
  • Ahrefs: 101 referring domains
  • Semrush: 99 referring domains
  • Majestic: 100 referring domains

In this case, Ahrefs would receive the “point” for their “win.” And then, we repeat the process for the remaining 999,999 domains.

🥇 The winner in the referring domains category: SEO PowerSuite

Who wins by the number of backlinks?

For the total number of backlinks, Semrush came out on top with 47.65% wins. It was followed by SEO PowerSuite (21.38% wins) and Ahrefs (16.25% wins). Majestic showed the lowest number of backlinks in this run and gained 14.73% wins.

Total backlinks in absolute numbers:

🥇 The winner in the referring backlinks category: Semrush

How positions in the comparison were distributed

Additionally, we have calculated how often each of the vendors occupied the 1st, 2nd, 3rd and 4th position in the comparison.

This is how the positions were distributed for referring domains:

And this is how the vendors scored for backlinks:

Important factors to keep in mind

All tests have limitations, and this one’s no exception. It’s quite challenging to run research of backlink databases and draw a totally unbiased conclusion.

Below, we’ve outlined a few important aspects that you should keep in mind when you read any backlink index study.

Different types of backlink indexes

Every backlink index is unique since every company uses a different approach to crawl, store and count backlink data. So, with any backlink index comparison, you have this big flaw of analyzing databases that are inherently not equal.

In our research, we’ve accessed the data via APIs that allegedly return the results from the following indexes:

  • Ahrefs Live index (all links that are currently live, a link is considered live until recrawled and found otherwise)
  • Semrush Fresh index (all links crawled in the last six months)
  • Majestic Fresh index (all links crawled in the last four months)
  • SEO PowerSuite backlink index includes all live links (similarly to Ahrefs, a link in our index is considered live until we recrawl it and see otherwise).

Unlike Ahrefs and SEO PowerSuite, Semrush and Majestic return both live and dead backlinks, and in most cases, it can make a huge difference in the backlink numbers.

For example, if you check the backlink profile of Amazon.com in Semrush, you’ll see that this domain has a total of over 18 billion backlinks (that’s what you get via the API as well). But if you switch to the Active links mode, you’ll see only seven million links:

Although Ahrefs index includes links that are currently live, you can still spot some inconsistencies. Ahrefs API returns 114 million backlinks for Ahrefs.com, this is close to the number we see in the Summary dashboard, but if we switch to the backlinks report, we’ll see just 35 million links.

There’s probably a good explanation behind this mismatch (low-quality or lost backlinks filtered out?), but it brings us to two important conclusions:

  1. We can’t be 100% certain about the type of backlink index used by APIs.
  2. We can’t eliminate this type of discrepancy from our research.

Different ways to count links

Yet another problem you’ll face when comparing the backlink indexes is that links are counted differently. For instance, you may calculate duplicate links on a page as a single link or as multiple links. The same’s true for redirects, canonicals, etc. All these things will affect the total calculation, and that’s why we have to take every backlink index comparison with a grain of salt.

Different ways to record IPs

Similarly, there’s a question of how backlink API providers record IP data. Most providers count only the most recent IP they’ve found for the backlink, which means they have one IP recorded per domain. But Semrush seems to record every IP they’ve detected in the last six months; thus, they may have a few dozen IPs recorded for the same domain.

The IP/subnet stats are included in the raw data file, and initially, we planned to use these numbers in the report. However, when we noticed this crucial difference in IP calculation, we considered that these stats could not be used to draw meaningful conclusions.

Different understanding of referring domains

A domain is a domain, right? Our analysis shows that things are a bit more complicated when we talk about backlink indexes.

According to Verisign, “the second quarter of 2021 closed with 367.3 million domains across all top-level domains.”

And while many data providers have a matching number of domains indexed (Ahrefs – 193.4 million referring domains, SEO PowerSuite – 264 million domains), Semrush claims to have a very dubious 1.6 billion referring domains.

Hence, what Semrush counts as a unique domain seem to be different from the widely accepted notion.

To sum it up, there’s probably no winner or loser in this type of comparison, due to all the mentioned factors and varying approaches.

But we believe that the more experiments we run, the more transparent backlink index providers will become about their data in the long run. And that means that enterprise and API users will better understand what they are paying for.

If you’re interested in testing our backlink API for your SEO services and apps (or running your own independent analysis), please drop us a note to api@link-assistant.com, and we’ll set you up with a free trial.

The post Top 4 backlink API vendors compared (1 million domain study) appeared first on Search Engine Land.

Read More
Jason December 13, 2021 0 Comments

How expired landing pages kill your Google rankings

A lot of landing pages expire every day when outdated information becomes obsolete, products are sold out, services are discontinued and entire communities sunset. How that expiring content is being handled from an SEO perspective can greatly impact the organic search rankings of websites. If its handling is floundered, SEO landing pages with expired content have the potential to kill the organic rankings of the website overall.

PageRank vs. user signals

A frequently mentioned argument made by website owners for maintaining landing pages with expired content, especially sold-out products, is to preserve incoming external PageRank to the website. It is a false assumption that a landing page must be maintained as indexable and by returning a 200 OK status code, even when a product or service isn’t available to users any longer in order to keep whatever authority or PageRank that same landing page has accumulated over time. Doing so effectively means creating a soft 404 landing page. A soft 404 is an error page with no relevant content which continuously returns a 200 OK status code instead of a 404 or 410 status code.

For a number of reasons, that strategy is a recipe for disaster. Firstly, the conversion rate rather than presumed PageRank accumulation ought to be the primary goal of a commercial website. After all, no publisher cares for their PageRank value, high or low, as long as conversions meet or exceed expectations. Secondly, PageRank can not be gauged by any degree of accuracy. PageRank changes continuously as Googlebot crawls the web and Google does not disclose the actual value for individual landing pages or websites. No external third-party tool can substitute that value in any meaningful way. Finally, product landing pages rarely attract lasting, high-quality, merit-based backlinks to begin with. Effectively, the perceived PageRank loss is debatable, while actual PageRank loss is negligible. 

Soft 404s are bad for user experience and therefore a thorn in the side of search engines, Google in particular. This is why maintaining expired content landing pages, especially unavailable product pages, considerably magnifies the risk of poor user signals. Google has become more adept in identifying negative on-page language and can accurately detect strings like “unavailable,” “out of stock,” “0 results found” or “sold out.” Frequently, yet not always, it will highlight the problem as soft 404 pages in Google Search Console. However, a major issue is that CTR is likely to suffer from snippet representation, highlighting information that services or products are unavailable to the user. Worse yet, if users are still compelled to click on results that turn out to be discontinued landing pages (also known as soft 404s), they are almost inevitably going to return to search results, look for an alternative and/or refine their query. Doing so, the users indicate with their click behavior that the individual user experience was bad for them. With this “bounce rate” growing, which is often mistaken for, yet unrelated to, the Google Analytics or on-site bounce rate, the relevance of the website as a whole suffers in the organic search rankings.

Negative wording in expired content leaks into snippets, adversely affecting CTR.

Although PageRank remains an important ranking factor, it pales in comparison with the weight of user signals which search engines collect for rankings. While emphatically denying the use of specific user signals, such as Google Analytics data or dwelling time, Google continues to favor websites that are popular with users. When compared against each other, the PageRank argument does not stand a chance. On the one hand, PageRank remains elusive and at best a means to an end. User signals on the other hand, directly and imminently contribute to the success of a website, with and beyond SEO.

The trends game

Google rankings, to a large extent, depend on SEO signal trends. For a large website, with many millions of relevant landing pages, a few thousand expired content landing pages are unlikely to trigger a ranking loss. They are relatively too few to decidedly tip the trend of a website’s signals one way. For a smaller website comprising ten thousand landing pages in total, a few hundred expired indexable landing pages can already pose an SEO danger. 

Expired content negative wording leaks into snippets, adversely affecting CTR.

Ultimately, the decisive factor is trends measured in percentages, rather than the actual total numbers of indexable expired content or soft 404 landing pages. Which website ranks well and which one does not depends on a number of critical factors. These include, among other factors, the total volume of crawlable landing pages, their content quality, the overall trends involved and, most importantly, the user experience signals trends indicating user satisfaction.

Soft 404s are likely to impact both CTR and bounce rate effectively dragging the website’s rankings down over time.

There are no fixed thresholds that must be observed. Instead, trends are front and center when SEO signals, and therefore organic search rankings, are to be improved. The question of how well a specific website fares in this regard can only be answered by analyzing the website’s specific data, especially its server logs. This is why commercial websites with a sizable and changing product database must regularly perform technical SEO audits. 

In-depth SEO audits are the only means of accurately gauging crawl budget management, or how long it may take for Google to re-crawl expired landing pages in order to register the changes applied. Only an SEO audit can help to identify whether expired content landing pages pose a problem and/or if it’s a serious one.

Trends are critical SEO indicators. Growing volumes of soft 404s are a potential risk.

Doing it the right way

Larger sections of a website that have outlived their usefulness but can’t be deleted, like sunset communities, can be moved off domain, thereby boosting the main website’s trend signal. In that instance, 301 Moved Permanently redirects must be established and maintained without an end date or return 404 status code so search engines know to discount the content. 

Expired product landing pages, however, must not be 301 redirected to other landing pages, thereby meddling with user signals. Instead, when products or services are no longer available, respective landing pages must return either 404 Not Found or 410 Gone HTTP status codes. Doing so, these status codes will signify to Google and other search engines that the landing pages no longer provide what they used to and strengthen the user signals of the remaining, still available 200 OK landing pages that continue to offer products or services.

There is, however, a possibility to legitimately capitalize on 404 error pages without taking the unnecessary business risk of confusing search engines or diluting user signals. That is by enhancing 404 Not Found pages, which still return this correct status code and supplementing the content of the error page with relevant, in-context information for users. These so-called smart or custom 404 landing pages must continue to address the fact that their main purpose, product or service is unavailable. But, they can be augmented with relevant product alternatives and/or the results of an internal search based on keywords from the request URL, enabling users to continue on their journey within the website — and for the website operator to potentially still capitalize on the lead. Custom 404 pages are not an SEO growth method, but much rather a means for maintaining user satisfaction and improving conversions. When applied, they pose no SEO risk as long as the status code is still a 404.

Ultimately, whether expired content landing pages return 404, 410 or a custom 404 response, it is important not to block the URLs in the robots.txt. Doing so inhibits search engines from crawling and understanding the changes applied and can have an undesirable effect on user signals. 

At the same time, internal linking to expired content landing pages must be updated and consequently discontinued. Internal linking is among the foremost important on-page signals indicating to search engines both relevance and importance from a crawl priority point of view, hence there’s no point in boosting content landing pages that have expired. 

Lastly, it is important to always keep in mind that 404 Not Found landing pages, no matter how numerous, will not impact a website’s organic rankings. No website ranks poorer or better because of, or despite, its 404 Not Found pages. Soft 404 landing pages, however, can not only impact rankings but also have the potential to drag down the entire website in organic search.

The post How expired landing pages kill your Google rankings appeared first on Search Engine Land.

Read More
Jason December 8, 2021 0 Comments