Category: SEO

How to analyze your SEO competitors and find opportunities

Understanding your competitors can give you tactical insights to help you discover opportunities.

Before competing, athletes spend many hours understanding weaknesses to exploit and finding possible gaps in the playing field.

Your digital marketing should come with the same level of insight and analysis.

Identifying and reviewing your competitors can help you come up with a strong list of potential keywords, find link building opportunities and build your persona profiles.

Starting your competitor analysis

When putting together a competitor analysis, it is important to make sure that your analysis tells you: 

  • What a competitor is doing.
  • How they are doing it.
  • What factors they are most excelling at in their SEO strategy.
  • What are their gaps.

Using this information, it is possible to create strategies at scale that will help you exceed your competitors and get higher rankings.

The first step is to assess the bigger picture of your competitors. You can start by using Google and typing in the main keywords to see what sites show up in the rankings.

Next, you can use tools to help you drill down into your competitor analysis process.


Get the daily newsletter search marketers rely on.












Processing…Please wait.


Identify your competitors

You can use Semrush to give you a starting point to identify your competitors.

In the Organic Research report, click on the Competitors tab.

You can use this as a starting point to figure out who your main competitors are.

Some of the data points that can help you in this research are:

  • Competition Level: Analysis of the number of keywords of each domain and the number of the domains’ common keywords. The more keywords the domain share in common, the more likely they are to compete with one another. 
  • Common Keywords: How many keywords do the domains have in common.
  • SE Keywords: How many total keywords each site is ranking for.
  • Traffic: Estimated organic traffic.
  • Cost: Traffic cost based on CPC data and estimated volume.
  • Paid Keywords: How many keywords are being paid for via Google Ads.

Having identified a list of competitors, the next step is to look at their backlink profile.

  • How many unique referring links do they have pointing to them?
  • What is the quality of those links?
  • What is the link velocity of the links they have acquired?  

Understanding their link profiles will help you determine how many backlinks you need to acquire before you are capable of competing for specific keywords.

In Semrush, go to Backlink Analytics and type in your root domain. You can also add up to four competitors to see backlink metrics side by side.

The overview will give you an idea of how your backlink profile looks compared to others vying to rank for the same keywords.  

Pay close attention to the number of referring domains you and your competitors have. To drill down, click on Referring Domains at the top of the report and sort the list by Authority Score.  

If your site has many referring domains, all with low Authority Score, your site may not have enough power to rank against competitors with better quality inbound links.

The different reports in the Backlink Analytics feature in Semrush will allow you to learn the total number and types of backlinks, whether the majority is followed or nofollowed, and what types of links they are. Analyzing the anchor text will also help you figure out whether link building work has been done and if those links include keywords in the anchor text.  

Your backlink analysis data will help you in upcoming research, as it will help you select keywords that you can realistically rank for.

If, for example, your site has an Authority Score of 25, but your competitors average an Authority Score of 45, you will not be likely to outrank them for root terms. You would have to search for keywords with less volume and where the ranking pages have a lower Authority Score

Keyword data and content decisions should be closely intertwined with your backlink analysis as it will allow you to select “low-hanging fruit” keywords and realistic ranking targets.

Traffic analytics

The Semrush Traffic Analytics tool gives you traffic estimates for competitors. Whereas Google Analytics will show you how your site performs, traffic analytics can give you estimated traffic data for your competitors.   

At a glance, you can compare how your competitors match up when it comes to:

  • Visits
  • Unique visitors
  • Pages / visit
  • Average visit duration
  • Bounce rate

Additionally, you can see trends over time for each of these categories. You’ll see if competitors have recently lost traffic or have made dramatic progress during a specific time period.

There are a myriad other data points and reports that could help you deepen your understanding of your competitors, such as top pages, traffic sources and traffic journeys.

Keyword research

Keyword Research is one of the oldest and most undervalued skills in SEO. Choosing the right keywords can mean the difference between success and failure in SEO.

Keyword research is not sexy, but it is necessary, foundational work to be done properly. Let’s go over a step-by-step process for keyword research.

Think in terms of “Keyword Sets”

Ideally, you should group your keywords into Keyword Sets. Start with a “seed keyword,” then look for long-tail variants of that keyword.

For example, if your primary keyword is “Real Estate Auction,” some of the sets you could use include:

Then you could move to another keyword set and also get variants for that term.  

Examples of other seed sets include:

  • Online foreclosure auctions
  • Home auctions
  • Online property auctions

Once you have these in place, you can go deeper and deeper using secondary keywords, questions, and other variants.

Look for long-tail keywords

You can use Semrush’s Keyword Magic tool to obtain these seed sets. Type in the seed sets you’ve uncovered, and Semrush will give you a list of possible keywords.

You can group these using the match types:

As you find keywords that you would like to rank for and track, use the checkmark and add them to your “Keyword Manager” list.

Assess the competition level

Next, you need to assess the viability of ranking for the keywords you chose. Ranking for your seed keywords can be your end goal, but you have to select keywords that you can realistically rank for based on your site’s current authority score. If you choose keywords that are too competitive, you’ll never see positive results for your SEO efforts. 

Most keyword research tools use a metric for Keyword Difficulty. You’ll want to identify keywords that have good volume but low KD. What that level is varies based on your site’s age, structure, backlink profile and much more.

Find keyword gaps

You can expand your keyword list by doing a keyword gap analysis. There might be keywords you never thought about that competitors are ranking for.

To perform keyword gap analysis, go to the Keyword Gap report, type the top competitors into the top bar, and click on compare:

There is a list of keywords that you and your competitors are ranking for in the table below. You’ll see which keywords are shared between all competitors, which ones they are ranking for but your site isn’t, which keywords you rank for but not at the top, and more.

Review these keywords, and if there are keywords that seem relevant, you can add them to your keyword manager list.

Compile your final keyword list

The final step is to collect all of your keyword data into the Keyword Manager. You can segment your lists or create one large list with all of your keywords.

By adding keywords from the keyword magic research, keyword gap analysis, competitor research and baseline assessment, you’ll have a comprehensive list of keywords to work with.  

Make sure you click on the “Update Metrics” to get current, accurate data on the volume, competitiveness, and Keyword Difficulty of each of the terms on your list.

Using competitor data to craft your strategy

At this stage, you’ll have a comprehensive understanding of who you’re up against in the SERPs. You’ll know which competitors have the strongest link profiles, what keywords you can target and have various data points about potential opportunities that your competitors have overlooked.

The data you have accumulated can help you build your final target keyword list and prepare you for the next stage, which includes understanding your target audience and building persona profiles.

The post How to analyze your SEO competitors and find opportunities appeared first on Search Engine Land.

Read More
Jason May 5, 2022 0 Comments

DMCA request removes Moz from Google Search index

If you search for [Moz] in Google Search, you won’t be seeing the moz.com home page, that page was removed from the Google index due to a DMCA takedown request. The takedown complaint cites that Moz’s home page, along with 185 other URLs were “distribute modified, cracked and unauthorized versions” of the Dr. Driving app.

The takedown complaint. The DMCA, The Digital Millennium Copyright Act, takedown complaint can be viewed over here. You can see the Moz home page listed on line 122. As Cyrus Shepard posted on Twitter “Crazy! You can’t access the Moz homepage from Google right now. A search for “Moz” shows an incredible 8(!) removed results from an overly-broad DMCA filing. DMCA literally lets anybody abuse the system, and it breaks Google.”

Google is aware. Danny Sullivan, the Google Search Liaison responded saying “I’ve passed it on for review.” We suspect Google will reverse this issue really quickly – but so far, Moz is still not showing.

The Google search results. Here is a screenshot of the search results page showing the Moz blog coming up in the first position, not the Moz home page:

The footer of the Google results show that Google “removed 8 result(s) from this page” due to DMCA violations:

Should not happen but it does. You are all probably thinking, this should not happen – how can Moz not rank for it’s own name in Google Search. How can it be that easy for someone to use a DMCA request to take down a large respected brand from showing in the Google Search results? And you are right, this should not happen – but it does.

We had our own site, Search Engine Land mistakenly removed from Google because Google thought the site was hacked – it was not hacked. Digg was also removed from Google Search because someone accidentally classified it as spam.

I guess mistakes happen, even in massive companies. But how? We don’t know yet. We have reached out for Google for a statement and if we hear back, we will update this story.

More on DMCA requests and Google Search. Google has its transparency report that says “It is our policy to respond to clear and specific notices of alleged copyright infringement. The form of notice we specify in our web form is consistent with the Digital Millennium Copyright Act (DMCA) and provides a simple and efficient mechanism for copyright owners from countries/regions around the world. To initiate the process to delist content from Search results, a copyright owner who believes a URL points to infringing content sends us a takedown notice for that allegedly infringing material. When we receive a valid takedown notice, our teams carefully review it for completeness and check for other problems. If the notice is complete and we find no other issues, we delist the URL from Search results.”

You can dispute these requests and have them reversed but how long does that take? You can submit DMCA requests to Google over here.

Why we care. This is a nightmare for most SEOs and site owners. To be removed from Google Search for your branded term. It should not happen, it is really inexcusable and sad to see but it did happen.

I am sure Moz will return shortly but there is really nothing we can say on how to prevent this from happening to your site. The good news, Moz is a big enough brand that this caught Google’s radar quickly and likely will be fixed soon because of that. But for small brands – good luck.

Postscript. Moz is now back, less than 12 hours after this issue was first reported here:

The post DMCA request removes Moz from Google Search index appeared first on Search Engine Land.

Read More
Jason May 5, 2022 0 Comments

Axios news SEO playbook: Speed, authority and brevity

If you’ve ever read Axios, you’ll remember it. At least that’s what Ryan Kellett, VP of Audience at Axios hopes. 

A news article on Axios has a distinctive look. It’s all in the name of Smart Brevity. 

Smart Brevity: “Axios gets you smarter, faster on what matters.” That’s their mission. And they have five excellent guiding principles:

  • Audience first
  • Elegant efficiency
  • Smart, always
  • No BS for sale
  • Excellence, always

What about SEO? Axios had 24.8 million visitors in March, with 16% of traffic coming from organic search, according to Semrush’s Website Traffic Checker. (For comparative purposes, the New York Times is the largest news site, with more than a billion visitors in March – but Axios doesn’t cover nearly as many topics as the Times, CNN or other large news publishers.)

News is incredibly competitive. And Smart Brevity seems to go against what many consider to be SEO best practices. Namely, longer and more is better.

So how does Axios make SEO and Smart Brevity work together?

Go deeper. Here are highlights from my recent Q&A with Kellett. It has been edited for smartness and brevity.

SEO is an entry point. For Axios, it’s audience first, always. Kellett said Axios delivers trustworthy breaking news and insights in the smartest and most efficient way possible. So what role does SEO play at Axios? Here’s what Kellett said:

  • “The goal with our SEO discipline is to introduce readers to our fact-based coverage and Smart Brevity in a way that earns their time, attention and trust. You’ll notice Axios is distinctive not just in the quality of what we produce but in the actual look of the written article. Have you seen our bullet points and bold before? It’s memorable.”
  • “If we can convince a search reader to identify Smart Brevity wherever they next see Axios (say, in a social feed, Apple News or a forwarded email from a friend), chances are good they will eventually convert to being a subscriber to one of our many amazing newsletters.”
  • “Sure, we’d love it if they signed up for a newsletter on the spot after coming over from search, but search mostly is a top-of-funnel entry point for readers who may be coming to us for the first or second time.”

One exception: Axios Pro, which is a specialized news subscription. Pro readers coming from search may be looking for a specific company or individual that Axios is covering, Kellett said. 

Smart Brevity is core to Axios. It’s hard to stand out in a crowded news space. Making content longer would go against everything Axios is trying to do. Kellett believes shorter is better. And that most news articles across the web could benefit from being shorter and more readable:

  • “Everyone focuses on the brevity part. But it’s brevity in combination with making the reader smarter that drives everything we do.”
  • “We most certainly get dinged for this in search rankings, but we have to be comfortable knowing that the reader will appreciate Smart Brevity when they encounter us, recognize it in the wild, and eventually seek it out from us. We obviously can’t abide if our stories don’t index or rank at all, so we look to avoid that. But otherwise, we have broad shoulders and navigate the best we can.”

Why Axioms matter. As Kellett noted, Axios articles have a distinctive look. And this look has a name internally at Axios: axioms. The style can be traced back to Axios co-founder Mike Allen’s flagship newsletter, Axios AM. Kellett said axioms have debatable SEO value – but undeniable value for their audience:

  • “How many articles have you read where the most important point and the whole reason you should care is buried in paragraph seven or ten? We use the ‘Why it matters’ axiom so you, the reader, can quickly identify why the story is relevant in the first place.”
  • “The SEO part of axioms may be quite subtle. I could make the argument we’re giving a consistent pattern for search engines to identify and parse our content. But I’m unsure how much it practically helps us rank at the moment. If you have creative ideas on how to extend axioms for search, my DMs on Twitter are open until Elon Musk shuts me down.”

“Start from a good place and optimize from there.” Their homegrown CMS defaults help nudge reporters to write good URLs and SEO-friendly headlines. Axios doesn’t have a separate search team outside of its larger audience team, so it relies on its newsdesk, copy editing and audience teams every day, Kellett said.

Headlines are one area Axios regularly tries to improve to enhance performance. Their teams do this manually, by looking at performance and going into the CMS. Axios does not use any kind of automated testing of headlines. 

5 Axios SEO best practices

1. SEO education. Kellett said this is ongoing and an important part of keeping up with changes to SEO.

  • “I am always on the lookout for myths and bad habits to correct. And Director of Audience Neal Rothschild writes a weekly email (produced using our own Axios HQ software) to the newsroom that reviews examples and changes in best practices, which is incredibly helpful to get the word out internally quickly.”

2. Authority and depth. Axios is not CNN.com. They don’t cover every single topic under the sun. Kellett said Axios focuses on reinforcing its areas of authority:

  • “Readers may see some of the breaking news coverage we publish, but we also have amazingly, deep coverage on narrow topics like space, sports betting, electric vehicles, China, privacy, immigration, to name a few. All of these are forward-looking areas that will only be more important to the country as time goes on.”
  • “And that depth can come into play down the road. As an example, Axios had more authority than you’d think when the war in Ukraine started because we had covered Volodymyr Zelensky, including this amazing interview we did with him for Axios on HBO in 2021. Using that expertise furthers both pure journalistic and SEO goals.”

3. Explainers.

  • “I’ve loved working on our explainers, which really help us step back and give regular readers a chance to access a storyline with Smart Brevity. Axios Explains Ukraine is a great example of where we are headed with this.”

4. Speed. For breaking news, Axios is fast. Really fast. 

  • “Being among the first URLs on the web on a big story helps us rank as news develops. We’ll often get beat when the big publishers come knocking but generally staying fast, pointing to our expertise through internal links, and peeling off really great story angles gives us a fighting chance.”

5. Pillar and evergreen content. Axios publishes more than just short articles.

  • “We have a set of Deep Dives that cover topics in-depth and from a bunch of different angles. Those stories have potential to be great pillar content and evergreen too with some continued editorial and technical SEO work.”

Breaking Axios SEO news: Axios has just hired their first Director of SEO: Priyanka Vora, formerly Quartz’s Audience Editor. She will help build out Axios’ SEO practice further.

  • “I would also keep an eye on the Axios job listings page as the company is in growth mode across a number of areas, both in and around our newsroom.”

Axios’ SEO tool of choice. “We’re a Semrush shop for now,” Kellett said. Though he is personally partial to exporting GSC data to Google Sheets.

Most important SEO KPIs or metrics? Raw referrals matter to Kellett most. Also, the percentage of overall referred traffic from search.

“Though, of course, I will also celebrate anyone in our newsroom who sends me a screenshot of when Axios is in Top Stories carousel,” Kellett said.

The post Axios news SEO playbook: Speed, authority and brevity appeared first on Search Engine Land.

Read More
Jason May 4, 2022 0 Comments

4 content marketing steps that will help you rank higher on Google

A hub and spoke content marketing strategy can help you boost keywords rankings, increase website traffic and enhance downstream metrics like conversions, leads and sales.

Join Conductor in a live webinar and learn the basics of a hub and spoke model. You’ll also take away a four-step process you can use to rank higher.

Register today for “4 Steps to Ranking Higher on Google with Hub and Spoke Content Marketing” presented by Conductor.

The post 4 content marketing steps that will help you rank higher on Google appeared first on Search Engine Land.

Read More
Jason April 30, 2022 0 Comments

Bingbot user-agent change coming in the Fall of 2022

In 2019, Microsoft Bing announced new Bingbot user-agent names that fit better with its evergreen Bingbot crawling and rendering service. Microsoft’s Fabrice Canel has now said that by the Fall of 2022 the old user-agent will stopped being used and the search company will transition to the new user-agents fully.

Old user-agent. Microsoft said it will stop using its historical user-agent by Fall 2022. That user-agent looks like this:
Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)

New user-agent. Bing will use a user-agent that identifies the specific version of Microsoft Edge is crawling your site. Here is the format for both desktop and mobile:

Desktop – Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm) Chrome/W.X.Y.Z Safari/537.36

Mobile – Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/W.X.Y.Z Mobile Safari/537.36 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)

“W.X.Y.Z” will be substituted with the latest Microsoft Edge version Bing is using, for eg. “100.0.4896.127″.

Do we need to worry? Most sites do not need to worry. Microsoft said “For most web sites, there is nothing to worry as we will carefully test the sites to dynamically render fine before switching them to Microsoft Edge and our new user-agent.” But if you have hardcoded any user agents into your scripts, you will need to revise those scripts to ensure BingBot can continue to crawl your site.

How to test. Bing previously said you can test it by installing the new Microsoft Edge browser “to check if your site looks fine with it.” Bing said “if it does then you will not be affected by the change.” “You can also register your site on Bing Webmaster Tools  to get insights about your site, to be notified if we detect issues and to investigate your site using our upcoming tools based on our new rendering engine,” Bing added.

Bing added “we will carefully test websites before switching them to our new user-agent Bing Webmaster Tools URL Inspection has already started using the new desktop user-agent for the Live URL Test to help you investigate potential issues.”

Google also. Google is also migrating to the new GoogleBot this month. Google is currently testing the new user agents, so you may be able to see them in your log files. I do not believe it is fully rolled out yet for Google.

Why we care. You probably should have been prepared for this change, since it was announced back in 2019. But in any event, this change can impact your site if you had any user agent detection methods for BingBot. Make sure to test your site to see if it supports the new user-agent. Most sites probably do not need to worry about this but you have done any advanced bot detection, you may need to take steps to update those scripts.

The post Bingbot user-agent change coming in the Fall of 2022 appeared first on Search Engine Land.

Read More
Jason April 28, 2022 0 Comments

SERP feature trends every SEO must know

Every time you type a question into Google, the results page can look completely different. Think videos, images, ads, graphs, and related questions.

Today there are more than 40 different interactive elements or SERP features that can appear. These responsive results offer an improved user experience, but pose a real challenge to search engine optimization (SEO).

So, how can you know what SERP features should be at the forefront of your strategic planning? In this report, Similarweb analyzed the most popular SERP features across various industries. It covers

  • Best practices to help you rank for key SERP features
  • Important factors that influence search behavior
  • Varying trends and growth rates of SERP features
  • Which SERP features are most prominent by industry
  • How branded and non-branded search impact SERP

Read it now to find out the best strategies to leverage and how.

The post SERP feature trends every SEO must know appeared first on Search Engine Land.

Read More
Jason April 27, 2022 0 Comments

Google Search Console’s URL parameter tool is officially not working

Google today has turned off support for the URL parameter tool within Google Search Console. Google did notify us just about a month ago that this would be happening and it did – the URL parameter no longer functions today.

What happened. If you try to access a specific Search Console profile using the URL parameter tool, Google will tell you “this report is no longer available here.” It will show this warning icon along with it:

What is the URL parameter tool. The URL parameter tool launched in 2009 as a parameter handling tool, a way to communicate to Google to ignore specific URLs or combinations of URL parameters. Two years later, in 2011, Google upgraded to tool to handle many more parameter scenarios.

The tool essentially let you block Google from indexing URLs on your site.

You are currently able to access the tool over here but when you try to use it, that error will show up.

Why is it going away. Google said it has become “much better at guessing which parameters are useful on a site and which are —plainly put— useless.” Google added that “only about 1% of the parameter configurations currently specified in the URL Parameters tool are useful for crawling.” “Due to the low value of the tool both for Google and Search Console users, we’re deprecating the URL Parameters tool in 1 month,” Google said.

What do I do going forward. Google said there is nothing specific to do. Google said “going forward you don’t need to do anything to specify the function of URL parameters on your site, Google’s crawlers will learn how to deal with URL parameters automatically.” You can always use robots.txt rules, Google said “or use hreflang to specify language variations of content,” Google added. Plus, Google said your CMS and platforms handle building quality URLs these days.

The old rules will no longer function or be considered going forward.

Why we care. If you previously used the URL parameter tool, now that the tool is no longer being used by Google, you will want to annotate your reports to document the change. You probably should keep a close eye on your analytics and Search Console reports to see changes, if any, in crawling, indexing and ranking that may be related to this change. This might be a gradual impact, so keep an eye on issues over the next several days to several weeks.

The post Google Search Console’s URL parameter tool is officially not working appeared first on Search Engine Land.

Read More
Jason April 26, 2022 0 Comments

Beware of fake DMCA link requests by AI-generated lawyers

Have you recently received a DMCA copyright infringement notice through email from a personal claiming to be a lawyer? Well, that email might be a scam and the lawyer who emailed you might not be a real person, but rather an AI generated persona for a lawyer at a fake law firm. That is what The Next Web uncovered in a recent report about such a DMCA request.

What is a DMCA request. A DMCA request is when someone requests the removal of content or a web page due to copyright violations. DMCA stands for Digital Millennium Copyright Act and it is used to have hosting companies, Google and web site owners remove content that infringes on copyright.

What is the scam. In this case, the fake machine generated lawyer is emailing sites claiming DMCA copyright infringement and instead of having the site remove the content, they are asking for a link instead. The email says first starts off threatening, as most legal notices sound, but it ends saying “our client is happy for their image to be used and shared across the internet. However, proper image credit is due for the past or ongoing usage.” The proper image credit should be done with “a link to” the site “within 7 days.” “Otherwise, we are required to take legal action,” the email continues.

In short, the scam is to threaten copyright legal action for a link to a site.

Here is a copy of the email:

Fake lawyers. It gets even more creepy, as this reporter dug into this issue, they investigated who Arthur Davidson Legal Services was. The law firms site looked legit but the domain name was only registered this year but the site claims the firm has been around for many years. He then dug into the profile of Nicole Palmer and learned that she never existed, that she was made up by AI, by a generative adversarial network, a deep learning model that can be trained to create faces, art, or anything else. This is her photo, notice how the earrings and other aspects don’t exactly line up:

It is just pretty wild how far scammers will go to manipulate the Google search rankings.

Why we care. Just beware of such legal threats, do your research to ensure the firm exists, the lawyer who emailed you is real and that this is not a scam. I can see many folks just reading the email, quickly adding the link attribution credit and emailing back saying this was done – without asking for more details or without verifying this is a real issue.

Online scams are only going to get more sophisticated and look more real with AI and machine learning at their disposal. So we all need to get more sophisticated in questioning everything we see, every email we receive and every request that is made from us.

The post Beware of fake DMCA link requests by AI-generated lawyers appeared first on Search Engine Land.

Read More
Jason April 25, 2022 0 Comments

4 technical SEO tasks that are critical to organic success

“Driving revenue and awareness from search relies on your website health — today’s success in organic search is about a lot more than just keywords and content,” said Shachar Radin Shomrat, CMO of Deepcrawl, in her presentation at The MarTech Conference. “It demands technically sound websites in today’s marketing landscape.”

“Over a longer timeframe, the technical aspects of your website play a huge role in how well your content performs,” she added.

If SEOs don’t take the time to optimize critical technical aspects of your site that influence page speed, indexing, and more, it could mean lost traffic and revenue.

Here are four key technical tasks SEOs should take care of to help increase organic visibility.

1. Optimize site architecture

“Architecture is your foundational stage,” Shomrat said. “If the overall website structure is not optimized for search performance, then any individual page on that site is not set up to have its best chance at being crawled, appearing with search results and ultimately converting into a revenue-driving asset.”

No two sites are the same, but most search professionals agree site architectures should generally have a logical flow with a hierarchy of pages. This helps users and crawlers make sense of your site.

2. Ensure pages are crawlable

Your pages should be accessible to search engines and users via valid status codes. SEOs should make sure the pages they want to be included in the index have a 200 HTTP status code.

Search marketers should also ensure their robots.txt files aren’t blocking pages they want to be indexed. A misplaced disallow directive could prevent crawlers from viewing your pages at all.


Get the daily newsletter search marketers rely on.









Processing…Please wait.


3. Check which pages allow indexing

Just because your pages are crawlable doesn’t mean they’re indexable. Marketers need to make sure their pages’ robots tags allow for indexing.

“If a page is not indexed by search engines, then it will not appear to users in their search results at all,” Shomrat said.

4. Improve page experience

People are less likely to convert on sites that offer poor experiences, such as slow-loading pages. Google and other search engines encourage site owners to optimize their technical structures to prevent this from happening, allowing their content to shine in the search results.

“If you do not fix your [technical] foundation, your content and keyword investments are not going to get the return that you expect,” Shomrat said.

The necessity of technical SEO

Paying attention to the technical quality of your site is vital to SEO success. Yet getting company team members and executives on board can be a tricky task.

“Creating a strong technical foundation for your website to rank well in Google and to provide an excellent experience to your site users involves a lot of moving parts,” Shomrat said.

“It expands beyond the boundaries of most marketing teams,” she added.

She encourages marketers to prove the worth of technical SEO to their teams by establishing clear goals and showing how their efforts are meeting them. She also recommends providing colleagues and executives with studies and other educational resources to show the full impact of technically sound sites for organic search.

“Get the support you need from leadership by promoting organic and website health KPIs as business-wide OKRs [objectives and key results],” she said. “There is great data out there about the impact website health and technical SEO can have as a comparatively low-cost customer acquisition channel.”

The post 4 technical SEO tasks that are critical to organic success appeared first on Search Engine Land.

Read More
Jason April 22, 2022 0 Comments

Google SpamBrain: AI-based spam prevention system launched in 2018

SpamBrain. That is the name of Google’s AI-based spam prevention system that the search company launched in 2018, yes over a few years ago.

SpamBrain is credited by Google for catching about six times more spam sites in 2021 than it did in 2020, reducing hacked spam by 70% and reducing gibberish spam on hosted platforms by 75%.

SpamBrain. It is the first we are hearing the name SpamBrain, which Google said launched in 2018. Google referenced the 2018 Google spam report, specifically the spam trends section where Google talks about its “machine learning systems” to improve search spam detection.

Google confirmed that this is the first time they are talking about this name, SpamBrain, publicly. A Google spokesperson told us: “SpamBrain refers to our machine-learning/AI solution for fighting spam. Since we first started using it, we’ve been updating SpamBrain constantly, and it’s grown much more effective at detecting and nullifying both existing and new spam types.”

Google also said that SpamBrain “was built to be a robust and evolving platform to address all types of abuse.”

Spam improvements. Google said that in 2021 it made additional strides in detecting and thwarting search spam attempts. These highlights include:

  • ~6 times higher identification of spam sites
  • 70% reduction in hacked spam sites
  • 75% reduction in gibberish spam on hosted platforms
  • 99% “spam free” searches

More on spam. Google talked a bit about its spam-fighting efforts, saying that links are still important for ranking and that its link spam algorithm in 2021 helped “broadly identify unnatural links and prevent them from affecting search quality.” Google also released a two-part spam update, part one and part two of the spam updates in June 2021 and then a November 2021 spam update.

Why we care. You can check out the previous Google web spam reports from 2020, 2019, 2018, 2017, 2016, and 2015 if you want to follow the progress.

In short, Google will continue to fight spam in order to try to keep its search results useful and of higher quality. While some sites may still get away with some spam tactics, Google is constantly launching new methods to detect and block those sites from ranking highly in Google search.

Long-term success in Google search should be about creating a spam-free site that ranks through the test of time. Build quality and build something you are proud of. Hopefully, your site will rank through all future spam and quality updates.

The post Google SpamBrain: AI-based spam prevention system launched in 2018 appeared first on Search Engine Land.

Read More
Jason April 21, 2022 0 Comments