Category: Google: SEO

Auto Added by WPeMatico

We’re turning off AMP pages at Search Engine Land

“Gasp! Think of the traffic!” 

That’s a pretty accurate account of the more than two dozen conversations we’ve had about Search Engine Land’s support of Google’s Accelerated Mobile Pages in the past few years. At first, it was about the headache in managing the separate codebase AMP requires as well as the havoc AMP wreaks on analytics when a nice chunk of your audience’s time is spent on an external server not connected to your own site. But, Google’s decision to no longer require AMP for inclusion in the Top Stories carousels gave us a new reason to question the wisdom of supporting AMP. 

So, this Friday, we’re turning it off.

How we got here

Even when Google was sending big traffic to AMP articles that ranked in Top Stories, the tradeoff had its kinks. For a small publisher with limited resources, the development work is considerate. And not being able to fully understand how users migrated between AMP and non-AMP pages meant our picture of return and highly engaged visitors was flawed. 

But, this August we saw a significant drop in traffic to AMP pages, suggesting that the inclusion of non-AMP pages from competing sources in Top Stories was taking a toll.

Our own analytics showed that between July and August we saw a 34% drop in AMP traffic, setting a new baseline of traffic that was consistent month-to-month through the fall.

Monthly AMP Page traffic to Search Engine Land from April 2021 to October 2021.

This week we also learned that Twitter stopped referring mobile users to AMP versions, which zeroed out our third-largest referrer to AMP pages behind Google and LinkedIn. We’ve seen LinkedIn referrals fall as well, suggesting that when November ends, we’ll be faced with another, lower baseline of traffic to AMP pages.

Publishers have been reluctant to remove AMP because of the unknown effect it may have on traffic. But what our data seemed to tell us was there was just as much risk on the other side. We could keep AMP pages, which we know have good experience by Google standards, and their visibility would fall anyway due to competition in Top Stories and waning support by social media platforms.

Read next: Google throttled non-AMP page speeds, created format to hamper header bidding, antitrust complaint claims

We know what a road to oblivion looks like, and our data suggests AMP visibility is on that path. Rather than ride that to nowhere, we decided to turn off AMP and take back control of our data.

How we are doing it

“If you are ready and you have good performance of your mobile pages, I think you should start testing.” That’s what Conde Nast Global VP of Audience Development Strategy John Shehata told attendees at SMX Next this month when asked about removing AMP.

Shehata suggested a metered strategy that starts with removing AMP on articles after seven days and then moves on to removing AMP on larger topical collections.

“If all goes well, then go for the whole site,” he said “I think it’s gonna be better in the long run.”

That, of course, hinges on the speed and experience of your native mobile pages, he said.

The Washington Post, which is still listed as an AMP success story on the AMP Project site, actually turned off AMP a while back, said Shani George, VP of Communications at the Post.

“Creating a reading experience centered around speed and quality has long been a top priority for us,” she added, pointing us to an extensive write-up its engineer team published this summer around its work on Core Web Vitals.

Because we are a smaller, niche publisher, our plan is to conserve our resources and turn off AMP for the entire site at once. Our core content management system is WordPress, and AMP is currently set for posts only, not pages. But that includes the bulk of our content by far.

Our plan is to use 302 redirects initially. This way we’re telling Google these are temporary, and there won’t be any PageRank issues if we turn them off (or replace them with 301s). We’ll then see how our pages are performing without AMP. If there’s no measurable difference, we’ll then replace those 302 redirects with permanent 301 redirects. The 301s should send any PageRank gained from the AMP URLs to their non-AMP counterparts.

Of course, if the worst-case scenario happens and traffic drops beyond what we can stomach, we’ll turn off the 302 redirects and plan a different course for AMP.

It’s a risk for sure. Though we have done a considerable amount of work to improve our CWV scores, we still struggle to put up high scores by Google’s standards. That work will continue, though. Perhaps the best solace we have at this point is many SEOs we’ve spoken to are having trouble seeing measurable impacts for work on CWV since the Page Experience Update rolled out.

Maybe it’s not about traffic for us

The relationship between publishers and platforms is dysfunctional at best. The newsstands of old are today’s “news feeds” and publishers have been blindsided again and again when platforms change the rules. We probably knew allowing a search platform to host our content on its own servers was doomed to implode, but audience is our lifeblood so can you blame us for buying in?

We also know that tying our fates to third party platforms can be as risky as not participating in them at all. But when it comes to supporting AMP on Search Engine Land, we’re going to pass. We just want our content back.

The post We’re turning off AMP pages at Search Engine Land appeared first on Search Engine Land.

Read More
Jason November 19, 2021 0 Comments

Google November 2021 Core Update rolling out today

Google is rolling out a new broad core update today named the November 2021 Core Update. This is the third core update Google released in 2021.

The announcement. Google announced this rollout on the Google Search Central Twitter account, not the Search Liaison account, which it has done for all other previous announcements on core updates.

Rollout started at about 11am ET. Google updated us that the rollout has begun at about 11am ET. Google said “The November 2021 Core Update is now rolling out live. As is typical with these updates, it will typically take about one to two weeks to fully roll out.”

Timing before holidays. It is a bit shocking to see Google rollout this update before, and likely during (assuming this is a normal two week rollout), the biggest online holiday shopping season. Black Friday and Cyber Monday is less than two weeks away and Google is rolling out this update starting today.

Previously Google took breaks before the holiday shopping season, it was Google’s gift to webmasters said former Googler Matt Cutts.

Danny Sullivan of Google responded to the timing of this update on Twitter:

Previous core updates. The most recent previous core update was the July 2021 core update and before that it was the June 2021 core update and that update was slow to roll out but a big one. Then we had the  December 2020 core update ands the December update was very big, bigger than the May 2020 core update, and that update was also big and broad and took a couple of weeks to fully roll out. Before that was the January 2020 core update, we had some analysis on that update over here. The one prior to that was the September 2019 core update. That update felt weaker to many SEOs and webmasters, as many said it didn’t have as big of an impact as previous core updates. Google also released an update in November, but that one was specific to local rankings. You can read more about past Google updates over here.

What to do if you are hit. Google has given advice on what to consider if you are negatively impacted by a core update in the past. There aren’t specific actions to take to recover, and in fact, a negative rankings impact may not signal anything is wrong with your pages. However, Google has offered a list of questions to consider if your site is hit by a core update. Google did say you can see a bit of a recovery between core updates but the biggest change you would see would be after another core update.

Why we care. Whenever Google updates its search ranking algorithms, it means that your site can do better or worse in the search results. Knowing when Google makes these updates gives us something to point to in order to understand if it was something you changed on your web site or something Google changed with its ranking algorithm. Today, we know Google will be releasing a core ranking update, so keep an eye on your analytics and rankings over the next couple of weeks.

The post Google November 2021 Core Update rolling out today appeared first on Search Engine Land.

Read More
Jason November 17, 2021 0 Comments

Lighthouse 9.0 includes API changes, user flows, updated reports and more

Google has released version 9.0 of Lighthouse, the website auditing tool for developers, SEOs and site owners. With this release, Google has updated the API, added user flows, refreshed some reporting, added new accessibility elements, and more.

9.0 is available. You can access Lighthouse 9.0 using the command line, in Chrome Canary, and in PageSpeed Insights which began rolling out this week. Google said it will also be part of the Chrome stable release in Chrome version 98.

Here is what is new. Google has made some changes to the Lighthouse API saying if you use the Lighthouse report JSON, there “may be some breaking changes in 9.0 that you need to be aware of.” Those technical changes are listed here.

User flows: Lighthouse 9.0 supports a new user-flow API that allows lab testing at any point within a page’s lifespan. Puppeteer can be used to script page loads and trigger synthetic user interactions, and Lighthouse can be invoked in multiple ways to capture key insights during those interactions, the company said. “This means that performance can be measured during page load and during interactions with the page,” Google explained.

Reports: Google has refreshed some of the reports within Lighthouse. The refreshed reports aim “to improve readability and make the source of the report and how it was run clearer,” Google explained. A final screenshot has been embedded at the top of the report to make it obvious at a glance if the page being tested loaded correctly and is in the format expected. Plus, the summary information at the bottom of the report has also been redesigned to better communicate how Lighthouse was run and the report collected.

Accessibility: In 9.0 of Lighthouse, all the elements sharing that ID are now listed.

Why we care. Google recently updated the PageSpeed Insights report and a lot of the features it uses is based off of Lighthouse. Using these tools can help you make your website faster, more usable and more accessible. In terms of SEO or rankings, the changes might not make a huge difference but this is more about your users and making them happy.

The post Lighthouse 9.0 includes API changes, user flows, updated reports and more appeared first on Search Engine Land.

Read More
Jason November 17, 2021 0 Comments

Google adds local news features to search, giving publishers more exposure

Google is showing local news Tweets in certain queries now.

Google on Tuesday released new search features intended to give more visibility to local news content when searchers are looking for information about their communities.

The company said it has expanded a feature previously rolled out for COVID searches that adds a carousel of local news stories when relevant to a searcher’s query. For example, a search on “football” may bring up stories on local sports.

Why we care. Local news organizations depend on organic search traffic and, according to Google, queries like “news near me” have tripled in the past five years. Unfortunately, that also comes as local news companies continue to struggle for survival. According to The Poynter Institue, a journalism advocacy group, more than 90 local news organizations closed during the pandemic. That’s on top of declines that have stretched back years. In 2019, the New York Times reported that 1 in 15 U.S. newspapers had closed in the past 15 years.

So, it’s certainly good that Google is finding ways to give the surviving local news ecosystem more visibility.

Read next: Google News app will display non-AMP content and send readers to publisher pages

Not just news sources. In addition to adding the local news carousel, Google also is adding a “Popular on Twitter” carousel for queries on local news topics. While local news organization tweets may be included, the feature will also pull in tweets from a range of sources it deems to have local authority on those topics.

Remaining questions. The new features are noble and timely, especially as COVID created closer bonds with homes and local communities. What will be interesting to see is whether the added visibility benefits the most vulnerable local publishers. While regional dailies and local TV affiliates have not been spared by declines in local publishing, community weeklies, alt-weeklies and other niche local publications struggle the most. Will they be able to compete for slots in these carousels?

“Any publisher’s content is eligible to rank within the carousel if their content is relevant to what a reader is searching for,” said Meghann Farnsworth, a Google spokesperson. “Publishers with more Expertise, Authority and Trustworthiness for a given topic or location will rank accordingly.”

But Google acknowledged that the new carousel is not reserved for local publishers, just local content.

“Local news includes sources with news about the user’s location, which is often local publishers but can sometimes include national reporting as well,” said Farnsworth.

News SEO. These new features also highlight why good SEO is essential for local news organizations. Inclusion in Google News (if you can even get in), authoritative news content that demonstrates E-A-T and a speedy website with good UX all help determine SERP visibility, even for smaller local audiences.

Here’s what the local news carousel looks like.

The post Google adds local news features to search, giving publishers more exposure appeared first on Search Engine Land.

Read More
Jason November 16, 2021 0 Comments

SiteGround Google’s crawling and indexing issues fixed

For the past several days, 2 million or so of the domains hosted on SiteGround were potentially not being crawled and thus indexed by Google Search. There was some sort of “network issue between AWS Global Accelerator service and Google,” the company said and as of this morning, the issue was resolved.

When the issue began. Matt Tutt first reported about the issue this past Tuesday, November 9th. So the issue started sometime before November 9th, Matt suspects it started as early as Monday, November 8th.

Here is a screenshot showing how Google Search Console’s URL inspection tool was unable to access sites hosted on SiteGround. As you can see from the screenshot below, Google is reporting that it failed to crawl the page. Matt posted more debugging details on his post.

Confirmation. Then on November 10th, SiteGround confirmed the issue and said it is investigating. “We have escalated the issue to Google and we are working to troubleshoot and identify the cause of the problem,” the company said.

The issue. On November 11th, SiteGround confirmed the issue was between Amazon Web Services and Google. The company said “we traced it down to a network issue between AWS Global Accelerator service and Google. We’re collaborating with engineers from both teams to fix it.”

Resolved. Then a day later, on November 12th, SiteGround confirmed the issue was resolved and that it can take a bit more time for DNS to update but once the update is propagated properly, Google will once again be able to crawl sites hosted on SiteGround. The company said “We are glad to inform you that we have implemented a fix for the Google bot crawling issue experienced by some sites. Websites are already being crawled successfully. Please allow a few hours for the DNS changes to take effect.”

Google advice. John Mueller of Google posted some advice on Twitter on how Google deals with these outages. In short, don’t worry too much, the issues you may have experience from the outage will auto correct and “settle down.” There won’t be any “lasting effects” to the outage, John added. John posted several tweets, here is the first one if you want to click on it to read through the rest.

Why we care. If you are one of your clients are one of the two million domains hosted on SiteGround, you may have been impacted by this. That means any new or updated content or pages on your site was invisible to Google for most of the work week.

The issue is resolved and those pages should be crawled by Google going forward. But you may want to annotate your analytics and reporting if you were impacted by this crawling issue.

The post SiteGround Google’s crawling and indexing issues fixed appeared first on Search Engine Land.

Read More
Jason November 12, 2021 0 Comments

Google November spam update is fully rolled out

The Google November spam update that began rolling out on November 3rd is now fully rolled out. Google has updated us 8 days after it first started to roll out that the rollout is complete.

The announcement. Here is the announcement about the update from Google on Twitter:

Impact. It is hard to say the full impact of this update, we feel this update “had legs” but you would have only noticed this update if you were doing some form of spam efforts that this algorithm targeted. Also, Google suggested this was about content spam and not specifically link spam.

Previous updates. Before this, the most recent confirmed Google update was back in July 2021 named the link spam update. Before that was the July 2021 core update, followed by the June 2021 core update, then part one and part two of the spam updates in June 2021. It’s been quite a year of updates.

Why we care. If you notice large ranking or traffic changes from your organic Google search results, you may have been hit by this spam update. Spam updates target specific guideline violations. This update may have been more focused on content spam efforts. Check your rankings and Google organic traffic over the past week to see if you noticed any big changes to your positions.

The post Google November spam update is fully rolled out appeared first on Search Engine Land.

Read More
Jason November 11, 2021 0 Comments

Google logo schema gains ImageObject type support

Google logo schema now supports ImageObject type, in addition to the URL type, according to updated information on the help document. Google said this provides “new flexibility” to specify an organization logo using these schema markup.

What is new. The logo required properties use to just say it accepted the URL property, but now it says both URL or ImageObject. Here is a screenshot of this section of the help document:

ImageObject type gives you the ability to add additional data to an image, such as width and height, or the author or a caption. Whereas the URL type did not give you these added values.

Why we care. Google is giving us a bit more flexibility with implementing logo schema and structured data going forward. So if you are using these schema, you may decide that going forward that you want to use ImageObject type over URL type – or not.

The post Google logo schema gains ImageObject type support appeared first on Search Engine Land.

Read More
Jason November 11, 2021 0 Comments

Google to add page experience ranking signals to desktop search in February 2022

Google will begin rolling out the page experience ranking update to desktop search results starting in February 2022. Google said the rollout will finish rolling out by the end of March 2022. This update will include all the current signals of the mobile version of the page experience update, outside of the page needing to be mobile friendly.

“This ranking launch will be based on the same page experience signals that we rolled out for mobile earlier this year,” Jeffrey Jose, Product Manager on Search at Google said.

We knew this would be coming, Google told us this would happen back in May 2021 at Google I/O.

Mobile vs desktop. Which factors is will be included in this desktop version? Google said all of them with the exception of the mobile friendliness requirement, which is kind of obvious. Here is a chart Google designed showing the specific factors:

Search Console tools. Google will be updating the Google Search Console tools and reports to help site owners prepare for this update. “We are also planning to help site owners understand how their desktop pages are performing with regards to page experience using a Search Console report which will launch before desktop becomes a ranking signal,” Google said.

Don’t expect drastic changes. Google said with this rollout and this new Google update, do not expect drastic changes. “While this update is designed to highlight pages that offer great user experiences, page experience remains one of many factors our systems take into account… Given this, sites generally should not expect drastic changes,” said Google. We expect the same to be true for the desktop rollout.

Why we care. While, I do not believe this page experience update will be a significant update where you will see tons of sites see their rankings drastically change, those working towards improving their page experience have been primarily focused on their mobile pages. Now, that you have your mobile pages ready for this update, you can shift focus towards your desktop pages.

The post Google to add page experience ranking signals to desktop search in February 2022 appeared first on Search Engine Land.

Read More
Jason November 4, 2021 0 Comments

Google on Penguin algorithm; aims to ignore spammy links but can lead to distrusting your site

When Google released the real-time Penguin algorithm update, which some SEOs code-named 4.0, back in 2016, Google told us this version devalues or ignores most spammy links. For the most part, Google’s Penguin algorithm no longer penalized for bad links because it would aim to neutralize the spammy links and just not count them, as opposed to penalizing for them.

John Mueller, a Search Advocate at Google, said on Friday in a video question and answer session that Penguin does try to ignore the spammy links. However, in the cases where Google cannot because there is a “very strong pattern” of spammy links pointing to the site, Penguin may penalize and distrust the site as a whole and not act in the granular way it was designed for.

John Mueller said this at the 37:06 mark in this video he posted on Friday on the Google Search Central YouTube channel.

What was said. The question John was asked was “Is the Penguin penalty still relevant at all or less relevant spammy toxic backlinks are more or less ignored by the ranking algorithms these days.”

John responded, “I’d say it’s a mix of both.” He explained, “For the most part when we can recognize that something is problematic and any kind of a spammy link and we will try to ignore it.” “If our systems recognize that they can’t isolate and ignore these links across a website, if we see a very strong pattern there, then it can happen that our algorithms say well we really have kind of lost trust with this website and at the moment based on the bigger picture on the web, we kind of need to be more on almost a conservative side when it comes to understanding this website’s content and ranking it in the search results and then you can see kind of a drop in the visibility there.”

John is saying, that in some cases, Google’s Penguin link algorithm can demote the whole site based on the links and not just ignore the specific spammy links. But it seems like it has to be very high levels of spammy links.

“But for the most part like the web is pretty messy and we recognize that we have to ignore a lot of the links out there. So for the most part I think that’s fine. Usually, you would only see this kind of a drop if it’s really a strong and a clear pattern that’s associated with the website,” John added.

John came back on Twitter to clarify that “this is the case for many spam & low-quality signals” in the Google algorithms. He explained, “we’ll work to ignore the irrelevant effects, but if it’s hard to find anything useful that’s remaining, our algorithms can end up being skeptical with the site overall.” “Our spam algorithms are pretty nuanced and they do look at a number of factors,” he added.

Disavow links. So do we need to disavow links now, even when Google said we really don’t need to? The answer is no, you don’t need to disavow links. You can John Mueller said, “I’d either ignore it or use the disavow file (for the worse domains).”

The video. Here is the video embed where John said this:

SEO consultants chime in. I asked a few SEO consultants their opinion on what John said in this video and here is what they had to say:

Lily Ray, the Senior Director, SEO & Head of Organic Research of Amsive Digital told us, “John’s advice here shouldn’t come as much of a surprise to SEOs who have dealt with companies engaging in large scale link building initiatives using tactics that violate Google’s guidelines, only to encounter massive declines in ranking and traffic.” “Google doesn’t always send out a manual action when the sites run into trouble. But in many cases, sites can either struggle to rank or feel that they’ve received an algorithmic penalty without any formal notification. Often, when you take a look at their backlink profile and talk to the company about their SEO strategy, you might discover that most of the links are paid links, guest posts, footer links, exact match anchor text, etc. on websites no one has ever heard of. In these cases, I believe it’s important to reconcile Google’s trust issues with your site by disavowing the paid/offending links, as well as earning new, trustworthy links organically,” she added.

Glenn Gabe, SEO consultant at G-Squared Interactive told us, “Rolling out Penguin 4 was a great move by Google in 2016, since it devalued link spam versus penalizing it. But as John explained, if Google’s algorithms cannot find any useful links (which would be an extreme situation), and there is a strong and clear pattern of spammy links, then it can be skeptical with the site, and Google can lose trust with the site overall. As a result, the site can see a drop in search visibility. The problem is that many site owners believe they are being attacked via negative SEO (and that those link attacks are working — and it’s the reason they have seen drops over time). Google has explained in the past that negative SEO attacks don’t work and that its algorithms can just ignore the link spam (especially for sites with a normal mix of links). So for many of those sites fearing negative SEO attacks, the situation John covered in the latest Search Central Hangout would not really apply. In my opinion, if a site has an overwhelmingly spammy link profile (almost all of the links are unnatural and spammy), without any other quality links, then that obviously can be problematic. But for most sites that have a normal mix of links, what John is explaining should NOT be a problem. Unfortunately, I’m already hearing from site owners about this… when their sites definitely don’t fit into the situation John explained in the video.”

Dr. Marie Haynes, the CEO of MHC inc told us, “Google’s communication on what site owners need to know about Penguin has been frustrating. At a Pubcon conference in late 2016 Gary Illyes told us that it was indeed possible for Penguin to algorithmically cause harm saying, “If Penguin sees signs of manipulation, it can decide to discount ALL the links, which can be pretty bad for a site.” Our belief is that while this can happen, it is reserved for cases where there is an obvious history of links being built solely to manipulate PageRank on an astronomical scale. While we have seen improvements for some sites after filing a thorough disavow, even if no manual action is present, the only cases that we feel we can attribute improvements to disavow work are for sites with a history of years of very manipulative link building.”

Why we care. Ultimately, when it comes to link building, you should be careful. Don’t buy links, don’t look for cheap and easy ways to get links to your site. Make sure to review Google’s documented link schemes help document and stay far away from those methods.

When working on client sites that may have link issues, you need to decide if you really need to disavow links or not and then which links you should disavow. At the same time, Google is likely already ignoring most of the bad links, so you may not have to worry too much about it.

It may just be easier to avoid the practice of link building and build a website and content that other sites naturally want to link to without you asking them.

The post Google on Penguin algorithm; aims to ignore spammy links but can lead to distrusting your site appeared first on Search Engine Land.

Read More
Jason November 1, 2021 0 Comments

Google removes 12 structured data fields from the help documents

Google has removed 12 documented structured data fields from its help documents citing these were removed because they are “unused by Google Search and Rich Result Test doesn’t flag warnings for them.”

What was removed. Google removed 12 different structured data fields from within HowTo, QApage and SpecialAnnouncements rich results types. These include:

  • HowTo: description field.
  • QAPage: mainEntity.suggestedAnswer.author, mainEntity.dateCreated, mainEntity.suggestedAnswer.dateCreated, mainEntity.acceptedAnswer.author, mainEntity.acceptedAnswer.dateCreated, and mainEntity.author fields.
  • SpecialAnnouncement: provider, audience, serviceType, address, and category fields.

Google removed these 12 fields from the help documents to more accurately describe what Google Search and the Rich Results Test support.

Remove the code? Should you remove the code and fields from your structured data and code on your web pages? No, you do not have to. Google simply will not support them, but it doesn’t hurt you to keep the fields populated on your pages. Google simply won’t use them for Google Search.

Why we care. If you are using these fields, just be aware that these have been officially removed from Google’s Search help documentation. They do not work for rich results in Google Search and the testing tool won’t notify you if there are errors or warning with these field types.

Again, you do not need to remove the fields from your structured data, but Google will simply ignore them.

The post Google removes 12 structured data fields from the help documents appeared first on Search Engine Land.

Read More
Jason October 29, 2021 0 Comments