Why server logs matter for SEO

The majority of website operators are unaware of the importance of web server logs. They do not record, much less analyze their website’s server logs. Large brands, in particular, fail to capitalize on server log analysis and irretrievably lose unrecorded server log data.

Organizations that choose to embrace server log analysis as part of their ongoing SEO efforts often excel in Google Search. If your website consists of 100,000 pages or more and you wish to find out how and why server logs pose a tremendous growth opportunity, keep reading.

Why server logs matter

Each time a bot requests a URL hosted on a web server a log record entry is automatically created reflecting information exchanged in the process. When covering an extended period of time, server logs become representative of the history of requests received and of the responses returned.

The information retained in server log files typically include client IP address, request date and time, the page URL requested, the HTTP response code, the volume of bytes served as well as the user agent and the referrer.

While server logs are created at every instance a web page is requested, including user browser requests, search engine optimization focuses exclusively on the use of bot server log data. This is relevant with regard to legal considerations touching on data protection frameworks such as GDPR/CCPA/DSGVO. Because no user data is ever included for SEO purposes, raw, anonymized web server log analysis remains unencumbered by otherwise potentially applicable legal regulations. 

It’s worth mentioning that, to some extent, similar insights are possible based on Google Search Console Crawl stats. However, these samples are limited in volume and time span covered. Unlike Google Search Console with its data reflecting only the last few months, it is exclusively server log files that provide a clear, big picture outlining long-term SEO trends.

The valuable data within server logs

Each time a bot requests a page hosted on the server, a log instance is created recording a number of data points, including:

  • The IP address of the requesting client.
  • The exact time of the request, often based on the server’s internal clock.
  • The URL that was requested.
  • The HTTP was used for the request.
  • The response status code returned (e.g., 200, 301, 404, 500 or other).
  • The user agent string from the requesting entity (e.g., a search engine bot name like Googlebot/2.1).

A typical server log record sample may look like this:

150.174.193.196 - - [15/Dec/2021:11:25:14 +0100] "GET /index.html HTTP/1.0" 200 1050 "-" "Googlebot/2.1 (+http://www.google.com/bot.html)" "www.example.ai"

In this example:

  • 150.174.193.196 is the IP of the requesting entity.
  • [15/Dec/2021:11:25:14 +0100] is the time zone as well as the time of the request.
  • "GET /index.html HTTP/1.0" is the HTTP method used (GET), the file requested (index.html) and the HTTP protocol version used. 
  • 200 is the server HTTP status code response returned.
  • 1050 is the byte size of the server response.
  • "Googlebot/2.1 (+http://www.google.com/bot.html)" is the user agent of the requesting entity.
  • "www.example.ai" is the referring URL.

How to use server logs

From an SEO perspective, there are three primary reasons why web server logs provide unparalleled insights: 

  1. Assisting to filter out undesirable bot traffic with no SEO significance from desirable search engine bot traffic originating from legitimate bots such as Googlebot, Bingbot or YandexBot. 
  2. Providing SEO insights into crawl prioritization and thereby enabling the SEO team with an opportunity to proactively tweak and finetune their crawl budget management.
  3. Allowing for monitoring and providing a track record of the server responses sent to search engines.

Fake search engine bots can be a nuisance, but they only rarely affect websites. There are a number of specialized service providers like Cloudflare and AWS Shield that can help in managing undesirable bot traffic.In the process of analyzing web server logs, fake search engine bots tend to play a subordinate role.

In order to accurately gauge which parts of a website are being prioritized other than major search engines, bot traffic has to be filtered when performing a log analysis. Depending on the markets targeted, the focus can be on search engine bots like Google, Apple, Bing, Yandex or others. 

Especially for websites where content freshness is key, how frequently those sites are being re-crawled can critically impact their usefulness for users. In other words, if content changes are not picked up swiftly enough, user experience signals and organic search rankings are unlikely to reach their full potential.

Only through server log filtering is it possible to accurately gauge relevant search engine bot traffic.

While Google is inclined to crawl all information available and re-crawl already known URL patterns regularly, its crawl resources are not limitless. That’s why, for large websites consisting of hundreds of thousands of landing pages, re-crawl cycles depend on Google‘s crawl prioritization allocation algorithms.

That allocation can be positively stimulated with reliable up-time, highly responsive web services, optimized specifically for a fast experience. These steps alone are conducive to SEO. However, only by analyzing complete server logs that cover an extended period of time is it possible to identify the degree of overlap between the total volume of all crawlable landing pages, the typically smaller number of relevant, optimized and indexable SEO landing pages represented in the sitemap and what Google regularly prioritizes for crawling, indexing and ranking.

Such a log analysis as an integral part of a technical SEO audit and the only method to uncover the degree of crawl budget waste. And whether crawlable filtering, placeholder or lean content pages, an open staging server or other obsolete parts of the website continue to impair crawling and ultimately rankings. Under certain circumstances, such as a planned migration, it is specifically the insights gained through an SEO audit, including server log analysis, that often make the difference between success and failure for the migration.

Additionally, the log analysis offers for large websites critical SEO insights. It can provide an answer to how long Google needs to recrawl the entire website. If that answer happens to be decisively long — months or longer — action may be warranted to make sure the indexable SEO landing pages are crawled. Otherwise, there’s a great risk that any SEO improvements to the website go unnoticed by search engines for potentially months after release, which in turn is a recipe for poor rankings.

A three-part Venn diagram showing the overlap between what google crawls, your XML sitemap and your SEO landing pages.
A high degree of overlap between indexable SEO landing pages and what Google crawls regularly is a positive SEO KPI.

Server responses are critical for great Google Search visibility. While Google Search Console does offer an important glimpse into recent server responses, any data Google Search Console offers to website operators must be considered a representative, yet limited sample. Although this can be useful to identify egregious issues, with a server log analysis it is possible to analyze and identify all HTTP responses, including any quantitatively relevant non-200 OK responses that can jeopardize rankings. Possible alternative responses can be indicative of performance issues (e.g., 503 Service Unavailable scheduled downtime) if they are excessive.

An abstract graphic showing 503 and 200 status codes.
Excessive non-200 OK server responses have a negative impact on organic search visibility.

Where to get started

Despite the potential that server log analysis has to offer, most website operators do not take advantage of the opportunities presented. Server logs are either not recorded at all or regularly overwritten or incomplete. The overwhelming majority of websites do not retain server log data for any meaningful period of time. This is good news for any operators willing to, unlike their competitors, collect and utilize server log files for search engine optimization.

When planning server log data collection, it is worth noting which data fields at a minimum must be retained in the server log files in order for the data to be usable. The following list can be considered a guideline:

  • remote IP address of the requesting entity.
  • user agent string of the requesting entity.
  • request scheme (e.g., was the HTTP request for http or https or wss or something else).
  • request hostname (e.g., which subdomain or domain was the HTTP request for).
  • request path, often this is the file path on the server as a relative URL.
  • request parameters, which can be a part of the request path.
  • request time, including date, time and timezone.
  • request method.
  • response http status code.
  • response timings.

If the request path is a relative URL, the fields which are often neglected in server log files are the recording of the hostname and scheme of the request. This is why it is important to check with your IT department if the request path is a relative URL so that the hostname and scheme are also recorded in the server log files. An easy workaround is to record the entire request URL as one field, which includes the scheme, hostname, path and parameters in one string.

When collecting server log files, it is also important to include logs originating from CDNs and other third-party services the website may be using. Check with these third-party services about how to extract and save the log files on a regular basis.

Overcoming obstacles to server log analysis

Often, two main obstacles are put forward to counter the urgent need to retain server log data: cost and legal concerns. While both factors are ultimately determined by individual circumstances, such as budgeting and legal jurisdiction, neither have to pose a serious roadblock.

Cloud storage can be a long-term option and physical hardware storage is also likely to cap the cost. With retail pricing for approximately 20 TB hard drives below $600 USD, the hardware cost is negligible. Given that the price of storage hardware has been in decline for years, ultimately the cost of storage is unlikely to pose a serious challenge to server log recording. 

Additionally, there will be a cost associated with the log analysis software or with the SEO audit provider rendering the service. While these costs must be factored into the budget, once more it is easy to justify in the light of the advantages server log analysis offers. 

While this article is intended to outline the inherent benefits of server log analysis for SEO, it should not be considered as a legal recommendation. Such legal advice can only be given by a qualified attorney in the context of the legal framework and relevant jurisdiction. A number of laws and regulations such as GDPR/CCPA/DSGVO can apply in this context. Especially when operating from the EU, privacy is a major concern. However, for the purpose of a server log analysis for SEO, any user-related data is of no relevance. Any records that can not be conclusively verified based on IP address are to be ignored. 

With regard to privacy concerns, any log data which does not validate and is not a confirmed search engine bot must not be used and instead can be deleted or anonymized after a defined period of time-based on relevant legal recommendations. This tried and tested approach is being applied by some of the largest website operators on a regular basis.

When to get started

The major question remaining is when to start collecting server log data. The answer is now! 

Server log data can only be applied in a meaningful way and lead to actionable advice if it is available in sufficient volume. The critical mass of server logs’ usefulness for SEO audits typically ranges between six and thirty-six months, depending on how large a website is and its crawl prioritization signals. 

It is important to note that unrecorded server logs can not be acquired at a later stage. Chances are that any efforts to retain and preserve server logs initiated today will bear fruits as early as the following year. Hence, collecting server log data must commence at the earliest possible time and continue uninterrupted for as long as the website is in operation and aims to perform well in organic search.

The post Why server logs matter for SEO appeared first on Search Engine Land.

Read More
Jason January 11, 2022 0 Comments

SEO software tools: What marketers need to know

Search Engine Optimization remains the stalwart mainstay of digital marketing, with search driving around 50% of website traffic on average, according to an analysis of SimilarWeb data by Growth Badger. But the practice of SEO has become more complex and it involves more considerations than SEOs enjoyed in the “ten blue links” era.

Today, SEO includes everything from content marketing and distribution to user experience, and even the core job of gathering and interpreting search intelligence has become more challenging as the search engines continually
change their display of results and port them over to other media like voice assistants. This doesn’t mean that the well-established SEO best practices should be cast aside, however. Keyword research, page-level analysis, backlink tracking and acquisition, and rank tracking are still of critical importance, even as the environment continues to change.

SEO platforms offer numerous capabilities that range from keyword research and rank-checking to backlink analysis and acquisition, as well as competitive intelligence, social signal integration, and workflow rights and roles.

Enterprise-level platforms may also provide more extensive link and site audits or analytics that include predictive scoring systems to identify potential opportunities to improve page performance or link authority. Vendors differentiate by offering more frequent or detailed data updates or content marketing features that sometimes require additional investment.

The following section discusses some of these capabilities and the key considerations involved in choosing an enterprise SEO platform.

Get the full report on Enterprise SEO Tools here

Link analysis and acquisition

Links continue to be one of the most important external or “off-the-page” signals that can help a website rise in search engine rankings. Most enterprise SEO platforms provide link analysis (i.e., what sites are linking to yours), link building or removal recommendations via competitive analysis, and other reports that reveal opportunities for obtaining links (i.e., what sites should you solicit links from) as part of their base platforms.

Keyword research/rank analysis

Keyword research – knowing what terms people use to find your website, how your pages rank for various queries, and how you should use those terms in your copy – has been a pillar of effective SEO. Virtually all enterprise SEO platforms provide keyword research tools that allow marketers to discover the ways that consumers search for content, and what keywords are driving traffic for competitors.

Vendors source this data differently, however. Some vendors license data from point solutions or ISPs, due to Google’s restrictions on scraped data in its terms of use and the percentage of search results that are “keyword (not provided).” Other vendors develop and manage a proprietary database of keyword terms. As a result, reliable keyword data has become less of a commodity and more expensive.

It’s also important to note that rank analysis has grown increasingly complex as Google has upped its use of more dynamic and visual SERPs. Marketers are no longer satisfied with simple numeric designation of how their page ranks for a particular query; they want to know if it’s displayed in a Carousel, in a Knowledge panel, with Sitelinks — or any of the other ways in which crawled content is being displayed on the SERPs. One of the newest entrants to
this category, Visably, offers a very different look at ranking, going so far as to look at all of the content on pages that rank for a particular keyword and then categorizing those pages.

With all of this data, it seeks to give brands a sense of how they’re coming across in search generally, even if the brand-related activity is happening on third-party sites.

Search intent-based analysis

Google’s search algorithms are focusing less on keyword matches and more on search intent. Recent algorithm updates, including the addition of BERT, have reduced the value of keywords in SEO. To counter the lack of keyword data, SEO platform vendors are developing more “search intent”-based tools that analyze search intent and predict or recommend the most relevant content that would meet the searcher’s needs.

Custom site crawls/audits

With content quality becoming the lynchpin for many marketers’ SEO strategies, site crawls or audits are important tools offered by enterprise SEO platform vendors. Some platforms offer optimization recommendations for keywords, page structures, and crawlability; prioritizing and assigning scores for such factors as HTML title tags, body tags, and meta-tags.

Most SEO platforms provide daily site crawls; others offer a weekly frequency. Ideally, the tool should be able to crawl the entire site, not just random pages, and should support the analysis of mobile-optimized and AMP pages as well. However, some enterprise sites are so large it’s unrealistic to expect a tool to crawl it in its entirety.

Get the full report on Enterprise SEO Tools here

Social signal tracking and integration

Social media activity isn’t directly included in search engine ranking algorithms, but pages that are highly shared benefit from higher traffic, and watching social activity can help inform content creation and distribution strategies. Most enterprise SEO platforms track, measure, and integrate social signals into their analytics and dashboard reports.

Sites that experience strong social sharing typically perform better in organic search results. Capabilities range from social signal tracking and correlations to site traffic and conversions, as well as social profile monitoring and sentiment analysis, and contact-relationship management.

While most vendors do well at tracking organic traffic, few currently track paid social activity.

Content marketing and analysis

SEO and content marketing have become closely aligned, as Google has raised the content quality bar through developments like BERT and RankBrain (Hummingbird), and its regular algorithm updates. As a result, relevant, up-to-date content has become integral to SEO success.

Many vendors have upgraded the content optimization and content marketing capabilities of their enterprise SEO platforms and expanded the tools’ content marketing features. These include page management tools or APIs to monitor on-page content and errors, reports on content performance and traffic trends, influencer identification and campaign management, and real-time content recommendations.

More advanced platforms perform analysis to help improve the depth and quality of content by performing topical analysis of content and comparing it against competition to identify potentially important gaps and make recommendations for improvement.

One emerging area in which vendors are investing is the ability to automatically and proactively suggest topics that marketers should create content about — eliminating the need to spend lots of time on analysis. Some even provide assistance with developing the type of content that will show up in queries for target keywords.

International search tracking

International search coverage has become a critical capability, as the global economy leads more U.S.-based enterprises to conduct business online and offline in multiple countries and languages. Most enterprise SEO platforms offer some level of international search coverage that crosses borders, languages, and alphabets. The capabilities include international keyword research, integrating global market and search volume data into the platform, as well as integrating global CPC currency data.

Mobile/local analytics

Google’s search engine updates are increasingly focused on improving the mobile/local search user experience. As mobile-friendly sites rise to the top of the SERPs, marketers are demanding more and better mobile and local data and analytics to help them optimize their sites for mobile users and improve search engine rankings. Many vendors offer features such as mobile audits, rankings, and metrics by device (i.e., desktop, tablet, iPhone, and Android) as well as by location.

Technical SEO crawling

The increasing importance of mobile traffic is also driving the development of tools to identify problems that may be slowing page load or affecting mobile-friendliness. This includes providing information about a site’s ranking for Core Web Vitals.

Additionally, technical implementation of schema markup is necessary if a page is to be used in one of the featured snippets or other advanced displays. Many of today’s tools can identify schema errors and advise on correcting them.

Cross-device attribution

Recognizing that SEO is just one aspect of a brand’s marketing efforts, and also that search traffic (especially on brand keywords) is influenced by paid media, some vendors are developing capabilities that help marketers determine what marketing initiative is driving site visits or sales. This is becoming increasingly difficult, however, as third-party cookies are no longer being supported by many companies.

Get the full report on Enterprise SEO Tools here

The benefits of using SEO platforms

With hundreds, thousands, tens of thousands, and even millions of pages, sites, social conversations, images, and keywords to manage and optimize, enterprise SEO has become increasingly complicated and time-consuming.

Using an SEO platform can increase efficiency and productivity while reducing the time and errors involved in managing organic search campaigns. More specifically, managing SEO through an enterprise toolset can provide the following benefits:

  • Many tools, one interface. SEO platforms perform many tasks in one system. A comprehensive dashboard can help your organization monitor SERP rankings and trends, how you measure up to competitors and your share of voice. The integration and prioritization of tasks, reporting, and user permissions can offer substantial benefits to enterprise-level SEO operations.
  • Intent insights. Because of the search engines’ increased focus on user intent, enterprise-level SEO tool vendors are developing machine learning models that analyze user behavior and site content to help marketers answer searchers’ questions.
  • More efficient management of global operations. Enterprise SEO tools have built-in diagnostics that can be invaluable on a global scale to identify site-wide issues across languages, countries or regions. These tools uncover macro and micro issues with pages, templates and infrastructure.
  • Keeping pace with the search engines. SEO software vendors have dedicated teams and engineers to follow frequent search engine algorithm changes and their impact on the SEO reporting required by enterprises.
  • Automated reporting to provide data in near real-time. Many brands end up trying to put a lot of data in spreadsheets and manually update them. But that doesn’t provide a complete view of the data. Most enterprise SEO platforms offer highly customized reporting capabilities that are widget- and wizard-driven to make reporting faster and easier. Many also allow for the export of data to business intelligence tools or other analytics software.

The post SEO software tools: What marketers need to know appeared first on Search Engine Land.

Read More
Jason January 10, 2022 0 Comments

January 10: The latest jobs in search marketing

SEO & Content Specialist @ OnBoard (U.S. remote)

  • Salary: $75-85k/yr
  • Work with the business leaders and stakeholders to develop and optimize new and existing content for search engines, creating organic traffic growth and increased on-site conversion.    
  • In partnership with the Demand Generation team, build and iterate keyword strategies that generate leads and accelerate sales funnel velocity. 

Content Strategist @ A Network for Grateful Living (U.S. remote)

  • Salary: $58-65k/yr
  • Cultivate a cohesive content marketing strategy targeted at: driving traffic and brand awareness, increasing engagement, generating new subscribers, building on a strong donor base and retaining activity within an existing community.
  • Work with internal/external collaborators and creative resources to deliver content that is consistent with A Network for Grateful Living’s mission, brand voice and strategy.

Senior Paid Media Specialist @ SEMbyotic (Western U.S. remote)

  • Salary: $65-95k/yr
  • Developing paid search strategy and campaigns across a variety of digital channels (Google, Bing, LinkedIn, Twitter, Facebook, Terminus, etc).
  • Articulating and presenting those strategies to clients for approval.

Senior SEO Analyst @ CB2 (U.S. remote or hybrid in Chicago)

  • Salary: $70-90k/yr
  • Responsible for writing highly-polished, clear, concise and grammatically correct copy that is consistent with brand voice and in-line with SEO best practices and commonly-used style guides.
  • Use keyword research tools to identify, research, and prioritize key traffic and revenue-driving terms from which to write and optimize website copy, meta data and inter-linking strategies.

Want a chance to include your job listing in the Search Engine Land newsletter? Send along the details here.

The post January 10: The latest jobs in search marketing appeared first on Search Engine Land.

Read More
Jason January 10, 2022 0 Comments

Google Ads to allow ads for sport betting in New York

New York state will begin allowing mobile or online sports bets to be placed beginning Saturday, January 8, 2022. With that Google has updated its Google Ads gambling policy to allow ads for sports betting from certified and state-licensed entities in New York State. This was confirmed by the New York State Gaming Commission yesterday and it goes into effect tomorrow at 9am local time.

The announcement. Google posted the announcement saying “in January 2022, the Google Ads Gambling and games policy United States country-specifics will be updated. We will begin to accept and run ads for sports betting from certified, state-licensed entities in New York from January 8.”

Google added “aAdvertisers must apply for certification” and “application for certification will be open to advertisers who wish to promote online gambling content in this region on January 7, 2022.”

Google wont update the policy until tomorrow but it will be updated tomorrow. Google wrote “the Gambling and games policy page will be updated when the policy goes into effect.”

Why the change. The Supreme Court overturned the Professional and Amateur Sports Protection Act, a 1992 federal law that restricted all but a handful of states from legalizing sports gambling.

Who is impacted. Well, you can expect gambling sites and apps from companies like Caesars Sportsbook, DraftKings, FanDuel, and Rush Street Interactive to jump at this.

Why we care. If you have clients or run ads for a company that does sports betting online, then you may want to look to expand your Google Ads for New Yorkers.

The post Google Ads to allow ads for sport betting in New York appeared first on Search Engine Land.

Read More
Jason January 8, 2022 0 Comments

Google AdSense related search experiments re-enabled

Google has announced that it has re-enabled the AdSense related search experiments. Google said as of January 6, 2022 AdSense publishers can once again create Related search Custom search style experiments in your AdSense account.

Paused for 8-months. Google actually originally announced the company was just pausing the related search experiments for a couple of months. Well, that lingered on for a total of eight-months and now over eight-months later, Google has unpaused this feature.

The original pause announcement, which was changed slightly, now reads:

We would like to inform you about a few changes that impact your ability to create Custom search style experiments, specifically for your Related search styles in your AdSense account.

Beginning May 10, 2021, you will not be able to create new experiments for your Related search styles in your AdSense account. Your existing Related search experiments that are currently in progress will also stop on May 10, 2021. Please note that this change does not prevent you from creating experiments for other elements of your styles (e.g., search ads, shopping ads).

Our engineering teams are working hard to bring this functionality back.

Thank you for your understanding and we apologize for any inconvenience caused.

Where to access. You should be able to see this option under the “optimization” section and within the “experiments” tab in the AdSense console.

Why we care. If you missed using the related search experiments feature for your AdSense ads on your site, you can now re-enable it. It might be worth testing the feature if you have not done so for a while.

The post Google AdSense related search experiments re-enabled appeared first on Search Engine Land.

Read More
Jason January 7, 2022 0 Comments

Apple Maps: What a maps web snapshot can do for your online content

Apple recently announced Maps Web Snapshots, a new static map product. Maps Web Snapshots allow users to create a static map image from a URL that can be used any time an interactive map is not necessary or JavaScript is unavailable. The map image pulls from a URL, making it a fantastic option for web pages and email clients alike.

How it works.  Maps Web Snapshots allow you to visually share points of interest and location details simply by loading a URL. The maps share chosen data points and details, including business locations, geographic boundaries, routing information and parameters, which can be customized to display different overlay styles, color schemes and map types, allowing you to choose what information your audience sees and how it appears on their screen. All you need is an Apple Developer account, MapKitJS key information and a domain to refer the Snapshot from. Once you set up the appropriate credentials, you can utilize Snapshots Studio to build your Maps Web Snapshot.

Unlike Apples Core Maps product, these maps are static and do not offer interactive features, making them perfect for situations where JavaScript is not available.

What it looks like. If you’ve ever used DuckDuckGo, you’ve likely seen Maps Web Snapshots in action.

Apple provides the basic map layout for developers to use as a starting point. They can then add parameters to modify the map display based on the user’s light or dark mode display, in addition to choosing the type of map displayed using the Snapshot Studio tool. Once a location is chosen for the center point of the map, parameters such as map size, language, map or satellite view, color and marker style can be customized.

Once the map is created, it can be embedded anywhere you choose and allows up to 25,000 daily views. Developers can request capacity increases as needed, depending on their level of developer membership.

Why we care. Maps Web Snapshots allow you to add a map to your website, email, or anywhere else you can share a link that you’ve chosen the data points on. Adding business locations, points of interest, and routing information that you choose to share offers a variety of opportunities for business owners and users alike:

  • Share route information with customers.
  • Shareable, branded, clutter-free (and competitor-free) map views.
  • Share business locations.
  • Create chosen points of interest for visitors.
  • Embeddable map content that is not dependent upon Google.
  • Private route sharing.

Apple has been quietly working to create a map product that rivals Google Maps, and as a result, they’re coming out with innovative features that users seem to be increasingly pleased with. Apple’s commitment to privacy was further solidified with their DuckDuckGo partnership, offering private map route planning. Their maps are currently displayed in DuckDuckGo’s SERPs, and this trend will likely continue across other search platforms as users’ desire for privacy increases.

The post Apple Maps: What a maps web snapshot can do for your online content appeared first on Search Engine Land.

Read More
Jason January 7, 2022 0 Comments

Google Ads error tells advertisers exact match keywords are saving as broad match

Search marketers entering exact match keywords may see an alert towards the bottom of their Google Ads interface stating, “Broad match keywords are on. Keywords will be saved as broad match.” This message is an error and has no impact on how keywords are saved, Ginny Marvin, ads product liaison at Google, has confirmed.

Image: Drew Cannon.

Tip of the hat to Drew Cannon for bringing this to our attention.

Google is working to remove the incorrect message. “This message is an error that we are working to remove,” Marvin said on Twitter, “The message was intended for a potential opt-in experiment, but it has no impact on how keywords are saved or the traffic to which these keywords match. Our apologies for the confusion.”

Why we care. This message tells advertisers exactly what they don’t want to see when using exact match keywords, so it can be frustrating and cause them to question whether the platform is functioning properly. However, the ability to save exact match keywords remains intact, despite the confusing alert, so you can carry on as usual. And, Google is working on a fix, so the message should go away soon.

The post Google Ads error tells advertisers exact match keywords are saving as broad match appeared first on Search Engine Land.

Read More
Jason January 7, 2022 0 Comments

Engage Your Online Prospects With These New Mortgage Website Templates

Looking to start 2022 with a fresh online presence? Check out these 2 brand-new mortgage website templates we just released! These professional, eye-catching designs make a striking first impression, and the sleek interface makes for an enjoyable user experience for your loan prospects.

They’re mobile responsive, modern, fully customizable (no coding needed!), and ready to launch in minutes –just like all our other templates. They also feature our new Mortgage Calculator, underscoring the modern borrower’s preference to engage, not just consume.

Perfect for either personal or company branding, the new mortgage templates have GROWTH built into every design and integration element. Click the images below to see them live.

Template 62template 61

 

Template 61

template 61

Read More
Jason January 6, 2022 0 Comments

Google launches ‘Shops’ section in mobile search results

Google has launched a “Shops” section in the mobile search results, a company spokesperson has confirmed to Search Engine Land. The Shops section shows three retailers (but can be expanded to show up to ten) based on their organic search rankings and is available on mobile devices for select shopping-related queries in the U.S.

The Shops section in the mobile search results. Image: Khushal Bherwani.

Tip of the hat to Khushal Bherwani for bringing this to our attention.

Google’s statement. “We recently launched Shops, a new module available on mobile devices for select US-English shopping-related queries,” a Google spokesperson told Search Engine Land, “We launched this to help present more seller options to users on Search. This feature currently shows 3 shops and users can then expand to see up to 10 merchants depending on availability. The selection of results shown and their order are based on organic search ranking.”

Another milestone for Google’s organic shopping efforts. Over the last two years, Google has expanded its shopping-related results from being a paid product to also offering plenty of visibility opportunities organically, beginning with the introduction of free product listings in April 2020.

The company has also introduced a “deals” section in the search results and launched Shopping integrations for Shopify, WooCommerce and GoDaddy, among other e-commerce platforms.

Why we care. The Shops section is another area in the search results where retailers might potentially appear, which can increase awareness for their brands and drive traffic. However, since the Shops section is based on organic search ranking, retailers who don’t already rank well may not be able to reap its benefits.

As Google continues to add support for organic shopping features, it only becomes more important for merchants to ensure their sites are optimized both for traditional search results and shopping-related features.

Non-shopping results may also appear in the same results page as the Shops section (in the case of the screenshot above, there is a listing for a tutorial on how to fix a broken bike chain). The addition of more shopping features may potentially push non-shopping-related results further down the page, which can affect clickthroughs.

From an industry perspective, Google’s buildout of organic e-commerce features supporting both users and merchants speaks to the rise of digital commerce and its role in the company’s strategy.

The post Google launches ‘Shops’ section in mobile search results appeared first on Search Engine Land.

Read More
Jason January 6, 2022 0 Comments