Category: SEO

Evolving Core Web Vitals tactics using Cloudflare and WebpageTest

In our guide to Core Web Vitals tactics using Cloudflare and WebpageTest, we outlined basic requirements for using Cloudflare as a reverse proxy for testing tactical HTML changes with WebpageTest. Our version of the test is simplified from Patrick Meenan’s original concept, which uses HTMLRewriter() to select an element and modify code.

We’re going in-depth with this tutorial, but if you’re just looking for the Cloudflare Worker script, you can find it here.

Our first installment noted that it won’t keep up with changes at Search Engine Land. The LCP was hard-coded and we would need it to interact with a dynamic page and its values. While WebpageTest has, at the time of publication, the most well-thought-out waterfall chart and more details than you can imagine, it isn’t the fastest way to get results.

Lighthouse from the Command Line

Running the Lighthouse CLI (Command Line Interpreter) program with --extra-headers options needed for the test allows us to also simulate standard settings for Core Web Vitals the way we did with WebpageTest. You’ll need to work from a terminal emulator.

The easiest way to install Lighthouse is with NPM (Node Package Manager). Once installed, run the following statement:

$ lighthouse https://sel.deckart.workers.dev
--extra-headers "{"x-host":"searchengineland.com", "x-bypass-transform":"false"}"
--form-factor=mobile
--throttling.cpuSlowdownMultiplier=4
--only-categories=performance
--view

The evolution of our Testbed

Our aim is to demonstrate an evolution from an original concept for a testbed to a project suitable for our future events and articles. The testbed should not be confined to running performance evaluations; that’s just where we’ll start. But, it has to work fairly well for a number of situations with websites and this can prove pretty difficult. We’ll supply methods to help.

For example, sites often use relative paths to asset resources rather than absolute (with HTTP protocol and all). We’ll supply a block to match these so HTML will generally work. After applying this, when things still don’t work, switching troublesome references between the test and test subject hostnames often does the trick, even for CORS policy violations.

That’s where the beauty of Cloudflare’s HTMLRewriter() really shines. Site-wide assets are usually loaded as page HEAD child elements. With flexibility matching like jQuery, even similar syntax, we can select child elements of HEAD when necessary. You can use XPath selectors and regular expressions. Let’s keep it simple and look for relative paths that start with “/” for src or href attributes:

return new HTMLRewriter()
  .on('link', {
    element: el => {
      link_href = el.getAttribute('href');
      if (link_href && link_href.startsWith('/')) {
        el.setAttribute('href', 'https://' + host + link_href);
      }
    }
  })
  .on('script', {
    element: el => {
      script_src = el.getAttribute('src');
      if (script_src && script_src.startsWith('/')) {
        el.setAttribute('src', 'https://' + host + script_src);
      }
    }
  })
  .on('img', {
    element: el => {
      img_src = el.getAttribute('src');
      if (img_src && img_src.startsWith('/')) {
        el.setAttribute('src', 'https://' + host + img_src);
      }
    }
  })

We’re leveraging the power (and cost effectiveness) of Edge Computing to conduct seriously useful tests. Modify the x-host request header to load different sites in the testbed and open DevTools. Transformations may not be needed, but your mileage will vary. Frontend experience gives you a feel for it.

Comment blocks like switches will fail and require a little experimentation (which may be all you need). For example, some asset references may be spelled without HTTP colon. You would need to write another conditional to check for paths where href or src starts with “//” and then modify the selected element value in the script. Try to end up with no console errors the actual site doesn’t have.

Lighthouse gives you LCP

It’s relatively easy to retrieve LCP references using Lighthouse, PageSpeed Insights or WebpageTest. Presuming the LCP qualifies for preload, like when it’s not a <div> or a <p>, and when it isn’t already getting preloaded, provide our script the href value by URL ‘query param’ structure (or return HTML with a form) to test for changes to a page’s LCP timing with preload.

Most technical SEO practitioners are handy at modifying request query parameters to process different things in server-side programs, like Google search results. Using the same interface, our script will preload the LCP using the path you apply in the “lcp” parameter value and passes it to a function called addPreloadAfter() for interpolating HTML for the test.

async function handleRequest(request) {
  const { searchParams } = new URL(request.url);
  let lcpHref = searchParams.get("lcp");

  return new HTMLRewriter()
    .on('title', addPreloadAfter(lcpHref))
  .transform(newResponse);
}

The addPreloadAfter() function takes our “lcpHref” value from searchParams.get() and processes it as “href” to build HTML.

const addPreloadAfter = (href) => ({
  element: (el) => {
    el.after(`<link rel="preload" href="${href}" />`, { html: true });
  }
});

Notice the option “html: true”? This is an option setting Cloudflare requires for safety when using Workers with HTMLRewriter() API methods that write HTML. You are going to want to learn its capabilities and constraints for coding your own tests.

Cloudflare’s KV

If we’re ever going to do anything remotely interesting, we need a way to store persistent data between script executions. Luckily, Cloudflare also offers a neat little data storage mechanism called KV that we can bind with our Workers to store a small data ‘value‘ field, accessible by its ‘key.’ It’s surprisingly easy to comprehend and implement. To demonstrate how to use it we’ll write a quick little hit counter.

const counter = await KV.get("counter");

if (!host || counter > 1000) {
  return new Response('hit limit exceeded or x-host missing', {status: 403});
} else {
  await KV.put("counter", parseInt(counter) + 1);
}

Find the KV navigation menu item under Workers.

Add a KV Namespace and counter variable with zero for a starting value

Once you’ve created a Namespace (“SEL” is used in the example above), use the KV dashboard UI to create your first Key (‘counter‘ in the above case) and assign a starting value. Once set up, navigate back to the Worker dashboard for the interface required to bind our new KV Namespace with Cloudflare Workers so they can access Keys and the associated stored Values.

Bind KV Namespaces to Workers

Choose the Worker you want to bind with and click its Settings menu to find the submenu for Variables (directly under General). Notice you can define environment variables, Durable Object Bindings (which we’ll explore in a future installment), and finally KV Namespace Bindings. Click Edit Variables and add the Variable you want to use in script.

In the following case, you can see our redundantly named ‘KV‘ variable that we’ll be using in the associated Worker script, the one we navigated from. Our use of ‘KV‘ was named for illustrative purposes. Select it from the dropdown, save it, and you’ll immediately be able to use your variable in the script. Create as many scripts and KV Namespaces combinations as you like.

KV Namespace bindings.
KV Namespace Bindings.

The trick is remembering to bind a Variable you want used in the Worker. It’s so flexible that you can feel free to munge about and make a mess at first. You’ll probably be able to organize it into something cohesive at a later date, which is exactly what you want for being able to prototype applications or author Microservices for use in your applications.

Once you’ve gotten your KV service and starting values set up, navigate back to the Worker and open the built-in “Quick Edit.” Replace what’s there with this updated gist, which includes the hit counter, and everything else written about in this post. Click “Save and Deploy” and you should have the service up and running at your publicly available, Workers demo URL.

Why we care

Our original guide was meant to whet your appetite, get you excited to start and excited for more valuable learning. In order to supply that, we have a free platform and code combination that is simple enough to understand on its own, coupled with a process that should be easy enough to follow and achieve a test result.

Standardizing website testing to demonstrate SEO to developers shouldn’t require understanding code when you can copy and paste script into Cloudflare, follow steps and test certain SEO tactics. Core Web Vitals tests are about as reliable as we’re going to get for improving RUM (Real User Metrics) performance scores for a boost in rankings, given how metrics dependent it is.

The post Evolving Core Web Vitals tactics using Cloudflare and WebpageTest appeared first on Search Engine Land.

Read More
Jason January 24, 2022 0 Comments

Neeva seeks to expand user base with free subscriptions

Neeva, the ad-free, private search engine co-founded by former SVP of Google Ads Sridhar Ramaswamy, has launched a free basic subscription as an alternative to its full-featured subscription, which costs $4.95 per month. While a worldwide rollout is planned, both subscriptions are currently only available to users in the U.S.

Image: Neeva.

Why we care. Newer search engines, like Neeva, DuckDuckGo and Ecosia, are finding novel ways to differentiate themselves from Google and Bing by rallying behind a unique selling point to appeal to a niche, but enthusiastic user base. A subscription fee can be a strong deterrent for new user acquisition, and Neeva is notably one of the few search engines that charges one.

“Even with a limited trial period, hundreds of thousands of users search with Neeva every month, and we think that the introduction of a free tier will drive this to new heights,” the company said in the announcement. If Neeva’s free offering gains traction, search marketers may need to pay attention as organic campaigns will be crucial for reaching Neeva’s users.

However, if free users don’t upgrade to the paid subscription, Neeva may have to adjust its strategy, especially if it can’t compensate for the additional cost of those users with ad revenue.

Free vs. premium. Free subscribers have access to Neeva’s ad-free search engine, however, customization options may be limited.

The premium subscription includes everything in the free subscription, but also includes access to Neeva’s latest search features, membership in a Neeva-hosted community, access to a monthly Q&A with the founders and additional privacy tools, such as a VPN and password manager.

RELATED: Neeva’s ‘FastTap Search’ feature presents direct links instead of a results page

The post Neeva seeks to expand user base with free subscriptions appeared first on Search Engine Land.

Read More
Jason January 23, 2022 0 Comments

How AI can automate SEO tasks at scale

Artificial intelligence, machine learning and neural networks are major buzzwords in the SEO community today. Marketers have highlighted these technologies’ ability to automate time-consuming tasks at scale, which can lead to more successful campaigns. Yet many professionals often have trouble distinguishing between these concepts.

“Artificial intelligence is essentially the term that defines the whole space,” said Eric Enge, president of Pilot Holding and former principal at Perficient, in his presentation at SMX Next. “Machine learning is a subset of that [AI] set around specific algorithms.”

Natural language processing (NLP) is another system that’s been used for SEO tasks in recent years. It’s primarily focused on understanding the meanings behind human speech.

“NLP is about helping computers better understand language the way a human does, including the contextual nuances,” he said.

With so many developing technologies available, marketers would be wise to learn how they can be applied to their campaigns. Here are three ways AI and its branches can automate SEO tasks at scale.

AI can address customers’ long-tail needs

Enge pointed to a customer search engagement study from Bloomreach that found that 82% of B2C shoppers’ experience is spent searching and browsing. This leaves room for plenty of long-tail searches, which are more niche in nature and, consequently, often overlooked by marketers.

Bloomreach’s own AI tool focuses primarily on extracting insights from this phase of discovery, Enge explained. It can identify site content that’s both underutilized and matches customer long-tail searches.

“AI improves pages by presenting more related pages that currently aren’t being linked to,” he said, “Or even potentially create new pages to fill the holes of those long-tail needs to create a better customer experience.”

Source: Eric Enge and Bloomreach

Marketers can use AI systems to generate more relevant pages based on these long-tail interests. But, there are some caveats to be aware of.

“Just be careful not to create too many new pages,” Enge said. “There are certainly cases where too many pages can be a bad thing. But deployed properly, this can be very effective.”

AI can enable automated content creation

Enge shared some information about GPT-3, a popular AI language model, to demonstrate AI’s content creation capabilities. While impressive, he noted how a system like this can get out of control if there aren’t proper constraints.

“They [AI systems] currently don’t have any model of the real world,” he said. “They only have the data that they were trained on. They don’t have any perspective or context for anything, so they can make really bad mistakes, and when they write, they’re prone to bias.”

“The wonderful thing about the web is that it has all the world’s information on it — the terrible thing about the web is all the world’s disinformation is on it, too,” he added.

Despite these weaknesses, AI systems have a lot of promise. Continuous improvements in these technologies can help marketers scale content efforts to meet customer expectations.

GPT-3, in particular, has the ability to generate content in a variety of formats, allowing SEOs to focus more on optimization efforts.

“You can use it [GPT-3] to create new content,” Enge said. “You’re going to have to put in a lot of effort and bring a lot of expertise to the table to do it. It might be more cost-effective than writing from scratch, or it may not, depending on how good you are.”

AI can leverage deep learning to help establish topical authority

Having topical authority means your site is a perceived expert on a given subject. This is one of the factors many SEOs believe is vital for improving rankings, which is why so many have leveraged AI’s capabilities.

Enge pointed to seoClarity, which uses an AI tool called Content Fusion designed to help brands write with more authority, to highlight these deep learning capabilities: “The approach is to leverage deep learning to identify entities and words that help you establish authority in a topic,” Enge said. “It extracts intent, entities, terms and potentially related topics. Then they apply their machine learning models that are specific to your market space.”

deep learning content fusion pipeline
Source: Eric Enge and seoClarity

The deep learning capabilities offer marketers a clearer view of their brand’s area of expertise, which can then be used to further develop their web properties. Establishing an automated deep learning system can provide them with fresh data to help demonstrate E-A-T (Expertise, Authoritativeness, Trustworthiness).

Every AI integration will look different, but each one has the potential to streamline your SEO efforts through automation and machine learning.

“There’s an incredible amount of stuff happening out there with AI,” Enge said. “Some of it you can take into your own hands if you’re willing to do the programming; in other cases, you can use tools. It’s up to you.”

Watch the full SMX Next presentation here (free registration required).

The post How AI can automate SEO tasks at scale appeared first on Search Engine Land.

Read More
Jason January 21, 2022 0 Comments

Google adds new robots tag indexifembedded

Google has a new robots tag for when you use embedded content on your pages named indexifembedded. Google said with this new tag “you can tell Google you’d still like your content indexed when it’s embedded through iframes and similar HTML tags in other pages, even when the content page has the noindex tag.”

Why we care. If you embed content on your site and want to control indexing of the content on the page, now you have more control with this new indexifembedded robots tag. Give it a try and see if it helps you with any indexing issues you may have had with pages where you embed content.

Why a new tag. Google explained that sometimes publishers want the content on the page to be indexed and sometimes not, when they embed content. This new robots tag gives you more control over communicating those wishes to Google Search.

“The indexifembedded tag addresses a common issue that especially affects media publishers: while they may want their content indexed when it’s embedded on third-party pages, they don’t necessarily want their media pages indexed on their own,” Google said, “Because they don’t want the media pages indexed, they currently use a noindex tag in such pages. However, the noindex tag also prevents embedding the content in other pages during indexing.”

Noindex and indexifembedded. Google said this new indexifembedded tag works with the original noindex tag: “The new robots tag, indexifembedded, works in combination with the noindex tag only when the page with noindex is embedded into another page through an iframe or similar HTML tag, like object.”

The example Google gave was if podcast.host.example/playpage?podcast=12345 has both the noindex and indexifembedded tag, it means Google can embed the content hosted on that page in recipe.site.example/my-recipes.html during indexing.

Code examples. Here are code examples of how to implement it, the first is via normal meta robots tag and the second is via the x-robots implementation:

meta robots
X-Robots

Other search engines. It seems Google is the only search engine to currently support this new robots meta tag.

Why use it? I asked John Mueller of Google why would anyone use this? I am still not sure I am convinced but this is what he said:

The post Google adds new robots tag indexifembedded appeared first on Search Engine Land.

Read More
Jason January 21, 2022 0 Comments

Google updates product structured data for car review snippets

Google has added a note to the product structured data help documentation to explain how to specify car markup and still have Product review snippet feature eligibility.

What is new. The note says “currently Car is not supported automatically as a subtype of Product.” You “will need to include both Car and Product types if you would like to attach ratings to it and be eligible for the Search feature,” Google said.

Code example. Google then provided this example in JSON-LD:

Image: Google.

Why we care. If you are just using car schema or just using product schema on your car and automobile landing pages, you will want to make sure to use both. If you do not, any review snippets may not show up in the Google search results for your site and web pages.

The post Google updates product structured data for car review snippets appeared first on Search Engine Land.

Read More
Jason January 21, 2022 0 Comments

Semrush acquires SEO training website Backlinko

Semrush, the SEO toolset provider, has acquired SEO training website Backlinko, the company announced Wednesday, for an unannounced sum.

Why we care. Semrush’s acquisition will likely strengthen the company’s own SEO education hub, Semrush Academy. This could make Semrush a more robust source of SEO knowledge and may help it pull away from other toolset providers.

Additionally, Backlinko draws in 500,000 organic visits a month, according to the announcement. Semrush is likely to market to those users to drive them further down its sales funnel.

Why Semrush may have acquired Backlinko. “The desire to acquire Backlinko was fueled by Semrush’s commitment to inspiring both the current and next generation of digital marketers,” according to the announcement.

In an email sent to Backlinko subscribers, Brian Dean, founder of Backlinko, explained that he wanted to further scale his business, but had hit a plateau in terms of what he could do on his own. “Then, out of the blue, I got an email that changed everything,” Dean wrote, sharing that he was approached for this deal by Max Roslyakov, SVP of marketing at Semrush.

Image: Sugandha Bansal.

Some SEOs have speculated about other potential reasons for the acquisition: “How much was Backlinko worth to Semrush? Breaking down courses, traffic, and estimated MRR from converting that traffic. What if a competitor came in?” tweeted Victor Pan, head of technical SEO at HubSpot.

“That’s what makes the acquisition itself the joke about our industry — A leading, listed tool company, buying a site, purely for the links gained by publishing nonsense,” Peter Mindenhall tweeted, suggesting that one of the motivators behind the deal was the acquisition of Backlinko’s backlink profile. Mindenhall’s remark about “publishing nonsense” refers to the mixed reception that some of the guidance in Backlinko’s content has garnered from the SEO community.

The future for Semrush/Backlinko. “On joining Semrush, Backlinko founder Brian Dean and his colleagues will continue to grow the Semrush community by creating and curating original content for the Semrush Academy,” according to the announcement.

“As part of my agreement with Semrush, I’ll still be working on Backlinko for the foreseeable future on a part-time basis,” Dean tweeted.

The post Semrush acquires SEO training website Backlinko appeared first on Search Engine Land.

Read More
Jason January 20, 2022 0 Comments

How analyzing search data can improve your decision-making

Much like a physical marketplace, the online search environment has both its successful businesses and those that fail to gain traction. Matt Colebourne, CEO of Searchmetrics, used the analogy of a “high street” — the main area of commercial or shopping — to describe the current state of search marketing in his presentation at SMX Next.

“Just as you have the winners and the losers in the physical space, you have the same in the digital space; page two of Google or any search engine is fundamentally the ‘backstreet,’” he said. “That’s where a lot less audience is going to end up.”

Marketers have long used search data to optimize their content so it meets user needs. But many fail to apply those same insights to inform decisions that impact the long game.

“A lot of companies make the mistake of optimizing for growth way too soon,” Colebourne said. “They settled for their current product set and their question becomes, ‘How can we optimize sales of what we have?’ Whereas the questions they should be asking are, ‘What are the sales that we could have? How much of our target market do we have right now?’”

Each day Google processes over 3.5 billion searches, which provides marketers with a wealth of data. Here are three reasons why analyzing this search data improves marketers’ decision-making processes.

Search data shows where your growth is coming from

“Currently, about 15% of search terms that appear on Google every month are new,” said Colebourne, “So, that starts to give you an inkling of the pace of change that we have to deal with. We see trends come and go in months, and some cases even weeks. And as businesses, we have to respond.”

Many organizations focus too much energy on driving growth while neglecting to determine where that growth is coming from. And in this digital age, there’s a good chance much of it is coming from search. This data offers marketers valuable insights, especially those relating to their industry segment.

“You have to understand how your industry and category is structured and ask the right questions,” he said. “If, for example, you sell specialty sports shoes, it doesn’t make a lot of sense to compare yourselves with Nike or similar companies who have much much bigger coverage, but may not be leaders in certain segments.”

It helps address your most significant decision-making challenges

Data — specifically search data — should be part of any company’s core decision-making process. To show how often brands use it, Colebourne highlighted a survey from Alpha (now Feedback Loop) that asked 300 managers how they made decisions.

“The question they asked was, ‘How important or not important is it to you to use data to make decisions?’” said Colebourne. “And I think nobody is going to be surprised by the results — 91% think data-driven decision-making is important . . . But the corollary to this question was, ‘How frequently or infrequently do you use that?’”

Source: Matt Colebourne

The answer was just 58%.

Clearly, knowing search data is valuable isn’t enough to be successful — marketers need to use these insights from searchers to make better business decisions. Otherwise, they’re going to miss out on a good source of traffic insights.

“65% of all e-commerce sessions start with a Google search,” Colebourne said. “I would argue that makes it a good source for decision-making. It’s a massive sample set, completely up to date, and it’s continually refreshed.”

Search data gives you more consumer context

“That [search] data — sensibly managed and processed — can show you the target market and provide you with the consumer demand,” said Colebourne. “It can show you if the market is growing or contracting.”

chart showing what search data really means
Source: Matt Colebourne

Analyzing search data can give marketers a clearer view of their consumers, especially for those groups they haven’t reached yet. Reviewing what people are searching for, how often they’re searching and how your competitors are addressing the challenge can make decision-making that much easier.

But more than that, marketers must look at the marketplace as a whole, using search data to inform decision-making.

“We’re all very focused on keywords and rankings and all these good things that we know how to manage,” Colebourne said. “But what we need to do is step beyond that and not just look at what we have or what competitors have, but look at the totality of the market.”

“Let’s look at the input search data to understand what the real demand is and how big this market is,” he added.

Watch the full SMX Next presentation here (free registration required).

The post How analyzing search data can improve your decision-making appeared first on Search Engine Land.

Read More
Jason January 20, 2022 0 Comments

Google recipe markup now requires specific times, no more time ranges

Google has updated the recipe schema markup help documents to remove all references to time ranges for food prep, cook time and total time as a supported field type. Now you need to specify a singular time, and no longer provide time ranges.

What changed. Google wrote that it “removed guidance about specifying a range for the cookTimeprepTime, and totalTime properties in the Recipe documentation. Currently, the only supported method is an exact time; time ranges aren’t supported. If you’re currently specifying a time range and you’d like Google to better understand your time values, we recommend updating that value in your structured data to a single value (for example, "cookTime": "PT30M").”

Old docs. the old documentation had references to using minimum and maximum time frames for the range of time it takes to prepare and cook the dish. Here is an old screenshot from the totalTime field about the mix and max ranges:

Now there are only references to using a singular and fixed time without any ranges.

Why we care. If you use recipe schema markup on your pages and have time ranges in that markup, you will want to adjust those ranges to use singular and fixed times. One would assume the Search Console reports will soon show errors for the use of ranges but you should jump on this and modify any uses of ranges in your markup.

The post Google recipe markup now requires specific times, no more time ranges appeared first on Search Engine Land.

Read More
Jason January 19, 2022 0 Comments

Yoast SEO launches on Shopify

Yoast SEO is now available for Shopify site owners, Yoast announced Tuesday.

An example of Yoast SEO within Shopify. Image: Yoast.

The company first unveiled plans for the Shopify version of its well-known WordPress app on January 4, 2022. Yoast SEO for Shopify costs $29 per 30 days, unlike the WordPress version of the app, which operates under a freemium model.

Why we care. Yoast SEO is one of the most commonly used SEO apps in the WordPress ecosystem and the launch of an app for Shopify speaks to the rise of e-commerce (particularly over the last two years).

This app is primarily aimed at SMBs, just like the available Google and Bing Shopify integrations (more on those below). The proliferation of SMB-oriented apps for merchants makes it easier for smaller retailers to establish an online presence, even if they’re not working with an agency partner. Together, these products could increase overall competition both in shopping and traditional search results.

RELATED: Shopify SEO Guide: How to increase organic traffic to your store

Search visibility for retailers of all sizes is now a thing. Beginning in 2020, e-commerce took on a more crucial role for most people as pandemic-related safety precautions inhibited in-person shopping. That also drove many retailers to offer their goods, pushing them towards platforms like Shopify.

The search engines picked up on this trend: Google announced its expanded Shopify integration in May 2021 and Bing launched its Shopify integration in December 2021, offering Shopify merchants an easy way to get their products listed in organic shopping results.

Yoast SEO for Shopify offers features that are complementary to those integrations. Instead of enabling merchants to show product listings, it may help them optimize their pages to show in organic, non-shopping results (like the WordPress version of the app).

The same Yoast SEO, but for Shopify. Yoast SEO for Shopify will offer much of the same functionality as its WordPress counterpart. This includes controls for your titles and descriptions in Google Search and social media, feedback on readability and Yoast’s schema graph.

While the functionality remains similar, the price points vary: At launch, Yoast SEO for Shopify will cost $29 per 30 days (after a free 14-day trial). The WordPress version operates under a freemium model, with the premium version costing $99 per year.

Why Yoast is launching a Shopify app. “An app on the Shopify platform is a huge business opportunity,” Thijs de Valk, CEO at Yoast, said, “Shopify is growing fast. It makes sense to build an app and profit from the growth of that specific platform.”

de Valk also cited risk-diversification as a motivator for Yoast’s Shopify app, explaining that the company’s growth up until now has been highly dependent on WordPress.

The post Yoast SEO launches on Shopify appeared first on Search Engine Land.

Read More
Jason January 18, 2022 0 Comments

Calling all search marketers! Check out these exclusive Master Classes

The days of title tag tweaks, meta keywords, and SERPs of ten blue links are long gone. SEO is one of the fastest-evolving aspects of the digital realm… and gaining visibility (not to mention quality traffic) is more complex and challenging than ever before.

If you want to stay ahead of the competition and at the top of the SERPs, it’s time to level up your SEO skills: Attend your choice of SMX Master Class – online March 8-9 – to take the first step on that critical journey.

This spring, we’re offering seven exclusive Master Classes, four of which are designed specifically for SEOs. (Not focused on SEO? Check out the complete Master Class lineup, featuring deep dives on Google Ads, Google Analytics 4, and more.)

SMX Master Classes SEO speakers

  • Advanced SEO training with Bruce Clay is perfect for seasoned SEO professionals craving next-level knowledge, tactics, and best practices that boost visibility, traffic, and revenue.
  • Advanced technical SEO training with Eric Enge dives head-first into sophisticated, technical SEO topics and tactics, including Schema, page experience, Core Web Vitals, and more.
  • SEO-friendly content marketing training with Michael Brenner focuses on the crucial role content plays in SEO, and how you can effectively leverage it to support a cohesive site-wide strategy.
  • SEO for developers training with Detlef Johnson explores how SEOs can implement technical tactics to boost visibility… and how developers can take a more SEO-friendly approach to coding.

At just $199 each, the SMX Master Classes pack a ton of value: Six focused hours of live, expert-led training that deliver actionable tactics you can implement immediately to drive measurable results – and intimate Q&A opportunities with industry legends that address your specific queries and curiosities. Plus, 100% virtual means you can tune in from anywhere – no plane ticket, hotel reservation, or travel risks required.

Ready to register? Smart move. Secure your spot at one of these exclusive Master Classes for just $199!

The post Calling all search marketers! Check out these exclusive Master Classes appeared first on Search Engine Land.

Read More
Jason January 18, 2022 0 Comments