Jake Serota, Author at The HOTH SEO Link Building Service Tue, 29 Oct 2024 14:40:08 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.2 https://www.thehoth.com/wp-content/uploads/2018/03/cropped-1crop-hoth-32x32.png Jake Serota, Author at The HOTH 32 32 Page Speed and SEO: The Complete Guide https://www.thehoth.com/blog/page-speed-seo/ https://www.thehoth.com/blog/page-speed-seo/#comments Tue, 29 Oct 2024 14:39:51 +0000 https://www.thehoth.com/?p=37125 Three seconds.  That’s roughly how long you have to make a positive first impression on your website’s visitors.  So, if your site doesn’t load within those three seconds, don’t expect anyone to stick around.  Instead, they’ll likely click back to the search results and select a website that loads quicker.  This is why page speed […]

The post Page Speed and SEO: The Complete Guide appeared first on The HOTH.

]]>
Three seconds

That’s roughly how long you have to make a positive first impression on your website’s visitors. 

So, if your site doesn’t load within those three seconds, don’t expect anyone to stick around. 

Instead, they’ll likely click back to the search results and select a website that loads quicker. 

This is why page speed is a crucial SEO factor that you need to master. 

Also, you should never just assume that your website loads properly, as page speed can be tricky to pin down if you’re not paying attention. 

It could be that the majority of your web pages load within an instant, but something like a JavaScript error is causing one of your most important landing pages to render at a snail’s pace. 

In that scenario, a valuable money-making page on your site is rendered useless, and you’d have no clue why until you audited your page speed. 

The #1 way to avoid problems like these is to keep a close eye on your page speed, which is what we’re going to teach you how to do today. 

Stick around to learn how to audit, analyze, and improve the page speed of all your web pages. 

What Does Page Speed Actually Refer to?

Broadly speaking, page speed refers to the amount of time it takes for a web page to fully render on someone’s computer. 

If you’re old enough to remember dial-up internet connections (raises hand), you’ll likely have not-so-fond memories of web pages taking eons to load completely. 

Loading times and internet speeds have improved dramatically since then, which is why modern internet users have little-to-no patience when it comes to waiting for websites to render. 

If your web pages don’t load at the drop of a hat, it’ll likely cause frustration and the user to venture elsewhere on the internet to meet their needs. 

Lots of factors can cause page speed issues, including:

  • Images and videos that are too large 
  • Too much JavaScript and CSS 
  • Traffic volume 
  • Redirects 
  • Web hosting issues 

Slow page speed negatively affects your SEO and user experience, so you have every incentive to ensure it stays lightning-fast. 

The different types of page speed 

Page speed is an umbrella term that encompasses multiple metrics that represent various stages of the loading process. 

In other words, web pages don’t load in one seamless process. Instead, multiple processes take place whenever a web page loads on a user’s screen, including:

  • Time to First Byte (TTFB). This metric measures how long it takes for a web page to initiate the loading process. 
  • First Contentful Paint (FCP). This measures how long it takes for a user to see the very first element of a page, like an image or block of text. 
  • Interaction to Next Paint (INP). This refers to how long it takes for a web page to respond to interactions, such as a user clicking on a link or button. 
  • Cumulative Layout Shift (CLS). A layout shift occurs whenever something like an ad causes the layout of your web page to ‘shift’ slightly. The CLS measures all the layout shifts that occur on your site.  

These are also the main metrics Google measures with its Core Web Vitals test. 

It runs this test on every website in its index because Google only rewards its highest search rankings to web pages with excellent load times. 

What’s the difference between page speed and site speed? Page speed and site speed are not interchangeable terms. Page speed refers to the time it takes a specific webpage to load, while site speed represents the average speed of multiple pages

How Does Page Speed Impact SEO? 

Google first introduced page speed as a ranking factor way back in 2010, so it’s mattered for SEO for quite some time now. 

An algorithm update in 2018 kicked things up a notch, as it made page speed an even more crucial ranking factor, especially for mobile search rankings. 

Why is fast page speed such a big deal to Google?

It has to do with how slow page speed negatively impacts user experience

As a search engine, Google’s reputation hinges on providing the highest quality search results for any given query. Not only do the sites they rank need to be relevant and accurate, but they also need to provide a pleasant user experience

If Google ranked websites that load super slow on page one (especially in the top 3), their users would quickly become frustrated. Moreover, they would probably start using another search engine instead (especially if slow sites pop up for everything they search for on Google). 

This is the primary reason why Google places so much importance on page speed for search rankings. 

By only ranking websites that can pass its Core Web Vitals test, Google ensures that its search results are always populated with sites that provide smooth, ultra-fast experiences. 

So, if your web pages aren’t loading as fast as they should, you may notice that you aren’t ranking as high as you’d like. 

The flip side is also true. If your website boasts excellent page speed, you’re more likely to rank on page one (assuming the rest of your SEO is in order). 

How to Test Your Page Speed Using Several Different Methods 

Now that you know that you need to keep an eye on your page speed, how do you do that?

Are you supposed to sit with a stopwatch and click on one of your web pages?

No! 

The good news is there are plenty of tools out there that will let you check your page speed, including a free tool from Google itself. 

You can also use tools like Semrush and Ahrefs to check your speed metrics, but they require paid subscriptions. 

Here’s a look at the top ways you can start testing your page speed (and site speed) today. 

Google PageSpeed Insights 

Google provides a free tool that any site owner can use to test their site speed via PageSpeed Insights (which is powered by Google Lighthouse). 

The best part is that the tool uses Google’s real Core Web Vitals metrics, so you get to see how your website performs on the test. 

How does the tool work?

It’s extremely simple. All you have to do is visit the PageSpeed Insights web page and enter the URL for the web page you want to check:

Once you hit the Analyze button, you’ll get to see a breakdown of how the web page scored on each page speed metric:

As you can see, these are all the speed metrics that we mentioned earlier. To pass the Core Web Vitals test, your site must score in the green range for each metric. If your scores are yellow or red, it’s a sign that your page speed needs work. 

Another handy feature of the tool is that it helps you diagnose underlying performance issues that affect page speed. 

If you scroll down to Diagnose Performance Issues, you’ll get to see a full report containing your website’s:

  • Performance 
  • Accessibility 
  • Best Practices 
  • SEO 

Each metric will show up as green (good), yellow (needs work), and red (serious issues). 

If you keep scrolling down, you’ll see a list of issues and recommendations to improve your page speed:

These suggestions are extremely helpful, so you should definitely take a look at them. For example, one of the recommendations is to properly size images, and it even includes how much data you stand to save (75 KiB). 

This tool is one of the most reliable ways to improve your overall site speed, so it’s a must-use for site owners everywhere. 

Checking page speed using Ahrefs 

If you’ve got a subscription to Ahrefs, you’ll be able to check your site speed using its Site Audit tool. 

You can even enable official Google Core Web Vitals metrics to show up in the tool, which you need to do in the Crawl settings. 

First, you’ll have to get an API code from Google (the link is in Ahrefs) that will enable Core Web Vitals (and Lighthouse) data to appear in Ahrefs. 

To check your site speed, first select the Site Audit tool, and select your website’s project

If you haven’t added any projects yet, you can learn how to do that here

Once you select your project, you’ll get directed to the Overview page:

Here, you get to see your crawl results and an overall health score for your website. However, we want to check site speed, so navigate to Performance under Reports on the left-hand sidebar:

This will take you to the Performance report that contains your Core Web Vitals metrics (if they’re enabled), Ahrefs speed metrics, and your load time distribution. 

The Issues tab will let you know if there are any major errors you need to fix, which is similar to the PageSpeed Insights report. 

However, you aren’t given as much insight into what caused the problem (or how to fix it). 

Best Practices for Optimizing Page Speed 

The best way to ensure fast page speed is to optimize your entire website for it, especially your most important pages for SEO. 

For example, it’s extremely important that your informative blogs and high-converting landing pages run flawlessly

These pages are crucial for your sales funnel, so they should receive top priority when optimizing your site to run fast. 

Here’s a look at the top best practices to implement on a web page that you want to run screaming fast. 

Compress your images and videos 

Excessive file size is a leading culprit behind slow loading times, so you should aim to make your files as small as possible. 

Images and videos are notorious offenders in this regard, as it doesn’t take much for their file sizes to get out of control. This is especially true for uncompressed high-definition video, as those files can easily take up several gigs if you aren’t careful. 

ImageOptim is a great tool for compressing images without losing quality. For videos, HandBrake is an open-source video transcoder that can convert virtually any video file into a number of different codecs, many of which feature high-quality compression. 

Minimize JavaScript and CSS 

Your website’s code could also be slowing things down, especially if you haven’t minified it. 

What’s that?

Minification is the process of cleaning up excessive JavaScript and CSS code to save space and improve page speed. 

Standard code is filled with white space, breaks, and other elements that are actually unnecessary. 

When you minify code, you get rid of these unnecessary elements which tightens up the code without losing functionality. This makes it possible for computers to read the code much faster, which improves page speed. 

There are lots of tools online that will automatically minify your code for you, like minifier.org

Get rid of redirect chains 

Using too many redirects (like a 301 or 302 redirect) can slow down your website. A redirect will automatically ‘direct’ users from a broken or outdated URL to a new URL. 

It’s common to use redirects to fix broken links and direct users away from expired services. 

However, too many redirects can slow things down, especially if there are redirect chains

A redirect chain occurs whenever users are redirected more than once after visiting a URL. 

For example, let’s say www.yoursite.com no longer works, so you redirect users to www.yoursite2.com. However, you recently migrated to a new domain, so you added another redirect to www.yoursite3.com. 

At this point, visiting www.yoursite.com will cause the browser to redirect to yoursite2, and then AGAIN to yoursite3, which will obviously slow things down. 

You may build a redirect chain without even knowing it, so it’s crucial to regularly audit your links. 

Here’s a guide on how to use SEO Spider Screaming Frog to identify redirect chains on your website. 

Improve Page Speed for Better Rankings and User Experience 

Let’s briefly recap what we’ve covered so far:

  1. Page speed is a crucial ranking metric on Google and other search engines 
  2. You can use Google’s PageSpeed Insights to keep track of your page speed 
  3. Minifying code, compressing images, and reducing redirects are all ways to improve page speed

As long as you follow these best practices and regularly audit your page speed, you shouldn’t have much trouble maintaining a super-fast website. 

Do you need help with the technical SEO for your website?

One of our Technical SEO Audits will give you a comprehensive report containing every technical issue that could harm your SEO. 

We’re always willing to fix what we find, so don’t wait to reach out for a free SEO consultation to discuss your needs in more detail!      

The post Page Speed and SEO: The Complete Guide appeared first on The HOTH.

]]>
https://www.thehoth.com/blog/page-speed-seo/feed/ 2
URL Structure: Must-Know Best Practices to Improve SEO https://www.thehoth.com/blog/url-structure/ https://www.thehoth.com/blog/url-structure/#comments Fri, 25 Oct 2024 15:39:34 +0000 https://www.thehoth.com/?p=37095 Do your URLs look like this: www.yoursite.com/blog/random-blog-post  Or this? www.yoursite.com//d/1mrSXWNhuvDxzio44Ky60o9Mbh6pvejmkYhqRniMtsyU/edit If the answer is the latter, your URL structure needs some tightening up.  Long, complicated URLs like the one above are inconvenient for users and search engine algorithms.  For users, complicated URLs make it more difficult for them to reach the web pages they want. […]

The post URL Structure: Must-Know Best Practices to Improve SEO appeared first on The HOTH.

]]>
Do your URLs look like this:

www.yoursite.com/blog/random-blog-post 

Or this?

www.yoursite.com//d/1mrSXWNhuvDxzio44Ky60o9Mbh6pvejmkYhqRniMtsyU/edit

If the answer is the latter, your URL structure needs some tightening up. 

Long, complicated URLs like the one above are inconvenient for users and search engine algorithms

For users, complicated URLs make it more difficult for them to reach the web pages they want. Imagine trying to memorize the garbled mess of letters and numbers in the example provided above!

Compare that with the first URL, which you can easily remember and type into a web browser. 

For search engine crawler bots, concise, logical URLs make your website easier to crawl and index. 

However, a URL structure optimized for SEO stretches beyond simplifying the web address. 

You’ll also want to add target keywords, remain consistent with categories and page names, and use HTTPS protocol for ideal security. 

There’s quite a bit to know about URL structure, including knowing the components of a URL, so it can seem a tad daunting for newcomers. 

That’s why we’re here to teach you everything you need to know about creating and maintaining an SEO-friendly URL structure, so stick around! 

What is a URL?

The internet is home to well over a billion websites (1.98 billion as of 2024), so it’s not exactly a place you can navigate on your own. 

Instead, you need a way to reliably find the exact website, web page, or file you’re looking for – which is what URLs provide. 

An acronym for Unique Resource Locator, a URL is an address that lets you directly access content on the internet. As the name implies, it’s a tool you use to locate a unique resource in the vastness of the internet. 

URLs are most commonly used for accessing websites and web pages using HTTP protocols on web browsers, but that’s not their only use. 

You can also use URLs to access your default email client (mailto) and upload and download files (file transfer protocol or FTP). We’ll discuss URL protocols more in a bit, but that covers the basics for now. 

Web browsers work by letting users enter URLs into their address bar. If a user knows the URL for the website they want to visit, they can type it directly into the address bar. 

If they don’t know what the exact URL is, they can use a search engine like Google to find what they’re looking for. An example would be searching for Netflix on Google because you aren’t sure what the URL is. In this case, clicking on Netflix’s search result will hyperlink you to their URL. 

Here’s what that looks like:

Understanding the Components of a URL

Even the simplest URLs contain syntaxes like ‘://’ and prefixes like ‘.com’ or ‘.net,’ which may seem confusing if you don’t know what they mean.

Every URL follows a uniform structure consisting of at least three mandatory parts, but there can be up to 10

Once you know what each part represents, URLs become much easier to understand. 

Here’s a breakdown of all 10 URL components. 

Component #1: Protocol (also called scheme) 

The first building block of a URL is a protocol, also called a scheme. 

What’s that?

In a nutshell, a protocol is a set of rules for how a connection between a web server and a browser should be established. 

The most common protocols include:

  • HTTP, which stands for Hypertext Transfer Protocol. You’ve seen this protocol in action whenever a website begins with http:// (although https has pretty much taken over at this point). The http protocol sends a request to your web browser to pull up the resource represented by the URL address. If the resource is available, it will appear on your screen. For many years, this was the most common way to access websites and web pages from browsers. The main flaw of this protocol is that it’s not secure, meaning anyone can access the data. 
  • HTTPS, which stands for Hypertext Transfer Protocol Secure. This protocol works in the exact same way as http, but it has the added benefit of being encrypted (hence the secure part of the name). This makes it much more difficult for someone to intercept the data. Https has become the norm in recent years, and Google strongly recommends it for all websites to ensure users’ privacy and security. It’s also a must for e-commerce stores where users enter their personal financial information. 
  • Mailto. You can use this protocol to automatically open your default email browser and prep a message to a recipient that you specify in the URL. An example would be entering ‘mailto:recipient@gmail.com’ into Firefox. Once you press the Enter key, Gmail will open up with the subject line populated with the address you put in the URL (in this case, recipient@gmail.com). 
  • FTP (file transfer protocol). If you need to transfer files from one system to another, you can use FTP. It’s a protocol that enables file sharing between two servers. You can both download and upload files using FTP. In fact, most web browsers have FTP processes taking place in the background without you even knowing it. If you’ve ever downloaded anything from the internet, FTP was likely what made it happen. 

The protocol always comes at the very beginning of a URL, regardless if it’s http, ftp, mailto, or something else. 

Component #2: Subdomain 

Not every URL will contain subdomains, as it’s not a mandatory component of a URL. A subdomain is a way of dividing your website into different sections. 

It’s a unique choice that’s often made to manage a distinct part of your website that requires its own hierarchy of interconnected pages. 

For example, it’s common to use subdomains if your website also has an online store or blog. By separating your online store away from the rest of your website with a subdomain, you can contain and manage all its inner pages in one location. 

Also, users won’t have to visit a different website to purchase your merch or read your blog posts. 

Here’s what a subdomain looks like in the context of a URL (highlighted in bold):

https://blog.yoursite.com/how-tos/gardening-tips 

As you can see, the subdomain ‘blog’ appears before the primary domain name, and directly after the protocol. 

Subdomains function as mini-websites that are connected to your main site via the URL. 

So, it’s wise to subdomains for any part of your website that could function as its own site, such as:

  1. Online stores 
  2. Blogs 
  3. Offering separate services for business clients and consumers (starting a subdomain for B2B clients) 
  4. Job boards 
  5. Support platforms 
  6. Data analytics platforms 
  7. Versions of your site in a different language (or for a different geographical region) 

You aren’t limited to just one subdomain, either. Websites can have up to 500 subdomains, so it’s entirely possible to have them set up for a blog, online store, customer support, and analytics platform all at the same time (with plenty of slots to spare, too). 

Component #3: Second-level domain (SLD) 

Moving down the chain, a URL’s second-level domain is the main name of your website. 

Here’s what it looks like in a URL (the bolded text is the SLD):

https://www.yoursite.com

Your second-level domain lets people know that the website belongs to your brand, and is often a version of the brand name (although that’s not always the case). 

It’s best to use a short, memorable SLD that your target audience can easily memorize and enter into a web browser. 

For example, our URL is https://www.thehoth.com. 

Our SLD is literally our brand name (The HOTH), making it extremely easy for our clients and prospects to find us online. 

Component #4: Top-level domain (TLD) 

Okay, now we’re on to the final domain-related URL component, which is the top-level domain. 

Your TLD specifies the type of entity you’re registered as on the internet. The most well-known TLD is ‘.com,’ which is a shortened version of the word ‘commercial.’

An example would be:

https://www.yoursite.com

This is because the TLD .com represents commercial organizations in the United States, which is why most businesses register with the .com TLD. 

Other popular top-level domains include:

  • .net. Derived from the word ‘network,’ .net is a TLD that’s open to anyone to use (similar to .com). It’s a common choice for technology companies due to the connotations of the word network. 
  • .edu. The TLD .edu was developed in 1985 to provide a TLD for institutions focused on education. This TLD is NOT available to anyone, as the site owner must be a US post-secondary institution officially accredited by the US Department of Education. 
  • .gov. Another restricted TLD, .gov is only available for official government entities. To receive a .gov TLD, site owners must meet eligibility requirements and send in a letter for authorization. 

There are lots of other TLDs, such as .co and .biz, but they’re far less common (and seem spammy). 

Component #5: Subdirectory 

Next, a subdirectory is a reference that lets search engine crawlers and users know where they are in the greater context of a site. 

For instance, if you were browsing the Jackets section of an online clothing store, the subdirectory in the URL would most likely read ‘jackets.’ 

Here’s an example of a subdirectory in a URL (highlighted in bold):

https://shop.yoursite.com/jackets/red-jacket-large 

The subdirectory lets you know that you’re in the Jackets section of the online store, which is A) a useful navigational resource for users and B) a way to keep the pages on your site organized. 

Component #6: Port 

The port in a URL is a number designating a specific gateway (which is why it’s called a port) that directs traffic to your website. 

You can think of it as a door that lets users pass through to your website’s content. 

Most of the time, ports don’t appear in URLs because they use standard ports assumed by the most commonly used protocols, like HTTP (port # 80) and HTTPS (port #443). 

For reference, here’s a port number inside a URL (in bold):

https://shop.yoursite.com/jackets/443/jackets/red-jacket-large

This number signifies that the HTTPS protocol is the port being used. However, most of the time the port won’t show up in the URL.

Component #7: Path 

By now, we’ve got a port to go through, but we don’t have a map to our specific destination online

This is what the path component is for, as it maps out the route you need to take to reach your requested resource. 

So, if the resource you want to access is a large red jacket, this is what the path would look like in a URL:

https://shop.yoursite.com/jackets/red-jacket-large

This lets the web browser know that you’re trying to navigate to the web page selling a large jacket in the color red. 

Component #8: Query 

You may notice question marks appearing in search URLs, but why are they there?

A question mark in a URL represents a query string that defines a set of parameters for the data you’re trying to retrieve from the website. 

They most commonly appear in URLs for search queries on engines like Google and Bing. 

We’ll move on to parameters next, but here’s what a query looks like:

https://shop.yoursite.com/jackets/red-jacket-large?

Remember, the question mark begins the query string (also called parameters), so everything you see after a question mark in a URL is a parameter. 

Component #9: Parameters 

Parameters modify the content of a page based on key and value designations. The key specifies what you want to change, and the value parameter sets the criteria for modifications. 

Here’s a quick example:

https://www.yoursite.com/blog?category=gardening 

Here, the key is ‘category’ and the value is ‘gardening,’ meaning the search will only display gardening articles from your blog. 

Component #10: Fragments 

Lastly, the final component is called fragments, which are special codes at the end of a URL that direct users to a specific part of a web page

They’re marked with hashtags (#) and indicate an exact location on a web page, such as the header or footer. 

Here’s an example (in bold):

https://www.yoursite.com/blog?category=gardening#footer

In this scenario, you would be directed to the very bottom of the page after clicking on the link. 

Not every URL will contain all 10 components, but it’s useful to know what they mean. 

Best Practices for SEO-Friendly URLs 

Now that you know all the building blocks that comprise a URL, it’s time to learn how to make them SEO-friendly

Google and other search engines have certain preferences for URLs, such as using the HTTPS protocol for enhanced security. 

By adhering to what Google wants with your URL structure and hierarchy, it becomes SEO-friendly, and you may see better rankings as a result. 

More importantly, maintaining SEO-friendly URLs will ensure that your website is easy for search engine bots to crawl and index, ensuring that your most important content continues to appear on the SERPs (search engine results pages). 

To ensure you have an SEO-friendly URL structure, follow these best practices. 

Implement a clear URL hierarchy 

A URL hierarchy refers to the structure you use for naming web page URLs. 

In order to appeal to both web users and search engines, you should employ a unanimous naming structure that follows the same rules across the board. 

You also need to segment your content into categories to make your inner pages easier to navigate. 

Here’s a quick example of what your URL hierarchy might look like:

  1. A user begins on your homepage, let’s say it’s www.yoursite.com
  2. Next, they click on a hyperlink called Services which takes them to this subdirectory: www.yoursite.com/services 
  3. They’re interested in consulting, which has its own series of pages. Clicking on Consulting takes them to www.yoursite.com/services/consulting
  4. Finally, they go to a specific consulting page about developing a strategy. The last past they visit has a URL like this: www.yoursite.com/services/consulting/strategy

As you can see, the naming device remains consistent, and each subcategory follows a logical structure (parent pages and child pages). 

Avoid non-ASCII characters 

When naming your URLs, it’s a good idea to avoid non-ASCII characters as much as possible. 

What are those?

ASCII (American Standard Code for Information Interchange) characters include the alphabet from A – Z, numbers 0 – 9, and most basic punctuation characters. 

Non-ASCII characters, then, refer to anything outside of those parameters. 

This means you should avoid symbols, accented letters, and characters from other languages when naming your URLs, otherwise you could confuse users and crawler bots. 

Use short, simple URLs 

The more complicated a URL is, the more likely it is to cause confusion. 

Even without considering web users and search engine crawlers, using a complicated naming structure has the potential to confuse you, too. 

Also, Google can’t display long URLs in its search results, so it abbreviates them:

That’s why it’s best to keep your URLs short and sweet:

Include relevant keywords 

It’s an SEO best practice to include target keywords inside URLs, although this has more of an impact on Bing than it does on Google. 

Regardless, using keywords in your URLs still yields some SEO benefit on Google, and it helps users know what your content is about. 

For example, if you have a blog post called Top 5 Digital Marketing Tips for This Year, and the target keyword is ‘digital marketing tips,’ you could use a URL like this:

www.yoursite.com/blog/digital-marketing-tips 

Now you have the keyword working double duty, both in your blog’s title and the URL (don’t forget to include it in the title tag, too). 

Use lowercase letters and hyphens to separate words 

There’s a bit of an unwritten rule in URL structure, and it’s to use hyphens to separate words instead of underscores

Good: www.yoursite.com/digital-marketing-tips

Bad: www.yoursite.com/digital_marketing_tips 

Sure, underscores can work fine for your image gallery on your PC, but stick to hyphens when creating web page URLs. 

This is because not every search engine recognizes underscores as word separators, but they all recognize hyphens. 

Place redirects on any URL that you change 

URL structure isn’t set in stone, and you’ll likely make some changes to it down the line. Reasons for this include:

  • Redesigning or restructuring your website
  • No longer offering a particular product or service 
  • Changing domains 

Whatever the reason may be, you must use a 301 redirect to permanently direct users to the new, changed URL. 

Otherwise, your website will start to get riddled with broken links, which is never a good thing. 

However, bear in mind that using too many redirects will slow your website down, so do your best to use them sparingly. 

Implement an SEO-Friendly URL Structure Today 

To recap what we’ve covered so far:

  1. A URL is a unique resource locator that helps you find specific files, websites, and web pages online. 
  2. URLs can have up to 10 components, but only 3 are essential, which are a protocol, domain name, and path. 
  3. Implementing an SEO-friendly URL structure can help your site rank better and ensure stronger crawling and indexing. 

URL structure is often overlooked by digital marketers and website owners, so don’t forget to optimize your URLs to achieve maximum SEO effectiveness. 

Do you need help managing your website’s URL structure and other technical SEO factors?

If so, one of our Technical SEO Audits is exactly what the doctor ordered. We’ll give your site a soup-to-nuts audit and provide you with a detailed report of every issue that we find. If you don’t feel like handling the fix yourself, our experts will take the wheel, so don’t wait to try it out!        

The post URL Structure: Must-Know Best Practices to Improve SEO appeared first on The HOTH.

]]>
https://www.thehoth.com/blog/url-structure/feed/ 2
Hreflang and Canonical Tags: The Right Ways to Use Both https://www.thehoth.com/blog/hreflang-canonical-tags/ https://www.thehoth.com/blog/hreflang-canonical-tags/#respond Tue, 10 Sep 2024 14:58:30 +0000 https://www.thehoth.com/?p=36741 Two HTML elements that have the ability to drastically affect your SEO (in both good and bad ways) are hreflang and canonical tags.  Improper use of either could cause incorrect pages to appear for users, or for Google to index an incorrect version of a web page.  Complicating things further, many SEOs confuse hreflang and […]

The post Hreflang and Canonical Tags: The Right Ways to Use Both appeared first on The HOTH.

]]>
Two HTML elements that have the ability to drastically affect your SEO (in both good and bad ways) are hreflang and canonical tags

Improper use of either could cause incorrect pages to appear for users, or for Google to index an incorrect version of a web page. 

Complicating things further, many SEOs confuse hreflang and canonical tags since they both serve similar purposes. 

Both HTML tags tell search engines which version of a page to index, but the similarities end there

Here’s a simple breakdown of the difference:

  1. Hreflang (pronounced H-ref-lang, by the way) tags determine which language your site appears in for search users. These tags only apply if you have multiple versions of your site in other languages
  2. Canonical tags are a way to deal with duplicate content, like two extremely similar versions of a product page (one for each color). A canonical tag lets search engines know which version of the page to include on search engines. Without using canonical tags, search engines won’t know which version to rank, so they’ll alternate between the two (causing major dips in traffic). 

An easy way to remember the difference is that hreflang deals with languages, and canonical tags deal with duplicate content

Besides knowing when to use which, there’s a proper way to implement hreflang and canonical tags in your content. If you don’t format them correctly, they could wind up having no impact at all. 

That’s why we put together this article breaking down the proper way to use hreflang and canonical tags, so stay tuned. 

What are Hreflang and Canonical Tags?

Technical SEO dives into the code and server level of your website, including your HTML and CSS. 

Since it’s more ‘technical’ in nature, this aspect of SEO can intimidate some marketers. 

However, you only need a basic grasp of HTML to understand how to properly use things like title tags, hreflang tags, and canonical tags. 

The best way to understand hreflang and canonical tags is to provide an example of each, so let’s do just that. 

What are hreflang tags?

First, let’s break down what the term ‘hreflang’ actually means. 

At first glance, it may seem like a random hodgepodge of letters, but it’s shorthand for Hypertext Reference Language

H-ref, or hypertext reference, is an HTML attribute that specifies the destination of a link

Lang is an abbreviation for language.

Therefore, hreflang refers to link destinations related to languages

If you recall the definition from the intro, a hreflang tag lets search engines know which language version of your site to display to users on search engines. 

For example, you can set a hreflang tag to display the German version of your site whenever someone conducts a search in German. 

That way, they’ll automatically see the version of your site in their language, which provides a pleasant user experience. 

You can hreflang tags in action by searching for popular brand names + specific countries. 

Here’s what happens when we search for ‘Apple Germany’ on Google:

As you can see, the German version of Apple’s website (https://www.apple.com/de/) appears in the search results, and it’s because they’ve set up the appropriate hreflang tag to do so. 

A hreflang tag in action

Here’s an example of what a hreflang tag looks like in HTML:

<link rel =”alternate” href=”www.yoursite.com” hreflang= “de” />

Let’s break down each element so you can understand what’s happening here. 

‘Link rel’ means Link relation, and in this case, ‘alternate’ means we’re providing an alternative link destination. 

H-ref means hypertext reference (link destination), and your website comes after. 

Lastly, ‘hreflang = “de”’ is the language code for German (short for Deutschland). 

So, putting it all together, this line of HTML code means you’re providing an alternative link destination that’s a version of your website in German

We’ll dive more into the specifics of using hreflang tags in a bit, but that’s the basics. 

What are canonical tags?

Canonical tags, on the other hand, help you avoid duplicate content issues on search engines like Google. 

You may have heard the term ‘canon’ to describe fictional movies, novels, and TV shows. The official narrative for a franchise is considered ‘canon’ (official), while things like fan fiction and spinoffs are non-canon (not part of the official narrative). 

Canonical tags follow the same concept, as the page you designate as ‘canon’ is the one Google will display in its search engine results. All other versions of the web page will remain unindexed, but users will still be able to access them on your website. 

A common example is when an eCommerce store features nearly identical web pages for different sizes and colors of their products. 

Consider a product page for a baseball cap that comes in three colors: black, red, and blue. 

There’s a version (or dynamic filter) of the page for each color, and they all contain the same keywords. 

This confuses search engines, as they won’t know which version of your page to include in the search results. 

By including a canonical tag on the black version, you’re telling Google’s crawler bots to include it in the rankings and ignore the rest (we’ll dive more into the specifics of making this happen later). 

Duplicate content, when left unchecked, can wreak serious havoc on your SEO performance. 

How is that?

It causes crawler bots to become confused when indexing your pages since you have two nearly identical pages attempting to rank for the same keyword. 

This causes the bot to do one of two things. It will either A) ignore both pages, causing your content to disappear from the SERPs or B) alternate ranking the two pages, causing both to periodically disappear. Neither outcome is desirable, which is why canonical tags are so useful. 

Here’s an example of what a canonical tag looks like in HTML code:

Link rel= “canonical” href= “www.yoursite.com/baseball-cap-black/” />

To break that down, the link relation (link rel=) this time is canonical, meaning you’re telling which web page to include in its search rankings. 

The next part of the code tells the bot which page is canonical, which, in this case, is the product page for the black baseball cap. 

As a result, Google will only include this web page in its results, and the web pages for other colors of the hat are just for customers already on your site. 

How do Hreflang and Canonical Tags Impact SEO?

Depending on the nature of your website, you may not have the need to use either hreflang or canonical tags. 

For instance, if your business only operates in America, you won’t need hreflang tags because you won’t have versions of your site written in other languages. 

If the same site has no issues with duplicate content (maybe it’s a simple blog with only a handful of posts so far), canonical tags aren’t necessary, either. 

Strong web design is all about keeping things simple, so you shouldn’t complicate your HTML with unnecessary tags. 

Keeping that in mind, here are the main ways canonical and hreflang tags will impact your SEO if you use them. 

How do canonical tags affect SEO?

You should only use canonical tags if you suspect you have issues with duplicate content

Some common instances of duplicate content include:

  • Dynamic pages that let users filter and sort through many options (like multiple colors and sizes of a product) 
  • Using the same content across multiple sites 
  • Engaging in a content syndication strategy 
  • The same content appears across multiple categories on your site 

These are all cases where it makes sense to use canonical tags to avoid duplicate content impacting your SEO. 

However, there are many instances where it makes sense to use a 301 redirect instead of a canonical tag. An example would be if you have two URLs that direct users to your homepage:

  1. https://www.yoursite.com 
  2. https://yoursite.com 

In this case, it’s better to use a 301 to redirect users to your preferred version. This prevents two versions of your homepage from floating around the web, which helps keep things simple. 

In general, you should opt for 301 redirects whenever possible since they’re a simpler solution than using canonical tags. 

Here are some ways canonical tags will positively impact your SEO when implemented properly:

    1. Consolidated link equity. If you don’t use a canonical tag, Google will try to rank each version of a web page, which means spreading the link equity from backlinks. Whenever you designate one page as the official version for search engines, all the link equity focuses on it, making it a stronger contender in the search rankings.  
  • Improve crawling efficiency. Besides eliminating the risk of duplicate content, canonical tags also make the crawling process run smoother. This is because search engines will know which pages to index and which to ignore. 

How do hreflang tags affect SEO?

Hreflang tags direct users to the appropriate language version of your website, which improves your user experience. 

There are different ‘ISO 639 language codes’ which let search engines know the language you want to display. We already mentioned German (de) before, but here’s a comprehensive list of the rest of them

Hreflang tags will positively impact your SEO in the following ways:

  • Lower bounce rates. If a German user clicks on your site and it’s displayed in English, you can bet with almost 100% certainty that they’ll click back to the search results to choose another website instead. This isn’t good news for your bounce rates, and hreflang tags are the remedy. 
  • Higher CTR rates. This goes hand-in-hand with the first point. Since your content will appear in the correct language, more users will click through to your website on search engine results pages (SERPs). 
  • Content differentiation. There may be content that appeals to certain cultures more than others, and hreflang tags give you the freedom to split things up. For example, there may be content that you only include on the French version of your site because you know it only appeals to a French audience. 

Don’t Cross the Wires: How to Implement Hreflang and Canonical Tags 

Now, let’s learn how to properly use hreflang and canonical tags in your web pages. 

You’ll have several options when it comes to implementation, including:

  1. Entering them in the <head> section of your HTML 
  2. Using them in your sitemap 
  3. Using tools to implement them 

Here’s how to knock out all three. 

Entering hreflang and canonical tags into the <head> section 

Both hreflang and canonical tags will only count if they appear in the <head> section of your HTML, so don’t include them anywhere else. 

Using a CMS like WordPress, you need to navigate to your header.php file. You can either get to it by using FTP (file transfer protocol) or going to Appearance > Theme Editor

Once there, either add a rel=canonical or hreflang=(language code) to the <head> section to activate the tags. Ensure everything is formatted properly, as so much as an unnecessary space can throw everything out of whack. 

Using canonical tags and hreflang tags in your sitemap 

If you’d rather not mess with your website’s code, you can opt to add canonical and hreflang tags to your sitemap instead. 

The XML sitemap method tends to work better for larger websites, as manually editing the HTML code of every web page can be a daunting (or nearly impossible) task for sites with thousands of web pages. 

However, the tradeoff is that it requires a trickier setup than simply adding HTML tags. 

Once you’ve created a sitemap, you have to integrate hreflang tags to every page. 

Google has some highly specific instructions for this, so be sure to follow their guidance to the letter. 

While adding hreflang tags to each page on your sitemap is still time-consuming, it’s far quicker than manually entering HTML tags since the sitemap provides a centralized location for all your URLs. 

Using tools to make implementing HTML tags quicker and easier 

If you really don’t want to waste any time implementing canonical and hreflang tags on your website, there are plenty of tools that will automate the process for you. 

An example is the Yoast SEO WordPress plugin, which will automatically add canonical tags to your URLs without you having to do anything. 

For hreflang tags, GeoTargetly offers a free hreflang tag generator you can use to automatically create the appropriate hreflang tag for each web page. 

Implement Canonical and Hreflang Tags to Improve Your SEO Performance 

Those are the proper ways to use hreflang and canonical tags on your website to improve your SEO. 

Both tags will enhance your user experience and make the crawling process easier for search engines, so it’s essential to use them when it’s appropriate. 

Remember that it’s not always necessary to use canonical tags if you don’t have any issues with duplicate content. Also, if your business isn’t international, you probably won’t need hreflang tags either (unless your country features multiple languages). 

Do you need help mastering the technical aspects of your SEO?

One of our Technical SEO Audits is just what the doctor ordered, so don’t wait to try the service out!    

The post Hreflang and Canonical Tags: The Right Ways to Use Both appeared first on The HOTH.

]]>
https://www.thehoth.com/blog/hreflang-canonical-tags/feed/ 0
Breadcrumb Navigation for Better SEO: A Complete Guide https://www.thehoth.com/blog/breadcrumb-navigation-seo/ https://www.thehoth.com/blog/breadcrumb-navigation-seo/#comments Tue, 03 Sep 2024 09:33:28 +0000 https://www.thehoth.com/?p=36749 You know those maps at large shopping malls that say “YOU ARE HERE” so that patrons know where they are in the complex? Well, breadcrumb navigation serves the same purpose online, except it’s for your website instead of a mega mall.  A ‘breadcrumb trail’ is a series of hyperlinks that A) lets users know where […]

The post Breadcrumb Navigation for Better SEO: A Complete Guide appeared first on The HOTH.

]]>
You know those maps at large shopping malls that say “YOU ARE HERE” so that patrons know where they are in the complex?

Well, breadcrumb navigation serves the same purpose online, except it’s for your website instead of a mega mall. 

A ‘breadcrumb trail’ is a series of hyperlinks that A) lets users know where they are in the website’s hierarchy and B) provides quick access to all the previous pages in their path. 

They’re most commonly used on websites that feature lots of subcategories and inner pages, such as eCommerce stores. 

Here’s a quick example of what a breadcrumb path looks like on Macy’s website:

This group of hyperlinks represents where we are in Macy’s website hierarchy, as we’re currently on Men’s Shirts & Tops, which is 2 links away from the general Men’s clothing category. 

These navigational tidbits are extremely helpful for both users and search engine crawler bots

In other words, your human visitors aren’t the only ones who will appreciate breadcrumb links. Whenever a search bot crawls your web page, breadcrumbs help them better understand your site’s structure.

Breadcrumbs work best on websites with deep site architecture, as they don’t serve much purpose on sites where each page is only a click or two away from the homepage. 

For sites with deep architecture, breadcrumbs are pretty much essential, so you should know how to implement them. 

This guide will teach you everything you need to know about breadcrumb navigation, so stay tuned! 

The Three Different Types of Breadcrumbs 

We’ve already mentioned that a breadcrumb trail lets users know where they are in your site’s hierarchy, but they can take three distinct forms

There are:

  1. Location-based breadcrumbs
  2. History-based breadcrumbs
  3. Attribute breadcrumbs 

Let’s analyze each one. 

Location-based breadcrumbs 

The most common type of breadcrumb trail is location-based, meaning it provides a series of links to the ‘parent’ pages a user visited prior to the page they’re currently on. 

What are parent pages? A ‘parent’ page refers to web pages at the top of your site hierarchy. They contain ‘child’ pages, which tend to be subcategories for products but can be any type of inner page. 

The Macy’s example in the intro is a location-based breadcrumb trail, as are most breadcrumbs that you’ll find online. 

The basic formula for a location-based breadcrumb trail looks like this: Services / SEO / Blog Content

As you can see, the trial consists of links to each parent page before the next subcategory. 

Here’s an example on Barnes & Noble’s website:

History-based breadcrumbs 

A history-based breadcrumb trail has more in common with a web browser’s back button than it does with location-based breadcrumbs. 

The catch is that when users go back, they will retain their spot in the site’s hierarchy – such as going back to Men’s Casual Shoes instead of all the way back to Men’s Shoes. 

Clothing websites make heavy use of history-based breadcrumbs for this reason. Here’s another example from Macy’s:

Attribute breadcrumbs 

The final type of breadcrumb trail you can use is attribute-based, which is a tad different from the other two. 

Rather than keeping track of the web pages a user visited, this type of breadcrumb trail keeps track of the attributes they selected on a page – most commonly product attributes. 

An example would be breadcrumbs recounting product attributes like sizes, colors, and other specifications. 

Here are attribute-based breadcrumbs at work on Office Depot’s website for scanners:

As you can see, we have attributes selected for the brand (Epson), color (black), and scanner type (documents) – all which are represented by attribute breadcrumbs at the top of the page. 

How Breadcrumbs Impact Your SEO 

You may be wondering how breadcrumb trails can affect your search rankings. 

After all, aren’t they primarily for improving your user experience and crawlability?

Yes. 

Then, how does that translate to higher (or lower) search rankings? 

To learn why, you need to understand how Google ranks search results. 

While the company keeps its search algorithms a secret, we do know what some of the most important ranking factors are (and recently leaked documents have proven these to be true):

  • Authority. Google typically only grants top rankings (the top 3) to websites that it views as authoritative, meaning it trusts them. The way you build that trust is by building backlinks from reputable websites in your industry. Check out our Learning Hub post to learn more about the importance of link-building. 
  • Relevance. Irrelevant websites serve no purpose to users, so your content must be relevant to the keyword to rank. 
  • Quality. One of the most important ranking factors is the quality of your website’s content. Factors that contribute to high-quality content (at least to Google’s quality raters) include original insights, first-hand experiences, demonstrating expertise, and more. High-quality content should also contain at least 1,000 words and some type of visual aid (such as high-resolution images or videos). 
  • Usability. Who wants to visit a website that won’t load properly? The answer is no one, which is why Google includes usability as a ranking factor. 

Here’s a quick look at the two ranking factors that pertain to breadcrumb navigation. 

Breadcrumb Google ranking factor #1: Relevance 

It’s very important to Google that the search results it displays are relevant to the user query (keyword). 

Otherwise, the user will see a bunch of irrelevant and unhelpful results, likely resulting in the user choosing another search engine. 

For this reason, Google’s algorithms have become increasingly adept at identifying relevant content and filtering out spam. Therefore, websites will only find success on Google’s SERPs (search engine results pages) if they target relevant keywords

This relates to breadcrumbs because they make it easier for Google’s crawler bots to identify relevant content. That brings us to another important point: you should always include relevant keywords in your breadcrumbs

Why is that?

It’s because Google uses the breadcrumb markup to categorize information from web pages in its search results. Whenever you include keywords in your breadcrumbs, it adds semantic clarity, enabling crawler bots to better understand the context and relevance of a web page in the overall site hierarchy. 

Breadcrumb Google ranking factor #2: Usability 

Google also wants its search results to only display websites with pleasant user experiences. 

Factors that affect user experience include:

  • Speeds for loading, interactivity, and stability (Google’s Core Web Vitals test checks these) 
  • Internal linking structure (breadcrumbs help with this) 
  • Site architecture (breadcrumbs help with this, too) 
  • Ease of navigation (once again, breadcrumbs) 

As you can see, breadcrumbs aid with numerous factors that affect your user experience. Since breadcrumb trails make your site easier to navigate for users, you’ll get the usability nod from Google’s algorithms, paving the way for higher rankings. 

Breadcrumbs help click-through rates due to rich snippets 

Another SEO perk associated with breadcrumbs is their ability to improve click-through rates by appearing as rich snippets

Also called enriched results, rich snippets are simply normal Google search results with some additional information displayed. 

You’ve seen rich snippets in action on Google’s SERPs whenever something like a recipe or phone number appears alongside a search result. 

Common rich snippets include:

  • Reviews
  • Events
  • Recipes 
  • Phone numbers 
  • ‘Top Stories’ for news articles 
  • Videos 

Here’s a recipe rich snippet that appears when we search for ‘pizza recipes’ on Google:

Websites are able to target rich snippets by including structured data (also called schema markup) in their site’s code. 

What’s that?

Structured data is just a standardized format for:

  1. Including information about a page 
  2. Classifying web page content 

Using structured data makes your content eligible for rich snippets and featured snippets on Google, which is a plus. 

This relates to breadcrumbs because you can include them in your structured data, and they can appear in the search results

Here’s an example of what this looks like:

In this case, we can see that the web page comes from the website’s Men’s Sneakers category, which is housed under the parent page Collections

This looks more attractive than a search result without breadcrumbs and provides users with immediate context about where they are on the site – which can lead to higher click-through rates (CTRs). 

When you combine all these factors, it becomes clear that breadcrumbs will have a significantly positive impact on your SEO, which is why they’re worth implementing (if your site architecture calls for it, of course). 

How to Add WordPress Breadcrumbs to Your Website 

Okay, now it’s time to learn how to actually add breadcrumb trails to your website so you can reap the aforementioned benefits. 

For this guide, we’re going to stick with how you can add breadcrumbs to WordPress websites, which is the most popular CMS (content management system). 

As a bonus, some WordPress themes support breadcrumbs by default, so you won’t have to do anything to implement them. Even if your theme doesn’t automatically add breadcrumbs, there are numerous plugins that will add the capability. 

Yoast SEO is one such plugin, and breadcrumbs are included in its free version, which you can’t beat. 

If you haven’t installed it already, go to Plugins > Add New on the left-hand sidebar. Search for the Yoast SEO plugin, and hit install. Before you can get started, don’t forget to hit the Activate button once the installation is complete. 

Next, you’ll need to add a snippet of code to your WordPress theme in order for breadcrumbs to appear. You can add it to the single.php or page.php files just above the page’s title. Also, some themes require you to place it at the end of the header.php file.  

Here’s the code:

<?php

if ( function_exists(‘yoast_breadcrumb’) ) {

  yoast_breadcrumb( ‘</p><p id=”breadcrumbs”>’,‘</p><p>’ );

}

?>

Once you’ve added the code to your WordPress theme, head over to the Advanced Settings on Yoast SEO, and select Breadcrumbs

Here, you’ll be able to customize the appearance and structure of your breadcrumbs:

There are options for customizing the anchor text, prefixes, and the graphic that separates the breadcrumbs. 

The team at Yoast offers more in-depth advice on implementing breadcrumbs in this document

Best Practices for Breadcrumb Navigation 

Getting breadcrumbs to display on your website is only the first step. You also need to optimize them to ensure they’re helping your SEO and not hindering it. 

Here’s a look at the top best practices for implementing breadcrumb trails on your website. 

Don’t use breadcrumbs if they don’t make sense for your website 

Regardless of the benefits breadcrumbs can bring to your SEO, you still should not use them if it doesn’t make sense for your website’s structure. 

If your website has an extremely shallow architecture, wherein each page is only a click or two away from the homepage, adding breadcrumbs will serve no purpose. The same is definitely true for one-page websites that have no site hierarchy whatsoever (since all the content is on a single page). 

Also, if you have a lot of lower-level pages that are accessible through various landing pages, breadcrumbs may become a source of confusion. 

Why is that?

It’s because users may access the same page from different starting points, throwing off the breadcrumb trail entirely. 

Don’t make your breadcrumbs too large

In all the visual examples we’ve provided so far, did you notice how small the breadcrumb trail links were?

This was very much on purpose

There’s a visual hierarchy involved with your site’s navigation menus, and your primary menu (Home, About Us, Services, etc.) should always be the largest. 

It will quickly become confusing if you have gigantic breadcrumb trails that dwarf your other navigation menus, so keep them small. 

Always include the full path in your breadcrumb trail 

Sometimes, a user may visit a page deep within your hierarchy directly from the search results. An example would be a product page for a pair of sneakers contained within numerous collection pages. 

Since the user effectively skipped those pages to get to the sneaker page, they shouldn’t be included in the breadcrumb trail, right?

Wrong! 

You should ALWAYS include the full path in your breadcrumbs, even if a user visits a deep page directly from the SERPs. Otherwise, your users could become confused. 

More importantly, not including the full path negates the opportunity for users to explore more of your website

Let’s say the user decides they actually don’t care for the sneakers they clicked on, and they want to look at what else you have to offer. By including the full path in your breadcrumbs, they can easily click back to your Men’s Sneakers category to browse other options. 

Including the full navigational path provides opportunities to A) improve dwell time (users spending more time on your site) and B) score conversions and generate leads. 

Use Breadcrumbs to Bolster Your SEO 

To summarize, breadcrumbs are a fantastic addition to your website when they’re appropriate. If you have a very deep site architecture, breadcrumbs will provide users with a quick and easy way to navigate your site. 

They will also make your website easier to crawl and aid with numerous ranking factors. 

However, don’t force breadcrumbs into your site if they don’t make sense. 

Do you need help forming a successful SEO strategy for your website?

HOTH X, our fully managed SEO service, is the answer to all your problems. We’ll take the entire process off your hands to simplify your SEO success, so don’t wait to get in touch!     

The post Breadcrumb Navigation for Better SEO: A Complete Guide appeared first on The HOTH.

]]>
https://www.thehoth.com/blog/breadcrumb-navigation-seo/feed/ 6
Crawlability: What It Is and Why it Matters for SEO https://www.thehoth.com/blog/crawlability/ https://www.thehoth.com/blog/crawlability/#comments Thu, 22 Aug 2024 08:42:26 +0000 https://www.thehoth.com/?p=36675 Nothing.  That’s what will show up in the results whenever someone searches for your brand name if Google can’t crawl and index your content (the same goes for other search engines, too).  Accordingly, crawlability and indexability are two crucial concepts every search engine marketer needs to understand. Without knowledge of how both processes work, your […]

The post Crawlability: What It Is and Why it Matters for SEO appeared first on The HOTH.

]]>
Nothing. 

That’s what will show up in the results whenever someone searches for your brand name if Google can’t crawl and index your content (the same goes for other search engines, too). 

Accordingly, crawlability and indexability are two crucial concepts every search engine marketer needs to understand.

Without knowledge of how both processes work, your content may disappear from Google’s search results, and you won’t have a clue why it happened or how to fix it. 

So what are they, then?

Crawling is how search engines discover new and updated content to store in their indexes, which is where they draw search results from. 

To break that down:

  1. Your content must be in a search engine’s index to appear in the search results.
  2. Crawling is how search engines find new content to store in their indexes. 

Search engine crawlers, also called spiders and bots, ‘crawl’ (closely examine) all the information on a web page to understand:

  1. The topic 
  2. Content quality
  3. Relevance to certain keywords 
  4. The page’s role in the website’s overall architecture 

That’s the crawling process in a nutshell, and it’s by no means a flawless process. 

Crawling and indexing errors happen all the time, which is why you need to know how to fix them. 

In this article, we’re going to cover everything related to crawlability and indexability, including how to ensure your most important pages get crawled and indexed – so stick around to learn more! 

What is Crawlability?

We’ve already covered what the crawling process is, but crawlability is a tad different. 

Crawlability refers to how easy it is for bots to crawl, navigate, and index your web pages. 

You can have good crawlability, so-so crawlability, or poor crawlability – depending on several key factors. 

Search engine crawlers can easily become confused if certain best practices aren’t in place, such as:

  • A sound internal linking structure (meaning each web page has at least one internal linking pointing at it)
  • A logical URL structure (short URLs, dashes to separate words, avoiding long strings of code, etc.)
  • Access to your XML sitemap through Google Search Console or Bing Webmaster Tools (sitemaps make crawling far easier for search bots)
  • Fast loading speed 
  • A properly formatted robots.txt file 
  • No duplicate content 
  • Functional links 

Conversely, here are some factors that will confuse search engine crawlers and cause problems:

  1. Slow loading speed 
  2. Broken links 
  3. No robots.txt or XML sitemap 
  4. Duplicate content 
  5. Orphan pages (web pages that have no internal links pointing at them) 
  6. Poorly formatted content (long blocks of text, no subheadings, etc.) 

The good news is all these factors are well within your control and tend to be easy to fix

For example, if you discover that you have orphan pages, fixing them is as easy as adding an internal link to them on your homepage (or another related page). 

What is indexing?

A search engine’s index is its database of websites and web pages. For a website to show up in a search engine’s index, its bots have to crawl the site’s content first. After that, a suite of search algorithms works their magic to determine if the website is worth ranking in the search results (i.e., storing the content in its index). 

You can think of Google’s index as a giant catalog of web pages that its crawlers determined are worthy of inclusion in the search results. Whenever a user searches for a keyword, Google references its index to determine if it has any relevant content to display in the results. 

7 Common Crawlability Issues (and How to Fix Them) 

Some crawling issues are a lot more common than others, so it’s important to familiarize yourself with the ‘usual suspects,’ so to speak. 

If you know how to quickly address and resolve these issues, your technical SEO audits will run much smoother. Also, you’ll be able to restore your content to the SERPs faster since you’ll know why your content suddenly disappeared. 

We’ve discussed a few of these already, but the most common crawlability issues are:

  1. Poor site structure 
  2. Not enough internal links 
  3. Broken links  
  4. No robots.txt file (or is improperly formatted) 
  5. No XML sitemap 
  6. Slow loading and response times 
  7. Web pages not optimized for mobile devices 

Let’s take a closer look at each problem and uncover the best way to fix them. 

Issue #1: Poor site structure (navigation and URLs) 

A crawler bot can become just as confused as a member of your target audience if your site doesn’t feature logical structure and navigation. 

Users and bots alike tend to prefer sites that have ‘flat’ architectures due to how easy they are to navigate. 

The two hallmarks of flat site architecture are a shallow page hierarchy (meaning each page is only  and minimal subcategories

In other words, it’s a minimalist approach to website design, and it’s a great way to prevent your site from filling up with countless subcategories and internal pages. 

Also, do your best to incorporate navigational best practices like breadcrumbs, which shows the user (and bot) where they are in your site hierarchy. 

Issue #2: Not enough internal links 

We’ve already mentioned orphan pages, which are web pages that have no internal links, but they’re only one side effect of not including enough internal links on your website. 

Besides ensuring all your most important pages are discoverable through site navigation, internal links also:

  • Make your website easier to crawl and understand for bots 
  • Can keep readers engaged in your content loop for longer 

Crawler bots love internal links because they use them to A) discover other web pages on your site and B) understand the greater context behind your content. 

For this reason, you should try to include internal links on every web page you create, especially in your articles. Whenever you’re writing a new blog post, think of instances where you could link to other pages on your site. 

As an example, let’s say you mention a topic in passing that you shot a video about a few months back. Adding an internal link to the video will give readers the opportunity to learn more about the subject, and it will give crawlers more context about your site as a whole. 

If you have low dwell times (how long users spend on your site), adding more internal links can help. 

Why is that?

It’s because internal links to other pieces of content provide users with the opportunity to remain engaged with your content, and they’ll spend longer on your site (which may end in a conversion). 

Issue #3: You have broken links 

When a link is broken, it means it no longer points to its original destination. 

Because of this, it will return an error page, most commonly a 404 Not Found. 

Reasons for this vary, but common culprits include website updates, CMS migrations, and human error. 

Broken links are an SEO killer because they cause valuable content to vanish, so it’s essential to keep your eyes open for them. 

Screaming Frog is a huge help in this regard, as it will let you know if you have broken links on your website. 

Coincidentally enough, Screaming Frog is a website crawler, so it’s a great tool for ensuring good crawlability overall. 

Issue #4: A missing or improperly formatted robots.txt file 

Robots.txt, or Robots Exclusion Standard, is a file that tells search engine crawlers which URLs they can access on your site and which they’re ‘excluded’ from. 

The purpose is to prevent overloading search engines like Google with too many crawl requests. 

Also, it’s important to note that not every page on your site needs to appear in search engine indexes. 

As a rule of thumb, you should only let search engines crawl your most important pages, such as your homepage, content, and product/landing pages. 

Admin pages, thank you pages, and login pages are examples of web pages that don’t need to show up in search results since they provide no value to users. 

Also, crawl budgets are a very real thing. 

Search engine bots don’t run on fairy dust, and it takes quite a bit of resources for them to crawl a web page. To save on energy, search engines like Google use ‘crawl budgets,’ where its bots will only crawl a predetermined amount of pages on a site. 

More popular web pages receive bigger budgets, while obscure sites have to make do with less. 

Here’s Google’s advice on how to create and format a robots.txt file

Issue #5: A missing XML sitemap 

An XML sitemap provides a clear ‘roadmap’ of your site’s architecture for crawler bots, so it’s pretty important to create one for your website. 

Moreover, you should submit it directly to Google Search Console (and Bing Webmaster Tools if your SEO strategy includes Bing). 

The Sitemaps Report in Google Search Console lets you view your sitemap once it’s uploaded, including which URLs Google’s crawlers currently have visibility of, which comes in handy. 

You can learn more about XML sitemaps (including how to format them) here

Issue #6: Slow speeds for loading, interactivity, and responsiveness 

Poor loading speed will throw a stick in the wheel of any SEO strategy. 

Besides providing a terrible user experience, slow loading times can cause you to fail Google’s Core Web Vitals test, meaning you won’t receive any favors in the rankings. 

The Core Web Vitals test checks every website’s speeds for not only loading web pages, but also their interactivity and responsiveness. 

Google Lighthouse contains the PageSpeed Insights tool, which lets you preview how well you’ll do on the Core Web Vitals test. 

The tool also contains suggestions for improvement, so it’s worth using. 

Issue #7: Not optimized for mobile devices 

Ever since 2017, Google has used mobile-first indexing, meaning they rank the mobile version of a website first. 

As a result, site owners must make doubly sure their pages display properly on smartphones and tablets. 

Mobile web browsing has been king for quite some time now, as mobile browsing currently accounts for 61.79% of all traffic. 

Responsive designs work best, which is where your website automatically adjusts its dimensions in accordance with a user’s device. Check out our mobile optimization guide to learn more. 

Improve Your Website’s Crawlability Today 

Countless factors affect a website’s crawlability, and all it takes is a single issue to cause your content to not appear in Google’s index. 

That’s why it’s so crucial to know how to identify and resolve errors related to crawling and indexing. 

Do you need help improving the crawlability of your website?

One of our Technical SEO Audits is just what the doctor ordered, then. We’ll provide your website with a top-to-bottom audit, including discovering any issues with your crawlability. For truly hands-off SEO, check out HOTH X, our fully managed SEO service.      

The post Crawlability: What It Is and Why it Matters for SEO appeared first on The HOTH.

]]>
https://www.thehoth.com/blog/crawlability/feed/ 3
The Ins and Outs of E-Commerce Technical SEO https://www.thehoth.com/blog/commerce-technical-seo/ https://www.thehoth.com/blog/commerce-technical-seo/#comments Thu, 15 Aug 2024 12:59:05 +0000 https://www.thehoth.com/?p=36619 E-commerce is one of the most fiercely competitive industries around, with 40% of online store owners reporting the market has ‘very tough competition’ in a recent survey.  Online shoppers also have sky-high expectations and lack patience for things like slow loading times and stores that don’t have many reviews.  These two factors make standing out […]

The post The Ins and Outs of E-Commerce Technical SEO appeared first on The HOTH.

]]>
E-commerce is one of the most fiercely competitive industries around, with 40% of online store owners reporting the market has ‘very tough competition’ in a recent survey. 

Online shoppers also have sky-high expectations and lack patience for things like slow loading times and stores that don’t have many reviews. 

These two factors make standing out in the e-commerce world exceptionally difficult, but e-commerce SEO provides a reliable solution. 

By outranking your competitors on search engines like Google, you’ll score the majority of clicks from your target audience, generating a ton of traffic to your store in the process. 

43% of all e-commerce traffic comes from Google, so engaging in SEO is a must for any online store eager to gain visibility. 

Yet, you must learn how to master e-commerce technical SEO before you can become a true search engine Jedi. 

Technical SEO refers to crucial ‘behind-the-scenes’ factors on your website that affect your search rankings. While all websites contend with technical SEO, e-commerce stores tend to have the most trouble with it. 

Slow loading speed, indexing problems, and duplicate content are all examples of common technical errors that affect many online stores, and it’s important to know how to identify and fix them. 

Airtight technical SEO will do your site a ton of good, including:

  • Higher rankings on search engines like Google 
  • Lightning-fast speed for loading and responsiveness 
  • A top-tier user experience
  • Lower bounce rates and higher dwell times 
  • Higher customer conversions 
  • A better mobile experience (which is extremely important for online stores) 

As you can see, it’s well worth your time to learn how technical SEO works. 

That’s why we put together this extremely comprehensive guide breaking down technical SEO for e-commerce businesses, so stay tuned to learn more! 

What is Technical SEO, Anyway?

There are countless subsets of SEO, which can make things a bit confusing for newcomers. 

You can pump just about any term into the formula ‘(something) SEO,’ and there’s bound to be an article about it somewhere. 

Just for fun, here’s what happens if you search for ‘Harry Potter SEO’ on Google:

Sure enough, it’s a thing. 

Anyway, let’s keep things simple. 

At the most basic level, there are three major subsets of SEO that apply to all strategies – regardless of industry or niche. 

They are on-page SEO, off-page SEO, and technical SEO:

  • On-page SEO. These are all the optimizations you make on the customer-facing side of your website to rank better on search engines. The most common examples are creating content and using relevant keywords. 
  • Off-page SEO. This refers to all the optimizations you make which occur off your website, like building backlinks (links on other websites that ‘point back’ to yours), and creating social media content. 

Lastly, there’s technical SEO

With technical SEO, you dive behind-the-scenes into your website’s code and architecture. There, you optimize your site for both users and search engines. This includes things like improving loading speed, using canonical tags (more on these later), and fixing broken links.

Poor technical SEO will spell disaster for your performance on search engines, and your user experience will take a nosedive. 

This is because technical errors can cause your website to:

  1. Disappear from search engine results pages (SERPs). 
  2. Not load correctly on mobile devices (which will likely lead to lost conversions). 
  3. Load slowly, causing many users to click off your site. 
  4. Develop lots of broken links, costing you valuable traffic and potential revenue. 

These problems (more like catastrophes) should provide you with more than enough incentive to keep track of your technical SEO.  

Why Do E-Commerce Sites Struggle with Technical SEO?

As mentioned previously, e-commerce stores and technical SEO factors have a tumultuous relationship, and it’s worse than other types of sites. 

Why is that?

It has to do with how online stores work. 

By nature, e-commerce stores will have:

  1. Hundreds, sometimes thousands of web pages to manage (the average is between 227 and 423 pages)
  2. Frequent content updates due to new products, prices, and promotions 
  3. A majority of customers shopping on mobile devices like smartphones and tablets 

These three reasons pose significant challenges to online stores, so let’s take a closer look at each one. 

Tons of web pages to manage (duplicate content issues) 

An e-commerce store typically has more web pages to deal with than other types of websites. 

The average store owner contends with tons of product pages, category pages, and subcategory pages. Speaking of product pages, it’s normal to have dozens of them for a single product. This is because store owners have to create a new web page for different colors, sizes, and styles. 

For example, let’s say you sell a cowboy hat on your site, but it comes in 3 different sizes and 10 different colors. 

That means you’ll have 13 pages for a single product! 

This very issue also causes problems with duplicate content, where you’re trying to rank two virtually identical web pages for the same keyword. 

In the case of our cowboy hat, Google will see 13 versions of it, all attempting to rank for identical keywords. 

What will Google do?

It will only rank one page at a time, causing dips in traffic and visibility for all pages, which is bad news. 

We’ll dive into how to fix duplicate content in a bit, but for now just know it’s a frustrating problem that plagues many e-commerce store owners. 

Frequent content updates to implement 

Let’s stick with the cowboy hat example for a second because we can use it to illustrate the next problem store owners have with technical SEO. 

For the longest time, you sell your cowboy hats for $19.95 each. However, you’ve got a promotion coming up that’ll knock the price down by 20%, making them $15.96. 

This means you just need to go in and tweak the price on the cowboy hat, and you’re done, right?

Wrong. 

As we’ve already established, you have 13 distinct product pages for the hat, meaning you’ll have to update the price 13 individual times. 

What’s worse is it’s just one product

Imagine if the promotion you’re running affects all the products in your store, which is typically what happens. 

That’s why larger stores make heavy use of API integrations, scheduled updates (which update product information automatically), and product information management (PIM) systems. 

A majority of customers shop on mobile devices

In the current era, the vast majority of customers browse e-commerce stores using smartphones and tablets instead of desktop computers. 

Research shows that at least 79% of smartphone users have made a purchase using their device in the past 6 months, and it’s estimated that over 50% of all e-commerce purchases made during the 2022 holiday season were made using mobile devices. 

As a result, it’s imperative that online stores display properly on smartphones and tablets. Otherwise, you’ll lose a lot of business. 

How to Improve Your E-Commerce Store’s Technical SEO 

Okay, enough talk about problems; let’s move on to the solutions! 

While technical SEO is undoubtedly challenging for online stores, it’s still doable, especially if you follow the right steps. 

Here’s a step-by-step guide that will teach you how to audit and improve your store’s technical SEO. 

Step #1: Audit your current technical SEO performance 

Regardless of your goals, you should ALWAYS kick off a technical SEO strategy with a comprehensive audit. 

This is because you’ll uncover your current strengths and weaknesses during the audit process, including any glaring errors that you should address before moving on. 

For example, if you start optimizing your store for mobile devices, but fail to realize that Google hasn’t indexed a majority of your product pages, your efforts will be in vain. 

Here are common tools of the trade that’ll help you perform successful audits:

  • HOTH SEO Report Tool. This free tool will let you know if you’re properly optimized for mobile, SSL enabled, and have a sitemap or not. It also contains lots of other SEO metrics, so don’t wait to try it out.
  • Google Search Console (GSC). You should set up your website on GSC if you haven’t already because it lets you monitor your SEO performance on Google. In particular, the Index Coverage Report notifies you of any crawling and indexing errors, which is invaluable. You can read more about how to use GSC here.
  • HOTH PageSpeed Checker. Sonic the Hedgehog-worthy loading speeds and response times are a must, so use our free tool to see if your site is up to snuff. 

We’ll be honest, technical SEO audits are pretty complicated, and there’s a lot beginners may miss. Our Technical SEO Audit services are the way to go if you have zero experience with the process. 

Our experts will audit your website from head to toe, checking every technical factor known to humanity in the process. Once we’re done, we’ll provide you with a detailed report and hop on a call to go over everything we found. 

Step #2: Make sure your store is easy to navigate 

Remember, technical SEO isn’t just about appealing to search engine algorithms; it’s also about enhancing your user experience. 

In that vein, your site must feature effortless navigation. Otherwise, your prospects will have a hard time finding what they’re looking for, causing them to click off your site and look elsewhere. 

Your product categories should be front and center, and it shouldn’t be too difficult to sift through subcategories, either, so try to limit them if you have a lot. 

Besides that, make heavy use of internal links and breadcrumbs

If your site contains plenty of internal links, your users won’t have any trouble finding what they’re looking for. Internal links also keep users engaged in your content loop, which is great for your dwell times. 

Breadcrumb navigation refers to keeping track of a user’s path through your categories and subcategories with little reminders. 

Here’s a quick example of what breadcrumb navigation looks like:

Clothing > Hats > Cowboy Hats 

As you can see, it’s a small visual reminder of where you are in a store’s hierarchy of pages, categories, and subcategories. Also, each category would contain a hyperlink. 

Here’s an example of breadcrumbs on Nordstrom’s site:

It’s a helpful way to remind users where they are in your website, and provides a quick and easy way to jump to other categories (such as if you clicked on Clothing or Men in the image above). 

Step #3: Use a clean URL structure 

The URLs you use also matter for your SEO and for your users. For instance, if your URLs are all long strings of nonsense numbers and characters, your target audience won’t remember them. 

However, if they’re short and logical, it’ll be easy for your users to share your web pages and navigate straight to the pages they want. 

Here’s an example of a cluttered URL:

https://www.yoursite.com/d/1nUCnTq-unl8OnLQO9WEXv2XZ5m_Ktrhexy9OWcD9p

Now, here’s an example of a clean URL:

https://www.yoursite.com/seo-blog 

See how much easier that is to remember? Google’s search algorithms also prefer shorter URLs that follow logical structures, so you have every incentive to tighten things up. 

Here are some best practices for writing SEO-friendly URLs:

  • Keep them short
  • Include target keywords (if possible) 
  • Separate words with hyphens
  • Always use lowercase letters 
  • Use static URLs instead of dynamic 

Step #4: Make sure your site loads and responds quickly 

Your site’s loading speed is another factor that matters to both users and search engines. 

On the user side, your prospects will get fed up if your site doesn’t load and respond in an instant, and they’ll click back to the search results if they experience a delay. 

Is it really that bad, though?

Yes. 

According to research, a one-second delay in loading speed causes a 7% loss in conversions and 11% fewer page views. 

Put in perspective, if your store earns $50,000 per day, a delay of just one second will cost you approximately $1 million per year

On the search engine side, Google’s Core Web Vitals tests the loading speed of every site in its index, and sites with poor loading speed don’t receive high rankings. 

To both check your site speed and find suggestions for improvement, take advantage of Google Lighthouse for Chrome. It contains PageSpeed Insights, which will test your loading times and recommend ways to speed things up. 

Step #5: Make your store mobile-friendly 

Google has used mobile-first indexing since 2017, so your store needs to work on mobile devices if you want to succeed with your SEO strategy. 

This means Google will always rank the mobile version of your website first, which is bad news if you only have a desktop version of your store. 

The #1 way to make your website mobile-friendly is to incorporate a responsive design, which means your site will automatically change its display resolutions if a user is on a mobile device. 

Check out our guide on mobile-friendly web design to learn more. 

Step #6: Implement structured data and rich snippets 

If you want to gain even more visibility for your online store, then you should target rich snippets by including structured data. 

What’s that?

Structured data, also called schema markup, is a standardized format for A) providing information about a web page and B) classifying the page’s content.

In other words, it’s how you qualify for rich snippets on Google like this:

Do you see how Google provided us with a direct answer to our query via a ‘snippet’ from a highlighted page? Moreover, this web page appears in position zero, meaning it appears above the organic search results. 

Here’s a rich snippet specifically for e-commerce:

Check out this guide on how to incorporate structured data to learn how it works. 

Step #7: Get rid of duplicate content 

As mentioned before, duplicate content is an issue that affects many e-commerce stores, but there are solutions. 

In particular, canonical tags make managing similar (or identical) pages a breeze. 

A canonical tag signals to search engines that a particular version of a web page is the version to include in search engine results, and to ignore the rest. 

Here’s a quick rundown of how it works:

  • The primary version of the page receives a canonical tag that points at itself. 
  • Every similar web page also receives a canonical tag, which refers to the original version

In other words, every canonical tag should point to the original page. This lets Google and other search engines know that they should only rank the primary page and keep the other identical pages off the search results. 

Step #8: Build lots of internal links 

You need to make a habit of including internal links in every piece of content you create. Also, ensure every page on your site has at least one internal link pointing at it. Otherwise, you could wind up with orphan pages, which are web pages that have no links pointing to them. 

As a result, orphan pages are next-to-impossible to find on your website and by search engines. That’s bad news, especially if the page in question is an important product page. 

Whenever you’re writing blogs or product descriptions, brainstorm any relevant internal linking opportunities. For instance, if you’re writing a blog about a particular product, why not link to it? As stated before, internal links keep users engaged with your content, so don’t be shy about using them. 

Step #9: Create an XML sitemap and upload it to GSC 

An XML sitemap is a file that helps guide search engines to the most important pages on your site that you want to appear in their indexes. 

Without one, search engine crawler bots may not locate all the pages you want to include in the search results. 

You could create an XML sitemap manually, but this is a time-intensive process, especially for larger websites. The manual process involves creating a list of all your website’s URLs, coding them, and then uploading them to GSC (and other search engine webmaster tools). 

There are also free tools you can use that will create sitemaps for you, like this one

Step #10: Monitor your progress regularly 

At this point, all that’s left is to keep an eye on your technical SEO factors into the immediate future. 

You should never go too long without checking crucial technical metrics like your site’s loading speed, indexing errors, broken links, and site architecture – since things are always subject to change. 

GSC and our free suite of SEO tools are your best friends in this regard, as they’ll help you keep track of all your technical SEO efforts. 

Get in Touch with E-Commerce Technical SEO Experts 

That’s what it takes to ensure flawless technical SEO for your online store. 

Doing so will ensure a pleasant user experience and higher search rankings, so it’s definitely worth the effort. 

However, technical SEO is both complex and time-consuming, which can prove to be too much for online store owners without much time on their hands

You can always hand the process off to our experts for a technical SEO audit or for HOTH X, our fully managed SEO service.      

The post The Ins and Outs of E-Commerce Technical SEO appeared first on The HOTH.

]]>
https://www.thehoth.com/blog/commerce-technical-seo/feed/ 3
14 Technical SEO Tools You Need to Start Using https://www.thehoth.com/blog/technical-seo-tools/ https://www.thehoth.com/blog/technical-seo-tools/#comments Thu, 18 Jul 2024 09:00:07 +0000 https://www.thehoth.com/?p=33053 Whenever site owners hear about technical SEO for the first time, it’s normal to feel slightly intimidated.  After all, the word ‘technical’ usually means that something is pretty complicated and requires special knowledge to understand.  The good news? There are plenty of free technical SEO tools out there that simplify and speed up the process.  […]

The post 14 Technical SEO Tools You Need to Start Using appeared first on The HOTH.

]]>
Whenever site owners hear about technical SEO for the first time, it’s normal to feel slightly intimidated. 

After all, the word ‘technical’ usually means that something is pretty complicated and requires special knowledge to understand. 

The good news?

There are plenty of free technical SEO tools out there that simplify and speed up the process. 

These tools will help you perform tasks like identifying and resolving indexing errors, improving page loading speed, and finding broken links. 

Regular technical SEO audits are absolutely necessary for maintaining search rankings, so it’s not something you can ignore. 

For example, you may have what’s known as an orphan page on your website, which means it has no internal links pointing at it (making it next to impossible for anyone to find). If you never audit your technical SEO, you’ll never catch this error, and the page will remain in obscurity forever. 

A solid technical SEO tool will quickly identify things like orphan pages and broken links. 

Yet, not every technical SEO tool is worth your time, so how do you know which ones to use?

In this article, we’re going to cover the top 14 technical SEO tools that you need to start using today, so stick around! 

Why Do Technical SEO Tools Matter?

Without testing tools, some aspects of technical SEO become impossible. 

For instance, how are you supposed to know how fast your pages load without using a tool to check them? It’s not something you can eyeball, as you need specific numbers to know if you’ll pass Google’s Core Web Vitals test or not. 

Without using a tool like Google’s PageSpeed Insights to uncover your speed metrics, you’ll just be shooting in the dark. 

That’s why technical SEO tools are necessary: they remove all guesses from the equation. 

The best tools provide in-depth metrics you can use to refine your strategy without leaving anything to chance. They also provide essential functions like creating XML sitemaps, uncovering duplicate content, and fixing indexing errors. 

In short, you have nothing to lose and everything to gain from using technical tools, especially since most of them are either free or have free versions. 

The Top 14 Technical SEO Tools Available Right Now

Search for ‘technical SEO tools’ on Google, and you’re met with a ton of options:

That’s not to mention all the listicles comparing and contrasting dozens of other technical SEO programs. 

Given the sheer variety, it can be confusing for site owners to know which tool will work best for their needs.  

Also, some tools require expensive memberships yet only provide basic SEO services you can easily replicate with free tools. 

It would be a shame to waste your marketing budget on a paid membership when you could have used a free program, which is why you shouldn’t pick the first tool you come across. 

Beyond that, you need to know what you’re going to use each tool for, as their features will vary. 

Here are some common technical SEO tasks:

  • Conduct SEO audits 
  • Check your website’s page loading times
  • View crucial SEO metrics 
  • Check how Google views your website content 
  • Fix broken links 
  • Find duplicate content 
  • Create & upload XML sitemaps 
  • Replace 404 pages with 301 redirects 
  • Clean up messy Javascript & CSS 
  • Test schema structured data markup
  • Optimize your website for mobile devices 

Before you go searching for tools, make a list of the main tasks you want it to knock out. It could be that you want a comprehensive technical SEO audit tool, or you may want to focus on broken links and duplicate content. 

Either way, you need to know what you want before you start entering your credit card information. 

This is why we put together a list containing the 14 highest quality technical SEO tools. They’ll handle everything from boosting page speed to testing structured data markup and everything in between! 

#1: Google Search Console (GSC)

Google Search Console is a go-to tool for everything SEO. 

Whether you want to monitor your keyword position rankings, check your backlinks, or uncover indexing errors (plus ways to fix them), GSC is your central hub for monitoring your on-site and off-site optimizations. 

GSC is an extremely powerful tool for technical SEO because you get to peek behind the curtain and see how Google’s algorithms view your website. 

For example, you’ll get to see which pages on your site Google has in its index via the Index Coverage Report. If a page you want to rank isn’t in Google’s index, it won’t appear in the SERPs (search engine results pages) at all. 

The good news is GSC will notify you of any indexing errors. 

As you can see, it provides a reason for the indexing error, which also informs you of the fix. 

For example, let’s say a page you want to rank shows an ‘excluded by noindex tag’ error like one of the web pages in the picture. 

This means the web page has a noindex tag, which is a piece of HTML code that tells Google’s crawlers not to include the page in its index. 

Therefore, removing the noindex tag from the HTML is all that’s required to fix the issue. 

While this is an easy fix, you would have had no clue why the web page wasn’t getting indexed if it weren’t for GSC!

Other useful GSC features for technical SEO 

Besides the Index Coverage Report, you’ll also want to make use of GSC’s other reports for your technical SEO. 

These include:

The Page Experience Report

Technical SEO isn’t only about fixing complex technical tweaks that occur behind-the-scenes of your website. 

It’s also about ensuring a pleasant user experience for visitors to your site, and that’s where GSC’s Page Experience Report comes in handy. 

This report provides insights into your website’s user experience. 

Crucial metrics to pay attention to here include your performance on the Core Web Vitals Test (which examines loading speed, stability, and interactivity) and your HTTPS status. 

In order for users to have a safe experience on your website, your HTTPS status needs to be ‘good.’

You also get to see the percentage of your web pages that feature a good page experience on both desktop and mobile. 

The XML Sitemap Report 

It’s an SEO best practice to upload your XML sitemap to GSC. Doing so makes it easier for Google’s bots to crawl your website and make sense of your internal linking structure. 

Once you’ve uploaded your sitemap, the XML Sitemap Report provides lots of useful features, including:

  1. Checking the status of your sitemap, which indicates whether Google has successfully processed your sitemap or if there were any errors. 
  2. You can view in-depth details about your sitemap, including how many URLs were submitted vs. how many were actually indexed. 

It’s important to view both the status and details of your sitemap once you submit it to ensure everything is in order. 

For instance, you may discover that one of the URLs that you want to rank was submitted but not indexed. In that scenario, you’d need to head over to the Index Coverage Report to see what went wrong. 

Once you’ve checked to make sure each page important to your SEO is indexed, you’ll know they’ll have a fighting chance on the SERPs. 

#2: Google Analytics 4 (GA4)

Google Analytics 4 is another one of Google’s free tools that’s a must-use for SEOs. 

What’s the difference between GA4 and GSC?

Where GSC focuses on search metrics (search traffic, keyword rankings, etc.), GA4 deals with user behavior and engagement metrics. 

In other words, Google Search Console is for monitoring search algorithm optimizations, and Google Analytics 4 is for monitoring the behavior of your target audiences

This means that using GSC and GA4 in tandem will ensure your SEO appeals to both search algorithms and humans, which is what you want. 

On the technical SEO side of things, GA4’s reporting capabilities are second to none. 

Primarily, GA4 is a useful tool because it gives you a real-time view of what’s happening on your website

This is huge because most website audit tools report problems after they’ve already occurred. With GA4, you can uncover major issues as soon as they happen. 

Key metrics to pay attention to include:

  • Bounce rate. This metric lets you know how frequently users click away from your website without venturing further into your content. If a user visits one of your landing pages from the SERPs and doesn’t click on an internal link to another page, it counts as a bounce. 
  • Dwell time. Equally as important as bounce rates are your dwell times. Bounce rates measure how often users leave without visiting another web page, while dwell times measure how long users stay on one of your web pages. If you discover that users are spending an average of 5 – 10 minutes on a piece of content, it means they’re consuming it all the way through – which is a great sign. Quick dwell times indicate a problem with your content, such as poor loading speed or irrelevant information. 

It’s also crucial to pay attention to your organic traffic levels. If you notice a steep dropoff, it’s typically a sign that you have an indexing issue. In that case, you could open GSC to pinpoint and resolve the issue (see how well these two programs work together?). 

Pro tip: High bounce rates don’t always mean there’s a problem with your content. This is because, for a web visit to count as a bounce, a user has to visit one page on your site and then click away. That’s not always a bad thing, especially if the user got what they needed from your content and thus had no need to venture further into your website. 

Also, bounce rates don’t consider dwell time, meaning a user could stay on one page for two hours before clicking away, and it would still count as a bounce. This doesn’t mean that bounce rates are a bad metric; it’s just that you should always check your dwell time before jumping to conclusions. 

#3: HOTH SEO Audit Tool 

Next, let’s take a look at our free SEO Audit Tool, which you can use to uncover tons of valuable technical SEO insights. 

This tool is not only great for auditing your own website, but also for gauging the strength of your competitors. 

To use it, you’ll need to enter your:

  • Website URL
  • First name
  • Email address

Once you’ve entered all the information, hit Run Report to receive a free comprehensive SEO audit. 

Here’s what the audit page includes:

  • An overall grade for your website’s SEO performance (on a scale from 1 – 100, with 100 being the best) 
  • Recommendations for keyword usage (such as if a keyword is missing from a title tag) 
  • An in-depth backlink analysis 
  • A task list detailing all the fixes necessary to improve your score 

Our favorite feature of the tool is definitely the ‘task list’ we provide at the end. Improving your SEO is as simple as checking off each box on the list! 

Other features of the report include:

  • Checking for canonical tags
  • Duplicate H1 tags 
  • Missing alt text
  • Javascript errors
  • Schema markup 
  • HREFLANG attributes 
  • Whether your meta descriptions are the optimal length (between 70 and 160 characters)
  • Making sure your website isn’t using unwanted noindex tags
  • Checking for a robots.txt file
  • Ensuring your site uses HTTPS & SSL
  • Checking for the presence of an XML sitemap 

As you can see, the tool notifies you about virtually every technical SEO issue there is, so don’t hesitate to use it. 

#4: Google PageSpeed Insights (PSI) 

If you want to enjoy top keyword rankings on Google, you’ll have to pass its Core Web Vitals test. 

What’s that?

It’s a test Google runs on every website in its index to measure:

  • Page loading speed 
  • Interactivity 
  • Visual stability 

It’s no secret that lightning-fast internet has spoiled everyone for many years now, and modern internet users are anything but patient regarding loading times. 

So, if your web pages are struggling with loading speed, users won’t hang around past a millisecond or two – which will cause your bounce rate to increase and dwell time to go down. 

Google’s PageSpeed Insights provides detailed metrics for your loading times, including how fast your website loads on desktops and mobile devices, which is a plus. 

There is one important note to make about PageSpeed Insights, though. 

It does not use the exact loading speed of your website; it uses approximations instead. 

While these approximations are undoubtedly accurate, you should use multiple tools to get a clearer picture of your loading times. 

However, a stand-out feature of PageSpeed Insights is its ability to point out and recommend potential solutions for problems with your loading times. There will even be a link under each issue it points out that says, “Should Fix”:

That’s why this tool makes our list. 

It’s an excellent way to find & resolve the top errors that are holding your loading times back, which will vastly improve your website’s user experience.  

#5: Google Lighthouse

Up until December 2023, Google’s Mobile-Friendly Test was THE tool for testing mobile friendliness on Google. 

However, on December 4th, 2023, Google put an end to its Mobile-Friendly Test API. 

According to the team at Google, the tool had become antiquated, as there were far better ways of testing mobile friendliness. 

One of which is Google Lighthouse, a suite of tools that audit:

  • Performance
  • Accessibility 
  • Progressive web apps 
  • SEO 

It’s crucial to note that Lighthouse’s suite includes PageSpeed Insights. While PSI is still a fantastic tool to use on its own, Lighthouse provides a more comprehensive view of your website’s performance, including loading speed and mobile-friendliness.   

If you want to use Lighthouse, then you’ll have to download Google Chrome if you don’t have it already. 

This is because Lighthouse is a part of Chrome DevTools. In fact, it has its very own panel in DevTools. 

Whenever you run a report, you can choose as many (or as few) categories as you want, and you have the option to analyze your performance on both mobile and desktop. 

#6: GTmetrix Page Speed Report

Yes, this tool provides yet another way to test your website’s loading speed and performance.

The catch is GTmetrix has some truly unique features that you won’t find in any other tool. This is because the score blends your raw loading speed with how well your website is optimized for performance. 

Therefore, the GTmetrix score is an effective judgment of your loading speed AND user experience. 

What’s even cooler is that the tool uses real data from Google Lighthouse and the Core Web Vitals test. This means you’re getting extremely accurate data coming straight from Google. 

Here are some other stand-out features of the tool:

  • Unique performance metrics: You get to peep crucial metrics like load times, resource usage, visual stability, and more. This comprehensive analysis helps identify and prioritize performance issues that could negatively impact SEO.
  • Real-world testing: You can test your website’s speed from different locations and under various device and connection settings. GTmetrix helps ensure the site performs well regardless of location or device.
  • Waterfall analysis: This chart is invaluable for diagnosing complex performance issues. That’s because it shows how each resource loads and helps pinpoint specific elements that slow down the page, allowing for targeted optimizations.

As you can see, GTmetrix provides the most in-depth analysis options for loading times, which is why we had to include it on the list. 

While Google’s free tools are more than adequate for most simple issues, GTmetrix is the way to go if you’re experiencing complicated loading issues. 

The tool doesn’t have a free version, but the starter pricing plan is only $4.25 per month, which is definitely affordable considering the advanced features you gain access to. 

#7: Screaming Frog 

Spotty URL structures and duplicate content are two SEO issues that will wreak havoc on your keyword rankings. 

Duplicate content occurs whenever you try to rank two identical pages for the same keywords, and it confuses Google’s algorithm. This causes:

  1. Google to choose one page over the other (which is usually the page you DIDN’T want to rank)  
  2. The algorithm fluctuates between ranking both pages, causing major dips in traffic

URL structure refers to how you format your URLs, such as www.yourwebsite.com/blog. 

If your URLs are too complex and don’t follow a logical structure, search engine crawlers will have a difficult time discovering relevant pages on your website to index. 

What do these two issues have in common?

They’re both next-to-impossible to spot without the aid of a website crawler, and Screaming Frog is the industry leader. 

Screaming Frog will help you quickly identify the following technical issues on your website:

  • Images that are too large (which can affect loading speed) 
  • Errors in your URLs 
  • Canonical tag issues (i.e., more than one canon tag)
  • Missing meta descriptions & meta keywords 
  • Missing page titles 
  • Response code errors 
  • Pagination issues 
  • Uncovering international SEO issues 

These issues can potentially tank your SEO profile, so it’s worth crawling your website regularly. 

You can also use Screaming Frog to pinpoint inconsistencies in your URL structure.

Your URLs need to follow a rigid structure to make it easier for Google’s bots to crawl your site, and Screaming Frog is an excellent tool for spotting any errors. 

#8: Google’s Rich Results Testing Tool 

Rich results are a goldmine for SEO, so you should definitely target them with your content. 

What are rich results?

We could go on a convoluted explanation of what they are, but a visual example will work best:

This ‘snippet’ that appears at the top of the search results (in what’s called position zero) is an example of a rich result. 

Coincidentally, the rich result is also describing rich results, which include:

  • Carousels (most common for e-commerce products) 
  • Images
  • Knowledge bars 
  • Featured snippets (which is what the image above is) 

Rich results are powerful for SEO because you appear above the organic search results and paid ads. According to research, featured snippets and other results snag 35.1% of the total click share whenever they appear. 

Achieving position zero for a keyword means you’ll automatically beat out all your organic competitors since you’ll reign above the #1 result. 

But how do you know if your content will trigger rich results?

That’s where Google’s Rich Results Testing Tool enters the picture. 

It will let you know whether your web pages are set up to trigger rich results. In particular, the tool checks to see if your structured data (also called schema markup) is considered valid according to Google’s guidelines. 

Structured data is simply a standardized format for providing information about the content on your page. Accepted structured data formats include JSON-LD, RDFa, and Microdata. 

What’s truly handy is that the tool highlights any errors it finds and provides suggestions for improvement. That’ll save you the time and hassle of trying to figure out why your content isn’t triggering featured snippets or knowledge bars. 

Not only that, but you can preview how your rich results may appear in Google Search. This is extremely helpful for seeing how your content gets displayed to users, which is great for uncovering any potential issues. 

For instance, you may discover that Google isn’t displaying a featured snippet correctly, such as failing to include the right quote. 

It’s far better to uncover and resolve any issues during the preview phase instead of having to deal with them after they’ve gone live. 

Once your structured data is confirmed valid and you’re happy with the preview, you can submit your page to Google for indexing. As soon as that’s done, your content should start displaying rich results, which is great news for your organic traffic numbers (hint: they’ll go up). 

#9: Siteliner 

As stated previously, duplicate content presents real problems for search rankings, and it’s not something you want to deal with on a regular basis. 

This is why Siteliner comes in handy. 

While not quite a website crawler like Screaming Frog, it provides a similar service by identifying duplicate content and broken links. 

If you want to quickly discover if you have duplicate content, enter your URL into Siteliner and hit Go

It will crawl up to 250 pages at once and then notify you of any duplicate content it finds. 

Why use this tool instead of Screaming Frog?

You can use either, but we like to use Siteliner for its pure speed

While it doesn’t have as many features as Screaming Frog, it excels at what it does provide. Finding duplicate content is extremely fast and easy with Siteliner, so it’s a good first stop if you suspect you have duplicate pages causing issues. 

Siteline is particularly valuable for E-commerce site owners since duplicate content runs amok in that industry. 

Why is that?

It’s because E-commerce sites often have multiple identical pages for different versions of the same product, such as different colors for a hat. 

The blue hat page is virtually identical to the red hat page, and they’re both trying to rank for the same keywords. Strategic use of canonical tags is the solution to this issue, but Siteliner is a great way to find out if you have an issue to begin with. 

Other useful features 

Besides uncovering duplicate content, Siteliner has other useful features for technical SEO. 

These include:

  • Uncovering broken links
  • Finding skipped pages 
  • Related domains 
  • Running site reports 
  • Downloading XML sitemaps 

The Broken Links tab will help you uncover any broken links on your website at the click of a button. You’ll get to see a complete breakdown of all the broken links on your site, as well as page summaries for each (where the broken links are highlighted). 

Once you uncover them, you can fix them by either adding a new hyperlink or using a 301 redirect. 

You can also use Siteliner to double-check your loading times & page sizes. 

Siteliner will display your average page speed and average page size compared to other websites – which is a great way to see how you stack up with the competition. 

#10: HOTH SSL Certificate Checker 

SSL certificates are a must for any website in today’s age, especially if you want to rank on Google. 

Google announced HTTPS as an official ranking factor way back in August of 2014, which means your site needs an SSL certificate. 

What is an SSL certificate? Put simply, an SSL certificate is a digital certificate that enables an encrypted connection to your website, meaning it scrambles important user information like credit card numbers. SSL stands for Secure Socket Layer, and it’s a bit outdated. Modern SSL certificates actually use a revamped version of SSL that’s called TLS (Transport Layer Security), but everyone still refers to them as ‘SSL’ certificates. 

Secure browsing is especially important for websites with E-commerce features since nobody wants to enter their payment information on a site that isn’t encrypted. 

As we’ve detailed in a blog post from the past, it’s quite easy to get an SSL certificate for free. 

If you aren’t sure if one of your online properties has an SSL certificate, you can use our free SSL certificate checker tool. 

All you have to do is enter your URL and hit Run Certificate Check

Here’s what happens when we run our site through the tool:

As you can see, we have an extremely up-to-date SSL certificate on our site. The tool also lets us know that future Chrome updates will consider SSL certificates issued before June 1st, 2016, as obsolete. 

This means you should definitely use this tool to check your certificates to make sure they’re all up to date! 

#11: Web Developer Toolbar

Next up is the Web Developer toolbar extension that’s available for Firefox, Chrome, Edge, and Opera. 

What is it?

It’s a free browser extension that equips web developers and SEO professionals with powerful tools for analyzing and manipulating web pages. 

With it, you can quickly inspect page elements that matter for your technical SEO, including:

  • HTML analysis: The toolbar lets you inspect and analyze the HTML structure of any webpage. This matters for SEO because your HTML structure plays a big role in how search engine crawlers perceive your content. If anything is off, such as the inclusion of a noindex tag, it can cause certain web pages to disappear from the search rankings. Having a tool that lets you view your HTML to check for errors is incredibly useful, and it’s free to boot. 
  • Meta information: Metadata is crucial for SEO, as your meta tags need to contain important keywords. Whenever a search engine bot crawls your website, your metadata is the very first thing it will see, so your target keywords better be in there. With the Web Developer toolbar, you can check your metadata to ensure it’s properly optimized for search engines. 

You can also disable the CSS and JavaScript using the Web Developer toolbar, which is useful for revealing how search engines view the page. 

With CSS and JavaScript turned off, you can quickly uncover content rendering and accessibility issues, such as missing image alt tags. 

Speaking of image alt tags, you can analyze every image on your website with the toolbar. You’ll be able to see the alt attributes for your images, which are extremely important to include.

Why is that?

It’s because alt text is how search engines (and users with disabilities) make sense of the images you include. Crawler bots don’t have computer vision, so they won’t be able to see any of your images. Alt text is a short line of text describing what an image shows, and it’s an SEO best practice to include target keywords in alt tags (so that crawler bots see them). 

The toolbar also lets you view and edit image dimensions, which is another factor that can affect performance and SEO. For example, if your image’s dimensions are off, they may not display properly on mobile devices. 

Lastly, the Web Developer toolbar is excellent for examining your backlinks and internal links. The ‘view link details’ feature lets you take a look at all the links on a page, including internal and external links. 

This is useful because:

  1. It helps you identify broken links 
  2. You can check the follow/nofollow tags for your backlinks 

A follow tag is a line of HTML code that tells Google to count a backlink towards your SEO. Conversely, a nofollow tag tells Google to ignore a backlink (although it will still have some impact on your rankings, just not as much as a follow link). 

For the backlinks you want to impact your rankings the most, ensure they all contain follow tags, as long as they don’t violate Google’s guidelines. 

In particular, Google requires all press backlinks to contain nofollow tags, so your press releases CANNOT contain follow tags – so bear that in mind when using the toolbar. 

#12: W3C Validator

The W3C Validator is a free online tool that acts as a quality control check for your website’s code. 

It ensures your site follows the official “rulebook” set by the W3C, the organization responsible for web standards. 

That may seem technical, but adhering to these standards has clear benefits for both your users and your search engine rankings.

Here’s how it works. 

Think of the Validator as a spell checker for code. You simply enter your website’s address, and it scans the underlying HTML, CSS, and other elements. 

If it finds anything that doesn’t follow the rules, it flags it as an error or warning. As a bonus, the tool also provides helpful tips on how to fix any issues it finds.

While the W3C Validator won’t magically boost your site to #1 on Google – clean, valid code contributes to a better overall SEO strategy through the following ways:

  • Easier crawling: Search engines like Google can navigate your site more smoothly when the code is nice and tidy. This helps Google’s crawler bots understand your content and index your pages properly.
  • Happy users: Valid code often means a website that works flawlessly across different browsers and devices. A good user experience keeps people on your site longer, which signals to search engines that your content is valuable.
  • Accessibility: Following web standards often improves accessibility for people with disabilities. Inclusive websites tend to rank better as they cater to a wider audience.
  • Technical SEO fixes: The Validator can catch technical SEO issues like broken links or incorrect header tags.

#13: MozBar

MozBar is a free browser extension for Google Chrome that yields real-time SEO insights while you’re browsing the web or analyzing the SERPs (search engine results pages). 

Moz is the brand behind the Domain Authority score, and the company’s data is extremely reliable. 

The primary benefit of MozBar is the ability to get instant SEO metrics and insights without having to leave your web browser. 

Whenever you’re on a web page, click on the MozBar to uncover its:

  • Domain Authority (DA) score 
  • Page authority (PA) score 
  • Total number of backlinks 
  • Keywords used on the page 
  • Details about a web page’s title tags, meta descriptions, header tags, and other key on-page SEO elements 

Let’s say you’re browsing the web when you come across a site in your niche that’s doing better than you are on the SERPs. 

That’s when you open up MozBar to check out the web page’s backlinks. When doing so, you uncover a few websites in your industry that accept guest posts (that you didn’t know about before). 

Voila, you just used MozBar to inform your SEO strategy. 

That’s one simple example of MozBar’s power, and the fact that it’s free means installing it is a no-brainer.

#14: Barracuda Panguin 

Google is constantly updating and tweaking its search algorithms, even outside of its officially announced ‘core’ updates. 

In fact, Google’s algorithms are updated literally thousands of times per year. 

That’s a lot of opportunities for things to go wrong with your SEO, especially if an algorithm change caused a manual action against one of your web pages. 

Barracuda’s Panguin tool is a quick, easy, and free way to find out if your website (or a site you’re investigating) has been impacted by Google’s algorithm updates. In particular, it’s useful for determining whether a site has had its rankings impacted by an algorithm change or not. 

To use the tool, sign in using your Google account. From there, Penguin will access your website’s GA4 data and provide a visual overview of your organic traffic. 

At the bottom of the page, you’ll see a group of icons representing various Google algorithm changes:

You also have the option to toggle certain changes on and off to hone in on particular types.    

Wrapping Up: The 14 Best Technical SEO Tools 

The tools on this list will make performing an in-depth technical SEO audit on your website a heck of a lot easier. 

From identifying duplicate content to fixing indexing errors, these tools will ensure all the technical aspects of your SEO strategy work flawlessly – making it effortless for web bots to crawl your website. 

Do you need help with the technical SEO at your company?

Then don’t wait to check out our extensive Technical SEO Service. Our digital marketing gurus know everything there is to know about technical SEO, and our audits leave no stone unturned. Go with your gut and book a call today!      

The post 14 Technical SEO Tools You Need to Start Using appeared first on The HOTH.

]]>
https://www.thehoth.com/blog/technical-seo-tools/feed/ 3
How to Identify Websites For High-Quality Guest Posts In 2024 https://www.thehoth.com/blog/quality-guest-post-sites/ https://www.thehoth.com/blog/quality-guest-post-sites/#comments Thu, 13 Jun 2024 08:50:48 +0000 https://www.thehoth.com/?p=28707 Link-building is a significant needle-mover for SEO, which is something we bring up time and time again on this blog.  Yet, for the process to bear fruit, you need to know which tactics are worth your time and which aren’t.  One of the most tried-and-true methods for building powerful backlinks is to publish guest posts […]

The post How to Identify Websites For High-Quality Guest Posts In 2024 appeared first on The HOTH.

]]>
Link-building is a significant needle-mover for SEO, which is something we bring up time and time again on this blog. 

Yet, for the process to bear fruit, you need to know which tactics are worth your time and which aren’t. 

One of the most tried-and-true methods for building powerful backlinks is to publish guest posts on other websites in your industry. 

What are those?

A guest post is a piece of content (typically a blog or article, but other content types can apply) that you publish on another website. These are great for SEO because you can include a link back to your website (a backlink) to boost your ranking power on search engines like Google. 

Backlinks are credibility votes for the quality of your content, which is why generating links from respected websites is crucial. 

As with any SEO technique, there’s a right and a wrong way to approach guest posting. 

Stay tuned to learn how to identify the highest quality guest post opportunities to improve your search rankings. 

Are Guest Posts Worth the Time and Effort? 

Guest posting is one of the oldest link-building techniques, and it still generates impressive results to this day. 

It’s the embodiment of ‘if it’s not broken, don’t fix it!’

However, the outreach involved with targeting guest posts is notoriously time-intensive and highly competitive. 

Since guest posting is so popular, your competitors will likely target the exact same website for guest posts. This is especially true for niches that only have a handful of websites that accept guest posts (which can be a challenge for more popular industries, too). 

Also, there’s the additional time and resource expense involved with producing a fresh piece of content for the guest post. 

Given all the work involved, is targeting guest posts really worth it?

The answer is yes, and there are several reasons why. 

Exposure to new audiences 

Besides generating valuable backlinks, guest posting is also an excellent digital PR tactic that exposes your content to new audiences. 

Whenever you land a guest post, a whole new audience gets the chance to experience your brand. 

This can lead to referral traffic, increased brand awareness, and even conversions. 

The more sites you guest post on, the more this benefit intensifies. 

High-quality backlinks 

While this is the most obvious benefit associated with link-building, that doesn’t diminish its significance. 

If you want to outrank your competitors, then you need more high-quality, relevant (this is huge) backlinks than they have. 

Guest posting is so powerful because you’re gaining backlinks from respected, relevant websites in your industry (if you target the right websites, that is, but more on this in a bit). 

You’re also creating valuable content for another website’s users, which is great for building trust with a new audience. 

Therefore, backlinks from guest posts will:

  • Signal to Google that your content is trustworthy and authoritative 
  • Generate referral traffic from qualified prospects 

In the SEO world, it doesn’t get much better than that! 

Relationship building 

At its heart, guest posting is a form of networking (as is link-building as a whole). This provides the additional benefit of forming close relationships with other websites in your industry. 

Guest post outreach will likely have you connecting with bloggers, journalists, and website owners in your field. 

If you play your cards right and aim to build long-lasting relationships instead of one-and-done link placements, you’ll see the following benefits:

  1. Opportunities for future guest posts 
  2. Collaborations on other projects/content types 
  3. Sharing contacts and resources 
  4. Syndicating your top-performing articles on related websites 

Boosted credibility 

The more guest posts and backlinks you have to other high-quality websites, the more credibility you’ll have – both on search engines and with audiences. 

This will inch you closer toward thought leader status, which is an extremely advantageous position to have. 

Whenever your audience views you as a thought leader, they’ll turn to your brand first whenever there’s a new development or major news story. As a result, other websites will start to link back to your content as a valuable resource, meaning you’ll generate backlinks without having to do any outreach! 

Metrics to Target: Website Authority, Traffic, or Both?

If you want to find success with guest posting, then the websites you publish on need to already have established authority with Google. 

Why is that?

It’s because the whole aim of link-building is to pass ‘link juice’ (a fancy term for ranking power) from one site to another. 

Low-quality websites that Google doesn’t trust won’t pass any link juice to your website, so they won’t have any effect on your rankings (they can actually negatively affect you if they came from spammy sources, but we’ll cover more on this later),. 

The two metrics used to gauge the quality of a website are:

  1. It’s authority score 
  2. The amount of traffic it generates each month 

Ideally, you should guest post on websites that have relatively high authority scores and generate a consistent amount of traffic. 

If the site has high authority but doesn’t see many visitors, the backlink may positively affect your SEO – but you likely won’t see any referral traffic from it. 

The opposite is also true, as a website with a low authority score but high traffic won’t do much for your SEO, but it could lead to referral traffic. 

While the best of both worlds is ideal, targeting strictly authority or traffic can still yield benefits, so don’t be afraid to mix and match. 

Calculating website authority scores 

How can you tell if a website is authoritative or not?

The quickest and easiest way is to use a third-party metric that calculates a website’s likelihood to rank well on the SERPs. 

The two most popular metrics (and the two that we use at The HOTH) are Moz’s Domain Authority Score and Ahrefs’ Domain Rating. 

For us to consider a website for a guest post, it MUST have a minimum authority score of 20. Otherwise, it’s not worth the time and effort. 

If a site has a DA/DR score of 20, we want it to have a minimum of 150 organic visitors. If the website has a DA/DR score of 40+, then we shoot for a minimum of 500 organic visitors. 

These stipulations prevent us from acquiring backlinks that have artificially inflated authority and don’t provide value to users (which are the most likely to land you in trouble with Google). 

What’s the difference between Domain Authority and Domain Rating?

You may be asking yourself why we use two third-party metrics to calculate a website’s level of authority. Beyond that, you may wonder why third-party metrics are necessary in the first place. 

Doesn’t Google have an official domain authority metric for SEOs to use?

They do, but they keep it a secret. PageRank (Google’s backlink algorithm) used to be public, but Google made it private to prevent marketers from ‘gaming the system.’ 

They’ve even stated publicly numerous times that they don’t calculate domain authority, but this claim has always raised suspicion from SEOs. 

Thanks to the recent leak of Google’s internal search documentation, we now know that Google DOES have a domain authority metric called siteAuthority – but we don’t have access to it. 

In the meantime, Moz’s DA and Ahrefs’ DR are the best that we have. 

Here’s the difference (and the reason why we use both):

Domain Rating examines the strength of your backlink profile but doesn’t examine any other ranking factors. 

Domain Authority, on the other hand, grades the quality of your backlink profile AND other important SEO ranking factors like: 

  • Age of the domain
  • The mobile-friendliness of the site
  • Quality of backlinks
  • Quality and quantity of unique content
  • Social share signals
  • Various on-page and technical SEO elements

As you can see, DA takes a more comprehensive look at a website’s SEO performance, whereas DR is strictly about backlink profile strength. 

Using both metrics gives us the ability to determine if a website is worth targeting for a guest post. If a site has sky-high DR but a low DA, it’s a sign that they need to work on their on-page SEO. The ideal scenario is a website that has suitable DR and DA scores. 

How much does traffic matter?

We’ve already gone over how websites that see a lot of visitors are great for generating referral traffic, but there’s an additional benefit. 

A website’s organic traffic is a ranking signal to Google. The more traffic a website has, the more Google views it as relevant and trustworthy. 

From thousands of campaigns on our managed service, HOTH X, we found that a mixture of website authority and traffic-based links is an effective link-building strategy. 

It’s also a great way to ensure a variety of different referring domains to your money pages.

Giving a website the ‘eye test’

You want to make sure that the websites you guest post on are real websites that serve a purpose for users. 

Why is that? Are there fake websites out there accepting guest posts?

The answer is a resounding yes. 

There’s no shortage of spammy websites that exist solely to link to other sites in an attempt to artificially boost search rankings. 

An example would be a private blog network, which is a series of websites that exist to provide links to one another. The issue is that these are fake websites that often publish:

  1. Nonsense content 
  2. The same blogs over and over 
  3. Inaccurate or made-up information 

The good news is these websites are easy to spot with a quick eye test. 

When peeping a website’s layout, look for:

  • Internal pages that go beyond the homepage 
  • Actual articles and resources 
  • A common theme or topic (many spam sites post random content) 

Also, pay attention to the overall design. Is the site easy to navigate, or is it cluttered with confusing sub-menus?

Here is an example of a site that DOES NOT pass the eye test:

Needless to say, you wouldn’t want to target a guest post from a website that looks like that! 

9 Red Flags to Look for When Vetting Guest Posting Websites 

If you want to become a guest posting guru, you need to know what NOT to look for just as much as what to look for on other websites. 

Guest posting on low-quality websites will backfire and have a negative affect on your search rankings, so it’s imperative to proceed with caution. 

Here are the top factors to avoid when looking for potential websites to guest post on. 

Too many ads

Have you ever visited a website and thought to yourself, “Gee, this site sure could use some more ads.”

Of course you haven’t!

Nobody likes sites crammed with ads since they negatively affect the user experience. An excessive amount of ads is also a telltale sign of a spam site that you should stay clear away from, so beware! 

Here is an example:

Updated content

How often does the domain post content?

If they haven’t posted in over a year, you’re better off targeting another website. Dead domains (sites that haven’t been updated in years) aren’t good for SEO because Google likely hasn’t crawled them in quite some time. 

Google gives crawling priority to websites that are A) popular and B) post fresh content all the time. 

In fact, if you pick up a backlink from a super old domain, chances are you’ll have to wait an eon (possibly up to a year) for it to impact your SEO. 

Irrelevant site categories

Relevance is one of the most crucial factors for a backlink to positively affect your rankings. Google will either ignore or flag irrelevant links as spam. 

This means you need to target websites that are relevant to your niche, either in topic or contextually. 

Here’s the difference:

  • Topical relevance means the website covers the same topics as you do. It could be a blog, forum, e-commerce store, or other type of site. 
  • Contextual relevance refers to websites that aren’t normally relevant to your website but are given the context. An example would be a news website publishing a backlink to a project management software website. Ordinarily, the two aren’t relevant. Yet, if the backlink comes from an article entitled “New Project Management Software Taking Over Offices Nationwide,” then the link makes sense in context. 

The anchor text you use for your backlinks also matters. Here’s an in-depth guide on achieving proper anchor text ratios

Do-follow vs No-follow

Links with dofollow tags will impact your SEO much more than nofollow tags. 

A dofollow tag is a line of HTML code that signals to Google that the backlink will count toward your search rankings. 

A nofollow tag does the opposite and signals to Google that the backlink SHOULD NOT affect your search rankings (although there’s evidence to show that nofollow links still retain some of their ranking power). 

There are certain situations where Google requires site owners to mark backlinks with nofollow tags. An example is all press release backlinks

How do you find out if a website provides dofollow links?

Check out a few of the articles on the website. Use Moz‘s toolbar extension or any other extension/SEO Tools to check if there is a do-follow link within the article.

You can also use Ahrefs Site Explorer. Here’s how. 

First, enter the URL of the domain you want to check.

ahrefs site explorer example

Next, scroll down and click linked domains:

From here, you can select the type of link and see the total number of each type coming from this domain.

This is a great way to ensure your link will be dofollow.

Additionally, you should confirm that your link will be dofollow by confirming it with the webmaster during your initial outreach email.

Sponsored Tags/Sponsored Messages

Some websites have guest posts but with sponsored messages or sponsored tags on the posts.

In 2019, Google introduced the rel=”sponsored” attribute, which identifies sponsored or promotional links. Think of it as a type of nofollow link.

It’s basically telling Google that you’ve paid for the link and that it is promotional content.

While this isn’t inherently a bad thing, sponsored backlinks don’t carry the same ranking power as genuine guest posts do. 

These sites will often mark all of their sponsored content with a “Sponsored” tag, which makes it easy to find:

image of sponsored content

Even if they don’t use the sponsored link attribute or a sponsored tag, they might have sponsored messages before or after the content on their blog.

PBNs

We’ve already gone over the idea behind private blog networks. An SEO (or group of SEOs) creates a group of dummy sites that only exist to link to one another. 

It qualifies as a link scheme under Google’s official search guidelines, so you shouldn’t use them. 

When buying backlinks, you need to carefully vet where they get their links from. 

There are plenty of link vendors out there claiming to sell premium backlinks when they’re actually all from PBNs. 

How to spot PBNs online 

Here are the telltale signs of PBNs to look out for. 

Blogs covering a wide variety of topics with no dedicated focus:

A Write for Us page (these types of websites accept guest posts from pretty much everyone):

While it may seem like a godsend when looking for websites that accept guest posts, that’s exactly what they’re capitalizing on. It’s worth taking the time to find actual websites that accept posts instead of going with a quick and easy fix. 

How to Find Guest Post Opportunities 

Now that you know what not to look for, it’s time to learn how to find genuine guest posting opportunities online. 

Here’s a look at some of our favorite techniques for pinning down websites for guest blogging. 

Google Search operators 

Search operators are special strings of characters that let you further refine your searches, and they’re perfect for finding guest post opportunities. 

The most basic type of search operator involves wrapping keywords in quotations. This tells Google that you only want to see search results that contain those words. 

Going with this formula, here’s how you can target guest posts:

  • Basic search: “your industry/niche” + “write for us” (e.g., “marketing” + “write for us”)
  • Find websites accepting submissions: “your industry/niche” + “submit a guest post”
  • Target specific websites: “website name” + “guest post guidelines”

Social Media

Social media is a great tool for finding guest opportunities. 

After all, no matter your target audience, they’re bound to have a presence on social media somewhere. For instance, TikTok is popular with a younger crowd, while older generations prefer Facebook. Professional audiences hang out on platforms like LinkedIn, especially B2B companies. 

Basically, you should use it as a networking tool. 

Look up groups and forums related to your industry, and start entering your brand into the discussion. Also, visit websites mentioned to or linked by top influencers in your field. 

Content Analysis Tools

What’s trending in your niche? A great way to find out is to use a tool like Buzzsumo. It will connect you with hundreds of journalists in your field, and you’ll get the chance to identify blogs that share similar topics as yours. 

Google Trends is another tool you can use to stay current with your audience. Simply enter a few keywords related to your industry, and you’ll get to see what’s currently trending. 

Leveraging Competitor Backlinks

Last but not least, you can let your competitors guide your backlink strategy. 

Check out the top-ranking websites for important industry keywords. From there, enter their URL into our free backlink checker tool to peep their backlinks. 

Make a note of any relevant business directories, link insertions, and guest blogs they’ve written. 

Since you share the same niche, any websites they target are also fair game for you! 

You can also use an advanced search operator on Google to find competitor’s backlinks. Here’s how:

link:domain.com -domain.com “guest post” (replace domain.com with your competitor’s domain)

This reveals websites where your competitor has contributed guest posts.

Example: link:backlinko.com -backlinko.com “guest post”

Do lots of outreach (and follow-up)

If you found a relevant website that you want a link from, it’s not guaranteed that they will want your post or that they will even RESPOND.

We make sure to follow up ASAP with webmasters we work with if we haven’t heard back from them for a few days.

It’s normal for webmasters’ inboxes to fill up every single day, which is why follow-up emails are a must (without overdoing it). 

Also, personalize each outreach email with the webmaster’s name, and throw in some details about how you’re a fan of their work (without sucking up too much).

Ultimately, your goal should be to form a positive, long-lasting relationship with the website instead of targeting a single link placement. 

Check out our complete guide for more information on how to do quality manual outreach.

Extra Link Outreach Tips 

We’re not done yet! 

Here are some extra tips to employ whenever you find a suitable website to write a guest post for:

Make sure YOUR content is quality

Quality content is the name of the game for any SEO strategy. Without it, all your outreach efforts are for naught. 

Also, remember that larger websites receive dozens (sometimes hundreds) of backlink requests per week. 

This means your content and pitch must stand out from the crowd. 

In particular, do your best to produce helpful content that answers users’ questions, educates them, and entertains them. 

Contact others who have done a guest blog 

Are you a total guest-posting newbie?

If so, you should connect with bloggers who’ve previously written guest posts for the website you’re targeting for a backlink. Politely ask about their experience, the website’s editorial process, and responsiveness.

This gives you insider knowledge about the publication’s expectations, which will help you craft a picture-perfect pitch.

Scrutinize guest post guidelines

As stated before, PBNs accept guest posts from anybody and everybody online, which is to their detriment. 

To find true success, you should post to websites that have standards. In particular, look for quality guidelines, style guides, and direction for tone of voice. These are all signs of quality blogs that are worth pursuing backlinks from.

Do lots of outreach (and follow up)

If you found a relevant website that you want a link from, it’s not guaranteed that they will want your post or that they will even RESPOND.

We make sure to follow up ASAP with webmasters we work with if we haven’t heard back from them for a few days.

Check out our complete guide and for more information on how to do quality manual outreach.

Conclusion: Identifying Quality Websites for Guest Posts 

That wraps up our breakdown of how to vet websites for guest posts. 

We follow these guidelines to the letter every time we consider a website for a guest post, and they haven’t failed us yet. 

It takes a lot of effort to not only vet link opportunities, but also to pursue them. In fact, it proves to be too much to handle for most businesses. 

At The HOTH, we can take the link outreach process off your hands to simplify your SEO success, so don’t wait to book a call today.   

The post How to Identify Websites For High-Quality Guest Posts In 2024 appeared first on The HOTH.

]]>
https://www.thehoth.com/blog/quality-guest-post-sites/feed/ 15
How to Do a Competitor Analysis in Semrush to Find SEO Ideas https://www.thehoth.com/blog/competitor-analysis-in-semrush/ https://www.thehoth.com/blog/competitor-analysis-in-semrush/#respond Tue, 14 May 2024 09:11:53 +0000 https://www.thehoth.com/?p=35816 Consistently coming up with outstanding content ideas and high-authority backlink opportunities is a tall task for anyone, even seasoned digital marketing experts.  The good news is that in the SEO world, it’s perfectly acceptable to cheat off a classmate’s test.  What we mean by that is you can analyze your competitors’ websites to find new […]

The post How to Do a Competitor Analysis in Semrush to Find SEO Ideas appeared first on The HOTH.

]]>
Consistently coming up with outstanding content ideas and high-authority backlink opportunities is a tall task for anyone, even seasoned digital marketing experts. 

The good news is that in the SEO world, it’s perfectly acceptable to cheat off a classmate’s test. 

What we mean by that is you can analyze your competitors’ websites to find new SEO opportunities and improve the performance of your own site. 

One of the best ways to pull this off is to learn how to do a competitor analysis in Semrush, which offers a suite of SEO tools perfect for peeking under the hood of competing websites. 

With their tools, you’ll be able to:

  • Discover your competitor’s top traffic sources
  • Analyze a competitor’s backlink profile to find new opportunities 
  • Find any user experience enhancements that your site lacks (i.e., faster loading speed, responsive design, easy to navigate, etc.)
  • Determine the effectiveness of their landing pages 

As you can see, analyzing your SEO competitors will yield many valuable insights for your own strategy, which is why learning how to do a competitor analysis in Semrush is worth your time – so stick around to learn more. 

What’s a Competitor Analysis in SEO?

SEO is a fiercely competitive space, as websites are constantly outranking and outperforming one another on Google’s search engine results pages (SERPs). 

In particular, websites fight for the coveted #1 organically ranked result, as well as landing SERP features like knowledge bars and the ‘local pack’ (a grouping of three local businesses complete with their address and location on Google Maps). 

Even when a website achieves stellar rankings for their most important keywords, they have to fight to maintain their positions – as there are always competitors out there eager to dethrone you at a moment’s notice. 

That means continuing to put out excellent content that’s optimized for search engines, which can be difficult to maintain after a while. 

That’s why so many SEOs look to their competitors for inspiration, especially for new content ideas and fresh link opportunities. 

Platforms like Semrush provide detailed data about virtually any website’s SEO and marketing activities, and taking a peek can provide invaluable insights. 

Therefore, you shouldn’t hesitate to analyze your competitor’s SEO profiles, as they’re likely doing the same to you. 

During a competitor analysis, you should pay attention to things like:

  • Their total organic traffic
  • Whether their audience is growing or declining
  • The channels that drive the most traffic 
  • Their backlink profiles (especially where they’re getting their links from) 

Analyzing your competitors isn’t something you can only do just once, either. You can and should make a regular habit of auditing your top competitor’s websites. 

How Do You Identify Your SEO Competitors?

Before you can analyze your competitors, you need to know who they are – which isn’t always straightforward. 

Sure, you can search on Google for the keywords you’re targeting and see who shows up, but that won’t give you a comprehensive overview of your top competitors. 

That’s where Semrush comes in handy. 

Its Organic Research tool will take all the mystery out of identifying your competitors. Simply enter your URL in the search bar, and then navigate to the Competitors tab. 

Voila, you now have a complete list of the websites competing for the same traffic, keywords, and target audience. 

You’ll also get to view key metrics like how many keywords they rank for, their total organic traffic, any keywords that you share (common keywords), and more. 

Semrush’s Traffic Analytics Tool 

Once you know who your top competitors are, you can do head-to-head comparisons between their site and yours using Semrush’s Traffic Analytics tool. 

First, enter a competitor’s URL into the tool to see a breakdown of their traffic, bounce rate, dwell time, market share, and other important metrics. 

This will give you a detailed overview of the competitor’s current SEO performance. If you scroll further down the page, you’ll find a graph displaying the same metrics as above but over a period of time. 

The graph is useful for spotting competitor trends, such as certain times of the year when they experience dips in traffic (like during the holidays). 

Besides analyzing a competitor’s metrics by themselves, you can also plug in your URL to do a direct comparison between SEO metrics and user experience. 

Under Root Domain, you’ll see four blank competitor slots. Add your URL into one of them, and hit the Compare button. 

This is extremely effective for visualizing the gap (if there is one) between you and a competitor. In the example provided, you can see that Vrbo trails behind AirBnB in nearly every category except for Pages per visit and bounce rate. 

That could mean that while AirBnB is generating more traffic, users are interacting more with Vrbo’s site. Since the bounce rate is lower and the pages per visit metric is higher, it’s a sign that users are continuing on to other pages on Vrbo’s website instead of ending their session after visiting one page. 

If you were AirBnB, it would be worth looking into Vrbo’s content and internal linking structure to see what’s engaging users so much. 

Analyzing a Competitor’s Backlink Profile 

Once you’ve done an in-depth analysis of a competitor’s content and traffic sources, it’s time to look at their backlink profile. 

You should never complete a competitor analysis without looking at their backlinks, as they often provide the most valuable insights and opportunities. 

Semrush’s Backlink Analytics tool is perfect for checking out a competing site’s top backlink sources. 

Enter a competitor’s URL into the tool, and you’ll be able to see an overview of their backlink profile, including their total number of referring domains and backlinks. 

Under the Backlinks tab, you can view a competitor’s complete backlink profile, which will help you uncover new link opportunities to pursue. In particular, pay attention to their source pages

 

Are they getting links from directories you don’t know about? Or are they using websites in your niche that accept guest posts? Either way, you could be a few outreach emails away from lots of new backlinks. 

Claim an Exclusive 14-Day Trial of Semrush Pro Today 

Analyzing competitors is one of the best ways to mix up your SEO strategy – especially if you’ve been a bit stagnant lately. 

If you’re fresh out of engaging content ideas and can’t seem to move the SEO needle anymore, your competitor’s websites should be your first stop. 

Are you ready to enjoy all the powerful SEO tools that Semrush offers?

Then, don’t wait to take advantage of our exclusive extended 14-day trial of Semrush Pro. The standard trial only lasts 7 days, so don’t forget to use our special link to snag an extra week for free!    

The post How to Do a Competitor Analysis in Semrush to Find SEO Ideas appeared first on The HOTH.

]]>
https://www.thehoth.com/blog/competitor-analysis-in-semrush/feed/ 0
An Empty Cookie Jar: Google Phases Out Third-Party Cookies https://www.thehoth.com/blog/google-phases-out-third-party-cookies/ https://www.thehoth.com/blog/google-phases-out-third-party-cookies/#comments Fri, 05 Jan 2024 17:19:53 +0000 https://www.thehoth.com/?p=35222 On January 4th, 2024, Google disabled the tracking data from third-party cookies for 1% of Chrome users – which is part of a greater plan to phase out tracking cookies completely by Q3 2024.  These aren’t the type of cookies that your grandma makes, either.  Despite the innocent-sounding name, third-party cookies are small files that […]

The post An Empty Cookie Jar: Google Phases Out Third-Party Cookies appeared first on The HOTH.

]]>
On January 4th, 2024, Google disabled the tracking data from third-party cookies for 1% of Chrome users – which is part of a greater plan to phase out tracking cookies completely by Q3 2024. 

These aren’t the type of cookies that your grandma makes, either. 

A grandma discovering cookies online.

Despite the innocent-sounding name, third-party cookies are small files that collect analytics, track browsing behavior, and personalize online ads. 

In other words, cookies are the reason why YouTube and Amazon ads all display things you’ve searched for online recently (like a YouTube ad for mattresses playing shortly after visiting mattress websites). 

While third-party cookies have been invaluable for digital ad agencies to gather precious consumer data, they’ve long been viewed as a violation of user privacy. 

Adding fuel to the fire is the fact that hackers can use cookies for nefarious purposes, such as stealing a user’s personal information or financial data. 

That’s why Google plans to eliminate third-party cookies, as they feel it will provide a safer, more private browsing experience. 

Not everyone is celebrating the news, though. 

Lots of ad tech firms rely on third-party cookies to inform their advertising strategies, and as soon as Google gets rid of cookies entirely in Q3, they’ll lose their data collection abilities. 

There is a light at the end of the tunnel, though. 

Stay tuned to learn more about Google’s campaign against cookies (including their plan for the future of ad tracking) and how focusing on SEO will help you kick the cookie habit. 

1% of Chrome Users are Now Cookie-Free 

While 1% of Google Chrome users may not seem like a lot, it’s crucial to remember that Chrome is the most popular online web browser. 

As of December 2023, a whopping 62.85% of online users cite Chrome as their preferred browser. Their closest competitor is Apple’s Safari, which accounts for 20.04% of users, so Google has an immense lead. 

That means that 1% of Chrome users translates to about 30 million online users, which is pretty significant. 

Google chose to start with 1% to enable small-scale testing of the Privacy Sandbox tools during Q1 of this year. 

Once they’re done tweaking and refining the Tracking Protection feature, it will roll out to the rest of Chrome users in Q3 2024. 

This change has been a long time coming, as competing browsers like Mozilla’s Firefox and Apple Safari already phased out third-party cookies by 2020. 

Google planned to follow suit shortly after, but its plans to phase out cookies have been postponed twice already. Initially scheduled for 2022, the implementation was postponed until late 2023 and then again to early 2024. 

The reason for the delay?

Privacy Sandbox vice president Anthony Chavez claims the delays were necessary to test and refine the technologies involved with the process. 

Since Safari and Firefox have already kissed cookies goodbye, third-party cookies will basically become a thing of the past as soon as Google’s implementation is complete in Q3 2024. 

Cookie-Free Advertising: The Way of the Future 

The writing is on the wall for third-party cookies, so the last thing you should do is plan to keep using them in 2024

Instead, it’s time to audit your cookie usage to find ways to collect user data without them. 

This is easier said than done for lots of ad tech agencies, and they remain wary of Google’s proposed alternatives (more on this in a bit). 

As a result, some ad publishers may see an initial decrease in revenue from the removal of tracking cookies. They will only recover from the loss if they adapt and find a new solution to track user data. 

The good news is there are plenty of ways that ad agencies can get ahead of the curve and preserve their business – one of which is Google’s suggestion to switch to APIs

Weaning off cookies with Google’s API alternatives

Google isn’t giving digital ad agencies the cold shoulder during the cookie phaseout, as they have developed alternative ways for advertisers to collect user data. 

In particular, Google has several application programming interfaces (APIs) that mimic the invaluable functions of third-party cookies for online advertisers – albeit while preserving user’s anonymity. 

But can you really kick your cookie cravings with one of Google’s API patches?

The Topics and Protected Audience APIs provide advertisers with limited information about user’s interests. Chrome discovers these interests by analyzing browsing history data that it stores on user devices instead of external servers for better security. 

From there, websites can choose ads based on these user interests, although the API won’t track cross-site activity as third-party cookies do. 

Still, these two APIs provide advertisers a way to collect valuable user data without violating their privacy (hence the name ‘Protected Audience’ API). 

Since these APIs don’t fully replicate the tracking capabilities of third-party cookies, some advertisers may view them as watered down. 

There are also ID-based solutions that advertisers can flock to if they aren’t a fan of Google’s APIs. These cookieless solutions use anonymized email addresses instead of third-party cookies, and they provide cross-site tracking. 

Got No Cookies? Focus on Organic SEO Instead 

If you don’t want to see a significant revenue dropoff once cookies fall by the wayside, you should start focusing on organic SEO sooner rather than later. 

With search engine optimization, you can generate tons of traffic, leads, and sales – and no third-party cookies (or expensive ad bidding) are required. 

As long as you have outstanding content and an impressive backlink profile, you’ll always appear at the top of the SERPs (search engine results pages), Google will trust your content, and you’ll enjoy lots of referral traffic. 

The prevalence of AI, Google SGE, and other factors are significantly changing the online advertising space, so you need to find ways to stay ahead of the curve – and SEO is the best way to do that. 

Moreover, SEO is a form of inbound marketing, so you don’t have to disrupt user experiences with ads. Instead, your informative and engaging content will draw your audience to your front door, making it far easier to land conversions. 

If you need help adjusting to a cookie-less internet, don’t wait to sign up for HOTH X, our renowned managed SEO service run by experts.    

The post An Empty Cookie Jar: Google Phases Out Third-Party Cookies appeared first on The HOTH.

]]>
https://www.thehoth.com/blog/google-phases-out-third-party-cookies/feed/ 1