Your cart is currently empty!
9 Powerful Tips to Troubleshoot Your Technical SEO
Many magazines will offer you the best SEO practice guides. However, we think practice guide only is not enough, web masters and SEO agencies need some help with their SEO issues. If you are the one who works in SEO agencies or SEO service, this article will surely be suitable for you. There will be a lot of solutions to overcome many problems in SEO. Check this out.
Info: Search Operator
In many cases, [info:https://www.domain.com/page] can aid you in diagnosing a variety of issues. Besides, you will also discover it if a page is indexed and how it is indexed. Sometimes, you will see that Google chooses to fold pages together in their index and treat two or more duplicates as the same page. Furthermore, this command shows you the canonicalized version. However, it is not necessarily the one specified by the canonical tag, rather, it is what Google views as the version they want to index.
Basically, Google didn’t want two of the same page in their index. That is why; if you search for your page with this operator, you’ll see the other URL ranking instead of what you wish to see in search results. The pages may be folded into one version and show the wrong page for the locations affected, if you make exact duplicates across country-language pairs in hreflang tags.
&filter=0 added to Google Search URL
To remove filters and show you more websites in Google’s consideration set, you can add &filter=0 to the end of the URL in a Google search. When you add this, you might see two versions of a page which may indicate issues with duplicate pages that weren’t rolled together. They might both say they are the correct version, for instance, and have signals to support that.
Furthermore, this URL appendix also provides you with other eligible pages on websites that could rank for this query. If you have multiple eligible pages, you are likely to consolidate pages or add internal links from these other relevant pages to the page you wish to rank.
Site: search operator
A search can show how wealthy the knowledge of a website. In fact, we would love to search for pages that are indexed in ways we wouldn’t expect, such as with parameters, pages in site sections we may not know about, and any issues with pages being indexed that shouldn’t be.
Site:domain.com keyword
To check for relevant pages on your site for another look at consolidation or internal link opportunities, you can use [site:domain.com keyword].
Another benefit about this search is that it will notice you if your website is eligible for a featured snippet for that keyword. Check through many of the top websites about this search to see what is included in their featured snippets that are eligible to try and find out what your website is missing or why one may be showing over another.
Using a “phrase” instead of a keyword will be useful if the content is being picked up by Google, which is handy on websites that are JavaScript-driven.
Static vs. Dynamic
It’s important to know that JS can rewrite the HTML of a page, when you’re dealing with JavaScript (JS). When you are looking at view-source or even Google’s cache, then you are looking at the unprocessed code. In fact, there are only limited amount of views that are actually be included once the JS is processed. To see what is loaded into the Dom (Document Object Model) and use “Fetch and Render” in Google Search Console is a good idea, especially on how Google actually sees the page, since it may be you who is wrong. If this is right, but when processed, something in the <head> section breaks and causes it to end early, throwing many tags like canonical or hreflang into the <body> section, where they aren’t supported.
However, since it would allow hijacking of pages from other websites, these tags aren’t supported in the body.
Check Redirects and Header Responses
Another tip is to understand how your redirects are being handled. If you are worried that a certain path is being consolidated, you can check the “Links to Your Site” report in Google Search Console and look for links that go to pages earlier in the chain to see if they are in the report for the page and shown as “via this intermediate link.” If they are, it’s a safe bet Google is counting the links and consolidating the signals to the latest version of the page.
Things will be more interesting for header responses. On the page, you will see canonical tags and hreflang tags here than can conflict with other tags. Moreover, redirects using the HTTP Header can also be problematic. Many cases have been found where people set the “Location:” without any information in the field and then redirect people on the page with a JS redirect. However, as a result, they’re redirected to nothing before they can see the other redirects.
Check for Multiple Sets of Tags
You may not notice that there will be many tags found in multiple locations, such as the HTTP Header, the <head> section and the sitemap. Investigate for any inconsistencies between the tags. There’s nothing stopping multiple sets of tags on a page, either. For instance, your template added a meta robots tag for index, then a plugin had one set for no-index. You cannot halt your search after the first search, as you can’t just assume there is one tag for each item.
Change UA to Googlebot
Sometimes, you have to see something like Google sees. So, whenever you find interesting issues, like cloaking, redirecting users and caching, you can change this with Chrome Developer Tools or with a plugin. However, it would be better to use incognito mode, once you are going to do this. You might want to check that Googlebot isn’t being redirected somewhere.
Robots.txt
You can use robots.txt for anything that might be blocked. For example, Google cannot crawl the page and can’t see those tags too. Therefore, you can use robots.txt for changes. Furthermore, you may also have a problem with a page not being indexed and not being able to figure out why. Although not officially supported, a noindex via robots.txt will keep a page out of the index, and this is just another possible location to check.
Summing Up
Since there will be many teams working on projects, in a complex environment, it is important to assume everything will change and everything will break at some point. In fact, the more points of failure can make the job of a technical SEO becomes more interesting and challenging.