How Google Considers Webpages as ‘Low Quality’

How Google Classifies Webpages as 'Low Quality'-ywf

When talking about producing best websites with best value, one cannot deny that they have to pass Google quality assessment. To classify sites, Google usually issues their quality update monthly and as a web developer, you surely wish your site will be qualified as a high quality, right? Therefore, you have to know how Google classifies Webpages as ‘Low Quality’. Knowing what makes webpage  considered as low quality is as important as knowing what makes it considered as high quality, so to prevent your sites from getting low rank, you can read the following information.

Excessive & Unnatural Internal Structural Links

Since early of 2012, Google has been talking about its low-quality criteria list when they filed a patent that directly targets websites repeating internal links across sidebars and footers. This was actually a part of SEO technique, but now it has become an unnatural form of SERP manipulation.

You can see some obvious signs when a website is untrustworthy, like it’s been hacked or it’s full of spam comments and spam pages. There are some less obvious factors that Google considers it as low quality signals.

Phantom 2

As Google keeps updating its algorithm, we began to see a change in Google’s algorithm in the end of April 2015 where in May 2015 was dubbed the Phantom 2 update. Many webmasters speculate that Phantom 2 is made to examine the newly patented low-quality criteria list, a lot of websites with poor quality content saw upwards of a 10 percent decrease in organic search traffic.

Over-Monetization of Content

Google considers most websites to be designed to trick users, search engines, or both. If Google determine your web as a part of this kind of websites, then they will record your web as the lowest quality. In fact, Google has taken greater steps to cater for high volume search queries that don’t have a single dominant interpretation.

Ads & Affiliate Links

Regarding to ads and affiliate links, you should make sure that the main focus of the page is its main content, so users shouldn’t have to scroll past affiliate links and ads or interact with intrusive overlays. For example, Taboola and Outbrand which blend in with the page, this design makes them look like they belong to the page and increase’s users trust. However, the bad side is, it will affect your quality ranking.

E-commerce Trust Factors

Another issue that you have put to underline is the lack of attention to detail e-commerce hygiene pages and hygiene content. Below are some pages that you should consider the most.

Financial Transaction Pages

Good transaction pages do not only have to contain cart/checkout and subsequent stages on the website, but these pages should also contain prominent links to standard e-commerce hygiene pages or any page that allows a user to “purchase’ or add a product to their cart/basket. Furthermore, as this page can be considered as the most important user pages, you should develop it with the best effort possible.

Financial Information Pages

Don’t forget to create a high quality content for your finance content, financing product purchases on your own site, or content and information that can insure products or services. High quality content should contain keywords and understand the relationship between keywords and content.

The bottom line of this article is, in order to get your SEO basics right, make sure that your pages and internal linking structures serve users very well, so that it doesn’t pass link equity to the money pages, especially for e-commerce and brochure service websites.

 

Why Real Time Search Algorithm Updates May be Bad News

bad news

Have you ever wondered what updates will come after Panda, Penguin, and many others? Don’t you worry, recently Google has released real time search algorithm updates which mean no need for searching any name for any update anymore, but on the other side this real-time update will surely make the job of any SEO services will become harder. So, what kind of problem will arise due to this real time search algorithm updates?

  • Troubleshooting Algorithmic Penalties

There are two types of penalties done by Google, algorithmic penalties and manual actions. Algorithmic penalties are a lot more difficult to troubleshoot than manual actions since in manual action, Google informs you of the penalty via the Google Search Console, giving webmasters the ability to address the issues that are negatively impacting their sites. This works oppositely with algorithmic penalty since you will never know if there are any problem exists.

So, how can you know if your web gets any algorithmic penalty? Well, the easiest way to figure it out is to match a decrement in your traffic with the dates of known algorithm updates or you can also determine if there is any algorithmic penalty by seeing if a site ranks high in maps but poorly for organic of different phrases.

  • Lead to Misdiagnosis and Confusion

What makes it is difficult and confusing is the fact that Google’s crawlers don’t crawl pages at the same frequency. So even if you’re keeping a detailed a timeline of website changes or actions, you still have a big chance from getting affected by algorithm update since there could be many other issues with the server or website changes that you may not be aware of that cause it is difficult for not to misdiagnosis of penalties. In fact, some SEO companies who try to take some actions like disavow their files to make a better link building may do more harm than good.

  • Negative SEO

With real time updates, you will need more time and effort to analyze your link building, if your link building is healthy enough or not. If it is not, then it is time for you to remove them.

3 Facts You Should Know o Google’s Truth Algorithm

google truth algorithm (3-facts)

In SEO, creating a dependable content is one of the biggest issues that every SEO services have to face. One of the ways that you can use is by using KBT “knowledge based Truth” in your content research. If you haven’t been familiar yet with KBT you can figure out what it is in (http://arxiv.org/pdf/1502.03519.pdf) to help you understand better the other 3 facts of Google’s truth algorithm.

Issue #1: Irrelevant Noise

The facts are identified by the algorithm through examining three factors which is known better as “Knowledge Triples,” consisting of a subject, a predicate, and an object. A subject is a “real-world entity” such as people, places or things. A predicate describes an attribute of that entity.  According to the research paper, an object is “an entity, a string, a numerical value, or a date.” Those three attributes together from a fact, known in the research paper as Knowledge Triples and often referred to simply as Triples.

Issue #2: Duplicate Content

Since the KBT cannot sort out containing facts copied from other sites, then it is possible that content contained copying facts from “trusted” sources such as Wikipedia, Freebase, and other knowledge sources cannot be sorted out by KBT.

The researchers tried to apply scaled copy detection as part of knowledge-based trust algorithm but it’s simply not ready. This is a fourth issue that will delay the deployment of KBT to Google’s search results pages.

Issue 3: Accuracy

The accuracy of KBT is derived from a research which among the one hundred random high trust sites picked for review, 15 of the sites (15%) are errors. Two sites are topically irrelevant, twelve scored high because of trivial triples, and one website had both kinds of errors (topically irrelevant and a high number of trivial triples). In other words, in a random sample of high trust sites with low PageRank, KBT’s false positive percentage is revealed to be on the order of 15%.

Many research papers whose algorithms eventually make it into an algorithm usually demonstrate a vast improvement over previous efforts. That is not the case with Knowledge-Based Trust. While a Truth Algorithm makes an alarming headline, the truth is there are five important issues that need to be solved before it makes it to an algorithm near you.