Saturday, August 18, 2012

Google Panda v/s Penguin


The Google Penguin

The Google Penguin was released in April 2012 and the purpose of this new algorithm is that it determines how web pages are indexed and whether they are linked to the appropriate keywords that users use in the search engine bar or not.

For example, if you search for a directory online (ABC directory, for example) through Google, you should be able to find the direct link to the directory when you type in the keywords “ABC Directory”. However, before, there were many pages that linked the directory to a directory page but a direct link never appeared on the Google results until the keyword “ABCdirectory.com” was searched. This meant that the domain was indexed but not the directory itself and this was what the Google Penguin pin rectified: sites that weren’t properly indexed and were backed up by lousy link building.

As a result, several web pages were penalized by receiving poor webpage ratings by Google that dropped their link from the first page by Google to the 10th or on a farther page. Some of the web pages were permanently banned from the Google results. The Google Penguin targeted websites that were stuffed with excessive keywords and links and analyzed them for further irregularities. These irregularities were analyzed by the Google Panda update.

To recover your web page from Google Penguin penalty, you would have to fill up a consideration request. However most of the sites could not receive this step successfully as it required you to first bring your site up to code and then required Google Spiders to crawl over the new webpage which could take months. Some of the businesses are waiting for the next Penguin update so that it may automatically revoke the penalty from the webpage.

The Google Panda

The Google Panda was released before Penguin and it did more damage to web page rankings than Penguin. In fact, once both web updates were released, web page owners were baffled by how their web page rankings fell so drastically. They were confused whether it was the Panda or the Penguin that did the damage. Eventually, this became the best time to be in the SEO business. Every web developer and web coder was now busy bringing the web pages up to code so that their clients could once again profit from being on top of the search results.

The Google Panda update was created because Google now demanded proper content. They wanted professional content that could serve the needs of the user. Hence how does the algorithm really determine whether the page has good or poor content and whether it deserved a good rank or not?
There were a number of factors analyzed.
  • First, Google analyzed how much time users were spending on the web page, the exit rate and what specific page had more exit rates than the others.
  • Then it analyzed the number of times people logged on to the same web page with the same IP address i.e. how many times they revisited the web page.
  • Then they analyzed the web content.
The content that Google Panda demanded had to be professional and authentic. This meant no more plagiarism. A popular strategy used prior to Panda update was firms using the same content and building multiple websites with shuffled content, all linking to one specific web page.

Now the Google Panda update sought out these web pages and lowered the ratings. It judges how much in-depth the info available on the page is, how much info is repeated or copied from other sites, how many times were keywords used and how naturally they were linked in the articles. Articles jammed with several keywords were rejected as well. Too much link building was no longer acceptable. Pages with several links for example more than 600, 000 links were carefully observed. Articles with less than 400 words were penalized with lower ratings and hence the recommended word limit went up to 500-800 words.

Such requirements made the webpages fall into lower results and hence caused them to become desperate to abide by the new Google rules. What the SEO firms are now challenged with is building authentic and genuine content for web pages as well as linking pages. There is no more manipulation and even if you try to manipulate, you would do so by posting more authentic information online which once again is beneficial to Google.

No comments:

Post a Comment