First of all it is important to distinguish between the two of them…
Google Panda & Penguin – including their differences: (courtesy of Wikipedia)
[accordion_toggle title="Panda"]Google Panda is a change to the Google’s search results ranking algorithm that was first released in February 2011. The change aimed to lower the rank of “low-quality sites” or “thin sites”, and return higher-quality sites near the top of the search results. CNET reported a surge in the rankings of news websites and social networking sites, and a drop in rankings for sites containing large amounts of advertising. This change reportedly affected the rankings of almost 12 percent of all search results. Soon after the Panda rollout, many websites, including Google’s webmaster forum, became filled with complaints of scrapers/copyright infringers getting better rankings than sites with original content. At one point, Google publicly asked for data points to help detect scrapers better. Google’s Panda has received several updates since the original rollout in February 2011, and the effect went global in April 2011. To help affected publishers, Google published an advisory on its blog, thus giving some direction for self-evaluation of a website’s quality. Google has provided a list of 23 bullet points on its blog answering the question of “What counts as a high-quality site?” that is supposed to help webmasters “step into Google’s mindset”.[/accordion_toggle]
[accordion_toggle title="Penguin"]Google Penguin is a code name for a Google algorithm update that was first announced on April 24, 2012. The update is aimed at decreasing search engine rankings of websites that violate Google’s Webmaster Guidelines  by using black-hat SEO techniques, such as keyword stuffing, cloaking, participating in link schemes, deliberate creation of duplicate content, and others.[/accordion_toggle]
[accordion_toggle title="Differences"]Before Penguin Google released a series of algorithm updates called Panda that first appeared in February 2011. Panda aimed at downranking websites that provided poor user experience. To identify such websites, a machine-learning algorithm by Navneet Panda was used, hence the name. The algorithm follows the logic by which Google’s human quality raters determine a website’s quality. In January 2012, so-called page layout algorithm update was released, which targeted websites with little content above the fold. The strategic goal that Panda, Penguin and page layout update share is to display higher quality websites at the top of Google’s search results. However, sites that got downranked as the result of these updates have different sets of characteristics.The main target of Google Penguin update is to check webspam.[/accordion_toggle]
Now with that out of the way we can proceed with the 3 Reasons Why Google Panda And Penguin Are Killing Your Business…
[dropcap]1[/dropcap]The first is a case of your past catching up with you. This is with respect to SEO (search engine optimization) – your historic methods of SEO might have been less than honest. Things like blog comment spamming, paid links, bad back link neighborhoods and a variety of other “spammy” linking methods that Google Penguin could now classify as inorganic linking.
The inorganic linking message in GWT (Google Webmaster Tools) is something you don’t wanna get. The moment you open your GWT and see this message you’ve got to know you’re in trouble and along with it comes bucket loads of work at removing them in order to appease Google and reconsider how they index your site in the SERPs.
Essentially, if you’ve at any time over the past 5 years (mas ‘o menos) retained the services of inexperienced SEOs, you’re likely going to find yourself in the crosshairs of the Penguin algo that was designed specifically to target webspam.
[pullquote style="left" quote="light"]Someone’s sitting in the shade today because someone planted a tree a long time ago. – Warren Buffett[/pullquote]A good guideline to any and all back linking efforts you do is to try to get them from niche related blogs, have a diverse selection from various web properties including Web2.0, bookmarking, social networks, commenting, contextual links and homepage or blogroll links. And please don’t forget to index them slowly at rates around 40-70 a day depending on the authority and age of your site. This is the “new” school of thought when it comes to backlinking compared to the commenting blasts from our past that are now catching up with us.
[dropcap]2[/dropcap]Secondly is what I often refer to as the “crap content syndrome”. This is the idea that scraping content (stealing content) and then spinning (the method of applying synonyms to words in the article then using software to churn out “spun” articles that may/may not appear unique to Google). Before I sound too hypocritical here I’d like to state for the record that this is in fact a method I myself utilize however…let me explain just HOW I use this method before you judge me.
An important fact to note is this…you get what you pay for! Simple? I think so.
When you pay someone $5 to write you content and then disperse that content across hundreds or even thousands of article or press release sites, do you seriously expect to get quality? Let me explain the reality behind properly spinning content and subsequently delivering that content.
Please understand that I do this day in and day out so I would hope you’d lend me some credibility on that note alone…
Writing a 400-500 word unique and compelling article can take anywhere from 20-40 minutes depending on the amount of research necessary to make it interesting to your target market. Then comes the spinning part – in order to properly spin an article whereas you can expect results that are near perfect for human readability, you have to expect that you’ll be investing some time by manually spinning it. Now of course you can always outsource this however you’ve got to be 100% sure that your outsourcing agent is going to manually spin the article instead of just saying they will. Also it is important to hire manual article spinners who have English as their first language. This is for obvious reasons.*any mass submitted content MUST be a minimum of 35% uniqueness or you will certainly face duplicate content penalties which makes all your efforts futile.
Now, once you have your article manually spun and you’re happy with the results it produces, it’s then time to find a service for drip feeding your article submission.
It is always a wise idea to slowly drip your submissions over time – 20-40/day is a good guideline. This is to prevent setting off any red flags for SPAM (IE: Panda)
[dropcap]3[/dropcap]Thirdly is accepting that there is a new SEO paradigm and with Google Panda and Penguin wreaking havoc, you’d be wise to adapt.
Google is dynamic…very dynamic. And with over 200 specific considerations when ranking your site, it would be advisable to be on top of this paradigm shift.
A few examples of what has changed with these two updates to the algorithm are as follows:
- Meta keywords no longer matter – don’t waste your time entering in your keyword in to the meta keywords portion because Google has long since denounced it.
- Meta title and description – these two still remain of the utmost importance - although the character limits still exist at 150-160 for meta description and 65-70 for the meta title tag, Google is now giving strong consideration towards LSI keywords. (explained below)
- Keyword density – at one time it was suggested to keep this anywhere between 2-4% – within the new algorithms this is far too high and could trigger “over optimization” filters within Google Panda.
- Deep page linking – old schoolers use to fire all their ammunition at the front page of a site when it came to back linking and now this is not advisable. A more diverse selection of pages throughout a site have to be considered when building back links to avoid the previously mentioned “over optimization” filters put in place by Panda. A good guide to stick to is 65% – 35% (for the rest)
Despite the science of SEO being far from perfect, the study of LSI (Latent Semantic Indexing) has shown us that since Panda and Penguin, they have been elevated in importance substantially.
To explain in more detail what LSI keywords are…
[message type="info"]Latent semantic indexing (LSI) is an indexing and retrieval method that uses a mathematical technique called Singular value decomposition (SVD) to identify patterns in the relationships between the terms and concepts contained in an unstructured collection of text. LSI is based on the principle that words that are used in the same contexts tend to have similar meanings. A key feature of LSI is its ability to extract the conceptual content of a body of text by establishing associations between those terms that occur in similar contexts.[/message]
…to be blunt about it…it’s selecting variations of the primary keyword and peppering your content throughout your site (including titles and descriptions) with these terms/phrases to ensure Google associates your site with the primary keyword phrase even when using LSI methods. It also prevents ringing alarm bells at Google for targeting – and over doing SEO – for one specific keyword.
For more information on SEO, inbound marketing and direct response copywriting, signup to my newsletter and immediately get access to my gift…Quick-Start SEO Guide.