5 Tips to Discover if A Site Received a Panda Google Slap. And a Quick Remedy


Since the May 2014 Panda 4.0 update, announced on Twitter by Google’s Matt Cutts, SEO pros, site developers and others involved with site optimization, are struggling to determine what’s really up with this recent update. And why it appears to be affecting far more sites than Google claims it should. Continue reading

Loading the player...

Jun 02, 2014 /prREACH/ -- The qestion many in the SEO world are still asking is whether the Panda update affected far more sites than Cutts suggests. Though he previously said "Google's Panda algorithm is designed to prevent sites with poor quality content from working their way into Google's top search results", it appears Panda's affecting more than a select few.

According to a 5-21 Searchmetrics report sites like eBay, Ask, health.com and DigitalTrends lost 50% of their search rankings. That said, many in the SEO community are seeing a wide variety of penalized sites. Radically different in size and type from behemouths like eBay. To be fair, Google admits updates are directed at both large and small sites.

Search pros are also wondering was the Panda update a double whammy for sites previously affected by the Penguin update.  A Google spokesperson told Search Engine Land "there is no Penguin or other spam efforts going on now".

To compound matters, Google finally admitted to a regular rollout of a variety of smaller updates called "refinements". Rolled out with little fanfare, or comment, over the past year. Meaning during that time sites may have been affected by a series of small, seemingly negligible changes. Which started slowly lowering a site's search ranking. Making it increasingly difficult to find.

A site may have been affected and penalized if:

1. It's slow and poor loading 2. Valuable, searchable content isn't regularly added 3. Images aren't crawlable, a site isn't indexed 4. Broken links, errors in HTML, CSS 5. Duplicate content, titles, meta descriptions

What, if anything, can be done to help protect a website from Google's meaty paw slap? While there are several options, top of the list should be a simple and fast site audit. Since no SEO pro can be expected to spend 24/7 tweaking a site, a periodic website audit will point out current issues, expose potential problems. Keep it running in top form with only periodic tweaking.

    • Run a website audit every 90 days to discover current and potential issues and eliminate them. Every website needs a periodic health check to stay fit and sound. Here's 5 negative issues a site's poor health can result in: 1) Turn visitors away with poor navigation, slow loading 2) Create order problems, cart abandonment, lack of product fulfillment 4) Competitors stealing clients 5) Client problems, regular client loss, customer service issues.
    • A site audit helps improve its future indexing and crawling. Ensures it's valuable content is indexed fast, appears in Google's search for optimal results. When crawled by a Google bot, it passes muster.
    • A good site audit tool can provide info to help optimize site page content for important keywords. With smart technology a good site audit tool can recommend suggestions for keyword density, prominence and other word usage concerning every element of web pages. Including title and meta description, h1, h2-h6 tags, image alt texts and link anchors, for example.

Google bots crawl the Internet 24/7  penalizing sites failing to adhere to updates, or slow, poor loading sites.  No site beyond it's reach. Though seemingly simple problems, all of these can easily have a site spiraling downward.

With Google's penchant for launching updates, and rolling out regular, unannounced updates potentially causing causing slow loss of visibility, visitors and ranking, for example, it makes sense to have a site audit to keep it healthy and running well.

Contact Info

Jean L. Serio, CPC


A May 20, 2014 article by Search Engine Land News Editor, Barry Schwartz, points out the “Panda 4.0 must be a major update to the actual algorithm versus just a data refresh.”
No Attachments