I wish diagnosing SEO issues was easy.
But it’s never easy...
We need more data, less opinion.
Click icon to add pictureSomeone told him
this was a good idea.
data
How to debug SEO
I even went to the length of understanding the process of elimination by building my own php app.
Caveat : You need to have a certain level of knowledge.
Rand said this in 2011, and it was already a big list
The debugging process.The following is based on real life events.
I’ve always wanted to say that.
Here’s the overall organic graph for 2012 versus 2011. Seasonal fluctuation seemed unlikely.
2012
2011
If this was programming, we’d get an error message telling us where the problem is.
Looking for problems in SEO isn’t as straightforward.
You can use the Content > Content Drill down report, but it only shows pageviews, not visits.
If you’re using GA, set up a custom report using these settings.
When folders are not constant /city/, segment by number of slashes (/) using regex to see the full picture.
/product/
/product/region/
/product/region/city/
It turns out that it was the /region/ city pages.
These are top converting revenue pages!
Step 1:
Explain the problem clearly with specific details.
/product/region/city/ dropped 50% organic visits
comparing from June to May.
Step 2:
Gather data and investigate.
Data Gathered Tool(s) used Double check with:
Before / After traffic metrics (visits, bounce rate etc..)
Google Analytics Server logs imported into Splunk.
Response codes, link architecture, On page elements (titles, robots directives etc..)
Screaming Frog, IIS crawler Chrome inspect element, manual inspection.
KW rankings, Backlink analysis Authority Labs, Searchmetrics, Open site explorer, Majestic
Custom scripts, manual Google checks.
Indexation, PageRank Scrapebox Custom scripts, SEOstats
Googlebot activity Server logs – with Splunk Google webcache (not perfect!)
Source code similarity (scrape) ImportXML + Excel Manual checks, Text comparing online (Diffnow)
Environmental activity SEOmoz Algorithm updates, Development queue / logs, Holidays, world events
Webmaster forums, other SEOs, SEO articles.
The process of elimination, think of it as Logger.log();
Footer
Lead generation Form
Navigation
Heading 10%
Sales text
Footer
Lead generation Form
Navigation
Heading 10%
Sales text
Same
Same
Same
Same
Different
Now I know what the problem is, and I have data to explain.
http://www.diffnow.com/
These pages were 90% identical to each other AND
Panda 3.8 & 3.9 updates rolled out around the time
of loss.
Step 3
Form a hypothesis that might explain the problem
If we make these pages at least 50% unique, they won’t be considered
duplicate and will regain traffic after a Panda refresh.
Step 4
Conduct an experiment to test the hypothesis.
Without proof..
Don’t expect anyone to do the work.
We created content to make a small sample set of pages
50% unique (5k).
We conducted the experiment and we now have proof (data) that it worked
Step 5:
Based on your experiment results, revise or validate
hypothesis.
It’s ok to fail – but try to minimize the damage.
My client
We can make SEO a
My client
Keep testing, keep breaking stuff, be a GREAT SEO
Image creditshttp://eastcountyoralsurgery.net/wp-content/uploads/2011/06/kid-diver.jpg
amateurexpert92.deviantart.comhttp://www.tommyzor.com
http://www.nouse.co.uk/2012/12/28/5-films-that-make-my-christmas/home-alone/www.amommyismade.com
http://www.failepicfail.com/gymnast-fail-gymnast-parallel-bars-epic-fail-229.html
Top Related