Think like a developer debugging seo - be wizard 2013 rimini

Click here to load reader

  • date post

  • Category


  • view

  • download


Embed Size (px)

Transcript of Think like a developer debugging seo - be wizard 2013 rimini

  • 1. I wish diagnosing SEO issues was easy.

2. But itsnevereasy... 3. We need more data, less opinion. 4. Someone told himthis was a goodidea. 5. data 6. How to debug SEO 7. I even went to the length ofunderstanding the process ofelimination by building my ownphp app. 8. Caveat : You need to have acertain level of knowledge. 9. Randsaid thisin2011, and it wasalreadya big list 10. The debugging process.The following is based on real life events.Ive always wanted to say that. 11. Heres the overall organic graph for2012 versus 2011. Seasonalfluctuation seemed unlikely. 2012 2011 12. If this was programming, wed get an error message telling us where the problem is.Looking for problems in SEOisnt as straightforward. 13. You can use the Content > ContentDrill down report, but it only showspageviews, not visits. 14. If youre usingGA, set up acustom reportusing thesesettings. 15. /product/When folders are notconstant /city/,segment by number/product/region/of slashes (/) usingregex to see the full/product/region/city/picture. 16. It turns out that it was the/region/ city pages. 17. These aretopconvertingrevenuepages! 18. Step 1:Explain the problem clearly with specific details. 19. /product/region/city/dropped 50% organic visits comparing from June toMay. 20. Step 2:Gather data and investigate. 21. Data Gathered Tool(s) used Double check with:Before / After traffic metrics (visits, Google Analytics Server logs imported into Splunk.bounce rate etc..)Response codes, link architecture,Screaming Frog, IIS crawlerChrome inspect element, manualOn page elements (titles, robots inspection.directives etc..)KW rankings, Backlink analysisAuthority Labs, Searchmetrics, Custom scripts, manual GoogleOpen site explorer, Majestic checks.Indexation, PageRankScrapeboxCustom scripts, SEOstatsGooglebot activityServer logs with SplunkGoogle webcache (not perfect!)Source code similarity (scrape) ImportXML + ExcelManual checks, Text comparing online (Diffnow)Environmental activitySEOmoz Algorithm updates,Webmaster forums, other SEOs,Development queue / logs,SEO articles.Holidays, world events 22. The process of elimination, thinkof it as Logger.log(); 23. Navigation SameNavigationHeading 10%Different Heading 10%SameLead generation Form Lead generation Form Sales text SameSales textSameFooter Footer 24. Now I know what the problem is, and I have data to explain. 25. These pages were 90%identical to each other ANDPanda 3.8 & 3.9 updates rolled out around the time of loss. 26. Step 3Form a hypothesis that mightexplain the problem 27. If we make these pages atleast 50% unique, they wont be consideredduplicate and will regaintraffic after a Panda refresh. 28. Step 4Conduct an experiment to testthe hypothesis. 29. Withoutproof.. 30. Dont expectanyone to dothe work. 31. We created content to makea small sample set of pages 50% unique (5k). 32. We conducted the experiment and wenow have proof (data) that it worked 33. Step 5:Based on your experimentresults, revise or validate hypothesis. 34. Its ok tofail buttry tominimizethedamage. 35. We can make SEO aMy client 36. Keep testing, keepbreaking stuff, be aMy clientGREAT SEO 37. DAVID SOTTIMANO Lead Consultant, Distilled [email protected] @dsottimano 38. Image credits http://www.tommyzor.com www.amommyismade.com