Breaking Bad SEO - The Science of Crawl Space

download Breaking Bad SEO - The Science of Crawl Space

If you can't read please download the document

description

Breaking Bad SEO - The Science of Crawl Space - with notes Why Breaking Bad? Story of a high school chemistry teacher who discovers he has cancer and starts producing crystal meth to earn some fast cash to help fund his treatment and leave some money for his family if he dies. Got me thinking about SEO as a science - requires knowledge, skill and regular testing to deliver the right results or improvements. Even the lead characters reflect certain aspects of the SEO industry… There is an episode called Crawl Space, which ties in nicely with this topic. This version contains the notes on the slides.

Transcript of Breaking Bad SEO - The Science of Crawl Space

  • 1.Breaking Bad SEO The Science of Crawl Space

2. Meet Walter Meet Walter, the chemistry teacher. Despite getting his hands pretty dirty throughout the series, Walt is fundamentally a good guy and as the star of the show, from a search perspective he should be considered a white hat SEO. As a scientist, Walt is methodical in his approach, he understands the principles and practices needed to achieve results and has the skill to produce the very best crystal meth. 3. Jesse Pinkman Jesse Pinkman, on the other hand, is a Black Hat SEO. An ex-student of Walts who failed chemistry at school and became a drug dealer. Jesse knows his industry and has the right contacts to get by, but he isnt really interested in delivering the very best results or long term plans. So why "Crawl Space? 4. The Totality of Possible URLs For a Website [You]have several options to optimize the crawl space (the totality of URLs on your site known to Googlebot) for unique content pages, reduce crawling of duplicative pages, and consolidate indexing signals. Google Webmaster Central Blog http://googlewebmastercentral.blogspot.co.uk/2014/02/faceted-navigation-best-and-5-of-worst.html What is Crawl Space? Google regularly refer to crawl space - its fundamentally about knowing your site architecture - the first step towards successful website optimisation. Identifying your crawl space is the first step towards knowing your site architecture. Why should we care about crawl space? 5. Size Does Matter Google warn you about potential crawl space issues in GWT and outline some of the implications Unnecessarily crawling a large number of duplicate content URLs Discovering undesired parts of your site Consuming more bandwidth than necessary Potential inability to completely index all of your site. 6. domain.com domain.com/home domain.com/home/ domain.com/Home domain.com/?source=ppc domain.com/index.html Howd I Get so Many Pages? So check whats in the indexes already, is it a realistic number of pages for your site? Do you recognise the URL structure/formats being returned? The most common contributor to a large crawl space is duplicate content - seen here with BooHoo.com, which can be caused by a number of valid and invalid reasons. 7. For SEO There Can Only Be One! 8. But Dont Lose Your Head Auditing your Crawl Space will help you understand the full mix of URLs and help formulate a consistent implementation. 9. What Makes Up Your Crawl Space? To build a picture of your site architecture and to discover your complete crawl space, you need to consider what contributes to your URL Universe the place where all theoretically possible URLs exist. 10. Poor management has implications: Orphaned pages Incomplete sitemaps Dilution of backlinks and shares Performance is harder to track Limited volume of unique URLs in analytics Crawl inefficiency for SEO Traffic growth strategies just wont work The Threats A lack of Crawl Space management can leave you wide open to threats, & lead to a load of GWT warnings! 11. Do social signals impact organic rankings? Twitter? Facebook? Google+ Potentially Social and Backlink Synergy Who cares!? Even if social signals are not used in a ranking algorithm, social media and SEO are both able to drive discovery at scale. They both have enormous reach so understanding the crawl space for search & social is crucial. URL aggregation is not necessarily required, Id recommend it. So let take a look at how to tackle all this 12. To get going we need all the components: A Cook A Lab The Formula Organic Ingredients Recipe Lets Cook! 13. Be the Boss (of URLs) An SEO needs to be a great cook Make sure you have the right equipment Follow an accurate formula Use quality ingredients Take ownership of your crawl space Benchmark URL Roadmap The Cook YOU! Just like Walt, you need to be the boss, the head chef, the daddy, ideally a scientist but not a prerequisite if you have the right tools to help you take ownership of your crawl space. 14. The Lab This is Jesses RV effectively a mobile meth lab. We need to set up our lab with the tools and equipment necessary to develop search marketing recipes. Need to link a range of data sources to understand the crawl space: Webmaster Tools - Google & Bing Landing page data (Analytics) Linked URL data - WMT/hrefs/Majestic/OSE Website crawler (Xenu/Screaming Frog/DeepCrawl) DCs new Universal Crawl now includes Google Analytics landing page data, along with link equity and social tagging reports to assess a comprehensive crawl space in one. 15. Maximised + Minimised = Optimised Maximise indexable space Increase volume of valuable pages Increase crawl efficiency Minimise crawlable space Define your crawl space Identify and eliminate threats Optimise canonical space A clean version of your website Your URL la carte The Formula Discovery, management & optimisation of crawl space is essential and lays the foundation for strong performance. All spaces need to be carefully defined and managed efficiently. 16. Use Organic Ingredients As SEOs, we have a whole host of juicy ingredients at our disposal to help cook up an optimised crawl space - picking the very best ingredients for your recipe will make all the difference in testing your recipes. Use the custom controls in DC to extract and schedule regular data comparisons for each source Overwrite robots.txt rules to assess full crawl space as well as the one delivered to search engines. Schedule regular sitemap crawls and compare against internal and external links, review canonical setup and index controls to maintain consistency. - DeepCrawl Backlinks crawl - OG tags & Hreflang DeepCrawl report 17. So Whats the Recipe? 18. Crawl Space Solutions in One New Universal Crawl is now out of Beta! Review website, XML sitemaps & organic landing page data in one Universal Crawl with Deep Crawl. Take advantage of a significant head start in defining, managing & optimising your crawl spaces. Universal Crawl is the New Heisenberg 19. Deep Crawl Goes Universal 20. Deep Crawl Goes Universal DCs Universal Crawl helps you quickly and easily understand your crawl space and identify URL sources, gaps, new formats and traffic value, plus you get all the regular DC features including fully customisable crawl settings, ability to compare test environment vs live site (support QA), custom extraction tools and scheduled crawls that record change, impact and help quantify SEO deliverables. Lets take a look at some recipes to control your crawl space DeepCrawl automatically shows you whats changed between crawls so you can understand how much of the site is changing. You might spot some URLs formats which are changing very frequently and affecting the crawl efficiency. 21. Understand whats in your crawl space Assess indexation reports Review the current index Test URL parameter changes Quick improvements through GWT Parameter settings Identify Indexable & Non-indexable Parameters DeepCrawl indexation reports help you quickly assess all crawlable URLs, unique pages, noindex pages and identify URL parameters that you might not want indexed. Use Webmaster tools data and and site: checks to understand current search engine indexation and use the Parameter Removal controls in DeepCrawl to test the impact of stripping parameters. Monitor crawl efficiency changes and for a quick win update your URL parameter settings in Webmaster tools when you find the right formula. 22. Make sure youre not self harming Check canonical implementation All canonical pages should be linked internally Assess pages without canonical tags Canonical URL Configuration 23. Which URLs Are Being Shared? ? Check the consistency of canonical URLs against social URLs are your OG & TwitterCard URLs consistent? DeepCrawl has a report called to automatically show any errors. You can also schedule custom extraction crawls to regularly assess social share equity changes for specific URLs using DC. Likewise for blog comments etc. 24. Wheres your thin content hiding out? Identify Low Value Navigational Pages 25. Take a good look at your analytics Review URLs delivering minimal traffic Identify and assess URLs outside of your canonical setup Identify Low Value Non Navigational URLs Review and assess if you can make your crawl more efficient by excluding certain pages. Check your non-canonical URLs & use Universal Crawl to see non-navigational pages not driving traffic. 26. Identify Domain Aliases Understanding whos working with you or whos against you is important too. Review your domain aliases. Test all registered domains to check if they return a duplicate or redirect. Check www/non-www configuration and HTTP/HTTPS. Implement redirects or use cross domain canonical setups. 27. Monitor your domain portfolio & keep alert! www.robotto.org Monitor domains using Robotto.org 28. Check all disallowed URLs Webmaster Tools Deep Crawl Indexation Reports Disallowed URLs DeepCrawl reports all Disallowed URLs so you can easily see what's already being excluded. Test changes to your robots.txt file using the Robots Overwrite functionality & develop an optimised file that increases disallowed URLs & focuses the crawl on primary pages - test the impact. 29. Review & Validate all linked URLs Identify All Linked URLs Check that your website internal linking is working towards an optimised crawl space. Use DeepCrawl internal broken links, redirected links, 4xx and 5xx error reports to identify internal links that are broken or are redirected URLs that may affecting your crawl efficiency. 30. Crawl your sitemap regularly Run analysis and compare: Scheduled sitemap crawls Scheduled website crawls All validated, linked URLs Compare Sitemaps Identify pages Only in Sitemaps and not linked internally, plus all pages linked internally, which are not in the Sitemaps. Schedule regular sitemap & comparative website crawls to assess whats changed, you can monitor how much of the site is changing & map against performance. You might spot some URL formats which are changing frequently and impacting crawl efficiency. 31. Wheres the link equity? Identify pages delivering traffic but not internally linked Understand the link profile of all pages: Crawl aggregated link data DeepCrawl automatically applies link metrics to all reports: Compare Landing Page URLs to Linked URLs By comparing sitemaps, GA and internal links, a Universal Crawl easily highlights URLs discovered Only in Organic Landing Pages and not linked internally. You can add Backlink Crawls to your DC projects, simply upload a comprehensive backlink profile URL list and let DC crawl and validate the URLs. This crawl also helps identify pages generating traffic but not necessarily linked internally, plus it automatically applies inbound link equity metrics to all DC reports at a URL level very useful. 32. Watch your Language Check your Hreflang! International SEO Considerations The correct use and implementation of HrefLang tags is important for effective International SEO, but it can be confusing and even experienced SEOs get it wrong. 33. Universal Crawl tests implementation across: Sitemaps Headers On-page Review a matrix of language alternatives for each page Assess gaps and inconsistencies in the setup Review a Pages Without Hreflang Tags report See David Sottimanos MOZ post on HrefLang: http://moz.com/blog/hreflang-behaviour-insights International SEO Considerations DeepCrawl helps manage a seamless HrefLang integration by detecting hreflang tags in sitemaps, headers and on-page before showing a matrix of language alternatives for every page so you can see the gaps and inconsistencies in the setup. 34. Review your options and consider your URL Universe Setup your lab Google Analytics Google/BING Webmaster Tools Deep Crawl Universal Crawl Follow the formula: Maximised + Minimised = Optimised Develop and test new recipes to focus your crawl spaces. #BreakingBadSEO 35. Thanks, Keep in Touch