Technology and experiment-driven digital agency

Introduction


Forward3D has always been driven by bespoke technology. We build our expertise based on and around tools that we developed to improve our own work such as Cardwall or Datasets.

 

Experiments support improving best practices
We are running experiments across all channels to test our current understanding of digital landscape and to fill in the gaps in the best practices. Just recently we have launched an SEO study which measures how the crawl budget are allocated based on directives provided for Googlebot spiders. This includes:

 

  • Parameter settings in search console and its impact on number of pages crawled daily by Googlebot (especially in conjunction with number of pages indexed on a weekly basis);
  • Setting up canonical rules across our client's websites to see re-distribution in ranking authority;
  • Measuring the impact internal no-follow links have on traffic, ranking, visibility, anchor text transmission, authority (trust) and PageRank;
  • Creating a mix of manual and dynamic XML sitemaps and measuring how the site crawl has changed as a result;
  • Changing navigational structure on support pages (including local pages, customer service URLs and HTML sitemaps) to see how connected pages are crawled;
  • Tracking caching dates and measuring indexation rates on a sample of pages.

 

Final result of the experiment will be packed into a case study which illustrates the outstanding questions we are facing today regarding crawl optimisation. The main objectives are:

 

  • How does link building change the allocated crawl budget? Are target pages with external backlink profile have a priority in crawl order?
  • Is passive parameter setup decrease the number of URLs with tracking parameters indexed? Does it help mitigate fluctuation caused by paid/PPC parameters such as the "utm" terms?
  • Can server log analysis help to identify thin and duplicate content issues by looking at what pages with canonical directives were accessed by Googlebot?
  • How much does SEO community understand in regards the function of no-follow links? Does no-follow directive help creating a lean site architecture?
  • Does international country-switcher tool used by Googlebot to crawl the site and does this help international visibility?
  • Is no-scripted URLs picked up and indexed with the crawler? 
  • What are the best practices to optimise internal site search for crawling? Are designated internal search pages pollute the SERPs with duplicate or improve overall site indexation and crawlability?
  • Is W3C’s accessibility guidelines show strong overlaps with white hat Google guidelines and does improving site usage for users with disabilities also improve site usage for other visitors as well as search engines?
  •  

This experiment has been running for more than 6 months now, the data collected already answered some questions partially. We are hoping to conclude the study by end of October.