Most businesses live or die on Google search rankings. If you can’t crack the top few results, it’s almost impossible to consistently drive organic traffic to your site.
But how does Google decide which sites appear at the top of the results list? It uses Googlebot, a proprietary automated software script to “crawl” every website on the Internet and create an index that becomes the basis for search results. The problem is that this script is vague and mysterious, making it impossible for website creators to truly optimize for Googlebot and other search engines’ crawlers.
Enter Botify, a cloud-based crawler that will examine your website and create a detailed SEO analysis with actionable feedback. The French-based company, which launched today onstage at Disrupt NY, has already raised $7.2 million and on-boarded big-name customers like eBay, BlaBlaCar and Expedia.
But a few things make Botify stand out from the hundreds of other SEO companies out there.
First, the crawler is strong. The startup jokes that it holds the “world record” for most URLs crawled from a single website — 150,000,000. Botify also will let site administrators specify the crawl speed — anywhere from 10 to over 200 pages per second. Why would a site want its content crawled slower? Because Botify can crawl so fast that it sometimes stresses a site’s servers.
Besides a strong crawler overall, Botify can also detect which pages on a site have been crawled by Google. When Google crawls a site, it leaves traces in a site’s server logs. Once the site provides these logs to Botify, they can compare it with their own crawl to see if Google had visited a certain page.
For More Blogs POS System Video