Google Custom Search Results Not Loading [results on own domain]

A big problem with Google’s Custom Search engine is the code is not formatted for users on https websites:


You can see the two urls http://www.google.com and http://www.google.com/coop/cse/brand?form=cse-search-box&lang=en are both insecure. This is even after providing Google with the secure version of the site where the ad-code would be displayed. When implementing the code and the results page, most users see a blank page or a search box missing the google custom search logo.

The simple fix for this issue is to change the http in the urls to https. This will stop any mixed content errors and users over your http and https site alike will have no trouble loading the searchbox or search results.

Hope this helps anyone! Ask any questions in the comments!

Google Analytics “Redundant Hostnames”

As of October 14th, 2014, Google analytics now warns users of having Redundant Hostnames causing hits to their analytics.
Many users are now seeing the error: ‘You have 1 unresolved issue: Redundant Hostnames.’

Redundant Hostnames - Property example.com is receiving data from redundant hostnames.

This means that there is more than one domain that can be used to access a particular page. For example, the domain example.com has redundant hostnames because it is accessible from www.example.com and example.com . This issue can also occur if your site is accessible by its ip address. For optimal user experience and seo practices, webmasters need to 301 redirect traffic to one consolidated domain.

In addition to providing 301 redirects, there are some best practices you can put into place to ensure your content is not duplicated across hosts.
The first is to add the following line to your robots.txt file:
Host: example.com
Replace example.com with your preferred host, be it www.yourdomain.com or just yourdomain.com

Google Webmaster Tools also allows you to set a prefered hostname under “site settings”. This will ensure that your host is consistent across all traffic from google. You must have the www and non-www versions of your site verified on WMT in order to set this feature.

Google Custom Crawl Rate

I’d like to share my experience with Google’s crawl rate change feature under settings in Google Webmaster Tools.
It seems there is a consensus across the interwebs that this feature is only for slowing down google’s crawl rate on your servers. Let me show you my logs, which say quite differently.

Around May 14th, 2014, I released a new site of mine with several hundred thousand unique pages with decent content (sitemaps too).
Within 48 hours, Google was crawling all the URLs in my sitemap at a rate of about 1 per second.
On the 19th, unsatisfied with the crawl rate of just 1 per second and looking to improve it, I tweaked the settings in my GWT to “limit” the crawl to 3 requests per second. I received the following confirmation message:

We’ve received a request from a site owner to change the rate at which Googlebot crawls this site: http:// – .co/
New crawl rate: Custom rate
– 3.062 requests per second
– 0.327 seconds per request
Your request will take a day or two to come into effect.
This new crawl rate will stay in effect for 90 days.
We recommend that you set a custom crawl rate only if you’re experiencing traffic problems with your server. To use Google’s recommended crawl rate, follow these steps.
1. On the Dashboard, select the site you want.
2. Click Settings.
3. In the Crawl rate section, select “Let Google determine my crawl rate”.

On the evening of May 20th, Google bumped my crawl rate up to 3 requests per second.

Here is a snapshot of the logs over the past couple days to show the change. You’re welcome to draw conclusions yourself, and I’d be happy to hear of alternative reasons that google tripled my crawl rate.

web request log

This evening (May 20th), I made another change to increase the crawl rate yet again. We shall see if in 48 hours my crawl rate is bumped to 5 requests per second.

UPDATE Evening of May 21st: Just about 24 hours later, Google has once again bumped their crawl rate up to about 5 requests per second. I’m convinced that GWT’s crawl rate can be used to increase the crawl rate on your site. If you have content that google is interested in AND your server can handle the load, max out your crawl settings!

UPDATE 2: My experience with a couple other domains shows that it may take more than 24 hours (36-48 in some cases)
Cheers,
Luke