Definition and Meaning of PsBattle

PsBattle, sometimes incorrectly used PSBattle, Ps-Battle, and Ps Battle a term used to shorten “Photoshop Battle”.

Photoshop Battles are when users of a social media platform come together in order to exercise their photoshop skills on an original, interesting picture posted with word PsBattle.

The subreddit /r/photoshopbattles is entirely dedicated to PsBattles where users will post an interesting, usually un-modified picture and prefix the title with “PsBattle: “. Once a “PsBattle” image is posted, reddit users will will then photoshop certain elements of the original picture and superimpose them onto other images often making fun of the subject(s) of the original photo.

/r/photoshopbattles encourages posting of ‘non-manipulated images’ to the subreddit and then posting the photoshopped versions in the comments.

Definition of PsBattle
PsBattle Definition
PsBattle Meaning

How to download iOS 9

Apple’s new iOS, Apple iOS 9 hasn’t yet been released. You’re probably looking for iOS 8 (which came out Today, 17th of September 2014).
You can find that at

Method 1.
1) Select your device and click “Settings”
2) Click “General”
3) Click “Software Update”
4) Download new iOS 8

Method 2. (Recommended)
1) Download iOS 8 firmware for iPhone/iPad/iPod Touch.
2) Open iTunes and connect your iDevice via USB cable
3) Select your device from the left navigation panel on iTunes 11
4) Hold down the Shift (Windows) or Alt/Option (Mac) key and hit the restore button on iTunes
5) Browse and select the downloaded iOS 8 IPSW file. Let the restore process complete.

Method 3.
1) Connect your iPhone/iPod Touch/iPad to your computer running iTunes 11 via USB cable
2) New firmware iOS 8 availability message will appear
3) Hit the “Download and Install” button.
4) It’ll download and install iOS 8 on your iDevice.

iOS 8 Final Download Links:
Done. You can install via QTA (Method 1)

Mirror download links (for Method nr. 2) coming soon. Very soon.

iPad Air (5th generation WiFi + Cellular)
iPad Air (5th generation WiFi)
iPad (4th generation CDMA)
iPad (4th generation GSM)
iPad (4th generation WiFi)
iPad mini (CDMA)
iPad mini (GSM)
iPad mini (WiFi)
iPad mini 2 (WiFi + Cellular)
iPad mini 2 (WiFi)
iPad 3 Wi-Fi (3rd generation)
iPad 3 Wi-Fi + Cellular (model for ATT)
iPad 3 Wi-Fi + Cellular (model for Verizon)
iPad 2 Wi-Fi (Rev A)
iPad 2 Wi-Fi
iPad 2 Wi-Fi + 3G (GSM)
iPad 2 Wi-Fi + 3G (CDMA)
iPhone 5 (CDMA)
iPhone 5 (GSM)
iPhone 5c (CDMA)
iPhone 5c (GSM)
iPhone 5s (CDMA)
iPhone 5s (GSM)
iPhone 6 (GSM)
iPhone 6 (CDMA)
iPhone 4s
iPhone 4 (GSM Rev A)
iPhone 4 (GSM)
iPhone 4 (CDMA)
iPod touch (5th generation)
iPhone 6 (GSM)
iPhone 6 (CDMA)
iPhone 6 Plus (GSM)
iPhone 6 Plus (CDMA)

Taptic Engine – Apple Watch (iWatch)

The taptic engine is apple’s feedback ‘vibrator’ that provides a new dimension to the apple watch.
It allows you to be notified by your apple watch or even send subtle taps to someone you’d like to contact. Not limiting the new watch to just visible notifications. It seems the intention of this element in the Apple Watch is to provide a feeling as if your friend is ‘tapping’ you on the wrist to get your attention.
Share heartbeat via Apple Watch
This “iWatch” features includes amazing possibilities such as sharing your heartbeat with your significant other.

Taptic Engine

What great things can you envision this feedback provider to do for apple devices?
I’m excited to no longer have to keep my iphone in my pocket 🙂

Apple Watch Lineup

Related: anyone looking to purchase the domains applepay.li or applepay.ws ?

Nighthawk Router Antenna Upgrade Experience

Upgraded my DD-WRT Nighthawk router with these antennas: http://www.amazon.com/gp/product/B00HMRJ8WK/ref=cm_cr_ryp_prd_ttl_sol_0

Here’s some stats on stock vs these antennas with my upgrade.
Wireless strength stock:
Screen Shot 2014-09-06 at 3.39.37 PM
Wireless strength with these antennas:
Screen Shot 2014-09-06 at 3.49.12 PM

Device ending in A6 is running on the 5Ghz band and is about 20 feet from the router. Device ending in B0 is on the 2.4Ghz band and is across the street in another building 150+ feet away. Although the results are not amazing, do remember that these percentages are on a logarithmic scale. So 4 or 5% is actually a decent increase. Additionally, although I don’t have exact statistics for it, throughput with these antennas did create a noticeable increase (using it as a 5Ghz router with a client on the 2.4Ghz band to bridge the network connections).

Additionally, I could likely increase TX broadcast strength of this router (currently running it at 251mW) and have some further gain.

8/10 do recommend the upgrade on the Nighthawk running DD-WRT.

Recommended Crawl Rate for Bots

You can set your desired bot crawl delay in your robots.txt file by adding this after the user-agent field: Crawl-Delay: 10

That will cause any legitimate robot to wait 10 seconds between requests as they crawl your site for links.
My recommendation, however, is not to set a crawl delay at all. You want bots like Googlebot and Bingbot to crawl your website as often as possible so your freshest content is in the search results. It’s only when you have an underpowered server with perhaps poorly written code that you want to add a crawl delay because in this case, you don’t want the bots to overwhelm your server with traffic causing it to crash. Googlebot, however, is pretty smart and if it notices increased response times due to the large amount of requests they are serving you, it will back off and make the requests more slowly. I’m unsure how Bingbot works with accidental DOS, but you can set your preferred crawl settings in Bing Webmaster Tools so Microsoft can focus their crawling on non-peak times to keep from overwhelming your server.

In terms of SEO, faster crawling is better, and quality new content is key.
Questions and experiences in the comments!
Cheers,
Luke

Google Apps script bot “GoogleApps script”

The GoogleApps Script Bot is a useragent / bot that Google’s Javascript app uses to fetch pages. Example and JS Code found below.
Here’s the standard apache log when this bot is accessing your site
64.233.172.162 - - [02/Jun/2014:15:12:59 -0400] "GET / HTTP/1.1" 200 749 "-" "Mozilla/5.0 (compatible; GoogleApps script; +http://script.google.com/bot.html)"
This bot can be used by any Google Docs user in order to scrape or otherwise access content on a website.
The particular IP accessing my site resolved to google-proxy-64-233-172-162.google.com
I’ve setup a test doc for anyone that would like to checkout the script in action. The code powering it can be found under the script editor.
You’re welcome to punch in your own url to have the crawler fetch your site.
See it in action: https://docs.google.com/a/rehmann.co/spreadsheets/d/1junWawm5kNziFJAZHdUP9wMpW9o-HHu3DGgQ0bLTyY4/edit#gid=0

Unfortunately, it looks like this script doesn’t follow rules set fourth in your robots.txt file, so if your website is being abused by a user using the Google Docs script bot, I would block the IP or setup your site to serve a 404 error to any user-agent matching “GoogleApps script”
I’d love to hear of your experience with this bot. Are people abusing it to scrape your content?

Cheers,
Luke

Functioning Example:
https://docs.google.com/a/rehmann.co/spreadsheets/d/1junWawm5kNziFJAZHdUP9wMpW9o-HHu3DGgQ0bLTyY4/edit#gid=0
(feel free to make a copy or test with your own URL)
Code behind it:
function readRows() {
var sheet = SpreadsheetApp.getActiveSheet();
var rows = sheet.getDataRange();
var numRows = rows.getNumRows();
var values = rows.getValues();

for (var i = 0; i <= numRows - 1; i++) { var row = values[i]; Logger.log(row); } }; function GetPage(url) { var response = UrlFetchApp.fetch(url); return response.getContentText(); } function encodeURIC( r ) { if( r.constructor == Array ) { var out = r.slice(); for( i=0; i< r.length; i++){ for( j=0; j< r[i].length; j++){ out[i][j] = encodeURIComponent(r[i][j].toString() ) ; } } return out ; } else{ return encodeURIComponent(r.toString() ) } }

Google Custom Crawl Rate

I’d like to share my experience with Google’s crawl rate change feature under settings in Google Webmaster Tools.
It seems there is a consensus across the interwebs that this feature is only for slowing down google’s crawl rate on your servers. Let me show you my logs, which say quite differently.

Around May 14th, 2014, I released a new site of mine with several hundred thousand unique pages with decent content (sitemaps too).
Within 48 hours, Google was crawling all the URLs in my sitemap at a rate of about 1 per second.
On the 19th, unsatisfied with the crawl rate of just 1 per second and looking to improve it, I tweaked the settings in my GWT to “limit” the crawl to 3 requests per second. I received the following confirmation message:

We’ve received a request from a site owner to change the rate at which Googlebot crawls this site: http:// – .co/
New crawl rate: Custom rate
– 3.062 requests per second
– 0.327 seconds per request
Your request will take a day or two to come into effect.
This new crawl rate will stay in effect for 90 days.
We recommend that you set a custom crawl rate only if you’re experiencing traffic problems with your server. To use Google’s recommended crawl rate, follow these steps.
1. On the Dashboard, select the site you want.
2. Click Settings.
3. In the Crawl rate section, select “Let Google determine my crawl rate”.

On the evening of May 20th, Google bumped my crawl rate up to 3 requests per second.

Here is a snapshot of the logs over the past couple days to show the change. You’re welcome to draw conclusions yourself, and I’d be happy to hear of alternative reasons that google tripled my crawl rate.

web request log

This evening (May 20th), I made another change to increase the crawl rate yet again. We shall see if in 48 hours my crawl rate is bumped to 5 requests per second.

UPDATE Evening of May 21st: Just about 24 hours later, Google has once again bumped their crawl rate up to about 5 requests per second. I’m convinced that GWT’s crawl rate can be used to increase the crawl rate on your site. If you have content that google is interested in AND your server can handle the load, max out your crawl settings!

UPDATE 2: My experience with a couple other domains shows that it may take more than 24 hours (36-48 in some cases)
Cheers,
Luke

CrawlDaddy Crawler Bot?

Shortly after creating a new website & domain, the following requests from CrawlDaddy popped up in the logs:

64.202.161.41 - - [13/May/2014:10:54:37 -0400] "GET /index.php HTTP/1.1" 200 5789 "-" "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; CrawlDaddy v0.3.0 abot v1.2.0.0 http://code.google.com/p/abot)"

64.202.161.46 - - [13/May/2014:10:54:37 -0400] "GET /FAQ.php HTTP/1.1" 200 8391 "-" "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; CrawlDaddy v0.3.0 abot v1.2.0.0 http://code.google.com/p/abot)"

The bot requested 7 pages via the IPs of 64.202.161.41 and 64.202.161.46 and then finally made a header request of the homepage before exiting:

64.202.161.41 - - [13/May/2014:10:54:40 -0400] "HEAD / HTTP/1.1" 200 - "-" "-"

The pages all existed, it seemed the bot was crawling rather than checking for url-related vulnerabilities.
Based on the URL provided in the User-agent, the crawler seems to be based off some open source website crawler code project.
It’s a curious thing that the crawler is coming from Godaddy’s IP block (64.202.160.0/19) and that the bot did not request a robots.txt file..
A google search of CrawlDaddy didn’t reveal much information on this bot, I’d love to hear about your experiences with it in the comments.

Cheers,
Luke