Semalt: How To Fight Spambots Like Buttons-For-Website, Darodar And Others

Most Google Analytics reports contain traffic spikes, obviously from referrers. Often, when users experience traffic spikes, they tend to copy and paste the referral link into a new tab of a browser to establish the origin of the web page. A spambot can be evident anytime users on referring pages see links that sell SEO and with their own sites nowhere to be found.

Spambots are frustrating and clever. They prey on web administrators who are ignorant about them (who do not understand how spambots work). Irrespective of their motive, spambots ruin a site metrics. In this regard, such traffic must be circumvented as soon as it appears.

Therefore, Lisa Mitchell, the leading expert of Semalt, outlines in the article ways of eliminating spambots in Google Analytics reports.

Bot Filtering in Google Analytics

It is the easiest means of eliminating bots in Google Analytics. In most instances, bot traffic hits a site, and Google Analytics fails to report its effect. The feature should be re-enabled when creating a new website or changing Google Analytics accounts. That is the easiest approach. The detailed procedure is outlined below:

  • Sign in into Google Analytics account.
  • Choose the property that needs to be worked on.
  • On the Admin button (at the top), select "View Settings" on the top right-most column.
  • Navigate down and choose the box for "Exclude all hits from spiders and known bots".

Google Analytics Filters

It is the final filter type. Setting up a filter to hide site traffic from certain domain or ISP (Internet Service Provider) prevents future referrer web metrics from being reported. Internet experts regard this as an out-of-mind, out-of-sight fix. The spambots will still invade a site until Google Analytics reports such traffic. The idea is to create several filters, and when spambots change their TLD (Top Level Domain), another filter is required. With that in mind, experts recommend Google Analytics Filters since anytime a user changes web hosts or re-code their sites, nothing will require being copied so long the same Google Analytics is still in use.

htaccess Rules

The technique stops spambot even before it invades the first code byte from the front-end. The main advantage of this method is that the htaccess files can stay in a site's public_html file directory and block spambots for anything on the server. This implies that users with many sites need to do it once. A major drawback of htaccess is that one must remember to carry these rules everytime they change hosts or recode a web page which is not covered by previous htaccess file.

Valid Hostname

Most internet experts prefer this way of blocking spambots. In most instances, the method is together with several server-side filters (such as htaccess above) as well as a custom PHP function which pulls a domain list from the regularly updated list of most common spambots. The method allows onlyy valid hostnames instead of filtering out unwanted domains.