Sifting out bot traffic from human traffic is important when you are analysing your traffic data. To detect different sorts of traffic, many small to medium businesses rely on the free version of Google Analytics. Others use web engineers to detect different bots. Whether you use a human or an analytics tool, both look out for similar visitor behaviours.
Is Bot Traffic Bad?
There are two types of bot traffic – bad and good. The good traffic crawls your site to see if your content tallies with search terms and can drive your website up the SERPs. When disguised as human traffic, bots ordered in large numbers can provide simple conversions (clicks and page visits). Others are law-bringers, ensuring you don’t publish copyrighted content or carry out other illegal activities.
However, some bots are sent with malicious intent – to steal data, content and credit card information, for example. They can be programmed to inject malware that damages your website. Using a professional, regularly updated antivirus and malware tool can prevent such damage.
Why Do I Need Bot Detect Tools?
Whether you order bot traffic to improve your marketing strategies or not, it is very important to separate bot traffic from human web traffic during data analysis. Why? Bots don’t respond to campaign design, price offers, page layout or free shipping. They do what they are told by the person who sends them. If you pay for bot traffic to arrive in generous volumes every month to make Google crawlers think your website is very popular, this nonhuman traffic will land and add pageviews, whatever the product, service, campaign or design.
But bot traffic doesn’t convert when you are selling something. They don’t have bank accounts. So when bot data is mixed with the behavioural data of your human visitors, you won’t know how well your online presence is performing.
And humans definitely do react to campaign design, price offers, page layout and free shipping. This means you need to use bot detect tools and filter bot traffic out of your data.
What Does Bot Traffic Look Like?
Analytics tools and engineers look at a number of characteristics to detect bots. These characteristics are:
- Very high bounce rate: some bot traffic sources don’t care about how long a bot remains on a page. They don’t ask their bots to interact with a page and leave the domain within three seconds – any traffic that does nothing on a webpage and leaves that URL within the three second limit adds to your bounce rate.
- Unusual session durations: if your web page’s average visitor session duration suddenly becomes very high or very low, this can indicate large volumes of bot visitors. When numerous bots clog up bandwidth on smaller sites they can make loading times slow, meaning both human and bot visitors need longer to navigate the pages. This will increase your session duration metric. Should bot traffic landing on your site be asked to click through the page, they tend to do this at significantly faster rates than a human visitor. Consequently, your session duration metric will be much lower than average.
- Strange wording: if your signup forms are full of fake email addresses, names and telephone numbers, this is a sure sign of bot activity.
- Very high pageviews: if you have ordered bot traffic as part of your SEO strategy, you will notice a sudden surge in your pageview metric. The same applies for paid human web traffic. Or even a successful organic campaign.
How Can I Detect Bot Traffic And Filter It Out?
A web engineer will know exactly how to filter out bot traffic and produce bot-free reports for his or her clients. However, not all of us have the luxury of an expert and rely on Google Analytics.
Google Analytics detects bots and removes them via its “Exclude all hits from known bots and spiders” option. This option removes the most common types, although malicious bots have been designed to overcome this first obstacle. Web bots are also called ‘intelligent agents’ and even Google can’t detect the more advanced ones, even though web bot detection is an important area of research among the computer programming community. Your antivirus tools with bot spam detection features can stop them from landing.
If you know where your bot traffic is coming from, you can also manually fill this in on your analytics tools.
When you don’t take the time to separate bot from human website traffic, your data will be unreliable. As up to 50% of all web traffic is from non-human traffic, half of your gathered data could seriously affect your marketing efforts – unless you only want pageviews and clicks, that is.