Bot protection

Bots are a threat to many sites. This term refers to many varieties of tools used by competitors and attackers to make automated requests. Bots can range from small scripts to sophisticated software systems that use various techniques to masquerade as ordinary users' browsers. Although bots rarely create a high load on the site, they can perform unauthorized actions, threaten the security of user data and disrupt the functioning of a website.

The primary threats that bots may pose are:

  • Data scraping.

    As a site owner, you may not want your competitors' bots to automatically download and process information hosted on your site. Therefore, such bots may not threaten the performance of your site, but may pose a threat to your business.

    Automatically traversing pages and extracting data from them is called scraping. The Qrator Labs solution makes scraping difficult by checking users devices' digital fingerprints using JavaScript code. By default, the protection works transparently for legitimate users with minimal added latency and does not resort to captchas or other distractions that would degrade user experience.

    To combat data scraping, the following are used:

  • Bruteforce.

    Bots are often created to learn some information by brute forcing. For example, bots can make a large number of login attempts by trying different passwords. In the same way, secret links, promo codes and other parameters that you do not want to make public can be picked up. Such activity is called brute forcing.

    Qrator Labs detects brute force attacks and specifically blocks traffic from those public IP addresses from where such requests come. This allows you to limit bot activity with minimal damage to legitimate users.

    A reverse HTTP proxy is used to combat brute-force attacks.

To enable bot protection on your account, please contact the Qrator Labs support specialists. Once the service is activated, you'll be able to configure the anti-bot policy in your dashboard.

expand_less