top of page
About Our Bot
 

Our automated bot is designed to help identify and monitor security risks across the web. It is part of our mission to make the internet a safer place by detecting malicious behavior, phishing pages, and other deceptive or harmful practices.

 

Purpose

 

The bot performs non-invasive scans of public web pages to collect signals related to trustworthiness, hosting patterns, domain age, cloaking behavior, and other web-based indicators of abuse or fraud. This data is used solely to improve user safety and threat intelligence.
 

Respect for Site Owners

 

We strictly adhere to the rules defined in robots.txt files. If a site disallows our bot in their robots.txt file, we do not access it.
 

Identification

 

Our bot identifies itself with a clear User-Agent string that includes our organization name and contact information. If you operate a website and have questions or concerns about our bot’s behavior, please contact us directly at security@stingray.com.

 

Responsible Crawling

 

We crawl at a conservative rate and avoid placing unnecessary load on web servers. Domains that show signs of being rate-limited, unreachable, or disallowing access are excluded from future requests.

 

Opt-Out

 

To opt out of future visits from our bot, add the following directive to your site's robots.txt file:

{`User-agent: StingrayBot\nDisallow: /`}

 

We check for robots.txt updates before each crawl and respect all opt-out requests.

Made with ❤️, 🔥, and 🧠 in Canada

Copyright © 2025 Stingray Security Ltd. All rights reserved.

bottom of page