Roboo uses advanced non-interactive HTTP challenge/response mechanisms to detect and subsequently mitigate HTTP robots, by verifying the existence of HTTP, HTML, DOM, Javascript and Flash stacks at the client side.

Such deep level of verification weeds out the larger percentage of HTTP robots which do not use real browsers or implement full browser stacks, resulting in the mitigation of various web threats:
  • HTTP Denial of Service tools - e.g. Low Orbit Ion Cannon
  • Vulnerability Scanning - e.g. Acunetix Web Vulnerability Scanner, Metasploit Pro, Nessus
  • Web exploits
  • Automatic comment posters/comment spam as a replacement of conventional CAPTCHA methods
  • Spiders, Crawlers and other robotic evil
You can find the first public version here

Have something to say about this article? Comment below or share it with us on Facebook, Twitter or our LinkedIn Group.