Roboo uses advanced non-interactive HTTP challenge/response mechanisms to detect and subsequently mitigate HTTP robots, by verifying the existence of HTTP, HTML, DOM, Javascript and Flash stacks at the client side.
The Hacker News

Such deep level of verification weeds out the larger percentage of HTTP robots which do not use real browsers or implement full browser stacks, resulting in the mitigation of various web threats:
  • HTTP Denial of Service tools - e.g. Low Orbit Ion Cannon
  • Vulnerability Scanning - e.g. Acunetix Web Vulnerability Scanner, Metasploit Pro, Nessus
  • Web exploits
  • Automatic comment posters/comment spam as a replacement of conventional CAPTCHA methods
  • Spiders, Crawlers and other robotic evil
You can find the first public version here

Found this article interesting? Follow us on Twitter and LinkedIn to read more exclusive content we post.