The Lazy Genius

Security News & Brain Dumps from Xavier Ashe, a Bit9 Client Partner

Stopping Automated Attack Tools

Posted by Xavier Ashe on May 26, 2005

Abstract:

An almost infinite array of automated tools exist to spider and
mirror application content, extract confidential material, brute force
guess authentication credentials, discover code-injection flaws, fuzz
application variables for exploitable overflows, scan for common files
or vulnerable CGI's, and generally attack or exploit web-based
application flaws. While of great value to security professionals, the
use of these tools by attackers represents a clear and present danger
to all organizations.

These automated tools have become increasingly popular for attackers
seeking to compromise the integrity of online applications, and are
used during most phases of an attack. Whilst there are a number of
defense techniques which, when incorporated into a web-based
application, are capable of stopping even the latest generation of
tools, unfortunately most organizations have failed to adopt them.

This whitepaper examines techniques which are capable of defending
applications against these tools; providing advice on their particular
strengths and weaknesses and proposing solutions capable of stopping
the next generation of automated attack tools.

By Gunter Ollmann.  Get the PDF at Infosecwriters.com
This is a good read and has some suggestions I had not though of
before.  I strongly suggest looking at intrusion prevention if you
have public web servers.  Here's a peek inside the pdf:

The most 10 most frequently utilised defences are:

  • Renaming
    the server hosting software
  • Blocking
    HEAD requests for content information,
  • Use of
    the REFERER field to evaluate previous link information,
  • Manipulation
    of Content-Type to “break” file downloads,
  • Client-side
    redirects to the real content location,
  • HTTP
    status codes to hide informational errors,
  • Triggering
    thresholds and timeouts to prevent repetitive content requests,
  • Single-use
    links to ensure users stick to a single navigation path,
  • Honeypot
    links to identify non-human requests,
  • Turing
    tests to block non-human content requests.
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

 
%d bloggers like this: