Follow @Openwall on Twitter for new release announcements and other news
[<prev] [next>] [day] [month] [year] [list]
Message-ID: <BLU0-SMTP82AEE56542B8A96651FDDEFD0C0@phx.gbl>
Date: Tue, 5 Jun 2012 09:27:46 +0200
From: Frank Dittrich <frank_dittrich@...mail.com>
To: john-users@...ts.openwall.com
Subject: Designing a good password cracking contest,

An internal discussion about the PHD contest triggered some thoughts
about how to design such a contest.
Alexander asked me to re-post my thoughts on john-users.

Designing a good contest really is rather complex.

If I were to design a contest, my first priority would be to prove that
I really know the passwords to all the hashes prepared for the contest.
This could be done by posting an additional file with one line per
contest hash:
<contest hash>\t<other hash>
The other hash should be a different hash format which is much harder to
crack than the contest hash. Additionally, an unusually high iteration
count could be used, so that attacking these hashes instead of the
contest hashes doesn't make sense, even if your cracking tool doesn't
support a particular contest hash format.
(For password protected zip files etc, the archive could contain a file
with the clear text password.)


The next problem is to ensure that the contest hashes match reality.
So, instead of generating those hashes with some tool, I'd prefer to
generate (or at least: verify) them on a system using those hashes.
For a large number of hashes this can be hard even if the number of
different hash formats is limited.
John supports a plethora of hash formats, so even getting hold of
systems/applications which use these formats can be quite a challenge.


But to test not just the power of the tools used in the contest, but the
abilities of their users, more needs to be done.

When designing a contest, you can do a lot to favor people with good
pattern recognition abilities (not limited to password patterns).

A good pen tester will be able to detect weaknesses in a specific
implementation or configuration. So I would try to test this ability:

For salted hash formats with an iteration count, use those different
iteration counts.
Using an iteration count that is not the default for that format might
also test the ability of the cracking tool, but the main idea here is:
The pen testers should realize that for a certain format there are
easier to attack hashes. So they should attack these hashes first, or at
least spend more effort on these hashes, to detect patterns that could
be used when cracking the hashes with a higher iteration count.

Similarly, I would of course use an uneven distribution of hashes across
salts. (To make this less obvious, I would do that for formats with a
limited salt size like DES).
So, the pen testers should try to crack the hashes with a higher number
of hashes per salt first, then use the password patterns or base words
when attacking the other hashes.

Then I would throw in some files with ambiguous hashes, so that the pen
testers will have to find out which hash type (format) has been used for
which file. (There could be several files with the same hash type.)
So the testers would have to throw all those ambiguous hashes into one
large file, then try to crack the easiest passwords for each hash
format, then split these hashes into separate files based on the cracked
hashes and their origin.

Since there have been broken implementations using (part of) the
password as the salt, I would also provide files with hashes generated
by such flawed implementations.
E.g., DES hashes using the first 2 characters of the password as the
salt. Once the pen testers cracked the first hashes and detected that
pattern, they need to spend time on preparing an external mode, a
specially crafted .chr file, or even an adjusted DES format
implementation in order to efficiently crack these hashes.
Using the password as the salt could even be made more obvious (like
using 123456 as the salt of an MD5 hash for password 123456, or less
obvious (for formats which use hex salts or mime encoding...)

I'd also throw in some samples similar to the Gawker user data published
after the break-in in 2010.
Here, for several accounts there were two hashes stored, a DES hash, and
an OpenBSD Blowfish hash.
After cracking the DES hash, you just need to append additional
characters (if the DES password is exactly 8 characters long) and/or
take non-ascii characters into account.
Similar tasks could be created using LM/NT or SAP hashes (CODVN G or H).

Depending on what abilities of the teams you'd want to test, you could
also include some formats which currently are not supported by any
password cracker.
(For a 2 day contest, you'd probably have to limit your choice to
password hash algorithms that are already open source.
Installing some non-free software and reverse engineering the password
hash algorithm might be out of scope.)
An image of a LUKS encrypted partition would be a good candidate.
(You could even use a very high iteration count and a complicated
passphrase for the first slot, and a simpler password and smaller
iteration count for the second slot, and let the smarter teams attack
the passphrase of the second slot.)


Since several implementations limit the maximum password length for a
higher c/s rate, I would even throw in some very simple passwords, but
long passwords - close to the maximum password length supported by the
systems using those hash formats.
To make it obvious that such long passwords could exist, I'd throw in
some shorter samples of the same pattern:
aaaaaaaa
bbbbbbbbbbbb
cccccccccccccccccccccccccc
or
1212121212121212121212121212
abababababaabababab
or
123aaaaaaaaaaaaaaaaaaaaaaaaaaa


Finally, to test the ability of the tools used, I'd try to get some
"real" password hashes from systems with Chinese or Japanese users,
for "simple" passwords (basic Chinese or Japanese words), to test how
the tools can handle unusual code pages like non-unicode double byte
code pages.)


You could also have a password file with hashes for admins and normal
users, and give more points for admin passwords found.

Or, you could provide a password history, used in some systems (like
SAP) to prevent password reuse.
In this case, you could provide about 10 passwords per user, and make
sure that knowledge of a few passwords cracked helps cracking the other
passwords of the same user.


So many choices how to design a contest, and I didn't even start
mentioning password patterns.


Frank

Powered by blists - more mailing lists

Confused about mailing lists and their use? Read about mailing lists on Wikipedia and check out these guidelines on proper formatting of your messages.