Follow @Openwall on Twitter for new release announcements and other news
[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Date: Sun, 10 Jul 2016 00:51:41 +0300
From: Solar Designer <solar@...nwall.com>
To: john-users@...ts.openwall.com
Subject: Re: Loading a large password hash file

On Sun, Jul 10, 2016 at 12:46:14AM +0300, Solar Designer wrote:
> perl -e 'use Digest::SHA1 qw(sha1_hex); for ($i = 0; $i < 200000000; $i++) { print sha1_hex($i), "\n"; }'
> 
> which is 8200000000 bytes.  On a machine with enough RAM, JtR loaded it
> in 6 minutes, and the running "john" process uses 13 GB.
[...]
> There's also the --save-memory option, which may actually speed things
> up when you don't have enough RAM.  But that's sub-optimal, and high
> memory saving levels may hurt cracking speed a lot.  They also hurt
> loading time when there would have been enough RAM to load the hashes
> without memory saving.  I've just tried --save-memory=2 on the 200M
> SHA-1's file, and it looks like it'll load in about 1 hour (instead of
> 6 minutes), consuming something like 11 GB.  So probably not worth it in
> this case.

My --save-memory=2 test completed loading in about 40 minutes, and the
"john" process uses a little over 9 GB.  So in this case it's a 7x
increase in loading time to save 30% of memory.  Cracking is about 5x
slower.  Usually not worth it, but if you had 12 GB RAM and didn't want
to split the input file in two, it could help.

Alexander

Powered by blists - more mailing lists

Confused about mailing lists and their use? Read about mailing lists on Wikipedia and check out these guidelines on proper formatting of your messages.