Follow @Openwall on Twitter for new release announcements and other news
[<prev] [next>] [<thread-prev] [day] [month] [year] [list]
Message-Id: <689D352A-AC71-4D20-92CF-3CB459A2FFCD@thireus.com>
Date: Tue, 11 Sep 2012 01:12:12 +0200
From: "Thireus (thireus.com)" <contact@...reus.com>
To: john-users@...ts.openwall.com
Cc: magnum <john.magnum@...hmail.com>,
 Jeroen <jeroen@...ronken.nl>
Subject: Re: MPI with incremental mode working fine?

Thank you magnum for these relevant information!

I did what you suggested, I have generated 1500 passwords (I'm running MacOS, it seems 1500 is the limit when using your command :-/ it hangs somewhere in the perl script I guess...).

thireus$ mpiexec -np 8 ../run/john -status:test
  0: guesses: 147 time: 0:00:00:04 c/s: 9194M
  1: guesses: 424 time: 0:00:00:04 c/s: 7147M
  2: guesses: 330 time: 0:00:00:04 c/s: 7721M
  3: guesses: 209 time: 0:00:00:04 c/s: 8650M
  4: guesses: 274 time: 0:00:00:04 c/s: 8303M
  5: guesses: 2 time: 0:00:00:04 c/s: 9907M
  6: guesses: 62 time: 0:00:00:04 c/s: 9556M
  7: guesses: 52 time: 0:00:00:04 c/s: 9658M
SUM: guesses: 1500 time: 0:00:00:04 c/s: 70139M avg 8767M

So as you can see MPI is running well here. Now I still don't understand why do I have duplicates (twice generated passwords) when restoring my old session :-/

Here are some other useful information about my case, this is my reply to a user having the same issue:

> Hi Joroen,
> 
> That's the point, I do not now what is wrong :(. And I do not understand numbers that are saved into session files.
> 
> I think I'll have to debug myself to understand what is wrong here. Let's hope someone can provide relevant answers as I don't have the time to dig into john source code.
> 
> I have launched several sessions in the past using the same session name, but never using MPI. As far as I can tell, MPI sessions are saved separately under the name "session.X.rec", so my supposition was that previous non-MPI sessions can interfere with MPI sessions... but that should not happen normally :-/.
> Also my server crashed several times during the cracking process, so I had to restore my crashed session everytime this happens, that could have influenced this potential corruption.
> 
> Do you have the exactly the same issue?
> (I also sent another email in which I give an example of the issue)
> 
> Regards,
> Thireus (contact@...reus.com), 
> IT Security and Telecommunication Engineering Student at ENSEIRB-MATMECA & Master 2 CSI University of Bordeaux 1 (Bordeaux, France).
> http://blog.thireus.com
> 
> Le 10 sept. 2012 à 23:32, Jeroen a écrit :
> 
>> Hi Thireus,
>> 
>> How do you know that saved sessions are corrupted? I might have the same
>> problem an want to look into it. Thanks for your help!
>> 
>> Cheers,
>> 
>> Jeroen

Thanks for your help. Now I think the issue comes from saved session files. One might be corrupted I guess... do you think it is possible ?

Regards,
Thireus (contact@...reus.com), 
IT Security and Telecommunication Engineering Student at ENSEIRB-MATMECA & Master 2 CSI University of Bordeaux 1 (Bordeaux, France).
http://blog.thireus.com

Le 11 sept. 2012 à 00:15, magnum <john.magnum@...hmail.com> a écrit :
>>> On 2012-09-10 22:38, Thireus (thireus.com) wrote:
>>> > I come back with an example that just happened right now with
>>> > incremental and MPI x8.
>>> > 
>>> > ...
>>> > 
>>> > So the first two passwords were found with an interval of 0.5 second.
>>> > The third one came after about 20 seconds. The next two others which
>>> > are similar to the two first one came after 1 minute... then nothing
>>> > else... john.pot just get filled 5 minutes later.
>>> > 
>>> > If MPI is not working with incremental mode, why is the second
>>> > alphonce2012 missing?? Umh... unless I'm missing the point that
>>> > john.pot is not the right place where MPI processes share their work
>>> > to avoid duplicated passwords to pop-up...
>>> 
>>> MPI is supported, known good and produce no dupes. When there are
>>> changes, I use to test it like this:
>>> 
>>> $ ../run/john -stdout -inc | head -10000 | shuf | ../run/pass_gen.pl
>>> rawmd5 > testfile
>>> $ mpiexec -np 8 ../run/john -inc -ses:test -pot:test.pot -fo:raw-md5
>>> testfile
>>> 
>>> (After it stops finding stuff, do a "pkill -USR1 mpiexec" and then
>>> ctrl-c the session).
>>> 
>>> This cracks all 10,000 passwords with no duplicates. I just tried this
>>> now, no problems here.
>>> 
>>> The most common mistake people do is they actually do not run an MPI
>>> build when they think they do, but then all cracked passwords should be
>>> duped by all nodes, and this does not seem to be the case here. Unless
>>> *some* of your nodes are running a non-mpi binary (provided you actually
>>> run mpi across several different machines).
>>> 
>>> Not sure what else could be going on. Check the log file!
>>> 
>>> The output of "mpiexec -np 8 ../run/john -status" might give a clue. For
>>> my test session above, it looks like this:
>>> 
>>> $ mpiexec -np 8 ../run/john -status:test
>>>   0: guesses: 74 time: 0:00:00:06 0.00% c/s: 2984M
>>>   1: guesses: 276 time: 0:00:00:06 0.00% c/s: 2541M
>>>   3: guesses: 83 time: 0:00:00:06 0.00% c/s: 3104M
>>>   4: guesses: 118 time: 0:00:00:06 0.00% c/s: 2997M
>>>   5: guesses: 172 time: 0:00:00:06 0.00% c/s: 2831M
>>>   2: guesses: 434 time: 0:00:00:06 0.00% c/s: 2377M
>>>   7: guesses: 246 time: 0:00:00:06 0.00% c/s: 2670M
>>>   6: guesses: 97 time: 0:00:00:06 0.00% c/s: 2914M
>>> SUM: guesses: 1500 time: 0:00:00:06 0.00% c/s: 22422M avg 2802M
>>> 
>>> BTW you could change "Save = 600" in john.conf to something less, I use 60.
>>> 
>>> magnum


Powered by blists - more mailing lists

Confused about mailing lists and their use? Read about mailing lists on Wikipedia and check out these guidelines on proper formatting of your messages.