Follow @Openwall on Twitter for new release announcements and other news
[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-ID: <20130725013654.GA10917@openwall.com>
Date: Thu, 25 Jul 2013 05:36:54 +0400
From: Solar Designer <solar@...nwall.com>
To: john-dev@...ts.openwall.com
Subject: Re: Parallella: bcrypt

Hi Katja,

On Wed, Jul 24, 2013 at 11:42:53PM +0200, Katja Malvoni wrote:
> I made use of dual-issue, the speed I'm getting is 976 c/s when compiling
> Epiphany code with -O2. If I compile with -O3 I get 979 c/s.

This is nice.  This is for 1 instance of bcrypt per core per invocation,
right?  I mean that there's no interleaving yet.

Can you try interleaving two instances, perhaps with C code initially?

> Code is in https://github.com/kmalvoni/JohnTheRipper/tree/master

I took a look, and surprisingly (besides the pieces of inline asm) I
noticed something unrelated: you seem to have inconsistent BF_binary
sizes between Epiphany and host sides.  I thought you had addressed that
already?  Maybe you forgot to commit?  Also, your host side code only
checks 32 bits of the computed hash value, whereas you could check 64
bits just as easily (so you should).

> I tried to crack BF_std.in, it cracked all passwords but I had to start it
> twice. First time it stuck somewhere and when it didn't start cracking I
> aborted it. I found it a bit strange, I'll look at it in more detail. It's
> not first time that it stuck but it happened a few times with assembly code
> that produced incorrect results (failed self test on get_hash[0](0)) so I
> didn't pay attention to it.

Actually, when I briefly tested your older code revision a week ago or
so, it similarly got stuck somewhere before visibly starting to crack,
but I could not reproduce this (the next time it just worked).

So not a new issue, it seems.

Alexander

Powered by blists - more mailing lists

Confused about mailing lists and their use? Read about mailing lists on Wikipedia and check out these guidelines on proper formatting of your messages.