<jfhbrook>
yeah like it's not as easy to copy-paste but
<jfhbrook>
the sparkline thing is really handy
|jemc| has quit [Ping timeout: 260 seconds]
* pikajude
starts the process of upgrading work's node app to 4.x
<pikajude>
one of our tests fails with a "syntax error"
<pikajude>
turns out it's hitting a case in one of our dependent libraries that uses eval.
<ljharb>
burn that dep with fire
<jfhbrook>
oh sick
<jfhbrook>
yeah how hard would it be to rip that shit out
<jfhbrook>
seriously what's the last time *anyone* used straight eval?
<ljharb>
i've used Function eval a few times
<ljharb>
but a) that's for detecting syntax support, and b) i'm a bad person for doing it
<ljharb>
(is-arrow-function and is-generator-function on npm)
<pikajude>
it's strong-agent
<pikajude>
we're tightly bound to the strong- ecosystem
<pikajude>
because god hates us
gq has joined #elliottcable
<jfhbrook>
oh
<jfhbrook>
that makes sense
<jfhbrook>
strong?
<jfhbrook>
what the fuck is that
<ljharb>
like strongloop?
<pikajude>
yeah
<jfhbrook>
can you submit a patch?
<jfhbrook>
or similar?
<pikajude>
uhh
<pikajude>
no, it's not OSS
<ljharb>
using a non-OSS ecosystem, eesh
<pikajude>
welcome to hell
<jfhbrook>
you have *access* to the source though
<jfhbrook>
can you float a patch?
<pikajude>
sure
<jfhbrook>
I don't actually know what the tooling for that looks like
<jfhbrook>
but like, fuck
<jfhbrook>
why are you using strongshit anyway?
<pikajude>
idk
<pikajude>
it won the trial
<pikajude>
now i'm waiting 30 minutes for the testsuite to fail
<pikajude>
generally find that with javascript-based testing frameworks they tend to forget that the callbacks exist
<pikajude>
so you have to wait for the timeout
<pikajude>
but upgrades are always painful
<pikajude>
npm is now npmjs.com instead of npmjs.org. that's odd
<ljharb>
they're a COMpany now
<pikajude>
that's a shame
alexgordon has quit [Quit: My MacBook Pro has gone to sleep. ZZZzzz…]
<ljharb>
why is it a shame that now they won't fail and disappear into obscurity?
<pikajude>
oh, i read their blurb
<pikajude>
i respect their mission
<pikajude>
dedicated to the long-term success of the node.js and npm projcets
<pikajude>
i was going to make some long-winded moral argument against monetary support of the continued existence of node.js, but i'm really just annoyed i'm professionally obligated to keep using it
<pikajude>
and making it go away wouldn't *really* fix anything
<ljharb>
on the contrary, i'd think
Hrorek has joined #elliottcable
Rurik has quit [Ping timeout: 246 seconds]
eligrey has quit [Quit: Leaving]
Hrorek has quit [Ping timeout: 260 seconds]
Hrorek has joined #elliottcable
alexgordon has joined #elliottcable
<alexgordon>
rurik... rarik?: ping
<Hrorek>
alexgordon, yes
<alexgordon>
Hrorek?
<alexgordon>
this is confusing
<Hrorek>
yes
Hrorek is now known as Rurik
Rurik has quit [Changing host]
Rurik has joined #elliottcable
creationix_ is now known as creationix
<Rurik>
alexgordon, pong
<alexgordon>
lolol
<purr>
lolol
<alexgordon>
Rurik: so I have figured out 70% of the issues with this thing
<alexgordon>
e.g. faulting: when you have an object that represents something in the database, but that data hasn't been fetched yet, what happens when someone tries to access a field on it?
<alexgordon>
Core Data has this crazy faulting mechanism, which causes all sorts of problems
<Rurik>
Core Data is that iOS thing, right?
<alexgordon>
yeah iOS and Mac
<alexgordon>
but the same goes for django's ORM and rails too I guess
<alexgordon>
but alternatively, you don't want to make every call asychronous!
<alexgordon>
so what I decided to do is just throw an exception in that case, and force people to fetch explicitly
<alexgordon>
which is a much cleaner solution, because now only the fetch call has to be asynchronous
<pikajude>
hey all, i asked this yesterday but i don't think anybody saw it
<pikajude>
how safe do you think it would be to store my hashed (pbkdf2) password in a text file in a public repo?
<alexgordon>
pikajude: not safe
<pikajude>
damn
<alexgordon>
pikajude: but it depends on the password
<alexgordon>
if the password is randomly generated with enough bits of entropy, then it's ok
<alexgordon>
but if it's low-entropy, then it could be cracked
<pikajude>
even if one attempt takes 500ms?
alexgordon has quit [Ping timeout: 276 seconds]
eligrey has joined #elliottcable
alexgordon has joined #elliottcable
<alexgordon>
pikajude: yes because it can be parallelized
<pikajude>
oh
<pikajude>
i wonder how many boxes someone is willing to throw at the effort to crack my website
<alexgordon>
probably not many :P
<pikajude>
say i have a 24-character random password
<alexgordon>
then it's fine, nobody can crack that
<pikajude>
and it gets hashed 131072 times
<alexgordon>
doesn't matter
<pikajude>
or something
<alexgordon>
24-character random is fine
<pikajude>
2^17 is the repetition factor afaict
<alexgordon>
what matters is the strength of the password
<alexgordon>
if it's beyond ~14 characters
<alexgordon>
and randomly generated, then it's not going to be cracked by brute force
<alexgordon>
because log(62^14) / log(10) = ~10^25
<alexgordon>
erm, = ~25
<alexgordon>
so about 10^12 guesses
<pikajude>
unless they get it right the first time
<alexgordon>
lol
<purr>
lol
<pikajude>
but they can't know the length of the password based on the hash
<pikajude>
i don't know how these things normally go
<alexgordon>
that kind of odds is like.. winning the lottery every day or something
<pikajude>
but brute-forcing pbkdf2 for even a 9 character password
<pikajude>
that's 500ms per guess
<pikajude>
and there are like 40,000 unicode characters
<pikajude>
which gives us 2.6e41 possibilities
<alexgordon>
well if it's _unicode_
<pikajude>
ok, ascii then
<pikajude>
26 * 2 + numbers and symbols
<alexgordon>
pikajude: how the maths works is that increasing the length makes much more difference than increasing the set of characters
<alexgordon>
because it's exponential in the length
<pikajude>
how do people normally guess these, anyway?
<pikajude>
do they just start with 1 character and move up from there?
<alexgordon>
get a list of the most common passwords :P
<alexgordon>
123456
<alexgordon>
password
<pikajude>
i'm just trying to get an idea of how long it would actually take to guess a 9 character ASCII password that doesn't contain common words
<pikajude>
if a single guess takes 500ms
<pikajude>
and it can't be rainbow tables'd because of the salt
<alexgordon>
pikajude: with symbols or without?
<pikajude>
with symbols
<pikajude>
so that's like what, 1e16 possibilities
<alexgordon>
95 printable characters
<alexgordon>
95 ^ 9
<pikajude>
so 6.3024941e17
<pikajude>
500ms per
<pikajude>
3.15124705e17 seconds
<pikajude>
divide by how many cores and machines you have to guess with
<pikajude>
is that *genuinely* feasible
<alexgordon>
pikajude: you only have to search half the space
<pikajude>
why
<alexgordon>
birthday paradox
<pikajude>
oh, ok
<alexgordon>
if you search half the space the probability of collision is 50% I think
<pikajude>
so, roughly 1.1 times the age of the earth
<pikajude>
with one machine
<alexgordon>
log(95 ^ 9) / log(10) / 2 = 8.9
<pikajude>
10 million years with a network of 500 cores
<alexgordon>
call it 10^9 attempts
<pikajude>
what is log(10) for
<alexgordon>
work out the exponent
<alexgordon>
log(95^9) / log(10) is 18, which means 10^18
<alexgordon>
i.e. 95^9 = ~10^18
<pikajude>
so you said you only have to search half the space
<pikajude>
how does that correspond to taking the square root of 10^18
<pikajude>
rather than dividing b 2
<pikajude>
by 2
<pikajude>
sorry, local idiot here
<alexgordon>
well I mean half the exponent. I may be wrong in my remembering of it though
<alexgordon>
pikajude: "The "birthday paradox" places an upper bound on collision resistance: if a hash function produces N bits of output, an attacker who computes only 2^N/2 hash operations on random input is likely to find two matching outputs. If there is an easier method than this brute-force attack, it is typically considered a flaw in the hash function.
<pikajude>
hey ljharb, do you know what the intent of "Buffer.concat(some_array)" is in node?
<pikajude>
because in 0.10 it returns some_array[0] and in 4 it throws an error
<pikajude>
is it just some_array.join()?
<jfhbrook>
nah, it's supposed to take 2 buffers and give you a new buffer with them concatted together
<pikajude>
i see
<jfhbrook>
like the intent is that you do Buffer.concat(someBuffer, someOtherBuffer) I think?
<pikajude>
well, some genius is using it apparently as a substitute for join
<jfhbrook>
oh no, I'm lying
<jfhbrook>
I'm lying hard core
<jfhbrook>
see this is what happens when I try to intelligently talk about something I never use
<jfhbrook>
it *looks* like you're supposed to do Buffer.concat(arrayOfBuffers, someExpectedLength)
<jfhbrook>
with someExpectedLength being optional, but faster if you have it
<jfhbrook>
is your array of not-so-buffers?
<pikajude>
it's an array of strings
<pikajude>
what's funny is that even in 0.10 it won't work unless the array only has one item
<pikajude>
so someone hasn't tested their shit at all
<jfhbrook>
yeah, post-0.10 added a number of new asserts
<pikajude>
asserts: when you want incorrect code to crash as late as possible
<pikajude>
nah jk
<ljharb>
pikajude: in older node it's a node-specific implementation
<ljharb>
in later nodes it's based off of ArrayBuffer
<pikajude>
ok
<ljharb>
so there def might be changes there
<ljharb>
but also, using buffer for a join is just wacky