cfbolz changed the topic of #pypy to: PyPy, the flexible snake (IRC logs: https://quodlibet.duckdns.org/irc/pypy/latest.log.html#irc-end ) | use cffi for calling C | if a pep adds a mere 25-30 [C-API] functions or so, it's a drop in the ocean (cough) - Armin
<lesshaste>
cfbolz, do you mean this line "sys.setrecursionlimit(n) sets the limit only approximately, by setting the usable stack space to n * 768 bytes. On Linux, depending on the compiler settings, the default of 768KB is enough for about 1400 calls."
agates[m] has joined #pypy
edd[m] has joined #pypy
Rhy0lite has joined #pypy
<mattip>
tumbleweed: could you file an issue so we can track it?
<tumbleweed>
yeah, sure
<mattip>
i just now sent a mail with the release candidates and hashes
<lesshaste>
one suggestion is that recursive depth is undefined behaviour
<lesshaste>
or something like that :)
<mattip>
it seems segfaulting is not very user friendly, but not sure what else we can do.
<lesshaste>
mattip, is it clear why pypy segfaults with sys.setrecursionlimit(10**4) (note the 4) but cpython doesn't with sys.setrecursionlimit(10**6) (note the 6)?
<lesshaste>
and/or why pypy 3.7 doesn't segfault at all but pypy 3.6 does?
<mattip>
with large enough values, everything would crash, right?
<lesshaste>
mattip, cpython doesn't crash I think. At least what I see is a max recursive depth error
<lesshaste>
maybe I should say I haven't managed to make it segfault yet
<mattip>
an error would be friendlier, agreed
<mattip>
it seems like a corner case, what happens if you do not call setrecursionlimit?
<lesshaste>
mattip, then you get the error RuntimeError: maximum recursion depth exceeded
<lesshaste>
I am not clear why it is segfaulting at all in fact
<atomizer>
lesshaste, just my opinion of the api. hitting os/env stack limits should probably prompt rethinking of approach
<lesshaste>
atomizer, got you
<lesshaste>
atomizer, I don't think it's clear why it was segfaulting in the first place
<lesshaste>
but yes, I should rewrite the code in an iterative style
<lesshaste>
which would solve all of this
<cfbolz>
lesshaste: it's clear why you segfault
<cfbolz>
because you blow the C stack
antocuni has joined #pypy
<Dejan>
why it does not segfault with CPython and recent PyPy
<Dejan>
lesshaste, simply get PyPy 3.6 rc1
<Dejan>
it works as expected
<atomizer>
just because stars aligned. increase depth and it should
<kenaan>
cfbolz release-pypy2.7-v7.x 6fe1105c1e17 /pypy/module/_pypyjson/: corner case in the json decoder: like regular object maps, don't make the json maps arbitrarily huge ...
<kenaan>
arigo release-pypy2.7-v7.x 6a59a88053cc /rpython/translator/revdb/src-revdb/revdb_include.h: Issue #3084 Fix compilation error when building revdb (grafted from cd96ab5b8d1e4364105cb4a3c21a31b5d...
<kenaan>
cfbolz release-pypy3.6-v7.x bd5961409c41 /pypy/module/_pypyjson/: corner case in the json decoder: like regular object maps, don't make the json maps arbitrarily huge ...
<kenaan>
arigo release-pypy3.6-v7.x 7fbabb23dfec /rpython/translator/revdb/src-revdb/revdb_include.h: Issue #3084 Fix compilation error when building revdb (grafted from cd96ab5b8d1e4364105cb4a3c21a31b5d...
xcm has quit [Remote host closed the connection]
xcm has joined #pypy
<cfbolz>
atomizer: exactly
<Dejan>
nobody talked about increasing the depth
<Dejan>
point is that with the depth lesshaste used in his code it should not segfault
<Dejan>
i tried the code on 4 different machines, and it behaves the same - CPython 3.6 works, rc1 PyPy 3.6 works, latest PyPy 3 stable segfaults
<Dejan>
i can't align the stars on all of them
<Dejan>
I wish I had such powers
<Dejan>
good thing is that it got fixed somehow
<lesshaste>
cfbolz, shouldn't it report a max recursive depth warning rather than segfault?
<cfbolz>
lesshaste: no, you disabled that by setting setrecursionlimit too high
<cfbolz>
That's what the recursion limit is for
<lesshaste>
cfbolz, I see what you mean. In that case the question is, why is the C stack so small in pypy compared to cpython
<lesshaste>
is/was
<cfbolz>
The C stack is the same. But pypy uses more of it per python function call
<lesshaste>
cfbolz, ah I see. Is there a reason it has to be a constant size? That is why it can't be dynamically resized until you run out of RAM?
<lesshaste>
or even have a size which is a function of the currently set recursive depth?
<cfbolz>
You could achieve that (stackless python did) but it's slower
<Dejan>
lesshaste, btw did you see my msg above that it works with PyPy 3.6 rc1 ?
<lesshaste>
Dejan, yes, thanks
<lesshaste>
!
<lesshaste>
that ! is meant to be attached to the thanks
<Dejan>
okidoki
<lesshaste>
in any case.. someone not on IRC currently fixed it
<lesshaste>
I wonder if it's easy to see the patch that did that
jvesely has quit [Quit: jvesely]
<cfbolz>
lesshaste: no, it's not a 'fix'. It's a random rejiggling that happens to make it work. It can't be really fixed easily. If you write different code that sets the recursion limit differently and recurses you can still segfault, both on PyPy and CPython
<lesshaste>
cfbolz, ok. The main issue seems to be the size of the stack relative to the extra RAM pypy uses per python function. A simple fix would seem to be to make the stack larger by the RAM overhead that pypy adds to the python functions.
<lesshaste>
i.e. make the pypy stack larger than the cpython stack to compensate
<cfbolz>
No, there are situations where CPython uses more stack. Do you then want pypy to compensate in the other direction?
<lesshaste>
cfbolz, ah.. that sounds messy indeed.
<lesshaste>
cfbolz, one last dim question... why isn't accessing something outside the stack detected as a run time error? that is rather than a segfault?
<cfbolz>
That's how c works
<cfbolz>
The segfault *is* the way that the problem is detected
<lesshaste>
cfbolz, hmm.. aren't there wrappers one can use to make functions behave better? Like out of bounds checkers
<cfbolz>
Yes, but they cost performance
<cfbolz>
Anyway, that wrapper is the recursion limit
<cfbolz>
Which you disabled
<lesshaste>
cfbolz, the recursion limit is only loosely connected to the stack size I thought
<cfbolz>
So?
<lesshaste>
the user has no way to know what recursion limit to set if they don't want the system to segfault
<lesshaste>
user/coder
<cfbolz>
lesshaste: sure! Leave the default value
<lesshaste>
cfbolz, :)
<cfbolz>
Yes, I agree, it's very much a 'use at your own risk' api
<lesshaste>
it is a shame as python is very much a teaching language
<lesshaste>
and recursion is very much a tool used in teaching CS
<lesshaste>
but it seems a mess in python in general
<cfbolz>
Yep
<lesshaste>
and I have a friend who keeps on telling me to move to Julia :)