antocuni changed the topic of #pypy to: PyPy, the flexible snake (IRC logs: https://botbot.me/freenode/pypy/ ) | use cffi for calling C | "PyPy: the Gradual Reduction of Magic (tm)"
danieljabailey has quit [Quit: ZNC 1.6.4+deb1 - http://znc.in]
<arigato>
simpson: yes, you need FinalizerQueue. be aware that you should only enqueue a smallish number of objects
<arigato>
otherwise, performance will suffer
<pjenvey>
unless you can get away with it being a @rgc.must_be_light_finalizer
<simpson>
Huh, okay.
<arigato>
right, but it looks a bit unlikely here, as @rgc.must_be_light_finalizer is meant for finalizers that don't really do anything apart from calling a simple C freeing function
<arigato>
and well, "smallish" number of objects in quote, because any normal Python program full of app-level __del__() still mostly works fine
<Alex_Gaynor>
Can you have a light finalizer that does the cheap check "was this defferred handled properly"" and in the common case does nothing, but in the bad case uses a finalizer queue to handle something
<pjenvey>
you can deregister from the finalizer queue up front if you know it doesn't need to do anything
<pjenvey>
generator does that
<arigato>
Alex_Gaynor: light finalizers cannot cause the object to be kept alive for longer
<Alex_Gaynor>
arigato: ahh, too bad
<arigato>
mostly, you could set a flag in some global variable, like "my_glob.oops = True", and then check it later
<simpson>
Could I walk a graph, dump the information into some global spot, and then let the object die?
<arigato>
simpson: yes, as long as you write no GC pointer anywhere
<arigato>
think about using a C function to dump the info into some memory that it would malloc() itself
<simpson>
Hmmmm. Yeah, okay, so that's a little bit of a problem. I wonder if there's a better way to identify when this particular sort of object is unreachable; really, what I care about is whether or not it'll resolve, which supposedly I could be tracking better.
<arigato>
unsure what you mean by "it'll resolve"
<simpson>
Sorry, um. Okay, so these are promises which either succeed or fail, but sometimes nobody resolves them to either success or failure. In Twisted, they have finalizers to print a debug message if an unresolved Deferred is GC'd.
<simpson>
So I was thinking that I could do something on GC too.
<arigato>
ah ok
<arigato>
that looks like what pjenvey said, then
<arigato>
pypy's generator register a (regular) finalizer in a FinalizerQueue, but when the generator finishes, they unregister
<arigato>
the common case is that generators are finished before being GC'ed
<simpson>
Hm, interesting.
<arigato>
this has much less of the performance problem
<arigato>
you can register the object only when needed, and unregister it at some point (and then you can't re-register it, but that's probably not needed)
<simpson>
We split the promise into two pieces, the promise and also a *resolver*, and really what I care about is a resolver getting GC'd before it's called. So I could register the resolver on creation, and unregister it after it's been used.
<arigato>
yes
<simpson>
ndash: Wondering about whether we should write an RPython-level generic promise module now.
forgottenone has quit [Quit: Konversation terminated!]
forgottenone has joined #pypy
forgottenone has quit [Client Quit]
forgottenone has joined #pypy
zmt00 has quit [Quit: Leaving]
tbodt has joined #pypy
tbodt has quit [Ping timeout: 240 seconds]
adamholmberg has joined #pypy
adamholmberg has quit [Ping timeout: 260 seconds]
inhahe has joined #pypy
inhahe_ has quit [Ping timeout: 248 seconds]
gutworth_ has quit [Ping timeout: 248 seconds]
kipras`away has quit [Ping timeout: 240 seconds]
kipras`away has joined #pypy
kipras`away is now known as kipras
gutworth has joined #pypy
jimbaker has quit [*.net *.split]
Garen has quit [*.net *.split]
__main__ has quit [*.net *.split]
redj has quit [*.net *.split]
carljm has quit [*.net *.split]
dpn` has quit [*.net *.split]
john51 has quit [*.net *.split]
[0__0] has quit [*.net *.split]
syntaxman has quit [*.net *.split]
holdsworth_ has quit [*.net *.split]
john51 has joined #pypy
danieljabailey has quit [Ping timeout: 260 seconds]
danieljabailey has joined #pypy
<runciter>
simpson: hey i've got a weird idea
<runciter>
simpson: would it be possible to write an emulator in rpython?
adamholmberg has joined #pypy
<runciter>
simpson: i have been poking around m68k emulators and some of them have JITs, and it occurs to me that an emulator is an interpreter for machine code
<runciter>
simpson: and it'd be cool to have an automatically generated JIT
<runciter>
simpson: do you think such a thing be possible or even close to a good idea?
adamholmberg has quit [Ping timeout: 268 seconds]
<simpson>
runciter: There's a toy Game Boy emulator already.
<runciter>
simpson: for real?
<runciter>
ok then
jimbaker has joined #pypy
Garen has joined #pypy
syntaxman has joined #pypy
__main__ has joined #pypy
dpn` has joined #pypy
carljm has joined #pypy
holdsworth_ has joined #pypy
redj has joined #pypy
[0__0] has joined #pypy
__main__ has quit [Max SendQ exceeded]
holdsworth_ has quit [Max SendQ exceeded]
<runciter>
simpson: ty :)
john51 has quit [Read error: Connection reset by peer]
holdsworth has joined #pypy
Alex_Gaynor has quit [*.net *.split]
Alex_Gaynor has joined #pypy
john51 has joined #pypy
__main__ has joined #pypy
pulkitg has quit [Ping timeout: 240 seconds]
bendlas has quit [Ping timeout: 255 seconds]
abvi[m]1 has quit [Ping timeout: 276 seconds]
yuvipanda has quit [Ping timeout: 240 seconds]
<cfbolz>
runciter: there's an arm and a mips emulator
<cfbolz>
The arm one is good enough to run pypy-for-arm
drolando has quit [Remote host closed the connection]
adamholmberg has joined #pypy
drolando has joined #pypy
marky1991 has joined #pypy
TheAdversary has quit [Ping timeout: 248 seconds]
<kenaan>
rlamy default e68c2a6d0069 /testrunner/get_info.py: Add testrunner/get_info.py script for the buildbot
cstratak has quit [Quit: Leaving]
<antocuni>
may I complain that rvmprof are a tangled mess of copy&pasted code? :(
<antocuni>
looking at hg log, it seems that they started small, then everyone expanded them adding slightly variations, but nobody every did a proper refactor
<antocuni>
arigato, fijal, plan_rich_: I'm complaining to you ^^^ :-P
* antocuni
refactors
mattip has joined #pypy
<mattip>
antocuni: thanks for taking this up, it is lacking a clear path forward
cstratak has joined #pypy
cstratak has quit [Remote host closed the connection]
cstratak has joined #pypy
<antocuni>
mattip: once we did a "cleanup" sprint years ago. Maybe we should do a "refactor" one eventually :)
<antocuni>
in the "cleanup" sprint you were allowed only to remove code, not to add new one :)
<mattip>
wow, that would be tough
* ronan
likes the idea
<mattip>
ronan: perhaps add a argparse interface to testrunner/get_info.py, like python-config, so you could do get_info.py --exe or get_info.py --binpath ?
marr has joined #pypy
<mattip>
ronan: it would save having to parse JSON in the buildbot step
<fijal>
ronan: what's the plan with assert_?
oberstet has quit [Ping timeout: 248 seconds]
<fijal>
I must say I'm very skeptical about changing all the tests
<ronan>
mattip: parsing json actually seems easier than importing properties one by one
<ronan>
(well, we only need one, ATM, but ...)
<ronan>
fijal: there aren't that many tests to change
<antocuni>
yes, I also don't like the assert_() thing
<fijal>
ok, so maybe start with explaining what it does?
<ronan>
well, it's the only way I've found to switch to assert rewriting
<ronan>
assert reinterpretation is both deprecated and severely broken
<antocuni>
what is assert rewriting vs reinterpretation?
<ronan>
antocuni: it's what happens when an assert fails in pytest
<ronan>
with reinterpretation, you run the assert statement again to display the failure message
<fijal>
what's wrong with that?
<ronan>
things can fail in ways that are unrelated to the test
<ronan>
I've lost many hours to this over the years
zmt00 has joined #pypy
<antocuni>
so, assert rewriting is better; then I suppose that the problem is that it rewrites it in a way which is not RPython, and thus the translator complains?
<ronan>
antocuni: yes, exactly
<antocuni>
can't we tell pytest to disable assert rewriting only for some specific functions?
<fijal>
ronan: why would you loose hours?
<fijal>
ronan: if you have side effects you immediately try not to and boom, no?
<ronan>
fijal: you have to know that there are side effects and remember that reinterpretation triggers them
<fijal>
yeah
<antocuni>
well, I agree with ronan that assert rewriting is just better
<antocuni>
but still, I'd like to find a way to keep using plain asserts in my tests
marky1991 has quit [Ping timeout: 250 seconds]
<bremner>
I guess it's too late to tell pytest not to use assert ;)
<ronan>
antocuni: we can't easily disable rewriting, because it's done at the AST level
<antocuni>
two ideas: 1) tell pytest not to rewrite some functions; 2) write our own decorator which uses ast to rewrite plain asserts into call to assert_()
<kenaan>
antocuni vmprof-0.4.10 6847b345ac78 /rpython/: WIP: rvmprof tests are a 90% one the copy of another, but they are a tangled mess. Start to refactor into ...
<kenaan>
antocuni vmprof-0.4.10 5f1804f818b4 /rpython/rlib/rvmprof/test/test_rvmprof.py: one more refactor
<ronan>
antocuni: I'm not sure that adding a decorator is less intrusive than changing assert foo to assert_(foo)
<antocuni>
well, I was thinking to automatically apply the decorator e.g. when we call "self.interpret(fn, ...)"
<ronan>
wouldn't work, fn would have already been rewritten by then
<antocuni>
ah, I see; they are rewritten at import time
<ronan>
rewriting happens at import-time
<lapinot>
hi folks
<ronan>
hi
<antocuni>
ronan: looking at pytest's code, if you have PYTEST_DONT_REWRITE in the docstrings, you disable rewriting for an entire module
<antocuni>
alternative idea: by looking at the bytecode of the rewritten functions, I see that it calls stuff from a global variable called "@pytest_ar"
<antocuni>
we could hijack it to put another global whose methods _call_reprcompare&co. are rpython friendly
<antocuni>
this can be done transparently by self.interpret&co
oberstet has joined #pypy
<lapinot>
long time since i follow the pypy/rpython project but there are still some dark spots in my mental representation of the whole beast: during the warmup phase, in classical jit-compilers, the user code gets interpreted (by the compiled interpreter), but with the pypy toolchain, the interpreter is one level higher so i'm not sure what happens. Is it so that during the jit-generation, every operation done by the
<lapinot>
interpreter is interleaved with some tracing code (so that "on the same level" there is some code doing stuff, and just after some code adding to the trace that we did this stuff)? Or is the interpreter compiled to an IR (the one that we see in the traces) that gets interpreted by some "hard-coded" interpreter built into the pypy toolchain?
<cfbolz>
lapinot: in addition to being traceable, the interpreter is also just compiled to C
<cfbolz>
so during warmup, this C version runs
<lapinot>
okay, so it's the first of my 2 options
<simpson>
lapinot: The "interleaving" is of a different structure than you supposed, though. The translation process creates several versions of the interpreter, and the system can switch between an interpreter which only executes, and an interpreter which traces as it executes.
<lapinot>
oh and i always forget: the jit-generation (tracing and other stuff) is done by an rpython library, not during the actual compilation of the rpython to C, right?
<cfbolz>
lapinot: the jit is both: it's an rpython library of optimizers, backends and the tracer, and a rewriter that inserts the correct calls to that library into the final C executable
<lapinot>
simpson: oh now i remember that (there's the one that interprets and counts, the one that traces and the one that executes compiled code)
<lapinot>
cfbolz: ok, makes sense
lritter has joined #pypy
<ronan>
antocuni: poking into the internals of pytest is a maintenance headache, and rewritten asserts would still be different from regular RPython asserts
antocuni has quit [Ping timeout: 268 seconds]
<cfbolz>
ronan: can we somehow cleanly fail in the annotator if we detect a rewritten assert? I am a bit worried about third-party rpython projects
<fijal>
I don't like the rewriting at all
<fijal>
tbh
<cfbolz>
in any case, but if somebody accidentally uses a rewriting pytest, an understandable error would be something we can all agree on, no?
<fijal>
yep
<ronan>
cfbolz: we already have a message that works in "most" cases
<cfbolz>
ah, cool
<ronan>
(though ATM it fails to explain that you also need to use an obsolete version of pytest)
<fijal>
ronan: IMO side effects are more known and less harmful, maybe because of years of fighting to stuff being accidentally not RPython, because reasons
<cfbolz>
fijal: sorry, you are ranting, basically ;-). it's not like ronan is to blame for the current situation
<fijal>
No, but I don't want to have a random change that also makes all the merges a complete nightmare
<fijal>
Also I'm not going to discuss it any more today
tbodt has joined #pypy
tbodt has quit [Client Quit]
tbodt has joined #pypy
tbodt has quit [Client Quit]
<cfbolz>
yes, I think this should be on pypy-dev
<cfbolz>
also, since it affects every rpython project
drolando has quit [Ping timeout: 240 seconds]
<ronan>
fijal: merges don't seem to be a problem, I forgot the branch for a year and I didn't have a single conflict
drolando has joined #pypy
<ronan>
TBH, the real reason I've been working on this is that I don't want to maintain an obsolete fork of pytest forever