cfbolz changed the topic of #pypy to: PyPy, the flexible snake (IRC logs: https://quodlibet.duckdns.org/irc/pypy/latest.log.html#irc-end ) | use cffi for calling C | if a pep adds a mere 25-30 [C-API] functions or so, it's a drop in the ocean (cough) - Armin
omry has joined #pypy
<omry>
Hi all, I have a library (OmegaConf) and I was curious to see if the tests would run faster on PyPy. to my surprise they ran twice as slow. I am a PyPy noob (first time I am trying it) so I might have made some silly mistake.
<omry>
I created a conda environment with pypy36 : Python 3.6.9 (?, Apr 10 2020, 19:47:05)
<simpson>
Tests tend to be a worst-case situation, because so little code is run multiple times, and PyPy's speed comes from optimizing frequently-run sections of code.
<omry>
Python 3.7: 1539 passed in 3.22s , PyPy : 1539 passed in 7.20s
<omry>
this is a small library, code being tested is about 2000 lines of code (and you can see that there are over 1500 tests.
<omry>
so the same code is being executed many times.
<omry>
although not in a dense manner.
<omry>
(and above it's Python 3.6, not 3.7 - typo).
gavinc has joined #pypy
gavinc has left #pypy [#pypy]
<ronan>
omry: one way to think about it is that pypy can only optimise loops. You'd get much better performance if you were running the same test 1500 times than with 1500 different tests
<ronan>
your timings seem fairly normal to me
<ronan>
to actually check whether pypy improves performance, you need proper benchmarks
<antocuni>
again, if anyone wants to step in is welcome, as I don't really know what to answer :)
<cfbolz>
antocuni: not a bad start, they said it themselves: "I think it needs, in some way, to retain the "segments" of the ouroboros"
BPL has quit [Quit: Leaving]
<arigo>
rjarry: passing None as a pointer should not be allowed at all
<arigo>
nor as an integer, or probably as anything
YannickJadoul has joined #pypy
lritter has quit [Ping timeout: 240 seconds]
<rjarry>
arigo: I figured as much, but wanted to make sure
Kronuz has quit [Ping timeout: 272 seconds]
Kronuz has joined #pypy
lritter has joined #pypy
Ai9zO5AP has quit [Ping timeout: 240 seconds]
jcea has joined #pypy
dddddd has joined #pypy
<arigo>
hum, to call C from C#, there is a rich set of pointer types and operators in C# (using the 'unsafe' keyword) that allow basically any kind of C-like operation
<arigo>
but to actually call C, you generally don't use that (!), you use some different special support to declare your external C functions
<arigo>
I'm thinking about how easy it would be to make a cffi extension, where you write the ffibuilder in a .py file like always, but simply run a different tool on it to produce a piece of C#, which you can then use together with the existing rich pointer operators
<arigo>
no, I meant I have a use case not involving python at all, only C# and C
<arigo>
it would still be useful if I could run a python script that inputs a cffi binding for the C code and outputs a C# wrapper
<arigo>
(or, of course, for any other language, but I just found out that in C# there is already all the stuff about manipulating C-like types)
<antocuni>
arigo: basically, you would like to use the ffibuilder as a general way to describe the C API of a given library
<antocuni>
(in an ideal world, the .h file would be enough)
<arigo>
well, I already have that with cffi :-) but yes, the point is to generate the bindings for other languages than python
<ronan>
arigo: be careful, you'll end up trying to reinvent SWIG ;-)
<arigo>
the plan is not to change anything at all in the ffibuilder :-)
<arigo>
if that's not possible, then the plan fails
<antocuni>
arigo: luckily, you called it "cffi" without any "py" in it. The name is generic enough to be an FFI for C for all languages :)
<arigo>
:-)
<arigo>
actually, I can think about a python script that takes as input an already-compiled python module produced by cffi, and outputs the C# interface for it
<antocuni>
because the compiled module contains all the needed metadata?
<arigo>
yes, because they are needed for normal python usage
<antocuni>
what about generating rffi-like bindings from cffi? :)
<arigo>
doable too :-)
<arigo>
well, or not really
<arigo>
or yes, but you'd need to run cffi and then this script to get the rffi source code that corresponds to the currente machine
<arigo>
yes, maybe that's even a solution
<arigo>
e.g. the value of C constants is captured by gcc when building the cffi-generated module
<antocuni>
why do you need machine-specific rffi code? Can't you generate machine-independent code like the one we write manually?
<arigo>
no, this you can't do from the compiled cffi-generated module, because it is specific to an architecture
<antocuni>
right, but you could generate it from ffibuilder?
<arigo>
not in all cases: in general, a file containing ffibuilder may contain "if" statements to define some things or not
<antocuni>
ah, true
<arigo>
remember that right now we need to run gcc too from rffi. it would be the same from this point of view
<antocuni>
yes, but I am imagining a world in which you run gcc explicitly in a sort of "configure" step, instead of running it implicitly at import time
<arigo>
right, and this change would make that easier, also because we could design it to support that from day one
<antocuni>
I mean, we can probably tweak rffi to do that if we try hard enough, but while we are at it, it would be nice to use cffi
<arigo>
exactly
<antocuni>
does it mean that we will be able to use cffi to run the C code in the tests, without having to pass through ll2ctypes?
xcm has quit [Remote host closed the connection]
<arigo>
ah, maybe?
<arigo>
though there would be confusion between lltypes and cffi types
<antocuni>
I have a vague feeling that this looks like a brilliant idea on the surface, and then there will be tons of corner cases which will bite us
<arigo>
yes, I fear it would end up with an almost-as-bad ll2cffi
xcm has joined #pypy
<antocuni>
arigo: note that there are places in which we are already using a cffi-like way to express lltype types, e.g. in cpyext.ctypespace (which is used by hpy as well)
<antocuni>
being able to express lltype types in a cffi-like way is probably a good idea on its own
Ai9zO5AP has quit [Remote host closed the connection]
Ai9zO5AP has joined #pypy
jcea has quit [Quit: jcea]
<antocuni>
arigo: what are the tricky problems with ll2ctypes, fwiw? I mean, why is it not so simple as "take this C function and call it" or "take this python function and make a C callback out of it"?
<antocuni>
it's a genuine question; I can imagine some of the problems, but I might be missing some tricky ones
<arigo>
the tricky ones is converting lltype stuff to real C data
<arigo>
via auto-generated ctypes.Structure and so on
<arigo>
and then hacking the lltype data so that they can be convinced that their real data should now be read out of these ctypes.Structure
<arigo>
in case it's accessed by Python afterwards
<antocuni>
right. What about re-implementing lltype.Struct in a way which is backed by REAL C data? (possibly through cffi, again)
<antocuni>
I think we would miss some niceties such errors if you try to access uninitialized fields, but it might be possible to hack around it
<arigo>
no, the problem is that you still need symbolic names sometimes, e.g. the name of the constants (not just their value)
<arigo>
in other words we need to store something more than just the C bits, say during translation, but other times during tests we really need C bits
<antocuni>
wait, this is another problem, but not directly related to lltype.Struct vs ctype.Structure, isn't it?
<antocuni>
ah no
<antocuni>
I understand
<antocuni>
you have this problem for prebuilt structs
<arigo>
yes, which is what lltype.Struct is used for during translation
<antocuni>
yes, I understand now
<antocuni>
from some point of view, I would probably like to refactor RPython completely and make it a more "standard" language, in which you don't have prebuilts but you have startup code which initializes things
<antocuni>
as everybody else in the world :)
<arigo>
:-)
<antocuni>
fwiw, now hg claims that I have multiple heads on default :(
xcm has quit [Ping timeout: 240 seconds]
xcm has joined #pypy
<YannickJadoul>
antocuni: Ronan merged my problematic MR yesterday
<YannickJadoul>
Is that commit still the offending one?
<antocuni>
I think it's something else now, although it doesn't tell me which revision is that
<ronan>
hmm, it might be because I pushed a new topic
<YannickJadoul>
I had a weird error when pushing yesterday, but updating hg and evolve fixed that
* antocuni
tries to update hg
<antocuni>
same problem; hg 5.3.2, hg-evolve 9.3.1
<YannickJadoul>
Yeah, same :-(
Ai9zO5AP has quit [Ping timeout: 240 seconds]
Ai9zO5AP has joined #pypy
sknebel has quit [Ping timeout: 256 seconds]
sknebel has joined #pypy
lritter has quit [Ping timeout: 256 seconds]
xcm has quit [Remote host closed the connection]
xcm has joined #pypy
andi- has quit [Ping timeout: 256 seconds]
andi- has joined #pypy
YannickJadoul has quit [Quit: Leaving]
andi- has quit [Quit: WeeChat 2.8]
oberstet has quit [Quit: Leaving]
andi- has joined #pypy
andi- has quit [Excess Flood]
andi- has joined #pypy
jcea has joined #pypy
altendky has quit [Quit: Connection closed for inactivity]