cfbolz changed the topic of #pypy to: PyPy, the flexible snake (IRC logs: https://quodlibet.duckdns.org/irc/pypy/latest.log.html#irc-end ) | use cffi for calling C | if a pep adds a mere 25-30 [C-API] functions or so, it's a drop in the ocean (cough) - Armin
dddddd has quit [Remote host closed the connection]
antocuni has quit [Ping timeout: 268 seconds]
xcm has quit [Read error: Connection reset by peer]
<arigato>
Alex_Gaynor: help
<arigato>
:-)
xcm has joined #pypy
<arigato>
I just released cffi 1.13 in source code
<arigato>
and immediately afterwards I noticed that ci.cryptography.io no longer exists
antocuni has joined #pypy
<arigato>
as usual, it means that cffi is now 1.13 on pypi but without any binary, which means probably that random pip users right now are getting failures
<arigato>
pypi got a new website design that looks cool but doesn't add anything like "please hide my 1.13 for a moment"
mhroncok has joined #pypy
mhroncok has left #pypy [#pypy]
<arigato>
does anyone know if cryptography moved their wheel builder somewhere else?
<arigato>
azure pipelines don't support mercurial at all, right?
<mattip>
I *think* you can connect azure pipelines to gitlab, but definitely not to bitbucket
ionelmc has joined #pypy
antocuni has quit [Ping timeout: 268 seconds]
<arigato>
mattip: thanks!
<Dejan>
Would be nice to enable the 3.7 builds
<Dejan>
and add aarch64 3.7 builds too, if possible
<arigato>
I have no idea if azure pipelines supports that
Ai9zO5AP has joined #pypy
<arigato>
if someone has got an example of a public Python-with-C-extension module whose release is driven by azure pipelines, that would be tremendously helpful
<mattip>
do you mean uploading it to PyPI?
<mattip>
I think most projects do that via twine by hand after creating and testing the wheel
<arigato>
which I cannot do because I don't have a Mac
<mattip>
the strategy would be to build the wheels and store the artifacts somewhere, then download them and push to PyPI
<mattip>
so the question is "how to store an artifact when running azure pipelines"
<Ashleee>
no need to ask anything, already found in history :D `as usual, it means that cffi is now 1.13 on pypi but without any binary, which means probably that random pip users right now are getting failures`
<mattip>
that ^^ is with a chroot on /extra1/xenial64 on bencher4
<arigato>
mattip: thanks for the link, I can now download the builds
rjarry has left #pypy ["Leaving"]
<mattip>
cool
<arigato>
painfully so far (need to click around for each single build)
<arigato>
ah, found out how to make a single downloadable file containing all buids
<arigato>
HTTPError: 400 Client Error: Binary wheel 'cffi-1.13.0-cp27-cp27m-linux_x86_64.w hl' has an unsupported platform tag 'linux_x86_64'. for url: https://upload.pypi .org/legacy/
rjarry has joined #pypy
<rjarry>
guys, is that possible to define dynamic cffi callbacks using the new style ?
<rjarry>
I know I can use @ffi.def_extern() but that limits the callback itself to only one instance
<rjarry>
I have a lib that takes a function pointer as argument and uses it on certain events
<rjarry>
It can store multiple instances of these functions and call them according to certain conditions
<rjarry>
having only 1 possible callback is not pratcical
<rjarry>
practical
<mattip>
arigato: do you have an auditwheel stage to make the wheels manylinux1 compliant?
<rjarry>
I only found 2 solutions: either use the old (not recommanded) style, or maybe handle the "routing" to the correct python callback in a single @def_extern one
<rjarry>
I don't like neither of these :(
<arigato>
mattip: no---I'm just reading about that now
<mattip>
heh
<arigato>
mattip: I won't even try to actively complain about WHY ON EARTH it's not the default given that it's what PyPI supports
<arigato>
where does "auditwheel" come from?
<mattip>
yeah, I was trying to see what cryptography does too
<arigato>
right, "pip install auditwheel" is what I'm looking for, and I won't try to understand why that line is not present in cryptography's example
<arigato>
...OK, no
<arigato>
I can't run auditwheel on python 2.x
<arigato>
WHAT A MESS
<mattip>
I am pretty sure you can run it on python3 against any wheel
<arigato>
so I need to first install it in another independent venv, cool
<arigato>
and probably pip install --user because otherwise there's an error
<arigato>
"sometimes"
<mattip>
well, 3.4 is officially EOL, 3.5 is dying out soon too
<arigato>
just saying
<arigato>
I'm still not out of this infinite yak
<mattip>
but yeah, it is a bit silly that there is no good tutorial
<arigato>
that's not the most silly thing in my opinion
<mattip>
maybe it is an elegant plot to get people to avoid c-extension packages
<arigato>
if only
<arigato>
"apt-get install patchelf" fails because I'm not root
<arigato>
I don't even know why I'm trying to run it on the Azure machine, when I could download the non-fixed versions and patch them locally
<mattip>
sorry, I didn't realize you were doing this on azure. Yes, that is a much more reasonable route
<arigato>
__main__.py: error: cannot repair "dist/cffi-1.13.0-cp27-cp27mu-linux_x86_64.whl" to "manylinux1_x86_64" ABI because of the presence of too-recent versioned symbols. You'll need to compile the wheel on an older toolchain.
<mattip>
... which is why people use the manylinux1 docker image to make the wheel on
<arigato>
OK, now I've got another question
<arigato>
assume I want, just once, to upload a wheel for linux and I'm on a linux machine
<arigato>
is there a way to do it, or should I just give up on the strange idea?
marky1991 has joined #pypy
<arigato>
not talking about azure or any other service, just about my own machine
ekaologik has joined #pypy
<mattip>
I cannot defend the current situation, but I think it goes like "what is linux_x86_64? is it ubuntu 18.04, centos6, alpine? which glibc does it use?"
<mattip>
so they created manylinux1, manylinux2010, and are now working on manylinux2014 tags which demand auditwheel check the symbol versions
<arigato>
let me try to make wheels for OSX and Windows, for now
<arigato>
I'm giving up linux for now
<arigato>
oh well, I'm giving up wheels completely at the moment
<mattip>
:(
<mattip>
you could just upload source tarballs
<arigato>
I'll wait for Alex_Gaynor I guess, just in case he has got a working solution already
<Alex_Gaynor>
arigato: hmm, what's the problem?
<arigato>
mattip: the source tarball is already uploaded a few hours ago, which is why it's a mess right now
<rjarry>
I'm not sure what it involves, but there could not be a solution that allows passing multiple "instances" of an extern "Python" function ?
<arigato>
Alex_Gaynor: thanks!
<rjarry>
arigato: portability is not a real issue here, I mainly target linux, however, is there any performance difference from @def_extern functions ?
<Alex_Gaynor>
arigato: the windows builds are a bit slow, and it looks like the 3.8 builders don't quite work, but otherwise should be good. I"ll find the "download all" link for you when it's done
<arigato>
so "portability" includes some linux systems that are "hardened" against the kind of hacks needed for old-style callbacks
<arigato>
there's no notable performance difference as far as I know
<rjarry>
ok
<arigato>
rjarry: the problem is that C only allows taking pointers to functions written in C, of which there is only a finite number in a given executable
<arigato>
so a C type "pointer to function" cannot by itself contain also a pointer to extra data
<rjarry>
ok, and if you wish to CALL on a function pointer that is not in the executable (i.e. in non-executable memory), you need to do some hacks, understood
<arigato>
exactly
<rjarry>
thanks for the details
<arigato>
Alex_Gaynor: I discovered that each job can publish an artifact of the same name; then at the end the files are all in a single one (fwiw)
<Alex_Gaynor>
arigato: Meaning you were able to download them?
<Alex_Gaynor>
we should have a fix for 3.8 manylinux1 in a little bit
jvesely has joined #pypy
<arigato>
yes, indeed I don't find any way to download them all
<arigato>
and yes, by publishing in each job to the same artifact, it creates only one with all the files in it, and then I was able to download it with a single click
<arigato>
for today I'll go with clicking around 20 times
<Alex_Gaynor>
interesting -- I seem to recall that when we named them all with the same name that it just replaced one with the other
<Alex_Gaynor>
reaperhulk: ^ is that your recollection
<reaperhulk>
Yeah I think that's what we saw at the time, but it's possible they've changed it
<arigato>
ah. it worked now, as far as I can tell, and all five files of my test were in it (though they are created in parallel)
antocuni has joined #pypy
<arigato>
the cffi-macos artifacts also contain pycparser-2.19-xxxx.whl, just mentioning that
<Alex_Gaynor>
cool, good job us
<reaperhulk>
ugh, well we should definitely fix that.
* Alex_Gaynor
files a bug
<arigato>
cool, thank you! the wheels are now uploading to pypi
<arigato>
three panicked hours later than the source
<mattip>
well, that was certainly less frustrating to watch
<Alex_Gaynor>
In the future if you ping one of reaperhulk or me we can trigger it right after the release. (ATM we have pretty good timezone coverage since reaperhulk is in China half the time π)
<antocuni>
arigato: I'm just reading the logs and it seems you solved the issue, but btw, capnpy has a travis-based CI which automatically uploads wheels to pypi when I push a git tag: https://github.com/antocuni/capnpy/blob/master/.travis.yml
<arigato>
:-)
<arigato>
Alex_Gaynor: will do. I didn't do it this time because I thought I could do it myself with the old url
<arigato>
antocuni: cool too
<antocuni>
still reading the logs, but the standard approach to build manylinux wheels is to use a docker image: this way you have a precise environment which is guaranteed to work
<jerith>
arigato and friends: Thanks for fixing this so quickly. :-)
<YannickJadoul>
arigato, Alex_Gaynor: Reading the logs, this might be the moment to advertise cibuildwheel, btw. It does all the auditwheel/delocate things for you :)
marky1991 has quit [Remote host closed the connection]
marky1991 has joined #pypy
<altendky>
YannickJadoul: yes, and the pypy-wheels (directory) said six months didn't it?
<arigato>
YannickJadoul: if it works out of the box as advertised in "Minimal setup" instructions, I'll buy
<altendky>
https://github.com/sqlalchemy/sqlalchemy/issues/4827 `venv/bin/romp --command 'git clone https://github.com/sqlalchemy/sqlalchemy; python -m pip install cibuildwheel; python -m cibuildwheel --output-dir wheelhouse sqlalchemy' --version 3.7 --artifact-paths wheelhouse` (as an example, with another layer of utility around it to launch the 'ci' on all three os')
dddddd has joined #pypy
xcm has quit [Remote host closed the connection]
xcm has joined #pypy
marky1991 has quit [Remote host closed the connection]
marky1991 has joined #pypy
jvesely has quit [Quit: jvesely]
<YannickJadoul>
altendky: Yeah, this romp is pretty cool as well :D We should somehow better advertise this possibility on cibuildwheel, as well!
<simpson>
Hm. When I use timeit on PyPy, I am warned that I should use perf. However, $(pypy -mpip install perf) does not work, and it doesn't seem like there's such a package on PyPI. What should I do instead?
<altendky>
YannickJadoul: maybe after i get around to the docs. i went and tried to switch to using magic wormhole first, and kinda failed so far. :[ also trying to do a ci config generator now that would of course use cibuildwheel.
<YannickJadoul>
altendky: Yeah, there's always other stuff on the backlog as well. But anyway, it's still pretty cool
<YannickJadoul>
Didn't see your answer on the 6-months old pypy-wheels directory, btw. I hadn't noticed that, yet
<mattip>
Dejan: no buildbot docker available, sorry
<mattip>
simpson: what version of pypy?
<YannickJadoul>
arigato: Most of the time, more or less, yes. We just have a bit of a backlog over the last months in implementing manylinux2010 and are now waiting for more Python 3.8 support on CI services
<altendky>
YannickJadoul: i'm not sure which response you are talking about. your link in the ticket seemed to be to a newer thing than i linked. i didn't get an answer here yet confirming so i didn't follow up in the ticket.
<antocuni>
I used to to compile a 32bit version of pypy
<Dejan>
antocuni, i am probably going to borrow your code to setup buildbot on my aarch64 VPS
<antocuni>
it's very likely that you have to modify/adapt it, but it might be a good starting point
<Dejan>
ah, this is just building pypy
<Dejan>
i thought it contains buildbot buildslave :)
<antocuni>
ah no sorry, then I misunderstood what you were looking for :)
<Dejan>
mattip, buildslave must be ubuntu, right?
<mattip>
"must" is a bit strong. I tried vanilla debian and something was missing to pass all the tests, I don't remember what
<simpson>
mattip: It's an older version, I guess. altendky was right, and pyperf works great.
<Dejan>
it is OK, as long as I know what I need to put there
<Dejan>
I will try to make a docker container for it
<Alex_Gaynor>
YannickJadoul: good to know for the future -- for now we've already built our own azure pipelines so we're going to be lazy and not replace it
<YannickJadoul>
altendky: Just the reply on here, that I'd missed. No rush on the GitHub issue :)
<YannickJadoul>
Alex_Gaynor: Yeah, makes sense. Just thought it was interesting to mention ;)
<arigato>
of course, the three-hour window during which only cffi sources were available broke things for many people
<arigato>
I think I'll go and file it as a bug of PyPI
<Alex_Gaynor>
arigato: There's an open bug I think, asking for "partial uploads" or something like that, where you can upload it, but say "don't install by default until later"
marky1991 has quit [Remote host closed the connection]
marky1991 has joined #pypy
<YannickJadoul>
Somehow, this `--only-binary` or `--prefer-binary` flags for pip also seem like they should be more used or maybe even default if the installation from source fails
<arigato>
mwaa ha haa, I hold in my hands the power to make many people angry by uploading a wheel-less version of cffi and keeping it that way for three hours
<arigato>
...ok, I'm only now noticing that there was not one issue on cffi (with many people saying "me too")... but FIVE of them
<Ashleee>
glad that our devs have taught me about constraints for pip install :P
<YannickJadoul>
It's Γ³ne way of getting an idea on the amount of users of a library ;)
<Ashleee>
trying to think what is using cffi down the dependency chain in my case, but it's probably related to either memcache or flask or mysql clients :)
xcm has quit [Remote host closed the connection]
xcm has joined #pypy
<arigato>
there are a lot of packages that indirectly depend on cffi nowadays
lritter has quit [Ping timeout: 265 seconds]
oberstet has quit [Remote host closed the connection]
lesshaste has joined #pypy
<lesshaste>
hi all
xcm has quit [Remote host closed the connection]
xcm has joined #pypy
firespeaker has quit [Quit: Leaving.]
<YannickJadoul>
Is there a way of running a specific test I'm writing, but adding the `--withmod-signal` flag? I need it to import pdb (to test `breakpoint()`)
<lesshaste>
how can I differentiate a*e^(y-a*e^y)?
<antocuni>
YannickJadoul: are you talking about AppTest?
<antocuni>
look e.g. at pypy/module/struct/test/test_struct.py
<antocuni>
in particular, spaceconfig = dict(usemodules=['struct', 'array'])
<YannickJadoul>
antocuni: Yes, indeed! Thanks :)
<cfbolz>
lesshaste: wrong channel, maybe?
marky1991 has quit [Remote host closed the connection]
marky1991 has joined #pypy
jacob22 has quit [Ping timeout: 252 seconds]
antocuni has quit [Ping timeout: 276 seconds]
firespeaker has joined #pypy
marky1991 has quit [Ping timeout: 240 seconds]
marky1991 has joined #pypy
<lesshaste>
cfbolz, sorry!
<lesshaste>
cfbolz, I worked out the answer though :)
jvesely has joined #pypy
<arigato>
YannickJadoul: note that "import pdb" is probably overkill in an AppTest
<arigato>
you should instead write a minimal unit test
<arigato>
just checking that breakpoint() causes the underlying function to be called
<arigato>
or if breakpoint() by default causes "import pdb", then maybe set up the test with making a tiny module, inserting it manually into sys.modules['pdb'], and checking that a function in it is indeed called from breakpoint()
<tumbleweed>
arigato: would you like access to a machine I can reproduce that pypy issue on?
<YannickJadoul>
arigato: OK, that indeed reduced execution time of this test by half!
<cfbolz>
YannickJadoul: nice!
firespeaker has quit [Quit: Leaving.]
niso13 has joined #pypy
<niso13>
Hi, im trying to figure out whats take high memory usage on an object and saw the warning on sys.getsize of. ATM i'm using pympler and run a snippet on cpython to profile the memory, any better way to do that? (because i assume there are differenents from pypy to cpython on the memory usage) Thank you
forgottenone has quit [Quit: Konversation terminated!]
xcm has quit [Remote host closed the connection]
xcm has joined #pypy
firespeaker has joined #pypy
YannickJadoul has quit [Quit: Leaving]
niso89 has joined #pypy
niso13 has quit [Ping timeout: 260 seconds]
<mattip>
niso13: there really is no good way to measure per-object memory usage on PyPy. You can measure global program usage.
<niso89>
If i want to "dig" into a high memory issue, what would you recommend to do?
antocuni has joined #pypy
lesshaste has quit [Ping timeout: 268 seconds]
<cfbolz>
niso89: you can try to analyze a heap dump
firespeaker has quit [Quit: Leaving.]
<mattip>
what is the real problem? you have a long running program that continues to take up more memory as it runs?
<mattip>
you have a large data set that you do not want to copy?
<cfbolz>
Trying to find docs about it
<niso89>
I have a complicated object which take "too much" memory and i want to try optimize it
<niso89>
How projects which use pypy do memory tests? to see if their memory usage became high in any specific commit?
xcm has quit [Remote host closed the connection]
xcm has joined #pypy
<marky1991>
I've had similar problems in cpython and wasn't really satisfied there either
<marky1991>
various tools can tell you that you've got 100k strings/ints/etc floating around, but figuring out where those came from isn't simple
<simpson>
niso89: Basically, yeah. Notice when memory pressure is a problem, and work to reduce the number and size of allocations. It's tough work and there's no magic tricks. The various memory debuggers on CPython are quite nice (I like Meliae still), and can guide how to do similar stuff on PyPy.
<niso89>
Yea, but in cpython i can "struggle" with it, with tools like pympler, i can get a good estimate
<simpson>
The main difference is that tricks like __slots__ are not very helpful on PyPy.
lesshaste has joined #pypy
lesshaste has quit [Remote host closed the connection]
<niso89>
gcdump.py output seems to be low-level, it might be hard to understand where those types are defined
ekaologik has quit [Ping timeout: 268 seconds]
CrazyPython has joined #pypy
CrazyPython has quit [Read error: Connection reset by peer]
niso89 has quit [Ping timeout: 260 seconds]
ekaologik has joined #pypy
ekaologik has quit [Read error: Connection reset by peer]
<arigato>
tumbleweed: yes please :-)
firespeaker has joined #pypy
marky1991 has quit [Ping timeout: 265 seconds]
Rhy0lite has quit [Quit: Leaving]
xcm has quit [Ping timeout: 268 seconds]
xcm has joined #pypy
<tumbleweed>
arigato: ssh arigo@51.15.246.122 - it has your ssh keys from github. You have root via sudo
<tumbleweed>
and it's a throw-away VM, so do whatever you want. I was about to shut it down, this morning, when I noticed there was activity on the ticket
xcm has quit [Remote host closed the connection]
xcm has joined #pypy
<arigato>
tumbleweed: am I missing it somewhere or is there not even a pypy checkout?
<tumbleweed>
arigato: you can see builds in my homedir
<tumbleweed>
I didn't do one from hg HEAD on that box, but I can, or you can