<njs>
mattip: the interpreter itself is only linked to a small handful of things
<njs>
mattip: when you 'import ssl', then the ssl module is an extension module that the main interpreter dlopen()s
<mattip>
ahh, like pypy3
<njs>
and dlopen() works differently then linked directly
<mattip>
pypy3 uses cffi for the ssl module. It still links to libcrypt, libexpat, libbz2, libffi, libtinfo, and libgcc that are not linked in cpython
<mattip>
of those, crypt, expat, bz2 could be refactored into external modules. I am not clear why tinfo and gcc are linked in
* mattip
updating the issue
<njs>
I'm trying to figure out whether having libffi in the list would cause any problems
<njs>
cpython links to expat, and it's weird and seems like probably a bad thing to me, but it hasn't actually caused any problems I know of so you might be able to get away with it
<mattip>
strange, cpython3 does but cpython2 doesn't
<mattip>
i guess ssl is the main problematic one, which is solved for pypy3 already
<mattip>
ssl being outside causes bootstrap problems on windows, since we need setuptools to find a compiler to build the cffi module,
<mattip>
but on windows setuptools is needed to find the compiler, but setuptools needs ensurepip,
<mattip>
but ensurepip wants to reach out to the internet to check pip, but the internet needs ssl
<njs>
it seems like there must be a way to find a compiler on a local disk without using the internet, but I sympathize with whoever has to untangle that
<mattip>
njs: on another note, can cpython maintaniners upload manylinux2010 wheels to PyPI yet?
<njs>
I think so?
<njs>
at least in theory, the build tooling might not be very easy to use
<mattip>
ok, thinking about reworking the numpy wheel scripts to use it
<njs>
ah, cool
<njs>
that's almost certainly a good idea. newer compilers make faster code.
<mattip>
we have moved to c99 on HEAD now that we are not constrained to python2
<mattip>
"InvalidMatch: variable mismatch: 'p55' instead of 'p55'"
<mattip>
huh???
<cfbolz>
mattip: will take a look once I'm in my office
<arigato>
Alex_Gaynor: not that I know
<arigato>
(unrelated: pip started actively refusing to work on officially deprecated versions of Python. That's stupid imho)
<energizer>
is it an ssl thing?
<arigato>
no, the message is "DEPRECATION: Python 3.4 support has been deprecated. pip 19.1 will be the last one supporting it. Please upgrade your Python as Python 3.4 won't be maintained after March 2019"
<mattip>
arigato: do you want to publish the release on cffi-dev
<arigato>
which sounds a lot like "we're not willing to maintain Python 3.4 support, we'd prefer to use all new 3.5 features inside pip"
<arigato>
mattip: what is cffi-dev?
<mattip>
mailing list
<mattip>
sorry, python-cffi
<arigato>
as far as I can tell, I did already
<mattip>
indeed, it seems 10 hours ago. My mail delivery must be slow
<mattip>
congrats :)
<cfbolz>
arigato: yes, Julian complained to them that he was seeing the message on PyPy2
<cfbolz>
But his pull requests to fix that was accepted
iomari has quit [Ping timeout: 258 seconds]
iomari has joined #pypy
<arigato>
cfbolz: right, but I'm really complaining about the attitude rather than the message
<arigato>
right now if you're somehow stuck with Python 3.3, then pip no longer works at all
<arigato>
I bet if you use an old pip, you see a warning "please update your pip", and if you do update, then you get a pip that no longer works at all and you're stuck
<cfbolz>
Yes
<kenaan>
mattip default 271763ca6c71 /rpython/rlib/test/test_rutf8.py: release notes for hypothesis 4.0.0: "average_size was a no-op, is removed
<njs>
arigato: pip won't upgrade itself to a version that doesn't work on the current python
<arigato>
OK
<njs>
there is currently some debate about when pip should drop py2 support, but it seems like "sometime after january 1" is winning
<arigato>
and how do you install the older pip if you have a brand new installation of python 3.3?
<njs>
heh, no idea
<njs>
3.4+ ship with a vendored copy of pip though
<njs>
so they're fine
<arigato>
right, that's why there are no cffi wheels for windows for cpython 3.3
<arigato>
because I couldn't figure it out either
<arigato>
in this case it's not really a problem, given that python 3.3 really seems not used any more
<arigato>
I'm just wondering what we'll do after january 1, 2020
* mattip
mumbles about ensurepip
<arigato>
a hard drop of python 2.7 where you can't install pip any more... sounds like the worst possible idea as long as python 2.7 is the most used version of python
<arigato>
right, 2.7 has got ensurepip, sorry
<mattip>
but I agree, like most libraries, pip should have a path for LTS for python2
<arigato>
says the man who drops 2.7 support for numpy :-) ...sorry
<njs>
arigato: 2.7 is going to be handled very differently from 3.3 :-)
<arigato>
yes, sorry, I'm just rambling along
<mattip>
numpy will not ship new features, but will offer lts on 1.16 for bug fixes
<arigato>
ok
<arigato>
(rumbling)
<mattip>
it will be a mess though, backporting bug fixes will get harder and harder as the branches diverge
<mattip>
in the mean time, we are still running CI on 1.17 for python2
beystef has joined #pypy
<njs>
mattip: I doubt the py2 diehards care that much about bugfixes honestly...
<fijal>
they care if pip is working
<fijal>
maybe we should pick up the torch?
illume has joined #pypy
senyai has quit [Quit: Leaving]
inhahe has quit []
inhahe has joined #pypy
lritter has joined #pypy
speeder39_ has quit [Quit: Connection closed for inactivity]
beystef has quit [Quit: beystef]
dddddd has joined #pypy
iomari has quit [Ping timeout: 258 seconds]
beystef has joined #pypy
<dstufft>
a few things:
<dstufft>
1) pip is unlikely to stop working, the oldest version of pip, pip 0.8 is still fully functional on 2.7 and works fine, All you have to do is adjust the default URL to point it towards the new PyPI location
<dstufft>
2) While pip is warning people about 2.x, I'm on the side that we have tot ake a usage based approach to deciding when to drop support, which means we're likely to support 2.7 for quite some time
<dstufft>
3) Re: refusing to work on versions we've stopped supporting. The problem is we have to decide between just not testing, but allowing people to upgrade to it anyways (which means people can pip install -U pip into a broken install) or actively preventing installation, whcih means that while it might work, we have no idea so it might start failing, who knows
<LarstiQ>
dstufft: I think I've seen situations where pip upgrades to the latest version automatically, is that pip's doing or something else?
<dstufft>
LarstiQ: pip itself should never upgrade automatically, but virtualenv does install the latest pip by default when creating a new virtualenv, so maybe you mean that?
<LarstiQ>
dstufft: ah, that sounds like it yeah
<mattip>
dstufft: thanks for the clarifications. Is there a statement of backward/forward compatibility on https://pip.pypa.io ?
<mattip>
and indeed it is a conundrum when people use pip in so many different ways across so many platforms and versions
jiwopene has joined #pypy
jiwopene has quit [Client Quit]
<dstufft>
mattip: there's no direct statement. PReviously we never dropped support for any version of Python until it was well and dead (<5%ish total downloads on PyPI) so there wasn't a huge need
<dstufft>
we're still figuring out exactly what our policy on 2.7 is going to be
<dstufft>
so the warning is more of a warning that something is going to happen, but we don't know what yet
<mattip>
yeah, mess. One of the problems is the mass of people with private PyPI instances behind firewalls using 2.7 but updating every now and then
<mattip>
not that you have to support them, but they will be annoyed when things break
<fijal>
dstufft: would be cool to have an official statement at some stage
<dstufft>
fijal: when we figure out what that is we will
<fijal>
Cool!
<fijal>
I don't think people understand what end of support means
<fijal>
Clearly, it's not clear since you keep debating
<mjacob>
cffi question:
<mjacob>
i have a "typedef struct { uint64_t v; } float64_t;"
<mjacob>
a function returning float64_t will return a python object <cdata 'float64_t' owning 8 bytes>
<mjacob>
how can i create this kind of objects myself?
<simpson>
dstufft: FWIW what I'm doing for NixOS, and what I'd like to see other distros do, is get ready to simply pop out CPython 2.7 for PyPy: https://github.com/NixOS/nixpkgs/pull/55182
<simpson>
CPython 2.7 wants to die? Great! I will help it.
<dstufft>
simpson: I think it is unlikely pip supports 2.7 indefinitely. Most of the pip developers want to drop support for 2.7 and originally were very on board with dropping support for 2.7 on Jan 2020 when CPython does
<dstufft>
I think I've convinced them to take a more usage based approach, and pay attention to PyPI numbers
<dstufft>
(at current trends, 2.7 vs 3.x on PyPI will be roughly ~50% on Jan 2020)
<simpson>
dstufft: Bet I can get NixOS to build Python packages without pip.
<simpson>
(And I will not miss it!)
<simpson>
But, like, push coming to shove, I won't let pip hold PyPy on NixOS hostage, mostly because I don't value pip as much as I value PyPy.
<dstufft>
simpson: okay
<mattip>
simpson: do you intend to replicate the top 20k packages on PyPI into the NixOS package manager? Otherwise your users will not appreciate you dropping pip support
<simpson>
mattip: We already support installation from wheels, and we've already packaged a pretty big pile of PyPI shit.
<mattip>
cool
<simpson>
I'm not at all the Python maintainer, though, and they're more conservative about this. So who knows~
<dstufft>
a lot of work has gone into making it possible to implement a thing that installs Python packages that isn't pip
<dstufft>
so it's cool to know someone is using it
<mattip>
so as a NixOS user I can do `tool install package --from-pypi` and it will go to pypi to look for it? Nice!
<simpson>
mattip: Yikes, that's entirely backwards. You'd do something like `nix-shell -p pypy27Packages.package`
<simpson>
dstufft: Something that may eventually be interesting to y'all is that, in nixpkgs, we generally attempt to run tests for Python packages. As a result, we have (in Nix code) a detailed nosology of Python packages which have broken/network-required/fancy-filesystem/nondeterministic tests.
<simpson>
dstufft: I was rather hoping that you would know! I'm following the links at the head of https://python3wos.appspot.com/ to try to find that out.
<dstufft>
oh wait, this graph only goes up to 46k packages
<mattip>
:)
<dstufft>
so left to right, PyPI would be somewhere like another 2.5 image widths to the right, what's the definition of fresh used for the y axis?
<simpson>
dstufft: Right, things are usually only added to nixpkgs if somebody actually uses them, or if they're in a dependency graph. There's piles of nixpkgs stuff that is leaves that only one person ever wanted.
<dstufft>
is it like, released in the last year or something?
<simpson>
IIRC a "fresh" package is reasonably close to its upstream version.
antocuni has joined #pypy
<mattip>
every month numpy gets about 10 million downloads from pypi. Assume 1% is actual users and 99% is CI, that's about 100k real users
<dstufft>
hm, not sure how to model that for PyPI, though I see CRAN and Hackage on here so they must be doing something different for fresh
<dstufft>
for "source" repos at least
<mattip>
assume %10 needs "only one" single-use package, that's still 10k piles of packages to pull from PyPI
<simpson>
Hm, "fresh" might be "not outdated". Their API talks only of outdatedness (https://repology.org/api).
<dstufft>
PyPI has something like 2 billion files downloaded in the last month
<dstufft>
it's crazy to me
<dstufft>
that doesn't account for the aggressive caching pip does of files (basically it'll only download each file once per computer per user, unless something deletes files out of the cache)
<simpson>
dstufft: I'm reading https://repology.org/addrepo right now and it looks involved. I think that the hard part is going to be figuring out what all of the Python packages are called downstream, although you could declare that PyPI's package names are the authoritative ones, since, y'know, PyPI.
<mattip>
dstuft: travis uses pypi as a caching services for clean docker images. It would be nice if they taught people how to update their images to avoid that
<simpson>
Which is a shame because the code's there, it's written in Python, and it's getting touched regularly.
<dstufft>
the hilarous weirdness there is on legacy PyPI, /pypi was the home page that everyone knew and "loved", and /pypi/ was a massive list of every Package on PyPI
<dstufft>
obviously it is not great to send you a 30MB+ HTML file becuase you accidently added a second slash
<simpson>
dstufft: I'm gonna set up a local Repology, make exactly that change, and see what happens. It does look like that "simple index" is the right shape.
<dstufft>
I've done exactly zero investigation into what information they're actually pulling out of that data, so maybe they need more, but that is probably a reasonable place to start
<dstufft>
oh you probably can't get the version or summary from the simple index
<dstufft>
you'd have to do 1 http request per package to get all of that
<simpson>
Right. Whereas, while I haven't read the Hackage Parser class, I don't have to wonder how far 1 HTTP request will go; it could get everything and then some: https://hackage.haskell.org/api#core
<dstufft>
Yea, one HTTP request is kind of hard to scale
<dstufft>
we could probably do something that only updated once a day or something
<dstufft>
Like, according to repology Hackage has 13339 packges, PyPI has 168302 (though not all of those actually have files uploaded)
Rhy0lite has joined #pypy
<simpson>
Right, and I figure that Repology packages probably only exist if they can actually be installed, although I haven't thought through what that implies.
<simpson>
dstufft: Sorry, nope, it's not that simple. I see a path forward but it's not going to be pleasant.
<simpson>
In addition to updating all the URLs and changing the regex, we still need at least one request/package to determine (a) if there's actually code behind the package, and (b) what versions exist.
<simpson>
But the good news is that one doesn't *actually* need to run Pg and the dev server and all that crap. One can directly import the parser and work with its unit. So there's that.
jcea has joined #pypy
jcea has quit [Ping timeout: 268 seconds]
<antocuni>
bah, eventlet's monkeypatch is horribly broken on pypy
<antocuni>
if obj is of type threading.RLock, it tries to access to obj._RLock__block, which apparently doesn't exists on PyPy
<antocuni>
uhm, and apparently on pypy 6.0.0 it works
<antocuni>
ok, it seems that the "culprit" is the rlock-in-rpython branch, although admittedly it's more an eventlet's fault
<mattip>
antocuni: is there a reason they need that interface?
jcea has joined #pypy
jcea has quit [Ping timeout: 252 seconds]
user24 has joined #pypy
<antocuni>
mattip: they do it inside "_green_existing_locks"
<antocuni>
they monkey-patch rlocks AFTER they have been created, by walking through gc.get_objects() 😱
<antocuni>
on CPython2.7, it prints "Found!", as expected
<antocuni>
on CPython3.6, it prints WTF
<fijal>
antocuni: gc.get_objects() is not a very reliable interface?
<antocuni>
isn't it?
<fijal>
I don't know
<fijal>
seems not to be!
<antocuni>
uhm, gc.get_objects return only the objects "tracked by the collector", i.e. I think it means only the objects which reference other PyObject*
<fijal>
heh
<antocuni>
indeed, _threadmodule.RLocktype doesn't have Py_TPFLAGS_HAVE_GC set
<fijal>
i told you it's a terrible interface ;-)
<antocuni>
I suppose I just found a nasty bug in eventlet on py3k
<mattip>
antocuni: but somehow it raised an exception, which means it found a lock and tried to set the non-existant attribute
<mattip>
or does python2 use a different codepath
<antocuni>
they are different issues
<antocuni>
there are two paths, one for py2 and one for py3
<antocuni>
pypy fails because 7.0.0 is no longer compatible with the py2 path (but it might/should be compatible with the py3 path)
<antocuni>
however, the py3 path relies on gc.get_objects() to discover existing locks: but on py3, locks cannot be discovered by using gc.get_objects!
<fijal>
eventlet is a mess
<antocuni>
ironically, I think that on pypy it works because gc.get_objects DOES work
<mattip>
ahh, the bug is cpython3
<mattip>
which is what you said above
<mattip>
... and as for the benchmark runs, the buildbot triggers a comparison of "pypy" with "pypy --jit off" for "baseline" and "changed"
* mattip
running pypy2-v7 compared to cpython-2.7.15rc1 (default python2 on ubuntu 18.04),
<mattip>
then I will run pypy2-v6, pypy2-v5 and latest default-with-unicode-utf8
<antocuni>
mattip: yes, I confirm that with pypy3.6-7.0.0 it prints "Found"
<fijal>
antocuni: that's a blog post material IMO
<fijal>
of the "relying on obscure APIs is a bad idea, even if you think you know what you are doing"
<fijal>
or even if you know what you are doing, but the underlaying platform disagrees
<mattip>
theoretically, they should have a test that checks that API
<mattip>
like the one antocuni wrote
<fijal>
can you reliably predict all the ways that an underlaying platform can change?
<fijal>
in a way that gives you >50% chance (say) that it's not your convoluted test that's broken?
<fijal>
osmehow doubt it
<fijal>
not using obscure APIs seems like a safer way to go
<antocuni>
fijal: it's not that pypy/rpython doesn't use obscure/undocumented/internal APIs or hacks
<antocuni>
although usually for us, the more hackish is the feature, the more robust is the test we write