antocuni changed the topic of #pypy to: PyPy, the flexible snake (IRC logs: https://botbot.me/freenode/pypy/ ) | use cffi for calling C | "PyPy: the Gradual Reduction of Magic (tm)"
irclogs_io_bot has quit [Remote host closed the connection]
dmalcolm has quit [Ping timeout: 240 seconds]
irclogs_io_bot has joined #pypy
dmalcolm has joined #pypy
ssbr has quit [Ping timeout: 276 seconds]
ssbr has joined #pypy
<xqb>
pypy and pygobject should be working now, eh?
<xqb>
now I have to make sure I ensure 2 compatibility
<xqb>
I have asyncio code :(
<xqb>
I mean, async def and await and such
<xqb>
or wait, does it work, did I misread?
marky1991 has joined #pypy
<xqb>
so I've just learned there's no asyncio in Py2
<xqb>
:(
* xqb
can just wait for pypy3 compatibility
jcea has joined #pypy
<xqb>
might someone know how far is that from happening?
<cfbolz>
xqb: what do you mean with "pypy3 compatibility"?
<kenaan_>
cfbolz default 14e8d60878c2 /pypy/module/cpyext/: when tp_hash is NULL and either tp_compare or tp_richcompare are Null then set __hash__ to None making the type un...
<kenaan_>
cfbolz default c61a8f2001e3 /pypy/module/cpyext/sequence.py: fix name of erasing pair (has no runtime effect, but sometimes easier to debug)
<xqb>
cfbolz: I mean pycairo support for pypy3
<Rotonen>
xqb: you could also use a thread pool executor from multiprocessing where asyncio would be a thing
<cfbolz>
xqb: that's a question to the pycairo devs
<xqb>
is it? my understanding is that 'the hot potato' is now in pypy devs hands
<arigato>
cfbolz: cpyext-faster-arg-passing looks like a good approach
<cfbolz>
arigato: cool, thanks
<arigato>
mattip: 3d1f618efa2e looks very much like it changes nothing
<kenaan_>
cfbolz cpyext-faster-arg-passing f997dff28b7f /pypy/module/cpyext/test/test_typeobject.py: can't pass with -A
marr has quit [Ping timeout: 240 seconds]
<cfbolz>
arigato: right, but the cpyext-datetime2 branch also looks like an unlikely culprit
<arigato>
yes, so we have to suspect unrelated things like a spectre update
<cfbolz>
or a broken fan, that throttles the CPU, or something
<arigato>
yes, some kind of hardware wear-down looks quite possible
<Rotonen>
you should be able to access the sensor data
<arigato>
it's all on tannit, right?
<arigato>
this is by now a very old machine
<cfbolz>
arigato: yes, amazing that it lasted that long, given how taxing this stuff is
<arigato>
agreed
<arigato>
it's running Ubuntu 12.04 and that was not the first installed version
<mattip>
is it time to talk to the speed.python.org people about running pypy benchmarks for us?
<cfbolz>
yes, that would be pretty awesome
<arigato>
Rotonen: do you have a link that gives a few pointers to easy places to check?
<arigato>
mattip: +1
<Rotonen>
lm-sensors provides sensors-detect, which will try multiple heuristics to modprobe the correct drivers (answer yes to all on the command line wizard), after which you just run the command sensors
<arigato>
(...sorry, I didn't mean "please do it")
<Rotonen>
per default it just prints you the commands, or you can tell it to also modprobe, and also to append it to some kernel modules autoload mechanism for the next boot
<Rotonen>
it's rather friendly if you install lm-sensors and run sensors-detect as root
<Rotonen>
and it has support all the way back to arcane ISA bus stuff, so as long as the hardware has sensors, it is likely it can manage
<Rotonen>
also, if you actually hit thermal throttling, this will be in dmesg
<arigato>
nothing in dmesg
Rhy0lite has quit [Quit: Leaving]
<Rotonen>
not even with dmesg | grep 'hrot' ?
<Rotonen>
hmm, could actually also be that if the hardware is really old enough, it does not expose those events to the OS
<Rotonen>
i'm fuzzy as to when the intel C states stuff landed in hardware and in kernels
<Rotonen>
and a good trick is to 'watch sensors' in an another terminal once you do something heavy like a build to see what's happening vs. idle
tbodt has joined #pypy
<Rotonen>
and come to think of it, maybe keep 'dmesg -H -w' open in a third terminal so you see with humanized timestamps if any relevant kernel events happen
<Rotonen>
an another thing to suspect would be a degraded disk, check the smart data while at it
<mattip>
did they update the kernel for the intel bugs?
<mattip>
"the internal variable we were previously using is no longer exported from the CRT"
<arigato>
__pioinfo? argh argh this shouts "don't do that"
tbodt has quit [Read error: Connection reset by peer]
<mattip>
so now we need to rework the code in rposix to (for old win32 compiler) keep the current behaviour, but also
<mattip>
wrap dangerous things in a context manager
<mattip>
fun
tbodt has joined #pypy
marr has joined #pypy
tbodt has quit [Remote host closed the connection]
<arigato>
so wait
<arigato>
ah I see
<mattip>
maybe we could go the other way around
<arigato>
the CPython 3.x source code just checks that _MSC_VER >= 1900
<arigato>
and if not, it doesn't do anything at all
<arigato>
but we need to
<arigato>
do both
<mattip>
we could do something different - install a _set_thread_local_invalid_parameter_handler always, and only release it
<mattip>
in cpyext
<arigato>
it would be different from CPython e.g. during cffi calls and create messes for embedding
tbodt has joined #pypy
<mattip>
ok. So I will create a _set_thread_local_invalid_parameter_handler() call pair to install/remove a handler
<mattip>
like CPython
<arigato>
but it also needs to continue working on the old VC used for 2.7
<mattip>
right, the new functions will be no-ops on old vc and non-win32
<arigato>
ideally the interface should contain both, e.g.:
<arigato>
with validate_fd(fd):
<arigato>
code
<arigato>
which might raise during validate_fd() if using the old VC,
<mattip>
+1
<arigato>
or if using the new VC, the "fd" argument is ignore
<arigato>
d
<arigato>
which is obscure in itself but well
<arigato>
with no_validation_crash_or_else_validate_fd_now(fd) ?
<mattip>
it sounds better in German :)
<arigato>
:-)
Frankablu has joined #pypy
<cfbolz>
heh
<Rotonen>
"crash, or else" also sounds brutal when parsed that way
<Frankablu>
Hi, I'm trying to use PyPy3 on windows (the beta, this won't work thing) and the multi-processing module appears to be fried on it. Any recommendations of how to split my work load over multiple cores, e.g. ZMQ, etc..?
<arigato>
with dont_crash_here_possibly_by_validating_this_fd(fd):
<cfbolz>
with HierNichtAbstuerzenVielleichtDurchValidierungDiesesFDs(fd):
<Rotonen>
arigato: with raise_or_validate(fd) ?
<xorAxAx>
with dwim(fd): pass
<arigato>
cfbolz: +1 :-)
<mattip>
cfbolz: I am tempted to use it ...
<mattip>
Frankablu: "appears to be fried" - can you give us more details?
<cfbolz>
mattip: unfortunately python 2 doesn't support unicode identifiers yet, or we could use hebrew and finally test the right-to-left support of all our editors
<mattip>
:) Doesn't HierNichtAbstuerzenVielleichtDurchValidierungDiesesFDs need an umlaut or two?
<Frankablu>
mattip, Sure no problem 2 seconds
<Rotonen>
do emoji RTL?
<arigato>
Rotonen: no because it doesn't mean the two different things it could do, one of which is "ignore fd, run this block of code, and if it calls a CRT function then have this CRT function give us an error if fd was invalid, pretty please, instead of crashing the whole process"
<cfbolz>
mattip: yes, should be "Abstürzen" really
<arigato>
but "Abstuerzen" is a somehow officially valid workaround, right?
<cfbolz>
yes
<arigato>
unlike french accents where there is no alternative
<cfbolz>
arigato: the ß is a bit more of a problem because the ascii replacement ss really introduces ambiguities
<Frankablu>
mattip, It looks bit rotted to me, but I've only looked at it for 5 minutes :p
<xorAxAx>
cfbolz: not in switzerland :-)
<cfbolz>
they disambiguate by context, which breaks down for stuff like "in Massen" und "in Maßen"
<xorAxAx>
a swiss person would always write in massen
<cfbolz>
yep
<cfbolz>
bit annoying, given that they mean kind of opposites ;-)
<arigato>
Frankablu: ah, so it just doesn't work at all. it's not bitrotted as much as "we never implemented this Windows-only stuff in _multiprocessing"
<mattip>
indeed, it is in pure python lib_pypy/_winapi.py
<mattip>
and WAIT_ABANDONED_0 is missing
<arigato>
and probably a certain number of other things too
<arigato>
Frankablu: it's all in pure Python though, which means one way forward would be to implement it (lib_pypy/_winapi.py and possibly _pypy_winbase_cffi)
<Frankablu>
Thanks, I'll guess it will get fixed eventually
<Frankablu>
I'm just porting over a genetic algorithm I wrote on CPython, it appears to run faster (~x2?) on a single core in PyPy than it does on Python with 4 cores, if I need it to run faster I'll do something with ZMQ later I guess
<mattip>
pypy3 is beta-quality. If you can, use pypy2 for now
<Frankablu>
I probably can, it's not like I use much python 3 syntax
<Frankablu>
mattip, Thanks for the help and your time
<mattip>
:)
<mattip>
it seems we never updated the _multiprocessing module for win32 python3
<mattip>
gnite
mattip has left #pypy ["Leaving"]
<kenaan_>
mattip py3.5 ea2362e153ff /lib_pypy/_winapi.py: add missing constant, more are lacking
jamesaxl has quit [Quit: WeeChat 2.0.1]
tbodt has quit [Quit: My Mac has gone to sleep. ZZZzzz…]