cfbolz changed the topic of #pypy to: PyPy, the flexible snake (IRC logs: https://quodlibet.duckdns.org/irc/pypy/latest.log.html#irc-end ) | use cffi for calling C | if a pep adds a mere 25-30 [C-API] functions or so, it's a drop in the ocean (cough) - Armin
jvesely has joined #pypy
jcea has joined #pypy
phoe62 has joined #pypy
phoe62 has quit [Quit: The Lounge - https://thelounge.github.io]
phoe62 has joined #pypy
jcea has quit [Quit: jcea]
andi- has quit [Remote host closed the connection]
andi- has joined #pypy
kiwi_83 has joined #pypy
sfdye has joined #pypy
sfdye has left #pypy [#pypy]
kiwi_83 has left #pypy [#pypy]
sfdye has joined #pypy
sfdye has quit [Client Quit]
ionelmc has quit [Quit: Connection closed for inactivity]
xcm has quit [Ping timeout: 250 seconds]
xcm has joined #pypy
dddddd has quit [Remote host closed the connection]
altendky has quit [Quit: Connection closed for inactivity]
jvesely has quit [Quit: jvesely]
dansan has quit [Read error: Connection reset by peer]
dansan has joined #pypy
oberstet has quit [Remote host closed the connection]
lritter has joined #pypy
<kenaan> mattip py3.6 ce5b78b5c182 /pypy/module/cpyext/test/test_eval.py: skip flaky test on darwin too
<kenaan> mattip py3.6 b4b11d42cd34 /lib_pypy/_testcapimodule.c: py_w_stopcode is only for "ifndef PYPY_VERSION"
antocuni has joined #pypy
YannickJadoul has joined #pypy
oberstet has joined #pypy
antocuni has quit [Ping timeout: 240 seconds]
Ai9zO5AP has quit [Ping timeout: 268 seconds]
xcm has quit [Remote host closed the connection]
xcm has joined #pypy
<mattip> ths segfaults for me on default, on ubuntu 18.04
<mattip> python2 pytest.py pypy/module/_ssl/test/test_ssl.py -k test_sslwrap
<mattip> somehow the context is not valid
<mattip> we could just delete that module since we have the cffi-based one, like we did on py3.6
jcea has joined #pypy
<cfbolz> mattip: sounds good
<kenaan> mattip py3.6 73df60cabe78 /pypy/interpreter/test/test_gateway.py: fix error msg for py3.6
<kenaan> mattip py3.6 90a079f62b2b /pypy/module/: fix failing tests
<kenaan> mattip default 8d3df17fc5ee /pypy/: remove the _ssl module; we now use the cffi-based one
<mattip> let's see what breaks
jcea has quit [Remote host closed the connection]
jcea has joined #pypy
<Dejan> i smell success
<Dejan> :)
<cfbolz> mattip: Yay, deleting code is good!
edd[m] has quit [Write error: Connection reset by peer]
bendlas has quit [Write error: Broken pipe]
agates[m] has quit [Write error: Connection reset by peer]
extraymond[m] has quit [Remote host closed the connection]
Ashleee has quit [Ping timeout: 240 seconds]
Ashleee has joined #pypy
dddddd has joined #pypy
jcea has quit [Remote host closed the connection]
antocuni has joined #pypy
jvesely has joined #pypy
jvesely has quit [Quit: jvesely]
jcea has joined #pypy
BPL has joined #pypy
BPL has quit [Remote host closed the connection]
xcm has quit [Remote host closed the connection]
xcm has joined #pypy
BPL has joined #pypy
BPL has quit [Remote host closed the connection]
antocuni_ has joined #pypy
antocuni has quit [Ping timeout: 252 seconds]
Nik8 has joined #pypy
<Nik8> Hi, Im trying to benchmark some code on pypy using pytest-benchmark (that using cprofile), and im getting different results in different runs, lets say this exmaple: https://pastebin.com/gq5T9vUs
<Nik8> What am i missing? ty
forgottenone has joined #pypy
<mattip> Nik8: benchmarking is non-trivial. Did you read this http://doc.pypy.org/en/latest/faq.html#how-fast-is-pypy ?
<mattip> I don't know details of pytest-benchmark, but using cprofile is going to interfere with the JIT.
<Nik8> Im benchmarking the code in the same environment if i wanst clear, just the "same" run twice (or more)
<Nik8> with some warmup runs.
<Nik8> what will be recommended instead of cprofile? and maybe do you know any reference of code tath does benchmarking of pypy code?
<mattip> it depends what you are trying to do
<mattip> microbenchmarking is nice for comparing two versions of pypy, not for comparing across implementations
<mattip> if you want to compare cpython to pypy, you need to do it with your actual code
<mattip> for profiling, we recommend statistical profilers like vmprof
Nik8 has quit [Ping timeout: 260 seconds]
firespeaker has quit [Quit: Leaving.]
BPL has joined #pypy
BPL has quit [Remote host closed the connection]
<mattip> pytest asserts no longer print the values on default as well as py3.6, is that on purpose?
<mattip> ronan: ^^^
<cfbolz> mattip: yes, I noticed too, sounds wrong
<cfbolz> nik8 is gone again, hm
altendky has joined #pypy
YannickJadoul has quit [Quit: Leaving]
Nik8 has joined #pypy
<Nik8> Im trying to benchmark an action which take 300-400 ms, that considered as micro-benchmark? (im using 9-10 warmup iterations)
<cfbolz> Nik8: what does your *actual* workload look like
<cfbolz> Nik8: also, what do you see for this concrete example that you consider "getting different results"?
<Nik8> im profiling and benchmarking creating big python object, which is created from json and includes some more logic processing, i dont have a good special way to describe it, just normal bussiness logic
<Nik8> different results = +- 15% of the time it took
<Nik8> Today, the described action is taking "too long" and i want to make it faster, im searching for a way to measure if my changes improve anything. if there is any other automated recommended way ill be glad to hear
<Nik8> And if you know any project which use pypy and does those things (benchmarking) ill be glad to read about it also
<cfbolz> +-15% in the above example is unfortunately pretty common
<cfbolz> on my machine, do() is 30ms, lots of noise sources on the ms scale
<cfbolz> you always need to run several processes, and average the results
<cfbolz> and look at the variation, to see whether your improvement is statistically significant
<Nik8> the benchmark plugin of pytest run several iterations and does that
<Nik8> (not processes)
<cfbolz> Nik8: in the same process? or starting a new process
<cfbolz> the latter is really needed
<Nik8> why?
BPL has joined #pypy
<cfbolz> because a lot of the variation gets reset when you start a new process. you could get unlucky with address space randomization, for example
<cfbolz> (or with the order in which pypy compiles stuff)
<cfbolz> (fwiw, I see the same relative variation on the pasted example in CPython, about 5%)
<cfbolz> but I was just running it as a script, not with pytest-benchmark
<Nik8> Vmprof, as i saw is a profiler which show me the plot of the code time consuming, What did you meant to use it for benchamarking?
<cfbolz> Nik8: vmprof is great for showing you which parts of your code are slow
<cfbolz> But it will give you slightly slower results than running outside of vmprof
<Nik8> How the measuring of speed.pypy are done?
<cfbolz> Just a loop, with time.time calls around
<Nik8> Ill look for the code
<Nik8> WIth several processes and so on?
<mattip> bitbucket.org/pypy/benchmarks
<cfbolz> Nik8: unfortunately not
<cfbolz> That's why the day to day results are so noisy
<Nik8> That using cprofile -_-
<cfbolz> Nik8: if you manage to extract a piece of your actual workload into a benchmark, that would be quite interesting for us, fwiw
<cfbolz> Nik8: where do you see cprofile usage?
<Nik8> My bad, just an option to use, configurable
xcm has quit [Read error: Connection reset by peer]
xcm has joined #pypy
<Nik8> Just to understand better, benchmark like this: https://bitbucket.org/pypy/benchmarks/src/default/own/bm_gzip.py, as i saw they refer to : https://bitbucket.org/pypy/benchmarks/src/default/unladen_swallow/performance/util.py, which mean no warmup?
<Nik8> And ill correct myself again, benchmark not profiling on default (cprofile), so in my case that wasnt the issue
Ai9zO5AP has joined #pypy
forgottenone has quit [Quit: Konversation terminated!]
jacob22_ has joined #pypy
jacob22 has quit [Read error: Connection reset by peer]
<ronan> mattip: I'm not sure, and I don't have time to check now, sorry. I'll look into it
<mattip> no hurry, just curious what changed
marvin_ has quit [Remote host closed the connection]
marvin has joined #pypy
<mattip> Nik8: we reached the conclusion that it was unfair to throw away warmup time
marvin is now known as Guest61850
<mattip> but those benchmarks run enough times so that the JITted runs are dominant
Guest61850 has quit [Remote host closed the connection]
marvin_ has joined #pypy
antocuni_ has quit [Ping timeout: 264 seconds]
jvesely has joined #pypy
<mattip> the logic in runicode.str_decode_utf_8_impl is different from unicode_helper.str_decode_utf8
<mattip> and the test from issue 2389 never made it out to module/_codecs/test/test_codecs.py
Nik8 has quit [Ping timeout: 260 seconds]
Nik8 has joined #pypy
<kenaan> mattip default c4b55d31320d /pypy/: issue 2389: (take 2) redo faulty logic by copying runicode.str_decode_utf8_impl
<mattip> now there are 2 things to be broken tonite
<Nik8> pypy3 is present somehow on speed.pypy?
<mattip> nope, it is pypy2 only. There is also speed.python.org, but that benchmark suite is not really suitable for cross-implementation comparison
<mattip> it would be cool if someone would convert bitbucket.org/pypy/benchmarks to work on python3
lritter has quit [Ping timeout: 268 seconds]
forgottenone has joined #pypy
xcm has quit [Remote host closed the connection]
xcm has joined #pypy
jvesely has left #pypy [#pypy]
jvesely has joined #pypy
forgottenone has quit [Quit: Konversation terminated!]
Nik8 has quit [Remote host closed the connection]
BPL has quit [Quit: Leaving]
BPL has joined #pypy
ionelmc has joined #pypy
bendlas has joined #pypy
bendlas has quit [Ping timeout: 246 seconds]
bendlas has joined #pypy
jacob22_ has quit [Ping timeout: 246 seconds]
jacob22_ has joined #pypy
edd[m] has joined #pypy
extraymond[m] has joined #pypy
agates[m] has joined #pypy