[01:57:48] <njs> okay sillier question: is there a way to tell pip 'please use the *oldest* version of every package you can find that meets these requirements'?
[01:58:14] <dstufft> Nope, it's been asked for before and we were kind iffy on adding it
[01:58:32] <dstufft> It probably wouldn't work well at all until #988 gets resolved anyways
[02:04:26] <njs> my use case is that I just discovered that our actual requirements went up, but we didn't notice because all our CI runs against the latest version
[02:04:59] <njs> so I would like to have a CI job that checks our declared requirements are actually up to date, which seems like a reasonable use case to me :-)
[02:05:03] <dstufft> Yea I understand the desire, it just will be very broken
[02:08:58] <njs> actually that's not too ridiculous, even with flit it would be easy to write a little script that reads the install-requires out of pyproject.toml and converts them into low-end pins
[02:09:09] <njs> it's too bad that it wouldn't apply to transitive requirements, but oh well
[02:10:13] <dstufft> seems like if you applied it to transitive requirements you're very likely to find lots of bugs in other libraries :)
[02:11:11] <dstufft> that may be a wholesome outcome though! but likely to be frustrating as you become the min dependency task force for your entire dependency chain :D
[02:11:13] <njs> people don't talk much about this downside of having a robust testing setup, but it is very true
[02:11:43] <njs> our CI finds upstream bugs on a *regular* basis
[02:12:12] <dstufft> have you tried testing less things
[02:13:18] <njs> you seem to think that my goal is to have less pointless aggravation in my life
[02:14:13] <dstufft> you are active in both packaging and you wrote a new I/O framework
[02:20:39] <njs> I'm not sure why lacking #988 is *necessarily* catastrophic... pip is already computing some kind of ranges, with.... more or less accuracy, let's say, but to the extent they work at all picking the bottom end doesn't seem harder than picking the top
[02:20:51] <dstufft> Neither answer here is right, because it's possible that the second library got compatability with foo>1 at some point and change it's version specifier
[02:21:02] <njs> dstufft: oh I see, because right now pip actually does satisfy the >=2, but only by accident?
[02:21:39] <dstufft> njs: yea, generally the implication of #988 is that if people aren't capping their top end version (which most people don't) you don't really notice it because it happens to satisify it
[02:23:37] <njs> next step, hypothesis-pip, which picks random package versions consistent with your declared requirements
[02:23:45] <dstufft> it's also possible that the library that depends on foo>=2 later widened it's version range (perhaps foo is actually numpy and people got mad it was hard to upgrade their numpy or something) so it's entirely possible that some combination of dependencies will install foo-1
[02:24:40] <dstufft> though I guess if #988 is solved, then the resolver will just work in reverse, iterating over options until it picks the maximally lowest total set
[02:25:18] <dstufft> (this isn't to say this problem makes it a bad idea or whatever, perfect enemy of good and all that, but it's not clear what the right answer is!)
[02:25:42] <njs> I mean, in general there is no one right answer to resolving any set of requirements, even under the normal rules
[02:27:44] <njs> but any resolver is going to have some heuristic to pick between multiple acceptable packages – for a backtracker I guess it's some function that takes a set of candidates and sorts them to determine which order it explores the possible branches in
[02:28:06] <njs> just flipping that the sort order in that one place seems like a pretty reasonable thing to do
[02:28:47] <njs> the goal of testing "old versions" is already undefined and somewhat misguided; it's totally possible that you say foo >= 1.1, and it turns out 1.1 and 1.3 work great but 1.2 (which you didn't test) is totally broken
[02:29:29] <njs> but trying "newest" versions and "oldest" versions is probably the biggest bang-for-buck in terms of probability of catching bugs with a small number of checks
[02:30:36] <njs> oh fun wrinkle, apparently we're still using the old pre-PEP 508 way to do conditional dependencies, I wonder if that still has any value or not
[02:31:00] <dstufft> obviously you just need to test the entire combinatorial set of all possible depedencies
[02:31:23] <dstufft> where dependencies also includes OS-level dependencies >:]
[02:31:28] <njs> dstufft: yeah the hypothesis-pip thing is kind of a joke only... not really
[02:32:25] <njs> dstufft: UGH YES you joke but I literally filed a bug last week to ask if dependabot could start managing the OS version for me please: https://github.com/dependabot/feedback/issues/346
[02:33:27] <njs> (that's not about checking the range of dependencies, but that would be the obvious next step)
[02:34:26] <njs> dstufft: btw do you remember off-hand when pip started supporting PEP 508
[02:35:30] <dstufft> njs: for wheels or for sdists
[02:36:22] <dstufft> IIRC the answer for wheels is forever
[02:36:33] <dstufft> the answer for sdist I think is like, pip 8ish?
[02:36:56] <njs> I thought for wheels you had to use that weird magic extras syntax
[02:37:03] <njs> did that always compile into the same thing?
[02:38:13] <dstufft> This is going way back in esoteric memory for me, but IIRC *all* forms of environment markers have always compiled into the same thing inside the wheel file itself
[02:38:36] <dstufft> there's just been 3 different ways to spell them for sdists as we transitioned from "here's this weird hack in setupcfg" to "here's a weird extras syntax for reasons I don't remember" to "oh hey, here's real support"
[02:38:47] <njs> well, maybe the thing to do is to just switch and see if anyone notices :-)
[02:39:20] <dstufft> IIRC the implication is your environment markers get ignored
[02:39:32] <dstufft> if you use PEP 508 markers in too old pips inside of sdist
[02:39:37] <njs> the comment says that the original reason we switched to using extras was because of an old bug in setuptools, but that's fixed now. And I guess people on python 3 are mostly not using pip 1.5.4? I sure hope so anyway
[02:41:36] <dstufft> last 30 days, what versions of pip downloaded a file for trio from PyPI
[02:41:57] <njs> oh heh, right, I was trying to figure out how to restrict to python 3 :-)
[02:42:21] <njs> I guess trio gets enough downloads these days for that query to be somewhat meaningful though, nice
[02:42:38] <dstufft> https://bpaste.net/show/7d1f8fca3d75 the query I used FWIW
[02:45:14] <dstufft> (unrelated, I'm kinda jazzed by the fact pip has made 180 million /simple/ requests in 18h40m that the new linehaul was sending data on 2/6
[02:45:31] <njs> oh heh, and python_requires is only in 9.0.0+, so I bet some of those ancient pip's are trying to install trio on python 2
[02:46:36] <dstufft> python versions, trio, last 30 days
[02:52:13] <dstufft> So the new linehaul started logging data at 5:20 AM UTC on 2/6, but starting from there till midnight 2/7 UTC, pip was used to make 180 million requests to /simple/ on PyPI. If I extrapolate that out , is like 230 million http requests. Add in the 83m from file downloads and pip is making 310?ish million http requests to PyPI a day.
[03:01:03] <dstufft> I'm kind of sad we switched Fastly services so the old historical data can't be graphed with the new data
[03:04:46] <njs> I like how in that last screenshot 3.8 is more popular than 2.7
[03:10:41] <dstufft> njs: 2.7 is more popular than 3.4 tho
[03:12:02] <njs> well, 7 is bigger than 4, that probably explains it
[19:17:59] <adamchainz> I dropped Python 2 support from some packages but accidentally released these as universal wheels. I've removed those but still top-level requires_python is showing in the metadata on PyPI as empty (e.g. https://pypi.org/pypi/django-modeldict-yplan/json ). This causes pip on Python 2 to download the new version but then fail to install. Is there any way of updating this metadata?
[19:21:15] <ngoldbaum> adamchainz: do another py2-supporting point release
[19:21:25] <ngoldbaum> then do a py3-only release with requires_python
[19:28:22] <adamchainz> thanks, fair that the metadata is immutable