[00:54:34] <gsnedders> dstufft: so, to support a superset of all legacy environment markers, and be compatible with pip, we need "platform_in_venv" in packaging
[01:02:39] <gsnedders> dstufft: we already have some releases of setuptools disallowing some markers, I don't really want to end up with some releases of pip disallowing some markers
[01:04:21] <gsnedders> dstufft: otoh, I can't find anything on GitHub that uses it
[01:04:24] <dstufft> gsnedders: I mean, I don't think it's a good marker in general
[01:04:33] <dstufft> I didn't notice it existed or I would have pushed back against it
[01:06:31] <gsnedders> dstufft: i'd rather add it with some explicit deprecated warning and then remove it soonish, but maybe we can just get away with dropping it and seeing if anyone notices
[01:06:45] <dstufft> I'd just drop it and see if anyone notices
[01:13:35] <gsnedders> https://github.com/pypa/pip/issues/3624 is a big for the pip fun
[07:55:36] <ChrisWarrick> I can’t edit any of my packages in pypi, known issue?
[08:47:33] <ChrisWarrick> also, what the hell happened to file URLs?
[08:47:55] <ChrisWarrick> Do you realize how many package managers you managed to break by using non-predictable URLs like this garbage? https://pypi.python.org/packages/6f/ba/a6f5614ecdba894de4ad023039f78b7809d9db76a142b52bc289941b1f48/datecalc-0.1.0.tar.gz
[08:50:13] <ronny> ChrisWarrick: i dont see whats wrong with that one
[08:52:32] <ronny> ChrisWarrick: dstufft clearly stated the issues that happen there
[08:53:23] <ronny> ChrisWarrick: also - if pacakge managers do not follow the official api and do random things - its entirely acceptable to break them
[08:54:01] <ChrisWarrick> ronny: because talking to a fancy API is always feasible
[08:55:28] <ronny> ChrisWarrick: taking random shortcuts is broken :) and the api in question is a simple http gett to a actual index
[08:56:02] <ronny> lets face it, if people cant be brothered to do even that step, they dont care about correctness to begin with, and those kinds of tools tends to break and fall apart
[08:56:27] <ChrisWarrick> ronny: Great, add support for that to every single package build tool out there, and do the same for ALL THE OTHER APIS IN THE WORLD
[08:56:40] <ronny> ChrisWarrick: wrogn kind of escalation
[08:58:21] <ronny> ChrisWarrick: but in the end, a packaging tool that doesnt care for correctness will break, its the fault of the tool, not the fault of the api that works as designed before and after a implementation detail change
[08:58:54] <ronny> ChrisWarrick: the point of an api is to work, if the tool doesnt use it, then dont point tot he api thats correct before and after a cahnge for being at fault
[08:59:19] <ronny> tools that take on technical debt in any way have to pay it at some point, or someone else will be paying
[08:59:29] <ronny> incorrect use of an api is such a debt
[09:00:50] <ronny> so now if a valid change that was keepign the api intact broke a 3rd party tool it is entirely the fault of the 3rd party tool
[09:03:21] <ChrisWarrick> ronny: The API might exist. Package build tools, however, just cannot support every single idiosyncratic get-URL-of-package API out there. And if you want to be the definitive source of Python packages, shouldn’t you put in some extra effort to ensure it works for everyone?
[09:04:03] <ronny> the api is that effort, and the cahnge was necessary beause it was affecting users and moved torwards more correct behaviour
[09:04:33] <ChrisWarrick> Package build tools, however, just cannot support every single idiosyncratic get-URL-of-package API out there.
[09:04:53] <ChrisWarrick> Especially one that requires parsing JSON or HTML responses.
[09:05:03] <ronny> well, then you either give them explicit urls, or watch them burn as being broken
[09:05:11] <FRidh> ChrisWarrick: on Nix we were also affected by this change. We trusted also the predictability of the urls. That doesn't work anymore. As the amount of Python packages in Nixpkgs has grown quite a bit, we we're going to need a tool to autogenerate lists of urls and hashes. Whether using the "url API" or one of the actual API's is a small diference then. Sure, you can't do it from bash but do you really want that?
[09:05:32] <ChrisWarrick> FRidh: You need support in your tooling.
[09:07:19] <ChrisWarrick> ronny: Here’s an idea that would keep everyone happy: just redirect old-style URLs to real files. Would that be so hard?
[09:08:38] <ronny> ChrisWarrick: potentially, the old codebasie is __hairy__ - how about making a pull request - this one is mostly volunteer driven
[09:09:33] <ronny> the dependable solution is, use the api
[09:10:11] <ChrisWarrick> which requires extra effort from most distros in the world to support it in packaging tools
[09:11:21] <ChrisWarrick> unless you leech off the pypi.debian.net server
[09:11:29] <ronny> well, thats expected, propper solutions take more efford than quick hacks ...
[09:41:02] <FRidh> What does surprise me is that with older packages the urls are also updated. I do hope the urls keep working for the older packages.
[14:10:25] <dstufft> FRidh: URLs for older packages aren't going to stop working (at least until metrics show that nobody is using them anymore)
[14:12:42] <FRidh> dstufft: thanks, good to know. I just noticed that the JSON API did give updated urls.
[14:13:48] <dstufft> yes, all URLs (but 7 of them which have hash mish matches) are migrated now
[14:14:15] <dstufft> Example: https://pypi.python.org/pypi/Endure/0.2 Endure-0.2.tar.gz is broken, endure-0.2.tar.gz isn't broken.
[14:14:33] <dstufft> changing the URL scheme is part of fixing these problems.
[14:15:50] <dstufft> (which is also an example of how the URLs were never fully predictable, since they were case sensitive and people could upload either)
[14:27:09] <dstufft> ChrisWarrick: 500 errors appear to be fixed
[14:28:03] <ChrisWarrick> dstufft: do you have any plan with the URLs thing? Perhaps take that pypi.debian.net thing and make it official?
[14:30:13] <dstufft> (Also redirecting old style URLs to new style is a bit harder than it appears, we currently don't serve /packages/ with a dynamic web application it goes straight to S3 from VCL and VCL can't determine what the hash is to do the redirect inside of Varnish. So we need to either narrow down our regex that sends /packages/ to s3 so that it only sends the _hashed_ packages to S3... or something else)
[14:30:34] <dstufft> a different domain or something, i dunno
[14:31:13] <dstufft> ChrisWarrick: not sure yet. the migration just finished over night (which is what was causing the 500 errors, lots of updates to the biggest table in PyPI)
[14:34:09] <dstufft> Most likely I'll just narrow the regex hash down from ^/packages/ to ^/packages/[a-z0-9]{2}/[a-z0-9]{2}/[a-z0-9]{60}/ and then implemented the redirect in Warehouse.
[14:38:01] <ChrisWarrick> most of those things are well-behaved HTTP clients (especially since they have to deal with redirects in other places)
[14:39:53] <dstufft> Yea, I'm just stating up front that if something doesn't respect a 301 (or maybe 302) redirect they're beyond the level of compatability I care about
[17:52:31] <rardiol> what's the difference between gui_scripts and console_scripts in entry points?
[18:50:05] <ChrisWarrick> dstufft: thanks for fixing #438 in pip! Is pypi.io a dependable URL?
[18:51:02] <dstufft> ChrisWarrick: once Warehouse is ready to go live pypi.io is going to be the new PyPI URL and pypi.python.org will redirect to it, however pypi.io is currently pre-production grade so we don't have any alerting on if it goes down and if it's broken I won't stay up all night to fix it etc
[18:52:45] <dstufft> Oh, for anyone who is in here and who wasn't following the issue, try curl -L -I https://pypi.io/packages/source/p/packaging/packaging-16.7.tar.gz /cc prometheanfire gchristensen tdsmith
[19:48:39] <rardiol> what package do I need to do bdist_wheel?