PMXBOT Log file Viewer

Help | Karma | Search:

#pypa-dev logs for Friday the 3rd of April, 2020

(Back to #pypa-dev overview) (Back to channel listing) (Animate logs)
[14:07:49] <sumanah> hey techalchemy .... hope you are holding up ok!
[14:09:10] <sumanah> I see from https://dev.azure.com/pypa/pipenv/_build/results?buildId=21264&view=logs&jobId=e5c2fdf2-ea94-5e63-a966-a1772ac249a4&j=e5c2fdf2-ea94-5e63-a966-a1772ac249a4 that Azure is failing _differently_ now?
[14:17:28] <sumanah> I updated the release tracking issue to reflect that April is now the projected release month and that you're cleaning up a big chunk of technical debt with fixing the CI setup
[14:20:21] <sumanah> (also, test suites are uncomfortably reminiscent of a Skinner box https://en.wikipedia.org/wiki/Operant_conditioning_chamber )
[19:48:02] <sumanah> di_codes: thank you for your work on improving the infrastructure of metadata validation!
[19:48:24] <sumanah> I used to work with someone who would sign off his emails with "peace, love, and metadata" and I wish you all of those things as well di_codes
[19:59:33] <sumanah> for those who were not watching: https://github.com/pypa/packaging/issues/147 and https://github.com/pypa/warehouse/pull/7582 reflect that there is a new standalone pkg https://pypi.org/project/trove-classifiers/
[20:05:23] <PSFSlack> <di> sumanah: and to you as well :slightly_smiling_face:
[20:06:13] <sumanah> :-)
[20:06:42] <sumanah> as it approaches 5pm ET: techalchemy I hope you will consider taking the weekend off. But please ignore me if that's not what you want
[20:07:11] <sumanah> (I mean I know it's not even 4:10, but, like, in the grand scheme of things, it's near the end of the workweek)
[20:07:21] <techalchemy> PSF slack eh
[20:08:07] <techalchemy> also thanks sumanah, i'm just glad i got windows tests running... apparently azure has decided to start prompting for host key verification when using git over ssh
[20:08:12] <sumanah> di_codes: I am taking a fresh look at https://github.com/pypa/packaging-problems/issues/264 on metadata strictness and a bit of discussion from https://discuss.python.org/t/increasing-pips-pypis-metadata-strictness-summit-followup/1855/9 and trying to disentangle what else to ask for soon
[20:08:30] <sumanah> techalchemy: oh! glad you figured that out!
[20:08:32] <techalchemy> but that was not being written to stdout/stderr
[20:09:25] <techalchemy> yeah also i couldn't spin up any windows vms locally because of some weird interactions between hw virtualization and secure boot + changes to the kernel recently i think, so i wound up just spinning up a vm in GCE to figure that out
[20:09:46] <PSFSlack> <di> yeah, got tired of running my IRC bouncer so I asked ernest to enable the bridge for these channels as well
[20:10:38] <techalchemy> verifying metadata is like the single most important thing we can do imho
[20:14:10] <sumanah> di_codes: I am grateful to be able to communicate with you in a way that works for you. Thanks, Ernest.
[20:14:32] <sumanah> techalchemy: so we've made some progress in the last few months, which is great....
[20:14:49] <techalchemy> sumanah, just getting to the bottom of the last few tests
[20:15:06] <techalchemy> I think there is a bug in pip actually
[20:15:32] <techalchemy> I think it doesn't resolve short paths on windows / thinks they are distinct from long paths
[20:16:23] <techalchemy> AssertionError: Egg-link c:\users\runner~1\appdata\local\temp\pipenv-mv0dt017-test\six does not match installed location of six (at c:\users\runneradmin\appdata\local\temp\pipenv-mv0dt017-test\six)
[20:20:54] <techalchemy> pradyunsg, am i wrong ^
[20:24:19] <sumanah> Warehouse started hard failing package uploads on invalid markup with explicit description type https://github.com/pypa/warehouse/issues/3285 , and there's progress on the yanking of old releases https://github.com/pypa/warehouse/pull/5838 , and we now check that .tar.gz files are actually just that https://github.com/pypa/warehouse/pull/5609
[20:27:48] <PSFSlack> <di> "verifying metadata is like the single most important thing we can do imho" not sure I really understand this
[20:28:15] <PSFSlack> <di> we're already verifying metadata, we're just moving the logic to a place where others (like twine) can verify it before trying to upload it to PyPI and get it rejected there instead
[20:28:35] <PSFSlack> <di> the quality of the metadata remains the same
[20:31:05] <pradyunsg> ^ +1
[20:31:13] <techalchemy> di: i mean, verifying metadata is kind of meaningless if we still let people upload things
[20:31:27] <pradyunsg> techalchemy: you're asking the wrong person about Windows quirks. :)
[20:31:29] <techalchemy> metadata verification needs to happen in warehouse imho, not in the upload client
[20:31:41] <techalchemy> validation, is probably the word i need to be using
[20:32:04] <PSFSlack> <di> it'll still happen there. we're just making it possible to do the exact same validation PyPI will do without the round-trip to pypi
[20:32:11] <techalchemy> yes, we should have tools to allow maintainers to know if they are meeting standards too, but accepting artifacts that don't conform is bad
[20:32:43] <pradyunsg> techalchemy: a GH issue (or zulip topic 😬) might be a better way to reach out to people with some idea of both pip's internals and Windows. :P
[20:33:23] <pradyunsg> techalchemy: we don't, other than those that don't comply with manylinux.
[20:33:48] <techalchemy> @di right, my only point is that preventing people from uploading things that don't follow metadata standards is the topic I have had the most conversations about with other package manager maintainers, etc
[20:33:51] <pradyunsg> @di IIUC, the plan is to enhance packaging to have those checks, and to make twine check do them as well, correct?
[20:33:55] <techalchemy> pradyunsg, i mean, we have setup.py...
[20:34:18] <sumanah> so techalchemy I do recommend you take a fresh look at https://github.com/pypa/packaging-problems/issues/264
[20:34:18] <pradyunsg> techalchemy: there's obviously stuff we can't protect ourselves from. :)
[20:34:18] <PSFSlack> <di> pradyunsg: yes, and setuptools, so you can't build invalid sdists, etc.
[20:34:54] <sumanah> "Increasing pip's & PyPI's metadata strictness" which is the issue summarizing TODOs on this topic, from the meeting at PyCon last year
[20:34:56] <pradyunsg> @di oh, nice! That'd be great to have.
[20:35:01] <PSFSlack> <di> techalchemy: sure, what i'm saying is that you already can't upload something that doesn't follow "metadata standards".
[20:35:29] <techalchemy> @di my complaint isn't that the sdist itself is invalid, just that the metadata can be -- e.g. requirement and extras specifiers, python_requires specifiers, etc
[20:35:40] <techalchemy> if that's true then it's news to me i guess
[20:35:46] <techalchemy> it's great news if so
[20:36:39] <PSFSlack> <di> techalchemy: yeah, pypi verifies the specifiers too, so that shouldn't be possible. now, they could be over-constrained, but that's not "invalid" per the metadata specs
[20:36:43] <techalchemy> @ sumanah that's the exact discussion i was thinking of
[20:37:32] <techalchemy> right -- i mean we have to believe what people say their package is and needs
[20:37:43] <PSFSlack> <di> techalchemy: which is why I don't understand discussions about "increasing PyPI's metadata strictness" -- it's already as strict as it can be to conform with the specs, and if there's something missing, that's a bug
[20:38:05] <techalchemy> @di I didn't realize it had been implemented since the conversations last year
[20:38:38] <PSFSlack> <di> techalchemy: it's been implemented since warehouse became PyPI in 2018
[20:38:40] <sumanah> so maybe you would revise your statement to "verifying metadata is like the single most important thing we can do imho.... so I'm really glad we're doing it" ;-)
[20:42:51] <sumanah> di_codes: I've gotten a bit confused about what we've already achieved and what still needs checking/tightening up, lemme look again at this issue
[20:42:56] <techalchemy> @di I didn't realize the server side work was already sorted out -- thanks for your work on that then!
[20:43:20] <PSFSlack> <di> techalchemy: wasn't me!
[20:43:36] <techalchemy> well thanks for filling me in anyway :p
[20:43:49] <techalchemy> seems like the most important piece of the puzzle anyway
[20:43:49] <PSFSlack> <di> techalchemy: you're welcome!
[20:44:01] <sumanah> di_codes: I think you've been a part of that work MO
[20:44:02] <sumanah> IMO
[20:44:06] <sumanah> reviewing, nudging, etc.
[20:44:48] <sumanah> di_codes: ok so I have a few questions -- I may be missing something
[20:45:05] <sumanah> does Warehouse currently reject uploads if they lack python_requires?
[20:45:08] <PSFSlack> <di> sumanah: eh, most of that was already done before I started working on warehouse. deciding what can/can't be uploaded is kind of central to a pypi implementation
[20:45:14] <sumanah> :-)
[20:45:39] <sumanah> I see a note "For packages where no restrictions on Python version are desired, a β€œpython_requires==*” would be satisfactory"
[20:46:13] <PSFSlack> <di> sumanah: no, because it's an optional field
[20:46:55] <PSFSlack> <di> sumanah: I left a comment about python_requires specifically here: https://github.com/pypa/packaging-problems/issues/264#issuecomment-578950775
[20:47:03] <sumanah> Right - I just saw that as I reread
[20:47:23] <sumanah> (as I read multiple times, different things make sense to me or jump out at me. Sorry for the redundancy di_codes )
[20:47:46] <sumanah> I saw there was a note: "Also, can we fail when there's missing author, author_email, URL. Currently warnings on setup" so I should go check as to which of those are optional
[20:48:31] <sumanah> ok! so the only required fields, I am reminded, are:
[20:48:31] <sumanah> Metadata-Version
[20:48:31] <sumanah> Name
[20:48:31] <sumanah> Version
[20:49:11] <sumanah> so never mind re: my question about those, then
[20:49:44] <techalchemy> yeah, wat least for what I personally care about, it's just that for things that _are_ populated, they should contain valid metadata (I guess warehouse parses this to make the json api anyway)
[20:49:58] <sumanah> Running auditwheel on new manylinux uploads https://github.com/pypa/warehouse/issues/5420 . di_codes I see you said last year "while I do think PyPI should eventually start auditing uploaded wheels, this is not going to happen overnight and/or without fair warning."
[20:50:26] <sumanah> and we agreed it was blocked on 2-phase upload/"package preview"
[20:50:54] <sumanah> (also di_codes if now is a bad time to dive into this, please say so and I'll switch mediums)
[20:52:41] <sumanah> di_codes: I remember a while back you were working on package preview/2-phase upload https://github.com/pypa/warehouse/issues/726 . Is that the case? I may be mistaken
[20:53:34] <PSFSlack> <di> sumanah: haven't started work on that yet. I don't think wheel-auditing is a blocker for that (or vice versa)
[20:53:45] <sumanah> oh good
[20:53:56] <sumanah> then I would love for us to start auditing those wheels. I'll leave a comment on that issue
[20:54:20] <PSFSlack> <di> wheel-auditing just requires all the infra to run auditwheel in a sandbox or something. we don't have anything like that now
[20:54:22] <sumanah> pradyunsg: https://github.com/pypa/warehouse/issues/69 "Clean database of UNKNOWN and validates against it" is this something that would aid the pip resolver work? I don't think so but I want a 2nd opinion
[20:54:59] <PSFSlack> <di> or maybe a sandbox wouldn't be totally necessary? I'm not sure if auditwheel is 100% static analysis
[20:55:06] <techalchemy> the resolver stuff is mostly hampered by the Requires-Python metadata issue
[20:55:06] <pradyunsg> sumanah: probably not?
[20:55:23] <PSFSlack> <di> techalchemy: the issue being that the field is not required?
[20:55:42] <pradyunsg> @di: auditwheel is 100% static, as per my understanding.
[20:56:04] <pradyunsg> But I could be wrong.
[20:56:07] <techalchemy> the issue being that the field can? or at least used to be something you didn't have to populate with a specifier in all cases, so there are weird variants in there that require some additional cleaning
[20:57:08] <techalchemy> iirc some legacy packages have very bizarre values in there
[20:57:13] <pradyunsg> techalchemy: I think what you're asking for is more consistency in the values of Python-requires when specified, *and separately* that the value be made non-option, correct?
[20:57:58] <sumanah> (the value be made mandatory, right?)
[20:57:59] <techalchemy> I mean sure I would love things to be required but the more difficult challenge for me has always been the things that are already populated but populated with invalid values
[20:58:36] <pradyunsg> sumanah: yup. My phone's keyboard did a weird autocorrect.
[20:58:51] <PSFSlack> <di> definitely can't go back in time and edit metadata. yanks will help though
[20:59:07] <pradyunsg> +1 on that.
[20:59:50] <techalchemy> yeah, I don't think there's a real solution to that problem which is why I am just looking ahead at new uploads
[21:00:04] <techalchemy> as long as those are being validated it's probably the best we can do for now
[21:00:51] <sumanah> so in https://github.com/pypa/warehouse/issues/474#issuecomment-370986838 Nick talked about retrofitting metadata for old releases ... di_codes it sounds like you're saying we can't/won't do that. is that a policy thing, technical feasibility, both?
[21:01:06] <techalchemy> i'm guessing policy
[21:02:05] <sumanah> pradyunsg: would it help the resolver work if PyPI ran auditwheel on manylinux uploads, stopping less-valid wheels from being uploaded? (I have suspicions that the answer is yes)
[21:02:23] <PSFSlack> <di> policy. we've done it literally once for a crucial django release and people flipped out
[21:02:38] <techalchemy> yeah modifying metadata could cause problems
[21:02:54] <sumanah> di_codes: ok! understood. thanks
[21:03:04] <sumanah> I remember the Django Incident
[21:03:06] <sumanah> (I think)
[21:03:12] <PSFSlack> <di> policy is generally: once it's on PyPI it never changes
[21:03:54] <sumanah> yeah. I remember. dredging up those memories from the ancient Before Time back when I worked more on PyPI
[21:04:31] <sumanah> I see someone said in that discussion at PyCon last year: "Could we explore banning old upload clients from PyPI?" is there still any reason to do that, now that we've maybe solved the "bad metadata" upload problem at a different abstraction layer?
[21:05:00] <techalchemy> IMHO it shouldn't matter what is putting the bits into the api if the api validates them
[21:05:06] <PSFSlack> <di> sumanah: seems like it would make sense to ban old metadata versions instead
[21:05:12] <techalchemy> +1
[21:05:14] <sumanah> HMMM.
[21:05:38] <sumanah> Looking now at https://packaging.python.org/specifications/core-metadata/#metadata-version
[21:05:40] <PSFSlack> <di> https://github.com/pypa/warehouse/issues/7403 could help with that
[21:06:11] <dstufft> banning old metadata versions doesn't work I think
[21:06:16] <PSFSlack> <di> (to track usage of different versions)
[21:06:27] <sumanah> dstufft: I'm listening
[21:06:41] <PSFSlack> <di> yeah, IIRC wheel will use the minimum metadata version or something like that
[21:06:56] <dstufft> IIRC distutils (and I suspect setuptools does as well, as an extension to distutils) only emits the minimum metadata version that the project needs
[21:06:57] <PSFSlack> <di> if you're not trying to use fields from newer versions
[21:07:05] <dstufft> so if you have a thing that only has a name and version, it'll be metadata 1.0
[21:07:30] <dstufft> which makes sense, that's, in theory, more compatible
[21:10:43] <sumanah> I'm thinking about this
[21:11:23] <sumanah> dstufft: can you help me understand -- when did we start versioning metadata with a proper Metadata-Version?
[21:11:34] <techalchemy> Metadata-Version: 2.1
[21:11:39] <dstufft> sumanah: Pretty sute Metadata-Version has existed forever
[21:11:49] <techalchemy> just with 'name' and 'version' in setup.py
[21:12:01] <dstufft> It used to be 1.0
[21:12:14] <dstufft> maybe it's changed in the interim so we no longer emit the lowest common denominator for Metadata-Version
[21:12:16] <PSFSlack> <di> sumanah: https://www.python.org/dev/peps/pep-0241/
[21:12:16] <techalchemy> figured i'd try it out
[21:12:25] <dstufft> It def *used* to work that way
[21:12:27] <sumanah> Right - just looked at that on my own di_codes :-)
[21:12:40] <dstufft> I think we went, 1.0, 1.1, 1.2, 2.1?
[21:12:44] <dstufft> or so
[21:12:48] <techalchemy> dstufft, do you think it's distutils specific or setuptools specific?
[21:13:10] <sumanah> I'm reading https://packaging.python.org/specifications/core-metadata/#metadata-version now "For broader compatibility, build tools MAY choose to produce distribution metadata using the lowest metadata version that includes all of the needed fields."
[21:13:11] <techalchemy> i can try older version of setuptools or i can try older versions of python
[21:13:13] <dstufft> techalchemy: the original logic was def in distutils, and I think setuptools inherited by nature of being an extension to distutils
[21:13:18] <techalchemy> ok
[21:13:20] <techalchemy> lemme try 2.7
[21:13:27] <PSFSlack> <di> don't forget about 2.0, the spooky metadata version that never was: https://github.com/pypa/warehouse/blob/master/warehouse/forklift/legacy.py#L401
[21:13:38] <dstufft> I haven't looked at that code in a long time
[21:14:02] <dstufft> so it's entirely likely it changed out from underneath me at some point
[21:14:11] <dstufft> but regardless, I think the point still stands, we don't require people to emit the latest metadata
[21:14:21] <dstufft> conceptually
[21:14:28] <dstufft> so it would be a PEP level change I think
[21:15:00] <techalchemy> if you want all of the tools to be compatible i guess it would yeah
[21:15:16] <sumanah> dstufft: ok. I get that this has been our approach so far. I think I am having trouble thinking about the use case for favoring compatibility-with-old-tools over making-them-do-what-I-want
[21:15:20] <techalchemy> i mean, tools can still emit the lowest common denominator whether warehouse accepts them or not :p
[21:16:01] <dstufft> sumanah: in theory tools should warn and/or error if they get a package with a Metadata-Version that they don't understand right?
[21:16:16] <dstufft> because it might have important information in it that they don't now how to understand
[21:16:22] <dstufft> know how*
[21:16:27] <sumanah> (I sense a trap) yes, I think so, yes, I follow
[21:17:31] <dstufft> so if we release a Metadata-Version 2.2, all of those tools are going to suddenly reject /warn anything that uses Metadata-Version 2.2, even if that project is using nothing from Metadata 2.2 and is otherwise (besides the Metadata-Version) a 2.1 compatible version
[21:17:54] <techalchemy> well I think the goal would be to set a minimum
[21:18:13] <techalchemy> and yeah pushing out new metadata versions should happen rarely and should be as backward compatible as possible
[21:19:05] <dstufft> I guess the other thing is, what value do you get from setting a minimum? All of those metadata <old> things still exist on PyPI and so tooling is going to generally still need to be able to understand them
[21:20:45] <techalchemy> a guarantee that new artifacts have more thorough and complete information since it helps with discovery, search, gives a more complete understanding of what is actually in pypi, i mean
[21:21:25] <dstufft> how does requiring Metadata 2.1 do that?
[21:21:45] <sumanah> so, I may be misunderstanding the causal chain, but:
[21:22:21] <techalchemy> I guess it's a question of do you really think poeple should still be uploading artifacts that only contain a name and a version? If we did a review of other packaging ecosystems we'd find a lot of other information like SPDX license specifiers, URLs, some even have mandatory screenshots (not really relevant to us but still)
[21:22:22] <sumanah> step a. increase what's necessary in Metadata, like, 2.2 to require a Requires-Python field
[21:23:01] <sumanah> step b. plumb that through all the tooling
[21:23:03] <dstufft> We can mandate things in Warehouse that aren't mandated in the Metadata spec
[21:23:14] <sumanah> (now various resolvers have a much easier time)
[21:23:39] <dstufft> This can be super useful to make it easier for projects that are not meant to be uploaded to PyPI
[21:24:14] <dstufft> like just for instance, SPDX license specifiers... if your package is never meant to be distributed to the public, being forced to set that is just make work
[21:24:25] <sumanah> and then step c. use the 2-phase upload to give a heads-up to the people who are uploading to PyPI....... ok I'll stop this thinking-aloud and answer your point
[21:24:54] <techalchemy> dstufft, but SPDX specifiers cover all of the licenses, you can still upload a project and type in the name of your license now
[21:24:55] <sumanah> dstufft: so I think this is where I was trying to respond to what di_codes had said, which is that if we want people to upload .... hold on
[21:25:27] <sumanah> if we want people to specify Requires_Python and we want Warehouse to mandate it, then we should change the metadata specification
[21:25:30] <techalchemy> at least if someone uses an spdx license and i'm programmatically consuming metadata i will know i can't redistribute
[21:26:01] <dstufft> I don't think we should change the metadata spec for requires-python
[21:26:09] <sumanah> techalchemy: I can't tell: are you arguing that we should mandate SPDX for PyPI upload?
[21:26:17] <techalchemy> no just using it as an example
[21:26:31] <dstufft> Like I said, Warehouse can mandate things more easily than the metadata spec can
[21:26:48] <dstufft> because the metadata spec has to function for all thing, even things that never get released to PyPI
[21:27:10] <techalchemy> hrm
[21:27:42] <techalchemy> does the information warehouse displays about a package even necessarily need to tie to package metadata?
[21:27:52] <sumanah> ok, I agree with dstufft that it's overkill to require the creators of all packages to do SPDX paperwork, and I get how techalchemy would really prefer that all new Warehouse packages have SPDX metadata
[21:28:12] <techalchemy> well nobody has to do paperwork you just pick a specifier
[21:28:18] <dstufft> specific to SPDX I'm on the fence for requiring it on PyPI
[21:28:21] <sumanah> techalchemy: I am calling that paperwork.
[21:28:31] <dstufft> but hard -1 for just random packages packages that exist on my local computer
[21:28:37] <sumanah> techalchemy: so you're thinking that, alternatively, Warehouse could do some smart parsing of the repository or something? instead of tying to package metadata?
[21:28:45] <techalchemy> dstufft, agree actually
[21:28:57] <sumanah> ok but there's a point here that I really don't want to get lost:
[21:29:06] <techalchemy> sumanah, no, thinking that there could be a web UI for a maintainer to add some extra info
[21:29:29] <sumanah> it sounds like di_codes and dstufft disagree about the levels of Warehouse mandates outside the metadata spec
[21:29:42] <sumanah> especially re: Requires-Python
[21:29:59] <sumanah> am I reading that right?
[21:30:08] <dstufft> I'm not sure what di_codes's position is
[21:30:22] <sumanah> linking now: https://github.com/pypa/packaging-problems/issues/264#issuecomment-578950775
[21:30:39] <techalchemy> e.g. https://i.imgur.com/EtxVzau.png <- here is what the canonical snap store UI looks like for packages
[21:30:48] <sumanah> "For things like "warn on missing python_requires", we could have this today if we made this field required in the metadata specification." and I think earlier today in the backscroll he reiterated that would be the approach he would prefer
[21:31:25] <techalchemy> there is also a build metadata file but what you see in the UI doesn't come from there
[21:31:45] <techalchemy> @di since he's on slack
[21:32:14] <dstufft> techalchemy: I think the answer to your question is there is nothing technically preventing it, but I'm not sure that it's something we would want to do. I would have to think about it more
[21:32:27] <techalchemy> yeah not sure either
[21:32:33] <techalchemy> never really thought about it
[21:32:41] <dstufft> PyPI has done it before in the past
[21:32:47] <dstufft> bugtrack_url was a UI only thing
[21:33:01] <EWDurbin> It feels like a pretty bad idea to blur the line we've drawn around modification of metadata post upload.
[21:33:14] <sumanah> So then we would have to explain to maintainers "there are 2 places you need to modify things, and one of them cannot be automated"
[21:33:23] <dstufft> Yea basically
[21:33:40] <techalchemy> for one I find it kind of odd when I login to pypi and can't do anything really
[21:34:00] <techalchemy> in terms of readme modifications etc
[21:34:00] <dstufft> you can delete things!
[21:34:01] <dstufft> and add maintiners!
[21:34:02] <dstufft> remove maintainers!
[21:34:03] <dstufft> :D
[21:34:11] <techalchemy> yeah no i mean there is a lot of like actual important package maintenance i can do
[21:34:20] <techalchemy> i just mean aesthetically
[21:34:30] <dstufft> We certainly wouldn't want to allow you to modify metadata different than what is inside the package
[21:34:52] <sumanah> techalchemy: so it sounds like we are making different design choices than some other package management sites
[21:34:58] <dstufft> if we implemented it, it would have to be wholly distinct metadata that didn't exist at all
[21:35:16] <dstufft> Like.. idk if we added a project logo field or something
[21:35:27] <techalchemy> yeah that'd be the level of modification i guess i'd expect
[21:36:12] <sumanah> and, again, there would be a support cost. Of time and maybe money someday. Of explaining to maintainers "here's what you can change where" and dealing with confusion about what can and cannot be automated, whether owners vs maintainers can do the web UI things..... I am not saying "veto" but there would be this cost
[21:36:20] <dstufft> I could see a world where we had made different choices and hadn't stuffed a lot of this metadata into the package and instead had made it some project level metadata on PyPI or something, but I think we're well past the point of no return on that
[21:36:32] <PSFSlack> <di> sumanah: yeah, I wasn't saying we should/shouldn't change python_requires, just that if the spec said that it was required, it should probably be required on PyPI too.
[21:36:47] <sumanah> di_codes: ah, ok, thanks, I appreciate understanding better
[21:38:01] <techalchemy> sure, I think the ship has sailed on most of these decisions for now. And the interface is super intuitive and such now as well so I can't really complain
[21:38:21] <sumanah> ok so then let's return to the question of whether we want to ban old metadata versions, or old upload clients, from uploading to PyPI
[21:38:45] <dstufft> Yea, I think that if we require something in the metadata spec, then PyPI should 100% require it, and we can also optionally opt to require (or fruther constrain) what PyPI accepts as valid metadata
[21:38:58] <dstufft> (for instance, we could require a long_description if we wanted)
[21:40:01] <dstufft> I don't think banning old metadata versions buys us anything unless we did something like change the meaning of a metadata field in a backwards incompatble way where the only discrminiator was metadata-version
[21:40:08] <sumanah> right.
[21:40:15] <dstufft> and i think it's a poor user experience to use metadata-version as the enforcement mechanism
[21:40:38] <dstufft> like let's say that we change it so that in 2.2 python-requires is required, and we say "ok PyPI requires metadata 2.2 now"
[21:40:51] <dstufft> the error message people would get is "metadata-version 2.1 not allowed"
[21:41:03] <dstufft> which is going to max approximately everyone but like 10 people go ????
[21:41:08] <sumanah> Understood.
[21:41:39] <dstufft> From my POV, the metadata-version validation on PyPI is primarily there to ensure that PyPI gets a metadata-version that it understands how to parse
[21:41:46] <techalchemy> well you'd have to give a better message than that
[21:42:11] <dstufft> any other constraints (like required fields, invalid field, etc) should just be validated on a field by field basis
[21:42:40] <techalchemy> OH, wow
[21:42:41] <sumanah> I don't have any particular notes on _why_ people were interested in exploring banning old upload clients last year -- the notes don't say I think
[21:42:57] <techalchemy> dstufft, that last line finally made me understand what you're saying
[21:44:11] <techalchemy> so if we want pypi to require more metadata that should be decided based on the specific metadata fields we want, and the error that it spits back should tell you which fields it expected to find but didn't
[21:44:20] <dstufft> techalchemy: Yes, 100%
[21:44:53] <techalchemy> instead of talking about some indirect question about metadata versions which act as a surrogate for the real issue, and probably a poor one
[21:44:59] <dstufft> Yup
[21:45:04] <dstufft> Sorry if I wasn't clear about that
[21:45:18] <techalchemy> no I just needed a minute to catch up
[21:46:47] <techalchemy> yes that makes plenty of sense to me because as you and dustin both mentioned we still have a ton of packages with the old metadata so it's not like those are going anywhere
[21:46:56] <dstufft> sumanah: I think that came out of the idea of just having twine warn about requires-python
[21:47:02] <dstufft> versus having PyPI require it
[21:47:06] <sumanah> ahhhhh
[21:47:14] <dstufft> and then it was "well if twine warns, but nobody is using the new version, wat do"
[21:47:15] <sumanah> ok, thank you dstufft, that does help me
[21:47:35] <dstufft> that is my vague recollection
[21:47:48] <sumanah> dstufft: that clicks with me :-)
[21:48:02] <dstufft> there might be additional reasons someone had, I dunno
[21:48:48] <dstufft> I don't think blocking old versions is a very worhwhile change exept for maybe some edge case where we change the meaning of something that the old client will give us erroneous information that we can't otherwise detect, but I don't think we'd generally do that
[21:49:26] <sumanah> ok, so I'm going away with 3 main things: nudging staged releases along, nudging the auditwheel checking of new wheel uploads along, and filing an issue where Warehouse would block uploads that lack python_requires
[21:49:58] <sumanah> and maybe we won't do that last thing, but at least I figure we should have the discussion on GitHub instead of IRC
[21:50:01] <sumanah> or in addition to IRC
[21:50:15] <dstufft> Reading back through the history about too from before I sat back down, I think if we still have any issues that are like "mae things more strict" we should probably just close them unless those issues have specific completion criteria
[21:50:39] <dstufft> a lot of that is my fault, when I originally wrote a lot of the issues in Warehouse they were mostly just notes to myself, to "do this thing i know we're going to need at some point"
[21:51:17] <sumanah> I started https://github.com/pypa/packaging-problems/issues/264 as a place to list the things we wanted coming out of the summit last year; it is ripe for checklisting
[21:51:42] <techalchemy> ^ is a great issue
[21:53:08] <dstufft> Also WRT metadata inaccuracies and such
[21:53:39] <dstufft> We *might* be able to modify the metadata in the database so that it more correctly matches the metadata in the files, in cases where they deviate
[21:54:03] <dstufft> but we probably need to change the database model some
[21:54:21] <dstufft> like requires-dist currently a release specific field, but different wheel files can have different requirements
[21:54:28] <dstufft> so it probably needs to move to the files tables
[21:54:57] <dstufft> and as part of that, we can *probably* "fix" the metadata in the database by backfilling it by looking inside the wheel files
[21:55:39] <dstufft> I guess a better way to word that, is the metadata inside the files should be considered authorative, and I'm pretty sure we can adjust the Database to match the authorative record
[21:55:44] <sumanah> ok, so, https://github.com/pypa/warehouse/issues/194 is a general "let's check more things upon upload" issue, which is the kind of thing I think you were talking about dstufft - not concrete
[21:55:50] <sumanah> I will take a look at that now
[21:56:23] <dstufft> sumanah: yea exactly
[21:57:26] <dstufft> FWIW I suspect I am going to have more time to work on packaging stuff again
[21:57:37] <dstufft> (not really related to any discussion here, just a general FYI ;P)
[21:58:07] <sumanah> that's wonderful to hear dstufft
[21:58:33] <sumanah> dstufft: sometime next week if you want to get on a call, we can renew the "Donald's Responsibility Inventory" to get a fresh one
[21:59:02] <dstufft> sumanah: sure
[22:00:04] <dstufft> I also spent all last week processing my like, many thousands of unread email so I actually can notice incoming email again :P
[22:00:22] <dstufft> last weekend*
[22:00:36] <sumanah> Thank you for doing that!
[22:05:59] <pradyunsg> dstufft: yay! I feel like that's an accomplishment that deserves a congratulations. :)
[22:06:22] <dstufft> woops, forgot to restore my ssh keys to get onto the pypi box to do a query
[22:06:41] <EWDurbin> @dstufft i can update it from your GH keys
[22:07:30] <pradyunsg> 19 seconds for EWDurbin's response. πŸ™ƒ
[22:08:01] <EWDurbin> sometimes you're just sitting watching the messages scroll by while you eat dinner, drink a beer, and watch the horror of a whitehouse press briefing.
[22:08:22] <pradyunsg> LOL
[22:08:57] <EWDurbin> dstufft: updated :-D
[22:09:03] <EWDurbin> go query my friend
[22:09:16] <pradyunsg> EWDurbin: I've been checking https://www.covid19india.org/ daily, which is better and worse in some ways.
[22:09:32] <dstufft> EWDurbin: my GH keys are not restored either :P I'm grabbing my backup drive to restore them
[22:09:36] <dstufft> I had to wipe my computer
[22:09:37] <sumanah> oh for like the last hour I didn't think about the pandemic at all! I was actually absorbed in work!
[22:09:38] <dstufft> well "had"
[22:09:51] <sumanah> what a gift. thank you all for that.
[22:09:57] <EWDurbin> dstufft: contents of your authorized_keys on the bastion host now match https://github.com/dstufft.keys anyway
[22:10:02] <dstufft> my laptop got so hot it warped
[22:10:03] <dstufft> so I had to mail it to apple to get it fixed
[22:10:14] <dstufft> and I don't give people my laptop (even though it has encrpytion) with any of my actual data on it
[22:10:34] <EWDurbin> oh dang, i should probably wipe and send my macbook for keyboard repair while i'm certainly not traveling
[22:10:38] <EWDurbin> i wonder if they're inundated
[22:10:49] <pradyunsg> sumanah: whoops sorry. πŸ™ˆ
[22:11:05] <dstufft> EWDurbin: mine was 2 weeks? ago or
[22:11:10] <dstufft> they were extremely fast
[22:11:22] <dstufft> the box they sent me was setup for overnight delivery
[22:12:02] <dstufft> they had it fixed and back on the truck in < 24 from when Fedex picked it up from my house
[22:12:03] <dstufft> < 24h
[22:12:04] <dstufft> of course it took an extra day for it to get bac to my house
[22:12:05] <dstufft> but total time was < 48h gone
[22:13:24] <pradyunsg> dstufft: Huh, interesting!
[22:13:58] <sumanah> pradyunsg: you're not the only one who reminded me of The News so no apology necessary ;-)
[22:14:02] <dstufft> they also fixed the keybord at that time
[22:14:24] <sumanah> dstufft: when that laptop got super hot, did it start booting OS/2? ;-)
[22:14:26] <dstufft> the only thing that has been on my TV in weeks is the news and tiger king
[22:14:28] <sumanah> (since it Warped)
[22:14:37] <dstufft> sumanah: lol
[22:14:56] <dstufft> I was actually surprised it got hot enough to phyiscally warp the laptop
[22:15:01] <dstufft> the lid wouldn't shut anymore
[22:15:04] <techalchemy> yikes
[22:15:16] <dstufft> and if you set it on a flat surface you could rock it back and forth
[22:15:57] <techalchemy> my thinkpad swings in the opposite direction and aggressively thermal throttles due to firmware issues making it think it is always in lap mode
[22:16:04] <techalchemy> as long as i run linux *
[22:16:08] <techalchemy> *which i have to for my job
[22:17:18] <sumanah> pradyunsg: would it help the resolver work if PyPI ran auditwheel on manylinux uploads, stopping less-valid wheels from being uploaded? (I have suspicions that the answer is yes)
[22:17:49] <EWDurbin> we definitely couldn't be running audit wheel in real time to stop uploads
[22:18:15] <EWDurbin> it would have to be async and send a notice email to the uploader/maintainers
[22:19:39] <dstufft> depends on how radical we want to get
[22:20:17] <EWDurbin> if we had 2-phase uploads, we could probably hold manylinux wheels from being public until results existed...
[22:20:27] <sumanah> right
[22:20:32] <dstufft> e.g. if we modified the upload API to be async in general
[22:20:37] <dstufft> which we arguably should do
[22:20:55] <EWDurbin> hmmm
[22:21:07] <EWDurbin> that would have some benefits for TUF too AFAICT
[22:21:14] <dstufft> Yea
[22:21:20] <dstufft> I mean it's our worst route by far
[22:21:30] <dstufft> there are a ton of beneits to doing it
[22:21:51] <sumanah> is this something that already has a GitHub issue open?
[22:22:02] <dstufft> it's just threading the needle on how to actually do that
[22:22:11] <dstufft> probably not unless we have a nebulous upload api 2.0 issue
[22:22:15] <dstufft> which we might
[22:22:20] <dstufft> sounds like the kind of terrible issue I'd make
[22:22:28] <sumanah> looking in https://github.com/pypa/warehouse/labels/APIs%2Ffeeds
[22:22:46] <sumanah> "Determine new API URL structure for warehouse (starting with new JSON API)" is not quite it
[22:23:29] <sumanah> dstufft: I think today is your lucky day and there is no particular issue for this, IMO
[22:23:39] <sumanah> shall I make 1?
[22:24:01] <dstufft> Sure
[22:24:06] <dstufft> the hardest part about that issue is iguring out how we deal with the existing uploaders
[22:25:24] <dstufft> because we'd want some mechanism where twine could get an identifier back, and then poll for completion (and any errors / warnings)
[22:26:01] <dstufft> and I'm not sure we could just YOLO change the existing API to do that
[22:26:12] <dstufft> so it'd probably be some thing where we stand up a second API with the new semantics, then deprecate the old one and start pushing people to it
[22:26:13] <dstufft> etc etc
[22:26:31] <dstufft> but maybe spending omre than 30s thinking about it would come up with something clever
[22:27:12] <dstufft> It would be interesting to talk to the TUF folks when doing it
[22:27:28] <dstufft> because if we do PEP 480 then that's going to change the upload API too
[22:28:01] <dstufft> so we'd want to make sure whatever we do is compatible with that
[22:28:27] <sumanah> ok, filed as https://github.com/pypa/warehouse/issues/7730
[22:31:02] <pradyunsg> sumanah: while it won't directly affect the resolver work (since we're only checking the tags for compatibility), it would help the actual compatibility issues that users face. Basically, not directly affecting but certainly nice to have!
[22:32:01] <sumanah> right .... it would help reduce the general support load and help reduce how often users get into bad situations
[22:35:55] <pradyunsg> sumanah: Yep!
[22:36:58] <sumanah> ok, thanks pradyunsg!
[22:37:06] <pradyunsg> I'mma go sleep now because I'd like to slowly pull my sleep timings to be more... appropriate for my timezone.
[22:37:08] <sumanah> also, pradyunsg it is 4:07 am in India
[22:37:11] <sumanah> yes please do
[22:37:13] <sumanah> sleep well
[22:37:18] <pradyunsg> Yea :)
[22:37:23] <sumanah> techalchemy: I hope you will at least consider taking the weekend off
[22:37:38] <pradyunsg> sumanah: thanks! ^.^
[22:37:53] <techalchemy> meh i just tweak one thing and push tests, wait 30-60 mins and repeat
[22:38:00] <techalchemy> i will consider it though
[22:38:20] <techalchemy> thanks for all your help sumanah i couldn't have caught up without you
[22:38:49] <sumanah> Thanks techalchemy.
[22:39:46] <sumanah> catch y'all later
[22:41:36] <travis-ci> pypa/pip#15464 (master - f7c5a69 : Paul Moore): The build passed.
[22:41:36] <travis-ci> Change view : https://github.com/pypa/pip/compare/d53d3d6b24dd67dee7c89298ff192bbd9867bbbd...f7c5a69e69c96dadb50f182654135ba7a464d91b
[22:41:36] <travis-ci> Build details : https://travis-ci.org/pypa/pip/builds/670778743
[23:30:15] <travis-ci> pypa/pip#15465 (master - 73bfea6 : Pradyun Gedam): The build passed.
[23:30:15] <travis-ci> Change view : https://github.com/pypa/pip/compare/f7c5a69e69c9...73bfea6d28b1
[23:30:15] <travis-ci> Build details : https://travis-ci.org/pypa/pip/builds/670793999