PMXBOT Log file Viewer

Help | Karma | Search:

#pypa logs for Wednesday the 9th of September, 2020

(Back to #pypa overview) (Back to channel listing) (Animate logs)
[09:33:03] <ThiefMaster> could pypi reject packages that contain python_requires in their setup.py but lack the proper package metadata?
[09:51:13] <abn> @ThiefMaster: as far as I am aware PyPI does not inspect the packages being uploaded. As long as the upload request is valid it should succeed.
[09:51:13] <abn> https://warehouse.pypa.io/api-reference/legacy/#upload-api
[09:51:13] <abn> You can always use the test PyPI to verify this.
[09:53:11] <abn> Does anyone know what happens to the big query data entry when a release is yanked? Specifically the project metadata https://warehouse.pypa.io/api-reference/bigquery-datasets/#project-metadata-table
[09:55:01] <ThiefMaster> maybe someone more familiar with it could chime in here since the guy has some problems publishing with the proper metadata and breaking downstream due to that ;x https://github.com/tobgu/pyrsistent/issues/208
[10:08:10] <abn> ThiefMaster: jfyi; the package metadata does not contain a `python_requires` constraint; it says `"requires_python": "",` see https://pypi.org/pypi/pyrsistent/json anyway I am sure someone more familiar can help :)
[17:11:31] <graingert> have I got the syntax of these extras wrong? https://github.com/dask/s3fs/pull/362/files#diff-2eeaed663bd0d25b7e608891384b7298R28-R29
[17:13:18] <graingert> abn: can it just support both ways around?
[17:13:49] <graingert> pradyunsg: I'm I getting hit by nested extras again?
[17:15:13] <pradyunsg> graingert: is that with the default pip resolver? Probably. Try it with --use-feature=2020-resolver?
[17:16:14] <graingert> pradyunsg: ah that works
[17:17:11] <graingert> https://www.irccloud.com/pastebin/gU599gxM/
[17:17:24] <graingert> pradyunsg: but it works without the pass-through extras
[17:17:41] <graingert> pradyunsg: buuut it downloads like every boto3
[17:18:15] <graingert> pradyunsg: bit of a pathological case: 55-44
[17:18:22] <pradyunsg> boto3 🏳️
[17:18:35] <graingert> * boto3-1.14.55 to 44
[17:18:45] <graingert> pradyunsg: a known issue?
[17:19:02] <graingert> aiobotocore should just vendor botocore right?
[17:19:47] <pradyunsg> graingert: yea. I'm too sleepy to find it in the issue tracker - but I've known since forever that boto3 is painful in dependency resolution. :)
[17:20:01] <pradyunsg> Try it with --use-feature=fast-deps?
[17:20:49] <graingert> HTTP range requests to obtain dependency information !
[17:20:58] <pradyunsg> Yes!
[17:21:00] <graingert> that's subzero
[17:21:07] <pradyunsg> As in?
[17:21:16] <graingert> cooler than very cold
[17:21:22] <pradyunsg> Ahaha.
[17:21:39] <graingert> gonna clear my pip cache tho so I can experience it fully
[17:22:27] <pradyunsg> It was McSinyx[m]'s GSoC project. Last I looked, it's actually slower than the default, but boto3 might make things better.
[17:22:44] <graingert> it's about the same it feels
[17:23:27] <graingert> in speed, but probably saves bandwith
[17:23:33] <graingert> can't use pypi.org json api?
[17:24:00] <pradyunsg> Not standardized, nor protected w/ hashes or whatever.
[17:29:41] <graingert> urgh
[17:30:37] <graingert> transparancy log with metadata and whl hashes?
[17:34:23] <sumanah> hey - I'm coming into the middle of this but graingert if you are interested in the possibility of securing the Python package pipeline further, https://www.python.org/dev/peps/pep-0458/ PEP 458 is in progress https://github.com/pypa/warehouse/issues/5247
[17:34:50] <sumanah> contrast with transparent logs: https://ssl.engineering.nyu.edu/blog/2020-02-03-transparent-logs
[17:39:15] <graingert> sumanah: TUF doesn't match my threat model :(
[17:40:25] <sumanah> graingert: oh hmmmm - could you talk a little more about that?
[17:42:12] <graingert> I think I mentioned it on discuss a while back
[17:42:37] <graingert> I'm after a drop in replacement for --generate-hashes' tofu behaviour
[17:44:00] <graingert> but TUF signed metadata probably helps with boto3
[17:45:29] <graingert> How big is the pypi dep graph? Eg every install_requires,extras_requires from every pypi.org package?
[17:45:47] <graingert> Or just all the metadata
[17:46:08] <sumanah> I don't know
[17:47:06] <graingert> How many times do I need to download boto3 before it's faster to download the full dep graph ;)
[17:48:15] <graingert> Is this going to massively inflate the downloads of newer versions of boto3?
[17:57:40] <dstufft> we don't have the dep graph available yet
[17:58:10] <dstufft> there's a warehouse ticket to make it available
[18:15:18] <abn> dstufft: is a dependency graph from warehouse planned to be made available?
[18:26:09] <dstufft> abn: yes
[18:27:33] <abn> Is there a discussion you'd suggest I follow if I am interested in it?
[18:31:12] <dstufft> https://github.com/pypa/warehouse/issues/8254
[18:33:04] <abn> Thank you. :)
[18:34:00] <abn> Seems I missed that one.
[21:04:00] <graingert> pradyunsg: does the "HTTP range requests to obtain dependency information !" actually validate any hashes?
[21:04:14] <graingert> " Not standardized, nor protected w/ hashes or whatever."
[21:04:40] <dstufft> no
[21:16:26] <graingert> dstufft: .METADATA sounds like a good solution
[21:17:20] <graingert> dstufft: do you happen to know the order of magnitude of what all that data would be? Downloading it and pre-filling the pip-tools pip-compile cache seems like it might be useful if it's only a few hundred MB
[21:35:23] <dstufft> no idea
[21:41:31] <graingert> fair
[21:42:19] <graingert> > The full PyPI mirror requires approximately 120 GB.
[21:42:19] <graingert> hmm metadata gonna be at least 1000 times smaller
[22:34:11] <dstufft> that number is super out of date
[22:34:16] <dstufft> wherever that came from
[22:34:29] <dstufft> full pypi mirror now is like 7tb
[23:05:01] <abn> graingert: not sure if this helps but recently added support to bandersnatch to do a metadata-only sync, https://github.com/pypa/bandersnatch/issues/665 (obviously not `.METADATA` :) )
[23:16:08] <graingert> abn: ah neat
[23:26:30] <tos9> agronholm: hey -- sorry for direct ping (feel free to tell me to buzz off :) -- but how do you use sphinx-autodoc-typehints if the typing import is hidden behind `if False`?
[23:27:33] <tos9> agronholm: specifically I'm trying to help fix a warning for python-build, but it currently warns (from autodoc-typehints) about a typing comment not being forward referenceable, which I think is just because it's hidden behind the magic import
[23:27:47] <tos9> set_type_checking_flag doesn't seem to work for that, it's just for `if TYPE_CHECKING` yeah?
[23:28:51] <tos9> oh, I also now see https://github.com/agronholm/sphinx-autodoc-typehints/issues/146 so extra sorry for pinging
[23:35:38] <graingert> tos9: the trick is to ping people by posting links to their GitHub repos
[23:36:38] <graingert> > You would be responsible for the continued development of this project. This means fixing bugs, dealing with pull requests and occasionally publishing new releases. I would ask you to keep the existing structure of the code base more or less the same and not make major changes without first checking with me.
[23:36:38] <graingert> You can't wish for more wishes
[23:37:23] <graingert> altendky: https://github.com/agronholm/sphinx-autodoc-typehints/issues/146
[23:38:53] <cooperlees> Where ever that PyPI size was advertised should link it to https://pypi.org/stats/
[23:39:08] <altendky> graingert: yep. Not promising now, but... Who knows
[23:39:22] <cooperlees> That size does not include html and json files tho
[23:41:14] <graingert> cooperlees: ooh, is there a bytes high score table for maintainers?
[23:41:47] <cooperlees> That's the top 100 there
[23:42:04] <cooperlees> Can fetch it in JSON too
[23:42:34] <cooperlees> https://github.com/cooperlees/pypistats
[23:59:03] <altendky> oh hey, i'm nearly tied with pypi...