[22:55:51] <sigmavirus24> I don't want to arbitrarily retry
[22:55:59] <sigmavirus24> and requests' retries are not sophisticated enough here
[22:56:40] <sigmavirus24> dstufft: is there no common URI that I could use to do something like `https://pypi.python.org/pypi/{safe_name}/{filename} and
[22:59:40] <sigmavirus24> So if the upload is incomplete, how do I deal with that?
[23:00:00] <sigmavirus24> dstufft: is the md5 in the json just what I tell PyPI when I upload something?
[23:00:25] <sigmavirus24> or is it what pypi calculates?
[23:01:06] <dstufft> sigmavirus24: If the upload is incomplete the only thing you can do is error-- Doing a hash check like that might be as eperate thing like twine upload --verify or something I dunno
[23:01:46] <sigmavirus24> dstufft: so I can't just use the md5 returned in the json to do a hash check?
[23:02:43] <dstufft> sigmavirus24: re hash, it doesn't matter, PyPI buffers the whole file locally and compares the hash that is sent by twine to a hash it computes of the file before doing anything else. The incomplete upload thing happens when something interupts PyPI's writing the file to persistent storage
[23:03:19] <dstufft> it's basically PyPI being really bad at transaction management
[23:03:26] <dstufft> and warehouse doesn't have the same problem
[23:04:03] <dstufft> sigmavirus24: does that description make sense? I think it sounds confusing but idk
[23:16:29] <sigmavirus24> dstufft: so the 500s can be caused by failing to write to persistent storage. in that case, is there any recourse to fix that?
[23:20:27] <dstufft> sigmavirus24: bump a new verson and re-release
[23:20:34] <dstufft> -> goes to daughters school for a bit