[23:15:02] <techalchemy> dalley, finally responded, no idea if i was remotely helpful in my response though
[23:17:15] <dalley> techalchemy, so I had a paragraph explaining why and decided it wasn't super relevant. the tl;dr is that one of our primary features is "lazy sync" which is basically a transparent on_demand cache
[23:17:39] <techalchemy> that'd be super cool to support
[23:18:07] <dalley> you 'mirror' PyPI (or an RPM repo, or a Debian repo, or a Docker registry), all the metadata gets ingested, but none of the packages are actually downloaded until a client requests it
[23:18:18] <techalchemy> yeah, makes perfect sense
[23:18:40] <dalley> and then once it's downloaded it sticks around
[23:18:58] <techalchemy> depending on how you're mirroring there's more configuration to do in the webserver etc potentially
[23:20:00] <techalchemy> for instance if you want to avoid redirects and still act like you have the bits
[23:24:00] <dalley> requirementslib looks like a nice library, I'll look into it
[23:24:56] <techalchemy> it's maybe not ideally suited for what you want
[23:25:59] <techalchemy> i'm currently ingesting all of the package metadata into a db and i think for requirementslib to download it you'd have to measure that time in days
[23:26:22] <techalchemy> aiohttp took hours and i didn't transform it at all or instantiate any objects from it
[23:29:09] <dalley> but yeah, our entire pipeline is set up around controlling the file downloads
[23:34:47] <techalchemy> i guess i could go read the code if i want to understand the flow a bit