[15:50:56] <fdv> hi! I have now gone through pipenv, poetry and pip-tools, and it seems these tools are all lacking essential (for me) functionality, unless I'm mistaken
[15:52:08] <fdv> pip supports (through the -c flag) a file with constraints that are taken into account when installing from requirements.txt, does anybody know whether pipenv or pip-tools (pip-sync) support that?
[15:53:31] <fdv> sorry, pip-compile, not pip-sync. I'd like my final requirements.txt to contain the actual versions I want (including recursive dependencies) without having to also consider a second constraints file part of the spec
[15:54:48] <fdv> poetry probably isn't a use-case for me anyhow, as it seems restricted to only install within a virtualenv (please do correct me if I'm wrong)
[19:54:00] <nanonyme> Is it possible to have --pre only for a single package?
[19:57:08] <ngoldbaum> i think "pip install --pre --no-deps <package>"
[19:58:01] <ngoldbaum> i don't think there's a way to only install the dependencies first
[19:58:12] <ngoldbaum> so i guess you could do "pip install <package>"
[19:58:16] <ngoldbaum> then "pip uninstall package"
[20:00:11] <nanonyme> What about if I'd want to pip wheel a package and its dependencies such that my package is a prerelease but dependencies should be release packages?
[20:00:32] <ngoldbaum> you want to bundle all the dependencies in a single wheel?
[20:00:50] <nanonyme> Just create wheels of everything into a directory that works as a package cache in TravisCI
[20:01:26] <nanonyme> I currently have the setup but I just realized it doesn't work as expected for my own package for some reason. I think I broke things when taking extras into use
[20:01:41] <ngoldbaum> and you want to create the wheels yourself and not rely on the upstream wheels on pypi?
[20:02:42] <nanonyme> TravisCI has cache that is faster than PyPI
[20:03:05] <nanonyme> I've been putting entire wheel build directory into cache and rotating over builds
[20:05:29] <nanonyme> Never mind. Looks like I got my scheme working again. pip ate the .[extras] syntax
[20:13:18] <tos9> fdv: what functionality are you looking for
[20:13:33] <tos9> fdv: all 3 of those are certainly missing plenty of it :), but what thing are you trying to do
[20:47:17] <fdv> tos9: we've (they've) been using pipenv in the project that I joined a little while back, and now, I needed to upgrade a package (pika), so I did pipenv install pika==0.12.0 and lo and behold, all dependencies of all other packages were also upgraded to the latest permitted version. Now, this is deployed on systems running python 3.5.2, and some dependencies required python 3.5.3, but they still got upgraded, and compatible packages
[20:47:17] <fdv> were of course nowhere to be found in the package repository
[20:47:56] <ngoldbaum> i don't think it's possible in python to have more than one version of the same library in a dependency tree
[20:48:35] <fdv> python was set to 3.5 in the Pipfile, didn't make much of a difference
[20:49:47] <fdv> pipenv install even has an option (--selective-upgrade) that by the looks of it should do what I need (upgrade on package and its deps and no others), but that didn't work
[20:49:51] <ngoldbaum> rust does that, it's pretty neast
[20:52:42] <fdv> "the role of a developer is to make the use of a computer less shitty for the user" :-)
[20:53:00] <fdv> anyhow, after a bit of reading, it seems that this behaviour (pipenv upgrading the world) is by design, and won't change
[20:53:52] <fdv> and frankly, I disagree on that stance, but to each his own, by all means. you make a tool, you get to decide its philosophy
[20:54:39] <fdv> (I want to do controlled upgrades of the packages I need. If I need to upgrade or install a package to fix some issue, I want that to be as atomic as possible)
[20:55:40] <fdv> but anyhow, I set about looking for options, and poetry seemed nice. It also seemingly did a better job at dependency resolution, as it didn't upgrade deps to versions incompatible with my python version.
[20:56:02] <fdv> poetry, however, doesn't believe in system installs and will only play with virtualenvs
[20:56:39] <fdv> which, again, is fine, I didn't make the tool and I'm in no position to complain, but I need that functionality, so I keep looking
[20:58:15] <energizer> fdv: can you explain the situation that causes you to want to manage a system install?
[20:58:25] <fdv> enter pip-tools, which is the closest I've come to a good solution. only downside is that I end up with a finalized requirements.txt that includes dependencies with versions I can't use. pip has a solution for that (--constraint), but pip-tools don't seem to support that
[20:59:25] <ngoldbaum> messing with the system python installation is a great way to have a bad time
[20:59:26] <energizer> you can include constraints in your requirements.in, which pip-compile will process
[20:59:31] <fdv> energizer: we deploy to docker images, and those are used in various ways. installing the packages in a virtualenv would require specific knowledge about the platform you deploy on
[20:59:46] <ngoldbaum> ah in docker it's less bad kinda
[20:59:49] <fdv> energizer: can you do that? that's awesome!
[21:00:09] <ngoldbaum> although still easy to fuck up the OS if you aren't careful
[21:05:16] <fdv> to find python in $(pwd)/venv/bin
[21:06:10] <energizer> if you have a version of python that has venv, (this is typical), then `python3 -m venv venv && venv/bin/pip install . && venv/bin/python` will work
[21:06:27] <fdv> my solution so far is using pip-compile to merge a couple of requirements files into one and then use pip --constraint <constraint_file> --requirement merged_requirements.txt
[21:06:43] <fdv> energizer: sure, but that's a lot of moving parts
[21:06:54] <fdv> and they have to look the same everywhere, or you have more moving parts
[21:09:01] <energizer> what i described, or pex.rtfd.org, are standard practice, and they work for many people in many environments. you can try to blaze a trail, but you don't have to
[21:24:42] <fdv> energizer: curreently, we set up the images in one place, and then run applications from elsewhere. as it is today, running the application requires no knowledge about where to find python or the necessary libraries, as they are in standard places
[21:25:08] <energizer> fdv: pex.rtfd.org might be what you want then
[21:25:15] <fdv> so the design is built around docker and that's currently where these dependencies are handled
[21:25:46] <fdv> yes, it *might*, I'm not saying it couldn't work, but it's not really what I want at the moment
[21:26:26] <fdv> we also need these applications to work in other contexts, like in dev, where we *do* use virtualenvs throughout
[21:27:04] <energizer> there's some writing about virtualenv+docker https://hynek.me/articles/virtualenv-lives/
[21:27:04] <fdv> and as I stated, I do lack some knowledge here, and perhaps I would have gone the pex route had I known about that before
[21:27:44] <energizer> there's plenty more options, https://sedimental.org/the_packaging_gradient.html but i believe most of them will be much less familiar to most python developers than virtualenv
[21:29:35] <fdv> and perhaps I still will (thanks for pointing it out), but I really have to do one thing at a time and the Newest Hottest Thing That Solves Everything has never really happened to do that for me (and I've tried my fair share)
[21:29:52] <fdv> so going for something I know and understand makes a lot of sense in the first place
[21:32:08] <fdv> energizer: but you said I could have constraints in requirements.in, do you know if that is documented anywhere?
[21:32:35] <energizer> fdv: its the same constraint format as in install_requires=
[21:38:58] <fdv> energizer: thanks. a quick round of googling didn't enlighten that much, however. it looks more or less like install_requires isn't that different from just specifying dependencies in requirements.txt?
[21:41:09] <fdv> but with pip's constraint file, I can set a constraint that (as is part of my current issue) async-timeout should be less than 3.0, but I don't have that dependency directly, it's a transitive one from aiohttp (which I do use)
[21:45:42] <fdv> the packages defined in the constraint file aren't installed because they are there, but *if* they are installed, either because they are mentioned in requirements.txt or are dependencies from those, it will adhere to the version requirements from the constraint file
[21:45:45] <energizer> if i want to set a constraint, normally i'd add it to requirements.in that way, but maybe there's a way around doing that. i'm not sure
[21:46:15] <fdv> that's the missing part for me to be happy with pip-tools for this task :)
[21:47:01] <fdv> it would be awesome if pip-compile could take in a constraint file as well (as pip can) to create a final requirements.txt that is actually athoritative
[21:50:39] <prsn> hi! can anyone explain this output to me? https://paste.pound-python.org/show/eiQYp0oMifUuhLYV09jU/
[21:50:55] <prsn> (ignore the xs in the protocol, that was just to get the paste site to let it through)
[21:51:22] <prsn> is this something as stupid as it looks, where pip things 11 is less than 9 because it's comparing the first digit only?
[21:53:44] <energizer> prsn: try --no-use-cache or whatever its called