PMXBOT Log file Viewer

Help | Karma | Search:

#pypa logs for Tuesday the 11th of June, 2019

(Back to #pypa overview) (Back to channel listing) (Animate logs)
[10:30:51] <AlexisBRENON> Hi all. I am currently facing the problem of install_requires vs requirements.txt. I read https://packaging.python.org/discussions/install-requires-vs-requirements/ and https://caremad.io/posts/2013/07/setup-vs-requirement/ but cannot embrace this philosophy yet. I would like some feedbacks to choose the best pattern.
[10:31:33] <AlexisBRENON> I am developing a server and provide two entry points: serve, client.
[10:32:03] <AlexisBRENON> First one is the nominal use case, the latest is for test/demo purpose
[10:32:56] <AlexisBRENON> This server rely on a package that is published on a private repo.
[10:33:30] <AlexisBRENON> Finally, this "application" (my server) will be published as a Docker image
[10:34:49] <AlexisBRENON> Currently, to build the image, I package my application (python setup.py sdist), copy the archive to the docker and run pip install server.tar.gz to install it.
[10:37:20] <AlexisBRENON> Before that, I have to copy the requirement.txt and run pip install -r requirements.txt to fill in the private dependency.
[10:39:07] <AlexisBRENON> In this case, I do not need some wide-spread distribution, and so do not have any pros for "abstract" dependencies. But it is not very clean to copy both archive and requirements.txt to docker and to install them separately.
[10:41:10] <AlexisBRENON> In your opinion which is the best pattern ? Stay as is ? Use "-e ." in requirements.txt and do not package anything (requiring to copy a whole folder to docker) ? Use the dependency_links anti-pattern and remove requirements.txt
[12:41:31] <tos9> AlexisBRENON: Stay as is, ship fat things to your server.
[12:42:32] <tos9> AlexisBRENON: Use `pip download -r requirements.txt .`, and ship that (entire thing) to your server, install with `--no-index`, and have those deps in your install_requires.
[12:48:47] <AlexisBRENON> tos9: The pip download command will download all required wheels. Then I copy them to my docker deamon and install my own package?
[12:49:11] <tos9> AlexisBRENON: yep, including your own
[12:49:21] <tos9> AlexisBRENON: But now you've ahead of time made sure that the versions you're going to install are the ones you want
[12:50:13] <AlexisBRENON> It makes a lot of things to pass to my docker deamon: all the downloaded packages plus my own package
[12:50:46] <tos9> AlexisBRENON: Well sure, that's what volumes are for.
[12:54:11] <AlexisBRENON> Volumes are used with an existing image. Here I need all these dependencies at build time, to build a docker image running my server
[13:00:24] <tos9> AlexisBRENON: You'd build your image once, and onbuild install the new versions
[13:00:36] <tos9> AlexisBRENON: But it doesn't matter, you can use whatever crazy parts of docker you like :)
[13:00:42] <tos9> Including yes, just copying the whole thing
[13:00:47] <tos9> (at image build)
[15:11:16] <exarkun> is there something written up about best practices for pinning
[15:12:47] <exarkun> one of the obvious approaches, `pip install -r requirements.txt`, is pretty unmaintainable since it doesn't guarantee all dependencies are actually pinned and if you're careless when you bump versions you'll probably introduce new unpinned dependencies
[15:13:19] <dstufft> exarkun: Warehouse uses pip-tools and it works ok. I don't know of anything written up
[15:35:56] <exarkun> thanks, pip-tools looks useful
[15:49:27] <dstufft> exarkun: one trick you can do to make sure that you don't get any unpinned dependencies is use hashes
[15:49:47] <exarkun> I don't know that trick. What kind of hashes do you use, and where?
[15:49:56] <dstufft> https://github.com/pypa/warehouse/blob/master/requirements/main.txt like that
[15:50:13] <dstufft> pip enforces the invariant that if one project has hashes listed for them, ALL must have them
[15:50:20] <dstufft> e.g. it's an all or nothing switch
[15:50:37] <dstufft> so if somehow something slips through, pip will error out because it doesn't know any hashes
[15:50:42] <exarkun> interesting
[15:56:53] <dstufft> exarkun: the downside of course is that it makes your requirements.txt much longer, and it requires more work to update them since you have to discover all the hashes for a project, and currently if someone uploads a new wheel or something that pip prefers, your build will break cuz you won't have the new hash in the list
[15:57:20] <dstufft> there's a PR (might have been merged recently? I'm not sure) that will make pip skip files it doesn't know hashes for which will make that final case better
[16:15:37] <toad_polo> exarkun: Also may be worth noting that you can give pip a constraints file.
[16:17:14] <toad_polo> So you can have an unpinned requirements.txt file and then pip install that into a fresh venv, then `pip freeze > constraints.txt` to get pins for your dependencies (or whatever you do, hashes, whatever)
[16:18:04] <toad_polo> It at least makes it easier to separate between conceptual dependencies and "this is the environment tested this in"
[16:35:19] <ychaouche> hello #pypa, I need help on figuring out why I can't update my selenium python package, here's the log output if anyone can figure out what's going on and how can I fix it : https://gist.githubusercontent.com/ychaouche/c88a247913e9cbaa29a6cbdd40fd013d/raw/653d7ef3a6f0d33005ff3ca8fcd20df851c37e3e/pip.log
[16:54:17] <ngoldbaum> ychaouche: are you on debian or ubuntu? if so are you using the system pip package?
[16:54:41] <ngoldbaum> ychaouche: what pip version is this? i bet its old...
[16:57:50] <ychaouche> ngoldbaum, I'm on an old mint PC and I frankly didn't check what version of pip that was, but even upgrading pip with pip itself fails (pip install -U pip)
[16:57:53] <ychaouche> will post version
[16:58:25] <ychaouche> I have one pip that is 1.5.4
[16:58:29] <ychaouche> in /usr/bin/
[16:58:33] <ychaouche> that's the system one
[16:58:53] <ychaouche> wow, the other is 19.1.1, from pyenv
[16:59:11] <ychaouche> yeah, big difference :)
[16:59:29] <ychaouche> the pyenv pip could install selenium after a couple of tries
[16:59:49] <ychaouche> but I still have the same problem with selenium, maybe I need to update the geckodriver now
[17:01:41] <toad_polo> ychaouche: It's pretty dangerous to use pip to manage your system dependencies.
[17:01:50] <toad_polo> It can cause all kinds of problems.
[17:02:40] <toad_polo> I generally reserve the system Python for the system and let apt/pacman/yum/brew/etc manage it.
[17:04:53] <toad_polo> I find it best to use some user-space installation of Python for my own personal use, either something lightweight like a virtualenv (see virtualenvwrapper) or something heavier but more thorough like pyenv or Anaconda.