[02:15:46] <_habnabit> is there anything that actually uses requirements.txt? i was rereading https://caremad.io/2013/07/setup-vs-requirement/ and i realized that i'm not sure what the point of having requirements.txt is
[02:39:27] <glyph> _habnabit: that is kind of how I feel whenever I read it :)
[03:37:31] <Alex_Gaynor> To install all the packages in requirements.txt
[03:37:52] <_habnabit> Alex_Gaynor, yes, but why via requirements.txt instead of, say, `pip install -e .`
[03:38:39] <Alex_Gaynor> "pip install -e ." only installs things in install_requirements. Many things I work on are not packaged as a python package. Further, even ones that are often have optional or non-installation dependencies that are listed elsewhere
[03:40:24] <_habnabit> Alex_Gaynor, if it is packaged as a python package, why not just make the optional dependencies extras?
[03:40:57] <Alex_Gaynor> because there's no end user purpose for them, e.g. linters or documentation tools
[03:46:38] <_habnabit> i can't decide if i agree or not re: whether setup.py shouldn't contain development dependencies
[03:49:58] <_habnabit> somewhat unrelatedly, is the expectation that python applications aren't going to be packaged as wheels or whatever as part of deployment? the blog post seems to say that even applications shouldn't have pinned versions in install_requires
[05:16:05] <dstufft> _habnabit: I use requirements.txt as a deployment file
[05:16:30] <dstufft> "here are the exact things I wan to install to deploy this thing"
[05:16:42] <dstufft> where deployment might also equal develop
[05:17:48] <_habnabit> dstufft, yeah, but right now my deployment is 'pip install -U' from an internal pypi, so there's no requirements.txt to read from
[05:20:09] <dstufft> _habnabit: that works if your internal PyPI only has the versions you want to install from (or the latest version is only the version you want to install from)
[05:20:55] <_habnabit> dstufft, right, neither of those are true, so i have the install_requires contain pinned versions
[05:22:04] <dstufft> _habnabit: unless you're pinning transitive dependencies in your top level setup.py you're only pinning some of the htings
[05:22:24] <dstufft> well then you're just using setup.py as a requirements.txt
[05:23:07] <dstufft> the different can be super important if you don't control the thing you're deploying
[05:23:45] <dstufft> Like I deploy devpi and use requrements.txt to pin the specific versions of dependencies
[05:25:14] <_habnabit> dstufft, ok, but for applications, is that really a problem?
[05:25:42] <dstufft> _habnabit: absolutely! Things changing out from underneath is often a cause of confusing bugs
[05:25:56] <_habnabit> dstufft, what would change?
[05:26:22] <dstufft> _habnabit: if a new version of X was released and you hadn't tested it with your app deployment
[05:26:33] <_habnabit> dstufft, but the versions are all pinned
[05:26:47] <dstufft> _habnabit: how are they pinned if you don't have control over setup.py
[05:26:59] <_habnabit> dstufft, i do have control over setup.py; i'm developing the application
[05:27:08] <dstufft> _habnabit: sure, in that situation you can do that
[05:27:12] <dstufft> but that's not every situation
[05:28:12] <dstufft> should I need to fork devpi and change it's setup.py to pin eery single dep, including transitive deps just so I can get repeated deployments?
[05:28:49] <_habnabit> it seems like this depends entirely on how deployments are done
[05:29:09] <dstufft> _habnabit: even in your situation though, there is value between differentiating between "the range of versions that I expect this thing to work with" and "the exact version I want to install right now"
[07:21:12] <sontek> I think the biggest issue for most people is that requirements.txt is shipped in the code repo (like git) but people rarely deploy from that, they deploy from an sdist/private devpi server
[07:21:35] <sontek> So the requirements.txt isn't even accessible, so you end up having to pin your reqs in the setup.py for repeatable deploys
[07:22:25] <sontek> We have people in our company who read that blog post and tried to make it work by doing things like reading requirements.txt from setup.py but in the end it just doesn't work
[07:23:17] <sontek> If you could include requirements.txt via MANIFEST.in and have pip grab it and use it then we would be in business
[07:44:13] <sontek> I have more questions about PEP440: "The version specified ('1.0.342-develop') is an invalid version,"
[07:44:55] <sontek> Previously, this was a "pre release" version
[08:05:46] <dstufft> sontek: oh, we didn't add a normalization for "develop"
[08:06:27] <dstufft> you want somehting like 1.0.342-dev
[08:06:41] <dstufft> the normalized form is 1.0.dev0
[08:07:50] <sontek> ahh, we generate our pre-release versions via git branches, so that was released from `develop`
[08:08:27] <sontek> We also have things like 1.0.342-FEATURE_THAT_ISNT_FINISHED will those be rejected as well?
[08:09:09] <dstufft> sontek: yea, but you can use local versions for that if you want, probably you want something like 1.0.342.dev0+feature.that.isnt.finished
[08:10:19] <sontek> and local versions wont be picked up by normal installs like `pip install app>=1.0.342`?
[08:13:59] <dstufft> local versions don't have an effect on that, you'd want to add the .dev0 part to mark it as a pre-release
[08:14:08] <dstufft> local versions are just non semantic extra info
[08:18:03] <sontek> This is going to be a rough transition
[08:19:17] <sontek> 100+ developers, 50+ services/libraries, and our ci tools all are using the -develop branch to mark pre-releases since pip 1.4.1 came out and stopped installing them by default
[08:19:47] <sontek> and we heavily use OR logic in our pins, for stuff like python-dateutil<2.0.0,>2.2
[08:20:23] <sontek> Because python-dateutil before 2.0 was python2.7 only, and 2.0 and 2.1 were only python 3, but then 2.2 came out and supported single code base
[08:24:04] <sontek> Just pinned our ansible scripts to pip 1.5.6 and virtualenv 1.11.6, moving to PEP440 is not going to be a simple task. I'm going to have to release new versions of tons of packages and do them in pieces so that I can update dependent package pins as I go
[08:24:24] <sontek> and change our CI tools to generate thew new versions with .dev0 on there
[18:21:19] <yoh> Hi! I wonder if it is feasible to ask making pip "compatible" with Debian Python builds where ssl v3 was disabled, and now using "stock" pip fails with AttributeError: 'module' object has no attribute 'PROTOCOL_SSLv3' (see http://nipy.bic.berkeley.edu/builders/seaborn-py2.x-sid-sparc/builds/85/steps/shell_2/logs/stdio) or should I check with urllib3 folks?
[18:30:51] <yoh> actually -- urllib3 already handled this in https://pypi.python.org/pypi/urllib3/1.10 but installed pip carries urllib3 of unknown ('dev') version which I guess wasn't patched yet
[18:32:21] <yoh> ah -- because that is also bundled within requests which you bundle within pip ... so I guess I now need to look into requests...
[18:44:53] <yoh> ok -- nevermind, the issue seems have been resolved just recently and I was digging in the mudd blindly instead of checkingrecent builds ;)
[19:00:37] <dstufft> yoh: it should be fixed in pip 6 I think