[00:16:35] <__machine> can anyone point me in the right direction… i want to run a private pypi index that will simply point to VCS repos for our internal private library packages… was previously using setuptools dependency_links but support for that is going away in pip 1.6
[00:20:09] <dstufft> __machine: erm, can you try making a hml page with <a href="git+https:///wahtever.git#egg=packagename-version">packagename-version</a>
[00:20:41] <dstufft> __machine: asusming you're using git that is
[00:20:48] <dstufft> hg+https:// or svn+https:// or whatever if you're not
[00:22:08] <__machine> dstufft: yes we are using git… on github, assembla and bitbucket… so i just make one html page and put git+ssh://git@github.com/account/repo.git in the href? i dont need a different html page for every package/version etc?
[00:23:04] <dstufft> __machine: moment, I'm looking to see
[00:25:25] <__machine> dstufft: also i read your threads in google groups, you initiated deprecation of dependency_links support in pip, right? can you tell me if dependency_links is only removed from pip but is still valid for setuptools? im not sure how dependency links worked (if it overrides public index or supplements it)… but i had hoped support could stay as a fallback, even with an additional warning/prompt if needed, as it is very usefu
[00:27:14] <dstufft> __machine: pip install --no-index --find-links http://localhost:8000/index.html packaging successfully installed with this index.html http://bpaste.net/show/AQ9ATFVjJTnZvXNLcIgC/
[00:28:53] <dstufft> __machine: yes I initiated the deprecation of support for it in pip. I don't have control over what setuptools does so I don't know what it will do (or not do)
[00:29:16] <dstufft> I think as of right now, setuptools still supports it though
[00:30:21] <dstufft> __machine: also long term, we'll likely end up with a better solution to this (in metadata 2.0), dependency_links themselves isn't the problem, the problem is using them on publicly available packages, e.g. something from PyPI shouldn't be able to influence you to use an URL that isnt PyPI
[00:30:53] <dstufft> metadata 2.0 has the concept of a "direct reference", which would let you put an URL directly into the dependency information... but PyPI would reject uploads using that metadata
[00:31:08] <dstufft> so people could privately still use it, but the public part of that isn't a problem anymore
[00:31:47] <dstufft> __machine: does that answer your question?
[00:31:55] <__machine> why is it such a bad thing for a package to be able to bundle an optional extra index or links in this way? being able to pip install —process-dependency-links was easy and useful for private repos… the alternative ends up doing the same thing but forces those doing the installing to find out where to get those repos and someone to maintain a private index that stays in sync with releases
[00:33:36] <__machine> i see… could pip just ignore dependency links for packages that were obtained from pypi, but allow it for packages installed locally or from private indexes?
[00:34:10] <dstufft> __machine: it's bad for it to happen on a public index. The problem is pip doesn't know if it's contacting a public index or a private index and pypi doesn't know that those projects have a dependency_links
[00:34:33] <mugwump> yeah I think you end up with multiple requirements files, and start pip with different options for each
[00:35:10] <dstufft> __machine: I was thinking earlier that it may make sense to revert the removal of --process-dependency-links in 1.6 and wait until metadata 2.0's direct references are implemented to actually remove it, but I hadn't thought that through the whole way yet
[00:37:28] <__machine> dstufft: that would be awesome… as we (and a few others at least who commented on the groups thread) are using dependency links in this way for private packages that are not uploaded to pypi… but i wouldnt know about the negative consequences that you might know of… however if it is not the default, its up to the user doing the install to make the call on if they should trust dependency links or specify their own index
[00:39:44] <dstufft> __machine: eh, the priamry negative consequence is just maintenance burden (it was a largely untested feature and required code to handle it) and that if someone was using it on PyPI or another public index, that people might just use --process-dependency-links instead of fixing their projects to not rely on depenency links for public indexes
[00:40:11] <dstufft> __machine: when I say I hadn't though it through, I don't really mean it requires a ton of thought or anything, the thought poped into my head and I just jotted it down to think about it later :)
[00:42:24] <__machine> i'd be happy with a big fat warning and even an additional prompt every time a dependency is installed from a dependency link… something along the lines of… "are you really sure you want to install from FOO? you really should setup your own private index for privately hosted packages"?
[00:43:39] <__machine> educate people to make the transition themselves instead of simply taking it away and having people learn why depdendency links in setup.py are bad because their packages are broken and wont install (and they dont already have a private index etc)
[04:09:51] <Ivo> __machine: you give people a warning and they'll just ask you what option they can set to disable the warning
[04:48:45] <__machine> sure… removing the ability to disable the warning is better than removing the whole feature :)
[04:56:19] <__machine> if my requirements includes -e . to install the current git clone… when i freeze requirements it is changed to a full git URL and commit hash… can i override that by having two requirements files… one having -e . and -r req-frozen.txt … and req-frozen.txt having the actual output from pip freeze?
[05:11:15] <__machine> dstufft: when i create my own index.html and point to git repo… it will grab the latest commit on the named branch (which is what i want) but when i freeze, those packages were not installed as editable so they are frozen with the version number from setup.py which doesnt always change (e.g. pre-1.8 during development)… is there any way to automatically install the latest dependencies from git repos recursively and freez
[05:22:29] <dstufft> __machine: hm, i'm not sure. I'd have to mess with that more to figure it out
[05:26:42] <__machine> there's no way to specify an editable requirement in setup.py is there?
[05:35:54] <__machine> seems like with either dependency_links or a private index.html and —find-links … i can install from git but it will be frozen as packagename==version … which when deployed elsewhere will probably get me a different commit for in development packages unless i change the version number every single commit AND update index.html
[05:40:44] <dstufft> __machine: yea, setup.py doesn't support editable at all
[05:41:12] <dstufft> __machine: you might just want to create a tarball in a post commit hook and publish it
[05:42:23] <__machine> and requirements.txt doesnt support recursive/automatic dependency handling… so im screwed… ill need to maintain in comments in requirements.txt a list of all the requirements for each app that might be used by a project… so people can uncomment the app they want and all its dependencies when creating a new project from a template…
[05:43:00] <dstufft> __machine: I'm confused what you're trying to do here
[05:45:14] <__machine> my company has several private repos for utility apps that sometimes have dependencies on each other… and we have a new project template which includes (commented out) the available private utility apps that a developer might want to use on that project, either in requirements.txt or setup.py…
[05:45:43] <__machine> i want the developer to be able to uncomment the core private library apps that they actually want to use in their project… and have pip/setuptools automatically install the required dependencies of those library apps
[05:46:06] <__machine> then the developer can pip freeze > requirements.txt and when deploying to other environments we can do pip install -r requirements.txt —no-deps
[05:47:01] <dstufft> forgive me, as I've had dental work and i'm loaded on pain killers, but pip freeze doesn't work with dependency_links either like you want does it?
[05:48:15] <__machine> no… dependency_links would just be used for the initial install (with unpinned abstract requirements, via setup.py or via requirements-unpinned.txt which is tracking @master for our private utility repos, for example)
[05:48:36] <dstufft> ok, just making sure I'm groking the problem
[05:49:07] <dstufft> so yea, pip freeze loses information, there's a half done branch that I did like 2 years ago that could have fixed that
[05:49:30] <dstufft> there's really not much to help you there :(
[05:49:41] <__machine> then pip freeze should record all of the packages that were installed by the recursive dependency handling of pip/setuptools… but because these utility repos are all git, not public released, they often dont have actual "releases" where the version number changes… i just want to freeze it at a commit
[05:50:21] <dstufft> yea, we don't record that information currently
[05:50:27] <dstufft> so pip freeze has no where to pul it from
[05:50:27] <__machine> but to get recursive dependency handling in pip/setuptools, i end up with packages installed into site-packages (not git clones, even though it cloned from git to get the files)… so when it is frozen, it always has the same version number and not a commit hash
[05:50:49] <dstufft> yea, you need -e to make it an editable
[05:50:55] <dstufft> you can isntall from git without an editable
[05:51:01] <__machine> so as far as i can tell… im screwed… and i need to ask developers to manually uncomment all of the nested/recursive dependencies of every utility ap they want to use in a project… right?
[05:51:49] <dstufft> that, or setup some automation so that you generate a "release" for each commit/push/whatever
[05:51:57] <dstufft> sorry I don't have a better answer
[05:54:42] <__machine> is this something that pip 2 might help with?
[05:55:38] <dstufft> this is something that could be handled in pip yes, you should file a bug with what you're trying to do
[13:06:08] <rere> can i use pip with the python which doesnt have _ctypes? I get an error at the moment: 'ImportError: No module named _ctypes' while starting pip: $ pip -V
[13:14:39] <rere> apollo13: I would probably be able to recompile whole python but at this moment that would be a pain. I dont use any sophisticated features of pip, maybe I can use older version?
[13:14:56] <apollo13> hmm, dunno my aix python has ctypes (at least one of them)
[13:15:02] <apollo13> can you provide a full traceback?
[13:15:08] <apollo13> what is trying to import ctypes
[14:43:29] <apollo13> Ivo: yeah, PR against pip ;)
[14:44:10] <apollo13> although it might be a nice wrapper around it for linux and windows support, but the page seems to suggest it's mainly for windoiws
[14:47:39] <dstufft> it has some utilities for linux
[14:48:37] <dstufft> mostly in that it has constants defined for the vairous colors and such
[14:49:23] <dstufft> it shouldn't be hard to change that to use constants defined in pip.log instead of in colorama so that colorama is only needed on windows
[23:59:34] <__machine> dstufft: is this something that might be integrated into pip itself in the future? https://github.com/prezi/snakebasket http://engineering.prezi.com/blog/2013/04/19/snakebasket/ — basically, a drop in replacement for pip that will recurse into git deps and install the requirements.txt for the dep before installing the dep… sounds like this might solve my problem…