[01:26:50] <tos9> guest423: Don't use setup.py (which will use setuptools or distutils) to upload, it sends credentials in plaintext -- use twine instead -- pip install twine then twine upload dist/whatever.1.0.0.tar.gz
[01:37:43] <guest423_> nevermind. I found my solution: http://stackoverflow.com/questions/13387516/authorization-header-missing-in-django-rest-framework-is-apache-to-blame
[08:43:42] <doismellburning> anyone have a preferred way of determining latest version for a given package name?
[08:43:56] <doismellburning> bored of typing https://pypi.python.org/pypi/$packagename into my browser
[09:19:37] <mgedmin> I usually google "$packagename pypi", then click the link then get a 5-minute timeout and a 503 errors (yep, that sporadic bug is still not fixed), then curse and give up
[14:15:42] <nedbat> here at edX we've got a gnarly complicated set of requirements.txt files. We currently have both setuptools and distribute in the mix, but we want to get rid of distribute to have only one. We tried to do that, but a dependency of a dependency is somehow unhappy with the new situation. I'm not even sure how best to begin describing the situation to get help untangling it... :(
[14:16:44] <singingwolfboy> I'm working with nedbat on this
[14:16:56] <singingwolfboy> but neither of us know what the problem is
[14:17:22] <ronny> nedbat: at my wortklace we had a similar issue, we made a custom wheel on a devpi server, so we could just skip some setup time dependencies
[14:17:55] <nedbat> ronny: interesting, we haven't done anything like that. How does it skip the dependencies?
[14:19:59] <ionelmc> nedbat: he probably made an empty wheel with the name of `distribute`
[14:20:11] <ronny> nedbat: we made a local version of it, and devpi will mask out the pypi version if you upload a custom one to a non-volatile index
[14:21:35] <nedbat> ronny: a local version of distribute?
[14:23:39] <ronny> nedbat: a local version of the package that depended on it
[14:23:52] <ronny> nedbat: bacially we uploaded everythign that was broken in a working version
[14:29:06] <singingwolfboy> here's a log of the error we're seeing: https://gist.github.com/singingwolfboy/814c430ab1e7031e767d
[14:29:39] <singingwolfboy> I'm going to work on paring it down to something more minimal -- I need to see which packages are actually interacting here, and which are merely present but not contributing to the problem
[14:32:34] <ionelmc> gotta love how pbr silently changes your dist-info/bdist output (eg: wheels) when you don't request it in setup.py
[14:34:16] <singingwolfboy> I found a much more minimal test case that reproduces the problem: https://gist.github.com/singingwolfboy/7f11ffc5b95b4437ba9d
[14:34:44] <singingwolfboy> basically: pip and setuptools are at their latest versions, Cython is installed. Try to install dm.xmlsec.binding, and it fails because setuptools goes into an infinite loop.
[14:36:17] <ronny> singingwolfboy: a friend of mine tole me of a potential issue of setuptools with nested setup_requires, bascially package has a really bad setup_requires
[14:36:28] <ronny> singingwolfboy: there is no easy workaround, pip install lxml beforehand
[14:37:23] <singingwolfboy> ronny: I've tried that, but it still recompiles lxml
[14:38:10] <singingwolfboy> ronny: I'm going to try it again, to confirm
[14:38:21] <singingwolfboy> and if it fails the way I think it will, I'll make another gist to show you
[14:41:48] <doismellburning> everything compiles lxml ever in the whole world :(
[14:42:42] <singingwolfboy> interesting -- ronny, looks like you were right, and installing lxml first fixes it. I've updated the gist.
[14:45:48] <ronny> singingwolfboy: it tens to be usefull to build wheels of packages you need and put them on a private devpi
[14:46:58] <nedbat> ronny: this project is open source, and in use by many people around the world, so we're trying to find a fix that doesn't require custom infrastructure.
[14:55:05] <ronny> nedbat: pull requests, does it need lxml as setup_require for example?
[14:55:33] <ronny> nedbat: but yeah, due to setuptools/distutils freedoms there are a lot ofthings that are packaged totally badly out there
[14:55:53] <nedbat> ronny: why did you say "pull requests"?
[14:56:27] <ronny> nedbat: people having to fix is always going to be slower than people being provided a ix
[14:56:38] <ronny> and pull request is the common tool for that these days
[14:58:03] <ronny> oh, that package has a bad project infrastructire ^^
[14:58:28] <nedbat> ronny: sorry, i'm not sure what you are talking about. Are you saying we should make pull requests to fix the broken packages?
[14:58:52] <ronny> nedbat: yesl, how else do you expect to fix it when you refuse custom infrastructure
[14:58:59] <ronny> but that package has no place to send patches
[15:00:20] <ronny> ouch, its setup.py is pretty evil
[15:00:39] <nedbat> ronny: sure, we're willing to submit fixes to packages, but we're also looking for ways that we could make the installation succeed while we wait for them to accept fixes.
[15:00:48] <nedbat> ronny: which package are you looking at?
[15:04:23] <nedbat> wow, even before getting to the installation logic, this is just weird python code...
[16:58:24] <DRMacIver> So I'm currently thinking of rebuilding my slightly baroque hypothesis-extra system on top of setuptools's extra_requires. Basically I have packages that I want to only be usable if the dependencies are installed with the appropriate versions. a) Is this is a good idea? b) Is there a sensible way to query if if the extra dependencies are installed with the
[16:59:35] <DRMacIver> (i.e. someone has done pip install hypothesis[foo] and I'd like to check that hypotheis[foo] is really available and only make hypothesis.extra.foo visible if it is)
[17:03:43] <dstufft> DRMacIver: you can use pkg_resources to see what version of something is available
[17:05:31] <DRMacIver> dstufft: Hm, ok, but if I want to query whether the specific version constraints in an extra_requires are satisfied I have to write the code to do that explicitly?
[17:06:29] <xafer> btw dstufft, did you get a chance to look at https://github.com/pypa/pip/pull/2833 ?
[17:06:41] <dstufft> DRMacIver: I think so, at least I'm not aware of anything to do that automatically
[17:08:12] <dstufft> xafer: it says you tested it to verify it works? (and I assume you tested to make sure it didn't work previously?)
[17:08:18] <ionelmc> dstufft: isn't pkg_resources.get_distribution("pip").version notoriously unreliable? afaik pip still has the problem with leaving multiple dist-info around
[17:10:22] <xafer> dstufft, well I tested it like my comment says: pip list returned the warning but pip install --upgrade didnt
[17:11:00] <dstufft> ionelmc: not that I'm aware of
[17:14:13] <natefoo> dstufft: hey, sorry to bug you again. i mentioned yesterday that i made a PR for linux wheels, not sure if you saw it.
[17:14:29] <dstufft> natefoo: can you link it to me
[17:14:51] <natefoo> also while i'm here, anyone know if it's possible to use wheels as setup_requires dependencies?
[17:16:22] <ionelmc> DRMacIver: imho you should just check the package's __version__/VERSION (if it has an attribute like that) - seems more reliable than pkg_resources to me
[17:16:48] <dstufft> natefoo: it'd probably be smart to post about this on distutils-sig
[17:17:19] <tomprince> I think the format of platform tags definitely belongs there.
[17:18:12] <DRMacIver> Why is pkg_resources unreliable?
[17:23:32] <ionelmc> DRMacIver: it's easy to have invalid metadata (which trips pkg_resources) because there's isn't any common way to store metadata across all the tools (pip, setuptools, whatever os package manager)
[17:24:12] <ionelmc> pkg_resources is a nice idea in theory but practice is something way different
[17:27:57] <DRMacIver> hmm. At this point I'm kinda tempted just to go with the solution "Document which versions it's expected to work with and let uses sort out versioning themselves" as a solution.
[17:28:13] <DRMacIver> (And still include the extras_require but not otherwise validate it)
[17:30:27] <dstufft> pkg_resources is how pip tells what's installed *shrug*
[17:36:49] <dstufft> (and is probably solved by using the utility function xafer just added)
[17:39:11] <dstufft> the problem there is that the version has changed without the pip process restarting, so you get the old versionc ached sometimes, if you do ``pip list`` afterwords it'll report the correct version because there's no in process cache happening
[17:44:43] <ionelmc> also https://github.com/pypa/pip/issues/2958 https://github.com/pypa/pip/issues/2642 https://github.com/pypa/pip/issues/2319 https://github.com/pypa/pip/issues/2235 https://github.com/pypa/pip/issues/1548 https://github.com/pypa/pip/issues/1895
[17:45:54] <ionelmc> it's a terrible mess, the main flaw i see here is that dist-info dirs have versions in the name, but multiple version at the same time are not really supported by pip (well, they are in the sense that you can have orphaned dist-infos around)
[18:01:06] <dstufft> The -e thing is because we don't uninstall versions before we install -e, so you legitimately have two different versions installed. However pkg_resources functions perfectly fine in that case, the second issue you linked is the same as the other issue you linked, the third issue is similar to the -e thing, you have multiple versions installed and pkg_resources does the right thing again by telling you the version you'll get from an import, it's
[18:01:07] <dstufft> a side effect of easy_install putting eggs first that the newly installed version is shadowed, but you're getting the right data. The fourth and fifth are the same issues as the first, and the final one is a problem with the recorded files being empty for some reason, unsure what it is.
[18:01:28] <dstufft> In all of those cases though, pkg_resources would have reported the version you were going to actually be importing.
[19:49:55] <breakingmatter> I'm having some issues using a `settings.py` file with a project ran through `setuptools`. Seems that it can't find the file from the `entry_point` location. Any ideas?
[19:50:53] <breakingmatter> The console_script that is created references a __main__() that has a function call that looks in the local directory for a file called `settings.py`
[19:51:10] <breakingmatter> If I execute the script directly, it works. If I execute the console_script, it fails.
[20:11:16] <superfly> ronny: I managed to run pip freeze, and I see this message above the package my other package depends on: ## FIXME: could not find svn URL in dependency_links for this package: