PMXBOT Log file Viewer

Help | Karma | Search:

#pypa logs for Sunday the 6th of November, 2016

(Back to #pypa overview) (Back to channel listing) (Animate logs)
[09:37:32] <agronholm> dstufft: I just confirmed, the latest twine still uploads to https://pypi.python.org/pypi by default
[09:37:52] <agronholm> unless you meant some development version
[10:35:36] <AlecTaylor> hi
[10:35:45] <AlecTaylor> How do I force virtualenv pip to be used? - http://stackoverflow.com/q/40438089
[11:20:31] <agronholm> AlecTaylor: how about you try this
[11:20:43] <agronholm> virtualenv testenv
[11:20:49] <agronholm> source testenv/bin/activate
[11:21:18] <agronholm> then pip --version
[11:22:03] <AlecTaylor> tech2_ - Just tried in a Docker image with Ubuntu 12.04.5, Python 2.7.10, virtualenv 13.1.2 and same issue. It's always telling me "pip 9.0.0 from /usr/local/lib/python2.7/dist-packages (python 2.7)"
[11:22:44] <AlecTaylor> agronholm: It is using the right pip binary, it's just that it's setting the wrong location which makes things cumbersome
[11:22:56] <agronholm> "setting the wrong location"?
[11:25:12] <agronholm> are you trying to say that virtualenv or pip doesn't work specifically for you?
[11:25:17] <agronholm> (they do work fine here)
[11:35:04] <AlecTaylor> agronholm: They work fine for me, what I'm trying to do is find the install location
[11:35:31] <agronholm> you mean it doesn't install to virtualenv/lib/pythonX.Y/site-packages?
[11:35:36] <AlecTaylor> `from distutils.sysconfig import get_python_lib` doesn't always work, nor does `from pip import __file__ as pip_loc; path.dirname(pip_loc)`
[11:45:51] <AlecTaylor> Ahh found the issue, something with PYTHONPATH
[11:45:52] <AlecTaylor> https://groups.google.com/forum/#!topic/comp.lang.python/5lIcq8P2REE
[11:46:10] <AlecTaylor> agronholm: Shouldn't PYTHONPATH be set/upserted by the activation of the virtualenv?
[11:47:59] <agronholm> no
[11:48:14] <agronholm> but PATH should be modified
[11:53:10] <AlecTaylor> Okay well I've taken PYTHONPATH out of my /etc/environment and will restart and see if it works
[12:00:43] <frgtn> Is there a way to make pip install from sdist instead of a wheel for a specific dependency in requirements.txt? What about only installing from sdist for a specific platform (OSX)?
[12:01:36] <apollo13> pip install --no-binary some_dep your_app I think
[12:03:47] <frgtn> apollo13: thanks. Any way to make that platform-specific in requirements.txt?
[12:04:34] <apollo13> no idea, why would you wanna do that though?
[12:04:45] <apollo13> fixing the wheel to work seems like a better option :D
[12:06:08] <frgtn> I'm fixing a build that segfaults on OSX, so want to fix immediate problem first, then go see about fixing a third-party wheel
[12:07:53] <apollo13> https://pip.readthedocs.io/en/stable/reference/pip_install/#requirements-file-format
[12:08:08] <apollo13> but I doubt you can limit that to a platform
[12:08:08] <frgtn> Cheers!
[12:08:25] <apollo13> unless you do like requirements_osx.txt which then includes another requirements file and sets the option
[12:08:34] <apollo13> so you'd have different requirement files for different platforms
[12:08:54] <apollo13> actually you might be able to do it with <requirement specifier> [; markers] [[--option]...]
[12:18:02] <frgtn> looks like the option with that syntax gets specified regardless of the marker
[17:19:19] <dstufft> agronholm: you sure you don't have a ~/.pypirc or something forcing it to pypi.python.org? It uploads to warehouse to me
[17:19:23] <dstufft> for me*
[17:19:35] <agronholm> dstufft: positive
[17:20:10] <agronholm> in fact twine did upload to warehouse when I explicitly set the repository URI
[17:20:22] <agronholm> but when I commented it out, it uploaded to pypi.python.org
[17:20:40] <dstufft> https://github.com/pypa/twine/blob/1.8.1/twine/utils.py#L42-L58 hm I am confused
[17:22:34] <agronholm> hm it seems that my globally installed twine was older
[17:22:39] <agronholm> which would explain this
[17:22:50] <agronholm> now I just have to figure out why it happens on Travis
[17:23:03] <agronholm> which installs the latest twine first
[17:26:18] <agronholm> found the culprit: https://github.com/travis-ci/dpl/blob/424c7a61c341f184b8aa66df178d4aca31455a6f/lib/dpl/provider/pypi.rb#L4
[17:28:31] <dstufft> agronholm: ah yea that would do it
[17:28:54] <agronholm> too bad they seems to be really slow about fixing stuff
[17:29:31] <agronholm> the travis tool has a bug that causes "travis setup pypi" to put the "distributions" key in the wrong place and there's been a simple PR to fix it for months
[17:29:44] <agronholm> I've explicitly bugged their support about it but no dice
[17:32:00] <dstufft> :/
[17:32:08] <dstufft> at least it looks like they let you override it?
[17:32:24] <dstufft> juset set a PYPI_SERVER environment variable it look slike
[17:32:26] <agronholm> yeah but I have a couple dozen projects and I'd hate to explicitly override it everywhere
[17:32:53] <dstufft> yea :
[17:32:55] <dstufft> :/ l*
[17:32:58] <dstufft> bleh
[17:33:00] <dstufft> you get what I mean
[17:33:04] <agronholm> yes :)
[17:33:25] <agronholm> ok, so everything is cool on this end at least
[17:33:44] <agronholm> when twine uploaded to the legacy url both on travis and locally, I assumed it was twine's fault
[17:33:57] <agronholm> when it was two separate problems
[17:34:27] <dstufft> agronholm: no problem :] Glad we sorted it out
[17:34:36] <dstufft> or you did anyways, I just linked to some code :D
[17:37:38] <agronholm> dstufft: one more q: if I upload to the legacy url, the project will never show up on warehouse unless indexes are refreshed manually?
[17:38:17] <dstufft> agronholm: Um, it'll show up eventually, stuff falls out of the cache on there more often than on legacy because less people are hitting it
[17:38:26] <dstufft> I think the longest we cache anything is for 1 hour
[17:38:28] <dstufft> er
[17:38:29] <dstufft> 1 day
[17:38:48] <agronholm> https://pypi.org/search/?q=fcgiproto
[17:39:04] <agronholm> https://pypi.python.org/pypi/fcgiproto/1.0.2
[17:39:23] <agronholm> 1.0.0 was uploaded in September and it still doesn't show up in Warehouse
[17:39:38] <dstufft> oh the search
[17:40:08] <dstufft> I think indexing is broken right this minutehttps://github.com/pypa/warehouse/pull/1473
[17:40:55] <agronholm> I see
[17:42:09] <dstufft> agronholm: there, just merged the fix
[17:42:19] <agronholm> *thumbs up*
[22:19:42] <brian_> hi :) is there a guide or any recommendations on how to handle packaging/distribution of a module that has a few different c dependencies? it seems my choices are either build them at setup time or distribute binaries
[22:19:48] <brian_> and i'd really rather not distribute binaries
[22:21:03] <brian_> i would love to just build from source everywhere but i'm not sure how much user frustration that results in
[22:25:06] <brian_> i guess python wheels suggest packaging binaries which i guess would put the onus on me to set up cross compilers and script the build process on my host. but that seems like a huge security risk. what if my host has a trojan i don't know about, and everyone who installs my module gets infected? am i liable if that happens?
[22:26:19] <tdsmith> i am not a lawyer but if the question is "can someone file suit against you" the answer is almost always "probably yes"
[22:26:44] <tdsmith> whether they'll win is a separate concern :p
[22:26:51] <brian_> ok let me rephrase then
[22:27:02] <brian_> running binaries built on joe schmoe's computer is a _really_ bad idea
[22:27:10] <tdsmith> that's what wheels are, yep
[22:27:16] <brian_> is that really the recommended way to go here?
[22:29:31] <nanonyme> brian_, are you using a license that says you can't be held responsible if your program melts the target machine? (I think most opensource licenses basically do that)
[22:29:52] <brian_> sure, i am using bsd
[22:32:22] <brian_> if i converted all my dependencies to use cmake, i guess i could just require users to have it installed before attempting to pip install my thing?
[22:32:34] <brian_> or even attempt to run the cmake installer if it's missing
[22:32:43] <tdsmith> that seems unfriendly
[22:32:52] <brian_> that's what i'm afraid of
[22:33:17] <tdsmith> most packages rely on them already being installed; some additionally ship binaries
[22:36:52] <brian_> heh i wonder if you can reliably run cmake without installing it
[22:44:24] <brian_> thanks for the feedback. ok, so i wasn't missing anything. it sounds like binaries are the way to go
[22:44:52] <brian_> i'm a little sketched out at the idea a few years from now of hypothetically reading the story where i infected my users :P
[22:45:21] <brian_> also i really, really hate cross compiling
[22:47:53] <brian_> oh hey, i can make travis do this? that's way better than running it on my host
[22:48:38] <dstufft> yea travis supports that
[22:49:09] <brian_> i'm way more on board with this idea :D
[22:49:14] <dstufft> PyPI / pip / etc don't really consider the author as an untrusted person. It's assumed that if you're installing from said author they (or their machine if compromised) can do pretty much anything they want to your computer
[22:49:28] <dstufft> it's really really easy to hide malicious code as an author inside of a package
[22:49:47] <brian_> that's true, but at least you *could* still audit python source
[22:51:17] <brian_> as an end user i would be more concerned about this if i were putting packages into production, but it sounds like linux users generally build python packages from source anyway
[22:51:39] <dstufft> pip allows people to opt out of binary packages from PyPI if they want to build from source fwiw
[22:52:06] <brian_> is there anything i need to do on my end to enable that?
[22:52:14] <brian_> i guess the answer is just shipping a source only distribution
[22:52:15] <dstufft> just publish a sdist
[22:52:17] <dstufft> yea
[22:52:30] <brian_> ok cool
[22:52:52] <brian_> i don't remember, does build-essentials usually include cmake? meh whatever if they're building from source they'll figure it out
[22:53:07] <dstufft> it's typical for C library using python packages to just assume the C library is installed in it's sdist setup.py script. Some projects will optionally try to compile those libraries using a bundled source
[23:01:24] <brian_> i'm not quite sure what that means. the project contains the build steps to compile?
[23:02:48] <brian_> i guess if youre releasing a source dist and binary dists, the source dist doesn't have to be especially robust/work on non-linuxy environments
[23:10:34] <brian_> thanks for all the help everyone. i feel like i'm much more on track now. but wow this is going to be a lot of work :)
[23:12:09] <tdsmith> !logs
[23:12:09] <pmxbot> http://chat-logs.dcpython.org/channel/pypa
[23:24:28] <brian_> is there a blessed set of boilerplate for creating binaries for osx/linux/win? it looks like people are using appveyor over travis, which i had never heard of
[23:31:10] <tdsmith> i think many projects use travis for os x and linux, and appveyor for windows
[23:31:13] <tdsmith> windows ci seems like their niche
[23:32:43] <tdsmith> https://github.com/pypa/manylinux is helpful for linux wheels
[23:34:06] <tdsmith> https://github.com/MacPython/wiki/wiki/Spinning-wheels for os x
[23:43:15] <brian_> ok, sounds like you just take it one platform at a time and try to get everything working
[23:43:29] <brian_> it's kind of staggering how many combinations of python version, os, and cpu type there are
[23:47:08] <tdsmith> arm? what's an arm
[23:50:06] <brian_> heh
[23:50:22] <brian_> i read somewhere about people using python on android and ios, i'm guessing they just manage their own dependency nightmares