[14:51:47] <offlim> when i try to start the server over a ssl connection i see this message “waiting for connections on port 27017 ssl”
[14:52:53] <offlim> so i open a new window terminal window and type in “./mongo” but now im getting this error “network error while attempting to run command 'isMaster' on host '127.0.0.1:27017'”
[19:09:44] <shayden> i guess you're trying to copy right from one db to the other, over the network? that could be difficult because you have all the networking firewalls, gateways and routing between local and remote, any of which could block you, it's probably easiest to see if you can download a mongodump or mongoexport, copy it locally, then use mongoimport or mongorestore
[19:10:23] <netcho> i opene everything on my local machine
[19:10:43] <shayden> i've never used mongolab, so i'm not sure how to get a mongodump or mongoexport from it
[19:13:02] <netcho> makes no sence... ewhen i use mongo ds0xxxxx-a0.mlab.com:xxxxx/my-db -u user -p password i get access to PRIMARY
[19:13:16] <netcho> but mongodump -h ds0xxxxx-a0.mlab.com:xxxxx -d my-db -u user -p password -o /home/ubuntu says auth failed
[19:18:23] <shayden> If you do not specify an authentication database, mongodump assumes that the database specified to export holds the user’s credentials.
[19:19:11] <shayden> maybe your auth database is different than 'my-db' ?
[19:19:47] <shayden> i vaguely remember something like this tripping me up before
[19:23:11] <netcho> actually its the sam as withouth --authdb
[19:23:18] <GitGud> hey there, i have a question. so in my app if someone makes a post on something or a comment somewhere it goes to that post db entry and writes to that in an array of objects. its all well and good but what would happen theoretically if 2 people were to put a comment down on the same post at the same time?
[19:23:43] <GitGud> if it gets write locked isnt that bad? is there a way to tell it to wait some time for the write lock to come off? or will the app just get errors ?
[19:23:58] <netcho> dbCopy gave me "errmsg" : "unable to login { ok: 0.0, errmsg: \"auth failed\", code: 18 }"
[19:32:48] <shayden> netcho: hmm, it looks like mlab lets you pull a mongodump through the web portal... that might be easier than guessing at remote authentication
[19:36:50] <netcho> shayden: yeah oly to s3 :) but i will give it a shot
[19:40:08] <AlmightyOatmeal> is there any way to do batch find requests? doing a simple find() over an entire collection gives one result at a time and is becoming painfully slow :(
[19:41:25] <AlmightyOatmeal> i'm puling data from mongodb and bulk-inserting it into elasticsearch but the collections are anywhere from 15M to 30M documents
[19:46:38] <offlim> When i use “mongod --enableEncryption --encryptionKeyFile data/encryrest/mongodb-keyfile” I keep seeing this error “Unable to retrieve key .system, error: There are existing data files, but no valid keystore could be located.”… the current db doesnt how any data though
[19:47:59] <AlmightyOatmeal> StephenLynx: there is no query, it's literally find({}) -- i'm pushing everything into elasticsearch
[19:48:30] <StephenLynx> is that a one-time operation?
[19:48:34] <AvianFlu> AlmightyOatmeal, I actually found a reference to a parallel collection scan api the other day, hang on and let me try to find it
[19:48:47] <AlmightyOatmeal> StephenLynx: for the time being, yes. this is the initial move.
[19:48:48] <AvianFlu> no guarantees and I've never used it, but I thought of you when I saw it
[19:49:02] <AlmightyOatmeal> AvianFlu: that sounds wonderful :)
[19:49:10] <AvianFlu> yeah I mean check it out before you get excited
[19:49:12] <StephenLynx> I would move in batches and keep track of the latest moved document.
[19:49:17] <AvianFlu> but it sounded like it was for the problem you have
[19:49:33] <offlim> do I need to create a new create a encrypted database?
[19:49:36] <StephenLynx> I assume that many docs would take way too much RAM
[19:50:48] <AlmightyOatmeal> AvianFlu: that looks like something i would have liked some time ago :) i'll play around with that and see if that will give me a boost for my large queries and scans :)
[19:51:44] <AvianFlu> yeah I'd never heard of it before, I just stumbled upon it while looking for something mostly unrelated