[00:30:30] <justinsf> Hello, is there anyway to speed up creating indexes, after running mongoimport? All of our indexes have { background: true }. Our server has 16 cpus, but noticed only 1 cpu is being used to index. Is there anyway to make creating background indexes multi-threaded?
[00:33:37] <Doyle> justinsf, indexing during initial sync drives me nuts. Same behavior. Get a DB, index, get a db, index, seemingly single threaded
[00:34:08] <Doyle> I know there was supposed to be an upgrade to initial sync between 2.6.? and 2.6.7 to enable 32 threads of replication, but I don't see it.
[00:34:19] <justinsf> Like I think this literally could take hours to create the indexes.
[00:39:24] <justinsf> from the doc: " The background index operation uses an incremental approach that is slower than the normal “foreground” index builds. If the index is larger than the available RAM, then the incremental process can take much longer than the foreground build."
[00:40:43] <Doyle> That's just the index though, not the working set for the index, right?
[00:40:51] <Doyle> The index should be tiny most of the time
[00:41:09] <justinsf> Yeah I mean I have a ton of free memory on the server
[00:41:20] <justinsf> yet only still using 1 cpu at 100% pegged
[00:42:58] <Doyle> My data's just silly I guess. Is 20M objects a lot?
[00:44:15] <justinsf> Our mongodump is around 6 GB
[00:44:34] <Doyle> that should wrap in seconds then
[00:45:07] <justinsf> how do I tell what is going on, is there commands I can run at mongo shell
[02:47:29] <quasiben> I'm working with an older version of mongo: 2.5 or 2.6? in any case, i'm helping to develop a web application to store 3rd party auth information and was looking for guidance
[02:48:42] <quasiben> So UserA has login credentials to the site and hey store credentials for digital ocean or some other service. Are there recommendations on how to store those 3rd party creds in away which is secure and can only be read by UserA
[03:14:44] <sellout> Well, this was a surprise, `isNumber(NumberInt(0)) == false`
[14:15:15] <deathanchor> what's the operator to query that the array of a field is not empty?
[14:16:53] <mbwe> doc_tuna: say i have a document like {name: "mbwe", age: 12} and if i update that document with exactly the same content for that document, i don't want that update go trough
[14:18:30] <doc_tuna> you should keep track of whether it needs writing or not yourself
[14:19:31] <doc_tuna> it's a no-op if you send it with the same contents but you are still wasting the databases cpu and network by sending the update, it would be better not to send it
[14:27:34] <mbwe> i can't because those documents get updated by different apps, doc_tuna
[20:11:01] <cheeser> i'd look at sqoop or flume which already do something like this.
[20:16:30] <terminal_echo> maybe, but the export csv is streaming, the rysnc in streaming, and the bulk insert is quite fast..
[20:29:38] <Kamuela> can any basic mongodb instance be connected to with mongodb:// ?
[21:11:32] <saml> what's downside of using string as _id ? Most of my queries will be db.docs.find({_id: <url> })
[21:12:06] <saml> or i can leave _id alone and create a new field, url, which has unique index. not sure what's better
[21:12:26] <terminal_echo> cheeser: thanks for the sqoop idea, this is definitely the way to go, i take it its easy to transfer from mongod -> MSSQL?
[21:23:31] <MacWinne_> if I want to insert an activity document into a collection, but I want to not insert the document if a specified set of fields already match in the collection already, what would be the best way? I want to do this atomically... I'm not looking to upsert since I don't want to update the existing record
[21:23:52] <MacWinne_> is there a specific collection command I need?
[21:26:11] <MacWinne_> also I can't use indexes for it beacuse the sets of fields I'm looking for can vary
[21:38:50] <diegoaguilar> hello I have documents in a collection with a onDemandPoints attribute looking like this
[21:38:51] <diegoaguilar> Ahorita te pague normal y lunes o martes espero poderte confirmar cuando cambiamos a asimilados
[21:57:34] <diegoaguilar> what problem are u having?
[21:58:37] <Nikesh> Well, I'm just considering which Ubuntu to install on a new machine. I've seen that Mongo doesn't officially support 15.04 yet, so I wanted to see if others got it to work
[21:58:43] <Nikesh> Otherwise I'll just use Ubuntu 14.04
[21:58:53] <diegoaguilar> oh well the best u can do is to install any of the LTS
[23:01:16] <roo00t> i don't know mongodb right now but i want to use it with my RESTful API. Could someone tell how the json file looks like so that i can test my code
[23:01:43] <roo00t> any resource for quick guidence
[23:11:32] <roo00t> Boomtime: are there mongo dump available for testing?
[23:12:00] <roo00t> i searched but it gets me to dump command for db
[23:12:31] <Boomtime> testing what? it's a database, so you get out what you put in
[23:13:44] <Boomtime> you have bought an empty filing cabinet and you are asking what is the the content of the pages that are put in it - there is nothing in it, until you put it there