PMXBOT Log file Viewer

Help | Karma | Search:

#mongodb logs for Sunday the 27th of July, 2014

(Back to #mongodb overview) (Back to channel listing) (Animate logs)
[07:46:48] <deanclkclk_> anyone here?
[07:47:05] <deanclkclk_> is it possible to copy over the db for a database to another server?
[07:50:06] <cheeser> mongodump, mongorestore
[10:23:51] <KrzyStar> Hai
[10:24:25] <KrzyStar> I'm trying to migrate an existing postgres database to mongo using references instead of relations
[10:24:57] <KrzyStar> For now, I've migrated `id` field together with all the documents
[10:25:16] <dawik> whyy
[10:25:26] <dawik> turn around and dont look back
[10:25:50] <KrzyStar> Is there an easy way to replace the IDs using ObjectIDs?
[10:26:03] <cheeser> you don't need to use ObjectID
[10:26:34] <cheeser> but once your document has an _id in mongo, it can't be changed. you'll have to copy that document to a new one with a new _id and update all references.
[10:26:49] <KrzyStar> Yeah, I know
[10:27:13] <KrzyStar> But then, can I set the id field to be an autoincrementing integer?
[10:27:27] <KrzyStar> Or do I have to keep the counter outside of the DB?
[10:27:31] <cheeser> http://docs.mongodb.org/manual/tutorial/create-an-auto-incrementing-field/
[10:29:16] <KrzyStar> So I need to keep the counters
[10:29:20] <KrzyStar> Aite, thanks cheeser :)
[10:31:35] <cheeser> np
[13:48:50] <Zzz> Hey
[13:49:49] <Zzz> Is it possible to give the group command a list of keys as an array
[13:50:04] <Zzz> and generate default values when there are no documents matching a key
[14:38:06] <ATuin> hej
[18:05:56] <leo3> Hi Guys, I am looking to store something like http://jsonresume.org/ imagine I want to store something like the json out there. and I would like to retrieve users based on skills, companyname etc….so what shall I use for such a scenario ? sql db or nosql db ??
[18:21:28] <leo3> any suggestions ?
[19:09:10] <Logicgate> hey guys
[19:09:11] <Logicgate> http://pastebin.com/6pwMUc0Y
[19:09:22] <Logicgate> Here is a php snippet, trying to aggregate some data
[19:09:31] <Logicgate> basically I want to aggregate unique clicks / impressions
[19:09:49] <Logicgate> the data model looks like this {action: 'click', ip: '192.168.0.1'}
[19:10:03] <Logicgate> the action is either 'click' or 'impression'
[19:10:16] <Logicgate> I want to aggregate the sum by type of action with unique ips
[19:10:32] <Logicgate> I'm getting: exception: A pipeline stage specification object must contain exactly one field.
[19:10:40] <Logicgate> As an error with the snippet of code I posted
[20:57:16] <Asterfield> I have a mongoose instance whose `save` callback never get's called. Any ideas why that might be?
[21:16:45] <s2013> what format does jsonArray expect?
[21:16:53] <s2013> is it [ {},{}] or is it {},{}
[21:16:57] <s2013> {} = a json doc
[21:17:27] <Derick> [] is an array
[21:17:45] <s2013> ive been trying to import a db for a week now
[21:17:54] <s2013> no luck. no one can help me on stackoverflow. im so confused
[21:23:36] <s2013> https://gist.github.com/ss2k/1f068aa6fab9c8975074 this currently how the data looks. except its like 100s of thousands of documents instead of the 3 i pasted
[21:24:00] <s2013> i changed it so the results key was removed and it just became [ {},{},{} ].. still no luck.. then i removed the [], still no l uck. same error. that its too large
[21:25:10] <Derick> how are you trying to import that?
[21:27:21] <s2013> mongoimport -d dbname -c collection --file filename.json --jsonArray
[21:31:40] <s2013> brb
[21:33:16] <Asterfield> Ahaha, one of my pre-save callbacks never returned
[21:33:29] <Asterfield> That was the problem :P
[22:59:49] <joannac> s2013 does it work with 3?
[23:00:21] <joannac> s2013: With 1000s of thousands it probably is too large - it's a single document
[23:09:15] <ranman> s2013: what's the SO link?
[23:09:27] <ranman> also greetings from #hackny
[23:33:34] <ranman> joannac: we solved s2013's issue I think, same issue I had back in chattanooga with mongoimport needing each doc on a single line