[00:36:53] <stephenmac7> Is it possible to disable auth for a specific db?
[00:40:51] <_jc> Anyone have insights into my question here on SO: http://stackoverflow.com/questions/22724846/mongodb-ingest-etl-design-options
[01:30:28] <_jc> Anyone have insights into my question here on SO: http://stackoverflow.com/questions/22724846/mongodb-ingest-etl-design-options
[01:41:17] <_jc> Trying to find ETL options for getting data into mongo. Any ideas?
[08:23:07] <HSP_> hey guys im using a cms with pictures and I only want certain users of my site to be able to see them. Any ideas on how to limit users access even if they have the url
[08:36:07] <shantanoo> i want to add current date time whenever i insert the document in the collection. something like default value now() in case of sql.
[08:36:25] <shantanoo> is there a way to do such thing at server side?
[08:36:25] <fl0w> shantanoo: _id consists of a timestamp
[08:55:45] <shantanoo> fl0w, iso support tz info. instead of Z at the end, offset is added.
[08:56:02] <shantanoo> but i suppose UTC should be fine. :)
[08:57:52] <fl0w> shantanoo: You are correct sir. My bad.
[08:58:51] <fl0w> never knew it supports intervals, duration and such
[10:57:02] <the8thbit> So, I have a record, I grab it, alter it, and then save it (all with mongoose in Node.js) but after I save the record is not updated, and .save() doesn't throw any error. What gives?
[13:49:44] <arussel> I want to create a collection out of https://github.com/commoncurriculum/standards-data/blob/master/clean-data/CC/math/CC-math-0.8.0.json is there a method that would just parse a url ?
[14:47:35] <mbroadst> hey I'm having trouble importing a rather large csv into mongo using mongoimport. I'm getting this exception "exception:read error, or input line too long (max length: 16777216)" however I can verify that no individual lines are that long, is that a cap on the whole document? (longest line is ~2629 bytes)
[14:47:48] <stephenmac7> Would it be possible to figure out which gridfs chunks/files have nothing referencing them?
[14:49:53] <mbroadst> cheeser: is there any easy way to extract that kind of detail from the error? I mean it's a huge file (177MB) exported in csv from excel :)
[14:50:50] <cheeser> maybe using the 'file' utility
[14:51:29] <mbroadst> cheeser: "ASCII text, with very long lines, with CR line terminators"
[14:56:28] <mbroadst> cheeser: should work, right?
[15:02:39] <cheeser> hard to say. clearly it doesn't, though. ;)
[15:02:44] <cheeser> i have to run, though. good luck!
[15:11:18] <mbroadst> alright, looks like I was able to import it converting from csv -> json, and importing that file. However, the resulting database is over twice as large as the source data (~700M source, mongo db ~1.9GB), is that to be expected?
[15:14:53] <mbroadst> oh I guess the result of "show databases" isn't the actual size, db.stats() indicates ~830M - is there a full backup then?
[15:15:16] <mbroadst> this is quite a jump from an original 40M excel spreadsheet though..