[15:18:10] <sterns> hello, mongodb 3.6.3 on Ubuntu Xenial nodeJS with 2.x driver. I am getting BSONObj size (exceeds 16MB) error when using insertMany with an array of about 13K small documents
[15:18:45] <sterns> it is my understanding that mongodb will batch docs into groups of 100K
[15:19:12] <sterns> and will *just* work even if the error result document is to exceed 16MB
[15:20:01] <sterns> I am certain that none of the 13K documents exceeds 16MB individually
[15:22:03] <sterns> the full error is here: https://gist.github.com/mattcollier/359cbf0b6e7ab7fe5b732cdf475077cc
[15:24:14] <sterns> this is a stand-alone mongo instance
[15:26:50] <sterns> and this is an unordered insert
[15:40:32] <croberts> Derick: ah thank you so much
[15:41:15] <Derick> sterns: all 13k docs together can't exceed 16MB
[15:41:55] <Derick> sterns: it depends on the driver how these are batched... (and I thought it was only 10k).
[15:47:03] <sterns> javascript/nodejs with the 2.x mongo driver
[16:15:25] <mariusvw> Does anyone know how to get all fields from a collection when using "aggregate" ?
[16:25:30] <Derick> they will all be in there unless you restrict it yourself - so, I'm not sure what you're asking. Can you share your pipeline on pastebin?
[16:42:41] <Derick> sterns: looks like it might be a bug, can you file a ticket at https://jira.mongodb.org/projects/NODE/issues ?
[17:03:41] <mariusvw> But that is exactly the same, I get the ID and Time field, but I would like to get the other fields too
[17:04:21] <mariusvw> In the example the price and quantity is not in the result same what I experience :)
[17:08:10] <Derick> mariusvw: you need to do a $last for each of these fields then - remember, they might not be all the same for each of the documents that are grouped in the "$key" bucket
[17:09:32] <mariusvw> Isn't it possible to get the "key" field grouped and then get all documents matching that key but only the last one based on time?
[17:10:04] <mariusvw> Again I'm new to mongo, in MySQL i know how to write such things with one join :|
[17:10:08] <Derick> only if in a previous pipe line operation you have sorted them - and then you can use $last
[17:11:55] <mariusvw> but that is what the first $sort should do right?
[17:15:27] <mariusvw> so, to get all the fields I have to run another query with all the keys to be fetched?
[17:16:38] <mariusvw> I don't understand why all the fields vanish in the result, but also, i have no idea if aggregate is the right way to get the data
[17:23:29] <mariusvw> other thing, just a personal opinion, if you had to work with nodejs, would you write it native or use tools such as Node-RED? :)
[17:50:46] <sterns> Derick: ty for your response, I can file a ticket. Do you already know if there is some additional logging/debug information I can look for and include in the ticket?
[18:41:55] <Derick> sterns: I think it would help if you can give a short reproducible script - not sure what else, as I don't know the node driver