[12:23:34] <Industrial> Can I stream (nodejs) stuff into mongodb?
[12:23:50] <Industrial> I made a streaming converter and I need to put an arbitrary data size of CSV input into mongo
[12:24:08] <Industrial> So I am wondering if I can stream them to MongoDB
[12:24:23] <Industrial> or if I need to pull 100 from a stream and then .insertMany() on it and then pull 100 more
[12:33:14] <GothAlice> Industrial: If you have some data ingress process acquiring data and issuing a "stream" of inserts… how does this not qualify? Additionally, for such use, bulk inserts are often an improvement in efficiency, noting that a bulk write operation is internally naturally batched. (Since you have a complete CSV to begin with.)
[13:18:19] <colegatron> Hi mongodb newbie here. I've just installed mongo in HA config over Kubernetes using helm chart. it works great, I can destroy a node and k8s+mongo failover is incredible.
[13:19:58] <colegatron> maybe what I am going to ask is silly, but... the install does not provides a load balancer pointing to the cluster's master, it just gives you a round robin load balancer, which fails 2 of 3 times when trying to write to the db.
[13:22:03] <colegatron> I've been told using mongos it takes care about which node write, but I am not sure if any java client will do the same.
[13:22:34] <colegatron> any mongo's client I meant
[15:26:36] <GothAlice> Ah, apologies, https://github.com/marrow/mongo/wiki/ObjectID for the module "docstring" made readable by proper interpretation as markdown. ;)
[15:58:32] <GothAlice> Now to monkeypatch bson to deserialize mine instead of the default implementation. ¬_¬
[16:00:25] <GothAlice> This, though, is hella gross: https://github.com/marrow/mongo/blob/next/marrow/mongo/util/oid.py?ts=4#L295-L305