PMXBOT Log file Viewer

Help | Karma | Search:

#mongodb logs for Friday the 30th of September, 2016

(Back to #mongodb overview) (Back to channel listing) (Animate logs)
[03:44:35] <steverandy> does mongodb support read preference option at collection level?
[04:05:56] <steverandy> opened the issue here https://jira.mongodb.org/browse/NODE-824?jql=project%20%3D%20NODE
[07:51:01] <crazyadm> i added replset in repl server, do i need to add repl set to configsvr?
[07:51:37] <crazyadm> what if i have two sets of replset, and those each one of those two sets consist 2 shards
[08:04:55] <crazyadm> how do i add multiple repl set to configsvr, and enable sharding on them
[08:17:48] <crazyadm> anyone help me
[08:57:13] <crazyadm> i have 2 sets of replic, how to add them to configsvr
[13:31:06] <INSANU> is there anykind of study or document related to a generic schema in mongodb?
[13:31:44] <cheeser> http://blog.mongodb.org/post/87200945828/6-rules-of-thumb-for-mongodb-schema-design-part-1
[13:31:53] <cheeser> https://docs.mongodb.com/manual/core/data-model-design/
[13:41:02] <INSANU> cheeser: thanks for the links
[13:42:07] <cheeser> yep
[15:18:29] <moos3> question, so my three nodes all go rebooted at the same time, now the prompt says OTHER and rs.status says REMOVED. How do I fix this?
[17:46:11] <diegows> hi, is there a counter for slow queries somewhere?
[18:09:45] <TheEpitome> Hello, when I perform a "db.hosts.find({},{dmiSystem: 1, networkInterfaces: 1, lanPrint: 1, pduPorts: 1, comment: 1})" from mongo shell it returns the fields I ask for almost instantly. When I perform this same query from PHP using MongoDB\Client it takes about 5 seconds, the same amount of time as if I just ran a find() without any parameters. Any ideas?
[18:40:47] <synthmeat> StephenLynx: if i don't need any caching/authentication, are there any other (relatively) complex things remaining if i decide to roll with just node.js http for my next project? (recall that you do so)
[18:41:09] <StephenLynx> eh
[18:41:17] <StephenLynx> I do have caching and auth though
[18:41:22] <StephenLynx> with nothing but node and mongo.
[18:41:51] <StephenLynx> and I'd say that sharding was a noticeable part of it too.
[18:42:22] <synthmeat> sharding? in context of http servers or in mongo?
[18:42:28] <StephenLynx> http servers.
[18:42:36] <synthmeat> what's that, load balancing?
[18:42:38] <StephenLynx> yes.
[18:42:47] <StephenLynx> not only for requests, but for cache generation.
[18:43:04] <StephenLynx> I used a lib for forwarding requests.
[18:43:32] <synthmeat> how about url parsing?
[18:44:10] <StephenLynx> node handles that.
[18:44:13] <StephenLynx> what about it?
[18:44:30] <StephenLynx> you mean, stuff like routing?
[18:49:02] <synthmeat> yeah
[18:49:03] <synthmeat> routing
[18:49:41] <StephenLynx> trivial, I implemented it on my own.
[18:50:57] <synthmeat> coolio. i'm sold.
[18:51:07] <synthmeat> and *finally* dropping mongoose!
[18:52:13] <StephenLynx> kek
[18:52:39] <StephenLynx> another part that is more complex than I think
[18:52:48] <StephenLynx> because of how long it has been there
[18:52:51] <StephenLynx> is templating.
[18:52:58] <moos3> question, so my three nodes all go rebooted at the same time, now the prompt says OTHER and rs.status says REMOVED. How do I fix this?
[18:53:10] <StephenLynx> my engine generates HTML pages
[18:53:27] <StephenLynx> so it has to get the base html from somewhere, manipulate and output the final page
[18:53:44] <StephenLynx> what I do is to load and cache the pure HTML file that will be a template
[18:53:48] <StephenLynx> create a document using jsdom
[18:53:53] <StephenLynx> manipulate and output.
[18:54:34] <synthmeat> in my hand-rolled static gen i load up pure html into cheerio then manipulate with a lot of template literals
[18:54:46] <StephenLynx> I looked into cheerio
[18:54:49] <StephenLynx> but fuck jquery syntax
[18:54:51] <StephenLynx> fuck it hard.
[18:55:18] <StephenLynx> what I have to make it modular is to have a .json file that says which file should be used for which template
[18:55:26] <synthmeat> yeah, i'm battling the syntax all the time, but it's powerful (didn't evalute jsdom tho)
[18:55:34] <StephenLynx> then the engine checks the fields on the template to output missing parts.
[18:55:41] <StephenLynx> and after that it caches the file.
[18:56:05] <StephenLynx> with jsdom I can generate a slightly complex page in 130ms.
[18:56:23] <StephenLynx> less than that, actually
[18:56:37] <StephenLynx> because in that time is also the time to generate its json counterpart
[18:56:39] <StephenLynx> and store it on mongo
[18:57:10] <StephenLynx> so if your performance with cheerio isn't doing that much better, it isn't paying off.
[18:57:30] <StephenLynx> besides, depending on how you generate the pages, you might optimize around efficiency rather than speed, like I did.
[18:57:51] <StephenLynx> I have two kinds of pages on my system:
[18:57:53] <synthmeat> i optimized for readability since it's just static gen, irrelevant how long it takes to build it
[18:58:02] <StephenLynx> eh
[18:58:17] <StephenLynx> readability doesn't matter performance wise.
[18:58:30] <synthmeat> (next one is just json rest api)
[18:58:32] <StephenLynx> it is important, but is not what I am talking about.
[18:58:52] <StephenLynx> I have static pages that are the same for every user and dynamic pages used by few users.
[18:58:59] <synthmeat> (yeah, please talk, i'm just shooting off random semi-related stuff)
[18:59:06] <StephenLynx> the dynamic ones are always generated on the fly
[18:59:10] <StephenLynx> and return 200
[18:59:16] <StephenLynx> the static ones are queued
[18:59:24] <StephenLynx> and this queued removes duplicate tasks.
[18:59:32] <StephenLynx> this queue*
[18:59:57] <StephenLynx> so even if the generation gets a little slow, it will reach a point where tasks will be dropped left and right due to being duplicates.
[19:00:09] <StephenLynx> the more works it has to do, the less work it has to do.
[19:00:20] <StephenLynx> and these pages can return a 304.
[19:00:41] <StephenLynx> and they are also compressed.
[19:01:02] <StephenLynx> so it doesn't matter too much if the generation for them isn't too fast.
[19:01:04] <synthmeat> yeah, should throw an intereting curve in page speed x requests/time
[19:01:50] <StephenLynx> and this is what I meant about optimizing for efficiency rather than speed.
[19:02:56] <StephenLynx> not only that, but I also cache individual posts.
[19:03:08] <StephenLynx> each post have 5 different caches.
[19:03:18] <StephenLynx> because they can be presented in 5 different ways.
[19:03:32] <StephenLynx> preview, inner page, outer page, low moderation, high moderation
[19:03:46] <synthmeat> yeah, that's exactly why i want to go down to node - so i can tailor in fairly straightforward way for my use-case
[19:04:14] <StephenLynx> the part that builds the post cell checks if the proper cache exists and use it. if it doesn't, it generates the cell and caches it.
[19:04:30] <StephenLynx> completely abstracting it from parts of the code that requests the cell generation.
[19:04:49] <synthmeat> oh, some kind of reddit-like thingie?
[19:04:53] <StephenLynx> a chan
[19:04:55] <StephenLynx> lynxhub.com
[19:04:59] <StephenLynx> freech.net
[19:05:01] <StephenLynx> bunkerchan.xyz
[19:05:03] <StephenLynx> endchan.xyz
[19:05:08] <StephenLynx> all these use it.
[19:05:12] <StephenLynx> on different versions
[19:05:38] <StephenLynx> lynxhub is mine, using the latest stable version, freech uses the beta version, bunkerchan uses the previous stable version
[19:05:49] <StephenLynx> and endchan uses the latest stable with some modifications
[19:05:56] <StephenLynx> because the admin didn't want to use my addon system
[19:06:04] <StephenLynx> cuz hes a dumbass sometimes :^)
[19:06:15] <synthmeat> pretty sweet
[19:07:58] <StephenLynx> currently I'm working on a fork of a front-end
[19:08:10] <StephenLynx> the penumbra one, used on spacechan.xyz
[19:08:19] <StephenLynx> it was a mess.
[19:08:34] <StephenLynx> I finished adding a few features and now I'll add support for 1.7
[19:08:39] <StephenLynx> which is the beta version
[19:08:59] <synthmeat> main reason i never participated in any chan-like thing is that i can't make the sense of it, on frontend. i guess it takes a while to get used to it
[19:10:48] <StephenLynx> :v
[19:11:01] <StephenLynx> after a while, you just get it.
[19:11:24] <StephenLynx> on the other hand, I can't figure reddit for the life in me.
[19:11:35] <StephenLynx> and all that nesting
[19:16:18] <StephenLynx> anyway, if you plan on doing a bare http system, you could learn a lot from mine.
[19:16:27] <StephenLynx> gitgud.io/LynxChan/LynxChan
[19:19:52] <synthmeat> yeah, was about to ask for package.json
[19:20:39] <StephenLynx> src/be
[19:20:47] <synthmeat> it is both hillarious and amazing
[19:21:02] <StephenLynx> kek
[19:21:03] <StephenLynx> what
[19:22:19] <synthmeat> any package.json of my projects who have 10th of functionality lynxchan has is 5 times bigger :D
[19:22:25] <StephenLynx> kek
[19:22:37] <StephenLynx> and one of those is legacy.
[19:22:39] <StephenLynx> the bcrypt one.
[19:23:11] <StephenLynx> I can't remove it because then an older account won't be possible to migrate
[19:23:30] <StephenLynx> because I changed the auth from bcrypt to pbkdf2
[19:23:57] <StephenLynx> so I still need to be able to check accounts using bcrypt and only then re-encrypt the hash using pbkdf2
[19:24:17] <StephenLynx> I used to have one, ip-address
[19:24:28] <StephenLynx> that I removed and instead used jsbn directly.
[19:24:45] <StephenLynx> the node-mailer is barely used.
[19:24:53] <StephenLynx> only for password resets.
[19:25:29] <StephenLynx> the proxy one is used by no one with a live server, afaik
[19:25:33] <StephenLynx> since its used for sharding
[19:26:12] <StephenLynx> I am always trying to keep the least amount of dependencies. this is how you avoid a leftpad.
[19:26:18] <StephenLynx> have you heard of it?
[19:26:58] <synthmeat> yup. i'm hoping to learn more about what matters through this (because you learn exactly nothing but express when you use express)
[19:27:21] <StephenLynx> yeah, web frameworks are cancer.
[19:27:41] <StephenLynx> I'd have to REALLY not give a single fuck about a project if I were to ever use one.
[19:27:45] <StephenLynx> literally a hit an run.
[19:32:25] <synthmeat> http-proxy looks well maintained. might even drop nginx from the equation
[19:32:38] <StephenLynx> and the way to avoid redundant dependencies is to first evaluate how the task can be implemented on your own.
[19:32:50] <StephenLynx> then you check the library and see if you REALLY need it.
[19:33:04] <StephenLynx> for example, I almost ended up using one for pbkdf2
[19:33:12] <synthmeat> and if you need to cut corners, you can do that as you go
[19:33:24] <StephenLynx> because all it did was to provide support for older versions of node.
[19:33:44] <StephenLynx> but since I work based on hard version requirements, that wasn't an issue to me.
[19:34:18] <StephenLynx> and is also important to evaluate how much of the task you don't care you is done.
[19:34:21] <StephenLynx> for example, e-mailing
[19:34:30] <StephenLynx> I don't care how the e-mail is handled, as long as it works.
[19:34:42] <StephenLynx> so that's one less reason to not use a library for it.
[19:35:05] <StephenLynx> you should be always thinking about the libraries that you need.
[19:35:08] <StephenLynx> and not the other way around.
[19:35:21] <StephenLynx> don't try and stop using a lib. only use a lib if you really feels like you have to.
[19:35:26] <synthmeat> yeah, i'm beyond framework window-shopping phase
[19:36:53] <synthmeat> i mean, i'm at age where i noticed that learning almost any high-level framework was (larger than i expected) sunken effort. nothing gained beyond possibly a faster start to getting something up
[19:37:00] <StephenLynx> yes.
[19:37:15] <synthmeat> no transferable skills/knowledge
[19:37:17] <StephenLynx> people really ignore the increase in the learning curve from adding dependencies to a project.
[19:37:30] <StephenLynx> and that too, how that is only applicable for that specific tool.
[19:37:43] <StephenLynx> while if you learn the base tool, you'll have a much broader range.
[19:38:23] <StephenLynx> not to mention how much the more a tool permeates the project, the more it will fail if the tool tails.
[19:38:25] <StephenLynx> fails*
[19:38:49] <StephenLynx> if the framework has a vulnerability, you're screwed.
[19:39:04] <synthmeat> yeah, once those dropped socket.io connections start to acumulate, you're hosed
[19:43:39] <synthmeat> (that was pre 1.0 socket.io issue, prolly fixed now)
[19:44:05] <StephenLynx> never touched that either.
[19:44:17] <StephenLynx> not only frameworks, but I don't use do-all libs.
[19:44:39] <StephenLynx> I once used websockets by using just a websocket library.
[19:46:21] <StephenLynx> at first I tried not using a lib but the assholes made websockets such a convoluted protocol that I gave up on doing it from scratch.
[19:46:34] <StephenLynx> it gets to the point it uses a "magical string" for the handshake.
[19:46:37] <StephenLynx> its fucking retarded.
[19:46:54] <StephenLynx> should've just enabled raw TCP.
[19:47:38] <synthmeat> i don't hear much about node and udp
[19:47:53] <StephenLynx> websockets is UDP?
[19:48:01] <StephenLynx> I thought it was just a fancy tcp.
[19:48:28] <synthmeat> no, mentioning it unrealated
[19:48:38] <teprrr> it's tcp, with tricks :P
[19:49:15] <StephenLynx> i'd guess that you can use udp with node. but don't quote me on that.
[19:49:29] <StephenLynx> yup
[19:50:11] <synthmeat> yeah, dgram
[20:01:33] <synthmeat> StephenLynx: gtg. thanks for the chat, was pleasure
[20:01:42] <StephenLynx> np
[22:00:01] <lacour> In production with Mongoid and MongoDB Atlas, I'm noticing some random fields being corrupted with ObjectIDs that are entirely nonsensical (i.e. with timestamps that are far in the past or future, one is for 2028). The documents don't have any ObjectIDs beyond _id, and the app itself doesn't handle/create any. Any ideas?
[22:01:53] <cheeser> you'll want to post to mongo-user with that one. neither emily nor durran lurk here.
[22:02:14] <lacour> Thanks
[22:02:19] <cheeser> yep