node.js Redis 缓存和 Mongo 持久化架构
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/11218941/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Architecture for Redis cache & Mongo for persistence
提问by Ryan Ogle
The Setup:
Imagine a 'twitter like' service where a user submits a post, which is then read by many (hundreds, thousands, or more) users.
设置:
想象一个“类似推特”的服务,用户提交一个帖子,然后被许多(数百、数千或更多)用户阅读。
My question is regarding the best way to architect the cache & database to optimize for quick access & many reads, but still keep the historical data so that users may (if they want) see older posts. The assumption here is that 90% of users would only be interested in the new stuff, and that the old stuff will get accessed occasionally. The other assumption here is that we want to optimize for the 90%, and its ok if the older 10% take a little longer to retrieve.
我的问题是关于构建缓存和数据库以优化快速访问和多次读取的最佳方法,但仍保留历史数据,以便用户(如果他们愿意)可以看到较旧的帖子。这里的假设是 90% 的用户只会对新内容感兴趣,并且偶尔会访问旧内容。这里的另一个假设是我们要针对 90% 进行优化,如果较旧的 10% 需要更长的时间来检索也可以。
With this in mind, my research seems to strongly point in the direction of using a cache for the 90%, and then to also store the posts in another longer-term persistent system. So my idea thus far is to use Redis for the cache. The advantages is that Redis is very fast, and also it has built in pub/sub which would be perfect for publishing posts to many people. And then I was considering using MongoDB as a more permanent data store to store the same posts which will be accessed as they expire off of Redis.
考虑到这一点,我的研究似乎强烈指向对 90% 使用缓存的方向,然后还将帖子存储在另一个长期持久系统中。所以到目前为止我的想法是使用 Redis 作为缓存。Redis 的优点是速度非常快,而且它内置了 pub/sub,非常适合向很多人发布帖子。然后我考虑使用 MongoDB 作为更永久的数据存储来存储相同的帖子,这些帖子将在它们从 Redis 过期时被访问。
Questions:
1. Does this architecture hold water? Is there a better way to do this?
2. Regarding the mechanism for storing posts in both the Redis & MongoDB, I was thinking about having the app do 2 writes: 1st - write to Redis, it then is immediately available for the subscribers. 2nd - after successfully storing to Redis, write to MongoDB immediately. Is this the best way to do it? Should I instead have Redis push the expired posts to MongoDB itself? I thought about this, but I couldn't find much information on pushing to MongoDB from Redis directly.
问题:
1. 这个架构是否成立?有一个更好的方法吗?
2. 关于在 Redis 和 MongoDB 中存储帖子的机制,我正在考虑让应用程序执行 2 次写入:第一次 - 写入 Redis,然后订阅者可以立即使用它。第二 - 成功存储到 Redis 后,立即写入 MongoDB。这是最好的方法吗?我应该让 Redis 将过期的帖子推送到 MongoDB 本身吗?我想过这个,但是我找不到很多关于直接从Redis推送到MongoDB的信息。
采纳答案by Didier Spezia
It is actually sensible to associate Redis and MongoDB: they are good team players. You will find more information here:
将 Redis 和 MongoDB 联系起来实际上是明智的:他们是优秀的团队成员。您将在此处找到更多信息:
One critical point is the resiliency level you need. Both Redis and MongoDB can be configured to achieve an acceptable level of resiliency, and these considerations should be discussed at design time. Also, it may put constraint on the deployment options: if you want master/slave replication for both Redis and MongoDB you need at least 4 boxes (Redis and MongoDB should not be deployed on the same machine).
一个关键点是您需要的弹性级别。Redis 和 MongoDB 都可以配置为达到可接受的弹性水平,这些注意事项应该在设计时进行讨论。此外,它可能会限制部署选项:如果你想要 Redis 和 MongoDB 的主/从复制,你至少需要 4 个盒子(Redis 和 MongoDB 不应部署在同一台机器上)。
Now, it may be a bit simpler to keep Redis for queuing, pub/sub, etc ... and store the user data in MongoDB only. Rationale is you do not have to design similar data access paths (the difficult part of this job) for two stores featuring different paradigms. Also, MongoDB has built-in horizontal scalability (replica sets, auto-sharding, etc ...) while Redis has only do-it-yourself scalability.
现在,保留 Redis 进行排队、发布/订阅等操作可能会更简单一些……并将用户数据仅存储在 MongoDB 中。基本原理是您不必为具有不同范式的两个商店设计类似的数据访问路径(这项工作的困难部分)。此外,MongoDB 具有内置的水平可扩展性(副本集、自动分片等...),而 Redis 仅具有自己动手的可扩展性。
Regarding the second question, writing to both stores would be the easiest way to do it. There is no built-in feature to replicate Redis activity to MongoDB. Designing a daemon listening to a Redis queue (where activity would be posted) and writing to MongoDB is not that hard though.
关于第二个问题,写给两家商店是最简单的方法。没有将 Redis 活动复制到 MongoDB 的内置功能。不过,设计一个监听 Redis 队列(活动将在其中发布)并写入 MongoDB 的守护进程并不难。

