bash 通过 S3 自动检测文件更改并同步

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/8417320/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-09-18 01:08:56  来源:igfitidea点击:

Automatically detecting file changes and synchronizing via S3

linuxbashfilesystemsamazon-s3inotify

提问by Cerin

I have a local directory of media files on a Linux system, which I synchronise with an Amazon S3 account using an s3sync script. Currently, I'm manually running the s3sync script when I know the media files have been modified.

我在 Linux 系统上有一个本地媒体文件目录,我使用 s3sync 脚本将其与 Amazon S3 帐户同步。目前,当我知道媒体文件已被修改时,我正在手动运行 s3sync 脚本。

How would I automatically run the script when files are modified?

修改文件时如何自动运行脚本?

I was thinking of creating a cron job to run the script every few minutes, but that seems like an excessive amount of processing, because even if there are no changes, the script still has to scan the entire directory structure, which is quite large.

我正在考虑创建一个 cron 作业来每隔几分钟运行一次脚本,但这似乎处理量过大,因为即使没有更改,脚本仍然要扫描整个目录结构,这是相当大的。

I also considered incron/inotify, which allows running commands when a specific file or directory changes, but these tools don't seem to automatically support monitoring changes to the entirety of a nested directory. Correct me if I'm wrong, but it seems that incron/inotify can only monitor files they've been explicitly told to monitor. e.g. If I wanted to monitor changes to all files at any level inside a directory, I'd have to write separate scripts to monitor directory and file additions/deletions to update the list of files and directories monitored by incron.

我还考虑了incron/inotify,它允许在特定文件或目录更改时运行命令,但这些工具似乎并不自动支持监视对整个嵌套目录的更改。如果我错了,请纠正我,但似乎 incron/inotify 只能监视它们被明确告知监视的文件。例如,如果我想监视目录内任何级别的所有文件的更改,我必须编写单独的脚本来监视目录和文件的添加/删除,以更新由 incron 监视的文件和目录列表。

Are there more efficient solutions?

有没有更有效的解决方案?

回答by sparrovv

For this kind of tasks, I'm using fssmgem.

对于此类任务,我正在使用fssmgem。

create file watcher.rb

创建文件 watcher.rb

require 'fssm'

FSSM.monitor('/dir_to_watch/', '**/*') do
  update {|base, relative| `your_script` }
  delete {|base, relative| `your_script` }
  create {|base, relative| `your_script` }
end

then:

然后:

ruby watcher.rb

Of course you can demonize it, if you want.

当然,如果您愿意,您可以将其妖魔化。

回答by Bemehow

Here is a sample scenario you might use instead and utilize simple rsync script.

这是您可以使用的示例场景,并使用简单的 rsync 脚本。

http://andrewwilkinson.wordpress.com/2011/01/14/rsync-backups-to-amazon-s3/

http://andrewwilkinson.wordpress.com/2011/01/14/rsync-backups-to-amazon-s3/

Basically means using fuse and s3fs ( http://code.google.com/p/s3fs/) to mount s3 share as a directory on your local filesystem and use rsync to sync the 2. Simple cron job would do the trick.

基本上意味着使用 fuse 和 s3fs ( http://code.google.com/p/s3fs/) 将 s3 共享挂载为本地文件系统上的目录,并使用 rsync 同步 2。简单的 cron 作业就可以解决问题。

回答by Tal Weiss

Now there is an efficient solution. This was just announced (long overdue):
http://aws.amazon.com/blogs/aws/s3-event-notification/

现在有一个有效的解决方案。这刚刚宣布(早该):http:
//aws.amazon.com/blogs/aws/s3-event-notification/

It is very simple to implement - time to throw out all the ugly cron jobs and list-loops.

实现起来非常简单 - 是时候扔掉所有丑陋的 cron 作业和列表循环了。