bash gzip: stdout: 运行自定义备份脚本时文件太大
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/2699946/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
gzip: stdout: File too large when running customized backup script
提问by Elitmiar
I've create a plain and siple backup script that only backs up certain files and folders.
我创建了一个简单的备份脚本,它只备份某些文件和文件夹。
tar -zcf $DIRECTORY/var.www.tar.gz /var/www
tar -zcf $DIRECTORY/development.tar.gz /development
tar -zcf $DIRECTORY/home.tar.gz /home
Now this script runs for about 30mins then gives me the following error
现在这个脚本运行了大约 30 分钟,然后给了我以下错误
gzip: stdout: File too large
Any other solutions that I can use to backup my files using shell scripting or a way to solve this error? I'm grateful for any help.
我可以使用 shell 脚本备份我的文件的任何其他解决方案或解决此错误的方法?我很感激任何帮助。
回答by Jürgen H?tzel
File too largeis a error message from your libc: The output has exceeded the file size limit of your filesystem.
文件太大是来自您的 libc 的错误消息:输出已超出文件系统的文件大小限制。
So this is not a gzipissue.
所以这不是gzip问题。
Options: Use another Filesystem or use split:
选项:使用另一个文件系统或使用拆分:
tar czf - www|split -b 1073741824 - www-backup.tar.
creates the backup.
创建备份。
Restore it from multiple parts:
从多个部分恢复它:
cat www-backup.tar.*|gunzip -c |tar xvf -
回答by Adrian
Can the file system you are backing up to support large files?
您正在备份的文件系统能否支持大文件?
Specifically, FAT32 has a limit of ~4GB in a single file, and other filesystems have similar limits.
具体来说,FAT32 在单个文件中的限制为 ~4GB,其他文件系统也有类似的限制。
If your backup is running for 30 minutes, the file could easily be getting that sort of size.
如果您的备份运行了 30 分钟,则文件很容易达到这种大小。
回答by frankc
Use a different compression utility, like compress or bzip2
使用不同的压缩实用程序,例如 compress 或 bzip2

