bash 用于归档日志文件的脚本

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/18850214/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-09-18 06:36:04  来源:igfitidea点击:

Script for archiving log files

linuxbash

提问by Mercer

i have created a Bash script for archiving log files:

我创建了一个用于归档日志文件的 Bash 脚本:

#!/bin/bash

# Pour chaque dossiers "log" trouvé.
for folder in $(find . -name log -type d )
do :
    # Pour chaque dossier log contenant des fichiers ".log" vieux de +30jours.
    for file in $(find $folder -name "*.log" -mtime +30)
    do :
            # Archiver les fichiers ".log".
        tar czf archive-log.tar.gz $folder/*.log
    done

    # Si une archive existe.
    if [ -e $folder/*.tar.gz ]
         # Déplacer l'archive.
         then mv $folder/*.tar.gz $ARCHIVE
    fi

done

the output i experience is:

我体验的输出是:

[logs]$ ll
total 8
drwxr-xr-x 2 webadm webgrp 4096 sep 17 14:26 log_weblogic
-rwxr--r-- 1 webadm webgrp  456 sep 17 14:31 scriptArchivesLog

[log_weblogic]$ ll
total 200
-rw-r----- 1 webadm webgrp 98005 mai 16 04:04 test.log
-rw-r----- 1 webadm webgrp 98005 sep 13 15:29 WLS-CONSOLE-DOMAINE-PUB.log


[logs]$ ll
total 32
-rw-r--r-- 1 webadm webgrp 21734 sep 17 14:31 archive-log.tar.gz
drwxr-xr-x 2 webadm webgrp  4096 sep 17 14:26 log_weblogic
-rwxr--r-- 1 webadm webgrp   456 sep 17 14:31 scriptArchivesLog

When i execute my script, why do i have all files in my archive? I want only files that match mtime +30

当我执行脚本时,为什么我的存档中有所有文件?我只想要匹配的文件mtime +30

[logs]$ tar tvzf archive-log.tar.gz
-rw-r----- webadm/webgrp 98005 2013-05-16 04:04:00 ./log_weblogic/test.log
-rw-r----- webadm/webgrp 98005 2013-09-13 15:29:03 ./log_weblogic/WLS-CONSOLE-DOMAINE-PUB.log

回答by pndc

You have made the critical error of not checking to see if there is already a program or library around that already does what you want. In this case, there is logrotate which is probably already present on your system, diligently cleaning up the system logfiles in /var/log. As a bonus, it will also already be configured to run periodically, so you won't even have to remember to set up a cron job.

你犯了一个严重的错误,即没有检查是否已经有一个程序或库已经做了你想要的。在这种情况下,您的系统上可能已经存在 logrotate,它会努力清理 /var/log 中的系统日志文件。作为奖励,它还将被配置为定期运行,因此您甚至不必记住设置 cron 作业。

There is a tutorial on using logrotate at https://www.linode.com/docs/uptime/logs/use-logrotate-to-manage-log-files

https://www.linode.com/docs/uptime/logs/use-logrotate-to-manage-log-files上有一个关于使用 logrotate 的教程

回答by devnull

Replace the following:

替换以下内容:

for file in $(find $folder -name "*.log" -mtime +30)
do :
        # Archiver les fichiers ".log".
    tar czf archive-log.tar.gz $folder/*.log
done

with

tar czf archive-log.tar.gz $(find $folder -name "*.log" -mtime +30)

回答by anubhava

It is because of your tar command:

这是因为你的 tar 命令:

tar czf archive-log.tar.gz $folder/*.log

Which is actually archiving all the *.logfiles irrespective of the timestamp on those files.

这实际上是归档所有*.log文件,而不管这些文件的时间戳如何。

gnu-tarhas a switch:

gnu-tar有一个开关:

--newer-mtime=date

For this use-case.

对于这个用例。

回答by konsolebox

This is the wrong part:

这是错误的部分:

for file in $(find $folder -name "*.log" -mtime +30) # Pour chaque dossier log contenant des fichiers ".log" vieux de +30jours.
do :
    tar czf archive-log.tar.gz $folder/*.log # Archiver les fichiers ".log".
done

Despite searching for files, tar czf archive-log.tar.gz $folder/*.logwould still archive all those files.

尽管搜索文件,tar czf archive-log.tar.gz $folder/*.log仍会存档所有这些文件。

You can change that to something like this instead:

您可以将其更改为如下所示:

readarray -t files < <(exec find $folder -name "*.log" -mtime +30) ## read those files (list) to array
[[ ${#files[@]} -gt 0 ]] && tar czf archive-log.tar.gz "${files[@]}"  ## create an archive if a file was found.

回答by SriniV

Try this:

尝试这个:

for folder in $(find . -name log -type d ) # Pour chaque dossiers "log" trouvé.
do :
    for file in $(find $folder -name "*.log" -mtime +30) # Pour chaque dossier log contenant des fichiers ".log" vieux de +30jours.
    do :
        tar czf archive-log.tar.gz $folder/*.log && rm -f $folder/*.log  # Archiver les fichiers ".log". and remove the copy
    done

    if [ -e $folder/*.tar.gz ] # Si une archive existe.
         then mv $folder/*.tar.gz $ARCHIVE # Déplacer l'archive.
    fi

done