bash 如何在for循环中从文件中逐字读取
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/7719628/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
How to read word by word from a file in a for loop
提问by user968623
This code is to read a directory of text file and match it with the input.txt. I got the word from input.txt working, but I don't know how to extract each word from a text file and compare to it. The file is in paragraph form so I can't look for similar character and such. Is there a way to read every word one by one at a time and compare?
这段代码是读取一个目录下的文本文件并将其与input.txt 进行匹配。我从 input.txt 工作中得到了这个词,但我不知道如何从文本文件中提取每个词并与它进行比较。该文件是段落形式,所以我无法寻找类似的字符等。有没有办法一次一个一个地阅读每个单词并进行比较?
#!/bin/bash
findkeyword () {
file=""
keyword=""
value=""
count=0
while read line
do
#problem right here
#问题就在这里
set -- $line
a=$(expr length "$file")
for i in '$line'; do
if [ "$i" = "$keyword" ]; then
count=`expr $count + 1`;
fi
done
done <$file
echo "Profile: " $file
scorefile $value $count
}
scorefile () {
value=""
count=""
echo "Score: " $((value * count))
}
while read line
do
set -- $line
keyword=
value=
echo "key: " $keyword
echo "value: " $value
for xx in `ls submissions/*`
do
filename=$xx
findkeyword $filename $keyword $value
done
done <input.txt
回答by user unknown
To count the occurences of a word in a file, just use grep -c (count):
要计算文件中某个单词的出现次数,只需使用 grep -c (count):
for word in $(<input.txt); do echo -n $word " " ; grep -c $word $file; done
For different files in a dir, never1 ever use ls.
对于目录中的不同文件,永远不要使用 ls。
for file in submissions/*
do
echo "$file"
for word in $(<input.txt)
do
echo -n "$word " ; grep -c "$word" "$file"
done
done
1in very, very rare cases, it might be the best solution, but blanks, linefeeds and special characters in filenames will corrupt your commands.
1在非常非常罕见的情况下,它可能是最好的解决方案,但文件名中的空格、换行符和特殊字符会破坏您的命令。

