windows 如何使用单个命令使用 wget 下载多个 url?
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/14578264/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
How to download multiple urls using wget using a single command?
提问by Olaf Dietsche
I am using following command to download a single webpage with all its images and js using wget in win7:
我正在使用以下命令在 win7 中使用 wget 下载包含所有图像和 js 的单个网页:
wget -E -H -k -K -p -e robots=off -P /Downloads/ http://www.vodafone.de/privat/tarife/red-smartphone-tarife.html
wget -E -H -k -K -p -e robots=off -P /Downloads/ http://www.vodafone.de/privat/tarife/red-smartphone-tarife.html
It is downloading the html as required, but when I tried to pass on a text file having a list of 3 urls to download, it didn't give any output, below is the command I am using:
它正在根据需要下载 html,但是当我尝试传递包含 3 个要下载的 url 列表的文本文件时,它没有给出任何输出,以下是我正在使用的命令:
wget -E -H -k -K -p -e robots=off -P /Downloads/ -i ./list.txt -B 'http://'
wget -E -H -k -K -p -e robots=off -P /Downloads/ -i ./list.txt -B 'http://'
I tried this also:
我也试过这个:
wget -E -H -k -K -p -e robots=off -P /Downloads/ -i ./list.txt
wget -E -H -k -K -p -e robots=off -P /Downloads/ -i ./list.txt
This text file had urls http://
prepended in it
这个文本文件的URL了http://
前面加上它
list.txt
contains list of 3 urls which I need to download using a single command. Please help me in resolving this issue.
list.txt
包含我需要使用单个命令下载的 3 个 url 的列表。请帮我解决这个问题。
回答by Olaf Dietsche
From man wget
来自man wget
2 Invoking
By default, Wget is very simple to invoke. The basic syntax is:
wget [option]... [URL]...
2 调用
默认情况下,Wget 调用起来非常简单。基本语法是:
wget [option]... [URL]...
So, just use multiple URLs
所以,只需使用多个 URL
wget URL1 URL2
Or using the links from comments
或者使用评论中的链接
$ cat list.txt
http://www.vodafone.de/privat/tarife/red-smartphone-tarife.html
http://www.verizonwireless.com/smartphones-2.shtml
http://www.att.com/shop/wireless/devices/smartphones.html
and your command line
和你的命令行
wget -E -H -k -K -p -e robots=off -P /Downloads/ -i ./list.txt
works as expected.
按预期工作。
回答by Tek Mentor
First create a text file with the URLs that you need to download. eg: download.txt
首先使用您需要下载的 URL 创建一个文本文件。例如:下载.txt
download.txt
will as below:
download.txt
将如下:
http://www.google.com
http://www.yahoo.com
then use the command wget -i download.txt
to download the files. You can add many URLs to the text file.
然后使用命令wget -i download.txt
下载文件。您可以向文本文件添加多个 URL。
回答by Ardhi
pedantic version:
迂腐版:
for x in {'url1','url2'}; do wget $x; done
the advantage of it you can treat is as a single wget url command
您可以将其视为单个 wget url 命令的优点