bash 如何使用 wget 或 curl 获取可用文件的列表?
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/10571335/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
How to get a list of available files using wget or curl?
提问by nachocab
I'd like to know if it's possible to do an ls
of a URL, so I can see what *.js files are available in a website, for example. Something like:
例如,我想知道是否可以执行ls
一个 URL,以便我可以查看网站中可用的 *.js 文件。就像是:
wget --list-files -A.js stackoverflow.com
and get
并得到
ajax/libs/jquery/1.7.1/jquery.min.js
js/full.js
js/stub.js
...
采纳答案by Lars Kotthoff
You can't do the equivalent of an ls
unless the server provides such listings itself. You could however retrieve index.html
and then check for includes, e.g. something like
ls
除非服务器本身提供此类列表,否则您无法执行等效的操作。但是index.html
,您可以检索然后检查包含,例如
wget -O - http://www.example.com | grep "type=.\?text/javascript.\?"
Note that this relies on the HTML being formatted in a certain way -- in this case with the includes on individual lines for example. If you want to do this properly, I'd recommend parsing the HTML and extracting the javascript includes that way.
请注意,这依赖于以某种方式格式化的 HTML —— 例如,在这种情况下,包含在单独的行上。如果您想正确执行此操作,我建议您解析 HTML 并以这种方式提取 javascript 包含的内容。