bash 如何查看服务器正在执行的所有请求 URL(最终 URL)
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/33570391/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
How to see all Request URLs the server is doing (final URLs)
提问by Cedric
How list from the command line URLs requests that are made from the server (an *ux machine) to another machine.
如何从命令行 URL 列出从服务器(一台 *ux 机器)到另一台机器的请求。
For instance, I am on the command line of server ALPHA_RE. I do a ping to google.co.uk and another ping to bbc.co.uk I would like to see, from the prompt :
例如,我在服务器ALPHA_RE的命令行上。我对 google.co.uk 执行 ping 操作,对 bbc.co.uk 执行另一次 ping 我希望看到的提示:
google.co.uk bbc.co.uk
google.co.uk bbc.co.uk
so, not the ip address of the machine I am pinging, and NOT an URL from servers that passes my the request to google.co.uk or bbc.co.uk , but the actual final urls.
因此,不是我正在 ping 的机器的 IP 地址,也不是将我的请求传递给 google.co.uk 或 bbc.co.uk 的服务器的 URL,而是实际的最终 URL。
Note that only packages that are available on normal ubuntu repositories are available - and it has to work with command line
请注意,只有在普通 ubuntu 存储库中可用的软件包才可用 - 它必须与命令行一起使用
EditThe ultimate goal is to see what API URLs a PHP script (run by a cronjob) requests ; and what API URLs the serverrequests 'live'. These ones do mainly GET and POST requests to several URLs, and I am interested in knowing the params :
编辑最终目标是查看 PHP 脚本(由cronjob运行)请求的API URL ;以及服务器请求“实时”的API URL 。这些主要是对几个 URL 进行 GET 和 POST 请求,我有兴趣了解参数:
Does it do request to :
它是否要求:
foobar.com/api/whatisthere?and=what&is=there&too=yeah
or to :
或者 :
foobar.com/api/whatisthathere?is=it&foo=bar&green=yeah
And does the cron jobs or the server do any other GET or POST request ? And that, regardless what response (if any) these API gives.
cron 作业或服务器是否执行任何其他 GET 或 POST 请求?而且,无论这些 API 给出什么响应(如果有)。
Also, the API list is unknown - so you cannot grep to one particular URL.
此外,API 列表是未知的 - 因此您无法 grep 到一个特定的 URL。
Edit:(OLD ticket specified : Note that I can not install anything on that server (no extra package, I can only use the "normal" commands - like tcpdump, sed, grep,...) // but as getting these information with tcpdump is pretty hard, then I made installation of packages possible)
编辑:(指定的旧票证:请注意,我无法在该服务器上安装任何东西(没有额外的软件包,我只能使用“普通”命令 - 如 tcpdump、sed、grep 等)// 但获取这些信息使用 tcpdump 非常困难,然后我使安装软件包成为可能)
回答by Bert Neef
You can use tcpdumpand grepto get info about activity about network traffic from the host, the following cmd line should get you all lines containing Host:
您可以使用tcpdump和grep从主机获取有关网络流量活动的信息,以下 cmd 行应该为您提供包含主机的所有行:
tcpdump -i any -A -vv -s 0 | grep -e "Host:"
If I run the above in one shell and start a Links session to stackoverflow I see:
如果我在一个 shell 中运行上面的代码并启动一个链接会话到 stackoverflow,我会看到:
Host: www.stackoverflow.com
Host: stackoverflow.com
If you want to know more about the actual HTTP request you can also add statements to the grep for GET, PUT or POST requests (i.e. -e "GET"), which can get you some info about the relative URL (should be combined with the earlier determined host to get the full URL).
如果您想了解有关实际 HTTP 请求的更多信息,您还可以为 GET、PUT 或 POST 请求向 grep 添加语句(即 -e "GET"),这可以为您提供有关相对 URL 的一些信息(应该与较早确定的主机以获取完整 URL)。
EDIT: based on your edited question I have tried to make some modification: first a tcpdump approach:
编辑:根据您编辑的问题,我尝试进行一些修改:首先是 tcpdump 方法:
[root@localhost ~]# tcpdump -i any -A -vv -s 0 | egrep -e "GET" -e "POST" -e "Host:"
tcpdump: listening on any, link-type LINUX_SLL (Linux cooked), capture size 65535 bytes
E..v.[@[email protected].$....P....Ga .P.9.=...GET / HTTP/1.1
Host: stackoverflow.com
E....x@[email protected].$....P....Ga.mP...>;..GET /search?q=tcpdump HTTP/1.1
Host: stackoverflow.com
And an ngrep one:
还有一个 ngrep:
[root@localhost ~]# ngrep -d any -vv -w byline | egrep -e "Host:" -e "GET" -e "POST"
^[[B GET //meta.stackoverflow.com HTTP/1.1..Host: stackoverflow.com..User-Agent:
GET //search?q=tcpdump HTTP/1.1..Host: stackoverflow.com..User-Agent: Links
My test case was running links stackoverflow.com, putting tcpdump in the search field and hitting enter.
我的测试用例运行链接 stackoverflow.com,将 tcpdump 放在搜索字段中并按 Enter。
This gets you all URL info on one line. A nicer alternative might be to simply run a reverse proxy (e.g. nginx) on your own server and modify the host file (such as shown in Adam's answer) and have the reverse proxy redirect all queries to the actual host and use the logging features of the reverse proxy to get the URLs from there, the logs would probably a bit easier to read.
这让您在一行中获取所有 URL 信息。一个更好的选择可能是在您自己的服务器上简单地运行一个反向代理(例如 nginx)并修改主机文件(如 Adam 的回答中所示)并让反向代理将所有查询重定向到实际主机并使用从那里获取 URL 的反向代理,日志可能会更容易阅读。
EDIT 2:If you use a command line such as:
编辑 2:如果您使用命令行,例如:
ngrep -d any -vv -w byline | egrep -e "Host:" -e "GET" -e "POST" --line-buffered | perl -lne 'print . if /(GET|POST) (.+?) HTTP\/1\.1\.\.Host: (.+?)\.\./'
you should see the actual URLs
你应该看到实际的 URL
回答by Adam
A simple solution is to modify your '/etc/hosts' file to intercept the API calls and redirect them to your own web server
一个简单的解决方案是修改您的“/etc/hosts”文件以拦截 API 调用并将它们重定向到您自己的 Web 服务器
api.foobar.com 127.0.0.1