list 当目录超过100,000个文件时,如何通过FTP删除文件?
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/445643/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
How to delete files via FTP when directory has over 100,000 files?
提问by Zoredache
I went to upload a new file to my web server only to get a message in return saying that my disk quota was full... I wasn't using up my allotted space but rather my allotted FILE QUANTITY. My host caps my total number of files at about 260,000.
我去上传一个新文件到我的网络服务器只是为了收到一条消息,说我的磁盘配额已满......我没有用完分配的空间,而是用完了分配的文件数量。我的主机将我的文件总数限制在大约 260,000 个。
Checking through my folders I believe I found the culprit...
检查我的文件夹,我相信我找到了罪魁祸首......
I have a small DVD database application (Video dB By split Brain) that I have installed and hidden away on my web site for my own personal use. It apparently caches data from IMDB, and over the years has secretly amassed what is probably close to a MIRROR of IMDB at this point. I don't know for certain but I did have a 2nd (inactive) copy of the program on the host that I created a few years back that I was using for testing when I was modifying portions of it. The cache folder in this inactivecopy had 40,000 files totalling 2.3GB in size. I was able to delete this folder over FTP but it took over an hour. Thankfully it also gave me some much needed breathing room.
我有一个小的 DVD 数据库应用程序(Video dB By split Brain),我已经安装并隐藏在我的网站上供我个人使用。它显然缓存了来自 IMDB 的数据,并且多年来秘密积累了可能接近 IMDB 镜像的内容。我不确定,但我确实在几年前创建的主机上有该程序的第二个(非活动)副本,当我修改它的一部分时,我将其用于测试。此非活动副本中的缓存文件夹有 40,000 个文件,总大小为 2.3GB。我能够通过 FTP 删除这个文件夹,但花了一个多小时。谢天谢地,它还给了我一些急需的喘息空间。
...But now as you can imagine the cache folder for the active copy of this web-app likely has closer to 150000 files totalling about 7GB worth of data.
...但是现在你可以想象这个网络应用程序的活动副本的缓存文件夹可能有接近 150000 个文件,总计大约 7GB 的数据。
This is where my problem comes in... I use Flash FXP for my FTP client and whenever I try to delete the cache folder, or even just view the contents it will sit and try to load a file list for a good 5 minutes and then lose connection to the server...
这就是我的问题所在......我使用 Flash FXP 作为我的 FTP 客户端,每当我尝试删除缓存文件夹时,甚至只是查看它会坐下的内容并尝试加载文件列表 5 分钟,然后然后失去与服务器的连接...
my host has a web based file browser and it crashes when trying to do this... as do free online FTP clients like net2ftp.com. I don't have SSH ability on this server so I can't login directly to delete either.
我的主机有一个基于 Web 的文件浏览器,它在尝试执行此操作时崩溃……就像 net2ftp.com 等免费在线 FTP 客户端一样。我在这台服务器上没有 SSH 能力,所以我也不能直接登录删除。
Anyone have any idea how I can delete these files? Is there a different FTP program I can download that would have better success... or perhaps a small script I could run that would be able to take care of it?
任何人都知道如何删除这些文件?是否有我可以下载的不同 FTP 程序会取得更好的成功……或者我可以运行一个能够处理它的小脚本?
Any help would be greatly appreciated.
任何帮助将不胜感激。
回答by Zoredache
Anyone have any idea how I can delete these files?
任何人都知道如何删除这些文件?
Submit a support request asking for them to delete it for you?
提交支持请求,要求他们为您删除它?
回答by Justin Scott
It sounds like it might be time for a command line FTP utility. One ships with just about every operating system. With that many files, I would write a script for my command-line FTP client that goes to the folder in question and performs a directory listing, redirecting the output to a file. Then, use magic (or perl or whatever) to process that file into a new FTP script that runs a delete command against all of the files. Yes, it will take a long time to run.
听起来似乎是时候使用命令行 FTP 实用程序了。几乎每个操作系统都有一个。有了这么多文件,我会为我的命令行 FTP 客户端编写一个脚本,该脚本会转到相关文件夹并执行目录列表,将输出重定向到一个文件。然后,使用魔法(或 perl 或其他)将该文件处理为一个新的 FTP 脚本,该脚本对所有文件运行删除命令。是的,运行需要很长时间。
If the server supports wildcards, do that instead and just delete ..
如果服务器支持通配符,请改为使用通配符并删除. .
If that all seems like too much work, open a support ticket with your hosting provider and ask them to clean it up on the server directly.
如果这一切看起来工作量太大,请向您的托管服务提供商开一张支持票,并要求他们直接在服务器上进行清理。
Having said all that, this isn't really a programming question and should probably be closed.
说了这么多,这不是一个真正的编程问题,应该关闭。
回答by Norman Ramsey
We had a question a while back where I ran an experiment to show that Firefox can browse a directory with 10,000 files no problem, via FTP. Presumably 150,000 will also be ok. Firefox won't help you delete, but it might be helpful in capturing the namesof the files you need to delete.
不久前我们遇到了一个问题,我进行了一项实验,以表明 Firefox 可以通过 FTP 浏览包含 10,000 个文件的目录,没有问题。大概 150,000 也可以。Firefox 不会帮助您删除,但它可能有助于捕获您需要删除的文件的名称。
But first I would just try the command-line client ncftp
. It is well engineered and I have had good luck with it in the past. You can delete a large number of files at once using shell patterns. And it is available for Windows, MacOS, Linux, and many other platforms.
但首先我会尝试命令行客户端ncftp
。它设计精良,过去我也很幸运。您可以使用 shell 模式一次删除大量文件。它适用于 Windows、MacOS、Linux 和许多其他平台。
If that doesn't work, you sound like a long-term customer---could you beg your ISP the privilege of a shell account for a week so you can remote login with Putty or ssh and blow away the entire directory with a single rm -r
command?
如果这不起作用,你听起来像是一个长期客户——你能不能向你的 ISP 请求一个 shell 帐户的特权一个星期,这样你就可以用 Putty 或 ssh 远程登录,并用一个单一的方式吹走整个目录rm -r
命令?
回答by Gatorhall
If your ISP provides ssh access, you can use one rm command to remove the files.
如果您的 ISP 提供 ssh 访问,您可以使用一个 rm 命令来删除这些文件。
If there is no command line access, you can have a try with some powerful FTP client like CrossFTP. It works on win, mac, and linux. When you select to delete the huge amount of files on your server, it can queue in the delete operations, so that you don't need to reload the folder again. When you restart CrossFTP, the queue can also be restored and continued.
如果没有命令行访问,您可以尝试使用一些功能强大的 FTP 客户端,例如CrossFTP。它适用于 win、mac 和 linux。当您选择删除服务器上的海量文件时,它可以加入删除操作的队列,这样您就不需要再次重新加载文件夹。当您重新启动 CrossFTP 时,队列也可以恢复并继续。