php 如何通过PHP脚本下载大文件
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/6527811/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
How to download large files through PHP script
提问by brenns10
Using PHP, I am trying to serve large files (up to possibly 200MB) which aren't in a web accessible directory due to authorization issues. Currently, I use a readfile()
call along with some headers to serve the file, but it seems that PHP is loading it into memory before sending it. I intend to deploy on a shared hosting server, which won't allow me to use much memory or add my own Apache modules such as X-Sendfile.
使用 PHP,我正在尝试提供由于授权问题而不位于 Web 可访问目录中的大文件(可能高达 200MB)。目前,我使用一个readfile()
调用和一些标头来提供文件,但似乎 PHP 在发送之前将其加载到内存中。我打算部署在共享托管服务器上,这将不允许我使用太多内存或添加我自己的 Apache 模块,例如 X-Sendfile。
I can't let my files be in a web accessible directory for security reasons. Does anybody know a method that is less memory intensive which I could deploy on a shared hosting server?
出于安全原因,我不能让我的文件位于 Web 可访问目录中。有人知道我可以在共享托管服务器上部署的内存密集程度较低的方法吗?
EDIT:
编辑:
if(/* My authorization here */) {
$path = "/uploads/";
$name = $row[0]; //This is a MySQL reference with the filename
$fullname = $path . $name; //Create filename
$fd = fopen($fullname, "rb");
if ($fd) {
$fsize = filesize($fullname);
$path_parts = pathinfo($fullname);
$ext = strtolower($path_parts["extension"]);
switch ($ext) {
case "pdf":
header("Content-type: application/pdf");
break;
case "zip":
header("Content-type: application/zip");
break;
default:
header("Content-type: application/octet-stream");
break;
}
header("Content-Disposition: attachment; filename=\"".$path_parts["basename"]."\"");
header("Content-length: $fsize");
header("Cache-control: private"); //use this to open files directly
while(!feof($fd)) {
$buffer = fread($fd, 1*(1024*1024));
echo $buffer;
ob_flush();
flush(); //These two flush commands seem to have helped with performance
}
}
else {
echo "Error opening file";
}
fclose($fd);
采纳答案by Francois Deschenes
If you use fopen
and fread
instead of readfile
, that should solve your problem.
如果您使用fopen
andfread
而不是readfile
,那应该可以解决您的问题。
There's a solutionin the PHP's readfile
documentation showing how to use fread
to do what you want.
PHP文档中有一个解决方案,readfile
展示了如何使用fread
它来做你想做的事。
回答by ursitesion
To download large files from server, I have changed the below settings in php.ini file:
为了从服务器下载大文件,我在 php.ini 文件中更改了以下设置:
Upload_max_filesize - 1500 M
Max_input_time - 1000
Memory_limit - 640M
Max_execution_time - 1800
Post_max_size - 2000 M
Now, I am able to upload and download 175MB video on server. Since, I have the dedicated server. So, making these changes were easy.
现在,我可以在服务器上上传和下载 175MB 的视频。因为,我有专用服务器。因此,进行这些更改很容易。
Below is the PHP script to download the file. I have no made any changes in this code snippet for large file size.
下面是用于下载文件的 PHP 脚本。对于大文件,我没有在此代码片段中进行任何更改。
// Begin writing headers
ob_clean(); // Clear any previously written headers in the output buffer
if($filetype=='application/zip')
{
if(ini_get('zlib.output_compression'))
ini_set('zlib.output_compression', 'Off');
$fp = @fopen($filepath, 'rb');
if (strstr($_SERVER['HTTP_USER_AGENT'], "MSIE"))
{
header('Content-Type: "$content_type"');
header('Content-Disposition: attachment; filename="'.$filename.'"');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header("Content-Transfer-Encoding: binary");
header('Pragma: public');
header("Content-Length: ".filesize(trim($filepath)));
}
else
{
header('Content-Type: "$content_type"');
header('Content-Disposition: attachment; filename="'.$filename.'"');
header("Content-Transfer-Encoding: binary");
header('Expires: 0');
header('Pragma: no-cache');
header("Content-Length: ".filesize(trim($filepath)));
}
fpassthru($fp);
fclose($fp);
}
elseif($filetype=='audio'|| $filetype=='video')
{
global $mosConfig_absolute_path,$my;
ob_clean();
header("Pragma: public");
header('Expires: 0');
header('Cache-Control: no-store, no-cache, must-revalidate');
header('Cache-Control: pre-check=0, post-check=0, max-age=0');
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-Type: application/force-download");
header("Content-Type: $content_type");
header("Content-Length: ".filesize(trim($filepath)));
header("Content-Disposition: attachment; filename=\"$filename\"");
// Force the download
header("Content-Transfer-Encoding: binary");
@readfile($filepath);
}
else{ // for all other types of files except zip,audio/video
ob_clean();
header("Pragma: public");
header('Expires: 0');
header('Cache-Control: no-store, no-cache, must-revalidate');
header('Cache-Control: pre-check=0, post-check=0, max-age=0');
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-Type: $content_type");
header("Content-Length: ".filesize(trim($filepath)));
header("Content-Disposition: attachment; filename=\"$filename\"");
// Force the download
header("Content-Transfer-Encoding: binary");
@readfile($filepath);
}
exit;
回答by Karoly Horvath
If you care about performance, there is xsendfile, available in apache, nginx and lighttpd as module. Check the readfile()
doc's users comments.
如果您关心性能,可以使用 xsendfile,它可以作为模块在 apache、nginx 和 lighttpd 中使用。检查readfile()
文档的用户评论。
There are also modules for these webservers which accept a url with an additional hash value which allows downloading the file for a short time period. This can be also used to solve authorization issues.
这些网络服务器还有一些模块,它们接受带有附加哈希值的 url,允许在短时间内下载文件。这也可用于解决授权问题。
回答by Winfield Trail
You could also handle this in the style of the Gordian Knot - that is to say, sidestep the problem entirely. Keep the files in a non-accessible directory, and when a download is initiated you can simply
您也可以按照 Gordian Knot 的风格来处理这个问题——也就是说,完全回避这个问题。将文件保存在不可访问的目录中,当开始下载时,您可以简单地
tempstring = rand();
symlink('/filestore/filename.extension', '/www/downloads'.tempstring.'-filename.extension');
echo("Your download is available here: <a href='/downloads/'.tempstring.'-filename.extension');
and setup a cronjob to unlink()
any download links older than 10 minutes. Virtually no processing of your data is required, no massaging of HTTP headers, etc.
并为unlink()
任何超过 10 分钟的下载链接设置 cronjob 。几乎不需要处理您的数据,也不需要按摩 HTTP 标头等。
There are even a couple libraries out there for just this purpose.
甚至有几个图书馆就是为了这个目的。