用 PHP 可靠地下载大文件

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/597159/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-24 23:12:27  来源:igfitidea点击:

Downloading large files reliably in PHP

phpdownload

提问by Erwing

I have a php script on a server to send files to recipents: they get a unique link and then they can download large files. Sometimes there is a problem with the transfer and the file is corrupted or never finishes. I am wondering if there is a better way to send large files

我在服务器上有一个 php 脚本,可以将文件发送给配方:他们获得一个唯一的链接,然后他们可以下载大文件。有时传输会出现问题,文件已损坏或无法完成。我想知道是否有更好的方法来发送大文件

Code:

代码:

$f = fopen(DOWNLOAD_DIR.$database[$_REQUEST['fid']]['filePath'], 'r');
while(!feof($f)){
    print fgets($f, 1024);
}
fclose($f);

I have seen functions such as

我见过这样的功能

http_send_file
http_send_data

But I am not sure if they will work.

但我不确定它们是否会起作用。

What is the best way to solve this problem?

解决这个问题的最佳方法是什么?

Regards
erwing

问候
erwing

回答by garrow

If you are sending truly large files and worried about the impact this will have, you could use the x-sendfile header.

如果您要发送真正的大文件并担心这会产生影响,您可以使用 x-sendfile 标头。

From the SOQ using-xsendfile-with-apache-php, an howto blog.adaniels.nl : how-i-php-x-sendfile/

来自 SOQ using-xsendfile-with-apache-php,一个 howto blog.adaniels.nl :how-i-php-x-sendfile/

回答by garrow

Best solution would be to rely on lighty or apache, but if in PHP, I would use PEAR's HTTP_Download(no need to reinvent the wheel etc.), has some nice features, like:

最好的解决方案是依赖 lighty 或 apache,但如果在 PHP 中,我会使用PEAR 的 HTTP_Download(无需重新发明轮子等),具有一些不错的功能,例如:

  • Basic throttling mechanism
  • Ranges (partial downloads and resuming)
  • 基本节流机制
  • 范围(部分下载和恢复)

See intro/usage docs.

请参阅介绍/使用文档

回答by trejder

Chunking files is the fastest / simplest method in PHP, if you can't or don't want to make use of something a bit more professional like cURL, mod-xsendfileon Apacheor some dedicated script.

文件分块是PHP最快/最简单的方法,如果您不能或不想使用一些更专业的像卷曲mod-xsendfile阿帕奇或一些专门的脚本

$filename = $filePath.$filename;

$chunksize = 5 * (1024 * 1024); //5 MB (= 5 242 880 bytes) per one chunk of file.

if(file_exists($filename))
{
    set_time_limit(300);

    $size = intval(sprintf("%u", filesize($filename)));

    header('Content-Type: application/octet-stream');
    header('Content-Transfer-Encoding: binary');
    header('Content-Length: '.$size);
    header('Content-Disposition: attachment;filename="'.basename($filename).'"');

    if($size > $chunksize)
    { 
        $handle = fopen($filename, 'rb'); 

        while (!feof($handle))
        { 
          print(@fread($handle, $chunksize));

          ob_flush();
          flush();
        } 

        fclose($handle); 
    }
    else readfile($path);

    exit;
}
else echo 'File "'.$filename.'" does not exist!';

Ported from richnetapps.com/ NeedBee. Tested on 200 MB files, on which readfile()died, even with maximum allowed memory limit set to 1G, that is five times more than downloaded file size.

从移植richnetapps.com/ NeedBee。对 200 MB 的文件进行了测试readfile(),即使将最大允许内存限制设置为200 MB ,也1G比下载的文件大小大五倍。

BTW: I tested this also on files >2GB, but PHP only managed to write first 2GBof file and then broke the connection. File-related functions (fopen, fread, fseek) uses INT, so you ultimately hit the limit of 2GB. Above mentioned solutions (i.e. mod-xsendfile) seems to be the only option in this case.

顺便说一句:我也在 files 上进行了测试>2GB,但是 PHP 只设法写入了第2GB一个文件,然后断开了连接。与文件相关的函数(fopen、fread、fseek)使用 INT,因此您最终会达到2GB. mod-xsendfile在这种情况下,上述解决方案(即)似乎是唯一的选择。

EDIT: Make yourself 100%that your file is saved in utf-8. If you omit that, downloaded files will be corrupted. This is, because this solutions uses printto push chunk of a file to a browser.

编辑让自己100%将文件保存在utf-8. 如果省略,下载的文件将被损坏。这是因为此解决方案用于print将文件块推送到浏览器。

回答by Andreas Baumgart

We've been using this in a couple of projects and it works quite fine so far:

我们一直在几个项目中使用它,到目前为止它工作得很好:

/**
 * Copy a file's content to php://output.
 *
 * @param string $filename
 * @return void
 */
protected function _output($filename)
{
    $filesize = filesize($filename);

    $chunksize = 4096;
    if($filesize > $chunksize)
    {
        $srcStream = fopen($filename, 'rb');
        $dstStream = fopen('php://output', 'wb');

        $offset = 0;
        while(!feof($srcStream)) {
            $offset += stream_copy_to_stream($srcStream, $dstStream, $chunksize, $offset);
        }

        fclose($dstStream);
        fclose($srcStream);   
    }
    else 
    {
        // stream_copy_to_stream behaves() strange when filesize > chunksize.
        // Seems to never hit the EOF.
        // On the other handside file_get_contents() is not scalable. 
        // Therefore we only use file_get_contents() on small files.
        echo file_get_contents($filename);
    }
}

回答by Hummdis

Create a symbolic link to the actual file and make the download link point at the symbolic link. Then, when the user clicks on the DL link, they'll get a file download from the real file but named from the symbolic link. It takes milliseconds to create the symbolic link and is better than trying to copy the file to a new name and download from there.

创建指向实际文件的符号链接,并使下载链接指向符号链接。然后,当用户单击 DL 链接时,他们将从真实文件中下载文件,但从符号链接中命名。创建符号链接需要几毫秒,这比尝试将文件复制到新名称并从那里下载要好。

For example:

例如:

<?php

// validation code here

$realFile = "Hidden_Zip_File.zip";
$id = "UserID1234";

if ($_COOKIE['authvalid'] == "true") {
    $newFile = sprintf("myzipfile_%s.zip", $id); //creates: myzipfile_UserID1234.zip

    system(sprintf('ln -s %s %s', $realFile, $newFile), $retval);

    if ($retval != 0) {
        die("Error getting download file.");
    }

    $dlLink = "/downloads/hiddenfiles/".$newFile;
}

// rest of code

?>

<a href="<?php echo $dlLink; ?>Download File</a>

That's what I did because Go Daddy kills the script from running after 2 minutes 30 seconds or so....this prevents that problem and hides the actual file.

这就是我所做的,因为 Go Daddy 在 2 分 30 秒左右后终止了脚本的运行......这可以防止出现该问题并隐藏实际文件。

You can then setup a CRON job to delete the symbolic links at regular intervals....

然后,您可以设置 CRON 作业以定期删除符号链接....

This whole process will then send the file to the browser and it doesn't matter how long it runs since it's not a script.

然后整个过程会将文件发送到浏览器,它运行多长时间并不重要,因为它不是脚本。

回答by Andrew Grant

For downloading files the easiest way I can think of would be to put the file in a temporary location and give them a unique URL that they can download via regular HTTP.

对于下载文件,我能想到的最简单的方法是将文件放在一个临时位置,并为他们提供一个唯一的 URL,他们可以通过常规 HTTP 下载该 URL。

As part generating these links you could also remove files that were more than X hours old.

作为生成这些链接的一部分,您还可以删除超过 X 小时的文件。

回答by UnkwnTech

When I have done this in the past I've used this:

当我过去这样做时,我使用了这个:

set_time_limit(0); //Set the execution time to infinite.
header('Content-Type: application/exe'); //This was for a LARGE exe (680MB) so the content type was application/exe
readfile($fileName); //readfile will stream the file.

These 3 lines of code will do all the work of the download readfile()will stream the entire file specified to the client, and be sure to set an infinite time limit else you may be running out of time before the file is finished streaming.

这 3 行代码将完成下载readfile()将指定的整个文件流式传输到客户端的所有工作,并确保设置一个无限的时间限制,否则在文件完成流式传输之前您可能会用完时间。

回答by lpfavreau

If you are using lighttpd as a webserver, an alternative for secure downloads would be to use ModSecDownload. It needs server configuration but you'll let the webserver handle the download itself instead of the PHP script.

如果您使用 lighttpd 作为网络服务器,则安全下载的替代方法是使用ModSecDownload。它需要服务器配置,但您将让网络服务器处理下载本身而不是 PHP 脚本。

Generating the download URL would look like that (taken from the documentation) and it could of course be only generated for authorized users:

生成下载 URL 看起来像这样(取自文档),当然它只能为授权用户生成:

<?php

  $secret = "verysecret";
  $uri_prefix = "/dl/";

  # filename
  # please note file name starts with "/" 
  $f = "/secret-file.txt";

  # current timestamp
  $t = time();

  $t_hex = sprintf("%08x", $t);
  $m = md5($secret.$f.$t_hex);

  # generate link
  printf('<a href="%s%s/%s%s">%s</a>',
         $uri_prefix, $m, $t_hex, $f, $f);
?>

Of course, depending on the size of the files, using readfile()such as proposed by Unkwntechis excellent. And using xsendfile as proposed by garrowis another good idea also supported by Apache.

当然,根据文件的大小,使用readfile()诸如Unkwntech提出的非常好。使用 garrow 提出的xsendfile是 Apache 也支持的另一个好主意。

回答by ahmed

header("Content-length:".filesize($filename));
header('Content-Type: application/zip'); // ZIP file
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="downloadpackage.zip"');
header('Content-Transfer-Encoding: binary');
ob_end_clean();
readfile($filename);
exit();

回答by Nawras

I have had same problem, my problem solved by adding this before starting session session_cache_limiter('none');

我有同样的问题,我的问题通过在开始会话 session_cache_limiter('none'); 之前添加它来解决;