java 使用servlet下载大文件
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/2818507/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
download large files using servlet
提问by niks
I am using Apache Tomcat Server 6 and Java 1.6 and am trying to write large mp3 files to the ServletOutputStream for a user to download. Files are ranging from a 50-750MB at the moment.
我正在使用 Apache Tomcat Server 6 和 Java 1.6,并尝试将大型 mp3 文件写入 ServletOutputStream 以供用户下载。目前文件大小为 50-750MB。
The smaller files aren't causing too much of a problem but with the larger files it and getting socket exception broken pipe.
较小的文件不会引起太大的问题,但是对于较大的文件并导致套接字异常损坏管道。
File fileMp3 = new File(objDownloadSong.getStrSongFolder() + "/" + strSongIdName);
FileInputStream fis = new FileInputStream(fileMp3);
response.setContentType("audio/mpeg");
response.setHeader("Content-Disposition", "attachment; filename=\"" + strSongName + ".mp3\";");
response.setContentLength((int) fileMp3.length());
OutputStream os = response.getOutputStream();
try {
int byteRead = 0;
while ((byteRead = fis.read()) != -1) {
os.write(byteRead);
}
os.flush();
} catch (Exception excp) {
downloadComplete = "-1";
excp.printStackTrace();
} finally {
os.close();
fis.close();
}
回答by BalusC
but with the larger files it appears that it is being written into the heap which is then causing an OutOfMemory error and bringing down the entire server
但是对于较大的文件,它似乎被写入堆中,然后导致 OutOfMemory 错误并关闭整个服务器
The cause lies somewhere else than in the as far given code snippet. One of the causes would be reading the entirefile into a byte[], but that doesn't seem to happen in the code you posted. Also, Tomcat 6 by default auto-flushes the response stream on every 2KB. In the future please include the entire stacktrace in the question as well. It might indicate a HttpServletResponseWrapperand/or a Filterin the chain which is possibly buffering the entire response.
原因在于给出的代码片段之外的其他地方。原因之一是将整个文件读入 . byte[],但这在您发布的代码中似乎没有发生。此外,Tomcat 6 默认每 2KB 自动刷新响应流。将来也请在问题中包含整个堆栈跟踪。它可能表示链中的aHttpServletResponseWrapper和/或 aFilter可能正在缓冲整个响应。
also getting socket exception broken pipe.
也得到套接字异常损坏的管道。
This just means that the other side has aborted the request. Nothing to do against from the server side on and it should technically also not harm. You can safely ignore it.
这只是意味着对方已经中止了请求。与服务器端无关,从技术上讲它也不应该受到伤害。您可以放心地忽略它。
回答by Jon Strayer
You are writing out files that are 3/4s of a gig one byte at a time?
您正在写出一次一个字节的 3/4s 的文件?
Try using a bigger buffer.
尝试使用更大的缓冲区。
int BUFF_SIZE = 1024;
byte[] buffer = new byte[BUFF_SIZE];
File fileMp3 = new File(objDownloadSong.getStrSongFolder() + "/" + strSongIdName);
FileInputStream fis = new FileInputStream(fileMp3);
response.setContentType("audio/mpeg");
response.setHeader("Content-Disposition", "attachment; filename=\"" + strSongName + ".mp3\";");
response.setContentLength((int) fileMp3.length());
OutputStream os = response.getOutputStream();
int byteCount = 0;
try {
do {
byteCount = fis.read(buffer);
if (byteCount == -1)
break;
os.write(buffer, 0, byteCount);
os.flush();
} while (true);
}
} catch (Exception excp) {
downloadComplete = "-1";
excp.printStackTrace();
} finally {
os.close();
fis.close();
}
BTW, before I started to answer this question I started an empty loop that will loop 750,000,000,000 (your large file size) times to see how long that would take. It's still running.
顺便说一句,在我开始回答这个问题之前,我开始了一个空循环,它将循环 750,000,000,000(您的大文件大小)次以查看需要多长时间。它仍在运行。
回答by ZZ Coder
One common problem with downloading large file is that you are buffering the whole file, which can cause broken pipe if client is not patient enough. You may also run out of memory on server when the file is too big.
下载大文件的一个常见问题是您正在缓冲整个文件,如果客户端不够耐心,这可能会导致管道破裂。当文件太大时,您也可能会耗尽服务器上的内存。
Try to force chunked encoding by doing this,
尝试通过这样做来强制分块编码,
response.setContentLength(-1);
This will cause the connector to stream the file chunk by chunk.
这将导致连接器逐块流式传输文件。

