javascript 如何用ajax小块上传文件并检查失败,重新上传失败的部分。
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/33537769/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
How to upload a file with ajax in small chunks and check for fails, re-upload the parts that failed.
提问by Relm
I have a file uploaded by a user, and I'd like to achieve the following.
我有一个用户上传的文件,我想实现以下目标。
- Divide the file into smaller chunks about a megabyte.
- Upload each chunk, and wait for it to finish before starting to upload the next chunk.
- For every chunk get success or failure report.
- Re-upload the failed chunks.
- Get progress in percentages.
- 将文件分成大约 1 兆字节的小块。
- 上传每个块,并等待它完成再开始上传下一个块。
- 对于每个块,获取成功或失败报告。
- 重新上传失败的块。
- 以百分比获取进度。
Here's some rough JavaScript. I'm literally lost. Got some code online and tried modifying it.
这是一些粗略的 JavaScript。我真的迷路了。在网上找了一些代码并尝试修改它。
$.chunky = function(file, name){
var loaded = 0;
var step = 1048576//1024*1024;
var total = file.size;
var start = 0;
var reader = new FileReader();
reader.onload = function(e){
var d = {file:reader.result}
$.ajax({
url:"../record/c/index.php",
type:"POST",
data:d}).done(function(r){
$('.record_reply_g').html(r);
loaded += step;
$('.upload_rpogress').html((loaded/total) * 100);
if(loaded <= total){
blob = file.slice(loaded,loaded+step);
reader.readAsBinaryString(blob);
} else {
loaded = total;
}
})
};
var blob = file.slice(start,step);
reader.readAsBinaryString(blob);
}
How can I achieve the above. Please do explain what's happening if there's a viable solution.
我怎样才能实现上述目标。如果有可行的解决方案,请解释发生了什么。
回答by afzalex
You are not doing anything for failure of any chunk upload.
您没有为任何块上传失败做任何事情。
$.chunky = function(file, name){
var loaded = 0;
var step = 1048576//1024*1024; size of one chunk
var total = file.size; // total size of file
var start = 0; // starting position
var reader = new FileReader();
var blob = file.slice(start,step); //a single chunk in starting of step size
reader.readAsBinaryString(blob); // reading that chunk. when it read it, onload will be invoked
reader.onload = function(e){
var d = {file:reader.result}
$.ajax({
url:"../record/c/index.php",
type:"POST",
data:d // d is the chunk got by readAsBinaryString(...)
}).done(function(r){ // if 'd' is uploaded successfully then ->
$('.record_reply_g').html(r); //updating status in html view
loaded += step; //increasing loaded which is being used as start position for next chunk
$('.upload_rpogress').html((loaded/total) * 100);
if(loaded <= total){ // if file is not completely uploaded
blob = file.slice(loaded,loaded+step); // getting next chunk
reader.readAsBinaryString(blob); //reading it through file reader which will call onload again. So it will happen recursively until file is completely uploaded.
} else { // if file is uploaded completely
loaded = total; // just changed loaded which could be used to show status.
}
})
};
}
EDIT
编辑
To upload failed chunk again you can do following :
要再次上传失败的块,您可以执行以下操作:
var totalFailures = 0;
reader.onload = function(e) {
....
}).done(function(r){
totalFailures = 0;
....
}).fail(function(r){ // if upload failed
if((totalFailure++) < 3) { // atleast try 3 times to upload file even on failure
reader.readAsBinaryString(blob);
} else { // if file upload is failed 4th time
// show message to user that file uploading process is failed
}
});
回答by Vince Busam
I've modified afzalex's answer to use readAsArrayBuffer()
, and upload the chunk as a file.
我已经修改了 afzalex 的回答以使用readAsArrayBuffer()
,并将块上传为文件。
var loaded = 0;
var reader = new FileReader();
var blob = file.slice(loaded, max_chunk_size);
reader.readAsArrayBuffer(blob);
reader.onload = function(e) {
var fd = new FormData();
fd.append('filedata', new File([reader.result], 'filechunk'));
fd.append('loaded', loaded);
$.ajax(url, {
type: "POST",
contentType: false,
data: fd,
processData: false
}).done(function(r) {
loaded += max_chunk_size;
if (loaded < file.size) {
blob = file.slice(loaded, loaded + max_chunk_size);
reader.readAsArrayBuffer(blob);
}
});
};