So I am working on an uploader for one of our websites, and one of the things I am trying to achieve is a way of uploading thousands of images via the website, instead of our clients using an FTP/SFTP solution.
One of the issues I am running into is upload speeds, so here is the current user flow:
The client clicks the 'Add Images' button and selects the images they wish to upload.
There is a
@change
set for the input, which processes the images for Vue, by taking the data from theevent.target.files
array and adding them into the Vue data so that we can display the content.There is a tick loop running, that checks the first 10 images and preloads them so the client can see the first 10 image previews, any more would kill the browsers memory. Also, as files get uploaded and removed from the array, this updates the previews for the first 10 images always, so there will always be the first 10 preview images displayed.
Once they are happy with this, our client would then click 'Upload Files' and this would then start the upload, which also is part of the tick loop, and what it does is check if anything is uploading, if not, it will start on the first file in the array.
So now it will set the status of the file as uploading, so it shows on the UI, then it creates a
XMLHttpRequest()
and set the URL, and create a new FormData object and append the image handle (the File(ID) object) and then any other data that needs to be sent.I set the request to use POST, and set an
onreadystatechange
so that I can catch when it finishes, which just basically, sets the file state as uploaded, and then removes it from the array, unless there is an issue, then it moves it to the issues array.Now I send the request to the server, this will then receive the file in the
$_FILES
variable, and will resize the image 3 times for various sizes and then save them to the correct place, and then return with a success or failure message.
The main problem stems from the upload, the resize code, is fairly quick, I mean around 200-500ms, so the issue doesn't stem from there, but the actual upload.
Our internet is around 4MB per second, and using FTP it takes around 300-400ms for a 4MB file, but for the browser, it takes about 2.2s so I am not sure why this is.
I understand that of course FTP/SFTP is a direct upload, using chunks (I think), where as we are making many Ajax requests, so there in itself makes sense to why it is slower, but is there no way to make this upload quicker at all?
Another thing to note, is this is running within Joomla also.
I am using the below code (amended for me to post):
// Create new request
var http = new XMLHttpRequest();
// Set URL
var url = 'POST_API_URL';
// Create form data object
var params = new FormData();
params.append('name', this.input.files[i].name);
params.append('size', this.input.files[i].size);
params.append('type', this.input.files[i].type);
// Append file to form data object
params.append('images[]', this.input.files[i].handle,
this.input.files[i].name);
// Open post request
http.open("POST", url);
// On return
http.onreadystatechange = function() {
// Check http codes
if (http.readyState == 4 && http.status == 200) {
// Write data to page
window.vm.$data.app.response = http.responseText;
// Get response array
var response = JSON.parse(http.responseText);
// Check response status
if (response.status) {
console.log('Completed');
} else {
console.log('Failed');
}
}
}
// Send request
http.send(params);
The PHP code to receive the file is here:
// Main upload function
public function save()
{
// Wrap everything in a try; catch
try {
// Import system libraries
jimport('joomla.filesystem.file');
jimport('joomla.filesystem.folder');
// Include php image rezing library
require_once JPATH_ROOT . '/lib/php-image-resize/lib/ImageResize.php';
// Decode request data from client
$request = (object) $_POST;
//Define the different sizes and auction we need
$sizes = array('small' => '200', 'medium' => '320', 'xlarge' => '2000');
// Define auction number
$auction = $request->auction;
// Set path for save
$path = $_FILES['images']['tmp_name'][0];
// Create image object
$image = new \Eventviva\ImageResize($path);
// Loop the sizes so we can generate an image for each size
foreach ($sizes as $key => $size) {
// Resize image
$image->resizeToWidth($size);
// Set folder path
$folder = JPATH_ROOT . '/catalogue_images/' . $auction . '/' . $key . '/';
// Check if folder exists, if not; create it
if (!JFolder::exists($folder)) {
JFolder::create($folder);
}
// Set file path
$filepath = $folder . $request->name;
// Save updated file
$image->save($filepath);
}
// Return to the client
echo json_encode(array('status' => true));
} catch(Exception $e) {
// Return error, with message
echo json_encode(array('status' => false, 'message' => $e->getMessage()));
}
}
I am open to any ideas, on how we can either use chunked upload, or anything else, but do keep in mind that our clients can upload 20 up to 5000 images, and we have some clients that do upload around 4000-5000 quite often. So it needs to be robust enough to support this.
My last test, was that:
- Time taken: 51 minutes, and 15 seconds
- Files: 1000 images (jpg)
- Sizes: 1.5MB and 6.5MB
_Also noticed, that it does get slower as time progresses, maybe an extra 500ms to 1s maximum, additional to the 2.2s upload time.
Thanks in advance.