A while back i wrote a post where i explained how to implement the new XMLHttpRequest2 object. The main point of the post was to use sendAsBinary() so we can stream file uploads from the client to the server.
The result of some testing revealed the server would need at least the amount of memory equal to the file size. For small files this is no problem. But with bigger files this becomes a problem. Although i couldn’t reproduce Jean-Pierre’s results. I wasn’t very happy with the test results.
Upload 2.8 MB file results in 3.1 MB memory usage Upload 29 MB file results in 30 MB memory usage
A bit more testing revealed that file_put_contents() was the culprit. Which seems logical if you think about it. It reads the file into memory and dumps it again. Not very elegant for big files. Besides we are trying to stream files. So why should we read them completely in to memory? We shouldn’t :)
Jean-Pierre decided to go with the DataForm object to solve the issue. I still want to look into that. But have not found any information about. Besides that i knew the problem was on the server side. So i rewrote the receive() method to be less memory intensive. The results are considerably better then before.
Upload 2.8 MB file results in 0.4 MB memory usage Upload 29 MB file results in 0.4 MB memory usage
The memory usage was measured with memory_get_peak_usage(). And the new code is posted below: