Times out returning very large artifact
Attempting to start a build of nvidia-cuda-toolkit
(currently ~6GiB) causes the Debusine server to hang (#457 (closed)) forever, and eventually timeout with a 504. e.g.: https://debusine.debian.net/artifact/554528/download/?archive=tar.gz.
That error-fails the task (e.g. https://debusine.debian.net/work-request/12028/).
This is because the tar.gz endpoint works entirely in-memory due to a Django bug.
We could work-around that Django bug with some monkey-patching etc. But... for any artifact of non-negligible size downloading the files separately is probably more efficient for client and server (not to mention, retryable). How about we:
- Implement a archive-contents size limit for the tar.gz download endpoint (of, say, 100MiB)
- Have the debusine client download artifact files separately if any of them are bigger than 20MiB or the total is bigger than the size limit above.