Chrome memory issue - File API + AngularJS Chrome memory issue - File API + AngularJS azure azure

Chrome memory issue - File API + AngularJS


I can't see any obvious memory leaks or things I can change to help garbage collection. I store the block IDs in an array so obviously there will be some memory creeep but this shouldn't be massive. It's almost as if the File API is holding the whole file it slices into memory.

You are correct. The new Blobs created by .slice() are being held in memory.

The solution is to call Blob.prototype.close() on the Blob reference when processing Blob or File object is complete.

Note also, at javascript at Question also creates a new instance of FileReader if upload function is called more than once.

4.3.1. The slice method

The slice() method returns a new Blob object with bytes ranging from the optional start parameter up to but not including the optional end parameter, and with a type attribute that is the value of the optional contentType parameter.

Blob instances exist for the life of document. Though Blob should be garbage collected once removed from Blob URL Store

9.6. Lifetime of Blob URLs

Note: User agents are free to garbage collect resources removed from the Blob URL Store.

Each Blob must have an internal snapshot state, which must be initially set to the state of the underlying storage, if any such underlying storage exists, and must be preserved through StructuredClone. Further normative definition of snapshot state can be found for Files.

4.3.2. The close method

The close() method is said to close a Blob, and must act as follows:

  1. If the readability state of the context object is CLOSED, terminate this algorithm.
  2. Otherwise, set the readability state of the context object to CLOSED.
  3. If the context object has an entry in the Blob URL Store, remove the entry that corresponds to the context object.

If Blob object is passed to URL.createObjectURL(), call URL.revokeObjectURL() on Blob or File object, then call .close().

The revokeObjectURL(url) static method

Revokes the Blob URL provided in the string url by removing the corresponding entry from the Blob URL Store. This method must act as follows: 1. If the url refers to a Blob that has a readability state of CLOSED OR if the value provided for the url argument is not a Blob URL, OR if the value provided for the url argument does not have an entry in the Blob URL Store, this method call does nothing. User agents may display a message on the error console. 2. Otherwise, user agents must remove the entry from the Blob URL Store for url.

You can view the result of these calls by opening

chrome://blob-internals 

reviewing details of before and after calls which create Blob and close Blob.

For example, from

xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxxRefcount: 1Content Type: text/plainType: dataLength: 3

to

xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxxRefcount: 1Content Type: text/plain

following call to .close(). Similarly from

blob:http://example.com/c2823f75-de26-46f9-a4e5-95f57b8230bdUuid: 29e430a6-f093-40c2-bc70-2b6838a713bc

An alternative approach could be to send file as an ArrayBuffer or chunks of array buffers. Then re-assemble the file at server.

Or you can call FileReader constructor, FileReader.prototype.readAsArrayBuffer(), and load event of FileReader each once.

At load event of FileReader pass ArrayBuffer to Uint8Array, use ReadableStream, TypedArray.prototype.subarray(), .getReader(), .read() to get N chunks of ArrayBuffer as a TypedArray at pull from Uint8Array. When N chunks equaling .byteLength of ArrayBuffer have been processed, pass array of Uint8Arrays to Blob constructor to recombine file parts into single file at browser; then send Blob to server.

<!DOCTYPE html><html><head></head><body>  <input id="file" type="file">  <br>  <progress value="0"></progress>  <br>  <output for="file"><img alt="preview"></output>  <script type="text/javascript">    const [input, output, img, progress, fr, handleError, CHUNK] = [      document.querySelector("input[type='file']")      , document.querySelector("output[for='file']")      , document.querySelector("output img")      , document.querySelector("progress")      , new FileReader      , (err) => console.log(err)      , 1024 * 1024    ];    progress.addEventListener("progress", e => {      progress.value = e.detail.value;      e.detail.promise();    });    let [chunks, NEXT, CURR, url, blob] = [Array(), 0, 0];    input.onchange = () => {      NEXT = CURR = progress.value = progress.max = chunks.length = 0;      if (url) {        URL.revokeObjectURL(url);        if (blob.hasOwnProperty("close")) {          blob.close();        }      }      if (input.files.length) {        console.log(input.files[0]);        progress.max = input.files[0].size;        progress.step = progress.max / CHUNK;        fr.readAsArrayBuffer(input.files[0]);      }    }    fr.onload = () => {      const VIEW = new Uint8Array(fr.result);      const LEN = VIEW.byteLength;      const {type, name:filename} = input.files[0];      const stream = new ReadableStream({          pull(controller) {            if (NEXT < LEN) {              controller              .enqueue(VIEW.subarray(NEXT, !NEXT ? CHUNK : CHUNK + NEXT));               NEXT += CHUNK;            } else {              controller.close();            }          },          cancel(reason) {            console.log(reason);            throw new Error(reason);          }      });      const [reader, processData] = [        stream.getReader()        , ({value, done}) => {            if (done) {              return reader.closed.then(() => chunks);            }            chunks.push(value);            return new Promise(resolve => {              progress.dispatchEvent(                new CustomEvent("progress", {                  detail:{                    value:CURR += value.byteLength,                    promise:resolve                  }                })              );                            })            .then(() => reader.read().then(data => processData(data)))            .catch(e => reader.cancel(e))        }      ];      reader.read()      .then(data => processData(data))      .then(data => {        blob = new Blob(data, {type});        console.log("complete", data, blob);        if (/image/.test(type)) {          url = URL.createObjectURL(blob);          img.onload = () => {            img.title = filename;            input.value = "";          }          img.src = url;        } else {          input.value = "";        }                   })      .catch(e => handleError(e))    }  </script></body></html>

plnkr http://plnkr.co/edit/AEZ7iQce4QaJOKut71jk?p=preview


You can also use utilize fetch()

fetch(new Request("/path/to/server/", {method:"PUT", body:blob}))

To transmit body for a request request, run these steps:

  1. Let body be request’s body.
  2. If body is null, then queue a fetch task on request to process request end-of-body for request and abort these steps.

  3. Let read be the result of reading a chunk from body’s stream.

    • When read is fulfilled with an object whose done property is false and whose value property is a Uint8Array object, run these substeps:

      1. Let bytes be the byte sequence represented by the Uint8Array object.
      2. Transmit bytes.

      3. Increase body’s transmitted bytes by bytes’s length.

      4. Run the above step again.

    • When read is fulfilled with an object whose done property is true, queue a fetch task on request to process request end-of-body for request.

    • When read is fulfilled with a value that matches with neither of the above patterns, or read is rejected, terminate the ongoing fetch with reason fatal.

See also