JSON disk / memory size ratio JSON disk / memory size ratio json json

JSON disk / memory size ratio


Json file size representation is the actual numbers of bytes it is currently using.

The RAM will vary. Depending on the language used, and the variable type.As per documentation, the language will use the value type size when allocating the RAM.

{"aNumber":1}

uses 1 byte in a file, but uses 8 bytes (in Java) when allocated in memory

To simplify, you can estimate that RAM usage will be file size multiplied by the largest type. So, Yes, file size times eight is a good estimate.

Edit:

Also, remember that variable names (strings) will occupy file size, and ram pointers will use more than that

Edit 2:

And if so, how would you suggest I could reduce memory load?

We could use more information on your system or need. but roughly, the more information segregation the better.

I have a system with 3200 sensors, and a expectancy of 7000 total. For my system, I have each "file output" with a hundred registries each, and each registry has up to 10 fields. This means usually a ~30kb file, that uses up to 300kb RAM each, then I can have a distributed service read one, work on that file, and re-output it where as needed.

If you can, then use multiple files with a low ammount of data each time, and once one has been processed, move to the next one