Hi Danny, it's hard to know the RAM consumption as it depends of many factors, some related to the way you create the data model and the relationships.
This app can give you an approximation: RAM Calculation QlikView App
What is "memory usage"? Does it include memory for users (each user adds 10% to the memory footprint)? Does it include the cache extension and QVS overhead? Does it include the growth caused by the reload of ever growing source tables?
The optimal calculation of memory requirements is largely based on estimates (see Ruben's suggested document), experience and continuous run-time measurements. It's not an exact science, unfortunately.
The usage of the memory ram I think is variable because depend of the server, for example if the server is virtual, the process that windows have or if you have another tools running in the server.
Also the performance can change depending of your bios settings, I share you some information this topic
Check the app that I share you, can be useful for you
I'm not sure what tool you were using as the "Memory Usage analyzer app", but it could be that is was calculating space for the data only, and not the sheet objects.
Qlikview Cookbook: QV Document Analyzer http://qlikviewcookbook.com/recipes/download-info/document-analyzer/
does a fair job of calculating what is called the "RAM Footprint" on the Summary sheet, which includes both the data and the sheet objects. It also identifies the per user increment on the Memory sheet.
Note that as soon as you open a document in QVS, results start getting added to the cache. So just looking at QVS memory in task manager is not an accurate measure of the baseline needed for an app. However, I think it's still a useful number because your app will need some cache when it's actually used -- so I think using actual QVS memory for planning make some sense.
Thanks for your answer. I used the Document Analyzer for app number 2 (in my overview above) and the RAM footprint result is 230MB. This differs a lot with the memory usage on the server (1.400MB). Is this caused by 1.100MB of results in cache then?
Any idea how to get better memory analyzing results?