Recent

Author Topic: heap size over available RAM  (Read 3952 times)

mrspock1

  • New Member
  • *
  • Posts: 29
heap size over available RAM
« on: July 30, 2018, 08:49:06 am »
I use Tdictionary taking over 4GB heap for dynamic variables. My computer has only 4GB RAM. I suppose Windows will put the variables on virtual memory. Do you have any experience as to how significant will be downturn of running time? Will I need a computer with more RAM? I may need from 8GB to 64GB RAM to be used for dynamic variables in my application.

Thaddy

  • Hero Member
  • *****
  • Posts: 14373
  • Sensorship about opinions does not belong here.
Re: heap size over available RAM
« Reply #1 on: July 30, 2018, 09:02:08 am »
Well, the best solution is to have 64G of memory. Anything other will be slower.
The default FPC memory manager looks at true available memory, not at swap space, so that won't work properly.
But you can write a custom memory manager that uses a (partial or wholly) heap on disk. That will be slower.
Such variables are better replaced with a (buffered) filestream if cost is an issue.
Object Pascal programmers should get rid of their "component fetish" especially with the non-visuals.

marcov

  • Administrator
  • Hero Member
  • *
  • Posts: 11452
  • FPC developer.
Re: heap size over available RAM
« Reply #2 on: July 30, 2018, 10:22:45 am »
I use Tdictionary taking over 4GB heap for dynamic variables. My computer has only 4GB RAM. I suppose Windows will put the variables on virtual memory. Do you have any experience as to how significant will be downturn of running time? Will I need a computer with more RAM? I may need from 8GB to 64GB RAM to be used for dynamic variables in my application.

This depends on your access pattern.  It can range from nearly transparent to very, very bad.

Afaik TDictionary is not the most memory conserving structure, and hashes distribute their structures randomly in memory. So even if there is a pattern ( e.g. when you access an item, the typically the next item is near in time or place or some other factor), the hash will distribute them pseudo randomly in memory.

The more logical way then would be to create an index that puts items that are near in such factor close together. That will improve swapping performance.

In general, a rule of thumb is that virtual memory doesn't work very well with (total  used memory)/(physical memory) > 2. 

Anyway, I think it will require a lot of rethinking, and it is wiser to just plug in more memory in most cases. 64GB is expensive, but still doable. If you are on a budget, a bunch of old machines with 8GB in a cluster configuration might be cheaper. (but more work, since you need to adjust your software for that)



mrspock1

  • New Member
  • *
  • Posts: 29
Re: heap size over available RAM
« Reply #3 on: July 30, 2018, 02:25:16 pm »
I wrote a request to the dean's office of the IT department at our university to give me an access to a computer with 64 GB RAM for an hour. My application is supposed to do the job just once (solve the board peg solitaire puzzle with graph searching), so that's enough for me. IT is not my department, I study at the philological faculty. They will answer my request in a month so I will have time to master Lazarus and move code from Delphi 32 bit to 64 bit to be able to use over 4GB RAM.

Thaddy

  • Hero Member
  • *****
  • Posts: 14373
  • Sensorship about opinions does not belong here.
Re: heap size over available RAM
« Reply #4 on: July 30, 2018, 06:28:12 pm »
I answered your question (related) in the other thread you opened: your net data requirement is 1.3 Gb. not more.
((50*30000000)/(1024^3) ~= 1.3Gb. Your gross data requirement should be less than ~2Gb using a TDictionary.
Which leaves you ~1.5 Gb of processing memory give 4Gb of RAM allowing for the OS.
As long as you don't copy all data that is.
Object Pascal programmers should get rid of their "component fetish" especially with the non-visuals.

 

TinyPortal © 2005-2018