Board index » delphi » Heap/Stack space problems

Heap/Stack space problems

I'm helping write a large application for TP7.  It will be used on Win95 or WinNT systems with at least 16MB
memory.  The application is a simulation which uses a lot of linked lists and dynamic variables that get stored
on the heap.  We are at the point where if we increase stack space we get heap overflow errors and if we
decrease stack space we get stack overflow errors.  This is from the IDE and occurs around the 33K value for
stack size.  If run from the DOS prompt, I think that frees up about 10K, so that is OK currently.  But it
doesn't leave much room for growth, and this application definitely will be growing.  My question is what can
be done to ease this problem?

In reading the manuals, the only help I have found so far is to change value parameters to constant parameters,
as that will reduce the need to create local variables on the stack.  There may be some very large records
passed as value parameters which are causing the problem and will have to be hunted down.

I don't know much about overlays, so I don't know if that will help.

There are about 20 units in the program.  One of them is very large.  Will breaking this unit into smaller
units help the stack space problem?

To reduce heap space, some of the linked lists could be changed to arrays and placed in the data segment (which
is at about 30K right now). But that would require massive changes to the code so I don't think it is a
realistic solution for us.

Any help appreciated.

Tom

 

Re:Heap/Stack space problems


Dear Tom,

Quote
On Thu, 05 Dec 1996 09:39:50 -0800, you wrote:
>I'm helping write a large application for TP7.  It will be used on Win95 or WinNT systems with at least 16MB
>memory.  The application is a simulation which uses a lot of linked lists and dynamic variables that get stored
>on the heap.  We are at the point where if we increase stack space we get heap overflow errors and if we
>decrease stack space we get stack overflow errors.  This is from the IDE and occurs around the 33K value for
>stack size.  If run from the DOS prompt, I think that frees up about 10K, so that is OK currently.  But it
>doesn't leave much room for growth, and this application definitely will be growing.  My question is what can
>be done to ease this problem?

There are several ways around this problem, I'll name a few:

- reduce the need for stack and heap space (I know you don't like to hear this
one).
- overlay the units (this will not help you very much though)
- Use EMS or XMS memory to store data temporarily... or use protected mode. This
will give you the 16 Mb of heap space you were talking about. (or are you doing
this already?)
- use a swap file. (this will give you almost any amount of storage space you
would want to have, and slow down things a bit.)

Peter de Jong,
wpdej...@worldonline.nl

Re:Heap/Stack space problems


Dear Peter,

   Thank you for responding.  My comments are interspersed with yours down below.

Tom

Quote
Peter de Jong wrote:

> Dear Tom,

> On Thu, 05 Dec 1996 09:39:50 -0800, you wrote:

> >I'm helping write a large application for TP7.  It will be used on Win95 or WinNT systems with at least 16MB
> >memory.  The application is a simulation which uses a lot of linked lists and dynamic variables that get stored
> >on the heap.  We are at the point where if we increase stack space we get heap overflow errors and if we
> >decrease stack space we get stack overflow errors.  This is from the IDE and occurs around the 33K value for
> >stack size.  If run from the DOS prompt, I think that frees up about 10K, so that is OK currently.  But it
> >doesn't leave much room for growth, and this application definitely will be growing.  My question is what can
> >be done to ease this problem?

> There are several ways around this problem, I'll name a few:

> - reduce the need for stack and heap space (I know you don't like to hear this
> one).

We did find a large record being passed as a value parameter.  We changed it to
constant and this freed up about 10K of stack space.

Quote
> - overlay the units (this will not help you very much though)
> - Use EMS or XMS memory to store data temporarily... or use protected mode. This
> will give you the 16 Mb of heap space you were talking about. (or are you doing
> this already?)

I thought that compiling using TPX.EXE instead of TURBO.EXE automatically meant we
were using protected mode.  I am under the impression that stack/heap space is
limited to the bottom 640K of memory and that only program code actually goes above
640K.  There is no way that our dynamic variables are using up 16MB of memory
unless we are allocating too much heap space or not disposing of old pointers
properly.  As for EMS or XMS memory, I thought Win95 automatically took care of
this.  Or is there a way to tell Turbo to use this area for stack/heap?

- Show quoted text -

Quote
> - use a swap file. (this will give you almost any amount of storage space you
> would want to have, and slow down things a bit.)

> Peter de Jong,
> wpdej...@worldonline.nl

Re:Heap/Stack space problems


Quote
> I'm helping write a large application for TP7.  It will be used on Win95 or WinNT systems with at least 16MB
> memory.  The application is a simulation which uses a lot of linked lists and dynamic variables that get stored
> on the heap.  We are at the point where if we increase stack space we get heap overflow errors and if we
> decrease stack space we get stack overflow errors.  This is from the IDE and occurs around the 33K value for
> stack size.  

   I suggest you look carefully at your dynamic data: the amount of
space allocated for each element of each L/L node (note that every
allocation rounds UP to a mod-8 value); and the size/dimension of other
structures.  Realize, too, that the L/L may allocate more Heap space
than, say, a pointer array implementation of the same type of
information (every L/L element carries a pointer, which may cause the
mod-8 allocation to carry over to the next mod-8 value).  Although it
seems that a L/L is the only way to use a dynamically-defined amount of
data, it's not, since you can allocate a variable pointer array and use
on;ly the number of pointers you need.

If run from the DOS prompt, I think that frees up about 10K, so that is
OK currently.  But it

Quote
> doesn't leave much room for growth, and this application definitely will be growing.  My question is what can
> be done to ease this problem?

> In reading the manuals, the only help I have found so far is to change value parameters to constant parameters,

   No, you can also pass Var (reference) parameters.  Large structure
parameter _should_ be used this way, since they require only 4 bytes of
Stack, rather than 4+the copied data value).  FTM, do you _have_ to pass
all that data as parameters?

Quote
> as that will reduce the need to create local variables on the stack.  There may be some very large records
> passed as value parameters which are causing the problem and will have to be hunted down.

   Yes, that will definitely cause high Stack usage.  Look into Var
(reference) parameters, if parameters must be used.  You won't be sent
to Hell if you use some global data, you know...8<}}

Quote
> I don't know much about overlays, so I don't know if that will help.

> There are about 20 units in the program.  One of them is very large.  Will breaking this unit into smaller
> units help the stack space problem?

   Nope.  Smart linking will link only what's referenced by the program,
and none of that has bearing on the Stack usage.  However, if you can
redesign some subprograms, by combining functions, etc., you would save
Stack usage by decreasing the depth of Stack utilization.

Quote
> To reduce heap space, some of the linked lists could be changed to arrays and placed in the data segment (which
> is at about 30K right now). But that would require massive changes to the code so I don't think it is a realistic solution for us.

   Don't overlook that option.  You're having to balance the uses of
many finite resources (Stack, Heap, Data Segment, etc.), and compromise
may be necessary (such as using global data, redesign of modularity,
different kinds of variable data, etc.).  It sounds as though you are
facing extreme situations, and "extreme" action may well be necessary.
   BTW, I would always take care to analyze/evaluate the data you're
using: can fixed-length strings be used instead of "string", can array
dimensions be reduced, can you use Byte instead of Integer, etc.?

Re:Heap/Stack space problems


Quote
Tom Sella wrote:

> I'm helping write a large application for TP7.  It will be used on Win95 or WinNT systems with at least 16MB
> memory.  The application is a simulation which uses a lot of linked lists and dynamic variables that get stored
> on the heap.  We are at the point where if we increase stack space we

> I don't know much about overlays, so I don't know if that will help.

In fact overlays are just what the doctor ordered. Your copious use of
units is costing you memory not freeing it. If anything I would try to
consolidate your units before trying overlays, because I think the
compiler opens up  a new segment of memory for each unit, Even if the
previous segment was not all used up. Overlaying forces the compiler to
use up all of the memory in one segment before going on to the next one.
Actually, if your units are going ovre the 64 k limit then you want to
break them up. If they are going under you want to consolidate.

I think this is how it works but not sure. Hope this helps

Re:Heap/Stack space problems


Mike,

   Thanks for responding.  See my comments (and further questions!) below.

Tom

Quote
Mike Copeland wrote:

>    I suggest you look carefully at your dynamic data: the amount of
> space allocated for each element of each L/L node (note that every
> allocation rounds UP to a mod-8 value); and the size/dimension of other
> structures.  Realize, too, that the L/L may allocate more Heap space
> than, say, a pointer array implementation of the same type of
> information (every L/L element carries a pointer, which may cause the
> mod-8 allocation to carry over to the next mod-8 value).  Although it
> seems that a L/L is the only way to use a dynamically-defined amount of
> data, it's not, since you can allocate a variable pointer array and use
> on;ly the number of pointers you need.
> By using MemAvail and how much free DOS memory the compiler says we have, we

are using about 10K of heap space right now for a typical simulation load.  
Code size is 460K (and growing), stack size is 22.5K, and the data size is
currently 25K (these have all been pared down in the past couple of days as
this problem has cropped up).  The largest L/L elements are 480 and 280 bytes
respectively.  We desire to have up to 36 of the size 480 and 10 of the size
280.  So that is 46 pointers, but there are a lot of other smaller
L/Ls about.

Running without the IDE we currently have about 30K free at maximum loading
since we have pared things down.  But 30K is not a large margin considering
how much the program has to grow.  

Others have written to me about 3rd-party Turbo XMS handlers available.  
Would this help the problem any by moving any or hopefully all of the code
and data to extended memory?

Quote
> If run from the DOS prompt, I think that frees up about 10K, so that is
> OK currently.  But it
> > doesn't leave much room for growth, and this application definitely will be growing.  My question is what can
> > be done to ease this problem?

> > In reading the manuals, the only help I have found so far is to change value parameters to constant parameters,

>    No, you can also pass Var (reference) parameters.  Large structure
> parameter _should_ be used this way, since they require only 4 bytes of
> Stack, rather than 4+the copied data value).  FTM, do you _have_ to pass
> all that data as parameters?

Yes, we have to pass the data as parameters.  Our team leader requires it as
"good programming practice" to keep other programmers on the team from
modifying data that they shouldn't.  Specifying CONST instead of VAR for data
that only needs to be read-only should save the same amount of data on the
stack.

Quote
> > as that will reduce the need to create local variables on the stack.  There may be some very large records
> > passed as value parameters which are causing the problem and will have to be hunted down.

>    Yes, that will definitely cause high Stack usage.  Look into Var
> (reference) parameters, if parameters must be used.  You won't be sent
> to Hell if you use some global data, you know...8<}}

Our team leader would not like that.  We do use global data, about 20K worth,
and we have to because local variables cannot be static.

Quote

> > I don't know much about overlays, so I don't know if that will help.

> > There are about 20 units in the program.  One of them is very large.  Will breaking this unit into smaller
> > units help the stack space problem?

>    Nope.  Smart linking will link only what's referenced by the program,
> and none of that has bearing on the Stack usage.  However, if you can
> redesign some subprograms, by combining functions, etc., you would save
> Stack usage by decreasing the depth of Stack utilization.

> > To reduce heap space, some of the linked lists could be changed to arrays and placed in the data segment (which
> > is at about 30K right now). But that would require massive changes to the code so I don't think it is a realistic solution for us.

>    Don't overlook that option.  You're having to balance the uses of
> many finite resources (Stack, Heap, Data Segment, etc.), and compromise
> may be necessary (such as using global data, redesign of modularity,
> different kinds of variable data, etc.).  It sounds as though you are
> facing extreme situations, and "extreme" action may well be necessary.
>    BTW, I would always take care to analyze/evaluate the data you're
> using: can fixed-length strings be used instead of "string", can array
> dimensions be reduced, can you use Byte instead of Integer, etc.?

We're looking at all of these options.  We don't use strings very much as
this is a "scientific" application.  But we can change Reals to Single and
other such things.

Thanks again for your help and any further insights.

Tom

Re:Heap/Stack space problems


Quote
> > By using MemAvail and how much free DOS memory the compiler says we have, we
> are using about 10K of heap space right now for a typical simulation load.
> Code size is 460K (and growing), stack size is 22.5K, and the data size is
> currently 25K (these have all been pared down in the past couple of days as
> this problem has cropped up).  The largest L/L elements are 480 and 280 bytes
> respectively.  We desire to have up to 36 of the size 480 and 10 of the size
> 280.  So that is 46 pointers, but there are a lot of other smaller
> L/Ls about.

   Well, I would then consider a "paging" system, using either a RAM
Disk or EMS/XMS memory - where you develop a MRU/LRU system of keeping a
small number of elements in memory and page what's needed in and out, as
required.  I do this with an application which must have a "virtual"
memory-based set of data which cannot all be stored in memory.  I
maintain a table of "pages" of this data, which informs me what's in
memory or not, and what's the last page used - and whether it's been
updated and must be written back to the "disk" - to know what page to be
read in and used.  This is a rather conventional method of using much
more data than can be kept in actual memory - and it's actually how
EMS/XMS data is used by the system.  You say you need 36 (or whatever)
elements, but you can surely work with a pool of, say, 5 elements quote
efficiently, if you have a "disk" which moves elements in and out.
Depending on the system I'm running on (laptop with/without Hyperdisk,
RAM disk, or even a hard disk), the application runs at varying
performance levels...but it always runs.

Quote

> Running without the IDE we currently have about 30K free at maximum loading
> since we have pared things down.  But 30K is not a large margin considering
> how much the program has to grow.

   Yes, and a paging system would support that.

Quote
> Others have written to me about 3rd-party Turbo XMS handlers available.
> Would this help the problem any by moving any or hopefully all of the code
> and data to extended memory?

   Not as you imply: you'd have to overlay code to get such benefit from
XMS, and I don;t think you could swap data in and out (unless you do as
I suggest above, in the "paging" scheme.

Quote
> > If run from the DOS prompt, I think that frees up about 10K, so that is
> > OK currently.  But it
> > > doesn't leave much room for growth, and this application definitely will be growing.  My question is what can
> > > be done to ease this problem?

   A data paging system, using MRU/LRU (Most Recently Used/Least
Recently Used) logic - see above.

Quote
> > > In reading the manuals, the only help I have found so far is to change value parameters to constant parameters,

> >    No, you can also pass Var (reference) parameters.  Large structure
> > parameter _should_ be used this way, since they require only 4 bytes of
> > Stack, rather than 4+the copied data value).  FTM, do you _have_ to pass
> > all that data as parameters?

> Yes, we have to pass the data as parameters.  Our team leader requires it as
> "good programming practice" to keep other programmers on the team from
> modifying data that they shouldn't.  Specifying CONST instead of VAR for data
> that only needs to be read-only should save the same amount of data on the
> stack.

   Well, _something_ has to "give" and a (pathologically) rigid mandate,
such as that, is indeed unfortunate, I feel.  In this business, there
are no absolutes, since requirements and resources are never static.  I
would attempt to prove to this team leader that _his_ requirement is
negatively impacting the ability to deal with the requirement(s) of this
application - by showing sample programs which demonstrate the value
added by using global data, Var parameters, etc. versus value parameters
- he might see the importance of "guidelines" versus "mandates".  As he
should, IMHO...

Quote

> > > as that will reduce the need to create local variables on the stack.  There may be some very large records
> > > passed as value parameters which are causing the problem and will have to be hunted down.

> >    Yes, that will definitely cause high Stack usage.  Look into Var
> > (reference) parameters, if parameters must be used.  You won't be sent
> > to Hell if you use some global data, you know...8<}}

> Our team leader would not like that.  We do use global data, about 20K worth,
> and we have to because local variables cannot be static.

   I'd say that person needs to be taught/convinced of some realities
(and costs/losses imposed by such rigidity).  This is something you may
have to work on...8<}}

Quote
> Thanks again for your help and any further insights.

   My pleasure.  Yours is an interesting problem/issue.  <g>

Re:Heap/Stack space problems


Quote
Mike Copeland <mrc...@primenet.com> writes:
> > By using MemAvail and how much free DOS memory the compiler says we have, we
> > are using about 10K of heap space right now for a typical simulation load.
> > Code size is 460K (and growing), stack size is 22.5K, and the data size is
> > currently 25K (these have all been pared down in the past couple of days as
> > this problem has cropped up).  The largest L/L elements are 480 and 280 bytes
> > respectively.  We desire to have up to 36 of the size 480 and 10 of the size
> > 280.  So that is 46 pointers, but there are a lot of other smaller
> > L/Ls about.

From these figures, data size is not really the problem. A total of about
60 KB for data, heap and stack is not so much. OTHT, code size is the problem
here, 460 KB is quite much for real mode. Furthermore, when sizes are growing,
usually code size grows faster than data size.

Quote
>    Well, I would then consider a "paging" system, using either a RAM
> Disk or EMS/XMS memory - where you develop a MRU/LRU system of keeping a
> small number of elements in memory and page what's needed in and out, as
> required.

Indeed, you need such a paging system, but not for the data, as Mike
suggested, but rather for the code. And this is exactly what overlays are for.
They use some kind of MRU/LRU system, AFAIK, and can swap out the code to EMS,
if wanted, or just reload it from disk, as needed.

You might have to break down bigger units to let the overlay manager be more
effective, because it pages in and out only whole units. AFAIR, another poster
says, breaking down units could waste space since units have to begin at
segment boundaries. However, segemnt boundaries in real mode are each 16
bytes, so this "waste" of space is neglectible.

Paging the data probably won't help much, since it's "only" 60 KB. Even if you
could achieve to save much of this space, it wouldn't be very much compared to
the expected growth of your code.

Quote
> > Others have written to me about 3rd-party Turbo XMS handlers available.
> > Would this help the problem any by moving any or hopefully all of the code
> > and data to extended memory?

As said above, the Overlay unit can use EMS. If you'd rather use XMS, however,
there are 3rd-party handlers available (I think OVERXMS.ZIP is the filename).

The other solution, besides overlays, would be using protected mode. In this
mode, you have the whole memory available for code and heap data (static data
and stack are still limited to 64 KB each). You won't have to change much in
the program, so this might be the easiest solution, but you'll have to buy a
BP 7.0 upgrade.

Quote
> > Yes, we have to pass the data as parameters.  Our team leader requires it as
> > "good programming practice" to keep other programmers on the team from
> > modifying data that they shouldn't.
> > Specifying CONST instead of VAR for data
> > that only needs to be read-only should save the same amount of data on the
> > stack.

CONST and VAR parameters use the same amount of stack (except for data <4
bytes, which always take 4 bytes when passed as "VAR"). What really wastes
stack, if improperly used, is value parameters (those without "CONST" or
"VAR"). Those should (almost) never be used for bigger structures.

Quote
> > Our team leader would not like that.  We do use global data, about 20K worth,
> > and we have to because local variables cannot be static.
>    I'd say that person needs to be taught/convinced of some realities
> (and costs/losses imposed by such rigidity).  This is something you may
> have to work on...8<}}

True, but OTOH you should also consider the costs of using too much global
data, such as harder maintainability and reusability of your code. I usually
start to worry about the problem when parameters lists are getting very long
(more than 10 parameters, as a guideline, but it depends). Often, the problem
can be solved by packing some of the parameters into a record, and passing the
record by reference - especially if often the same parameters are used
together. You might note that this goes in the direction towards object
oriented programming where data are packed into records (objects) and
(implicitly) passed with each procedure (method) call.

BTW, I wouldn't call the requirements rigid, since you are already using 25 KB
of global data (much more than heap data, from the figures above). Rather than
making more data global, I would consider the opposite, since global data are
limited to 64 KB (and even less under Windoze, if that matters), whereas heap
is limited only by the amount of conventional (real mode) or total (protected
mode) memory. I would suggest to use global data only for data that are really
global, i.e. that have the same meaning to the whole program (rather than such
data that may have different meanings and different values, e.g. to each
window in a multi-window-program, and would have to be changed each time the
active window changes - or things like that).

Frank

Re:Heap/Stack space problems


Thanks everyone for your help.  We have "solved" the problem with overlays and EMS.
 I was worried about execution speed, so I tested it by overlaying the biggest unit,
which is called quite frequently along with some other units called less frequently.
The speed penalty was about 5%, much better than I expected.

  The long-term solution will probably be to upgrade to Delphi.

Tom

Quote
Frank Heckenbach wrote:

> BTW, I wouldn't call the requirements rigid, since you are already using 25 KB
> of global data (much more than heap data, from the figures above). Rather than
> making more data global, I would consider the opposite, since global data are
> limited to 64 KB (and even less under Windoze, if that matters), whereas heap
> is limited only by the amount of conventional (real mode) or total (protected
> mode) memory. I would suggest to use global data only for data that are really
> global, i.e. that have the same meaning to the whole program (rather than such
> data that may have different meanings and different values, e.g. to each
> window in a multi-window-program, and would have to be changed each time the
> active window changes - or things like that).

> Frank

Other Threads