Difference between revisions of "Memory usage and management"

(Removed references to 32-bit, which doesn't exist any more)
 
(4 intermediate revisions by the same user not shown)
Line 4: Line 4:
  
 
===Overview===
 
===Overview===
The evaluation of some models may require large amounts of memory, especially when they contain arrays with high dimensionality. When the memory needed by your model approaches the available memory on your computer, you may encounter an error informing
+
Some models may require large amounts of memory, especially when they contain arrays with many dimensions (Indexes). When the memory used by your model approaches the available memory on your computer, you may see an error saying that there is insufficient memory to carry out the computation, or you may experience slow performance as the process starts using virtual memory. This page explains how to monitor memory usage. It also suggests ways to work around memory bottlenecks, or modify your model to reduce memory usage.
you that there is insufficient memory to carry out the computation, or you may experience a general slowdown as the process starts utilizing virtual memory. To address the challenges presented by memory bottlenecks, you may need to rethink various aspects of how you are attacking your problem, or you may have to employ various advanced modeling techniques to obtain the desired computations.
 
  
=== Memory limits ===
+
=== Viewing memory usage ===
Roughly speaking, there are two memory limits on your computer: The amount of RAM and the maximum available memory which includes both RAM and page file memory. When the evaluation of your model requires more memory that there is RAM, but does not exceed the maximum available memory, you may encounter a dramatic slowdown. The slowdown occurs because the operating system continuously swaps memory between disk and RAM, and disk is much slower than RAM. Sometimes the slowdown is not so bad -- it depends on how localized the memory demands are. If you have a single array whose storage space exceeds RAM then the slowdown will almost certainly be intolerable. If all the individual arrays are much smaller than RAM, but the total usage exceeds RAM only because the model has a lot of variables, then it might not experience much slowdown.
 
  
If the required usage exceeds the maximum available memory (the sum of RAM and page file, less space used by competing processes), then Analytica will report an insufficient memory error.  
+
You can monitor the total amount of memory in use by your Analytica process by using the [[Memory Usage dialog]]. See also [[Analytica_Specifications#Memory_usage|Memory usage]].
  
Because Analytica is 64-bit, the address space limit, which is about 18 million terabytes, will not be your limiting factor.  
+
=== Memory limits and memory types ===
 +
Roughly speaking, there are two memory limits on your computer: The amount of RAM (random access memory) is the starting point. The maximum available memory adds VM (virtual memory), which means saving some data to a page file. If your model needs more than the maximum available memory (RAM plus page file, less space used by other processes including the Windows OS), Analytica will report an insufficient memory error.  
  
=== Configuring available memory ===
+
When your model needs exceeds the RAM available and starts to use VM, performance can slow dramatically. This slowdown occurs because the operating system has to swap memory between RAM and a hard disk, and disk is much slower than RAM. The degree of slowdown depends on how localized the memory demands are. If you have a single array whose storage space exceeds RAM then the slowdown may be intolerable. If all the arrays are much smaller than RAM, but the total usage exceeds RAM only because the model has a lot of variables, it might not experience much slowdown.
Various settings in the Windows operating system determine the maximum amount of memory that your process can use before running out of memory.
 
<!--
 
In 32-bit editions of Windows, processes are limited to 2GB of space unless you explicitly configure a flag in the <code>C:\boot.ini</code> file. With the <code>/3GB flag</code> in <code>boot.ini</code>, processes like Analytica may use up to 3GB (see [[How To Access More Memory]] on the Analytica Wiki). Analytica 32-bit may be used from a 64-bit edition of Windows, in which case it can utilize 4GB with no special configuration required.
 
-->
 
If you need to utilize more memory than the amount of RAM on your computer, you may need to adjust your virtual memory settings. On Windows, configure this from '''My Computer  &rarr; Properties &rarr; Advanced system settings &rarr; Performance &rarr; Settings &rarr; Advanced &rarr; Virtual Memory (Change)'''. You can substantially improve performance by installing a solid state hard drive (SSDI) and locating your page file on that drive. You should configure a custom size with an initial size that is large enough to accommodate your large computations. We find that Windows freezes for several minutes when it needs to increase the size of its page file, which you avoid by using a suitably large initial size.
 
  
 +
=== Hardware cures for memory limits ===
 +
 +
Nowadays many computers, including laptops, have a SSD (solid-state drive) which is much faster than a hard disk, but not as fast as RAM. Using SSD can greatly alleviate the slowdown from use of VM. Adding an SSD to your computer can sometimes be an easy fix for slowdown of memory-intensive models.  Or you can just add more RAM, which gets every cheaper.
 +
 +
=== Configuring virtual memory ===
 +
If you need more memory than the amount of RAM on your computer, you may increase your virtual memory (VM). The Windows operating system lets you adjust the VM (virtual memory) page file -- i.e. the amount of memory available to your process. On Windows, you can configure this from '''My Computer  &rarr; Properties &rarr; Advanced system settings &rarr; Performance &rarr; Settings &rarr; Advanced &rarr; Virtual Memory (Change)'''
 +
 +
Windows tends to freeze for several minutes when it needs to increase the size of its page file. So you should configure a custom intial size large enough for your large computations. If you have (or add) an solid state hard drive (SSD), make sure to locate your page file on the SSD.
 +
 +
These virtual memory settings show recommended settings:
 
::[[image:Recommended Virtual memory settings.png]]
 
::[[image:Recommended Virtual memory settings.png]]
  
The above virtual memory settings illustrates our recommended settings. On this computer, the paging file has been located on the solid state drive <code>E:</code> for maximum speed. To ensure it uses the SSDI, drive <code>C:</code> is set to '''No paging file'''.  A '''Custom size''' for <code>E:</code> has been specified -- this part is important. We strongly recommend against using the '''System managed size''' option (unless you enjoy having Windows freeze for several minutes). Set the initial size to something large enough to accommodate the largest memory usage you might ever encounter. In this example, the maximum size has been set to something a bit larger, but in reality there is little benefit to setting the Maximum size to anything larger than the Initial size, since you should never exceed the initial size.
+
On this computer, the paging file has been located on the solid state drive <code>E:</code> for maximum speed. To ensure it uses the SSD, <code>C:</code> is set to '''No paging file'''.  It specifies a '''Custom size''' for <code>E:</code> -- ''this is important.'' We strongly recommend against using the '''System managed size''' option (unless you enjoy having Windows freeze for several minutes). Set the initial size to something large enough to accommodate the largest memory usage you might ever encounter. In this example, the maximum size has been set to something a bit larger, but in reality there is little benefit to setting the Maximum size to anything larger than the Initial size, since you should never exceed the initial size.
  
=== Viewing memory usage ===
+
===How to reduce memory used by your model===
You can monitor the total amount of memory in use by your Analytica process by using the [[Memory Usage dialog]]. See also [[Analytica_Specifications#Memory_usage|Memory usage]].
 
  
When you find your model consuming excessive amounts of memory, the first step is to determine
+
There are several ways to reduce memory use by a model, described below. But before you do, we strongly recommend starting out by using the [[Performance Profiler library]] to find out which variables are using most of the memory. This is a feature available in Analytica Enterprise (and above). The Profiler shows how much memory (and computation) is used by each variable. Often you will find that just a few variables consume the bulk of memory.  Then you can focus your efforts to reduce memory usage on just those variables.
where it is using that memory, and track it to a particular variable. For that, the [[Performance Profiler library]] is especially useful. This is a feature available in Analytica Enterprise, and with it, you can determine how much memory is being consumed by each variable in your model. It is not uncommon to find that a small handful of variables consume the bulk of memory, and knowing that, you can focus your efforts on those variables.
 
  
 
=== Reduce dimensionality ===
 
=== Reduce dimensionality ===
Excessive memory usage is most often the result of excessive dimensionality. Analytica makes it very easy to add indexes to your model, but the effect is multiplicative in space and time when those indexes appear in the same array. Finding the right trade-off between dimensionality, level of detail and resource usage (time and computation) is an inherent aspect of model building.
+
 
 +
Excessive memory usage is usually the result of arrays with too many dimensions (Indexes). Analytica makes it very easy to add indexes to your model, but the effect is multiplicative in space and time when those indexes appear in the same array. Finding the right trade-off between dimensionality, level of detail and resource usage (time and computation) is an inherent aspect of model building.
  
 
When reducing dimensionality, focus on individual arrays that consume a lot of space as a result of high dimensionality. Can you reformulate your problem using a lower-dimensional array?
 
When reducing dimensionality, focus on individual arrays that consume a lot of space as a result of high dimensionality. Can you reformulate your problem using a lower-dimensional array?
  
 
=== Selective parametric analysis ===
 
=== Selective parametric analysis ===
Parametric analysis refers to the practice of inserting an extra index into a model input in order to explore how outputs vary across a range of values for the given input. Analytica makes it easy to do this simultaneously for multiple inputs, but in the process, the dimensionality of computed results increases. Selective parametric analysis is the technique of performing parametric analysis on only a few inputs at a time. All other inputs are left at a single point; hence, only a few parametric indexes are combined in any one evaluation. The user of the model then repeats the process, selecting a new subset of inputs for parametric analysis, and re-evaluating the model. You can facilitate selective parametric analysis by utilizing Choice menu input controls for potential parametric inputs. See [[Creating a choice menu]].
+
Parametric analysis refers to the practice of inserting an extra index into a model input in order to explore how outputs vary across a range of values for the given input. Analytica makes it easy to do this simultaneously for multiple inputs, but in the process, adds an extra dimension for each parametric variable. Selective parametric analysis means selecting just a few inputs for parametric analysis at a time. Leave the other inputs at a single point. You can repeat the process, selecting a different subset of inputs for parametric analysis. First use a [[Tornado chart]], which varies just one uncertain input at a time , to identify which are the most sensitive variables deserving of parametric analysis. You can facilitate selective parametric analysis by using Choice menu input controls for each potential parametric inputs. See [[Creating a choice menu]].
  
 
=== Configuring caching ===
 
=== Configuring caching ===

Latest revision as of 18:29, 22 November 2024

Overview

Some models may require large amounts of memory, especially when they contain arrays with many dimensions (Indexes). When the memory used by your model approaches the available memory on your computer, you may see an error saying that there is insufficient memory to carry out the computation, or you may experience slow performance as the process starts using virtual memory. This page explains how to monitor memory usage. It also suggests ways to work around memory bottlenecks, or modify your model to reduce memory usage.

Viewing memory usage

You can monitor the total amount of memory in use by your Analytica process by using the Memory Usage dialog. See also Memory usage.

Memory limits and memory types

Roughly speaking, there are two memory limits on your computer: The amount of RAM (random access memory) is the starting point. The maximum available memory adds VM (virtual memory), which means saving some data to a page file. If your model needs more than the maximum available memory (RAM plus page file, less space used by other processes including the Windows OS), Analytica will report an insufficient memory error.

When your model needs exceeds the RAM available and starts to use VM, performance can slow dramatically. This slowdown occurs because the operating system has to swap memory between RAM and a hard disk, and disk is much slower than RAM. The degree of slowdown depends on how localized the memory demands are. If you have a single array whose storage space exceeds RAM then the slowdown may be intolerable. If all the arrays are much smaller than RAM, but the total usage exceeds RAM only because the model has a lot of variables, it might not experience much slowdown.

Hardware cures for memory limits

Nowadays many computers, including laptops, have a SSD (solid-state drive) which is much faster than a hard disk, but not as fast as RAM. Using SSD can greatly alleviate the slowdown from use of VM. Adding an SSD to your computer can sometimes be an easy fix for slowdown of memory-intensive models. Or you can just add more RAM, which gets every cheaper.

Configuring virtual memory

If you need more memory than the amount of RAM on your computer, you may increase your virtual memory (VM). The Windows operating system lets you adjust the VM (virtual memory) page file -- i.e. the amount of memory available to your process. On Windows, you can configure this from My Computer → Properties → Advanced system settings → Performance → Settings → Advanced → Virtual Memory (Change)

Windows tends to freeze for several minutes when it needs to increase the size of its page file. So you should configure a custom intial size large enough for your large computations. If you have (or add) an solid state hard drive (SSD), make sure to locate your page file on the SSD.

These virtual memory settings show recommended settings:

Recommended Virtual memory settings.png

On this computer, the paging file has been located on the solid state drive E: for maximum speed. To ensure it uses the SSD, C: is set to No paging file. It specifies a Custom size for E: -- this is important. We strongly recommend against using the System managed size option (unless you enjoy having Windows freeze for several minutes). Set the initial size to something large enough to accommodate the largest memory usage you might ever encounter. In this example, the maximum size has been set to something a bit larger, but in reality there is little benefit to setting the Maximum size to anything larger than the Initial size, since you should never exceed the initial size.

How to reduce memory used by your model

There are several ways to reduce memory use by a model, described below. But before you do, we strongly recommend starting out by using the Performance Profiler library to find out which variables are using most of the memory. This is a feature available in Analytica Enterprise (and above). The Profiler shows how much memory (and computation) is used by each variable. Often you will find that just a few variables consume the bulk of memory. Then you can focus your efforts to reduce memory usage on just those variables.

Reduce dimensionality

Excessive memory usage is usually the result of arrays with too many dimensions (Indexes). Analytica makes it very easy to add indexes to your model, but the effect is multiplicative in space and time when those indexes appear in the same array. Finding the right trade-off between dimensionality, level of detail and resource usage (time and computation) is an inherent aspect of model building.

When reducing dimensionality, focus on individual arrays that consume a lot of space as a result of high dimensionality. Can you reformulate your problem using a lower-dimensional array?

Selective parametric analysis

Parametric analysis refers to the practice of inserting an extra index into a model input in order to explore how outputs vary across a range of values for the given input. Analytica makes it easy to do this simultaneously for multiple inputs, but in the process, adds an extra dimension for each parametric variable. Selective parametric analysis means selecting just a few inputs for parametric analysis at a time. Leave the other inputs at a single point. You can repeat the process, selecting a different subset of inputs for parametric analysis. First use a Tornado chart, which varies just one uncertain input at a time , to identify which are the most sensitive variables deserving of parametric analysis. You can facilitate selective parametric analysis by using Choice menu input controls for each potential parametric inputs. See Creating a choice menu.

Configuring caching

Whenever Analytica computes the result of a variable, it stores the result in memory. If the value is requested later by another variable, or if the user views the result, Analytica simply returns the previously computed result without having to recompute it again. This is referred to as caching.

It may be unnecessary to cache many of the intermediate variables within your model, or to keep the cached value once all the children of that variable have been computed. You can configure how cached results are retained by setting the CachingMethod attribute for each variable[1]. You can configure a variable to always cache its results, never cache, never cache array values, or release cached results after all children are fully computed. To control the CachingMethod, you must first make the attribute visible as described in Managing attributes. Note that you should always cache results for any variable containing a random or distribution function. There are some other limitations and interactions to be aware of when managing caching policies, as described further in Controlling When Result Values Are Cached on the Analytica Wiki.

Looping over the model

Analytica’s array abstraction computes the entire model in one pass. Because all intermediate variables are normally cached, this means that the results for all scenarios, across all indexes, are stored in memory for all variables.

An alternative is to loop over key dimensions, computing the entire model for a single case at a time, and collecting only the results for final output variable in each case. This consumes only the amount of memory required to compute the entire model for a single scenario, plus the memory for the final result.

The basic technique is illustrated by the following example. Suppose your model has an input X indexed by I, and an output Y. We also assume that no intermediate steps in the model operate over the index I. Then Y by looping using:

For xi[ ] := X do WhatIf(Y, X, xi)

This simple example is illustrative. As you apply this to complex models involving multiple inputs, outputs and looping dimensions, a greater level of sophistication becomes necessary. The article Looping over a model on the Analytica Wiki covers this topic in greater detail.

Large sample library

Monte Carlo simulations with a very large sample size may also lead to insufficient memory. The Large sample library runs a Monte Carlo simulation in small batches, collecting the entire Monte Carlo sample for a selected subset of output variables. The Large Sample Library User Guide is found on the Analytica Wiki, where the library itself is also available for download.

CompressMemoryUsedBy(A)

In some cases, the CompressMemoryUsedBy() function is able to reduce the amount of memory consumed by array A. The logical contents of A remains unchanged, so you will not see a difference, other than possibly a drop in memory usage.

Analytica’s internal representation of arrays is able to accommodate certain forms of sparseness, and when such patterns of sparseness occur within A, CompressMemoryUsedBy(A) condenses the internal representation to leverage the sparseness. There are two forms of sparseness that may occur: Shared subarrays and constant subvectors. For more details on these forms of sparseness, see CompressMemoryUsedBy on the Analytica Wiki.

Notes

  1. Use of CachingMethod requires Analytica Enterprise.

See Also


Comments


You are not allowed to post comments.