Memory Pool

Memory pools, also called fixed-size blocks allocation, is the use of pools for memory management that allows dynamic memory allocation comparable to malloc or C++'s operator new. As those implementations suffer from fragmentation because of variable block sizes, it is not recommendable to use them in a real time system due to performance. A more efficient solution is preallocating a number of memory blocks with the same size called the memory pool. The application can allocate, access, and free blocks represented by handles at run time.

Many real-time operating systems use memory pools, such as the Transaction Processing Facility.

Some systems, like the web server Nginx, use the term memory pool to refer to a group of variable-size allocations which can be later deallocated all at once. This is also known as a region; see region-based memory management.

Simple memory pool implementation

A simple memory pool module can allocate, for example, three pools at compile time with block sizes optimized for the application deploying the module. The application can allocate, access and free memory through the following interface:

  • Allocate memory from the pools. The function will determine the pool where the required block fits in. If all blocks of that pool are already reserved, the function tries to find one in the next bigger pool(s). An allocated memory block is represented with a handle.
  • Get an access pointer to the allocated memory.
  • Free the formerly allocated memory block.
  • The handle can for example be implemented with an unsigned int. The module can interpret the handle internally by dividing it into pool index, memory block index and a version. The pool and memory block index allow fast access to the corresponding block with the handle, while the version, which is incremented at each new allocation, allows detection of handles whose memory block is already freed (caused by handles retained too long).

Memory pool vs malloc

Benefits

  • Memory pools allow memory allocation with constant execution time. The memory release for thousands of objects in a pool is just one operation, not one by one if malloc is used to allocate memory for each object.
  • Memory pools can be grouped in hierarchical tree structures, which is suitable for special programming structures like loops and recursions.
  • Fixed-size block memory pools do not need to store allocation metadata for each allocation, describing characteristics like the size of the allocated block. Particularly for small allocations, this provides substantial space savings.
  • Allows deterministic behavior on real-time systems avoiding the out of memory errors.

Drawbacks

  • Memory pools may need to be tuned for the application which deploys them.

See also

External links


  This article uses material from the Wikipedia page available here. It is released under the Creative Commons Attribution-Share-Alike License 3.0.

Memory_pool
 



 

Connect with defaultLogic
What We've Done
Led Digital Marketing Efforts of Top 500 e-Retailers.
Worked with Top Brands at Leading Agencies.
Successfully Managed Over $50 million in Digital Ad Spend.
Developed Strategies and Processes that Enabled Brands to Grow During an Economic Downturn.
Taught Advanced Internet Marketing Strategies at the graduate level.


Manage research, learning and skills at NCR Works. Create an account using LinkedIn to manage and organize your omni-channel knowledge. NCR Works is like a shopping cart for information -- helping you to save, discuss and share.


  Contact Us