Python consumes a lot of memory or how to reduce the size of objects?
A memory problem may arise when a large number of objects are active in RAM during the execution of a program, especially if there are restrictions on the total amount of available memory.
Below is an overview of some methods of reducing the size of objects, which can significantly reduce the amount of RAM needed for programs in pure Python.
Note: This is english version of my original post (in russian).
For simplicity, we will consider structures in Python to represent points with the coordinates x
,y
, z
with access to the coordinate values by name.
Dict
In small programs, especially in scripts, it is quite simple and convenient to use the built-in dict
to represent structural information:
>>> ob = {'x':1, 'y':2, 'z':3}
>>> x = ob['x']
>>> ob['y'] = y
With the advent of a more compact implementation in Python 3.6 with an ordered set of keys, dict
has become even more attractive. However, let's look at the size of its footprint in RAM:
>>> print(sys.getsizeof(ob))
240
It takes a lot of memory, especially if you suddenly need to create a large number of instances:
Number of instances | Size of objects |
---|---|
1 000 000 | 240 Mb |
10 000 000 | 2.40 Gb |
100 000 000 | 24 Gb |
Class instance
For those who like to clothe everything in classes, it is preferable to define structures as a class with access by attribute name:
class Point:
#
def __init__(self, x, y, z):
self.x = x
self.y = y
self.z = z
>>> ob = Point(1,2,3)
>>> x = ob.x
>>> ob.y = y
The structure of the class instance is interesting:
Field | Size (bytes) |
---|---|
PyGC_Head | 24 |
PyObject_HEAD | 16 |
__weakref__ | 8 |
__dict__ | 8 |
TOTAL: | 56 |
Here __weakref__
is a reference to the list of so-called weak references to this object, the field__dict__
is a reference to the class instance dictionary, which contains the values of instance attributes (note that 64-bit references platform occupy 8 bytes). Starting in Python 3.3, the shared space is used to store keys in the dictionary for all instances of the class. This reduces the size of the instance trace in RAM:
>>> print(sys.getsizeof(ob), sys.getsizeof(ob.__dict__))
56 112
As a result, a large number of class instances have a smaller footprint in memory than a regular dictionary (dict
):
Number of instances | Size |
---|---|
1 000 000 | 168 Mb |
10 000 000 | 1.68 Gb |
100 000 000 | 16.8 Gb |
It is easy to see that the size of the instance in RAM is still large due to the size of the dictionary of the instance.
Instance of class with __slots__
A significant reduction in the size of a class instance in RAM is achieved by eliminating __dict__
and__weakref__
. This is possible with the help of a "trick" with __slots__
:
class Point:
__slots__ = 'x', 'y', 'z'
def __init__(self, x, y, z):
self.x = x
self.y = y
self.z = z
>>> ob = Point(1,2,3)
>>> print(sys.getsizeof(ob))
64
The object size in RAM has become significantly smaller:
Field | Size (bytes) |
---|---|
PyGC_Head | 24 |
PyObject_HEAD | 16 |
x | 8 |
y | 8 |
z | 8 |
TOTAL: | 64 |
Using __slots__
in the class definition causes the footprint of a large number of instances in memory to be significantly reduced:
Number of instances | Size |
---|---|
1 000 000 | 64 Mb |
10 000 000 | 640 Mb |
100 000 000 | 6.4 Gb |
Currently, this is the main method of substantially reducing the memory footprint of an instance of a class in RAM.
This reduction is achieved by the fact that in the memory after the title of the object, object references are stored — the attribute values, and access to them is carried out using special descriptors that are in the class dictionary:
>>> pprint(Point.__dict__)
mappingproxy(
....................................
'x': <member 'x' of 'Point' objects>,
'y': <member 'y' of 'Point' objects>,
'z': <member 'z' of 'Point' objects>})
To automate the process of creating a class with __slots__
, there is a library [namedlist] (https://pypi.org/project/namedlist). The namedlist.namedlist
function creates a class with __slots__
:
>>> Point = namedlist('Point', ('x', 'y', 'z'))
Another package [attrs] (https://pypi.org/project/attrs) allows you to automate the process of creating classes both with and without __slots__
.
Tuple
Python also has a built-in type tuple
for representing immutable data structures. A tuple is a fixed structure or record, but without field names. For field access, the field index is used. The tuple fields are once and for all associated with the value objects at the time of creating the tuple instance:
>>> ob = (1,2,3)
>>> x = ob[0]
>>> ob[1] = y # ERROR
Instances of tuple are quite compact:
>>> print(sys.getsizeof(ob))
72
They occupy 8 bytes in memory more than instances of classes with __slots__
, since the tuple trace in memory also contains a number of fields:
Field | Size (bytes) |
---|---|
PyGC_Head | 24 |
PyObject_HEAD | 16 |
ob_size | 8 |
[0] | 8 |
[1] | 8 |
[2] | 8 |
TOTAL: | 72 |
Namedtuple
Since the tuple is used very widely, one day there was a request that you could still have access to the fields and by name too. The answer to this request was the module collections.namedtuple
.
The namedtuple
function is designed to automate the process of generating such classes:
>>> Point = namedtuple('Point', ('x', 'y', 'z'))
It creates a subclass of tuple, in which descriptors are defined for accessing fields by name. For our example, it would look something like this:
class Point(tuple):
#
@property
def _get_x(self):
return self[0]
@property
def _get_y(self):
return self[1]
@property
def _get_z(self):
return self[2]
#
def __new__(cls, x, y, z):
return tuple.__new__(cls, (x, y, z))
All instances of such classes have a memory footprint identical to that of a tuple. A large number of instances leave a slightly larger memory footprint:
Number of instances | Size |
---|---|
1 000 000 | 72 Mb |
10 000 000 | 720 Mb |
100 000 000 | 7.2 Gb |
Recordclass: mutable namedtuple without cyclic GC
Since the tuple
and, accordingly, namedtuple
-classes generate immutable objects in the sense that attribute ob.x
can no longer be associated with another value object, a request for a mutable namedtuple variant has arisen. Since there is no built-in type in Python that is identical to the tuple that supports assignments, many options have been created. We will focus on [recordclass] (https://pypi.org/project/recordclass), which received a rating of [stackoverflow] (https://stackoverflow.com/questions/29290359/existence-of-mutable-named-tuple-in -python / 29419745). In addition it can be used to reduce the size of objects in RAM compared to the size of tuple
-like objects..
The package recordclass introduces the type recordclass.mutabletuple
, which is almost identical to the tuple, but also supports assignments. On its basis, subclasses are created that are almost completely identical to namedtuples, but also support the assignment of new values to fields (without creating new instances). The recordclass
function, like thenamedtuple
function, allows you to automate the creation of these classes:
>>> Point = recordclass('Point', ('x', 'y', 'z'))
>>> ob = Point(1, 2, 3)
Class instances have same structure as tuple
, but only withoutPyGC_Head
:
Field | Size (bytes) |
---|---|
PyObject_HEAD | 16 |
ob_size | 8 |
x | 8 |
y | 8 |
y | 8 |
TOTAL: | 48 |
By default, the recordclass
function create a class that does not participate in the cyclic garbage collection mechanism. Typically, namedtuple
andrecordclass
are used to generate classes representing records or simple (non-recursive) data structures. Using them correctly in Python does not generate circular references. For this reason, in the wake of instances of classes generated by recordclass
, by default, the
PyGC_Headfragment is excluded, which is necessary for classes supporting the cyclic garbage collection mechanism (more precisely: in the
PyTypeObjectstructure, corresponding to the created class, in the
flagsfield, by default, the flag
Py_TPFLAGS_HAVE_GC` is not set).
The size of the memory footprint of a large number of instances is smaller than that of instances of the class with __slots__
:
Number of instances | Size |
---|---|
1 000 000 | 48 Mb |
10 000 000 | 480 Mb |
100 000 000 | 4.8 Gb |
Dataobject
Another solution proposed in the recordclass library is based on the idea: use the same storage structure in memory as in class instances with __slots__
, but do not participate in the cyclic garbage collection mechanism. Such classes are generated using the recordclass.make_dataclass
function:
>>> Point = make_dataclass('Point', ('x', 'y', 'z'))
The class created in this way, by default, creates mutable instances.
Another way – use class declaration with inheritance from recordclass.dataobject
:
class Point(dataobject):
x:int
y:int
z:int
Classes created in this way will create instances that do not participate in the cyclic garbage collection mechanism. The structure of the instance in memory is the same as in the case with __slots__
, but without the PyGC_Head
:
Field | Size (bytes) |
---|---|
PyObject_HEAD | 16 |
x | 8 |
y | 8 |
y | 8 |
TOTAL: | 40 |
>>> ob = Point(1,2,3)
>>> print(sys.getsizeof(ob))
40
To access the fields, special descriptors are also used to access the field by its offset from the beginning of the object, which are located in the class dictionary:
mappingproxy({'__new__': <staticmethod at 0x7f203c4e6be0>,
.......................................
'x': <recordclass.dataobject.dataslotgetset at 0x7f203c55c690>,
'y': <recordclass.dataobject.dataslotgetset at 0x7f203c55c670>,
'z': <recordclass.dataobject.dataslotgetset at 0x7f203c55c410>})
The sizeo of the memory footprint of a large number of instances is the minimum possible for CPython:
Number of instances | Size |
---|---|
1 000 000 | 40 Mb |
10 000 000 | 400 Mb |
100 000 000 | 4.0 Gb |
Cython
There is one approach based on the use of [Cython] (https://cython.org). Its advantage is that the fields can take on the values of the C language atomic types. Descriptors for accessing fields from pure Python are created automatically. For example:
cdef class Python:
cdef public int x, y, z
def __init__(self, x, y, z):
self.x = x
self.y = y
self.z = z
In this case, the instances have an even smaller memory size:
>>> ob = Point(1,2,3)
>>> print(sys.getsizeof(ob))
32
The instance trace in memory has the following structure:
Field | Size (bytes) |
---|---|
PyObject_HEAD | 16 |
x | 4 |
y | 4 |
y | 4 |
пусто | 4 |
TOTAL: | 32 |
The size of the footprint of a large number of copies is less:
Number | Size |
---|---|
1 000 000 | 32 Mb |
10 000 000 | 320 Mb |
100 000 000 | 3.2 Gb |
However, it should be remembered that when accessing from Python code, a conversion from int
to a Python object and vice versa will be performed every time.
Numpy
Using multidimensional arrays or arrays of records for a large amount of data gives a gain in memory. However, for efficient processing in pure Python, you should use processing methods that focus on the use of functions from the numpy
package.
>>> Point = numpy.dtype(('x', numpy.int32), ('y', numpy.int32), ('z', numpy.int32)])
An array of N
elements, initialized with zeros, is created using the function:
>>> points = numpy.zeros(N, dtype=Point)
The size of the array in memory is the minimum possible:
Number of objects | Size |
---|---|
1 000 000 | 12 Mb |
10 000 000 | 120 Mb |
100 000 000 | 1.20 Gb |
Normal access to array elements and rows will require convertion from a Python object to a C int
value and vice versa. Extracting a single row results in the creation of an array containing a single element. Its trace will not be so compact anymore:
>>> sys.getsizeof(points[0])
68
Therefore, as noted above, in Python code, it is necessary to process arrays using functions from the numpy
package.
Conclusion
On a clear and simple example, it was possible to verify that the Python programming language (CPython) community of developers and users has real possibilities for a significant reduction in the amount of memory used by objects.
Comments 3
Only users with full accounts can post comments. Log in, please.