mygrad.turn_memory_guarding_off#

mygrad.turn_memory_guarding_off()[source]#

Globally disables all memory-guarding mechanisms, except for in contexts where they are explicitly enabled.

See also

turn_memory_guarding_on

Globally enables all memory-guarding mechanisms

mem_guard_off

context manager & decorator for suspending memory guarding

mem_guard_on

context manager & decorator for enabling memory guarding

Notes

With memory guarding disabled, arrays participating in active computational graphs are not protected from being mutated by the user. Mutating such an array will corrupt the derivatives that are computed via back-propagation, and will produce incorrect results.

This can speed up computations involving many small tensors substantially.

If you want to disable memory guarding at the system level, you can set the system environment variable MYGRAD_MEM_GUARD=False. NOTE THAT THIS IS NOT RECOMMENDED.

Examples

The following demonstrates how one can unwittingly corrupt backpropagation through a computational graph

>>> import mygrad as mg
>>> import numpy as np
>>> mg.turn_memory_guarding_off()  # speeds up calculations, but with risks involved..
>>> x = np.arange(3.)
>>> y = mg.ones_like(x)
>>> z = x * y
>>> x[:] = 0  # mutates x, corrupting state associated with z
>>> z.backward()
>>> y.grad  # would be array([0., 1., 2.]) if graph wasn't corrupted
array([0., 0., 0.])