Welcome to Slick-dnn’s documentation!¶
Autograd¶
How it works¶
When you write:
import numpy as np
from slick_dnn.variable import Variable
a = Variable(np.ones(3))
b = Variable(np.ones(3))
c = a + b
New Variable c is created.
It’s c.data
is numpy array [2, 2, 2]
.
But it also tracks history of creation.
So it’s backward_function
was set to Add.backward
and it’s backward_variables
was set to [a, b]
Fundamental classes¶
-
class
slick_dnn.autograd.
Autograd
[source]¶ Autograd is a base class for all operations made on Variables.
-
__call__
(*variables_list)[source]¶ For convenience. One can use all Autograd objects by simply calling them instead of using
apply
method.Parameters: variables_list – same as in forward
methodReturns: what apply
returns
-
apply
(*variables_list)[source]¶ Actual creation of new Variable. It calls overwritten
forward
method, creates newContext
(same context is used in forward an backward pass) and setsbackward_function
andbackward_variables
for the new VariableParameters: variables_list – any variables, backward function will have to calculate gradients w.r.t all input variables Returns: one variable, the one with tracked history and calculated data
-
-
class
slick_dnn.autograd.
Context
[source]¶ This class is for storing information for back propagation. Autograd uses this class instead of using self to allow constructions:
relu = ReLU() b = relu(a) c = relu(b)
That means, that you can use one instance of Autograd class to all of your operations. Without it, the above example would be:
relu1 = ReLU() relu2 = ReLU() b = relu1(a) c = relu2(b)
Mathematical¶
All mathematical operations available for Variables
-
class
slick_dnn.autograd.mathematical.
Add
[source]¶ Adds given tensors
-
class
slick_dnn.autograd.mathematical.
MatMul
[source]¶ Matrix multiplication: tensor1 @ tensor2
-
class
slick_dnn.autograd.mathematical.
Mul
[source]¶ Element-wise multiplication
Activation Functions¶
All activation functions available.
-
class
slick_dnn.autograd.activations.
ReLU
[source]¶ Applies the ReLU function element-wise.
-
class
slick_dnn.autograd.activations.
Softmax
[source]¶ Applies the Softmax function element-wise.