activations
Abs
¶
Bases: Module
Source code in orthogonium\layers\custom_activations.py
11 12 13 14 15 16 17 18 19 20 21 22 23 24 |
|
__init__()
¶
Initializes an instance of the Abs class.
This method is automatically called when a new object of the Abs class is instantiated. It calls the initializer of its superclass to ensure proper initialization of inherited class functionality, setting up the required base structures or attributes.
Source code in orthogonium\layers\custom_activations.py
12 13 14 15 16 17 18 19 20 21 |
|
HouseHolder
¶
Bases: Module
Source code in orthogonium\layers\custom_activations.py
68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 |
|
__init__(channels, axis=1)
¶
A activation that applies a parameterized transformation via Householder reflection technique. It is initialized with the number of input channels, which must be even, and an axis that determines the dimension along which operations are applied. This is a corrected version of the original implementation from Singla et al. (2019), which features a 1/sqrt(2) scaling factor to be 1-Lipschitz.
Attributes:
Name | Type | Description |
---|---|---|
theta |
Parameter
|
Learnable parameter that determines the transformation applied via Householder reflection. |
axis |
int
|
Dimension along which the operation is performed. |
Parameters:
Name | Type | Description | Default |
---|---|---|---|
channels
|
int
|
Total number of input channels. Must be an even number. |
required |
axis
|
int
|
Dimension along which the transformation is applied. Default is 1. |
1
|
Source code in orthogonium\layers\custom_activations.py
69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 |
|
HouseHolder_Order_2
¶
Bases: Module
Source code in orthogonium\layers\custom_activations.py
109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 |
|
__init__(channels, axis=1)
¶
Represents a layer or module that performs operations using Householder transformations of order 2, parameterized by angles corresponding to each group of channels. This is a corrected version of the original implementation from Singla et al. (2019), which features a 1/sqrt(2) scaling factor to be 1-Lipschitz.
Attributes:
Name | Type | Description |
---|---|---|
num_groups |
int
|
The number of groups, which is half the number |
axis |
int
|
The axis along which the computation is performed. |
theta0 |
Parameter
|
A tensor parameter of shape |
theta1 |
Parameter
|
A tensor parameter of shape |
theta2 |
Parameter
|
A tensor parameter of shape |
Parameters:
Name | Type | Description | Default |
---|---|---|---|
channels
|
int
|
The total number of input channels. Must be an even |
required |
axis
|
int
|
Specifies the axis for computations. Defaults |
1
|
Source code in orthogonium\layers\custom_activations.py
110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 |
|
MaxMin
¶
Bases: Module
Source code in orthogonium\layers\custom_activations.py
48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 |
|
__init__(axis=1)
¶
This class implements the MaxMin activation function. Which is a pairwise activation function that returns the maximum and minimum (ordered) of each pair of elements in the input tensor.
Parameters axis : int, default=1 the axis along which to apply the activation function.
Source code in orthogonium\layers\custom_activations.py
49 50 51 52 53 54 55 56 57 58 59 60 |
|
SoftHuber
¶
Bases: Module
Source code in orthogonium\layers\custom_activations.py
27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 |
|
__init__(delta=0.05)
¶
Initializes the SoftHuber class. This class implements the Soft Huber loss function, which is a differentiable approximation of the Huber loss. The Soft Huber loss behaves like abs(x) when the absolute error is large and like x**2 when the absolute error is small. The transition between these two behaviors is controlled by the delta parameter.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
delta
|
float
|
The threshold at which to switch between L1 and L2 loss. |
0.05
|
Source code in orthogonium\layers\custom_activations.py
28 29 30 31 32 33 34 35 36 37 38 39 40 41 |
|