All Classes and Interfaces
Class
Description
Implements common functionality for all batch normalization implementations for
Tensor_F64.Base class which implements common functionality between all
DFunctionInterface of
ActivationReLU which adds functionality for the backwards step.Implementation of
DActivationReLU for Tensor_F64Interface of
ActivationSigmoid which adds functionality for the backwards step.Implementation of
DActivationSigmoid for Tensor_F64.Interface of
ActivationTanH which adds functionality for the backwards step.Implementation of
DActivationTanH for Tensor_F64.Implementation of
batch normalization for training networks.Backwards implementation of
ClippedPadding2D_F64.Backwards implementation of
ConstantPadding2D_F64.Implementation of
Batch Normalization for training networks.Implementation of
DFunctionBatchNorm for Tensor_F64.Drop out is a technique introduced by [1] for regularizing a network and helps prevents over fitting.
Implementation of
DFunctionDropOut for Tensor_F64Interface of
FunctionLinear which adds functionality for the backwards step.Implementation of
DFunctionLinear for Tensor_F64Interface of
Spatial Batch Normalization for training networks.Implementation of
DSpatialBatchNorm for Tensor_F64.Implementation of
Spatial Convolve 2D for training networks.Implementation of
DSpatialConvolve2D for Tensor_F64.Interface of
SpatialMaxPooling which adds functionality for the backwards step.Interface for computing the gradient of a padded spatial tensor.
Backwards functions for operations which convolve a window across the input spatial tensor and
process the image in a BCHW (batch, channel, (row, column)) order, e.g.
Backwards functions for operations which convolve a window across the input spatial tensor.
Base class for element-wise derivative functions
Given a
Function implementations of this interface will compute the gradient of its
inputs and parameters.Implementation of
NumericalGradient for Tensor_F64