namespace mlp¶
// classes template < typename t_FloatType, typename t_XDataType, unsigned int t_DdrWidth, unsigned int t_XDdrWidth, unsigned int t_aColMemWords = 1, unsigned int t_aRowMemWords = 1, unsigned int t_bColMemWords = 1, unsigned int t_maxWSize = 0, unsigned int t_maxBSize = 0 > class Fcn
relu¶
#include "mlp/activations.hpp"
template <typename t_DataType> t_DataType relu (t_DataType x)
relu (rectified linear unit) is a very common activation function in deep neural network
Parameters:
x | is the input value |
sigmoid¶
#include "mlp/activations.hpp"
template <typename t_DataType> t_DataType sigmoid (t_DataType x)
sigmoid function is a very common activation function in MLP
Parameters:
x | is the input value |
tansig¶
#include "mlp/activations.hpp"
template <typename t_DataType> t_DataType tansig (t_DataType x)
tansig function is used as an activation function in some MLPs
Parameters:
x | is the input value |