NMSIS-NN  Version 1.0.2
NMSIS NN Software Library
Activation Functions

Perform activation layers, including ReLU (Rectified Linear Unit), sigmoid and tanh. More...

Functions

void riscv_nn_activations_direct_q15 (q15_t *data, uint16_t size, uint16_t int_width, riscv_nn_activation_type type)
 neural network activation function using direct table look-up More...
 
void riscv_nn_activations_direct_q7 (q7_t *data, uint16_t size, uint16_t int_width, riscv_nn_activation_type type)
 Q7 neural network activation function using direct table look-up. More...
 
void riscv_relu6_s8 (q7_t *data, uint16_t size)
 s8 ReLU6 function More...
 
void riscv_relu_q15 (q15_t *data, uint16_t size)
 Q15 RELU function. More...
 
void riscv_relu_q7 (q7_t *data, uint16_t size)
 Q7 RELU function. More...
 

Detailed Description

Perform activation layers, including ReLU (Rectified Linear Unit), sigmoid and tanh.

Function Documentation

◆ riscv_nn_activations_direct_q15()

void riscv_nn_activations_direct_q15 ( q15_t *  data,
uint16_t  size,
uint16_t  int_width,
riscv_nn_activation_type  type 
)

neural network activation function using direct table look-up

Q15 neural network activation function using direct table look-up.

Note
Refer header file for details.

◆ riscv_nn_activations_direct_q7()

void riscv_nn_activations_direct_q7 ( q7_t *  data,
uint16_t  size,
uint16_t  int_width,
riscv_nn_activation_type  type 
)

Q7 neural network activation function using direct table look-up.

Parameters
[in,out]datapointer to input
[in]sizenumber of elements
[in]int_widthbit-width of the integer part, assume to be smaller than 3
[in]typetype of activation functions

This is the direct table look-up approach.

Assume here the integer part of the fixed-point is <= 3. More than 3 just not making much sense, makes no difference with saturation followed by any of these activation functions.

◆ riscv_relu6_s8()

void riscv_relu6_s8 ( q7_t *  data,
uint16_t  size 
)

s8 ReLU6 function

Parameters
[in,out]datapointer to input
[in]sizenumber of elements

◆ riscv_relu_q15()

void riscv_relu_q15 ( q15_t *  data,
uint16_t  size 
)

Q15 RELU function.

Parameters
[in,out]datapointer to input
[in]sizenumber of elements

Optimized relu with QSUB instructions.

◆ riscv_relu_q7()

void riscv_relu_q7 ( q7_t *  data,
uint16_t  size 
)

Q7 RELU function.

Parameters
[in,out]datapointer to input
[in]sizenumber of elements

Optimized relu with QSUB instructions.