NMSISNN
Version 1.2.0
NMSIS NN Software Library

Convolutional Neural Network Example  
Gated Recurrent Unit Example  
▼Public  A collection of functions to perform basic operations for neural network layers. Functions with a _s8 suffix support TensorFlow Lite framework 
Activation Functions  Perform activation layers, including ReLU (Rectified Linear Unit), sigmoid and tanh 
GroupElementwise  
Concatenation Functions  
▼Convolution Functions  Collection of convolution, depthwise convolution functions and their variants 
GetBufferSizeNNConv  
▼Fullyconnected Layer Functions  Collection of fullyconnected and matrix multiplication functions 
GetBufferSizeFC  
LSTM Layer Functions  
▼Pooling Functions  Perform pooling functions, including max pooling and average pooling 
GetBufferSizePooling  
Reshape Functions  
Softmax Functions  
SVDF Functions  
▼Private  Perform data type conversion inbetween neural network operations 
Convolution  Support functions for Convolution and DW Convolution 
LSTM  Support functions for LSTM 
Fully Connected  Support functions for Fully Connected 
Softmax  Support functions for Softmax 
Basic math functions  Elementwise add and multiplication functions 
Basic Math Functions for Neural Network Computation  Basic Math Functions for Neural Network Computation 
Copy  
Fill  
Nndata_convert  
SupportConversion 