# Softmax Functions¶

void `riscv_softmax_q15`(const q15_t *vec_in, const uint16_t dim_vec, q15_t *p_out)
void `riscv_softmax_q7`(const q7_t *vec_in, const uint16_t dim_vec, q7_t *p_out)
group `Softmax`

EXP(2) based softmax function.

Functions

void `riscv_softmax_q15`(const q15_t *vec_in, const uint16_t dim_vec, q15_t *p_out)

Q15 softmax function.

Here, instead of typical e based softmax, we use 2-based softmax, i.e.,:

Return

none.

Parameters
• `[in] vec_in`: pointer to input vector

• `[in] dim_vec`: input vector dimention

• `[out] p_out`: pointer to output vector

y_i = 2^(x_i) / sum(2^x_j)

The relative output will be different here. But mathematically, the gradient will be the same with a log(2) scaling factor.

void `riscv_softmax_q7`(const q7_t *vec_in, const uint16_t dim_vec, q7_t *p_out)

Q7 softmax function.

Here, instead of typical natural logarithm e based softmax, we use 2-based softmax here, i.e.,:

Return

none.

Parameters
• `[in] vec_in`: pointer to input vector

• `[in] dim_vec`: input vector dimention

• `[out] p_out`: pointer to output vector

y_i = 2^(x_i) / sum(2^x_j)

The relative output will be different here. But mathematically, the gradient will be the same with a log(2) scaling factor.