In this study we present a number of schemes for the generation of elementary functions for hardwired neural network emulators. We propose approaches for the hardware design of the sigmoid and the logarithm function based on a hybrid approach. The hybrid approach requires access to lookup tables and direct computations. Our proposed approaches outperform existing schemes in terms of speed and hardware requirements, while keeping an accuracy in the order of the IEEE double precision floating point format. Additionally, ten elementary functions that are commonly used in the neural network paradigm have been designed using first and second degree piece-wise approximations. These functions have a precision on the order of 2-$\sp{10}$ with inexpensive hardware. The elementary functions are: the sigmoid and its derivative, the logarithm, the sine and cosine trigonometric functions, the exponential, the hyper tangent, the square root, the inverse and inverse square. The proposed design approaches outperform existing schemes in terms of performance, hardware cost, and precision. It is shown that one of the proposed designs can accommodate all ten elementary functions without loss of performance. Furthermore, the function generator can be programmable; this in turn provides the capability of extending the computations to other elementary functions with no penalties in terms of performance, hardware cost, or additional design effort. Those features make our low precision schemes suitable for neural network emulators that require "moderate" precision for the computation of elementary functions. Their inclusion in the design allows those emulators to achieve high performance computations with low hardware cost.
Recommendations
RBF neural network based on q-Gaussian function in function approximation
To enhance the generalization performance of radial basis function (RBF) neural networks, an RBF neural network based on a q-Gaussian function is proposed. A q-Gaussian function is chosen as the radial basis function of the RBF neural network, and a ...
Elementary function generators for neural-network emulators
Piecewise first- and second-order approximations are employed to design commonly used elementary function generators for neural-network emulators. Three novel schemes are proposed for the first-order approximations. The first scheme requires one ...
An optimized radial basis function neural network with modulation-window activation function
AbstractIt is a crucial basis to improve the performance of neural network by constructing an appropriate activation function. This paper proposes a novel modulation window radial basis function neural network (MW-RBFNN) with an adjustable activation ...