A weight initialization method based on neural network with asymmetric activation function

Liu, J., Liu, Y. and Zhang, Qichun (2022) A weight initialization method based on neural network with asymmetric activation function.

Full text not available from this repository. (Request a copy)
Item Type: Article
Additional Information: Weight initialization of neural networks has an important influence on the learning process, and the selection of initial weights is related to the activation interval of the activation function. It is proposed that an improved and extended weight initialization method for neural network with asymmetric activation function as an extension of the linear interval tolerance method (LIT), called ‘GLIT’ (generalized LIT), which is more suitable for higher-dimensional inputs. The purpose is to expand the selection range of the activation function so that the input falls in the unsaturated region, so as to improve the performance of the network. Then, a tolerance solution theorem based upon neural network system is given and proved. Furthermore, the algorithm is given about determining the initial weight interval. The validity of the theoremand algorithm is verified by numerical experiments. The input could fall into any preset interval in the sense of probability under the GLIT method. In another sense, the GLIT method could provide a theoretical basis for the further study of neural networks.
Depositing User: RED Unit Admin
Date Deposited: 18 Dec 2024 12:06
Last Modified: 18 Dec 2024 12:06
URI: https://bnu.repository.guildhe.ac.uk/id/eprint/19541

Actions (login required)

Edit Item Edit Item