Its success is evident from the FNN's application to numerous real-world problems. The gradient-descent algorithm such as backpropagation has been widely applied to optimize the FNNs. Researchers adopted such different viewpoints mainly to improve the FNN's generalization ability. The FNN optimization is often viewed from the various perspectives: the optimization of weights, network architecture, activation nodes, learning parameters, learning environment, etc. Over the past two decades, the feedforward neural network (FNN) optimization has been a key interest among the researchers and practitioners of multiple disciplines.
0 Comments
Leave a Reply. |