文档介绍:200 IEEE TRANSACTIONS ON FUZZY SYSTEMS, VOL. 9, NO. 1, FEBRUARY 2001
Fuzzy works for Function Learning
Daniel W. C. Ho, Ping-An Zhang, and Jinhua Xu
Abstract—Inspired by the theory of multiresolution analysis the number of wavelet candidates would drastically increase
(MRA) of wavelet transforms and fuzzy concepts, a fuzzy wavelet with the dimension. Therefore, constructing and storing wavelet
network (FWN) is proposed for approximating arbitrary non- bases/frames for large dimension problems are of prohibitive
linear functions in this paper. The FWN consists of a set of fuzzy
rules. Each rule corresponding to a sub-wavelet work cost. In [13], a ic algorithm (GA) combined with a steepest
(WNN) consists of single-scaling wavelets. Through efficient descent technique and least squares technique for optimal selec-
bases selection, the dimension of the approximated function does tion of the basis of the wavelet works have been used.
not cause the bottleneck for constructing FWN. Especially, by However, the greedy nature of GA restricted the development
learning the translation parameters of the wavelets and adjusting of the method. In [1], an algorithm for works has
the shape of membership functions, the model accuracy and the
generalization capability of the FWN can be remarkably im- been constructed by using radial wavelet frames with natural
proved. Furthermore, an algorithm for constructing and training single-scaling characteristic. In his work, attentions are paid to
the fuzzy works is proposed. Simulation examples are sparse training data so that problems of large dimension could
also given to illustrate the effectiveness of the method. be better handled. Obviously, to improve the approximation ac-
Index Terms—Fuzzy works, wavelet works, curacy, large number of wavelet neurons are required for WNN
wavelet transforms. with fixed wavelet bases. This will result in a -
work structure and cause overfitting problems.
For WNN with variable wav