ON SOME BOUNDS OF THE MINIMUM EDGE DOMINATING ENERGY OF A GRAPH

: Number of hidden neurons is necessary constant for tuning the neural network to achieve superior performance. These parameters are set manually through experimentation. The performance of the network is evaluated repeatedly to choose the best input parameters.Random selection of hidden neurons may cause underfitting or overfitting of the network. We propose a novel fuzzy controller for finding the optimal value of hidden neurons automatically. The hybrid classifier helps to design competent neural network architecture, eliminating manual intervention for setting the input parameters. The effectiveness of tuning the number of hidden neurons automatically on the convergence of a back-propagation neural network, is verified on speech data. The experimental outcomes demonstrate that the proposed Neuro-Fuzzy classifier can be viably utilized for speech recognition with maximum classification accuracy.


1.Introduction
Backpropagation Neural Network classifier (BPNN) permits more complex, non-linear relationships of input data to output results [1]. The structure of BPNN classifier is affected by factors like the size of the training set, Input Layer (IL) size, Hidden layer (HL) size, activation function used for learning, Output Layer (OL) size, and so on. Out of these factors, we have dealt with Hidden Neurons (HN) in this research work. Some empirically derived thumb rules are in existence to set the hidden neurons. Each hidden neurons added increases the number of weights and impedes generalisation contributing to increased training time. The number of hidden neurons is gradually increased with repetitive training of the network and the Mean Squared Error (MSE) is monitored. This work proposes a technique to estimate the optimal value of hidden neurons using a fuzzy controller based on MSE. The fuzzy controller reduces the overhead of training the network repeatedly to find the optimal value of hidden neurons. It also avoids the necessity of checking the testing accuracy iteratively [2,31]. The proposed Neuro-Fuzzy classifier is tested for accurate speech recognition.This paper is arranged as: Section 2 discusses related work, section 3 describes the proposed methodology, section 4 discusses results, and finally, the conclusion is given in Section 5.

RELATED WORK
Many applications of neural networks test a series of systems with an alternate estimation of hidden neurons, and the effective MSE is calculated. In [3], Li proposed a monotone index-based method to estimate the number of Hidden neurons in the three layer feedforward network. The method requires large dataset. It does not support multiple input or output. Yuan et al. [4] have decided the hidden neurons size based on information entropy and decision tree algorithm.Initially, the neural network with random hidden neurons is trained by a set of training samples. Then the activation values of hidden neurons and information gain are calculated. A decision tree is used to overcome problems associated with higher number of neurons and problems such as shortage of capacity because of the few numbers of neurons are avoided. Boger et al. [5] suggested initialising the number of hidden neurons to number of input neurons. In case it doesn't produce sufficient accuracy, a number of output neurons are added to it. Researchers have proposed a couple of more thumb rules to choose the size of HL as, the number of hidden neurons ought to be not as much as twice of the IL neurons, the size of HL should lie between that of IL and OL [6,7]. Estimation of signal to noise ratio is used to optimize the number of hidden neurons while avoiding overfitting in the function approximation [8]. The criterion uses a small number of computations and can be used for other parameters of the network like number of iterations depending upon training error itself. The technique works well for small sized datasets. Review of methods to fix the number of hidden neurons is discussed in [9]. Authors have also proposed a new technique to fix the number of hidden neurons. They have examined 101 various hidden neurons fixation criteria to estimate the error in the neural network. The criteria with a minimum estimated error is fixed to train and test the network. The selected criterion is (4 2 + 3)/( 2 − 8) where n is the number of input parameters. Panchal et al. [10] have reviewed various approaches to find the optimal value of hidden neurons. They include trial and error method with repeated attempts, rule of thumb method where predefined rules are used to decide the size of the HL, a simple 1286 Research Article method where the number of hidden neurons is matched to the number of input and output nodes. Authors have experimented back propagation method and conjugate gradient method by increasing the number of HLs and the number of hidden neurons simultaneously. The combination that yields minimum MSE is used for experimentation.

METHODOLOGY
BPNN is a type of gradient descent algorithm [25]. Conventional BPNN implementation requires value of hidden neurons to be fed as input. If the hidden neuronvalue entered is too large, the network enters a saturation state [29,30]. The proposed classifier overcomes these drawbacks by autotuning the hidden neurons value. The fuzzy controller determines whether the value of hidden neurons can be increased or decreased. In terms of linguistics the problem is considered. It can be mentioned that "if the MSE is too high at the output of one iteration, then the number of hidden neurons is reduced." Conversely, "if the MSE is too low at the output of one iteration, then the number of hidden neurons increases." Such laws suggest that there is a dead zone between the value of high and low hidden neurons that is "approximately right" for a network. The question is, what is 'high' and 'low' for the network and what is 'about right' for it. Fuzzy logic answers the questions using membership functions. ' ' represents the activation value of each cell.
 For the first iteration, the number of output neurons is considered to be number of hidden neurons. The fuzzy membership function gives corresponding degree of membership for computed MSE. The required change in number of hidden neurons, ∆HN is computed using (2). As shown in (3), the value ∆HN is added to the number of hidden neurons in current iteration to get the corresponding value for next iteration. is a tunable scalar parameter which is considered equal to 10 empirically.

1287
Research Article  Above two steps are repeated till the ∆HN becomes zero. This final value of hidden neurons is considered as optimal number of hidden neurons. The experimentation done for speech dataset provides highest results compared to state of the art approaches.

RESULTS
Ability to generalise and provide proper predictions for unknown situations is the strength of NN [26]. Backward error propagation is the most popular learning algorithm in artificial neural networks, where an error in the output is corrected by back-propagating it [24]. Existing hidden neurons finding methods determine the number of hidden neurons using several trials. The method starts with undersized hidden neurons and then add neurons to it. This approach is time-consuming and does not assure fixing the number of hidden neurons to its optimal value. The results of experimentation for speech recognition with the proposed hybrid Neuro-Fuzzy classifier approach are discussed in this section. The following section explains the speech datasets used for training and testing the network. Three different speech datasets are used to analyze the performance of the proposed technique. The first dataset is Spoken Marathi Numerals Dataset (SMND) [11]. SMND is our own recorded dataset. The other datasets are TIDIGITS dataset [12] and Vowels Dataset (VD) [13]. Separate training and testing TIDIGITS datasets are available from the course 'Fundamentals of speech recognition'. Different parameters for the three datasets are described in Table 1. Increasing the number of hidden neurons improves the performance of the neural network. Fig. 3 shows the effect of varying number of hidden neurons on training accuracy of BPNN. The network is trained for 1000 iterations for all the three datasets. Conventional BPNN is trained for different number of hidden neurons based on experimentation. The fuzzy controller finds optimal value of hidden neurons for proposed Neuro-Fuzzy classifier. The plot shows that in terms of training accuracy, the proposed hidden neurons tuning through fuzzy controller converges faster and outperforms conventional network during training. For all the three datasets, maximum accuracy is achieved in the least number of iterations. Thus, the fuzzy controller provides a optimal value of hidden neurons that gives higher training accuracy while converging faster than conventional BPNN network. The proposed Neuro-Fuzzy approach incorporates hidden neurons tuning by the fuzzy controller to avoid trial and error runs and guarantees faster convergence. The optimal hidden neurons value estimated by the fuzzy controller for SMND, and TIDIGITS is 37 each. While it is 29 for VD. BPNN is trained with these values of hidden neurons to test the effectiveness of the proposed approach.In case of SMND, the training accuracy of the poposed technique is 94.32% while it is 85.30% for the later on.

1291
Research Article Table 4 lists some of the state of the art hidden neuronsfixation criteria implemented for TIDIGITS dataset. is the number of inputs, 'o' is the number of outputs and is the number of training samples. 's' is a constant between 0 and 10. Minimum MSE is attained using proposed fuzzy controller.Trained network details generated by Neuro-Fuzzy classifier are used for classifying speech data. The size of the testing dataset is mentioned in Table1. Performance evaluation of the Neuro-Fuzzy classifier is presented in Fig. 4. The performance criteria used are classification accuracy and precision. Accuracy is the proportion of estimates that are accurate. Precision is the percentage of positive predictions that are accurate. Classification results show that Neuro-Fuzzy classifier produces higher accuracy as well as precision than the conventional neural network classifier for SMND and TIDIGITS datasets. The performance of both the classifiers is comparable for VD dataset. Thus, Neuro-Fuzzy classifier allows training the network with fewer iterations improving the testing performance. HN tuning by fuzzy controller results in training accuracy that moves up faster and smoother than the conventional approach. For all the three datasets, maximum accuracy is achieved in the least number of iterations. Thus, the fuzzy controller provides a near-optimal value of HN that gives higher training accuracy while converging faster than conventional BPNN network. Another advantage is fast processing as it avoids experimentations to fix the optimal value of HN.Highest accuracy provided by the proposed classifier is 99.03%. Highest precision achieved is 94.73%.

CONCLUSION
This research paper introduces a hybrid classifier capable of automatically tuning the number of hidden neurons to an optimal value. It has been shown that the network converges fast with a considerable reduction in MSE and number of iterations. The highest training accuracy of 99.27% with MSE value as low as 0.00013 is achieved by the hybrid Neuro-Fuzzy classifier. This ensures enhanced testing accuracy of 99.03% and precision of 94.73% for speech recognition. For qualitative analysis, we have plotted the convergence curves for conventional BPNN and proposed hybrid algorithm. The curves showed that the technique provides faster convergence while maintaining testing accuracy for speech datasets. Though the algorithm is applied for speech recognition in this paper, it may be effectively used in other soft computing applications. Further, the research work presented in this paper can be extended to test on other types of neural networks.