Zobrazit minimální záznam

dc.contributor.authorNaik, Bighnaraj
dc.contributor.authorNayak, Janmenjoy
dc.contributor.authorBehera, H. S.
dc.contributor.authorAbraham, Ajith
dc.date.accessioned2016-03-30T12:36:44Z
dc.date.available2016-03-30T12:36:44Z
dc.date.issued2016
dc.identifier.citationNeurocomputing. 2016, vol. 179, p.69-87.cs
dc.identifier.issn0925-2312
dc.identifier.issn1872-8286
dc.identifier.urihttp://hdl.handle.net/10084/111416
dc.description.abstractIn the data classification process involving higher order ANNs, it’s a herculean task to determine the optimal ANN classification model due to non-linear nature of real world datasets. To add to the woe, it is tedious to adjust the set of weights of ANNs by using appropriate learning algorithm to obtain better classification accuracy. In this paper, an improved variant of harmony search (HS), called self-adaptive harmony search (SAHS) along with gradient descent learning is used with functional link artificial neural network (FLANN) for the task of classification in data mining. Using its past experiences, SAHS adjusts the harmonies according to the maximum and minimum values in the current harmony memory. The powerful combination of this unique strategy of SAHS and searching capabilities of gradient descent search is used to obtain optimal set of weights for FLANN. The proposed method (SAHS–FLANN) is implemented in MATLAB and the results are contrasted with other alternatives (FLANN, GA based FLANN, PSO based FLANN, HS based FLANN, improved HS based FLANN and TLBO based FLANN). To illustrate its effectiveness, SAHS–FLANN is tested on various benchmark datasets from UCI machine learning repository by using 5-fold cross validation technique. Under the null-hypothesis, the proposed method is analyzed by using various statistical tests for statistical correctness of results. The performance of the SAHS–FLANN is found to be better and statistically significant in comparison with other alternatives. The SAHS–FLANN differs from HS–FLANN (HS based FLANN) by the elimination of constant parameters (bandwidth and pitch adjustment rate). Furthermore, it leads to the simplification of steps for the improvisation of weight-sets in IHS–FLANN (improved HS based FLANN) by incorporating adjustments of new weight-sets according to the weight-sets with maximum and minimum fitness.cs
dc.language.isoencs
dc.publisherElseviercs
dc.relation.ispartofseriesNeurocomputingcs
dc.relation.urihttp://dx.doi.org/10.1016/j.neucom.2015.11.051cs
dc.rightsCopyright © 2015 Elsevier B.V. All rights reserved.cs
dc.subjectSelf adaptive harmony searchcs
dc.subjectGradient descent learningcs
dc.subjectArtificial neural networkcs
dc.subjectClassificationcs
dc.subjectData miningcs
dc.subjectMachine intelligencecs
dc.subjectFunctional link artificial neural networkcs
dc.titleA self adaptive harmony search based functional link higher order ANN for non-linear data classificationcs
dc.typearticlecs
dc.identifier.doi10.1016/j.neucom.2015.11.051
dc.type.statusPeer-reviewedcs
dc.description.sourceWeb of Sciencecs
dc.description.volume179cs
dc.description.lastpage87cs
dc.description.firstpage69cs
dc.identifier.wos000370090300006


Soubory tohoto záznamu

SouboryVelikostFormátZobrazit

K tomuto záznamu nejsou připojeny žádné soubory.

Tento záznam se objevuje v následujících kolekcích

Zobrazit minimální záznam