Show simple item record

dc.contributor.authorGanaie, M. A.
dc.contributor.authorTanveer, M.
dc.contributor.authorSuganthan, P. N.
dc.contributor.authorSnášel, Václav
dc.date.accessioned2022-10-03T10:40:39Z
dc.date.available2022-10-03T10:40:39Z
dc.date.issued2022
dc.identifier.citationNeural Networks. 2022, vol. 153, p. 496-517.cs
dc.identifier.issn0893-6080
dc.identifier.issn1879-2782
dc.identifier.urihttp://hdl.handle.net/10084/148670
dc.description.abstractRandom Forest is an ensemble of decision trees based on the bagging and random subspace concepts. As suggested by Breiman, the strength of unstable learners and the diversity among them are the ensemble models' core strength. In this paper, we propose two approaches known as oblique and rotation double random forests. In the first approach, we propose rotation based double random forest. In rotation based double random forests, transformation or rotation of the feature space is generated at each node. At each node different random feature subspace is chosen for evaluation, hence the transformation at each node is different. Different transformations result in better diversity among the base learners and hence, better generalization performance. With the double random forest as base learner, the data at each node is transformed via two different transformations namely, principal component analysis and linear discriminant analysis. In the second approach, we propose oblique double random forest. Decision trees in random forest and double random forest are univariate, and this results in the generation of axis parallel split which fails to capture the geometric structure of the data. Also, the standard random forest may not grow sufficiently large decision trees resulting in suboptimal performance. To capture the geometric properties and to grow the decision trees of sufficient depth, we propose oblique double random forest. The oblique double random forest models are multivariate decision trees. At each non-leaf node, multisurface proximal support vector machine generates the optimal plane for better generalization performance. Also, different regularization techniques (Tikhonov regularization, axis-parallel split regularization, Null space regularization) are employed for tackling the small sample size problems in the decision trees of oblique double random forest. The proposed ensembles of decision trees produce trees with bigger size compared to the standard ensembles of decision trees as bagging is used at each non-leaf node which results in improved performance. The evaluation of the baseline models and the proposed oblique and rotation double random forest models is performed on benchmark 121 UCI datasets and real-world fisheries datasets. Both statistical analysis and the experimental results demonstrate the efficacy of the proposed oblique and rotation double random forest models compared to the baseline models on the benchmark datasets.cs
dc.language.isoencs
dc.publisherElseviercs
dc.relation.ispartofseriesNeural Networkscs
dc.relation.urihttps://doi.org/10.1016/j.neunet.2022.06.012cs
dc.rights© 2022 Elsevier Ltd. All rights reserved.cs
dc.subjectdouble random forestcs
dc.subjectoblique random forestcs
dc.subjectensemble learningcs
dc.subjectbootstrapcs
dc.subjectdecision tree classificationcs
dc.titleOblique and rotation double random forestcs
dc.typearticlecs
dc.identifier.doi10.1016/j.neunet.2022.06.012
dc.type.statusPeer-reviewedcs
dc.description.sourceWeb of Sciencecs
dc.description.volume153cs
dc.description.lastpage517cs
dc.description.firstpage496cs
dc.identifier.wos000828089700009


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record