Aksela M., Laaksonen J.2006. Using diversity of errors for selecting members of a committee classifier. Journal of Pattern Recognition39(4), 608–623.

Alaiz-Rodriguez R.2008. Local decision bagging of binary neural classifiers. Lecture Notes in Artificial Intelligence5032, 1–12.

Amores J., Sebe N., Radeva P.2006. Boosting the distance estimation application to the K-nearest neighbor classifier. Pattern Recognition Letters27, 201–209.

Babenko B., Yang M. H., Belongie S.2009. A family of online boosting algorithms. In IEEE 12th International Conference on Computer Vision Workshops, September 27–October 4, Kyoto, 1346–1353.

Bakker B., Heskes T.2003. Clustering ensembles of neural network models. Neural Networks16(2), 261–269.

Banfield R. E., Hall L. O., Bowyer K. W.2007. A comparison of decision tree ensemble creation techniques. IEEE Transactions Pattern Analysis and Machine Intelligence29, 173–180.

Banfield R. E., Hall L. O., Bowyer K. W., Kegelmeyer W. P.2005. Ensemble diversity measures and their application to thinning. Information Fusion6(1), 49–62.

Baszczyski J., Sowiski R., Stefanowski J.2010. Variable consistency bagging ensembles. Lecture Notes in Computer Science5946, 40–52.

Bauer E., Kohavi R.1999. An empirical comparison of voting classification algorithms: bagging, voting, and variants. Machine Learning36, 105–139.

Bifet A., Holmes G., Pfahringer B.2010a. Leveraging bagging for evolving data streams. Lecture Notes in Artificial Intelligence6321, 135–150.

Bifet A., Holmes G., Kirkby R., Pfahringer B.2010b. MOA: massive online analysis. Journal of Machine Learning Research11, 1601–1604.

Bifet A., Holmes G., Pfahringer B., Kirkby R., Gavalda R.2009a. New ensemble methods for evolving data streams. In 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, 139–148.

Bifet A., Holmes G., Pfahringer B., Gavalda R.2009b. Improving adaptive bagging methods for evolving data streams. Lecture Notes in Artificial Intelligence5828, 23–37.

Bradley J. K., Schapire R. E.2008. Filterboost: regression and classication on large datasets. Advances in Neural Information Processing Systems20, 185–192.

Breiman L.1996. Bagging predictors. Machine Learning24, 123–140.

Breiman L.1999. Pasting small votes for classification in large databases and on-line. Machine Learning36(1–2), 85–103.

Breiman L.1999b.Prediction games and arcing algorithms. Neural Computation11(7), 1493–1517.

Breiman L.2000. Randomizing outputs to increase prediction accuracy. Machine Learning40, 229–242.

Breiman L.2001. Random forests. Machine Learning45(1), 5–32.

Brown G., Wyatt J., Harris R., Yao X.2005. Diversity creation methods: a survey and categorisation. Information Fusion6(1), 5–20.

Bryll R., Gutierrez-Osuna R., Quek F.2003. Attribute bagging: improving accuracy of classifier ensemble by using random feature subsets. Pattern Recognition36, 1291–1302.

Brzezinski D., Stefanowski J.2013. Reacting to different types of concept drift: the accuracy updated ensemble algorithm. IEEE Transactions on Neural Networks and Learning Systems24(7), in press.

Buhlmann P.2012. Bagging, boosting and ensemble methods. Handbook of Computational Statistics – Springer Handbooks of Computational Statistics, 985–1022.

Buhlmann P., Yu B.2002. Analyzing bagging. The Annals of Statistics30(4), 927–961.

Buja W. S.2006. Observations on bagging. Statistica Sinica16, 323–351.

Cai Q-T., Chun-Yi P., Chang-Shui Z.2008a. A weighted subspace approach for improving bagging performance. In IEEE International Conference on Acoustics, Speech and Signal Processing, March 31–April 4, Las Vegas, 3341–3344.

Cai Q-T., Chun-Yi P., Chang-Shui Z.2008b. Cost-sensitive boosting algorithms as gradient descent. In IEEE International Conference on Acoustics, Speech and Signal Processing, March 31–April 4, Las Vegas, 2009–2012.

Chawla N. V., Hall L. O., Bowyer K. W., Kegelmeyer W. P.2002. SMOTE: synthetic minority oversampling technique. Journal of Artificial Intelligence Research16, 321–357.

Chawla N. V., Lazarevic A., Hall L. O., Bowyer K.2003. SMOTE-Boost: improving prediction of the minority class in boosting. Principles of Knowledge Discovery in Databases, PKDD-2003, 107–119.

Coelho A., Nascimento D.2010. On the evolutionary design of heterogeneous bagging models. Neurocomputing73(16–18), 3319–3322.

Croux C., Joossens K., Lemmens A.2007. Trimmed bagging. Computational Statistics and Data Analysis52, 362–368.

Derbeko P., Yaniv R., Meir R.2002. Variance optimized bagging. Lecture Notes in Artificial Intelligence2430, 60–72.

Dietterich T.2000. An experimental comparison of three methods for constructing ensembles of decision trees: bagging, boosting, and randomization. Machine Learning40, 139–157.

Domingos P.2000. A unified bias-variance decomposition for zero-one and squared loss. In Seventeenth National Conference on Artificial Intelligence and Twelfth Conference on Innovative Applications of Artificial Intelligence, 564–569.

Elwel R., Polikar R.2011. Incremental learning of concept drift in nonstationary environments. IEEE Transactions on Neural Networks22(10), 1517–1531.

Esposito R., Saitta L.2003. Monte Carlo theory as an explanation of bagging and boosting. In 18th International Joint Conference on Artificial Intelligence IJCAI'03, Acapulco, Mexico, 499–504.

Faisal Z., Uddin M. M., Hirose H.2010. On selecting additional predictive models in double bagging type ensemble method. Lecture Notes in Computer Science6019, 199–208.

Fan W., Stolfo S. J., Zhang J., Chan P. K.1999. AdaCost: misclassification cost-sensitive boosting. In 16th International Conference on Machine Learning, Slovenia, 97–105.

Frank E., Pfahringer B.2006. Improving on bagging with input smearing. Lecture Notes in Artificial Intelligence3918, 97–106.

Freund Y., Schapire R. E.1996a. Experiments with a new boosting algorithm. In 13th International Conference on Machine Learning, Bari, Italy, 148–156.

Freund Y., Schapire R. E.1996b. Game theory, on-line prediction and boosting. In Ninth Annual Conference On Computational Learning Theory-COLT '96, Desenzano sul Garda, Italy, 325–332.

Freund Y., Schapire R. E.1997. A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences55(1), 119–139.

Friedman J. H., Hall P.2007. On bagging and nonlinear estimation. Journal of Statistical Planning and Inference137(3), 669–683.

Friedman J. H., Hastie T., Tibshirani R.2000. Additive logistic regression: a statistical view of boosting. Annals of Statistics38, 367–378.

Fu Q., Hu S. X., Zhao S. Y.2005. Clustering-based selective neural network ensemble. Journal of Zhejiang University Science6A(5), 387–392.

Fumera G., Roli F., Serrau A.2005. Dynamics of variance reduction in bagging and other techniques based on randomisation. Lecture Notes in Computer Science3541, 316–325.

Fumera G., Roli F., Serrau A.2008. A theoretical analysis of bagging as a linear combination of classifiers. IEEE Transactions on Pattern Analysis and Machine Intelligence30(7), 1293–1299.

Fushiki T.2010. Bayesian bootstrap prediction. Journal of Statistical Planning and Inference140, 65–74.

Galar M., Fernandez A., Barrenechea E., Bustince H., Herrera F.2012. A review on ensembles for the class imbalance problem: bagging-boosting and hybrid-based approaches. IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews42(4), 463–484.

Gama J.2010. Knowledge Discovery from Data Streams. CRC Press.

Gambina A., Szczureka E., Dutkowski J., Bakunc M., Dadlezc M.2009. Classification of peptidemass fingerprint data by novel no-regret boosting method. Computers in Biology and Medicine39, 460–473.

Gao W., Zhou Z-H.2010. Approximation Stability and Boosting. Lecture Notes in Artificial Intelligence6331, 59–73.

Gao Y., Gao F., Guan X.2010. Improved boosting algorithm with adaptive filtration. In 8th World Congress on Intelligent Control and Automation, July 6–9, Jinan, China, 3173–3178.

Garcia-Pedrajas N.2009. Supervised projection approach for boosting classifiers. Pattern Recognition42, 1742–1760.

Garcia-Pedrajas N., Ortiz-Boyer D.2008. Boosting random subspace method. Neural Networks21, 1344–1362.

Garcia-Pedrajas N., Ortiz-Boyer D.2009. Boosting k-nearest neighbor classifier by means of input space projection. Expert Systems with Applications36, 10570–10582.

Gomez-Verdejo V., Ortega-Moral M., Arenas-Garcia J., Figueiras-Vidal A.2006. Boosting by weighting critical and erroneous samples. Neurocomputing69, 679–685.

Grabner H., Bischof H.2006. On-line boosting and vision. In 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, June 17–22, 260–267.

Grandvalet Y.2004. Bagging equalizes inuence. Machine Learning55, 251–270.

Hall L., Baneld R., Bowyer K., Kegelmeyer W.2007. Boosting lite—handling larger datasets and slower base classifiers. Lecture Notes in Computer Science4472, 161–170.

Hall P., Samworth R. J.2005. Properties of bagged nearest neighbour classifiers. Journal of the Royal Statistical Society Series B67(3), 363–379.

Hernandez-Lobato D., Martinez-Munoz G., Suarez A.2007. Out of bootstrap estimation of generalization error curves in bagging ensembles. Lecture Notes in Computer Science4881, 47–56.

Hido S., Kashima H., Takahashi Y.2009. Roughly balanced bagging for imbalanced data. Statistical Analysis and Data Mining2, 412–426.

Ho T. K.1998. The random subspace method for constructing decision forests. IEEE Transactions on Pattern Analysis and Machine Intelligence20(8), 832–844.

Hothorn T., Lausen B.2003. Double-bagging: combining classifiers by bootstrap aggregation. Pattern Recognition36(6), 1303–1309.

Hothorn T., Lausen B.2005. Bundling classifiers by bagging trees. Computational Statistics and Data Analysis49, 1068–1078.

Jiang Y., Jin-Jiang L., Gang L., Honghua D., Zhi-Hua Z.2005. Dependency bagging. Lecture Notes in Artificial Intelligence3641, 491–500.

Jimnez-Gamero M. D., Muoz-Garca J., Pino-Mejas R.2004. Reduced bootstrap for the median. Statistica Sinica14, 1179–1198.

Joshi M. V., Kumar V., Agarwal R. C.2001. Evaluating boosting algorithms to classify rare classes: comparison and improvements. IEEE International Conference on Data Mining, 257–264.

Kalai A., Servedio R.2005. Boosting in the presence of noise. Journal of Computer and System Sciences71, 266–290.

Khoshgoftaar T. M., Van Hulse J., Napolitano A.2011. Comparing boosting and bagging techniques with noisy and imbalanced data. IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans41(3), 552–568.

Koco S., Capponi C.2011. A boosting approach to multiview classification with cooperation. In European Conference on Machine Learning and Knowledge Discovery in Databases ECML-PKDD'11, Athens, 209–228.

Kotsiantis S., Pintelas P.2004. Combining bagging and boosting. International Journal of Computational Intelligence1(4), 324–333.

Kotsiantis S. B., Kanellopoulos D.2010. Bagging different instead of similar models for regression and classification problems. International Journal of Computer Applications in Technology37(1), 20–28.

Kotsiantis S. B., Kanellopoulos D., Pintelas P. E.2006. Local boosting of decision stumps for regression and classification problems. Journal of Computers1(4), 30–37.

Kotsiantis S. B., Tsekouras G. E., Pintelas P. E.2005. Local bagging of decision stumps. In 18th International Conference on Innovations in Applied Artificial Intelligence, Bari, Italy, 406–411.

Kuncheva L. I., Skurichina M., Duin R. P. W.2002. An experimental study on diversity for bagging and boosting with linear classifiers. Information Fusion3, 245–258.

Kuncheva L., Whitaker C. J.2002. Using diversity with three variants of boosting: aggressive. Lecture Notes in Computer Science2364, 81–90.

Kuncheva L. I., Whitaker C. J.2003. Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Journal of Machine Learning51, 181–207.

Kuncheva L.2004a. Combining Pattern Classifiers: Methods and Algorithms. Wiley.

Kuncheva L.2004b. Classifier ensembles for changing environments. Lecture Notes in Computer Science3077, 1–15.

Latinne P., Debeir O., Decaestecker Ch.2000. Mixing bagging and multiple feature subsets to improve classification accuracy of decision tree combination. In 10th Belgian-Dutch Conference on Machine Learning. Tilburg University, 15–22.

Le D.-D., Satoh S.2007. Ent-boost: boosting using entropy measures for robust object detection. Pattern Recognition Letters28, 1083–1090.

Lee H., Clyde M. A.2004. Lossless online bayesian bagging. Journal of Machine Learning Research5, 143–151.

Leistner C., Saffari A., Roth P., Bischof H.2009. On robustness of on-line boosting—a competitive study. In IEEE 12th International Conference on Computer Vision Workshops, Kyoto, Japan, 1362–1369.

Leskes B., Torenvliet L.2008. The value of agreement a new boosting algorithm. Journal of Computer and System Sciences74, 557–586.

Li C.2007. Classifying imbalanced data using a bagging ensemble variation (BEV). 45th Annual Southeast Regional Conference, 203–208.

Li G.-Z., Yang J. Y.2008. Feature selection for ensemble learning and its application. In Machine Learning in Bioinformatics, Zhang, Y.-Q. & Rajapakse, J. C. (eds.). Wiley.

Li W., Gao X., Zhu Y., Ramesh V., Boult T.2005. On the small sample performance of boosted classifiers. In IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05), June 20–25, 574–581.

Littlestone N., Warmuth M. K.1994. The weighted majority algorithm. Information and Computation108(2), 212–261.

Liu X., Yu T.2007. Gradient feature selection for online boosting. In IEEE 11th International Conference on Computer Vision (ICCV 2007), Rio de Janeiro, Brazil, 1–8.

Lu Y., Tian Q., Huang T.2007. Interactive boosting for image classification. Lecture Notes in Computer Science4472, 180–189.

Markowitz H.1959. Portfolio Selection: Efficient Diversification of Investments. Yale University Press.

Martinez-Munoz G., Suarez A.2007. Using boosting to prune bagging ensembles. Pattern Recognition Letters28(1), 156–165.

Martinez-Munoz G., Suarez A.2010. Out-of-bag estimation of the optimal sample size in bagging. Pattern Recognition43, 143–152.

Martinez-Munoz G., Hernandez-Lobato D., Suarez A.2007. Selection of decision stumps in bagging ensembles. Lecture Notes in Computer Science4668, 319–328.

Mason L., Baxter J., Bartlett P., Frean M.1999. Functional gradient techniques for combining hypotheses. Advances in Large Margin Classifiers. MIT Press, 221–247.

Melville P., Mooney R.2005. Creating diversity in ensembles using artificial data. Information Fusion6, 99–111.

Minku L. L., Xin Y.2012. DDD: a new ensemble approach for dealing with concept drift. IEEE Transactions on Knowledge and Data Engineering24(4), 619–633.

Nanculef R., Valle C., Allende H., Moraga C.2007. Bagging with asymmetric costs for misclassified and correctly classified examples. Lecture Notes in Computer Science4756, 694–703.

Oza N. C.2003. Boosting with averaged weight vectors. Lecture Notes in Computer Science2709, 15–24.

Oza N.2005. Online bagging and boosting. In 2005 IEEE International Conference on Systems, Man and Cybernetics, October 10–12, Hawaii, USA, 2340–2345.

O'Sullivan J., Langford J., Caruna R., Blum A.2000. Featureboost: a metalearning algorithm that improves model robustness. 17th International Conference on Machine Learning, 703–710.

Panov P., Dzeroski S.2007. Combining bagging and random subspaces to create better ensembles. Lecture Notes in Computer Science4723, 118–129.

Pelossof R., Jones M., Vovsha I., Rudin C.2009. Online coordinate boosting. In IEEE 12th International Conference on Computer Vision Workshops, Kyoto, Japan, 1354–1361.

Peng J., Barbu C., Seetharaman G., Fan W., Wu X., Palaniappan K.2011. ShareBoost: boosting for multi-view learning with performance guarantees. Lecture Notes in Computer Science6912, 597–612.

Phama T., Smeulders A.2008. Quadratic boosting. Pattern Recognition41, 331–341.

Pino-Mejias R., Jimenez-Gamero M., Cubiles-de-la-Vega M., Pascual-Acosta A.2008. Reduced bootstrap aggregating of learning algorithms. Pattern Recognition Letters29, 265–271.

Pino-Mejias R., Cubiles-de-la-Vega M., Lpez-Coello M., Silva-Ramrez E., Jimnez-Gamero M.2004. Bagging classification models with reduced bootstrap. Lecture Notes in Computer Science3138, 966–973.

Puuronen S., Skrypnyk I., Tsymbal A.2001. Ensemble feature selection based on contextual merit and correlation heuristics. Lecture Notes in Computer Science2151, 155–168.

Redpath D. B., Lebart K.2005. Boosting feature selection. Lecture Notes in Computer Science3686, 305–314.

Reyzin L., Schapire R. E.2006. How boosting the margin can also boost classifier complexity. In 23rd International Conference on Machine Learning, Pittsburgh, 753–760.

Rodriguez J. J., Kuncheva L. I., Alonso C. J.2006. Rotation forest: a new classier ensemble method. IEEE Transactions on Pattern Analysis and Machine Intelligence28(10), 1619–1630.

Rodriguez J. J., Maudes J.2008. Boosting recombined weak classifiers. Pattern Recognition Letters29, 1049–1059.

Schapire R., Freund Y., Bartlett P., Lee W. S.1997. Boosting the margin: a new explanation for the effectiveness of voting methods. In Fourteenth International Conference on Machine Learning-ICML'97, 322–330.

Schapire R. E., Singer Y.1999. Improved boosting algorithms using confidence-rated predictions. Machine Learning37, 297–336.

Seiffert C., Khoshgoftaar T., Hulse J., Napolitano A.2008. Resampling or reweighting: a comparison of boosting implementations. In 20th IEEE International Conference on Tools with Artificial Intelligence-ICTAI'08, Ohio, USA, 445–451.

Seni G., Elder J.2010. Ensemble methods in data mining: improving accuracy through combining predictions. Synthesis Lectures on Data Mining and Knowledge Discovery2(1), 1–126.

Servedio R. A.2003. Smooth boosting and learning with malicious noise. The Journal of Machine Learning Research4, 633–648.

Shen C., Li H.2010. Boosting through optimization of margin distributions. IEEE Transactions on Neural Networks21(4), 659–667.

Shirai S., Kudo M., Nakamura A.2008. Bagging, random subspace method and biding. Lecture Notes in Computer Science5342, 801–810.

Shirai S., Kudo M., Nakamura A.2009. Comparison of bagging and boosting algorithms on sample and feature weighting. Lecture Notes in Computer Science5519, 22–31.

Skurichina M., Duin R.2000. The role of combining rules in bagging and boosting. Lecture Notes in Computer Science1876, 631–640.

Sohn S. Y., Shin H. W.2007. Experimental study for the comparison of classifier combination methods. Pattern Recognition40, 33–40.

Stefanowski J.2007. Combining answers of sub-classifiers in the bagging-feature ensembles. Lecture Notes in Artificial Intelligence4585, 574–583.

Su X., Khoshgoftarr T. M., Zhu X.2008. VoB predictors: voting on bagging classifications. In 19th International Conference on Pattern Recognition-ICPR'2008, December 8–11, Florida, USA, 1–4.

Sun Y., Kamel M. S., Wong A., Wang Y.2007. Cost-sensitive boosting for classification of imbalanced data. Pattern Recognition40, 3358–3378.

Tang W.2003. Selective ensemble of decision trees. Lecture Notes in Artificial Intelligence2639, 476–483.

Tao D., Tang X., Li X., Wu X.2006. Asymmetric bagging and random subspace for support vector machines-based relevance feedback in image retrieval. IEEE Transactions on Pattern Analysis and Machine Intelligence28(7), 1088–1099.

Terabe M., Washio T., Motoda H.2001. The effect of subsampling rate on subagging performance. ECML2001/PKDD2001, 48–55.

Ting K., Witten I.1997. Stacking bagged and dagged models. In Fourteenth International Conference on Machine Learning-ICML ’97, Tennessee, USA, 367–375.

Torres-Sospedra J., Hernandez-Espinosa C., Fernandez-Redondo M.2007a. Mixing aveboost and conserboost to improve boosting methods. In International Joint Conference on Neural Networks, Orlando, Florida, USA, August 12–17, 672–677.

Torres-Sospedra J., Hernandez-Espinosa C., Fernandez-Redondo M.2007b. Designing a multilayer feedforward ensemble with the weighted conservative boosting algorithm. In International Joint Conference on Neural Networks, Orlando, Florida, USA, August 12–17, 684–689.

Torres-Sospedra J., Hernandez-Espinosa C., Fernandez-Redondo M.2008. Researching on combining boosting ensembles. In International Joint Conference on Neural Networks-IJCNN2008, Hong Kong, 2290–2295.

Tsaoa C., Chang Y. I.2007. A stochastic approximation view of boosting. Computational Statistics and Data Analysis52, 325–334.

Tsymbal A., Puuronen S.2000. Bagging and boosting with dynamic integration of classifiers. Lecture Notes in Artificial Intelligence1910, 116–125.

Valentini G., Masuli F.2002. Ensembles of learning machines. Lecture Notes in Computer Science2486, 3–19.

Valentini G., Dietterich T. G.2003. Low bias bagged support vector machines. In 20th International Conference on Machine Learning ICML-2003, Washington, USA, 752–759.

Vezhnevets A., Barinova O.2007. Avoiding boosting overfitting by removing confusing shamples. In ECML2007, Poland, September, 430–441.

Wall R., Cunningham P., Walsh P., Byrne S.2003. Explaining the output of ensembles in medical decision support on a case by case basis. Artificial Intelligence in Medicine28(2), 191–206.

Wang X., Wang H.2006. Classification by evolutionary ensembles. Pattern Recognition39, 595–607.

Wang C.-M., Yang H.-Z., Li F.-C., Fu R.-X.2006. Two stages based adaptive sampling boosting method. In Fifth International Conference on Machine Learning and Cybernetics, Dalian, August 13–16, 2925–2927.

Wang S., Yao X.2009. Diversity analysis on imbalanced data sets by using ensemble models. IEEE Symposium on Computational Intelligence and Data Mining, 324–331.

Wang W., Zhou Z.-H.2010. A new analysis of co-training. In 27th International Conference on Machine Learning-ICML'10, Haifa, Israel, 1135–1142.

Webb G. I.2000. MultiBoosting: a technique for combining boosting and wagging. Machine Learning40, 159–196.

Wolpert D. H.1994. Stacked generalization. Neural Networks5(6), 1289–1301.

Xu W., Meiyun Z., Mingtao Z., He R.2010. Constraint bagging for stock price prediction using neural networks. In International Conference on Modelling, Identification and Control, Okayama, Japan, July 17–19, 606–610.

Xu X., Zhang A.2006. Boost feature subset selection: a new gene selection algorithm for microarray data set. In International Conference on Computational Science, UK, 670–677.

Yang L., Gong W., Gu X., Li W., Liu Y.2009. Bagging null space locality preserving discriminant classifiers for face recognition. Pattern Recognition42, 1853–1858.

Yasumura Y., Kitani N., Uehara K.2005. Integration of bagging and boosting with a new reweighting technique. In International Conference on Computational Intelligence for Modelling, Control and Automation and International Conference Intelligent Agents, Web Technologies and Internet Commerce (CIMCA-IAWTIC05), Vienna, Austria, 338–343.

Yi X.-C., Ha Z., Liu C.-P.2004. Selective bagging based incremental learning. In Third International Conference on Machine Learning and Cyhemetics, Shanghai, August 26–29, 2412–2417.

Yin H., Dong H.2011. The problem of noise in classification: Past, current and future work. In 2011 IEEE 3rd International Conference on Communication Software and Networks (ICCSN), May 27–29, 412–416.

Yin X.-C., Liu C.-P., Zhi H.2005. Feature combination using boosting. Pattern Recognition Letters26, 2195–2205.

Zaman F., Hirose H.2008. A robust bagging method using median as a combination rule. In IEEE 8th International Conference on Computer and Information Technology Workshops, Dhaka, Bangladesh, 55–60.

Zhang C. X., Zhang J. S.2008a.A local boosting algorithm for solving classification problems. Computational Statistics & Data Analysis52(4), 1928–1941.

Zhang C. X., Zhang J. S.2008b. RotBoost: a technique for combining Rotation Forest and AdaBoost. Pattern Recognition Letters29, 1524–1536.

Zhang C. X., Zhang J. S., Zhang G.-Y.2008. An efficient modified boosting method for solving classification problems. Journal of Computational and Applied Mathematics214, 381–392.

Zhang C. X., Zhang J. S., Zhang G.-Y.2009. Using boosting to prune double-bagging ensembles. Computational Statistics and Data Analysis53, 1218–1231.

Zhang D., Zhou X., Leung S., Zheng J.2010. Vertical bagging decision trees model for credit scoring. Expert Systems with Applications37, 7838–7843.

Zhou Z.-H., Wu J., Tang W.2002. Ensembling neural networks: many could be better than all. Artificial Intelligence137(1–2), 239–263.

Zhou Z. H., Yu Y.2005a.Adapt bagging to nearest neighbor classiers. Journal of Computer Science and Technology20(1), 48–54.

Zhou Z. H., Yu Y.2005b.Ensembling local learners through multimodal perturbation. IEEE Transactions on Systems, Man, and Cybernetics Part B:Cybernetics35(4), 725–735.

Zhu X., Yang Y.2008. A lazy bagging approach to classification. Pattern Recognition41, 2980–2992.