Site icon T4Tutorials.com

Ensemble Methods MCQs

1. What are ensemble methods mainly used for?

(A) Data cleaning


(B) Improving model performance by combining models


(C) Feature selection


(D) Data visualization



2. Which ensemble technique builds models sequentially?

(A) Bagging


(B) Boosting


(C) Stacking


(D) Random Forest



3. Which ensemble method reduces variance?

(A) Boosting


(B) Stacking


(C) Bagging


(D) AdaBoost



4. Which algorithm is an example of bagging?

(A) AdaBoost


(B) Gradient Boosting


(C) XGBoost


(D) Random Forest



5. Which ensemble method focuses more on misclassified instances?

(A) Bagging


(B) Voting


(C) Stacking


(D) Boosting



6. What does bagging stand for?

(A) Bayesian Aggregation


(B) Balanced Grouping


(C) Bootstrap Aggregating


(D) Batch Aggregation



7. Which technique combines predictions using a meta-model?

(A) Bagging


(B) Boosting


(C) Stacking


(D) Random Forest



8. Which ensemble method uses majority voting?

(A) Stacking


(B) Bagging


(C) Voting classifier


(D) Boosting



9. Which ensemble technique reduces bias?

(A) Bagging


(B) Random Forest


(C) Boosting


(D) Voting



10. Which algorithm is a boosting method?

(A) KNN


(B) Naive Bayes


(C) Linear Regression


(D) AdaBoost



11. Which ensemble method builds decision trees on random feature subsets?

(A) Bagging


(B) Random Forest


(C) Boosting


(D) Stacking



12. Which ensemble method is most prone to overfitting noisy data?

(A) Bagging


(B) Voting


(C) Random Forest


(D) Boosting



13. Which ensemble technique uses weighted voting?

(A) Bagging


(B) Simple voting


(C) Random Forest


(D) Boosting



14. Which ensemble method combines different types of models?

(A) Bagging


(B) Stacking


(C) Boosting


(D) Random Forest



15. Which ensemble technique is best for reducing overfitting?

(A) Boosting


(B) Bagging


(C) Stacking


(D) AdaBoost



16. Which ensemble algorithm is based on gradient descent?

(A) Gradient Boosting


(B) AdaBoost


(C) Random Forest


(D) Bagging



17. Which ensemble method combines weak learners into a strong learner?

(A) Bagging


(B) Boosting


(C) Voting


(D) Random Forest



18. Which ensemble technique is also known as parallel learning?

(A) Boosting


(B) Stacking


(C) Bagging


(D) Voting



19. Which ensemble method is commonly used in competitions?

(A) XGBoost


(B) Linear Regression


(C) K-Means


(D) PCA



20. Which statement about ensemble methods is TRUE?

(A) They always reduce bias


(B) They use only one model


(C) They eliminate the need for training data


(D) They can improve prediction accuracy



Exit mobile version