#C14544. Iris KNN Classifier with Hyperparameter Optimization
Iris KNN Classifier with Hyperparameter Optimization
Iris KNN Classifier with Hyperparameter Optimization
In this problem, you are given the famous Iris dataset. Your task is to implement a K-Nearest Neighbors (KNN) classifier that automatically tunes its hyperparameters using grid search with 5-fold cross validation. Specifically, you should search for the optimal number of neighbors ( (n_neighbors) ) from the set ({3, 5, 7, 9}) and the best distance metric from ({\text{euclidean}, \text{manhattan}, \text{chebyshev}, \text{minkowski}}). After selecting the best model, generate a classification report that provides precision, recall, and f1-score for each iris class (setosa, versicolor, virginica). Note that your solution must read any input from standard input (stdin) and write all output to standard output (stdout).
inputFormat
The program does not require any specific input. It will accept an empty input from stdin (or ignore any extraneous input).
outputFormat
The output consists of two parts: the best hyperparameters found by the grid search and the corresponding classification report on the test set. The classification report should include columns for precision, recall, f1-score, and support for each iris class.## sample
Best Parameters: {'n_neighbors': 3, 'metric': 'euclidean'}
Classification Report:
precision recall f1-score support
setosa 1.00 1.00 1.00 15
versicolor 0.94 1.00 0.97 16
virginica 1.00 0.88 0.93 14
accuracy 0.96 45
macro avg 0.98 0.96 0.97 45
weighted avg 0.98 0.96 0.97 45
</p>