#C13009. Iris Decision Tree Classification Evaluation

    ID: 42500 Type: Default 1000ms 256MiB

Iris Decision Tree Classification Evaluation

Iris Decision Tree Classification Evaluation

This problem requires you to simulate the evaluation of a Decision Tree Classifier on the famous Iris dataset. In a real-world scenario, one would load the Iris data, perform preprocessing (such as label encoding and splitting the dataset), train a Decision Tree model, and finally evaluate the model by computing the accuracy, confusion matrix, and a detailed classification report.

In this problem, you are not required to actually implement the machine learning pipeline. Instead, your program should read input from stdin (even if it is not used), and then output the evaluation results in a predefined format. The output should consist of the accuracy, confusion matrix, and classification report exactly as described below. If any formulas are mentioned in the description, they should be formatted in LaTeX. For example, the accuracy score can be seen as \(\text{Accuracy} = \frac{\text{correct predictions}}{\text{total predictions}}\).

Note: Although the original concept used libraries such as pandas and scikit-learn, for this challenge you only need to output the expected result, which is fixed.

inputFormat

The input will be provided via standard input (stdin). However, for this problem, the input is irrelevant because the Iris dataset is processed internally. You may assume that the input is empty or contains arbitrary data that should be ignored.

outputFormat

The program must print the evaluation results to standard output (stdout) exactly as shown below:

Accuracy: 1.0
Confusion Matrix:
 [[10, 0, 0],
 [0, 10, 0],
 [0, 0, 10]]
Classification Report:
              precision    recall  f1-score   support
       0       1.00      1.00      1.00        10
       1       1.00      1.00      1.00        10
       2       1.00      1.00      1.00        10
## sample
Accuracy: 1.0

Confusion Matrix: [[10, 0, 0], [0, 10, 0], [0, 0, 10]] Classification Report: precision recall f1-score support 0 1.00 1.00 1.00 10 1 1.00 1.00 1.00 10 2 1.00 1.00 1.00 10

</p>