Special Offer take any 4 courses for INR 21999.00*

Courses
0

Explain the concept of entropy in decision trees.

January 7, 2025

Entropy is a measure of impurity or disorder used in decision trees to determine how to split the data. A node with high entropy means the data at that node is impure (mixed classes), while low entropy means the data is more homogeneous (mostly one class). The decision tree algorithm tries to minimize entropy at each split, increasing homogeneity in the resulting branches.

Leave a Comment

Drop a Query

Whether to upskill or for any other query, please drop us a line and we'll be happy to get back to you.

Drop a Query NEW

Request A Call Back

Please leave us your contact details and our team will call you back.

Request A Call Back

By tapping Submit, you agree to Cambridge infotech Privacy Policy and Terms & Conditions

Enquiry Now

Enquiry popup