Editing Hands-on Machine Learning
Jump to navigation
Jump to search
The edit can be undone. Please check the comparison below to verify that this is what you want to do, and then publish the changes below to finish undoing the edit.
Latest revision | Your text | ||
Line 55: | Line 55: | ||
=== Decision Trees === | === Decision Trees === | ||
Task: Labelling | |||
Input data types: nominal (or numeric, with conditionals) | |||
Description: A decision tree is something like a flow chart. It's a tree of decision boxes; you start at the root and, based on your data, follow decisions down to leaf nodes. At the leaf nodes, you'll typically have a label. | |||
Training: | |||
Evaluation: | |||
Application: | |||
=== Naive Bayes Classifier === | === Naive Bayes Classifier === | ||
Line 161: | Line 75: | ||
Description: Naive Bayes is a statistical technique for predicting the probability of all labels given a set of inputs. For instance, let's assume we've trained a naive Bayes system on (color, kind of fruit) pairs. Then, we can ask it for the probability distribution of "kind of fruit" given the color "yellow." This will tell us that it's almost certainly a banana or lemon, but it could be an apple, and might occasionally be an orange, etc. That is, it returns a list of labels with an associated probability. | Description: Naive Bayes is a statistical technique for predicting the probability of all labels given a set of inputs. For instance, let's assume we've trained a naive Bayes system on (color, kind of fruit) pairs. Then, we can ask it for the probability distribution of "kind of fruit" given the color "yellow." This will tell us that it's almost certainly a banana or lemon, but it could be an apple, and might occasionally be an orange, etc. That is, it returns a list of labels with an associated probability. | ||
Training: | |||
Evaluation: | |||
Application: | |||
=== Support Vector Machines === | === Support Vector Machines === |