1. K-nearest neighbour classfier
Base on k training instances to classify a new instance into the majority class. The distance between the new instance and each of k training instances is often computed by Euclidean distance.
2. K-fold cross validation
The idea of k-fold cross validation is to try to estimate the true predictive accuracy of a classifier. It is often carried out for a small training dataset. Instead of using a separate small test dataset, we can divide the training dataset into k portions and then use 1 portion for testing and k-1 for training, and then repeat for different portions. Doing that, a more accurate predictive accuracy of a classifier can be achieved.
3. Leave-one-out cross validation
Instead of dividing the training dataset into k different portions, only one instance is kept for testing and the rest is for training.
Subscribe to:
Post Comments (Atom)
Mounting USB drives in Windows Subsystem for Linux
Windows Subsystem for Linux can use (mount): SD card USB drives CD drives (CDFS) Network drives UNC paths Local storage / drives Drives form...
-
I. Five different ways to answer a question II. Use SOLO strategies to explain our thinking and reasoning III. SOLO Taxono...
-
Learning levels 1, 2, and 3 Learning levels 4, 5, and 6 References http://www.cccs.edu/Docs/Foundation/SUN/QUESTIONS%20FOR%20TH...
No comments:
Post a Comment