Search

Home > Machine Learning Guide > 15. Performance
Podcast: Machine Learning Guide
Episode:

15. Performance

Category: Technology
Duration: 00:41:24
Publish Date: 2017-05-07 00:00:00
Description:

Performance evaluation & improvement

## Episode

Performance evaluation

- Performance measures: accuracy, precision, recall, F1/F2 score - Cross validation: split your data into train, validation, test sets - Training set is for training your algorithm - Validation set is to test your algorithm's performance. It can be used to inform changing your model (ie, hyperparameters) - Test set is used for your final score. It can't be used to inform changing your model.

Performance improvement - Modify hyperpamaraters - Data: collect more, fill in missing cells, normalize fields - Regularize: reduce overfitting (high variance) and underfitting (high bias)

Total Play: 5

Users also like

200+ Episodes
Data Science .. 300+     20+
300+ Episodes
Revolutions 2K+     50+
2 Episodes
Anxiety & De .. 20+    
100+ Episodes
Fisicast 800+     60+