top

Search

Machine Learning Tutorial

It is important to understand the performance metrics associated with a learning model and know which hyperparameter needs to be tuned so as to get better results. The idea is that the learning model needs to generalize well on new data. Let us look at different ways of improving the performance of learning models. It can be based on one of the 4 methods: Improving performance of model with data Improving performance of model by tuning the algorithm Improving performance of model by choosing different algorithm Improving performance of model using ensembles Improving performance of model with data The data can be processed and cleaned in a different way, selecting different features, forming new features, etc. New perspectives on the data can be formed which would yield better predictions. Data can be resampled and provided as input data to the learning algorithm. Data can also be rescaled or normalized to scale to a different value thereby yielding better outputs. Improving performance of model by tuning the algorithm The algorithm and data representation can be identified and a baseline can be defined. Any result above this baseline can be taken as a benchmark. The data available has to be best made use of and different algorithms can be used to resample the data. The evaluation metric that is used should be the one that best captures the problem requirement. Linear and non-linear algorithms need to be spot checked, so as to understand the amount of bias present in them. The algorithm used should be configured properly to work well with the input data. Improving performance of model by choosing different algorithm When one algorithm doesn’t yield the required result, it can switch to work with a different algorithm. Learning curves of algorithms need to be understood before implementing the algorithm, so that it doesn’t overfit or underfit the data. Sometimes even intuition helps. Randomized Search hyperparameter tuning method can be used to randomly tune the hyperparameters and see which random search would yield good result. GridSearch hyperparameter tuning method can be used to understand which hyperparameter needs to be tuned to yield better results. An alternate implementation can be looked up so that the same algorithm yields better result when implemented in a different way. That specific algorithm’s extension can also be used to improve the performance. Algorithms can be customized as well to suit that specific data and yield satisfactory results. Improving performance of model using ensembles Ensembles combine the output of more than one algorithm so that the results are good. The predictions from well performing models are chosen. Multiple models can be used to form different learning algorithms. Various different combinations can be tried as well. Data representation can be changed so that this data can be trained on algorithms that perform well. Multiple subsamples of the training data can be created and training suing a great algorithm, as well as combine the predictions of these models. This is also known as bootstrap aggregation. Methods like boosting can be used to improve the performance of the model, by explicitly specifying which predictions are correct. ConclusionIn this post, we saw various ways in which the performance of machine learning models can be improved. 
logo

Machine Learning Tutorial

Improving Performance of ML Models

It is important to understand the performance metrics associated with a learning model and know which hyperparameter needs to be tuned so as to get better results. The idea is that the learning model needs to generalize well on new data. Let us look at different ways of improving the performance of learning models. 

It can be based on one of the 4 methods: 

  • Improving performance of model with data 
  • Improving performance of model by tuning the algorithm 
  • Improving performance of model by choosing different algorithm 
  • Improving performance of model using ensembles 

Improving performance of model with data 

The data can be processed and cleaned in a different way, selecting different features, forming new features, etc. New perspectives on the data can be formed which would yield better predictions. Data can be resampled and provided as input data to the learning algorithm. 

Data can also be rescaled or normalized to scale to a different value thereby yielding better outputs. 

Improving performance of model by tuning the algorithm 

The algorithm and data representation can be identified and a baseline can be defined. Any result above this baseline can be taken as a benchmark. 

The data available has to be best made use of and different algorithms can be used to resample the data. The evaluation metric that is used should be the one that best captures the problem requirement. Linear and non-linear algorithms need to be spot checked, so as to understand the amount of bias present in them. The algorithm used should be configured properly to work well with the input data. 

Improving performance of model by choosing different algorithm 

When one algorithm doesn’t yield the required result, it can switch to work with a different algorithm. Learning curves of algorithms need to be understood before implementing the algorithm, so that it doesn’t overfit or underfit the data. Sometimes even intuition helps. 

Randomized Search hyperparameter tuning method can be used to randomly tune the hyperparameters and see which random search would yield good result. 

GridSearch hyperparameter tuning method can be used to understand which hyperparameter needs to be tuned to yield better results. 

An alternate implementation can be looked up so that the same algorithm yields better result when implemented in a different way. That specific algorithm’s extension can also be used to improve the performance. Algorithms can be customized as well to suit that specific data and yield satisfactory results. 

Improving performance of model using ensembles 

Ensembles combine the output of more than one algorithm so that the results are good. The predictions from well performing models are chosen. Multiple models can be used to form different learning algorithms. Various different combinations can be tried as well. Data representation can be changed so that this data can be trained on algorithms that perform well. 

Multiple subsamples of the training data can be created and training suing a great algorithm, as well as combine the predictions of these models. This is also known as bootstrap aggregation. Methods like boosting can be used to improve the performance of the model, by explicitly specifying which predictions are correct. 

Conclusion

In this post, we saw various ways in which the performance of machine learning models can be improved. 

Leave a Reply

Your email address will not be published. Required fields are marked *