Abstract
This thesis investigates the use of genetic algorithms to optimize the hyperparameters of machine learning; the focus is on the application to real-time series data gathered during industrial processes. The combination of machine learning and the meta-heuristic genetic algorithm is reviewed to determine their suitability for hyperparameter optimization for anomaly detection. Because the machine learning model consists of a variational autoencoder with long short-term memory layers, the output of the model is the reconstruction error. Further, a skewness-adjusted boxplot for non-normal distributed data is applied for outlier detection on the reconstruction error. A new approach of the genetic algorithm with maximal one evaluation of each individual per generation and fold was introduced. The genetic algorithm is developed to overcome the long runtime and expert knowledge that is needed for the popular approach of manual hyperparameter optimization. Further, a combination of two crossover functions is introduced for a better exploration of good regions of the search space. The outlier detection is done in an unsupervised manner. For the hyperparameter optimization as well as the training only non-anomalous data was used; then the trained network is applied to all the data. This improves performance for highly biased training data. The approach was successfully applied to datasets gained during an industrial process.
Translated title of the contribution | LSTM Hyperparameteroptimierung: Der Einfluss von Hyperparametern auf die Performance des maschinellen Lernens in Bezug auf Zeitreihen physikalischer Prozesse |
---|---|
Original language | English |
Qualification | Dipl.-Ing. |
Awarding Institution |
|
Supervisors/Advisors |
|
Award date | 25 Jun 2021 |
Publication status | Published - 2021 |
Bibliographical note
embargoed until nullKeywords
- Machine Learning
- Genetic Algorithms
- Hyperparameter
- Variational Autoencoder
- Outlier Detection
- LSTM