Dropout - a Method to Regularize the Training of Deep Neural Networks [Lecture 6.4] AMILE - Machine Learning with Christian Nabert 696 подписчиков Скачать
Activation Function of Neural Networks - Step, Sigmoid, Tanh, ReLU, LeakyReLU, Softmax [Lecture 5.3] Скачать
Building Neural Networks - Neuron, Single Layer Perceptron, Multi Layer Perceptron [Lecture 5.2] Скачать
When to Stop the Training of a Decision Tree? - Hyperparameters of Decision Trees [Lecture 4.3] Скачать
How to Make a Decision Tree - Mathematical Theory of Training with Gini Impurity [Lecture 4.2] Скачать
How to Evaluate Classification Models - Confusion Matrix and Precision-Recall Curve [Lecture 2.7] Скачать
What is the Meaning of Cross Entropy/ Log Loss as Cost Function for Classification? [Lecture 2.6] Скачать
Cross Entropy vs. MSE as Cost Function for Logistic Regression for Classification [Lecture 2.5] Скачать
Softmax Regression as a Generalization of Logistic Regression for Classification [Lecture 2.3] Скачать
Regularization - Early Stopping, Ridge Regression (L2) and Lasso Regression (L1) [Lecture 1.6] Скачать
Data Preprocessing - Normalization, Outliers, Missing Data, Variable Transformation [Lecture 1.4] Скачать