summaryrefslogtreecommitdiff
path: root/SI/Resource/Data Science/Machine Learning/Contents/Gradient descent.md
diff options
context:
space:
mode:
authorTheSiahxyz <164138827+TheSiahxyz@users.noreply.github.com>2024-07-29 14:14:27 +0900
committerTheSiahxyz <164138827+TheSiahxyz@users.noreply.github.com>2024-07-29 14:14:27 +0900
commit2f32142ea7162aa05ac33e4e89df5b9a1c657ddd (patch)
tree6bf8bec6680b7d8c819e9a88bf04f8c7756fd440 /SI/Resource/Data Science/Machine Learning/Contents/Gradient descent.md
parent8abb4730aff5f380a2e0f3a29b5fe439817d28a0 (diff)
Init
Diffstat (limited to 'SI/Resource/Data Science/Machine Learning/Contents/Gradient descent.md')
-rw-r--r--SI/Resource/Data Science/Machine Learning/Contents/Gradient descent.md3
1 files changed, 1 insertions, 2 deletions
diff --git a/SI/Resource/Data Science/Machine Learning/Contents/Gradient descent.md b/SI/Resource/Data Science/Machine Learning/Contents/Gradient descent.md
index fdf8905..6d047a1 100644
--- a/SI/Resource/Data Science/Machine Learning/Contents/Gradient descent.md
+++ b/SI/Resource/Data Science/Machine Learning/Contents/Gradient descent.md
@@ -7,7 +7,6 @@ tags:
- Machine-Learning
- Gradient-descent
---
-
# Gradient Descent
- Update parameters that minimize values of loss functions
@@ -18,4 +17,4 @@ tags:
1. Find the derivative of the loss function at the current parameters.
2. Update parameters in the opposite direction of the derivative
-3. Repeat steps 1 and 2 as many epochs (hyperparameter) until the differential value becomes 0. \ No newline at end of file
+3. Repeat steps 1 and 2 as many epochs (hyperparameter) until the differential value becomes 0.