summaryrefslogtreecommitdiff
path: root/SI/Resource/Data Science/Machine Learning/Contents/Gradient descent.md
diff options
context:
space:
mode:
authorTheSiahxyz <164138827+TheSiahxyz@users.noreply.github.com>2024-04-29 22:06:12 -0400
committerTheSiahxyz <164138827+TheSiahxyz@users.noreply.github.com>2024-04-29 22:06:12 -0400
commit4d53fa14ee0cd615444aca6f6ba176e0ccc1b5be (patch)
tree4d9f0527d9e6db4f92736ead0aa9bb3f840a0f89 /SI/Resource/Data Science/Machine Learning/Contents/Gradient descent.md
init
Diffstat (limited to 'SI/Resource/Data Science/Machine Learning/Contents/Gradient descent.md')
-rw-r--r--SI/Resource/Data Science/Machine Learning/Contents/Gradient descent.md21
1 files changed, 21 insertions, 0 deletions
diff --git a/SI/Resource/Data Science/Machine Learning/Contents/Gradient descent.md b/SI/Resource/Data Science/Machine Learning/Contents/Gradient descent.md
new file mode 100644
index 0000000..fdf8905
--- /dev/null
+++ b/SI/Resource/Data Science/Machine Learning/Contents/Gradient descent.md
@@ -0,0 +1,21 @@
+---
+id: 2023-12-17
+aliases: December 17, 2023
+tags:
+- link-note
+- Data-Science
+- Machine-Learning
+- Gradient-descent
+---
+
+# Gradient Descent
+
+- Update parameters that minimize values of loss functions
+- **Instantaneous is always _0_**
+- Therefore, update parameter that a derivative of loss function is equal to 0
+
+## Pseudo Code
+
+1. Find the derivative of the loss function at the current parameters.
+2. Update parameters in the opposite direction of the derivative
+3. Repeat steps 1 and 2 as many epochs (hyperparameter) until the differential value becomes 0. \ No newline at end of file