Skip to content

Commit db18108

Browse files
committed
obsidian 24-10-10 12:02:40
Affected files: 1.2.Regression.with.Multiple.Input.Variables.md
1 parent 97beb0f commit db18108

File tree

1 file changed

+24
-27
lines changed

1 file changed

+24
-27
lines changed

1.2.Regression.with.Multiple.Input.Variables.md

Lines changed: 24 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,10 @@
1-
### C1_W2: Regression with Multiple Input Variables
1+
# C1_W2: Regression with Multiple Input Variables
22

3-
This week, you'll extend linear regression to handle multiple input features.
4-
You'll also learn some methods for improving your model's training and performance, such as _vectorization_, _feature scaling_, _feature engineering_ and _polynomial regression_.
5-
At the end of the week, you'll get to practice
6-
implementing linear regression in code.
3+
This week, you'll extend linear regression to handle multiple input features. You'll also learn some methods for improving your model's training and performance, such as _vectorization_, _feature scaling_, _feature engineering_ and _polynomial regression_. At the end of the week, you'll get to practice implementing linear regression in code.
74

8-
#### C1_W2_M1 Multiple Linear Regression
5+
## C1_W2_M1 Multiple Linear Regression
96

10-
##### C1_W2_M1_1 Multiple features
7+
### C1_W2_M1_1 Multiple features
118

129
![](/img/1.2.1.1.multiple.features.png)
1310
-$\vec{x}^{(i)}$= __vector__ of 4 parameters for$i^{th}$row
@@ -27,13 +24,13 @@ implementing linear regression in code.
2724
- this is __multiple linear regression__
2825
- __Not__ _multivariate regression_
2926

30-
###### Quiz
27+
#### Quiz
3128

3229
In the training set below (see slide: C1_W2_M1_1 Multiple features), what is$x_{1}^{(4)} $?
3330

3431
<details><summary>Ans</summary>852</details>
3532

36-
##### C1_W2_M1_2 Vectorization part 1
33+
### C1_W2_M1_2 Vectorization part 1
3734

3835
Learning to write __vectorized code__ allows you to take advantage of modern
3936
numberical linear algebra libraries, as well as maybe GPU hardware.
@@ -46,7 +43,7 @@ numberical linear algebra libraries, as well as maybe GPU hardware.
4643
- Vectorization has 2 benefits: _concise and efficient_
4744
- `np.dot` can use parallel hardware
4845

49-
##### C1_W2_M1_3 Vectorization part 2
46+
### C1_W2_M1_3 Vectorization part 2
5047

5148
How does vectorized algorithm works...
5249

@@ -58,14 +55,14 @@ How does vectorized algorithm works...
5855

5956
![](/img/i1.2.1.3.gradient.descent.png)
6057

61-
##### C1_W2_Lab01: Python Numpy Vectorization
58+
### C1_W2_Lab01: Python Numpy Vectorization
6259

6360
- [Coursera](https://www.coursera.org/learn/machine-learning/ungradedLab/zadmO/optional-lab-python-numpy-and-vectorization/lab#?path=%2Fnotebooks%2FC1_W2_Lab01_Python_Numpy_Vectorization_Soln.ipynb)
6461
- [Local](/code/C1_W2_Lab01_Python_Numpy_Vectorization_Soln.ipynb)
6562
-$a \cdot b$returns a scalar
6663
- e.g.$[1, 2, 3, 4] \cdot [-1, 4, 3, 2] = 24 $
6764

68-
##### C1_W2_M1_4 Gradient descent for multiple linear regression
65+
### C1_W2_M1_4 Gradient descent for multiple linear regression
6966

7067
![](/img/1.2.1.4.gradient.descent.png)
7168

@@ -76,12 +73,12 @@ How does vectorized algorithm works...
7673
![](/img/1.2.1.4.normal.equation.png)
7774
- __Normal Equation__
7875

79-
##### C1_W2_Lab02: Muliple linear regression
76+
### C1_W2_Lab02: Muliple linear regression
8077

8178
- [Optional Lab: Multiple linear regression | Coursera](https://www.coursera.org/learn/machine-learning/ungradedLab/7GEJh/optional-lab-multiple-linear-regression/lab)
8279
- [Local](/code/C1_W2_Lab02_Multiple_Variable_Soln.ipynb)
8380

84-
#### Quiz: Multiple linear regression
81+
## Quiz: Multiple linear regression
8582

8683
1. In the training set below, what is$x_4^{(3)} $?
8784

@@ -104,9 +101,9 @@ How does vectorized algorithm works...
104101

105102
<details><summary>Ans</summary>30, 4, F</details>
106103

107-
### C1_W2_M2 Gradient Descent in Practice
104+
# C1_W2_M2 Gradient Descent in Practice
108105

109-
#### C1_W2_M2_01 Feature scaling part 1
106+
## C1_W2_M2_01 Feature scaling part 1
110107

111108
![](/img/1.2.2.01.values.png)
112109
- Use __Feature Scaling__ to enable gradient descent to run faster
@@ -122,7 +119,7 @@ How does vectorized algorithm works...
122119

123120
:bulb: We can __speed up gradient descent by scaling our features__
124121

125-
#### C1_W2_M2_02 Feature scaling part 2
122+
## C1_W2_M2_02 Feature scaling part 2
126123

127124
![](/img/1.2.2.02.scale.png)
128125
- scale by dividing$x_i^{(j)} / \max_x $
@@ -138,29 +135,29 @@ How does vectorized algorithm works...
138135
- but the range is ok if it's relatively close
139136
- rescale if range is too large or too small
140137

141-
##### Quiz:
138+
### Quiz:
142139

143140
Which of the following is a valid step used during feature scaling? (see bedrooms vs size scatterplot)
144141
- [ ] Multiply each value by the maximum value for that feature
145142
- [ ] Divide each value by the maximum value for that feature
146143

147144
<details><summary>Ans</summary>2</details>
148145

149-
#### C1_W2_M2_03 Checking gradient descent for convergence
146+
## C1_W2_M2_03 Checking gradient descent for convergence
150147

151148
![](/img/1.2.2.03.alpha.png)
152149
- We can choose$\alpha $
153150

154151
![](/img/)
155152
- Want to minimize _cost function_ $\min\limits_{\vec{w}, b} J(\vec{w}, b)$
156153

157-
#### C1_W2_M2_04 Choosing the learning rate
158-
#### C1_W2_M2_05 Optional Lab: Feature scaling and learning rate
159-
#### C1_W2_M2_06 Feature engineering
160-
#### C1_W2_M2_07 Polynomial regression
161-
#### C1_W2_M2_08 Optional lab: Feature engineering and Polynomial regression
162-
#### C1_W2_M2_09 Optional lab: Linear regression with scikit-learn
163-
#### C1_W2_M2_10 Practice quiz: Gradient descent in practice
164-
#### C1_W2_M2_11 Week 2 practice lab: Linear regression
154+
## C1_W2_M2_04 Choosing the learning rate
155+
## C1_W2_M2_05 Optional Lab: Feature scaling and learning rate
156+
## C1_W2_M2_06 Feature engineering
157+
## C1_W2_M2_07 Polynomial regression
158+
## C1_W2_M2_08 Optional lab: Feature engineering and Polynomial regression
159+
## C1_W2_M2_09 Optional lab: Linear regression with scikit-learn
160+
## C1_W2_M2_10 Practice quiz: Gradient descent in practice
161+
## C1_W2_M2_11 Week 2 practice lab: Linear regression
165162
![](Screenshot%202024-10-09%20180220.png)
166163
![](Screenshot%202024-10-09%20180317%201.png)

0 commit comments

Comments
 (0)