Browse Source

For the sake of the TF beginner (this is Example #2 afterall), added a helpful comment about how the optimizer knows to modify W and b

Davy Durham 8 years ago
parent
commit
c0a784a04a
1 changed files with 1 additions and 0 deletions
  1. 1 0
      examples/2_BasicModels/linear_regression.py

+ 1 - 0
examples/2_BasicModels/linear_regression.py

@@ -38,6 +38,7 @@ pred = tf.add(tf.multiply(X, W), b)
 # Mean squared error
 cost = tf.reduce_sum(tf.pow(pred-Y, 2))/(2*n_samples)
 # Gradient descent
+#  Note, minimize() knows to modify W and b because Variable objects are trainable=True by default
 optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)
 
 # Initializing the variables