Преглед изворни кода

For the sake of the TF beginner (this is Example #2 afterall), added a helpful comment about how the optimizer knows to modify W and b

Davy Durham пре 8 година
родитељ
комит
c0a784a04a
1 измењених фајлова са 1 додато и 0 уклоњено
  1. 1 0
      examples/2_BasicModels/linear_regression.py

+ 1 - 0
examples/2_BasicModels/linear_regression.py

@@ -38,6 +38,7 @@ pred = tf.add(tf.multiply(X, W), b)
 # Mean squared error
 # Mean squared error
 cost = tf.reduce_sum(tf.pow(pred-Y, 2))/(2*n_samples)
 cost = tf.reduce_sum(tf.pow(pred-Y, 2))/(2*n_samples)
 # Gradient descent
 # Gradient descent
+#  Note, minimize() knows to modify W and b because Variable objects are trainable=True by default
 optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)
 optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)
 
 
 # Initializing the variables
 # Initializing the variables