Просмотр исходного кода

For the sake of the TF beginner (this is Example #2 afterall), added a helpful comment about how the optimizer knows to modify W and b

Davy Durham 8 лет назад
Родитель
Сommit
c0a784a04a
1 измененных файлов с 1 добавлено и 0 удалено
  1. 1 0
      examples/2_BasicModels/linear_regression.py

+ 1 - 0
examples/2_BasicModels/linear_regression.py

@@ -38,6 +38,7 @@ pred = tf.add(tf.multiply(X, W), b)
 # Mean squared error
 cost = tf.reduce_sum(tf.pow(pred-Y, 2))/(2*n_samples)
 # Gradient descent
+#  Note, minimize() knows to modify W and b because Variable objects are trainable=True by default
 optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)
 
 # Initializing the variables