瀏覽代碼

For the sake of the TF beginner (this is Example #2 afterall), added a helpful comment about how the optimizer knows to modify W and b

Davy Durham 8 年之前
父節點
當前提交
c0a784a04a
共有 1 個文件被更改,包括 1 次插入0 次删除
  1. 1 0
      examples/2_BasicModels/linear_regression.py

+ 1 - 0
examples/2_BasicModels/linear_regression.py

@@ -38,6 +38,7 @@ pred = tf.add(tf.multiply(X, W), b)
 # Mean squared error
 cost = tf.reduce_sum(tf.pow(pred-Y, 2))/(2*n_samples)
 # Gradient descent
+#  Note, minimize() knows to modify W and b because Variable objects are trainable=True by default
 optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)
 
 # Initializing the variables