Bladeren bron

output layer aactivation, add fc2 in call (#358)

softmax applied during training phase to output layer and fc2 layer is unused

Co-authored-by: Aymeric Damien <aymeric.damien@gmail.com>
Nikhil Kilari 5 jaren geleden
bovenliggende
commit
e7353b7761
1 gewijzigde bestanden met toevoegingen van 2 en 2 verwijderingen
  1. 2 2
      tensorflow_v2/notebooks/3_NeuralNetworks/neural_network.ipynb

+ 2 - 2
tensorflow_v2/notebooks/3_NeuralNetworks/neural_network.ipynb

@@ -111,12 +111,12 @@
     "        # First fully-connected hidden layer.\n",
     "        self.fc2 = layers.Dense(n_hidden_2, activation=tf.nn.relu)\n",
     "        # Second fully-connecter hidden layer.\n",
-    "        self.out = layers.Dense(num_classes, activation=tf.nn.softmax)\n",
+    "        self.out = layers.Dense(num_classes)\n",
     "\n",
     "    # Set forward pass.\n",
     "    def call(self, x, is_training=False):\n",
     "        x = self.fc1(x)\n",
-    "        x = self.fc2(x)\n",
+    "        x = self.fc2(x)\n"
     "        x = self.out(x)\n",
     "        if not is_training:\n",
     "            # tf cross entropy expect logits without softmax, so only\n",