Please note that eager execution is now officially included in the main module.
So, no need for `import tensorflow.contrib.eager as tfe`. With that said, all occurences of `tfe` might need to be changed to `tf`, since all methods can be now accessed direclty from `tf`.
```tfe.GradientTape()``` --> ```tf.GradientTape()```
Hi, this me again.
I have an question about __reg_loss__ .
The total_loss should be equal to __reg_loss__ + softmax_loss ,but I do not know how to get __reg_loss__ .
I have configure parameter of Conv2D(kernel_regularizer= tf.keras.regularizers.l2) like [this demo](https://github.com/tensorflow/tensorflow/blob/9e0b05bbc4bb88d1b34fb2147429dc4ad7bd25cd/tensorflow/contrib/eager/python/examples/densenet/densenet.py#L49) but when the compute the loss of this mode [compute loss](https://github.com/tensorflow/tensorflow/blob/9e0b05bbc4bb88d1b34fb2147429dc4ad7bd25cd/tensorflow/contrib/eager/python/examples/densenet/densenet_test.py#L105) . loss only compute softmax_loss, where is __reg_loss__ , and how to get it for computing gradients??
I also use tf.get_total_loss() API but I do not know useful or not?
thank you for this great tutorials. I like your style of putting the model and all its training-function in one class.
However I was not able to find a method to load one of the pretrained TF/Keras models in such class. Do you have an idea how to accomplish this?