Skip to content

Commit

Permalink
Merge pull request #542 from AyuavnGautam/master
Browse files Browse the repository at this point in the history
Tensorboard Visualisation
  • Loading branch information
MukulCode authored Oct 31, 2022
2 parents a13c1f5 + 4b22a62 commit 04387fb
Showing 1 changed file with 105 additions and 0 deletions.
105 changes: 105 additions & 0 deletions Tensorboard Visualisation/tensorboard_visualization.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,105 @@
# -*- coding: utf-8 -*-
"""Tensorboard Visualization
Automatically generated by Colaboratory.
Original file is located at
https://colab.research.google.com/drive/1csbf3PrSm7OipUtCdmNAzXmco0JdthJY
#Get started with TensorBoard
In machine learning, to improve something you often need to be able to measure it. TensorBoard is a tool for providing the measurements and visualizations needed during the machine learning workflow. It enables tracking experiment metrics like loss and accuracy, visualizing the model graph, projecting embeddings to a lower dimensional space, and much more.
This quickstart will show how to quickly get started with TensorBoard. The remaining guides in this website provide more details on specific capabilities, many of which are not included here.
"""

# Commented out IPython magic to ensure Python compatibility.
# Load the TensorBoard notebook extension
# %load_ext tensorboard

import tensorflow as tf
import datetime

# Clear any logs from previous runs
!rm -rf ./logs/

"""Using the [MNIST](https://en.wikipedia.org/wiki/MNIST_database) dataset as the example, normalize the data and write a function that creates a simple Keras model for classifying the images into 10 classes."""

mnist = tf.keras.datasets.mnist

(x_train, y_train),(x_test, y_test) = mnist.load_data()
x_train, x_test = x_train / 255.0, x_test / 255.0

def create_model():
return tf.keras.models.Sequential([
tf.keras.layers.Flatten(input_shape=(28, 28)),
tf.keras.layers.Dense(512, activation='relu'),
tf.keras.layers.Dropout(0.2),
tf.keras.layers.Dense(10, activation='softmax')
])

"""## Using TensorBoard with Keras Model.fit()
When training with Keras's [Model.fit()](https://www.tensorflow.org/api_docs/python/tf/keras/models/Model#fit), adding the `tf.keras.callbacks.TensorBoard` callback ensures that logs are created and stored. Additionally, enable histogram computation every epoch with `histogram_freq=1` (this is off by default)
Place the logs in a timestamped subdirectory to allow easy selection of different training runs.
"""

# tf.test.is_gpu_available(
# cuda_only=False, min_cuda_compute_capability=None
# )
# tf.config.list_physical_devices('GPU')
model = create_model()
model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])

log_dir = "logs/fit/" + datetime.datetime.now().strftime("%Y%m%d-%H%M%S")
tensorboard_callback = tf.keras.callbacks.TensorBoard(log_dir=log_dir, histogram_freq=1)
# with tf.device('/TPU:0'):
model.fit(x=x_train,
y=y_train,
epochs=5,
validation_data=(x_test, y_test),
callbacks=[tensorboard_callback])

tf.config.list_physical_devices()

"""Start TensorBoard through the command line or within a notebook experience. The two interfaces are generally the same. In notebooks, use the `%tensorboard` line magic. On the command line, run the same command without "%"."""

# model = tf.keras.Sequential()
model = tf.keras.Sequential()
model.add(tf.keras.Layers.Conv2D(32, (3, 3), activation='relu', input_shape=(28, 28)))
model.add(tf.keras.Layers.MaxPooling2D((2, 2)))
model.add(tf.keras.Layers.Conv2D(64, (3, 3), activation='relu'))
model.add(tf.keras.Layers.MaxPooling2D((2, 2)))
model.add(tf.keras.Layers.Conv2D(64, (3, 3), activation='relu'))
model.add(tf.keras.layers.Dense(512, activation='relu'))
model.add(tf.keras.layers.Dropout(0.2))
model.add(tf.keras.layers.Dense(10, activation='softmax'))
model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
log_dir = "logs/fit/" + datetime.datetime.now().strftime("%Y%m%d-%H%M%S")
tensorboard_callback = tf.keras.callbacks.TensorBoard(log_dir=log_dir, histogram_freq=1)

model.fit(x=x_train,
y=y_train,
epochs=5,
validation_data=(x_test, y_test),
callbacks=[tensorboard_callback])
# with tf.device('/CPU:0'):

# Commented out IPython magic to ensure Python compatibility.
# %tensorboard --logdir logs/fit

"""<!-- <img class="tfo-display-only-on-site" src="https://github.com/tensorflow/tensorboard/blob/master/docs/images/quickstart_model_fit.png?raw=1"/> -->
A brief overview of the dashboards shown (tabs in top navigation bar):
* The **Scalars** dashboard shows how the loss and metrics change with every epoch. You can use it to also track training speed, learning rate, and other scalar values.
* The **Graphs** dashboard helps you visualize your model. In this case, the Keras graph of layers is shown which can help you ensure it is built correctly.
* The **Distributions** and **Histograms** dashboards show the distribution of a Tensor over time. This can be useful to visualize weights and biases and verify that they are changing in an expected way.
Additional TensorBoard plugins are automatically enabled when you log other types of data. For example, the Keras TensorBoard callback lets you log images and embeddings as well. You can see what other plugins are available in TensorBoard by clicking on the "inactive" dropdown towards the top right.
"""

0 comments on commit 04387fb

Please sign in to comment.