Search code examples
pythontensorboard

Why isn't Tensorboard showing any graphs even though I have log files?


When I open Tensorboard, I get a window that says "no dashboards are active for the current data set" even though my Tensorboard log directory has files in it.

Here's the command I'm using to start Tensorboard

tensorboard --logdir tf_logs/

The tf_logs directory has these folders and files in it

run-20190609234531
   events.out.tfevents.1560125157.BRUBIN
run-20190610010816
   events.out.tfevents.1560128897.BRUBIN
run-20190610010949
   events.out.tfevents.1560128989.BRUBIN

Here's the code that I used to create the log files (the add_summary is towards the end of the code).

import datetime
import numpy as np
import sklearn
import tensorflow as tf

from datetime import datetime
from sklearn.datasets import fetch_california_housing
from sklearn.preprocessing import StandardScaler   

def fetch_batch(epoch, batch_index, batch_size):
    np.random.seed(epoch * n_batches + batch_index)
    indices = np.random.randint(m, size=batch_size)
    X_batch = scaled_housing_data_plus_bias[indices]
    y_batch = housing.target.reshape(-1, 1)[indices]
    return X_batch, y_batch

now = datetime.utcnow().strftime("%Y%m%d%H%M%S")
root_logdir = "tf_logs"
logdir = "{}/run-{}/".format(root_logdir, now)

housing = fetch_california_housing()
m, n = housing.data.shape
housing_data_plus_bias = np.c_[np.ones((m, 1)), housing.data]
scaler = StandardScaler(copy = True)
scaled_housing_data = scaler.fit_transform(housing.data)
scaled_housing_data_plus_bias = np.c_[np.ones((m, 1)), scaled_housing_data]

X = tf.placeholder(tf.float32, shape=(None, n + 1), name="X")
y = tf.placeholder(tf.float32, shape=(None, 1), name="y")

theta = tf.Variable(tf.random_uniform([n + 1, 1], -1.0, 1.0, seed=42), name="theta")
y_pred = tf.matmul(X, theta, name="predictions")
error = y_pred - y
mse = tf.reduce_mean(tf.square(error), name="mse")
optimizer = tf.train.GradientDescentOptimizer(learning_rate=learning_rate)
training_op = optimizer.minimize(mse)

init = tf.global_variables_initializer()

batch_size = 100
learning_rate = 0.01
n_epochs = 1000
n_batches = int(np.ceil(m / batch_size))

mse_summary = tf.summary.scalar('MSE', mse)
file_writer = tf.summary.FileWriter(logdir, tf.get_default_graph())

with tf.Session() as sess:
    sess.run(init)

    for epoch in range(n_epochs):
        for batch_index in range(n_batches):
            X_batch, y_batch = fetch_batch(epoch, batch_index, batch_size)
            if batch_index % 10 == 0:
                summary_str = mse_summary.eval(feed_dict={X: X_batch, y: y_batch}) 
                step = epoch * n_batches + batch_index

                ##### Write the Tensorboard log #####
                file_writer.add_summary(summary_str, step)  

            sess.run(training_op, feed_dict={X: X_batch, y: y_batch})

    best_theta = theta.eval()

    file_writer.close()
    sess.close()    

Why is Tensorboard not showing these graphs?


Solution

  • I ran your code and found one error:

    learning_rate = 0.01
    

    is defined after it's used.

    I changed it to be defined before it's used and the code ran fine, I also ran tensorboard and it showed me the graph and scalars.

    however if that is not your problem I can think of only one other problem:

    You have to be in the directory where the tf_logs directory is in.