Search code examples
pythontensorboardword-embedding

Why does the TensorBoard display the wrong cosine distance?


i want to visualize word embeddings in the Projector from TensorBoard, but the cosine distances doesnt seem right.

If i compute the cosine distances via sklearn i get different results.

Am i using the TensorBoard Projector wrong?

TensorBoard: https://i.sstatic.net/3lEv0.png

Sklearn: https://i.sstatic.net/QGilv.png

import os
import tensorflow as tf
from tensorflow.examples.tutorials.mnist import input_data
from tensorflow.contrib.tensorboard.plugins import projector

LOG_DIR = 'logs'
metadata = os.path.join(LOG_DIR, 'metadata.tsv')

emb_arr = []

arr = []

# category -> dictionary
# category["Category 1"] -> array([[...,...,...,...,]]) # 300 dimensions

for category in category_embeddings:
    arr.appendcategory_embeddings[category][0]) 
embds_arr = np.asarray(arr)

with open(metadata, 'w', encoding="utf-8") as metadata_file:
    for key in category_embeddings.keys():
        metadata_file.write(key + "\n")

embds = tf.Variable(embds_arr, name='embeds')

with tf.Session() as sess:  
    saver = tf.train.Saver([embds])

    sess.run(embds.initializer)
    saver.save(sess, os.path.join(LOG_DIR, 'category.ckpt'))

    config = projector.ProjectorConfig()    
    config.model_checkpoint_path = os.path.join(LOG_DIR, 'checkpoint')

    config = projector.ProjectorConfig()
    embedding = config.embeddings.add()
    embedding.tensor_name = embds.name
    embedding.metadata_path = metadata

    projector.visualize_embeddings(tf.summary.FileWriter(LOG_DIR), config)

Solution

  • Solved,

    i tested it with different datasets and training cycles, it seems to be a bug within TensorBoard. Sklearn returns the correct reuslts for the original vector space and TensorBoard possibly calculates the distance from a reduced dimensionality.

    https://github.com/tensorflow/tensorboard/issues/2421