Search code examples
apache-sparkpysparkdatabricks

Failed to load preview: Notebook size exceeded the byte limit


Due to some large plotly plots in my Databricks notebook, I'm exceeding the file size limit of 10 MB and can't work with the notebook anymore.

How can I increase the file size limit? I'm assuming some SparkConf setting is responsible for that, but I couldn't find a corresponding setting in the docs


Solution

  • Found the answer myself in the docs for Limits & FAQ for Git integration with Databricks Git folders

    Files larger than 10 MB can’t be viewed in the Databricks UI.