Search code examples
pythondatabricksazure-databrickshyper-api

I/O error while accessing file:/dbfs/my_hyper.hyper: SIGBUS


I'm trying to write Tableau's .hyper file to a directory in Databricks.

However it yields

The database "hyper.file:/dbfs/my_hyper.hyper" could not be created: I/O error while accessing file:/dbfs/my_hyper.hyper: SIGBUS

Why is this happening? I face no issues at all when writing other file types but this issue persists with .hyper files.

Is this a permissions issue or a bug?

Please Advise

I'd be happy to provide additional info


Solution

  • Most probably this happens because DBFS doesn't support random writes (see docs for the list of limitations). The workaround would be to write to the local disk, like, /tmp/, and then copy/move files using dbutils.fs.cp or dbutils.fs.mv commands (see docs)