If a pyspark dataframe is reading some data from a table and writing it to azure delta lake Can we add comments to this newly written file? For e.g
Df = sql("select * from table1(doing some manipulation on this table data)")
Df.write.mode('overwrite').format('delta') \
.option('overwriteschema',"true").save(newfolder)
To set comment on the table, you can use COMMENT ON TABLE SQL command:
spark.sql(f"COMMENT ON TABLE delta.`{newfolder}` IS 'my comment'")
Notice, that we use special syntax to refer to a Delta table by path:
delta.`path`
If you want to set comment on a specific column, you can use ALTER TABLE ALTER COLUMN SQL command:
col_name = "abc"
spark.sql(f"ALTER TABLE delta.`{newfolder}` ALTER COLUMN {col_name} COMMENT 'my comment'")