I have one table whose structure roughly is as follows ->
CREATE TABLE keyspace_name.table_name (
id text PRIMARY KEY,
type text,
bool_yn boolean,
created_ts timestamp,
modified_ts timestamp
)
Recently I added new column in the table ->
alter table keyspace_name.table_name first_name text;
And when I query on the given column from table in cqlsh, it gives me the result. For eg.
select first_name from keyspace_name.table_name limit 10;
But if I try to perform the same query in dse spark-sql
It is giving me the following error.
Error in query: cannot resolve '
first_name
' given input columns: [id, type, bool_yn, created_ts, modified_ts];
I don't know what's wrong in spark-sql. I've tried nodetool repair but problem still persists
Any help would be appreciated. Thanks
If table schema changes, the Spark metastore doesn't automatically refresh the schema changes, so manually remove the old tables from spark sql with a DROP TABLE
command, then run SHOW TABLES
. The new table with latest schema will be automatically created. This will not change the data in Cassandra.