I want to insert a python dictionary in the formal below to a Cassandra database column.
data= {'key1':25,'key2':10.12,'key3':[1,2,3],'key4': 'SomeText'}
As you can see its a mixture of various datatype elements. I am thinking about defining the column as
some_column=map()
and passing the dictionary during insert operation. This dictionary will be inserted as a group insert along with 100 other variables. Is there any better or memory efficient way of doing this? I don't have access to the database yet, so i cannot execute the code above and see the status.
You can't insert multiple data types as a value into the same map - you always need to declare specific types for key & value inside map.
If you really need to store heterogenous data there, you can declare the field as text, and insert into it your dictionary serialized as JSON, for example, and deserialize when you're reading. (or you can use the blob type, and serialize/deserialize to/from byte array as pickle - depending on your requirements).