Search code examples
scalacassandrascalatra

Handling huge number of fields in scalatra model


I'm building a rest api using scalatra and cassandra. My cassandra data model is having 1000+ fields. I need to read these fields into scalatra middleware and do a lot of json manipulation as per business logic. What are the ways in which I can automatically/easily map the cassandra fields -> scalatra object -> JSON response?

Thanks in advance.


Solution

  • In Cassandra 2.2 added JSON support
    You can use SELECT JSON

    The SELECT statement has also be extended to support retrieval of rows in a JSON-encoded map format. The results for SELECT JSON will only include a single column named [json]. This column will contain the same JSON-encoded map representation of a row that is used for INSERT JSON. For example, if we have a table like the following:

    Let you schema be

    CREATE TABLE users (
        id text PRIMARY KEY,
        age int,
        state text
    );
    

    You can use

    SELECT JSON * FROM users;
    

    The results will look like this:

    {"id": "user123", "age": 42, "state": "TX"}
    

    or you can use

    SELECT JSON id, writetime(age), ttl(state) as ttl FROM users;
    

    Output :

    {"id": "user123", "writetime(age)": 1434135381782986, "ttl": null}
    

    Source : http://www.datastax.com/dev/blog/whats-new-in-cassandra-2-2-json-support