Search code examples
javaazurespring-dataazure-cosmosdbazure-cloud-services

Azure cosmos db payload size limit


I have a cosmos db collection, and the primary key is id field. The problem is my payload is bigger than 2MB, and cosmos does not allow writes bigger than 2 MB. if I split the payload into smaller chunks and then write to cosmos then it will override the previous record with the same id. I want to add under the same id, not to override whatever is listed under the id. Is there a way to overcome this? Please see my structure below. I am saving the User object(s). Thanks.

public class User {
    @Id
    private String id;
    private People people;
}

public class People {
    private List<Person> persons;
}

JDK 1.8

Spring Boot


Solution

  • If you array is very large it should be referenced, not embedded. Give your Person class the id you can use to look them up and insert them as individual rows rather than as an array embedded in one document. Embedding is only appropriate for 1:1 or 1:few scenarios. You can learn more here Data modeling in Cosmos DB.

    PS: If you're new to Cosmos I would encourage you to read all the articles on modeling and partitioning. These are key to understanding how to design a database that will perform and scale.