This is an interview question :
Suppose: I have 100 trillion elements, each of them has size from 1 byte to 1 trillion bytes (0.909 TiB). How to store them and access them very efficiently ?
My ideas : They want to test the knowledge about handling large volume of data efficiently. It is not an only-one-correct-answer question.
Save them into some special data structure ?
Actually I have no ideas about this kind of open-end question.
Any help is really appreciated.
It really depends on the data-set in question. I think the point is for you to discuss the alternatives and describe the various pros/cons.
Perhaps you should answer their question with more questions!
The data structure you choose will depend on what kinds of trade-offs you are willing to make.
For example, if you only ever need to iterate over the set sequentially, perhaps you could should use a linked list as this it has a relatively small storage overhead.
If instead you need random access, you might want to look into:
TL;DR: It's all problem dependent. There are many alternatives.
This is essentially the same problem faced by file systems / databases.