Search code examples
c#binarybinaryfileslarge-data

Sorting gigantic binary files with C#


I have a large file of roughly 400 GB of size. Generated daily by an external closed system. It is a binary file with the following format:

byte[8]byte[4]byte[n]

Where n is equal to the int32 value of byte[4].

This file has no delimiters and to read the whole file you would just repeat until EOF. With each "item" represented as byte[8]byte[4]byte[n].

The file looks like

byte[8]byte[4]byte[n]byte[8]byte[4]byte[n]...EOF

byte[8] is a 64-bit number representing a period of time represented by .NET Ticks. I need to sort this file but can't seem to figure out the quickest way to do so.

Presently, I load the Ticks into a struct and the byte[n] start and end positions and read to the end of the file. After this, I sort the List in memory by the Ticks property and then open a BinaryReader and seek to each position in Ticks order, read the byte[n] value, and write to an external file.

At the end of the process I end up with a sorted binary file, but it takes FOREVER. I am using C# .NET and a pretty beefy server, but disk IO seems to be an issue.

Server Specs:

  • 2x 2.6 GHz Intel Xeon (Hex-Core with HT) (24-threads)
  • 32GB RAM
  • 500GB RAID 1+0
  • 2TB RAID 5

I've looked all over the internet and can only find examples where a huge file is 1GB (makes me chuckle).

Does anyone have any advice?


Solution

  • At great way to speed up this kind of file access is to memory-map the entire file into address space and let the OS take care of reading whatever bits from the file it needs to. So do the same thing as you're doing right now, except read from memory instead of using a BinaryReader/seek/read.

    You've got lots of main memory, so this should provide pretty good performance (as long as you're using a 64-bit OS).