Search code examples
cfwrite

write big blocks to file with fwrite() (e.g. 1000000000)


I am attempting to write blocks with fwrite(). At this point the largest block I could write was 100000000 (it is probably a bit higher than that...I did not try..). I cannot write a block with the size 1000000000 the outputfile is 0 Byte.

Is there any possibilty to write blocks like e.g. 1000000000 and greater?

I am using uint64_t to store these great numbers.

Thank you in advance!

Code from pastebin in comment: -zw

      char * pEnd;
        uint64_t uintBlockSize=strtoull(chBlockSize, &pEnd, 10);
        uint64_t uintBlockCount=strtoull(chBlockCount, &pEnd, 10);

        char * content=(char *) malloc(uintBlockSize*uintBlockCount);





        /*
        Create vfs.structure
        */
        FILE *storeFile;
        storeFile = fopen (chStoreFile, "w");
        if (storeFile!=NULL)
        {
            uint64_t i=uintBlockCount;

            size_t check;

            /*
                Fill storeFile with empty Blocks
            */
            while (i!=0)
            {
                fwrite(content,uintBlockSize, 1, storeFile);
                i--;
            }

Solution

  • You're assuming that the type used in your C library to represent the size of objects and index memory (size_t) can hold the same range of values as uint64_t. This may not be the case!

    fwrite's manpage indicates that you can use the function to write blocks whose size is limited by the size_t type. If you're on a 32bit system, the block size value passed to fwrite will be cast from uint64_t to whatever the library's size_t is (uint32_t, for example, in which case a very large value will have its most significant digits lost).