Search code examples
c++memoryallocation

C++ program running out of memory for large data


I am trying to solve an issue in a C++ program I wrote. I am basically running out of memory. The program is a cache simulator. There is a file which has memory addresses collected beforehand, like this:

Thread Address Type Size Instruction Pointer
0 0x7fff60000000 1 8 0x7f058c482af3

There can be 100-500 billion such entries. First, I am trying to read all those entries and store it in a vector. Also while reading, I build up a set of these addresses (using map), and store the sequence numbers of a particular address. Sequence number simply means the position of the address-entry in the file (one address can be seen multiple times). For large inputs the program fails while doing this, with a bad_alloc error at around the 30 millionth entry. I guess I am running out of memory. Please advise on how can I circumvent the problem. Is there an alternative way to handle this kind of large data. Thank you very much! Sorry for the long post. I wanted to give some context and the actual code which I am writing.

Below is the relevant code. The ParseTaceFile() reads each line and calls the StoreTokens(), which gets the address and size, and calls AddAddress() which actually stores the address in a vector and a map. The class declaration is also given below. The first try block in AddAddress() actually throws the bad_alloc exception.

void AddressList::ParseTraceFile(const char* filename) {
  std::ifstream in_file;
  std::cerr << "Reading Address Trace File..." << std::endl;
  in_file.exceptions(std::ifstream::failbit | std::ifstream::badbit);
  char *contents = NULL;
  try {
    in_file.open(filename, std::ifstream::in | std::ifstream::binary);
    in_file.seekg(0, std::ifstream::end);
    std::streampos length(in_file.tellg());
    if (length < 0) {
      std::cerr << "Can not read input file length" << std::endl;
      throw ExitException(1);
    }
    contents = (new char[length]);
    in_file.seekg(0, std::ifstream::beg);
    in_file.read(contents, length);
    in_file.close();
    uint64_t linecount = 0, i = 0, lastline = 0, startline = 0;
    while (i < static_cast<uint64_t>(length)) {
      if ((contents[i] == '\n') or (contents[i] == EOF)) {
        contents[i] = '\0';
        lastline = startline;
        startline = i + 1;
        ++linecount;
        if (linecount > 1) {
          StoreTokens((contents + lastline), &linecount);
        }
      }
      ++i;
    }
  } catch (std::bad_alloc& e) {
    delete [] contents;
    std::cerr << "error allocating memory while parsing" << std::endl;
    throw;
  } catch (std::ifstream::failure &exc1) {
    if (!in_file.eof()) {
      delete[] contents;
      std::cerr << "error in reading address trace file" << exc1.what()
          << std::endl;
      throw ExitException(1);
    }
  }
  std::cerr << "Done" << std::endl;
}
//=========================================================    
void AddressList::StoreTokens(char* line, uint64_t * const linecount) {
  uint64_t address, size;
  char *token = strtok(line, " \t");
  uint8_t tokencount = 0;
  while (NULL != token) {
    ++tokencount;
    switch (tokencount) {
    case 1:
      break;
    case 2:
      address = strtoul(token, NULL, 16);
      break;
    case 3:
      break;
    case 4:
      size = strtoul(token, NULL, 0);
      break;
    case 5:
      break;
    default:
      break;
    }
    token = strtok(NULL, " \t");
  }
  AddAddress(address, size);
}
//================================================================
void AddressList::AddAddress(const uint64_t& byteaddr, const uint64_t& size) {

  //allocate memory for the address vector
  try {
    if ((sequence_no_ % kReserveCount) == 0) address_list_.reserve(kReserveCount);

  } catch (std::bad_alloc& e) {
    std::cerr
        << "error allocating memory for address trace vector, address count"
        << sequence_no_ << std::endl;
    throw;
  }
  uint64_t offset = byteaddr & (CacheParam::Instance()->LineSize() - 1);
  //lineaddress = byteaddr >> CacheParam::Instance()->BitsForLine();
  // this try block is for allocating memory for the address set and the queue it holds
  try {
    // splitter
    uint64_t templinesize = 0;
    do {
      Address temp_addr(byteaddr + templinesize);
      address_list_.push_back(temp_addr);
      address_set_[temp_addr.LineAddress()].push(sequence_no_++);
      templinesize = templinesize + CacheParam::Instance()->LineSize();
    } while (size + offset > templinesize);
  } catch (std::bad_alloc& e) {
    address_list_.pop_back();
    std::cerr
    << "error allocating memory for address trace set, address count"
    << sequence_no_ << std::endl;
    throw;
  }
 }

//======================================================
typedef std::queue<uint64_t> TimeStampQueue;
typedef std::map<uint64_t, TimeStampQueue> AddressSet;
class AddressList {
public:
  AddressList(const char* tracefilename);
  bool Simulate(uint64_t *hit_count, uint64_t* miss_count);
  ~AddressList();

private:
  void AddAddress(const uint64_t& byteaddr, const uint64_t& size);
  void ParseTraceFile(const char* filename);
  void StoreTokens(char* line, uint64_t * const linecount);

  std::vector<Address> address_list_;
  AddressSet address_set_;
  uint64_t sequence_no_;
  CacheMemory cache_;

  AddressList (const AddressList&);
  AddressList& operator=(const AddressList&);
};

The output is like this:

Reading Cache Configuration File...

Cache parameters read...

Reading Address Trace File...

error allocating memory for address trace set, address count 30000000

error allocating memory while parsing


Solution

  • As it seems your datasets will be much larger then your memory you would have to write an on disk index. Probably easiest to import the whole thing into a database and let that build the indexes for you.