I have defined a class like this:
class myClass {
private:
int count;
string name;
public:
myClass (int, string);
...
...
};
myClass::myClass(int c, string n)
{
count = c;
name = n;
}
...
...
I have also a *.txt file which in each line there is a name:
David
Jack
Peter
...
...
Now I read the file line by line and create a new object pointer for each line and store all objects in a vector. The function is like this:
vector<myClass*> myFunction (string fileName)
{
vector<myClass*> r;
myClass* obj;
ifstream infile(fileName);
string line;
int count = 0;
while (getline(infile, line))
{
obj = new myClass (count, line);
r.push_back(obj);
count++;
}
return r;
}
For small *.txt files I have no problem. However, sometimes my *.txt files contain more than 1 million lines. In these cases, the program is dramatically slow. Do you have any suggestion to make it faster?
First, find faster io than std streams.
Second, can you use string views instead of strings? They are C++17, but there are C++11 and earlier versions everywhere.
Third,
myClass::myClass(int c, string n) {
count = c;
name = n;
}
should read
myClass::myClass(int c, std::string n):
count(c),
name(std::move(n))
{}
which would make a difference for long names. None for short ones due to "small string optimization".
Forth, stop making vectors of pointers. Create vectors of values.
Fifth, failing that, find a more efficient way to allocate/deallocate the objects.