I have been looking for a way to get the tensor of weights/parameters and biases for each layer of the network using the C++ API on the OpenVINO framework. I can't find anything on the documentation nor any example on the samples. How could I extract these tensors?
Thanks, César.
EDIT: Code for getting weights and biases separately:
for (auto&& layer : this->pImplementation->network) {
weightsbuf << "Layer name: " << layer->name << std::endl;
weightsbuf << "Parameters:" << std::endl;
for (auto&& param : layer->params) {
weightsbuf << '\t' << param.first << ": " << param.second << std::endl;
}
std::vector<int> kernelvect;
auto kernelsize = layer->params.at("kernel");
std::stringstream ss(kernelsize);
// split by comma kernel size
for (int i; ss >> i;) {
kernelvect.push_back(i);
if (ss.peek() == ',')
ss.ignore();
}
int noutputs = std::stoi(layer->params.at("output"));
int nweights = kernelvect[0] * kernelvect[1] * noutputs;
int nbias = noutputs;
for (auto&& blob : layer->blobs) {
weightsbuf << '\t' << blob.first << ": ";
for (size_t w = 0; w < nweights; ++w) {
weightsbuf << blob.second->buffer().as<float*>()[w] << " ";
}
weightsbuf << std::endl;
weightsbuf << '\t' << "biases:";
for (size_t b = 0; b < nbias; ++b) {
weightsbuf << blob.second->buffer().as<float*>()[nweights + b] << " ";
}
}
weightsbuf << std::endl;
}
Looks like there is no official example to show that functionality. I haven't found anything like that as well.
I implemented a basic sample which prints information about each layer of a network. Please take a look: https://github.com/ArtemSkrebkov/dldt/blob/askrebko/iterate-through-network/inference-engine/samples/cnn_network_parser/main.cpp
I believe the idea how to use API is clear.
The sample is based on the current state of the dldt repo (branch '2019', it corresponds to the release 2019 R3.1)
Another link, which might be useful, is the documentation on CNNLayer class: https://docs.openvinotoolkit.org/latest/classInferenceEngine_1_1CNNLayer.html