Search code examples
c++system-verilogsystem-verilog-dpi

Error in reading value from SV in C++ function using DPI


I am trying to pass a string from SV to C++ function but the value is not getting passed properly to C++ function

Code in SV side:

    import "DPI" function string mainsha(string str);
    class scoreboard ;
     string text_i_cplus;
     string text_o_cplus; 
    text_i_cplus="abc";
    text_o_cplus=mainsha(text_i_cplus);

That's how I am sending value to C++ . At the C++ side I am taking value as:

    extern "C" string mainsha(string input)
 {

    string output1 = sha256(input);

    cout << "sha256('"<< input << "'):" << output1 << endl;
   return output1;
 }

I am getting correct output when I running the C++ prog alone. But at the console I am getting the following output:

sha256(''):e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855

Can someone please suggest where I am going wrong or am I missing something?


Solution

  • The DPI in current SystemVerilog standard only supports the C language, not C++. In fact you should be using import "DPI-C". in C++, string is a class and C only has arrays of char

      extern "C" const char * mainsha(const char * input)
     {
    
        string output1 = sha256(input); // not sure if you need a cast here
    
        cout << "sha256('"<< input << "'):" << output1 << endl;
       return output1.c_str();
     }
    

    If you are using Modelsim/Questa, there is a -dpiheader switch that you can use that automatically generates the DPI prototypes for you and you can include that file when you compile your C++ code. That way you will get a compiler error if there is a mismatch and not have to debug run-time behaviors.