Search code examples
c++resthttpcurlrestbed

Client only gets very limited response from REST service


I ran into following problem which I have no clue about the how and why:

I have written a webservice in C++. I have also used an example for a client that should test for the response.

client.cpp

#include <memory>
#include <future>
#include <cstdio>
#include <cstdlib>
#include <restbed>

using namespace std;
using namespace restbed;

void print( const shared_ptr< Response >& response )
{
    fprintf( stderr, "*** Response ***\n" );
    fprintf( stderr, "Status Code:    %i\n", response->get_status_code( ) );
    fprintf( stderr, "Status Message: %s\n", response->get_status_message( ).data( ) );
    fprintf( stderr, "HTTP Version:   %.1f\n", response->get_version( ) );
    fprintf( stderr, "HTTP Protocol:  %s\n", response->get_protocol( ).data( ) );

    for ( const auto header : response->get_headers( ) )
    {
        fprintf( stderr, "Header '%s' > '%s'\n", header.first.data( ), header.second.data( ) );
    }

    auto length = response->get_header( "Content-Length", 0 );

    Http::fetch( length, response );

    fprintf( stderr, "Body:           %.*s...\n\n", length, response->get_body( ).data( ) );
}

int main( ){
    auto request = make_shared<Request>(
            Uri("http://localhost:3030/apps/17?start_date=2017-02-01&end_date=2017-01-31&kpis=foo,bar"));
    request->set_header("Accept", "application/json");
    request->set_header("Host", "localhost");

    auto response = Http::sync(request);
    print(response);

    auto future = Http::async(request, [](const shared_ptr<Request>, const shared_ptr<Response> response){
        fprintf(stderr, "Printing async response\n");
        print(response);
    });

    future.wait();

    return EXIT_SUCCESS;
}

And my service streams the response in chunks (or is supposed to stream the response in chunks) First, it streams the parameters of the request such as start_date, end_date and kpis. Second, the requested data is streamed.

Here is the stream_result_parameter function:

void stream_result_parameter(std::shared_ptr<restbed::Session> session, const Parameter params,
                             const std::string endpoint, const std::string mime_type)
{
    std::stringstream       stream;

    if(mime_type.compare("application/json") == 0)
    {
        std::vector<std::string> kpis = params.get_kpis();
        stream << "\n{\n"
               << "\"result_parameter\":{\n"
               << "\"App\":" << params.get_app_id() << ",\n"
               << "\"start_date\":" << params.get_start_date() << ",\n"
               << "\"end_date\":" << params.get_end_date() << ",\n"
               << "\"Kpis\":[";

        for(std::vector<std::string>::iterator kpi = kpis.begin(); kpi != kpis.end(); ++kpi)
        {
            if(kpi == kpis.end()-1)
            {
                stream << *kpi << "]\n},";
            }
            else
            {
                stream << *kpi << ",";
            }

        }

    }
    else
    {
        if(endpoint.compare("app") == 0 )
        {
            stream << "Called App Endpoint App: "
                   << std::to_string(params.get_app_id())
                   << "\r\nStart Date: "
                   << params.get_start_date()
                   << "\r\nEnd Date: "
                   << params.get_end_date()
                   <<"\n";

        }
        else
        {
            stream << "Called Cohorts Endpoint App: "
                   << std::to_string(params.get_app_id())
                   << "\r\nStart Date: "
                   << params.get_start_date()
                   << "\r\nEnd Date: "
                   << params.get_end_date()
                   <<"\n";

        }
    }
    session->yield(200, "\r"+stream.str()+"\r",
                   { { "Content-Length", std::to_string( stream.str().length())},
                     { "Content-Type", mime_type },
                     { "Connection", "keep-alive" } });
}     

Now, my problem occured after I added the Content-Length to it where it simply stops and closes the conversation between client and him(the service). curl gives me following error

*   Trying ::1...
* TCP_NODELAY set
* Connected to localhost (::1) port 3030 (#0)
> GET /apps/17?start_date=2017-02-01&end_date=2017-01-31&kpis=foo,bar HTTP/1.1
> Host: localhost:3030
> User-Agent: curl/7.54.0
> Accept:text/csv
> 
< HTTP/1.1 200 OK
< Connection: keep-alive
< Content-Length: 94
< Content-Type: text/csv
< 
* Excess found in a non pipelined read: excess = 2, size = 94, maxdownload = 94, bytecount = 0
Called App Endpoint App: 17
Start Date: Wed. February 1 2017
* Connection #0 to host localhost left intact
End Date: Tue. January 31 2017

Does the excess have anything to do with it?

Lastly, I want to show you the output from my test client and curl if I take the content-length away.

client.cpp output:

*** Response ***
Status Code:    200
Status Message: OK
HTTP Version:   1.1
HTTP Protocol:  HTTP
Header 'Connection' > 'keep-alive'
Header 'Content-Type' > 'application/json'
Body:           ...

Printing async response
*** Response ***
Status Code:    200
Status Message: OK
HTTP Version:   1.1
HTTP Protocol:  HTTP
Header 'Connection' > 'keep-alive'
Header 'Content-Length' > '64'
Header 'Content-Type' > 'application/json'
Body:           
"result_set":{
"i":1,
"j values": [
{"j":1,"kpi_values":[1,1]}...

But curl gives me everything I need:

*   Trying ::1...
* TCP_NODELAY set
* Connected to localhost (::1) port 3030 (#0)
> GET /apps/17?start_date=2017-02-01&end_date=2017-01-31&kpis=foo,bar HTTP/1.1
> Host: localhost:3030
> User-Agent: curl/7.54.0
> Accept:text/csv
> 
< HTTP/1.1 200 OK
< Connection: keep-alive
< Content-Type: text/csv
* no chunk, no close, no size. Assume close to signal end
< 
Called App Endpoint App: 17
Start Date: Wed. February 1 2017
End Date: Tue. January 31 2017
1,1,1,1
1,2,1,2
1,3,1,3
1,4,1,4
1,5,1,0
1,6,1,1
1,7,1,2
1,8,1,3
1,9,1,4
1,10,1,0
2,1,2,1
2,2,2,2
2,3,2,3
2,4,2,4
2,5,2,0
2,6,2,1
2,7,2,2
2,8,2,3
2,9,2,4
2,10,2,0

(Please note I did not want to copy 17400 lines so its just part of the complete and rightful output)

Maybe I am violating some rule or missing something else but I just can't think of it. Thanks in advance

UPDATE:

The excess message is gone once I accounted for the "/r"s but still the response is sent and no more chunks can follow:

*   Trying ::1...
* TCP_NODELAY set
* Connected to localhost (::1) port 3030 (#0)
> GET /apps/17?start_date=2017-02-01&end_date=2017-01-31&kpis=foo,bar HTTP/1.1
> Host: localhost:3030
> User-Agent: curl/7.54.0
> Accept:text/csv
> 
< HTTP/1.1 200 OK
< Connection: keep-alive
< Content-Length: 96
< Content-Type: text/csv
< 
Called App Endpoint App: 17
Start Date: Wed. February 1 2017
End Date: Tue. January 31 2017
* Connection #0 to host localhost left intact
 

Solution

  • Once the response is received your application will end.

    auto future = Http::async(request, [](const shared_ptr<Request>, const shared_ptr<Response> response){
        fprintf(stderr, "Printing async response\n");
        print(response);
    });
    
    future.wait();
    

    Please see github issue for final result.