I discovered a memory leak reading data via USB interrupt transfer using libUSB synchronously. My simple user program is not using any dynamic memory allocation itself. Internally libusb makes excessive use of dynamic memory allocation. The communication flow is working as expected. Is there a special function to free any internal dynamic memory after using libusb_interrupt_transfer? Does anyone have an idea what causes the continously increase of memory during runtime?
My protocol implements a two way handshake. Because of this a simple data exchange causes a OUT(request), IN(Ack/Nack), IN(Response) and OUT(Ack/Nack) transfer. The report size is 32 Bytes, the outEndpointAddr is 1, the inEndpointAddr is 129, Here are the relevant code snippets.
int main (void)
{
uint32_t devFound = 0;
uint32_t devErrors = 0;
...
int libUsbErr = 0;
if(!findSensor(&devFound, &devErrors, &libUsbErr, foundCB))
printf("finding sensor failed %d\n", libUsbErr);
if(!openSensor(mySensor, &libUsbErr))
printf("open sensor failed %d\n", libUsbErr);
int i = 0;
while(1)
{
printf("[%06d] Int Temp %f C\n",i++, readIntTemper());
Delay(0.5);
}
closeSensor(&mySensor, NULL);
closeSensorContext();
return 0;
}
float readIntTemper()
{
static uint8_t tmp[32];
static uint8_t response[32];
...//Prepare request frame
int libUsbErr = 0;
if(!HID_Write(mySensor, tmp, &written, 4000, &libUsbErr))
{
printf("write request failed %d\n", libUsbErr);
return 0;
}
//Read Ack / Nack
if(!HID_Read(mySensor, tmp, &read, 4000, &libUsbErr))
{
printf("Read ACK NACK failed %d\n", libUsbErr);
return 0;
}
...//Test if Ack / Nack
if(!HID_Read(mySensor, response, &read, 4000, &libUsbErr))
{
printf("Read response failed %d\n", libUsbErr);
return 0;
}
... //Prepare ACK
if(!HID_Write(mySensor, tmp, &written, 4000, &libUsbErr))
{
printf("Ack response failed %d\n", libUsbErr);
return 0;
}
...
float* temper = (float*)&response[8];
return *temper;
}
bool HID_Write(const Sensor* sens, uint8_t* repBuf, int* transferred, uint32_t timeout, int* libUsbErr)
{
if(sens == NULL || repBuf == NULL || transferred == NULL)
return returnlibUSBErr(libUsbErr, -1008); ///TODO nice error codes;
if(!sens->claimed)
return returnlibUSBErr(libUsbErr, -1012); ///TODO nice error codes;
int r = libusb_interrupt_transfer(sens->devHandle, sens->outEndpointAddr,
repBuf, sens->outRepSize, transferred, timeout);
if (r < 0)
return returnlibUSBErr(libUsbErr, r);
return returnlibUSBErr(libUsbErr, LIB_USB_OK);
}
bool HID_Read(const Sensor* sens, uint8_t* repBuf, int* read, uint32_t timeout, int* libUsbErr)
{
if(sens == NULL || read == NULL)
return returnlibUSBErr(libUsbErr, -1008); ///TODO nice error codes;
if(!sens->claimed)
return returnlibUSBErr(libUsbErr, -1012); ///TODO nice error codes;
int r = libusb_interrupt_transfer(sens->devHandle, sens->inEndpointAddr, repBuf,sens->inRepSize, read, timeout);
if (r < 0)
return returnlibUSBErr(libUsbErr, r);
return returnlibUSBErr(libUsbErr, LIB_USB_OK);
}
EDIT
If followed this instruction to monitor memory usage:
To find the leak I used UMDH Windows tool like mentioned here:
The problem is that I have to use CVI NI compilter to build my application. I wasn't able to get the symbol table out of this compilter. So my heap dump diff only shows addresses.
// Each log entry has the following syntax:
//
// + BYTES_DELTA (NEW_BYTES - OLD_BYTES) NEW_COUNT allocs BackTrace TRACEID
// + COUNT_DELTA (NEW_COUNT - OLD_COUNT) BackTrace TRACEID allocations
// ... stack trace ...
//
// where:
//
// BYTES_DELTA - increase in bytes between before and after log
// NEW_BYTES - bytes in after log
// OLD_BYTES - bytes in before log
// COUNT_DELTA - increase in allocations between before and after log
// NEW_COUNT - number of allocations in after log
// OLD_COUNT - number of allocations in before log
// TRACEID - decimal index of the stack trace in the trace database
// (can be used to search for allocation instances in the original
// UMDH logs).
//
+ 80000 ( 80000 - 0) 1 allocs BackTrace4920B3C
+ 1 ( 1 - 0) BackTrace4920B3C allocations
ntdll!RtlAllocateHeap+274
cvirte!LoadExternalModule+291EC
cvirte!CVIDynamicMemoryInfo+12B6
cvirte!CVIDynamicMemoryInfo+1528
cvirte!CVIDynamicMemoryInfo+1AF9
cvirte!mblen+84D
cvirte!_CVI_Resource_Acquire+116
cvirte!malloc+68
libUSB_HID!???+0 : 41DCE8
libUSB_HID!???+0 : 4E95C7
libUSB_HID!???+0 : 4C13BE
libUSB_HID!???+0 : 4BA09D
libUSB_HID!???+0 : 4C7ABA
libUSB_HID!???+0 : 4F92F0
libUSB_HID!???+0 : 4FB3BD
libUSB_HID!???+0 : 4FC50E
libUSB_HID!???+0 : 415C31
libUSB_HID!???+0 : 408847
libUSB_HID!???+0 : 402967
libUSB_HID!???+0 : 41B51E
libUSB_HID!???+0 : 41A021
kernel32!BaseThreadInitThunk+E
ntdll!__RtlUserThreadStart+70
I also replaced all free, alloc, calloc and realloc cmds within libUSB with my a own implementation tracking every single memory request. This tracking is not showing any memory leak. The amount of allocated bytes stays constant during runtime as expected. Anyway the UMDH tools shows a heap allocation difference. So I'm completely out of ideas what to test next atm.
Sorry guys I ported my program to minGW gcc and everythink is working as expected. It seams that my porting of libusb for CVI compiler is not completely correct. Now I use the standard dll and the memory leak is gone.