Search code examples
c++sqlsql-serverodbcbulk

bcp_bind fails when used with size over 8000 bytes for VARBINARY(MAX) column


I am using a Bulk Copy Functions (http://msdn.microsoft.com/en-us/library/ms130922.aspx) via SQL Server native client v11 for fast data insertion into SQL Server 2012 table. Everything works fine, except insertion of big (over 8K) data chunks into BLOB column (inserting data under 8K works without any issues). I get following extended information about the error:

HY000 „Invalid field size for datatype“

Example C++ code:

std::vector<char> vData(9000, 0);
BYTE* pData = &buf[0];
DBINT nLenth = buf.size();
auto nRet = bcp_bind(hHandle, pData, 0, nLength, nullptr, 0, SQLVARBINARY, 2/*Column number is  correct*/);
assert(nRet != FAIL); //Fails

Table creation SQL:

CREATE TABLE [Table1] (C1 INT IDENTITY (1,1) NOT NULL, C2 VARBINARY(MAX) NULL);

I’ve tried different data type valus (SQLBIGBINARY, SQLBIGVARBINARY, SQLBINARY, …) – same error.

I’ve tried setting pData to nullptr to signal that data will be provided via calls to bcp_moretext - same error.

Binding data block under 8K works without any issues.

Transferring same data via ODBC SQLBindParameter with type SQL_VARBINARY (SQL_C_BINARY) works without any issues.

Any chance to get bulk copy functionality working?

Thank you!


Solution

  • Answering my own question:

    even though bcp_bind for some strange reason fails when binding to columns of more than 8K bytes, another BCP function, bcp_collen, works just fine with same data. So the solution code is:

    std::vector<char> vData(9000, 0);
    BYTE* pData = &buf[0];
    DBINT nLenth = buf.size();
    auto nRet = bcp_bind(hHandle, pData, 0, 1/*or any value < 8000*/, nullptr, 0, SQLVARBINARY, 2/*Column number*/);
    assert(nRet != FAIL); //OK
    nRet = bcp_collen(hHandle, nLength, 2/*Column number*/);
    assert(nRet != FAIL); //OK
    

    works like charm!