Search code examples
csvdelphifdmemtable

Delphi - Loading a CSV into a Dataset using FireDac ( Field Sizing Issue... )


I'm using TFDBatchMove, TFDBatchMoveTextReader, TFDBatchMoveDataSetWriter, and TFDMemTable to load data from a csv file into a memTable dataset. It works great except for the fact that I have one field that has a lot of text (400-500 characters) and for some reason the memTable component caps the field size at 233 characters... The fields are all loaded by the batch components and I can't find an option to extend the field size limit and same with the memTable. How do I get around this?

unit Unit1;

interface

uses
  System.SysUtils, System.Types, System.UITypes, System.Classes, System.Variants,
  FMX.Types, FMX.Controls, FMX.Forms, FMX.Graphics, FMX.Dialogs,
  FireDAC.Stan.Intf, FireDAC.Stan.Option, FireDAC.Stan.Param,
  FireDAC.Stan.Error, FireDAC.DatS, FireDAC.Phys.Intf, FireDAC.DApt.Intf,
  FMX.Edit, FMX.Controls.Presentation, FMX.StdCtrls, Data.DB,
  FireDAC.Comp.DataSet, FireDAC.Comp.Client, FireDAC.Comp.BatchMove.DataSet,
  FireDAC.Comp.BatchMove, FireDAC.Comp.BatchMove.Text, System.Rtti,
  FMX.Grid.Style, FMX.ScrollBox, FMX.Grid, FireDAC.UI.Intf, FireDAC.FMXUI.Wait,
  FireDAC.Comp.UI, Data.Bind.EngExt, Fmx.Bind.DBEngExt, Fmx.Bind.Grid,
  System.Bindings.Outputs, Fmx.Bind.Editors, Data.Bind.Components,
  Data.Bind.Grid, Data.Bind.DBScope;

type
  TForm1 = class(TForm)
    BatchMove: TFDBatchMove;
    csvReader: TFDBatchMoveTextReader;
    datasetWriter: TFDBatchMoveDataSetWriter;
    memTable: TFDMemTable;
    btnConvert: TButton;
    FilePath: TEdit;
    StringGrid1: TStringGrid;
    FDGUIxWaitCursor1: TFDGUIxWaitCursor;
    BindSourceDB1: TBindSourceDB;
    BindingsList1: TBindingsList;
    LinkGridToDataSourceBindSourceDB1: TLinkGridToDataSource;
    procedure btnConvertClick(Sender: TObject);
  private
    { Private declarations }
  public
    { Public declarations }
  end;

var
  Form1: TForm1;

implementation

{$R *.fmx}

procedure TForm1.btnConvertClick(Sender: TObject);
begin
  csvReader.FileName := FilePath.Text;
  BatchMove.Execute;
end;


Solution

  • Figured out a way around it. Even if I set the AnalyzeSample size to encompass the entire dataset, for some reason the TFDBatchMove was not able to figure out the correct field size to fit every record in the dataset and was setting it way too low. I was unable to get this to work. However, by creating all of the FieldDefs in my TFDMemTable manually (at runtime), I could define the field sizes to be whatever I wanted them to be. Then by unchecking the "poCreateDest" option, the TFDBatchMove would no longer try to format the fields based off of its own analysis and instead just write data to the fields that I had already created. It's less dynamic this way because the field parameters are fixed but it works well enough for what I need it to do.