Search code examples
triggerssalesforceapexdataloader

Salesforce trigger-Not able to understand


Below is the code written by my collegue who doesnt work in the firm anymore. I am inserting records in object with data loader and I can see success message but I do not see any records in my object. I am not able to understand what below trigger is doing.Please someone help me understand as I am new to salesforce.

 trigger DataLoggingTrigger on QMBDataLogging__c (after insert) {
        
        Map<string,Schema.RecordTypeInfo> recordTypeInfo = Schema.SObjectType.QMB_Initial_Letter__c.getRecordTypeInfosByName();    
        List<QMBDataLogging__c> logList = (List<QMBDataLogging__c>)Trigger.new;
        List<Sobject> sobjList  =  (List<Sobject>)Type.forName('List<'+'QMB_Initial_Letter__c'+'>').newInstance();
        Map<string, QMBLetteTypeToVfPage__c> QMBLetteTypeToVfPage  = QMBLetteTypeToVfPage__c.getAll();
        Map<String,QMBLetteTypeToVfPage__c> mapofLetterTypeRec = new Map<String,QMBLetteTypeToVfPage__c>();
        set<Id>processdIds = new set<Id>();
        for(string key : QMBLetteTypeToVfPage.keyset())
        {
            if(!mapofLetterTypeRec.containsKey(key)) mapofLetterTypeRec.put(QMBLetteTypeToVfPage.get(Key).Letter_Type__c, QMBLetteTypeToVfPage.get(Key)); 
        }
        for(QMBDataLogging__c log : logList)
        {
            Sobject logRecord = (sobject)log;
            Sobject QMBLetterRecord = new QMB_Initial_Letter__c();
            if(mapofLetterTypeRec.containskey(log.Field1__c))
            {
                string recordTypeId = recordTypeInfo.get(mapofLetterTypeRec.get(log.Field1__c).RecordType__c).isAvailable() ? recordTypeInfo.get(mapofLetterTypeRec.get(log.Field1__c).RecordType__c).getRecordTypeId() :  recordTypeInfo.get('Master').getRecordTypeId();  
                string  fieldApiNames = mapofLetterTypeRec.containskey(log.Field1__c)  ?  mapofLetterTypeRec.get(log.Field1__c).FieldAPINames__c : '';
                //QMBLetterRecord.put('Letter_Type__c',log.Name);
                QMBLetterRecord.put('RecordTypeId',tgh); 
                processdIds.add(log.Id);
                if(string.isNotBlank(fieldApiNames) && fieldApiNames.contains(','))
                {
                    Integer i = 1;
                    
                    
                    for(string fieldApiName : fieldApiNames.split(','))
                    {
                        string logFieldApiName = 'Field'+i+'__c';
                        fieldApiName = fieldApiName.trim();
                        system.debug('fieldApiName=='+fieldApiName);
                        Schema.DisplayType fielddataType =   getFieldType('QMB_Initial_Letter__c',fieldApiName);
                        if(fielddataType == Schema.DisplayType.Date)
                        {
                            Date dateValue = Date.parse(string.valueof(logRecord.get(logFieldApiName)));
                            QMBLetterRecord.put(fieldApiName,dateValue);
                        }
                        else if(fielddataType  == Schema.DisplayType.DOUBLE)
                        {
                            string value = (string)logRecord.get(logFieldApiName);
                            Double  dec  = Double.valueOf(value.replace(',',''));
                            QMBLetterRecord.put(fieldApiName,dec);
                        }
                        else if(fielddataType == Schema.DisplayType.CURRENCY)
                        {
                          Decimal  decimalValue  = Decimal.valueOf((string)logRecord.get(logFieldApiName));
                          QMBLetterRecord.put(fieldApiName,decimalValue);   
                        }
                         else if(fielddataType == Schema.DisplayType.INTEGER)
                        {
                          string value = (string)logRecord.get(logFieldApiName);
                          Integer  integerValue  = Integer.valueOf(value.replace(',',''));
                          QMBLetterRecord.put(fieldApiName,integerValue);   
                        }
                        else if(fielddataType == Schema.DisplayType.DATETIME)
                        {
                          DateTime  dateTimeValue  = DateTime.valueOf(logRecord.get(logFieldApiName));
                          QMBLetterRecord.put(fieldApiName,dateTimeValue);   
                        }
                       
                        else
                        {
                            QMBLetterRecord.put(fieldApiName,logRecord.get(logFieldApiName));
                        }
                        i++;
                    }
                }  
            }
           
            sobjList.add(QMBLetterRecord);
            
        }
        
        if(!sobjList.isEmpty())
        {
            insert sobjList;
           if(!processdIds.isEmpty()) DeleteDoAsLoggingRecords.deleteTheProcessRecords(processdIds);
        }
        
        Public static Schema.DisplayType  getFieldType(string objectName,string fieldName)
        {
            SObjectType r = ((SObject)(Type.forName('Schema.'+objectName).newInstance())).getSObjectType();
            DescribeSObjectResult d = r.getDescribe();
            return(d.fields.getMap().get(fieldName).getDescribe().getType());
        }
        
    }

Solution

  • You might be looking in the wrong place. Check if there's an unit test written for this thing (there should be one, especially if it's deployed to production), it should help you understand how it's supposed to be used.


    You're inserting records of QMBDataLogging__c but then it seems they're immediately deleted in DeleteDoAsLoggingRecords.deleteTheProcessRecords(processdIds). Whether whatever this thing was supposed to do succeeds or not.

    This seems to be some poor man's CSV parser or generic "upload anything"... that takes data stored in QMBDataLogging__c and creates QMB_Initial_Letter__c out of it.

    QMBLetteTypeToVfPage__c.getAll() suggests you could go to Setup -> Custom Settings, try to find this thing and examine. Maybe it has some values in production but in your sandbox it's empty and that's why essentially nothing works? Or maybe some values that are there are outdated?

    There's some comparison if what you upload into Field1__c can be matched to what's in that custom setting. I guess you load some kind of subtype of your QMB_Initial_Letter__c in there. Record Type name and list of fields to read from your log record is also fetched from custom setting based on that match.

    Then this thing takes what you pasted, looks at the list of fields in from the custom setting and parses it.

    Let's say the custom setting contains something like

    Name = XYZ, FieldAPINames__c = 'Name,SomePicklist__c,SomeDate__c,IsActive__c'

    This thing will look at first record you inserted, let's say you have the CSV like that

    Field1__c,Field2__c,Field3__c,Field4__c
    XYZ,Closed,2022-09-15,true
    

    This thing will try to parse and map it so eventually you create record that a "normal" apex code would express as

    new QMB_Initial_Letter__c(
       Name = 'XYZ',
       SomePicklist__c = 'Closed',
       SomeDate__c = Date.parse('2022-09-15'),
       IsActive__c = true
    );
    

    It's pretty fragile, as you probably already know. And because parsing CSV is an art - I expect it to absolutely crash and burn when text with commas in it shows up (some text,"text, with commas in it, should be quoted",more text).

    In theory admin can change mapping in setup - but then they'd need to add new field anyway to the loaded file. Overcomplicated. I guess somebody did it to solve issue with Record Type Ids - but there are better ways to achieve that and still have normal CSV file with normal columns and strong type matching, not just chucking everything in as strings.

    In theory this lets you have "jagged" csv files (row 1 having 5 fields, row 2 having different record type and 17 fields? no problem)

    Your call whether it's salvageable or you'd rather ditch it and try normal loading of QMB_Initial_Letter__c records. (get back to your business people and ask for requirements?) If you do have variable number of columns at source - you'd need to standardise it or group the data so only 1 "type" of records (well, whatever's in that "Field1__c") goes into each file.