My input is a csv
file :
col1,col2,col3
A,13/01/2023,1
B,14/01/2023,2
C,15/01/2023,3
When I import it using Power Query, I get this auto generated code :
let
Source = Csv.Document(File.Contents("C:\Users\VERBOSE\Desktop\file.csv"),[Delimiter=",", Columns=3, Encoding=1252, QuoteStyle=QuoteStyle.None]),
#"En-têtes promus" = Table.PromoteHeaders(Source, [PromoteAllScalars=true]),
#"Type modifié" = Table.TransformColumnTypes(#"En-têtes promus",{{"col1", type text}, {"col2", type date}, {"col3", Int64.Type}})
in
#"Type modifié"
The problem is that the file I receive may sometimes have missing columns or new ones. And even worse, sometimes the same column name may have different types accross two consecutives files.
And that throws some annoying errors :
Expression.Error
: Désolé... Nous n'avons pas trouvé la colonne « col4 » de la table. Détails : col4
DataFormat.Error
: Désolé... Nous n'avons pas pu analyser l'entrée fournie à une valeur Date. Détails : 1
Can you guys tell me if there is a dynamic way to infer the types of columns ?
In the interface, you can use transform ... detect data type... but thats single use not dynamic
Otherwise it is a long and arduous journey
There as an attempt at this here detect types but Im not convinced it works properly