in the Phoenix application, I have a function that takes two maps, and creates two entries in the database via Ecto.Changeset.
def create_user_with_data(user_attrs, data_attrs) do
name = cond do
data_attrs["name"] ->
data_attrs["name"]
data_attrs[:name] ->
data_attrs[:name]
true -> nil
end
Ecto.Multi.new()
|> Ecto.Multi.insert(:user, User.registration_changeset(%User{}, Map.put(user_attrs, :name, name)))
|> Ecto.Multi.run(:user_data, fn(%{user: user}) ->
%MyApp.Account.UserData{}
|> MyApp.Account.UserData.changeset(Map.put(data_attrs, :user_id, user.id))
|> Repo.insert()
end)
|> Repo.transaction()
end
because the keys in these map can be both atoms and lines, I have to check these keys.
but the expression
Map.put(user_attrs, :name, name)
will cause an error
** (Ecto.CastError) expected params to be a map with atoms or string keys, got a map with mixed keys: %{:name => "John", "email" => "m@gmail.com"}
if the keys are strings.
Is there any best practice in dealing with this issue?
I'd convert all the keys to atoms first and then use atoms everywhere.
def key_to_atom(map) do
Enum.reduce(map, %{}, fn
{key, value}, acc when is_atom(key) -> Map.put(acc, key, value)
# String.to_existing_atom saves us from overloading the VM by
# creating too many atoms. It'll always succeed because all the fields
# in the database already exist as atoms at runtime.
{key, value}, acc when is_binary(key) -> Map.put(acc, String.to_existing_atom(key), value)
end)
end
Then, convert pass all such maps through this function:
user_attrs = user_attrs |> key_to_atom
data_attrs = data_attrs |> key_to_atom
Now you can Map.put
atom keys whenever you want to.