I'm using SQL Server 2012 in a new server with 16 GB of RAM. I have a long running stored procedure on a 50 GB database and about 2.5 millions records in an input table. This process reads one table (input table) and normalizes the rows (most of the columns have text values instead of IDs), inserting the normalized rows in another table. What's the problem? the memory consumption of SQL Server service goes up as much as possible. I had to set an upper limit to avoid freezing the server. Is this normal? I know that SQL Server maintain data on the memory instead of freeing the memory to avoid loading it again in the future but in my case, I don't need it in the future because this is just a process that read data, normalizes it and moves it to another table.
What you described is absolutely normal. SQL Server caches as much data in memory as it's allowed to speed up execution time. A trip to RAM is much much faster than a trip to the hard drive. Without a limit, it will stuff everything into memory, even if the OS has to page things out to disk. This will slow your server to a crawl.
When SQL Server hits its memory limit, it will decide what to keep and what to purge to make space for new data. Don't try to micro manage it.