Best Options to lead with tempDB with a lot of data

In order to load data from multiple data sources and a big amount of Data using SQL Server 2014.

My ETL Scripts are in T-SQL and it taking a lot of time to execute because my TempDB are full.

In your opinion, which is the best way to lead with this:

  • Using Commit Transactions?
  • Clean TempDB?
  • etc.


  • They only way to answer this question is with a very high level general response.

    You have a few options:

    1. Simply allocate more space to TempDB.
    2. Optimize your ETL queries and tune your indexes.

    Option 2 is often the better apporoach. Excessive use of TempDB indicates that inefficient sorts or joins are occurring. To resolve this, you need to analyze the actual execution plans of your ETL code. Look for the following:

    • Exclamation marks in your query plan. This often indicates that a join or a sort operation had to spill over to TempDB because the optimizer under estimated the amount of memory required. You might have statistics which needs to be updated.
    • Look for large differences in the estimated number of rows and actual number of rows. This can also indicate statistics that are out of date of parameter sniffing issues.
    • Look for sort operations. It is often possible to remove these by adding indexes to your tables.
    • Look for inefficient access methods. These can often be resolved by adding covering indexes. E.g table scan if you only need a small number of rows from a large table. Just note that table scans are often the best approach when loading data warehouses.

    Hope this was helpful.