Database Programming: Temp Tables or Table Variables?

On the occasion of this blog's one hundredth post, I thought I'd pass along some information I recently discussed with my youngest cousin, Tom, who is like me an otherwise rational person who's been bitten by the database development bug.  In a conversation on the subject topic, Tom wrote:

My concern with using temp tables is partly superstitious - Using [table variables] sounds so much quicker than writing to disk.

The tidbit I passed along to Tom is now a tidbit for you: The contents of both table variables and temp tables are instantiated in tempdb, so you’re not really buying yourself as much as you might think you are when you try to stay off the disk by using table variables.  So what’s the difference between them?  Table variables don’t have statistics on them, so they are far less likely to provoke a recompile when the optimizer sees them; temp tables have statistics on them, so they generate a recompile every time they’re built.  In SQL Server 2005, you can isolate the recompile to the statement rather than the entire proc (which is how SQL Server 2000 behaves), but you still get some flavor of a recompile. 

SO..  if your data volume in the temp structure is low, OR if you’re certain you’ll always be scanning the entire table, use a table variable.  If your data volume in the temp structure is random, or if you’re going to be performing highly selective queries against the temp structure, use a temp table and build indexes on it after you populate it if they boost performance with your highest volume.

Don’t build a clustering key, though, unless you can prove it helps overall performance and you put an appropriate ORDER BY on the INSERT statement so you’re not churning out-of-order data on the disk when you build the index.  This is a temp structure, so clustering it on disk when we’re going to blow it away in a couple hundred milliseconds anyway doesn’t make much sense unless it’s done at zero overhead.

If the temp table is only referenced once, consider using a derived table instead.

In all instances, you should unit test all permutations of this with low, medium, and high volumes, weight those for the number of times each volume is run (if high volume is run once a day, you might have a different position on its performance than you would if it was run 10,000 times an hour; both are possible), and work to meet the established SLA for whatever it is you’re writing.

Here's to the next 100 posts..


Comments (9)

  1. Anonymous says:

    Kimberly Tripp, one of the most incredibly astute SQL poeple on the planet, calls her blog, "Improving…

  2. Anonymous says:

    One of the great things about my involvement with the SQL Ranger community is that I occasionally here

  3. Anonymous says:

    When I posted yesterday on this topic (first broached here), I’d found some good information pertinent…

  4. Anonymous says:

    Hi ifo..

    If I understand your requirement correctly, using the SET IDENTITY_INSERT ON property on your table before you make your second insert would be the way to go.  You could even insert the record straightaway, without incrementing and then updating the value, if that’s what you desire.

    If you’re using SQL Server 2005, the OUTPUT option of SELECT.. INSERT may also be of use.

    Hope this helps!  Please let me know if there’s anything else you need.


  5. Scott Whigham ( says:

    Ward –

    I’ve just found your blog and I’ve enjoyed reading your posts. One question, you write, "… temp tables have statistics on them, so they generate a recompile every time they’re built."

    When you say "recompile", are you talking about using a temp table inside a stored procedure and that the stored proc will have to recompile upon every execution?

  6. Ward Pond says:

    Hi Scott..

    Thanks for your kind words.

    A procedure that creates a temp table will recompile every time it’s invoked.  A procedure which references an already-defined temp table will only recompile if the cardinality estimate for the query is different than that for the previous compilation.


  7. Scott Whigham ( says:

    So, if you load up Profiler, do you see an SP:Recompile event upon each invocation of any proc that includes CREATE TABLE #… or SELECT INTO?

  8. Ward Pond says:

    Hi Scott..

    Based on my reading, that’s what I’d expect; however, my research to this point does *not* bear this out.  Please give me a day or two to look into this further.

    I’ll keep you posted.  Thanks..


  9. ifo says:


    Please I need advice.

    SQL Server 2005. I have a table consisting of 200 fields with a primary key(auto-increment). I need to be able to duplicate via a stored procedure a specific row in the table identified by primary key value i.e. insert a record form that table into the SAME table …. the primary key of the inserted record so be incremented and I would need to change to of the field values in the inserted record (i.e. duplicating a row in a table with a primary key(auto  increment). What is the  best way to do this… one suggested writing to temp table then re-insert to original table. I have had some issues with this is there another method? Please not that other users may try and access the orginal record during this insert process. Can you give me example syntax and comments.

    Thank You!

Skip to main content