Have you ever encountered a query that SQL Server could not execute because it referenced too many tables?

10,229

Solution 1

For SQL Server 2005, I'd recommend using table variables and partially building the data as you go.

To do this, create a table variable that represents your final result set you want to send to the user.

Then find your primary table (say the orders table in your example above) and pull that data, plus a bit of supplementary data that is only say one join away (customer name, product name). You can do a SELECT INTO to put this straight into your table variable.

From there, iterate through the table and for each row, do a bunch of small SELECT queries that retrieves all the supplemental data you need for your result set. Insert these into each column as you go.

Once complete, you can then do a simple SELECT * from your table variable and return this result set to the user.

I don't have any hard numbers for this, but there have been three distinct instances that I have worked on to date where doing these smaller queries has actually worked faster than doing one massive select query with a bunch of joins.

Solution 2

@chopeen You could change the way you're calculating these statistics, and instead keep a separate table of all per-product stats.. when an order is placed, loop through the products and update the appropriate records in the stats table. This would shift a lot of the calculation load to the checkout page rather than running everything in one huge query when running a report. Of course there are some stats that aren't going to work as well this way, e.g. tracking customers' next purchases after purchasing a particular product.

Solution 3

I have never come across this kind of situation, and to be honest the idea of referencing > 256 tables in a query fills me with a mortal dread.

Your first question should probably by "Why so many?", closely followed by "what bits of information do I NOT need?" I'd be worried that the amount of data being returned from such a query would begin to impact performance of the application quite severely, too.

Solution 4

This would happen all the time when writing Reporting Services Reports for Dynamics CRM installations running on SQL Server 2000. CRM has a nicely normalised data schema which results in a lot of joins. There's actually a hotfix around that will up the limit from 256 to a whopping 260: http://support.microsoft.com/kb/818406 (we always thought this a great joke on the part of the SQL Server team).

The solution, as Dillie-O aludes to, is to identify appropriate "sub-joins" (preferably ones that are used multiple times) and factor them out into temp-table variables that you then use in your main joins. It's a major PIA and often kills performance. I'm sorry for you.

@Kevin, love that tee -- says it all :-).

Share:
10,229
Marek Grzenkowicz
Author by

Marek Grzenkowicz

SQL Server → .NET → SharePoint → Informatica PowerCenter → Hadoop → Python / Data analytics LinkedIn | Last.fm | GitHub

Updated on June 05, 2022

Comments

  • Marek Grzenkowicz
    Marek Grzenkowicz almost 2 years

    Have you ever seen any of there error messages?

    -- SQL Server 2000

    Could not allocate ancillary table for view or function resolution.
    The maximum number of tables in a query (256) was exceeded.

    -- SQL Server 2005

    Too many table names in the query. The maximum allowable is 256.

    If yes, what have you done?

    Given up? Convinced the customer to simplify their demands? Denormalized the database?


    @(everyone wanting me to post the query):

    1. I'm not sure if I can paste 70 kilobytes of code in the answer editing window.
    2. Even if I can this this won't help since this 70 kilobytes of code will reference 20 or 30 views that I would also have to post since otherwise the code will be meaningless.

    I don't want to sound like I am boasting here but the problem is not in the queries. The queries are optimal (or at least almost optimal). I have spent countless hours optimizing them, looking for every single column and every single table that can be removed. Imagine a report that has 200 or 300 columns that has to be filled with a single SELECT statement (because that's how it was designed a few years ago when it was still a small report).

    • Stu
      Stu almost 16 years
      Are you using SQL Server 2000 SP3?
    • Calvin Allen
      Calvin Allen almost 16 years
      Could you possibly create some views?
    • Marek Grzenkowicz
      Marek Grzenkowicz over 14 years
      Views won't help. Tables used in views count towards the limit, too.
  • Marek Grzenkowicz
    Marek Grzenkowicz over 14 years
    Imagine a customer who wants to see a list (in a grid) of items in their stores along with really a lot of related pieces of information (e.g. about first order of every item, about last order, about first delivery, about last delivery, about customers who buy it, about costs of delivery, ...). Believe me, it is possible, I've seen it. And I have had to cope with it. :) Yes, the impact on performance can be severe in such situations.
  • Marek Grzenkowicz
    Marek Grzenkowicz over 13 years
    It's weird that it helped - as far as I know, tables used in views count towards the limit. Are you using indexed views?
  • Marek Grzenkowicz
    Marek Grzenkowicz over 13 years
    No iterator of any kind. Just a single SELECT statement.