"Query Too Complex" message

  • Thread starter Thread starter Greg Blum
  • Start date Start date
G

Greg Blum

I cannot get more than 9 small subqueries in a master
query without getting "Query Too Complex". I have used
the same query elsewhere with no problem. I pull out
random subqueries and it will run as soon as I get below
9.

This is a pretty simple thing. Any ideas?
 
Run the data to a temp table (in a temp file),
then run a query on that, so that you can break
the processing into two parts.

(david)
 
I have a similar problem with a 61 field query, but the query is supporting
a form where changes are made to a table, so a temp table won't work. Any
other thoughts on what makes a query "too complex"?
 
"To complex" means that the query plan builder ran out of memory
(which used to be 64K, but may have been slightly increased).

Since the working memory requirements of the query plan builder
depend on which path it takes while analysing the query, it
can change if you break the query into sub-queries, or consolidate
queries, or compact the FE and BE database, or change the run
conditions (for example, by using dynamic SQL instead of a stored
Querydef)

The error message may also indicate just a bug in the query
plan builder, or a corrupted query.

In any case, there is no general magic solution: some queries are
too complex: it this query is one of them, you need to adopt an
alternate approach.

(david)
 
Thanks David. I figured I'd have to go a different route. I had tried
nested queries with the same result.

My problem is how could I restructure the data to get the results I'm after.
Has anyone figured out how to have indexed data elements inside a single
field like other languages permit: EndBal(1) EndBal(2), etc.
 
Dear Richard:

To have an indexed array you should create another table. The columns
of this table would be the Primary Key of your existing table, plus
another column with the index number. That becomes the Primary Key of
the new table. Add the column for your data.

Tom Ellison
Microsoft Access MVP
Ellison Enterprises - Your One Stop IT Experts
 
Tom, your words are beyond my comprehension.

If I understand you, I'll still end up with multiple records holding the
data, (recall the carry-forward problem), and a single form can't display
that.

I know that I am unwashed, so forgive my hubris.
 
Dear Richard;

I am forwarding you some moist towelettes. As for your Hubris, I
thought they quit making those years ago. I used to drive one myself.
So much for us not understanding one another.

If your goal is an updatable form, and you are not prepared to perform
some expert level coding and design, then you may be stuck with a
non-normalized approach, having separate columns in the table for each
value.

There is a way to do this dynamically using a proper normalized
design, but a thorough explanation is going to be more difficult to
understand, let alone implement, than what I said already. And it's a
long subject to cover in a newsgroup post. Besides the data
considerations I've already covered, the form itself would have to be
dynamic to handle varying numbers of columns, perhaps even exceeding
the normal limit of 255. This also can be handled using the data
scheme I recommended.

Given what I have seen of subjects that are rated "expert" this one
would have to be "extreme expert" and I just don't see us going there.

Tom Ellison
Microsoft Access MVP
Ellison Enterprises - Your One Stop IT Experts
 
Thanks Tom. The other consideration pecking at this is that the project
just isn't worth the trouble. Everything runs fine for the first four
quarters, so at the end of the first years I'll dump all of the computed
fields into a new table and run the second four quarters. Done.
 
Back
Top