Adapting the plan at runtime seems like the most universal solution for optimizer edge cases and is already implemented in the big 3.
If you think about it, the adaptive mechanism doesn't have to be perfect to have a lot of uplift. Even coarse grain detection and starting from zero can make a huge difference if the alternative would be a query burning up the server for hours and a failed batch job.
qazxcvbnm 1 hours ago [-]
What are some more information on the state of the art in runtime adaptation? I confess I do not feel like I possess such a thing from the databases I regularly use.
Adaptation sounds very compelling; if instead of emitting a plan based on a cardinality estimate, we emit a plan and a range of reasonable intermediate cardinalities together with expected time, and interrupt the original plan when the expectations are exceeded by an order of magnitude, and perform alternative plans based on newly gathered physical information, it sounds like it would be greatly beneficial. Are there concrete reasons that this has not been done (e.g. cost, complexity)?
RaftPeople 2 hours ago [-]
> Adapting the plan at runtime seems like the most universal solution for optimizer edge cases and is already implemented in the big 3
But the issues frequently aren't edge cases and frequently are at runtime (i.e. new query), it's just the best the optimizers can do with limited info (cardinality) and a limited duration to evaluate alternatives.
EDIT:
I just realized I misunderstood the post I responded to. I thought adapt meant update the query plan that was previously stored, but I believe the meaning is during execution of the query.
Rendered at 02:01:35 GMT+0000 (Coordinated Universal Time) with Vercel.
If you think about it, the adaptive mechanism doesn't have to be perfect to have a lot of uplift. Even coarse grain detection and starting from zero can make a huge difference if the alternative would be a query burning up the server for hours and a failed batch job.
Adaptation sounds very compelling; if instead of emitting a plan based on a cardinality estimate, we emit a plan and a range of reasonable intermediate cardinalities together with expected time, and interrupt the original plan when the expectations are exceeded by an order of magnitude, and perform alternative plans based on newly gathered physical information, it sounds like it would be greatly beneficial. Are there concrete reasons that this has not been done (e.g. cost, complexity)?
But the issues frequently aren't edge cases and frequently are at runtime (i.e. new query), it's just the best the optimizers can do with limited info (cardinality) and a limited duration to evaluate alternatives.
EDIT:
I just realized I misunderstood the post I responded to. I thought adapt meant update the query plan that was previously stored, but I believe the meaning is during execution of the query.