0

We have situation where query takes time through SSIS packages during the weekend run whereas its run in 3 mins during weekdays run. There is considerable increase in the records count but with that it should run in max of 15 mins duration.

We have found temporary solution to overcome this but this one is manual effort need to performed every weekend run.

Temporary solution is, we run the SQL task source query in the SSMS before the package gets triggered from a job.

Whereas the query running through will also take longer time execute but we abort manually ran query. This creates Execution plan Cache of that DB server.

After this when query runs from package it will execute in 3 mins regardless the no of records.

Kindly let us know if any permanent fix can be done for this.

Thanks,

SANDY

Sandeep212
  • 29
  • 1
  • 6
  • http://www.sommarskog.se/query-plan-mysteries.html pretty much covers most of the options? – Richard Hansell Nov 04 '14 at 09:12
  • Hi Richard, I had read the above previously but the thing here, we cannot create index on any column as data is not similar. And we tried having BETWEEN clause in the query but still the same problem. Query doesn't execute fast in the first run once we abort and execute again then the query will execute from both SSMS and SSIS Package. – Sandeep212 Nov 04 '14 at 10:37
  • 1
    maybe read this then? http://stackoverflow.com/questions/20864934/option-recompile-is-always-faster-why If you use OPTION (RECOMPILE) and it fixes the problem then this isn't a proper solution. But it does help you to identify what sort of issue you have with your query... – Richard Hansell Nov 04 '14 at 10:42
  • Is there anything between the last fast run and the slow run that would be flushing the data cache? A scheduled server restart, for instance? – Ben Thul Nov 04 '14 at 13:32
  • Hi Ben, No no restart was or update of plans has done. query runs only after the plan is available in cache, which is getting created by our manually do execute and abort of query otherwise the package run for longer time. Today it processed record/min which is not expected. – Sandeep212 Nov 05 '14 at 04:44
  • I suspect that @RichardHansell has a good suggestion to try. What you are describing in both the problem and solution suggest that you have a plan which is being reused. However, the plan is inappropriate for some executions because of the change in record volume. I actually think it is an appropriate solution to compile your proc with recompile if this is being called once a day in a batch process and is supposed process different volumes of records. Though updating statistics might be an appropriate maintenance task to fit in there as well. – Mark Wojciechowicz Nov 05 '14 at 15:01
  • Yes, @MarkWojciechowicz is correct and I do have a number of overnight processes with OPTION (RECOMPILE) at the start for precisely this reason. However, this is not a "silver bullet" fix. Similar to people who stick NO LOCK all over their scripts because it fixed a problem they had once and they now assume it is good practice (when really it just means they will be reading uncommitted data). There is a cost to recompiling the query plan every time you run a script; not a huge cost but still a consideration. – Richard Hansell Nov 05 '14 at 17:07

0 Answers0