By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
I have checked a query using explain plan and it was going for a full table scan in instance A, and after creating an index it is using the index and the run time is also around 40% improved. When we ran the same query after two weeks with the same index, it used the index but the cost was too high, higher than before, and when I used the same query in another instance (B) after creating the index, the explain plan shows it's going for a full table scan. Could you please let me know how to use the index with less cost?
Normally, I'd recommend to keep your statistics up to date. You may want to regularly recalculate statistics on the table and its index(es). If you still have this problem, then I'd recommend locking your statistics for this table so that even if the data distribution changes, you will get the execution plan you think is optimal. You can use the DBMS_STATS package to do more than recalculate statistics. The SET_TABLE_STATS and SET_INDEX_STATS procedures of this package can be used to manually set the optimizer statistics to the value you want. Once you have them set, you can use the LOCK_*_STATS procedure to lock those statistics down so that further recalculations will not modify their values ( * = SCHEMA or TABLE). It can be useful to lock stats when you have values you know that work for your environment. Read this document for
more information on these procedures of the DBMS_STATS package