Hi Maria, ur explanation is extraordinary. I am learning Performance tuning from sites and tutorials, but everyone explaining with simple examples. You are giving valuable information. I wish to post more videos on PT.
The original plan is slower and its cost is artificially low because the cardinality estimates are incorrect. By fixing the cardinality estimates the cost of both plans changes and the HASH Join plan which was 9X more expensive now becomes the cheaper of the two plans and provides better performance (shorter elapse time).
Hello Maria, Great content as always. I have a question: at 2:44 I still dont get it why the optimizer could not use the actual rows of 10K when he was able to calculate 1% of the 10K to calculate the cardinality estimate ;-)
Thank you very much for all your work! Is it a good practice to use a Bitmap type index for a column like cust_id? I assume that this column is like the primary key of the Customers table, so it will contain lots of disparate values (unique ones).
Hi Vivi, Honestly, no it's not a good practice to use a bitmap index on a column like the cust_id column, which has a lot of distinct values. A bitmap index is typically most effective on a column with a low number of distinct values.
First off the first SQL statement is literally terrible. 1) never use a WHERE clause to define joins order of operations WHERE is after join, in million row dataset you can exclude directly BEFORE it hits WHERE clause. This is literally horrific examples. NEVER use functions on indexes. Just use the indexes without functions. This person should be taken off TH-cam. Please find other Creators. Holy god.
Hi Trevor, thanks for your comment, but I fear you missed the point of the video. The purpose of the video is to indicate how to tune a SQL statement rather than how to write the best SQL statement. Also, I suspect your knowledge of how the sophisticated Oracle Optimizer works is sadly lacking. Regardless of where you specify a join condition within a SQL statement, the Oracle Optimizer will first find the most efficient access methods to retrieve the data, applying the where clause predicates as either access or filter predicates and then join the data sets.
Excellent use on extended statistics
Hi Maria, ur explanation is extraordinary. I am learning Performance tuning from sites and tutorials, but everyone explaining with simple examples. You are giving valuable information. I wish to post more videos on PT.
I love this Maria, you've made what can be a complex subject so easy to understand!
Very very useful video and it is helping my daily dba routines. Thanks Maria
Love it! Helpful, useful, effective, and entertaining!
Great info with excellent explanation as always.
Hi and Tx! for your explanation, it's more easiest understand the internal functionality of a query
Glad it was helpful!
Hi! Great content. I have a question: what exactly is the point of changing the join method if the cost of the new plan is 9x more expensive?
The original plan is slower and its cost is artificially low because the cardinality estimates are incorrect. By fixing the cardinality estimates the cost of both plans changes and the HASH Join plan which was 9X more expensive now becomes the cheaper of the two plans and provides better performance (shorter elapse time).
Hello Maria, Great content as always. I have a question: at 2:44 I still dont get it why the optimizer could not use the actual rows of 10K when he was able to calculate 1% of the 10K to calculate the cardinality estimate ;-)
Absolutely loved it.
Thank you very much for all your work!
Is it a good practice to use a Bitmap type index for a column like cust_id? I assume that this column is like the primary key of the Customers table, so it will contain lots of disparate values (unique ones).
Hi Vivi, Honestly, no it's not a good practice to use a bitmap index on a column like the cust_id column, which has a lot of distinct values. A bitmap index is typically most effective on a column with a low number of distinct values.
@@SQLMaria Thank you
At 4:01 the closing parenthesis on the 2nd statement is an open paren instead of a close paren.
The second statement is actually the heading of the statement output that is truncated automatically by Oracle.
Great
Your avatar so cute :)
Oh thank you!
First off the first SQL statement is literally terrible. 1) never use a WHERE clause to define joins order of operations WHERE is after join, in million row dataset you can exclude directly BEFORE it hits WHERE clause. This is literally horrific examples. NEVER use functions on indexes. Just use the indexes without functions. This person should be taken off TH-cam. Please find other Creators. Holy god.
Hi Trevor, thanks for your comment, but I fear you missed the point of the video.
The purpose of the video is to indicate how to tune a SQL statement rather than how to write the best SQL statement. Also, I suspect your knowledge of how the sophisticated Oracle Optimizer works is sadly lacking. Regardless of where you specify a join condition within a SQL statement, the Oracle Optimizer will first find the most efficient access methods to retrieve the data, applying the where clause predicates as either access or filter predicates and then join the data sets.