ความคิดเห็น •

  • @akashkhantwal2430
    @akashkhantwal2430 2 หลายเดือนก่อน

    Amazing video sir, Very informative❤❤❤,Its help me alot

    • @TheMLMine
      @TheMLMine 2 หลายเดือนก่อน

      Glad to hear that

    • @akashkhantwal2430
      @akashkhantwal2430 2 หลายเดือนก่อน

      @TheMLMine sir how can i connect with you over the Instagram.

  • @gospelmoto2833
    @gospelmoto2833 หลายเดือนก่อน

    VERY NICE Video! you nailed it. You got a new sub here. Thanks for your informative tutorials, exactly what I needed.

    • @TheMLMine
      @TheMLMine หลายเดือนก่อน +1

      Thanks. Glad it was helpful

  • @sweetsubha514
    @sweetsubha514 4 หลายเดือนก่อน +2

    Your efforts are highly appreciated 👍👍

    • @TheMLMine
      @TheMLMine 4 หลายเดือนก่อน

      Thank you 😊

  • @deepak12428
    @deepak12428 12 วันที่ผ่านมา

    Thanks for the video, you can also use a CTE instead of creating temporary tables.

    • @echodelta7680
      @echodelta7680 12 วันที่ผ่านมา

      Hi,
      Suppose there are two tables A and B. Each table has only two columns, ID and NAME. We have to get an output table that has rows in it as :
      1st row from A
      1st row from B
      2nd row from A
      2nd row from B
      3rd row from A
      3rd row from B .... and so on.
      What would be the query to generate this output table?

  • @mahenderchilagani5916
    @mahenderchilagani5916 2 หลายเดือนก่อน

    Very well explained, thanks for the videos.

    • @TheMLMine
      @TheMLMine 2 หลายเดือนก่อน

      Thanks. Glad it was helpful

  • @rajm5349
    @rajm5349 2 หลายเดือนก่อน

    thnks fr xplng wndws fnctins clearly , if possble cn u xpln any healthcare project

  • @willz2622
    @willz2622 2 หลายเดือนก่อน

    Pls keep making these videos.

    • @TheMLMine
      @TheMLMine 2 หลายเดือนก่อน

      Feedback noted

  • @AbhishekSharma-vm7tr
    @AbhishekSharma-vm7tr 3 หลายเดือนก่อน +1

    nice video are they ask these type of questions to fresher or not do they ask hard questions for data analytics

    • @TheMLMine
      @TheMLMine 3 หลายเดือนก่อน +1

      Thanks Abhishek, they may ask these type of questions in data analytics interview. However, the idea is that you should be aware of different functions and options available in SQL and how they can be used for different applications.
      Some may ask more elaborate questions where you will have to join different tables and apply these functions for different requirements. This playlist is highly recommended but for sure not enough. If you want to solve more questions you can refer to www.hackerrank.com/domains/sql
      Hope it helps

  • @akashkhantwal2430
    @akashkhantwal2430 2 หลายเดือนก่อน +1

    Sir,when i am applying this command why it is showing error for delete duplicate entry method which you apply first
    delete from emp
    where emp_id in(select emp_id from (
    select *,row_number() over(partition by name,age,salary order by emp_id) as row_num from emp
    ) as table1 where row_num>1);
    this error is coming
    Error Code: 1175. You are using safe update mode and you tried to update a table without a WHERE that uses a KEY column.
    To disable safe mode, toggle the option in Preferences -> SQL Editor and reconnect. 0.016 sec

    • @TheMLMine
      @TheMLMine 2 หลายเดือนก่อน

      Hi akash,
      This error is because MySQL safeupdate setting is on. Safeupdate restricts certain types of queries that can unintentionally affect a large number of rows while deletion. And, using a WHERE clause with condition on a non-key column (that can have repeated values) could be one cause of that.
      What can you do? For now, you can turn the setting off in Edit->Preferences->SQL Editor->Uncheck Safe Updates (you can find the checkbox at the bottom most position in the window). Then restart MySQL and try again.
      Your query should work.
      Let me know if you face any issues

  • @yashikachugh4198
    @yashikachugh4198 2 หลายเดือนก่อน

    In case of Ans 1)
    basis approach of row_number
    Here Aviral has 2 id's and where row_num is greater than 1 you deleted that record randomly, generally if nothing is specified what is the best way to explain the interviewer that how to decide which record to delete) also in that case the same solution would not work row_number >1, then should we use rank or dense rank function as it give same value same rank and we can pull those values and then decide? (just thinking)
    group by approach is fine

    • @TheMLMine
      @TheMLMine 2 หลายเดือนก่อน

      Good question yashika,
      - if nothing is specified (on which row to delete) : If you observe the row_num entries > 1 are the entries that may have been inserted at a later stage. You may justify to the interviewer that we are deleting only the latest rows if they are repeated, keeping the originally entered row (row_num=1) as it is.
      - If interviewer asks you to keep specific rows based on emp_id among the repeated values, you can simply do
      DELETE FROM emp
      WHERE emp_id IN ()
      - I am not sure how you are thinking to use rank or dense_rank because you will need to anyways select which rank value to keep and you don't know at what rank the same values are there.
      However, in all the cases, you will either use emp_id column (since that is the only unique column we have) or a self created column (either row_num/rank/dense_rank) based on the type of row to delete.
      Hope it helps. Let me know if you have any further queries

  • @rokithkumar4905
    @rokithkumar4905 4 หลายเดือนก่อน +1

    Appreciated bro 👍🏻 is there any chance we get matplotlib after this series?

    • @TheMLMine
      @TheMLMine 4 หลายเดือนก่อน

      Sure bro, soon you will see a video on that as well 👍

  • @BE__jagdishChintamani
    @BE__jagdishChintamani 4 หลายเดือนก่อน +2

    Hello sir,
    Please make detailed videos on
    Python libraries
    Pandas
    Numpy
    Matplotlib
    Seaborn

    • @TheMLMine
      @TheMLMine 4 หลายเดือนก่อน

      Hello Jagdish, next video on matplotlib.. coming soon

  • @shinewithshreyashi2945
    @shinewithshreyashi2945 หลายเดือนก่อน

    please make more videos.. 😇