🎯 Get Latest SQL Guided Project (1M sales records) 🎯topmate.io/zero_analyst/1237072 🎯 Get Latest End to End Advanced SQL Project (Amazon)+ 70 Business Problems th-cam.com/video/J9qFIAH788Q/w-d-xo.html 🐍 All Python Projects: Python ETL - Data Cleaning Project th-cam.com/video/uDQO9xOg4xg/w-d-xo.html
Your consistency in creating videos and your clear explanations are truly impressive.🎉 It would be great if you could upload advanced-level Data Analytics projects for experienced professionals in Power BI, Excel, and SQL in the same style.
finally i have found one of the great you tube source for sql practices thanks a lot sir.. can you please provide to lecture of python end to end project which is direct relevent to data analyst.
While Removing NULL values you ommitted the 'age' Column which has 10 NULL values, so after deleting all NULL values you should have 1987 rows left. Great work!!
in Q.10 we can also do this -> select CASE WHEN HOUR(sale_time) < 12 THEN 'Morning' WHEN HOUR(sale_time) BETWEEN 12 AND 17 THEN 'Afternoon' ELSE 'Evening' END as 'Shift', count(*) as 'Orders' from Retail_sales group by 1;
i am using mysql workbench. i got 1987 rows of data. guess the null values rows did not import. So is there any issues since if the age is not present in the data we can delete that row ?? if we would have done in python then deleting would have been the first option.. WHat harm is there if we use mysql with 1987 rows ?
Not able to import the file csv. getting error as :\program files\mysql\mysql workbench 8.0\modules\sqlide_power_import_export_be.py", line 431, in start_import self._editor.executemanagementcommand(query, 1) grt.dberror: ("unknown column 'none' in 'field list'", 1054) error: import data file: ("unknown column 'none' in 'field list'", 1054)
in csv file there is a age column after gender column but while checking if the null are present in the column (timestamp 22.30) there is no age column after gender can you check it for me it show some null values in age column as well
We don't have the luxury to delete rows with no data in any professional setting. The tables are hosted on production databases. in my short experience, we should learn to deal with NULL values
Hi @zero_analyst, For question 9, I understood that we need to find unique customers who have purchased items from all categories. But what you have done implies the unique customers for each categories. Can you please explain me where am i lagging?
Facing problem with importing the dataset to MySQL workbench. The whole dataset is not imported.. I even handled the datatype of each columns. What to do?
How would it be different if I want to do it in MySQL? Meaning what changes or points I need to take under considerations if I do it parallelly while watching your video? Please let me know as it would help me with other project videos that you have uploaded as well. I have completed courses on SQL and new to making projects.
core sql concepts stay the same across the platforms, only few keywords or functions may change but rest everything is same . and even if you dont find that keyword or function in Mysql , you can anyhow refer to chatgpt for alternative to that specific keyword
hey najir thanks for the videos, i have a query like i uploaded the dataset in mysql and i find that there is only 1987 records, so how can i handle this situation
When i did the null values check for the data, I have found 13 rows with null in which 10 rows are from the age columns, will this effect my analysis, in which 5 female and 5 male, any suggestion ?
hey also when i did the count(*) function it gave 1987 rows not 2000, was it the same for you? idk why sql didn't imported rows with null values, did you do your project with 1987 rows?
Hey! I'm new to SQL so I'm not sure of this. Why would you 'CREATE TABLE' and add data types, instead of just importing the data as a CSV file? Is there any difference if I follow importing data as a CSV file over creating a table? I don't know much about this since I just started to learn SQL, great videos btw.
One doubt....Is there any way of checking null values in the entire table instead of mentioning each one of them. Because what if we have hundred or more than that columns.
Sir , is there any way to save these queries on display of pg admin as I close my pg admin n open all queries disappear. In query history it shows but very confusing way.
@@MOTIVATIONALQUOTE-wt3de Even I got the same count. The reason is we have ten null values in the age column which are deleted apart from those three records.
@@ajaysherkhane8797 Same for me too I am only able to import 1987 rows, I also went through sir's importing lecture but the issue I have too detected is that only the Rows having any column null (13) is being left out. I tried to alter the table created to allow null values but result is still same. If anyone can help me with this, I will be truly grateful. Sir , if you are familiar with this issue kindly help. Thanks
Yes we can do it in excel as well! This is a SQL project so we are using SQL to clean the data! in excel there are some limitations where you can only work if you have data upto 1.5M where you don't have any limit also once you create a function you can reuse it!
Is there anybody who is getting issue in importing the whole 2000 rows in table in MY SQL. For me I am only able to import 1987 rows, I also went through sir's importing lecture but the issue I have detected is that only the Rows having any column null is being left out. I tried to alter the table created to allow null values but result is still same. If anyone can help me with this, I will be truly grateful. Sir , if you are familiar with this issue kindly help. Thanks
Check the video description! Download the dataset! First watch the full video! I have explain how to download the dataset! Also added git rep in the description!
Just checked seems csv file has the correct number of data with 2000 rows! please go through the video and use PgAdmin4 and PostgresSQL As this dataset is only compatible with PgAdmin 4 and postgresql! If you want to know how to install and import file you can go through respective video mentioned in the description!
🎯 Get Latest SQL Guided Project (1M sales records)
🎯topmate.io/zero_analyst/1237072
🎯 Get Latest
End to End Advanced SQL Project (Amazon)+ 70 Business Problems
th-cam.com/video/J9qFIAH788Q/w-d-xo.html
🐍 All Python Projects:
Python ETL - Data Cleaning Project
th-cam.com/video/uDQO9xOg4xg/w-d-xo.html
Sir please create job oriented Data Analyst playlist with real world projects and interviews questions and answers 😊
Your consistency in creating videos and your clear explanations are truly impressive.🎉 It would be great if you could upload advanced-level Data Analytics projects for experienced professionals in Power BI, Excel, and SQL in the same style.
Sure will make it!
finally i have found one of the great you tube source for sql practices thanks a lot sir..
can you please provide to lecture of python end to end project which is direct relevent to data analyst.
While Removing NULL values you ommitted the 'age' Column which has 10 NULL values, so after deleting all NULL values you should have 1987 rows left.
Great work!!
in Q.10 we can also do this -> select
CASE
WHEN HOUR(sale_time) < 12 THEN 'Morning'
WHEN HOUR(sale_time) BETWEEN 12 AND 17 THEN 'Afternoon'
ELSE 'Evening'
END as 'Shift',
count(*) as 'Orders'
from Retail_sales
group by 1;
Your consistency is great!!!
Thank you for supporting!
Please upload next part soon... very well explained
Sure
Good initiative... Gradually increase the number of tables, relationship and the analysis a bit advanced and more complex
Nyc Wallpaper!
I just subscribed this channel and your teaching is great ❤
Thanks and welcome
i am using mysql workbench. i got 1987 rows of data. guess the null values rows did not import. So is there any issues since if the age is not present in the data we can delete that row ?? if we would have done in python then deleting would have been the first option.. WHat harm is there if we use mysql with 1987 rows ?
Not able to import the file csv. getting error as
:\program files\mysql\mysql workbench 8.0\modules\sqlide_power_import_export_be.py", line 431, in start_import self._editor.executemanagementcommand(query, 1) grt.dberror: ("unknown column 'none' in 'field list'", 1054) error: import data file: ("unknown column 'none' in 'field list'", 1054)
in csv file there is a age column after gender column but while checking if the null are present in the column (timestamp 22.30) there is no age column after gender can you check it for me it show some null values in age column as well
Yeah it has some null values
You can delete it or keep it based on the requirements!
Thanks for this video sir
Thank you for supporting!
In MS-SQL SERVER, TO_CHAR() & EXTRACT functions are not working bro, which one to use instead of those functions...
yess...me too same prblm bro
hey do we need any installed requisites coz pwd in command prompt is not working for me
For window cd to see present working directory
We don't have the luxury to delete rows with no data in any professional setting. The tables are hosted on production databases. in my short experience, we should learn to deal with NULL values
True!
Hi @zero_analyst,
For question 9, I understood that we need to find unique customers who have purchased items from all categories. But what you have done implies the unique customers for each categories.
Can you please explain me where am i lagging?
The question itself implies that you have to find customer from each category? for that you have to use distinct
Facing problem with importing the dataset to MySQL workbench. The whole dataset is not imported.. I even handled the datatype of each columns. What to do?
For mySQL please use command line to import!
thanks, a lot bro
Thank you mate!
Please subscribe and support if it was helpful!
thank you 🙏
How would it be different if I want to do it in MySQL? Meaning what changes or points I need to take under considerations if I do it parallelly while watching your video?
Please let me know as it would help me with other project videos that you have uploaded as well. I have completed courses on SQL and new to making projects.
core sql concepts stay the same across the platforms, only few keywords or functions may change but rest everything is same . and even if you dont find that keyword or function in Mysql , you can anyhow refer to chatgpt for alternative to that specific keyword
hey najir thanks for the videos, i have a query like i uploaded the dataset in mysql and i find that there is only 1987 records, so how can i handle this situation
When i did the null values check for the data, I have found 13 rows with null in which 10 rows are from the age columns, will this effect my analysis, in which 5 female and 5 male, any suggestion ?
Yes i missed age columns!
You can delete all null values!
Understood, thank you by the way. It's my daily routine to watch at least one of your video, if by any chance I miss I make up to next day !!
hey also when i did the count(*) function it gave 1987 rows not 2000, was it the same for you? idk why sql didn't imported rows with null values, did you do your project with 1987 rows?
@@Aurora-rd1fd do this in pg admin then it will import 2000 rows.
can i download raw file instead of doing all of this
Yes you can check the video descriptions
Hey! I'm new to SQL so I'm not sure of this. Why would you 'CREATE TABLE' and add data types, instead of just importing the data as a CSV file? Is there any difference if I follow importing data as a CSV file over creating a table? I don't know much about this since I just started to learn SQL, great videos btw.
In postgres you cannot import data without creating table and defining table structures
can I build this project using snowflake?
Yes you can!
Can u please upload series as quick as possible bcz it will helpful our careers
Project 2 uploaded please check
th-cam.com/video/6X2-P9fNVvw/w-d-xo.htmlsi=eqimtm8H_iFxEiW7
One doubt....Is there any way of checking null values in the entire table instead of mentioning each one of them. Because what if we have hundred or more than that columns.
This is the only way i follow!
Sir , is there any way to save these queries on display of pg admin as I close my pg admin n open all queries disappear.
In query history it shows but very confusing way.
Simply click save!
Subscribed
Please share your feedback so i can work on it improve
thank you
im not able to do the steps mentionedd in command promt???
You can download the file from the from my github!
i am facing error on using To_char() func " Error Code: 1305. FUNCTION sq_project_p1.TO_CHAR does not exist " i am using mysql workbench
In MySQL you should have different function!
SELECT *
FROM transactions
WHERE category = 'Clothing'
AND quantity_sold > 4
AND DATE_FORMAT(transaction_date, '%Y-%m') = '2022-11';
4:06 can you please tell how this effect is made? in obs or any shortcut key or in editing where ?
I use tool call screenflow this is only only on mac!
While importing data into MY SQL there are only 1987 rows are showing
Please use Postgresql and pgadmin 4 check my another video how to install pgadmin 4 th-cam.com/video/x73dpaYzFoY/w-d-xo.html
same problem bhai. i also only get 1987 rows in mysql.
@@MOTIVATIONALQUOTE-wt3de Even I got the same count. The reason is we have ten null values in the age column which are deleted apart from those three records.
@@ajaysherkhane8797 Same for me too I am only able to import 1987 rows, I also went through sir's importing lecture but the issue I have too detected is that only the Rows having any column null (13) is being left out. I tried to alter the table created to allow null values but result is still same. If anyone can help me with this, I will be truly grateful. Sir , if you are familiar with this issue kindly help. Thanks
why dont we just clean the data in excel and import it into DBMS ?
Yes we can do it in excel as well!
This is a SQL project so we are using SQL to clean the data!
in excel there are some limitations where you can only work if you have data upto 1.5M where you don't have any limit also once you create a function you can reuse it!
Your English bahut achaa h 😂
No.of rows more than 11lkhs how to find max(Len(Column)) to identify character length manual it's not possible
You can use python script
Is there anybody who is getting issue in importing the whole 2000 rows in table in MY SQL. For me I am only able to import 1987 rows, I also went through sir's importing lecture but the issue I have detected is that only the Rows having any column null is being left out. I tried to alter the table created to allow null values but result is still same. If anyone can help me with this, I will be truly grateful. Sir , if you are familiar with this issue kindly help. Thanks
This project was built using postgres so you may have import issues!
Tq🥰
Can we make it using MySQL?
Yes you can!
How often are you going to upload a video?
Part 2 uploaded today please check!
Sir please provide data set
github.com/najirh/Retail-Sales-Analysis-SQL-Project--P1
Hello could you please share your excel sheet
Check the video description!
Download the dataset!
First watch the full video!
I have explain how to download the dataset!
Also added git rep in the description!
the excel sheet that you provided has incomplete values
Just checked seems csv file has the correct number of data with 2000 rows!
please go through the video and use PgAdmin4 and PostgresSQL
As this dataset is only compatible with PgAdmin 4 and postgresql!
If you want to know how to install and import file you can go through respective video mentioned in the description!
es español carnal.