If you're running into an error with "exit code 1" @1:28:55, you need to update the Dockerfile @1:08:08. Goto the Dockerfile and "image: postgres:9.2" for both "source_postgres" and "destination_postgres" I think this was a good idea but many details went off. I wish he'll be more specific about the version of the software he's using next time.
Thanks, I wasted many hours with that error. I would appreciate if you could expand on the reasons why this error happens. Again, thanks for your contribution.
bro, the execution of my code is still running, (after I edited the part of the code you mentioned), but I believe in you! Thank you so much wallah you are a savior! 🔥🔥
Amazing! Looking forward to it. Also, it would be perfect to have a more comprehensive version of this course as well. Covering all batch and stream processing tools and methods.
Please bring a bigger course , which covers all aspects from basics , like from scratch , MySQL , python/Java/Scala , Hadoop Spark , pyspark , something that covers a data engineering with one cloud
At 1:53:56 you may face an error due to the fact that the dbt service is launched before the completion of the elt_script service. To solve the issue, you have to add condition: service_completed_successfully under the depends_on clause of the dbt service to be sure that it will always be launched after the completion of the elt_script service.
@@wusswuzz5818 did you end up figure out any errors? dbt creates the view for me for the films, film_actors, and actors tables but not for the film_ratings table because the relation "public.films" does not exist
Good job, you are creating visibility for airbyte in a great way, by providing an evolutionary view of the stack that gets one to eventually need it. Hope they continue to support you making content using this approach.
Thanks for the course .. to all those using VM's make sure files are located on the VM .. tried running docker excercise with docker on VM and files on host (Windows) wasted a lot of time resloving errors finally moved all the files to the Ubuntu VM where things ran smoothly ...
Hey , currently there is no set path to becoming a data engineer. So please create a proper certification with a clear roadmap of foundations and most used cloud tech in data engineering so that we can get some structure going for those interested in this career.
@@JREQuickPods he codes and games on his twitch channel and he has a TH-cam(you can search his name) where he shares his experiences. twitch.tv/justinbchau. I watched him make some of this course a few months back on twitch.
Please make this a series! (Edit) Suggestion 1: I would make is to include an overview before each section of 1) the overarching pipeline 2) where in the pipeline we are for a given section. In other words, having a map or diagram of what is happening would help with conceptual understanding. Suggestion 2: Explain the code line-by-line conceptually. Time writing the code on the screen could be cut and replaced just by explanation. This would save time.
I just started and love the style. You teach fluent and set focus on the important take aways. I saw so much bullshit that I really expected that I need to watch you 10 minutes installing docker and already started skipping but you didn't show that part which is nice. Makes perfect sense. Someone not being able to RTFM and install docker on his own shouldn't focus on DE at this point anyway imho.
this is a good video if you already knows the technology used in this project, like postgres and python. Unfortunely, too many details are skipped over, which is understandable. You can only teach so much in just 3 hours.
If you get the "pg_dump: error: aborting because of server version mismatch ". I found that running "apt-get update && apt-get install -y postgresql-client" is actually installing version 15, while "postgres:latest: pulls version 16. To fix this this, specify your image to be "postgres:15" for both the source_postgres and destination_postgres in the docker-compose.yaml. I also changed my RUN in command in Dockerfile to be "apt-get update && apt-get install -y postgresql-client-15" to be explicit.
This course is not for beginners, he is moving forward like a hell without telling how to install, he is moving fast forward and video title is for beginners
Came across another issue while running code for datapipeline gave me version error between dump and postgresql I changeed the version from latest to 15.5 to match dump version... hope this helps in case anyone face issues on this module
An advice. Doesn't make sense to read and write the code from your top monitor. Save your and the watchers' time and just copy-paste what's there and explain it line by line.
Airbyte section is complicated and badly explained. It looks for me like some part is missing in the recording. E.g. how airbyte was started? After this video I don't see significant advantage using Airbyte over elt_script from the video example.
In the "Building a Data pipeline from Scratch" section, when I am running the containers by docker compose up, I get the following error: elt-elt_script-1 | pg_dump: error: aborting because of server version mismatch elt-elt_script-1 | pg_dump: detail: server version: 16.1 (Debian 16.1-1.pgdg120+1); pg_dump version: 15.5 (Debian 15.5-0+deb12u1) I have installed the latest version of postgres on my machine i.e. 16.1. I have also removed the images and volumes and rerun the docker compose up command, still I get the above error. Can someone please help? Thanks!
I ran into this same problem. I fixed it by changing the docker-compose.yaml. Instead of image: postgres:latest under the source_postgres and destination_postgres, I wrote image: postgres:15 in both of these places. This way when the code is run, it installed the same version of PostgreSQL in both the source and destination.
I ran into the below error trying to run my container, can you please advice me on how to resolve it? All my files are in one directory. 2024-05-03 15:34:42 Node.js v18.20.2 2024-05-03 15:34:45 node:internal/modules/cjs/loader:1143 2024-05-03 15:34:45 throw err; 2024-05-03 15:34:45 ^ 2024-05-03 15:34:45 2024-05-03 15:34:45 Error: Cannot find module '/app/src/index.js' 2024-05-03 15:34:45 at Module._resolveFilename (node:internal/modules/cjs/loader:1140:15) 2024-05-03 15:34:45 at Module._load (node:internal/modules/cjs/loader:981:27) 2024-05-03 15:34:45 at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:128:12) 2024-05-03 15:34:45 at node:internal/main/run_main_module:28:49 { 2024-05-03 15:34:45 code: 'MODULE_NOT_FOUND', 2024-05-03 15:34:45 requireStack: [] 2024-05-03 15:34:45 } 2024-05-03 15:34:45 2024-05-03 15:34:45 Node.js v18.20.2
Thank you! I ran into all the missing configuration/typos Justin had. But for this one, his github yaml file didn't have the line, so it was difficult to find the root cause. thank you for sharing.
@2:56:09 You can avoid relaunching Airflow containers in every change by mapping your DAG dir to Airflow's DAG dir; by adding a volumes mapping in the Airflow compose file, under x-airflow-common: x-airflow-common: ... volumes: - /path/to/project/dags:/opt/airflow/dags @2:58:52 Normally you can force rebuilding the Docker images using "--build" flag: docker compose -f docker-compose.yml up --build -d
| 05:53:48 Encountered an error: dbt-1 | Runtime Error dbt-1 | fatal: Invalid --project-dir flag. Not a dbt project. Missing dbt_project.yml file dbt-1 exited with code 2 getting this issue how to solve this error
I had the same problem, I solved it like this: command: [ "run" ] networks: - elt_network volumes: - ./custom_postgres:/usr/app - ~/.dbt/:/root/.dbt/ You can change "run" to "debug" to see which file it does not recognize
The setup for these services seems like A LOT of work and because of that things can go wrong in production, not to mention handover for something like this...what alternatives are there for this type of ELT pipeline?
I have same folder structure but when I execute docker compose sentence I receive an error: fatal: Invalid --project-dir flag. Not a dbt project. Missing dbt_project.yml file.
have a issue with postgres version mismatch. I change the version of postgres server to suit with it. services: source_postgres: image: postgres:15.5 same for destination_postgres. Hope this help
I am getting an issue with docker compose up 'dbt-1 | relation "public.actors" does not exist' seems like dbt runs first and then the elt_script, even though I have the same docker compose file as the video shows.
@@josesalazar2384 I was having the same issue but with public.films does not exist. I went in the docker file and added the condition. But that did not solve the issue. i looked around and turns out in sources.yml i had the table source for films to be "films.sql". we dont need to add a .sql. Leaving this here in case someone has the same issue as me
Errors: Database Error in model actors (models/example/actors.sql) relation "public.actors" does not exist relation "public.actors" does not exist ===> Solution: In 'docker-compose.yaml', add the below code to dbt: depends_on: elt_script: condition: service_completed_successfully
i did this and the dbt runs for the films, film_actors, and actors tables but when trying to create the view for the film_ratings table it still says that relation "public.films" does not exist. I have been frustratingly stuck on this for days, I even tried rebuilding the project. Not sure what else to do, any thoughts on what I can do next?
@@jomeltapawan9316 I had this same issue. I had to recreate my project due to another setup issue and as a result, I forgot to edit the dbt_project.yml file such that '+materialized: table' rather than 'view'. Once fixing this, the issue was resolved. During the beginning of the dbt section, he mentioned that the view setting is not useful for the model application. Hope this helps.
upon doing "docker compose up" (this is right towards the end of the "building a data pipeline from scratch") I get the error : "source_postgres-1 | 2024-01-20 23:30:21.504 UTC [247] FATAL: password authentication failed for user "root" source_postgres-1 | 2024-01-20 23:30:21.504 UTC [247] DETAIL: Role "root" does not exist. source_postgres-1 | Connection matched file "/var/lib/postgresql/data/pg_hba.conf" line 128: "host all all all scram-sha-256"" Anyone else get the same issue? I'm not a huge expert on docker, so I can only guess I have to do something with the postgres user "root", even tho justin didn't make a root user, only "postgres".. but then how do I do this in docker as well?
should i first aim to become a data scientist then go to data engineering? just wondering if it's better that way or not, if my end goal is to be a data engineer. thoughts?
Why is dbt using the port 5434 to communicate with the destination database? Both containers are running in the same docker network so why do we need to use the exposed port number?
It does not need that port that port is exposed so that if you the user wants to take a peak at the database you can run a db manager against that port, otherwise to peek at the database you would have to exec -it into the db container to look at the tables.
Okay, major question. As you are adding in new technologies like Airflow in the docker file, where is your console log or terminal to let you know if there’s any syntax errors etc.? For example, with React, Django, Flutter, I always have the app running on local host and ALWAYS have that window open to see if there are any errors in the error log as I am updating files. How do you do that with this workflow to make sure you’re not making mistakes while you’re writing code?
It was actually he forgot the macro keyword when he was defining the macro in ratings_macro.sql file. {% macro generate_ratings() %} CASE WHEN user_rating >= 4.5 THEN 'Excellent' WHEN user_rating >= 4.0 THEN 'Good' WHEN user_rating >= 3.0 THEN 'Average' ELSE 'Poor' END as rating_category {% endmacro %}
LOL. Chau just skipped debugging it 🤣 He actually just forgot the macro keyword. It should be {% macro generate_ratings() %} in the rating_macro.sql file
I love it. But lets say i need to make a email classification system that distributes these emails after there classification and send them to there classified department server. Would you use a message queue like celery or use a broker like kafka or rabit mq?
Hi Justin, I have problem loading data to Postgres from the SQL scripts. The problem I encounter is "unf_inv_name expected afer this token". May I get some advices pls?
Can someone help me, when running command docker run, in Docker i get error Exited(1), saying it can not find module: Error: Cannot find module '/app/src/index.js' I did per instructions, should i move index.js file somewhere else, it is in the app directory
Not even on the Gartner Quadrant. And Airbyte is a freemium, only 14 days of free use. And as you might have guessed, it's greed-priced, errr usage priced. the lowest use (10gb and only 4 rows to replicate per month) is $160.00. That is huge expensive. I think Im going to skip this advertisement.
Hey Everyone, I hope you are well. I am stuck on the last 20 minutes due to the requirement to have SSH instead of being able to bypass it how he did in the video. Has anyone else encountered this? How did you move past it?
We need full data engineering course py + sql + big data hadoop + apache spark + apache airflow + apache kafka + aws + project
If you're running into an error with "exit code 1" @1:28:55, you need to update the Dockerfile @1:08:08. Goto the Dockerfile and "image: postgres:9.2" for both "source_postgres" and "destination_postgres"
I think this was a good idea but many details went off. I wish he'll be more specific about the version of the software he's using next time.
Thanks, I wasted many hours with that error. I would appreciate if you could expand on the reasons why this error happens. Again, thanks for your contribution.
God bless you mate !
Thanks a lot for the point. I spent a bit of time trying to solve the issue.
bro, the execution of my code is still running, (after I edited the part of the code you mentioned), but I believe in you! Thank you so much wallah you are a savior! 🔥🔥
thanks mate
We need a 60 hrs course for DataEngineer.
U make one
Check out the data engineering zoomcamp.
😂@@dijik123
@@StarLord-571
@@StarLord-571 it's not with the quantity but with the quality
the best DE course that i ever seen. the most courses only stick to the theory and never show the practical part.
Amazing!
Looking forward to it.
Also, it would be perfect to have a more comprehensive version of this course as well. Covering all batch and stream processing tools and methods.
Please bring a bigger course , which covers all aspects from basics , like from scratch , MySQL , python/Java/Scala , Hadoop Spark , pyspark , something that covers a data engineering with one cloud
yes!!
Yes, please. This would be so helpful but thanks for this resource, a great start.
YES Please
Yes
Yes please
At 1:53:56 you may face an error due to the fact that the dbt service is launched before the completion of the elt_script service.
To solve the issue, you have to add condition: service_completed_successfully under the depends_on clause of the dbt service to be sure that it will always be launched after the completion of the elt_script service.
Thank you so much.
This did not work for me, and I can't progress any further. Frustrating.
Thanks man, I was struggling with this for a while
depends_on:
elt_script:
condition : service_completed_successfully
@@wusswuzz5818 did you end up figure out any errors? dbt creates the view for me for the films, film_actors, and actors tables but not for the film_ratings table because the relation "public.films" does not exist
in the docker-compose.yaml file, for anyone wondering (like me 1 min ago)
Good job, you are creating visibility for airbyte in a great way, by providing an evolutionary view of the stack that gets one to eventually need it. Hope they continue to support you making content using this approach.
Thanks for the course .. to all those using VM's make sure files are located on the VM .. tried running docker excercise with docker on VM and files on host (Windows) wasted a lot of time resloving errors finally moved all the files to the Ubuntu VM where things ran smoothly ...
Hey , currently there is no set path to becoming a data engineer. So please create a proper certification with a clear roadmap of foundations and most used cloud tech in data engineering so that we can get some structure going for those interested in this career.
Thanks!
Finally a data engineering course!
Since when Justin is a data engineer? Well I guess the constant learning is real.
I’ve been following him since I started coding(almost 4 years now) he’s always learning and growing.
@@briabytes which channel ?
@@briabytes Please share his YT channel.
@@JREQuickPods he codes and games on his twitch channel and he has a TH-cam(you can search his name) where he shares his experiences. twitch.tv/justinbchau. I watched him make some of this course a few months back on twitch.
his twitch is in the other comment
1:59:58
the reason why he encountered the error is that {% generate_ratings() %}.
To avoid the error, you should put {% macro generate_ratings() %}
YES THANK YOU SO MUCH keep expanding this!!!!
Please make this a series!
(Edit)
Suggestion 1: I would make is to include an overview before each section of 1) the overarching pipeline 2) where in the pipeline we are for a given section. In other words, having a map or diagram of what is happening would help with conceptual understanding.
Suggestion 2: Explain the code line-by-line conceptually. Time writing the code on the screen could be cut and replaced just by explanation. This would save time.
Yes! I’ve been waiting for this.
Please add more data engineering course like this. I really love it.
I just started and love the style. You teach fluent and set focus on the important take aways. I saw so much bullshit that I really expected that I need to watch you 10 minutes installing docker and already started skipping but you didn't show that part which is nice. Makes perfect sense. Someone not being able to RTFM and install docker on his own shouldn't focus on DE at this point anyway imho.
More bi and DE courses like this plz
Long waiting for this course
We would like to see more content for Data Engineering, possibly a full course.
this is a good video if you already knows the technology used in this project, like postgres and python. Unfortunely, too many details are skipped over, which is understandable. You can only teach so much in just 3 hours.
Here I'm I thinking about data engineering, then Boom! TH-cam shows me a data Engineering course 😅
You probably googled
Thank you for this! Learned a lot today. 👍🏾
If you get the "pg_dump: error: aborting because of server version mismatch ".
I found that running "apt-get update && apt-get install -y postgresql-client" is actually installing version 15, while "postgres:latest: pulls version 16.
To fix this this, specify your image to be "postgres:15" for both the source_postgres and destination_postgres in the docker-compose.yaml.
I also changed my RUN in command in Dockerfile to be "apt-get update && apt-get install -y postgresql-client-15" to be explicit.
Thank you. I have 30 mininutes for this error.
love you mate.
This course is not for beginners, he is moving forward like a hell
without telling how to install, he is moving fast forward and video title is for beginners
Came across another issue while running code for datapipeline gave me version error between dump and postgresql I changeed the version from latest to 15.5 to match dump version... hope this helps in case anyone face issues on this module
Hero
An advice. Doesn't make sense to read and write the code from your top monitor. Save your and the watchers' time and just copy-paste what's there and explain it line by line.
Well you should read the description
Airbyte section is complicated and badly explained.
It looks for me like some part is missing in the recording. E.g. how airbyte was started?
After this video I don't see significant advantage using Airbyte over elt_script from the video example.
In the "Building a Data pipeline from Scratch" section, when I am running the containers by docker compose up, I get the following error:
elt-elt_script-1 | pg_dump: error: aborting because of server version mismatch
elt-elt_script-1 | pg_dump: detail: server version: 16.1 (Debian 16.1-1.pgdg120+1); pg_dump version: 15.5 (Debian 15.5-0+deb12u1)
I have installed the latest version of postgres on my machine i.e. 16.1. I have also removed the images and volumes and rerun the docker compose up command, still I get the above error. Can someone please help? Thanks!
I ran into this same problem. I fixed it by changing the docker-compose.yaml. Instead of image: postgres:latest under the source_postgres and destination_postgres, I wrote image: postgres:15 in both of these places. This way when the code is run, it installed the same version of PostgreSQL in both the source and destination.
@@oddlang687 it' s work , thanks you very much, i take two day to find the way to show that problem.
Thanks man!
@@oddlang687 Appreciate this comment, fixed for me. Thank you!
I ran into the below error trying to run my container, can you please advice me on how to resolve it? All my files are in one directory. 2024-05-03 15:34:42 Node.js v18.20.2
2024-05-03 15:34:45 node:internal/modules/cjs/loader:1143
2024-05-03 15:34:45 throw err;
2024-05-03 15:34:45 ^
2024-05-03 15:34:45
2024-05-03 15:34:45 Error: Cannot find module '/app/src/index.js'
2024-05-03 15:34:45 at Module._resolveFilename (node:internal/modules/cjs/loader:1140:15)
2024-05-03 15:34:45 at Module._load (node:internal/modules/cjs/loader:981:27)
2024-05-03 15:34:45 at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:128:12)
2024-05-03 15:34:45 at node:internal/main/run_main_module:28:49 {
2024-05-03 15:34:45 code: 'MODULE_NOT_FOUND',
2024-05-03 15:34:45 requireStack: []
2024-05-03 15:34:45 }
2024-05-03 15:34:45
2024-05-03 15:34:45 Node.js v18.20.2
another one had issues with host.docker.internal hence added extra_hosts:
- "host.docker.internal:host-gateway" to docker compose for dbt
Thank you! I ran into all the missing configuration/typos Justin had. But for this one, his github yaml file didn't have the line, so it was difficult to find the root cause. thank you for sharing.
@@paolaprieto8111 heya could you share the exact code you wrote, I can't seem to get it working
@2:56:09 You can avoid relaunching Airflow containers in every change by mapping your DAG dir to Airflow's DAG dir;
by adding a volumes mapping in the Airflow compose file, under x-airflow-common:
x-airflow-common:
...
volumes:
- /path/to/project/dags:/opt/airflow/dags
@2:58:52 Normally you can force rebuilding the Docker images using "--build" flag:
docker compose -f docker-compose.yml up --build -d
Thank you for this. I like the way you teach it step-by-step and I always got lost with JOINS. Haha anyways thanks for this! Keep it up
1:40:58 can't find the schema.yml file in the course github. Had to copy everything from the video :/
ran into the same issue, its in a different branch
My man...Justin!
You're a top crossfitter and I miss working out under your guidance
50:40 How did he get out of END by skipping the line ? I was stuck and had to close the terminal.
At 2:52:34 how was airbyte started ??
try looking into course resources > airbyte
I like to see more of your content, good explanation skills
| 05:53:48 Encountered an error:
dbt-1 | Runtime Error
dbt-1 | fatal: Invalid --project-dir flag. Not a dbt project. Missing dbt_project.yml file
dbt-1 exited with code 2 getting this issue how to solve this error
I had the same problem, I solved it like this:
command:
[
"run"
]
networks:
- elt_network
volumes:
- ./custom_postgres:/usr/app
- ~/.dbt/:/root/.dbt/
You can change "run" to "debug" to see which file it does not recognize
Absolute path instead of ~ helped me
The setup for these services seems like A LOT of work and because of that things can go wrong in production, not to mention handover for something like this...what alternatives are there for this type of ELT pipeline?
I have same folder structure but when I execute docker compose sentence I receive an error: fatal: Invalid --project-dir flag. Not a dbt project. Missing dbt_project.yml file.
have a issue with postgres version mismatch. I change the version of postgres server to suit with it.
services:
source_postgres:
image: postgres:15.5
same for destination_postgres.
Hope this help
thank you man! Great help
Thank you for amazing video. Could you please a second video including Data Engineering Projects
I am getting an issue with docker compose up
'dbt-1 | relation "public.actors" does not exist'
seems like dbt runs first and then the elt_script, even though I have the same docker compose file as the video shows.
fixed it by doing this on my dbt service
depends_on:
elt_script:
condition: service_completed_successfully
@@josesalazar2384 thank you for sharing this! I ran into the same problem and was looking forever for a solution. This worked great for me
thanks a lot
@@josesalazar2384 I was having the same issue but with public.films does not exist. I went in the docker file and added the condition. But that did not solve the issue. i looked around and turns out in sources.yml i had the table source for films to be "films.sql".
we dont need to add a .sql.
Leaving this here in case someone has the same issue as me
best video on the internet
Finalllyyyy an data project
Errors:
Database Error in model actors (models/example/actors.sql)
relation "public.actors" does not exist
relation "public.actors" does not exist
===> Solution:
In 'docker-compose.yaml', add the below code to dbt:
depends_on:
elt_script:
condition: service_completed_successfully
A hero, but why does it work for most people out of the box?
i did this and the dbt runs for the films, film_actors, and actors tables but when trying to create the view for the film_ratings table it still says that relation "public.films" does not exist. I have been frustratingly stuck on this for days, I even tried rebuilding the project. Not sure what else to do, any thoughts on what I can do next?
@@jomeltapawan9316 same bro =(
@@jomeltapawan9316 I had this same issue. I had to recreate my project due to another setup issue and as a result, I forgot to edit the dbt_project.yml file such that '+materialized: table' rather than 'view'. Once fixing this, the issue was resolved. During the beginning of the dbt section, he mentioned that the view setting is not useful for the model application. Hope this helps.
upon doing "docker compose up" (this is right towards the end of the "building a data pipeline from scratch") I get the error :
"source_postgres-1 | 2024-01-20 23:30:21.504 UTC [247] FATAL: password authentication failed for user "root"
source_postgres-1 | 2024-01-20 23:30:21.504 UTC [247] DETAIL: Role "root" does not exist.
source_postgres-1 | Connection matched file "/var/lib/postgresql/data/pg_hba.conf" line 128: "host all all all scram-sha-256""
Anyone else get the same issue? I'm not a huge expert on docker, so I can only guess I have to do something with the postgres user "root", even tho justin didn't make a root user, only "postgres".. but then how do I do this in docker as well?
instead of postgres:latest use image: postgres:12 and add this line POSTGRES_INITDB_ARGS: --auth-host=scram-sha-256
ok, this is super helpful.
should i first aim to become a data scientist then go to data engineering? just wondering if it's better that way or not, if my end goal is to be a data engineer. thoughts?
I am stuck at installing docker, 'cant run on my pc' windows 10
Why is dbt using the port 5434 to communicate with the destination database? Both containers are running in the same docker network so why do we need to use the exposed port number?
It does not need that port that port is exposed so that if you the user wants to take a peak at the database you can run a db manager against that port, otherwise to peek at the database you would have to exec -it into the db container to look at the tables.
Okay, major question. As you are adding in new technologies like Airflow in the docker file, where is your console log or terminal to let you know if there’s any syntax errors etc.? For example, with React, Django, Flutter, I always have the app running on local host and ALWAYS have that window open to see if there are any errors in the error log as I am updating files.
How do you do that with this workflow to make sure you’re not making mistakes while you’re writing code?
Do you need to know maths for data engineering if so what types of
You didn't solve the column level macro bug @ 2:00:03. That's not the attitude mate.
It was actually he forgot the macro keyword when he was defining the macro in ratings_macro.sql file.
{% macro generate_ratings() %}
CASE
WHEN user_rating >= 4.5 THEN 'Excellent'
WHEN user_rating >= 4.0 THEN 'Good'
WHEN user_rating >= 3.0 THEN 'Average'
ELSE 'Poor'
END as rating_category
{% endmacro %}
Keep up the good work!
I really need a longer video.
Hi! Thanks for the tutorial, where can I find those yml file ? in the github repo there are not those files
I mean the part of dbt
what happened with the macro on 1:59:56? you got an error and then just jumped into another topic :D
LOL. Chau just skipped debugging it 🤣
He actually just forgot the macro keyword. It should be {% macro generate_ratings() %} in the rating_macro.sql file
What are the learning pre-requisites for this course?
brain
this guy could barely code in JavaScript like a year ago so not too much i guess.
@@AndrewHuange thanx for answering my query👍
Can you pls do a complete end to end video on AWS or GCP data engineering data ingestion, etl, analytics using pyspark
Please make a detailed tutorial on data engineer, about 20-30 hours full end to end course please it's a request 🙌
I was not able to access the docker file though the local host. Could you please help me with that?
Yes! Thanks!
The intro and description don't match the video - there is no Spark or Kafka anywhere in the video.
I love it. But lets say i need to make a email classification system that distributes these emails after there classification and send them to there classified department server.
Would you use a message queue like celery or use a broker like kafka or rabit mq?
How come you didn’t use pyspark for the etl script?
Is spark and kafka actually covered here as mentioned in the description?
nope
This is excellent content
heyy anybody knows where i can found the github repo for the dbt project? I'm searching for the schema and can't find it
same
@@joshwigginton9881 you switch to the dbt branch of the repo
does the alpine container run the same as the ubuntu container i presume it does but want to know if it does change the scope of the project
Sir on first run of docker compose up, here is the error I have been facing: elt_script-1 exited with code 1, some please help
did you solve this?
@@muh.zakyfirdaus6100 not yet solved, please help
How does the "data_dump.sql" file transfer from source container to the destination container?
Hi Justin, I have problem loading data to Postgres from the SQL scripts. The problem I encounter is "unf_inv_name expected afer this token". May I get some advices pls?
Finalllyyyy an data project. What are the learning pre-requisites for this course?.
Anyone know why at 1:59:55 defining and using that macro didn't work?
I'm pretty sure it's because he forgot to type macro. ie. he wrote {% generate_ratings() %}, but it should be {% macro generate_ratings() %}
@@Fuhrmaaj thank you
This channel is gold 🪙❤
Thank's for the video
please come up with big data course
Quallity content as always
@freeCodeCamp please put more content on data engineering ;)
Please make video on performance testing using jmeter
Can someone help me, when running command docker run, in Docker i get error Exited(1), saying it can not find module:
Error: Cannot find module '/app/src/index.js'
I did per instructions, should i move index.js file somewhere else, it is in the app directory
fixed it by installing bunch of commands:
npm install uuid
npm install express
npm install sqlite3
But now getting cannot get/
Am I stubit or the first part about Docker was a bit incomprehensible?
Docker tends to feel like that...
Im curious what tools, besides, Airbyte can be used? (the presenters employer - FYI, this is a marketing channel for Airbyte).
Not even on the Gartner Quadrant. And Airbyte is a freemium, only 14 days of free use. And as you might have guessed, it's greed-priced, errr usage priced. the lowest use (10gb and only 4 rows to replicate per month) is $160.00. That is huge expensive. I think Im going to skip this advertisement.
@@markk364 you can run airbyte for free using the OSS version.
Great course!
Awesome!
Please make more video about Data Engineering
please we need courses about big data , hadoop and sparks with practical projects
Might be great, but source code is not working at all (starts with main branch, and keeps going).
My boy, Chau!
thank you very much
Any Pre-requisites required ?
How do i learn all of these? What course should i take? Please help
Hey Everyone, I hope you are well. I am stuck on the last 20 minutes due to the requirement to have SSH instead of being able to bypass it how he did in the video. Has anyone else encountered this? How did you move past it?
Love it!
Can anyone suggest some youtube channel which provide all the topics like SQL python spark pyspark azure ???
highiest paid job in australia 250K AYEAR+BONUS + + + +