I didn't expect to laugh out loud - TWICE - but I did. Great video. I love how MotherDuck can demonstrate awesome desktop data-engineering super-powers and tell jokes at the same time. No excuse me I need to deal with the coffee I just spit everywhere after the 2nd laugh I had. 😂
s3://us-prd-motherduck-open-datasets/who_ambient_air_quality/parquet/who_ambient_air_quality_database_version_v6_april_2023.parquet - I can access the (403) file using jupyter notebook
For the path option, what path or file am I supposed to locate/find or create that would be pointed to in my profiles.yml? I actually ran into an error in my motherduck project because of this. I tried running dbt + motherduck & ran into this issue.
Any path will do - it's just where you want to persist (if you want to) the data through a DuckDB file format. You probably ran into issue because you may not have write access to the given path in the example. Maybe replace it with './my_db.db' (current path of your project) ? Happy to help you also on MotherDuck Slack slack.motherduck.com !
dude how r you, i have the next question, what could i do if i have a stream on snowflake that i want to "consume" in dbt but not creating a physical table or view, instead something live a ephemeral materialization, only to purge the stream and avoid to become stale. I create an ephemeral model and select the stream source but that only create obviously an ephemeral materialization but kind not clean the data on the stream, thoughts??
Yes! You can use a post-hook strategy and macros to have multiple outputs. Essentially having a macro that does a COPY to a given location. Hope it helps!
I didn't expect to laugh out loud - TWICE - but I did. Great video. I love how MotherDuck can demonstrate awesome desktop data-engineering super-powers and tell jokes at the same time. No excuse me I need to deal with the coffee I just spit everywhere after the 2nd laugh I had. 😂
Would love a tutorial with this and meltano :D
I'm getting 403 errors trying to access the S3 buckets. Tried setting the 2 S3 environments to random values in profiles.yml and also in an .env file.
s3://us-prd-motherduck-open-datasets/who_ambient_air_quality/parquet/who_ambient_air_quality_database_version_v6_april_2023.parquet - I can access the (403) file using jupyter notebook
Hey there! Would recommend to join slack.motherduck.com, we can follow-up there to help you debug ! Cheers
encountered - Encountered an error:Parsing Error
Env var required but not provided: 'S3_ACCESS_KEY_ID'
sounds like you need to provide a s3 access key
For the path option, what path or file am I supposed to locate/find or create that would be pointed to in my profiles.yml? I actually ran into an error in my motherduck project because of this. I tried running dbt + motherduck & ran into this issue.
Any path will do - it's just where you want to persist (if you want to) the data through a DuckDB file format. You probably ran into issue because you may not have write access to the given path in the example. Maybe replace it with './my_db.db' (current path of your project) ?
Happy to help you also on MotherDuck Slack slack.motherduck.com !
dude how r you, i have the next question, what could i do if i have a stream on snowflake that i want to "consume" in dbt but not creating a physical table or view, instead something live a ephemeral materialization, only to purge the stream and avoid to become stale. I create an ephemeral model and select the stream source but that only create obviously an ephemeral materialization but kind not clean the data on the stream, thoughts??
searching for dbt confused me too. didn't find technical things at first
Hair quality 😉🤣
woo
Can you write to CSV and duckdb in a single model?
Yes! You can use a post-hook strategy and macros to have multiple outputs. Essentially having a macro that does a COPY to a given location. Hope it helps!
Hi, running dbt on local data was the thing i was wiahting for some usecases.
BtW : "Getting started with dbt " is targetting a dead lin ;-p
thanks for the headsup, fixed!
terrible accent awesome content