- 80
- 31 317
Torbjorn Storli
เข้าร่วมเมื่อ 25 เม.ย. 2017
A (deeper) dive into DuckDB using RStudio - PART 10 (DuckDB in R)
Welcome to the tenth video in my series on (a deeper dive into) DuckDB!
In this episode, I will take a deeper dive into duckDB using RStudio, specifically we will look at how to use DuckDB with R. We will use the tidyverse and tidyquant packages to obtain stock data.
Whether you’re new to DuckDB or looking to expand your knowledge, this video has something for everyone.
In this series, I’ll cover:
1) How to use DuckDB in general within VSCode and the DuckDB CLI environment.
2) Creating a DuckDB macro to dynamically query Yahoo Finance, retrieve JSON data, parse it, and convert it for further use.
3) Utilizing DuckDB with Python and R for statistical analysis.
Join me as I explore the powerful features of DuckDB and how it can enhance your data analysis workflows.
Don’t forget to like, comment, and subscribe to stay updated with the latest videos in this series!
In this episode, I will take a deeper dive into duckDB using RStudio, specifically we will look at how to use DuckDB with R. We will use the tidyverse and tidyquant packages to obtain stock data.
Whether you’re new to DuckDB or looking to expand your knowledge, this video has something for everyone.
In this series, I’ll cover:
1) How to use DuckDB in general within VSCode and the DuckDB CLI environment.
2) Creating a DuckDB macro to dynamically query Yahoo Finance, retrieve JSON data, parse it, and convert it for further use.
3) Utilizing DuckDB with Python and R for statistical analysis.
Join me as I explore the powerful features of DuckDB and how it can enhance your data analysis workflows.
Don’t forget to like, comment, and subscribe to stay updated with the latest videos in this series!
มุมมอง: 23
วีดีโอ
A (deeper) dive into DuckDB using DuckDB CLI and VSCode - PART 9 (Python in VSCode)
มุมมอง 100หลายเดือนก่อน
Welcome to the ninth video in my series on (a deeper dive into) DuckDB! In this episode, I will take a deeper dive into duckDB using the DuckDB CLI and specifically we will look at how to use DuckDB with Python. Whether new to DuckDB or looking to expand your knowledge, this video has something for everyone. In this series, I’ll cover: 1) How to use DuckDB within VSCode and the DuckDB CLI envir...
A (deeper) dive into DuckDB using DuckDB CLI and VSCode - PART 8 (Full Text Search - fts)
มุมมอง 126หลายเดือนก่อน
Welcome to the eight (8) video in my series on (a deeper dive into) DuckDB! In this episode, I will take a deeper dive into DuckDB using the DuckDB CLI and specifically we will look at how to use the Full Text Search (fts) feature. Whether new to DuckDB or looking to expand your knowledge, this video has something for everyone. In this series, I’ll cover: 1) How to generally use DuckDB within V...
A (deeper) dive into DuckDB using DuckDB CLI and VSCode - PART 7 (large files and partition)
มุมมอง 77หลายเดือนก่อน
Welcome to the seventh video in my series on (a deeper dive into) DuckDB! In this episode, I will take a deeper dive into duckDB using the DuckDB CLI and specifically we will look at how to load large csv files and then partition them into the much more efficient parquet format. Whether you’re new to DuckDB or looking to expand your knowledge, this video has something for everyone. In this seri...
A (deeper) dive into DuckDB using DuckDB CLI and VSCode - PART 6 (duckDB scripting)
มุมมอง 692 หลายเดือนก่อน
Welcome to the sixth video in my series on (a deeper dive into) DuckDB! In this episode, I will take a deeper dive into duckdb using the DuckDB CLI. Whether you’re new to DuckDB or looking to expand your knowledge, this video has something for everyone. In this series, I’ll cover: 1) How to use DuckDB in general within VSCode and the DuckDB CLI environment. 2) Creating a DuckDB macro to dynamic...
A (deeper) dive into DuckDB using DuckDB CLI and VSCode - PART 5 (modes, time_buckets and log files)
มุมมอง 582 หลายเดือนก่อน
Welcome to the fifth video in my series on (a deeper dive into) DuckDB! In this episode, I will take a deeper dive into duckdb using the DuckDB CLI. Whether you’re new to DuckDB or looking to expand your knowledge, this video has something for everyone. In this series, I’ll cover: 1) How to use DuckDB in general within VSCode and the DuckDB CLI environment. 2) Creating a DuckDB macro to dynamic...
A (deeper) dive into DuckDB using DuckDB CLI and VSCode - PART 4 (duckDB cool features)
มุมมอง 1102 หลายเดือนก่อน
Welcome to the fourth video in my series on (a deeper dive into) DuckDB! In this episode, I will take a deeper dive into duckdb using the DuckDB CLI. Whether you’re new to DuckDB or looking to expand your knowledge, this video has something for everyone. In this series, I’ll cover: 1) How to use DuckDB in general within VSCode and the DuckDB CLI environment. 2) Creating a DuckDB macro to dynami...
A (deeper) dive into DuckDB using DuckDB CLI and VSCode - PART 3 (duckDB cool features)
มุมมอง 892 หลายเดือนก่อน
Welcome to the third video in my series on (a deeper dive into) DuckDB! In this episode, I will take a deeper dive into duckdb using the DuckDB CLI. Whether you’re new to DuckDB or looking to expand your knowledge, this video has something for everyone. In this series, I’ll cover: 1) How to use DuckDB in general within VSCode and the DuckDB CLI environment. 2) Creating a DuckDB macro to dynamic...
A (deeper) dive into DuckDB using DuckDB CLI and VSCode - PART 2 (Yahoo Finance and JSON)
มุมมอง 1432 หลายเดือนก่อน
Welcome to the second video in my series on (a deeper dive into) DuckDB! In this episode, I will take a deeper dive into querying JSON from Yahoo Finance using the DuckDB CLI. Whether you’re new to DuckDB or looking to expand your knowledge, this video has something for everyone. In this series, I’ll cover: 1) How to use DuckDB in general within VSCode and the DuckDB CLI environment. 2) Creatin...
A (deeper) dive into DuckDB using DuckDB CLI and VSCode - PART 1
มุมมอง 4352 หลายเดือนก่อน
Welcome to the first video in my series on (a deeper dive into) DuckDB! In this episode, I will take a deeper dive into using DuckDB with the DuckDB CLI. Whether you’re new to DuckDB or looking to expand your knowledge, this video has something for everyone. In this series, I’ll cover: 1) How to use DuckDB in general within VSCode and the DuckDB CLI environment. 2) Creating a DuckDB macro to dy...
Loading Faker Data into a DuckDB database using VSCode and Python
มุมมอง 833 หลายเดือนก่อน
In this video, we will look at how we can create and load data from the faker Python package into a DuckDB DATABASE.
Excel Financial Functions in VSCode - Part 3
มุมมอง 174 หลายเดือนก่อน
In this video, we will examine some of the supplied Excel Financial Formulas for computing the Time Value of Money. We will compare the Excel output with the VSCode Financial Function output. GitHub: github.com/Tor-Storli/Excel-Financial-Functions-VSCode.git
Excel Financial Functions in VSCode - Part 2
มุมมอง 224 หลายเดือนก่อน
In this video, we will look at the supplied Excel Financial Formulas for computing Depreciation under the General Accepted Accounting Principles (GAAP) methods and the IRS (MACRS) methods. Github: github.com/Tor-Storli/Excel-Financial-Functions-VSCode.git
Using Excel Financial Functions in VSCode (Adding Nuget Packages) - Part1
มุมมอง 944 หลายเดือนก่อน
This is the first video in a new series where I will be covering Excel Financial Functions in VS code. In this video, we will look at how to add the Excel Financial Functions library to VS code using Nuget Package Manager.
Advanced Power Query - Yahoo Web API Application
มุมมอง 1356 หลายเดือนก่อน
In this video, we will use Advanced Power Query techniques along with the M-Language to extract Complex JSON Stock data from a Yahoo Web API in Excel. learn.microsoft.com/en-us/power-query/ learn.microsoft.com/en-us/powerquery-m/ www.youtube.com/@GoodlyChandeep GitHub: github.com/Tor-Storli/PowerQuery.git
Spatial queries in DuckDB using python in VSCode, Jinja2, and magic_duckdb
มุมมอง 1467 หลายเดือนก่อน
Spatial queries in DuckDB using python in VSCode, Jinja2, and magic_duckdb
Scraping Forex / Stock Data from Yahoo Finance and Analyzing it using Oracle Pattern Matching and R
มุมมอง 5810 หลายเดือนก่อน
Scraping Forex / Stock Data from Yahoo Finance and Analyzing it using Oracle Pattern Matching and R
CSV Files - How to select Delimiters, Encoding, and loading files into Excel and Python.
มุมมอง 3011 หลายเดือนก่อน
CSV Files - How to select Delimiters, Encoding, and loading files into Excel and Python.
import large Text and Binary Files using the DBMS_LOB PL/SQL package in Oracle
มุมมอง 16011 หลายเดือนก่อน
import large Text and Binary Files using the DBMS_LOB PL/SQL package in Oracle
Create a JSON file in PowerShell using the SQLPS and JSON Cmdlets.
มุมมอง 7011 หลายเดือนก่อน
Create a JSON file in PowerShell using the SQLPS and JSON Cmdlets.
GUI Data Development with ttkbootstrap, yahoo finance API and numpy-financial
มุมมอง 66ปีที่แล้ว
GUI Data Development with ttkbootstrap, yahoo finance API and numpy-financial
R Styled tables using tidyverse, gt and the scales packages
มุมมอง 3.3Kปีที่แล้ว
R Styled tables using tidyverse, gt and the scales packages
Database Documentation Builder - Oracle 21c, OpenXML, ASP.NET Web API and Blazor Technologies
มุมมอง 77ปีที่แล้ว
Database Documentation Builder - Oracle 21c, OpenXML, ASP.NET Web API and Blazor Technologies
hi thanks for explain can you share 🤓 the code
I am sorry @ALIAl-khatib-g4l. I didn't realize that My Github Repository for this video was sitting at visibility = Private. I made it Public, so please try the GitHub link in the description again. Hopefully it works for you now. Let me know if it works, OK?
excellent work
Thank you! Glad you liked it.😊
Your duckdb series is amazing, thanks for sharing your knowledge 😊
Glad you like them!
Awesome video
Glad you enjoyed it 😊
Thanks for the video
Thanks 😊
Nice explanation
Thanks for liking
Thanks
Unfortunately, this still doesn't fix my problem. The GUI font (letters) is still too small for me. When I do this what you presented, it does increase the font, but everything is so blurry. Any other fix?
thanks
you are the MAN
I ran into a problem. When I render pdf files, the function tab_options(table.width = pct()" can only be used once. If more than once, I got the error as follow. No problem with HTML files. ERROR: compilation failed- error LaTeX Error: Command \holdLTleft already defined. Or name \end... illegal, see p.192 of the manual. See the LaTeX manual or LaTeX Companion for explanation. Type H <return> for immediate help. ... l.336 ewlength\holdLTleft ewlength\holdLTright\setlength\holdLTleft{\LTle... see table.log for more information.
Thanks, Sir.
Thank you sir! :)
Most welcome!
The github link for the codes is not working. Can you repost the codes?
Try it now! Thanks, :)
Thank you very much for this video! I am just on the way of combining HTML and Py in Jupyter
I am glad you got something out of it. Thanks for your kind words!
Hi and great video! I watched your Excel 365 YF addin first since that it was i was looking for. I am new to Excel 365 and a little disappointed with it's STOCKHISTORY function reliability. So I am looking for a data solution. Are you using Python to build a database to then have Excel view it or some other viewing platform? Either way it would be great to see how you view the data?
Just remember the R native pipe has some limitations compared to the magrittr pipe
Yes, you are absolutely correct. Thanks for pointing this out!👍
Hi, could you please show in pdf file and reveals file in quarto? Thanks.
This is a fantastic tutorial. These are really vexing issues for those starting out and showing how to assert control in a clear way like this is extremely helpful. Kudos.
You're very welcome! Thank you!
Dude you have great tutorials but you could cut in half, even more, the time. No need to explain the base R pipe or how to install R
This video went from 0 to 100 very quickly, hahahaha. I barely work with python code, but I certainly will try some of the things you suggest. Great vid.
Thanks for your kind words! It makes it worthwhile to create videos. :)
Great video, thanks
Thank You. Glad you liked it 🙂
Thanks for solving man. U deserve a million views
Thank you! I am glad it helped!
thank you. found it informative
Thank You for your kind words. Glad I could help!
Thanks. The source code link does not work.
Try it now. Thanks!
Fnu, you could very well use the partition by Ticker and then order by date. Either way you most likely would end up at the same place. You are right, it is probably more correct to first partition and then order by. Thanks for the comment! Well spottet😅
I need some help in R-Services. I am looking for help everywhere, but there seems not much on the internet. Can we connect , for some time of yours ?
At 16:38 , should it be partition by ticker order by date instead or order by ticker,date ?
"promo sm" 🎶
I unzipped the folder then imported the .sln to Visual Studio then build, but it failed. Tried to use on personal computer.
I hope not to be too rude my friend but I came to see a video about sql, not about how to do geoprocesses in qgis, what time do we get to the damn wkt?
[̲̅p][̲̅r][̲̅o][̲̅m][̲̅o][̲̅s][̲̅m]
Slick setup. Well done.
Thanks!
Thanks so much for your content! It has been hard to find out how to use spatial data in c# with sql server and NTS.
Great to hear!
hello, excellent video, I'm having problem creating the spatial index, in a query that compares 1 point out of 7millions of another table, I could help, I saw that you stopped part 1 well in explaining the indexes.. thank you
Hi Torbjorn, I was able to obtain the queries from the json. But, after configuring all the settings in SQL Server and running the query inside SQL. I get the following message. Do you know what could be happening? Or What should I review? Msg 39004, Level 16, State 20, Line 36 A 'R' script error occurred during execution of 'sp_execute_external_script' with HRESULT 0x80004004. Msg 39019, Level 16, State 2, Line 36 An external script error occurred: Loading required package: xts Loading required package: zoo Attaching package: 'zoo' The following objects are masked from 'package:base': as.Date, as.Date.numeric Attaching package: 'PerformanceAnalytics' The following object is masked from 'package:graphics': legend Loading required package: lubridate Attaching package: 'lubridate' The following object is masked from 'package:base': date Loading required package: quantmod Loading required package: TTR Version 0.4-0 included new data defaults. See ?getSymbols. Loading required package: tidyverse ── Attaching packages ─────────────────────────────────────── tidyverse 1.2.1 ── ✔ ggplot2 3.1.0 ✔ purrr 0.3.0 ✔ tibble 2.0.1 ✔ dplyr 0.7.8 ✔ tidyr 0.8.2 ✔ stringr 1.3.1 ✔ readr 1.3.1 ✔ forcats 0.3.0 Msg 39019, Level 16, State 2, Line 36 An external script error occurred: ── Conflicts ──────���─────────────────────────────────── tidyverse_conflicts() ── ✖ lubridate::as.difftime() masks base::as.difftime() ✖ lubridate::date() masks base::date() ✖ dplyr::filter() masks stats::filter() ✖ dplyr::first() masks xts::first() ✖ lubridate::intersect() masks base::intersect() ✖ dplyr::lag() masks stats::lag() ✖ dplyr::last() masks xts::last() ✖ lubridate::setdiff() masks base::setdiff() ✖ lubridate::union() masks base::union() Error in bind_rows_(x, .id) : Argument 1 must have names Calls: source ... withVisible -> eval -> eval -> bind_rows -> bind_rows_ Adem�s: Warning messages: 1: x = '^GSPC', get = 'stock.prices': Error in curl::curl_download(cu, tmp, handle = h): Failed to connect to finance.yahoo.com port 443: Bad access Msg 39019, Level 16, State 2, Line 36 An external script error occurred: 2: x = 'ABT', get = 'stock.prices': Error in curl::curl_download(cu, tmp, handle = h): Failed to connect to finance.yahoo.com port 443: Bad access 3: x = 'ABBV', get = 'stock.prices': Error in curl::curl_download(cu, tmp, handle = h): Failed to connect to finance.yahoo.com port 443: Bad access 4: x = 'JNRFX', get = 'stock.prices': Error in curl::curl_download(cu, tmp, handle = h): Failed to connect to finance.yahoo.com port 443: Bad access 5: x = 'JNOSX', get = 'stock.prices': Error in curl::curl_download(cu, tmp, handle = h): Failed to connect to finance.yahoo.com port 443: Bad access 6: x = 'JANRX', get = 'stock.prices': Error in curl::curl_download(cu, tmp, handle = h): Failed to connect to finance.yahoo.com port 443: Bad access 7: x = 'JANIX', get = 'stock.prices': Error in curl::curl_download(cu, tmp, handle = h): Failed to connect to finance.yahoo.com port 443: Bad access Msg 39019, Level 16, State 2, Line 36 An external script error occurred: 8: x = 'JANWX', get = 'stock.prices': Error in curl::curl_download(cu, tmp, handle = h): Failed to connect to finance.yahoo.com port 443: Bad access Error in execution. Check the output for more information. Error in eval(ei, envir) : Error in execution. Check the output for more information. Calls: runScriptFile -> source -> withVisible -> eval -> eval -> .Call Ejecuci�n interrumpida The query that I tried is: DECLARE @rScript nvarchar(max) SET @rScript = N'library("PerformanceAnalytics") library("tidyquant") GSPC <- tq_get("^GSPC", get = "stock.prices", from = "2013-01-01") ABT <- tq_get("ABT", get = "stock.prices", from = "2013-01-01") ABBV <- tq_get("ABBV", get = "stock.prices", from = "2013-01-01") JNRFX <- tq_get("JNRFX", get = "stock.prices", from = "2013-01-01") JNOSX <- tq_get("JNOSX", get = "stock.prices", from = "2013-01-01") JANRX <- tq_get("JANRX", get = "stock.prices", from = "2013-01-01") JANIX <- tq_get("JANIX", get = "stock.prices", from = "2013-01-01") JANWX <- tq_get("JANWX", get = "stock.prices", from = "2013-01-01") data <- bind_rows("^GSPC" = GSPC, "ABT" = ABT, "ABBV" = ABBV, "JNRFX" = JNRFX, "JNOSX" = JNOSX, "JANRX" = JANRX, "JANIX" = JANIX, "JANWX" = JANWX, .id = "symbol") data$date <- as.character(data$date) OutputDataSet <- data; '; EXEC sp_execute_external_script @language = N'R', @script = @rScript --, --@output_data_1 = N'OutputDataSet' WITH RESULT SETS( ([symbol] nvarchar(50), [date]nvarchar(50), [open] Decimal(20,2), [high] Decimal(20,2), [low] Decimal(20,2), [close] Decimal(20,2), [volume] Decimal(20,0), [adjusted] Decimal(20,2)) ); GO
Aldo, Try this: 1) Make sure "SQL Server" and "SQL Server Launchpad" is running (See Services) 2) Make sure R-services are enabled: EXEC sp_configure 'external scripts enabled', 1 RECONFIGURE WITH OVERRIDE 3) Stop and Restart "SQL Server" and "SQL Server Launchpad" 4) Make sure the following packages are installed in R-Services: ---------------- curl quantmod TTR xts zoo -------------- 4) Now try to run the query below. If all the packages are installed correctly and you get an error it could be that your SQL server instance is not configured to access the internet. If so, Google it and figure out what you have to do in order for that to work. If you are able to run the query successfully - You will have to reinstall all the other packages that Tidyverse needs (Depends / Imports / Suggests) and make sure that there are no version conflicts. Yes, it is very painful to get all the ducks in a row. (I mentioned this in the first video approx. 10 minutes in). Go to this CRAN URL to see what packages dependencies Tidyverse / Tidyquant) require. cran.r-project.org/web/packages/tidyquant/index.html (See: Depends / Imports/Suggests) Hope that helps. Good Luck! ============================= DECLARE @rScript nvarchar(max) SET @rScript = N'library("quantmod") stock_list <- c("FB") start_date <- Sys.Date()-365 end_date <- Sys.Date() master_df <- NULL for (idx in seq(length(stock_list))){ stock_index = stock_list[idx] getSymbols(stock_index, verbose = TRUE, src = "yahoo", from=start_date,to=end_date) temp_df = as.data.frame(get(stock_index)) temp_df$Date = row.names(temp_df) temp_df$Index = stock_index row.names(temp_df) = NULL colnames(temp_df) = c("Open", "High", "Low", "Close", "Volume", "Adjusted", "Date", "Index") temp_df = temp_df[c("Date", "Index", "Open", "High", "Low", "Close", "Volume", "Adjusted")] master_df = rbind(master_df, temp_df) } OutputDataSet <- master_df;'; EXEC sp_execute_external_script @language = N'R', @script = @rScript --, --@output_data_1 = N'OutputDataSet' WITH RESULT SETS( ( [Date] varchar(50), [Index] nvarchar(50), [Open] Decimal(20,2), [High] Decimal(20,2), [Low] Decimal(20,2), [Close] Decimal(20,2), [Volume] Decimal(20,0), [Adjusted] Decimal(20,2) ) ); GO --======================= /* If it runs successfully - you should see something like this: ---------------------------------------------------------------------------- STDERR message(s) from external script: Loading required package: xts Loading required package: zoo Attaching package: 'zoo' The following objects are masked from 'package:base': as.Date, as.Date.numeric Loading required package: TTR Version 0.4-0 included new data defaults. See ?getSymbols. 'getSymbols' currently uses auto.assign=TRUE by default, but will use auto.assign=FALSE in 0.5-0. You will still be able to use 'loadSymbols' to automatically load data. getOption("getSymbols.env") and getOption("getSymbols.auto.assign") will still be checked for alternate defaults. This message is shown once per session and may be disabled by setting options("getSymbols.warning4.0"=FALSE). See ?getSymbols for details. WARNING: There have been significant changes to Yahoo Finance data. Please see the Warning section of '?getSymbols.yahoo' for details. This message is shown once per session and may be disabled by setting options("getSymbols.yahoo.warning"=FALSE). STDERR message(s) from external script: Downloaded 2594 bytes... Downloaded 4905 bytes... Downloaded 10105 bytes... Downloaded 11560 bytes... Downloaded 15460 bytes... Downloaded 18935 bytes... STDOUT message(s) from external script: downloading FB ..... done. (252 rows affected) Total execution time: 00:00:04.168 Output: ======= Date Index Open High Low Close Volume Adjusted 2020-08-06 FB 249.04 266.60 248.67 265.28 45241600 265.28 2020-08-07 FB 264.08 278.89 263.43 268.44 72766400 268.44 2020-08-10 FB 268.04 273.86 259.69 263.00 30248800 263.00 2020-08-11 FB 260.19 265.92 255.13 256.13 28238300 256.13 2020-08-12 FB 258.97 263.90 258.11 259.89 21415700 259.89 2020-08-13 FB 261.55 265.16 259.57 261.30 17374000 261.30 ETC...... */
Hello Torjborn, Can you upload again the scripts that you're using in your tutorial? I enterd to the github branch where you have them and I get an error. Thanks. Great content!
Aldo, Not sure how your error occurs, but when I go to The Github Repository and download the R_Script.ipynb and open it in Azure Data Studio it works fine. In the Github repository named: Tor-Storli /SQL_Server_R_Services Try this (If you are on Windows): ========================== 1) Click on the "R_Scripts.ipynb" 2) It will try to load and return "Sorry, something went wrong. Reload?" 3) Click the "download" button. 4) The file opens up in a "raw" json format. 5) Right Click and click "Save As" 6) Select save as type: "All files(".")" 7) Save the file with the following name: "R_Scripts.ipynb" 8) Install Azure Data Studio" - "docs.microsoft.com/en-us/sql/azure-data-studio/download-azure-data-studio?view=sql-server-ver15" 9) Open file in Azure Data Studio. If you are trying to open it in VS Code or Jupyter Notebook it will probably give you errors. I used Azure Data Studio to create the Notebook and it has references to ADS SQL Kernel. The other file in the repository: "TestDB_Objects.sql" contains SQL statements - which you should be able to just copy/paste. Hope that helps. :)
@@torbjornstorli2880 Thank you Torjborn, I was able to get the scripts. Regards :)
What a great presentation. Thank you!
Thank you Roman. That is very kind of you.
so you didnt do any image preprocessing?why not? what could have been done for this dataset?
FB, Well, I do not know how much preprocessing you can do with X-rays. That is, Every X-Ray is in a vertical position and every X-ray seems to look the same from a positional point of view, so I do not see the point of tilting or rotating etc. - any image to improve the performance. If you think you need to shift the center of the image to the left or right or zoom it - try it and see if you are able to get better results. Let me know if it helps. Sorry for the delayed response. Hope that answers your question. Thanks Again. :)
PostgreSQL is better for geospatial data right? (Using postgis)
I don't know. You may be right. I do not have any experience with PostgreSQL Geospatial. I have only used PostgreSQL to transfer Palantir Datasets into Postgres for Reporting purposes. That is all I know about PostgreSQL, so maybe. I just focus on SQL Server Geospatial capabilities in this video. Thanks for your interest, and sorry for the delayed response. :)
great code you have there! i am trying to run it atm but when running the learn.fit.one_cycle, the process gets Interrupted almost when finishing the first epoch. do you have any idea of what that might be? thanks
Sorry, no idea. Sorry for the delayed response. :)
where is the part 2
Part 2 of what?
Can you email me your slides
Sorry, I cannot. Those were slides I converted into a PowerPoint executable and put into my OneDrive about 7-8 years ago, so I do not have them anymore. Thanks, :)
@@torbjornstorli2880 no problem