In this webinar Michael Tarallo from Qlik introduces how Qlik Cloud integrates with the OpenAI APIs using Qlik's OpenAI Analytics Connector and Application Automation OpenAI Connector Blocks. Sign up for the next Do More with Qlik: pages.qlik.com/21Q3_QDEV_DA_GBL_DoMorewithQlikTargetpage_Registration-LP.html Do More Archive Playlist: th-cam.com/play/PLW1uf5CQ_gSo8GzlcFKAjIxLfFETDECcK.html Index: 0:00- Introduction and Presentation 15:00 - Demonstration - Embedded Generative AI - OpenAI Analytics Connector 25:00 - Add Generative AI to your Qlik data model - OpenAI Analytics Connector 29:51 - Adding Generative AI to your automated workflow (OpenAI Connector Blocks - Application Automation) Resources: Embed Generative AI in Qlik Sense charts - real-time: th-cam.com/video/R9ScDzEU9DQ/w-d-xo.html Add Generative AI to your data model th-cam.com/video/XCaaRenozb8/w-d-xo.html Add Generative AI to your automated workflows th-cam.com/video/GxUSUqgEfK0/w-d-xo.html
Is this correct that there is a general request limit of 25 rows per request during data load? Means if my table is bigger then I have to split the table into parts of 25 rows?
Hello Oli - For Qlik Sense, this is true and depends on the specific connector. Requests to OpenAI are run a row at a time and are priced per token in each request (with rate limiting in place). Practically we have set this limit, so customers don’t run up massive bills with OpenAI without consciously doing so. This is mostly designed to ensure that when you add requests in a sheet you don’t send every value in a table to OpenAI. So yes, if you need to run more than 25 rows, you need to make these requests in batches. In the load script, this can be some with a simple script. However, this is all dependent on the use case best suited for OpenAI if sending larger batches of data is the right cost-effective solution, as many uses cases may be best suited by just using the chart expression for on-demand processing by OpenAI instead of loading the data from OpenAI into the data model. If you would like to discuss this further please let me know. Regards, Mike T
@@QlikOfficial This has been a problem for me. I am working with a table of 19000 rows, with a lot of information. Therefore, I am unable to use the OpenAI connector?
In this webinar Michael Tarallo from Qlik introduces how Qlik Cloud integrates with the OpenAI APIs using Qlik's OpenAI Analytics Connector and Application Automation OpenAI Connector Blocks.
Sign up for the next Do More with Qlik:
pages.qlik.com/21Q3_QDEV_DA_GBL_DoMorewithQlikTargetpage_Registration-LP.html
Do More Archive Playlist:
th-cam.com/play/PLW1uf5CQ_gSo8GzlcFKAjIxLfFETDECcK.html
Index:
0:00- Introduction and Presentation
15:00 - Demonstration - Embedded Generative AI - OpenAI Analytics Connector
25:00 - Add Generative AI to your Qlik data model - OpenAI Analytics Connector
29:51 - Adding Generative AI to your automated workflow (OpenAI Connector Blocks - Application Automation)
Resources:
Embed Generative AI in Qlik Sense charts - real-time:
th-cam.com/video/R9ScDzEU9DQ/w-d-xo.html
Add Generative AI to your data model
th-cam.com/video/XCaaRenozb8/w-d-xo.html
Add Generative AI to your automated workflows
th-cam.com/video/GxUSUqgEfK0/w-d-xo.html
Complex stuff, but as always, so nice & neatly presented. Thanks, Mike!
Great Mike!
Is this correct that there is a general request limit of 25 rows per request during data load? Means if my table is bigger then I have to split the table into parts of 25 rows?
Hello Oli - For Qlik Sense, this is true and depends on the specific connector. Requests to OpenAI are run a row at a time and are priced per token in each request (with rate limiting in place). Practically we have set this limit, so customers don’t run up massive bills with OpenAI without consciously doing so. This is mostly designed to ensure that when you add requests in a sheet you don’t send every value in a table to OpenAI. So yes, if you need to run more than 25 rows, you need to make these requests in batches. In the load script, this can be some with a simple script. However, this is all dependent on the use case best suited for OpenAI if sending larger batches of data is the right cost-effective solution, as many uses cases may be best suited by just using the chart expression for on-demand processing by OpenAI instead of loading the data from OpenAI into the data model. If you would like to discuss this further please let me know.
Regards,
Mike T
@@QlikOfficial This has been a problem for me. I am working with a table of 19000 rows, with a lot of information. Therefore, I am unable to use the OpenAI connector?
👍