- 42
- 405 365
The Java Tech Learning
India
เข้าร่วมเมื่อ 2 ม.ค. 2019
One stop platform to learn all the languages, frameworks, open-source technologies and lot more...
By Vishal Uplanchwar
By Vishal Uplanchwar
Apache Kafka Interview Questions | For Experienced | Kafka FAQs | Interview Preparation
This video covers around 17-18 frequently asked interview questions for experienced IT folks. Answers & explanations are provided for all of the question.
Link to pdf File : github.com/vishaluplanchwar/KafkaTutorials/blob/master/Apache%20Kafka%20Interview%20Questions.pdf
LIKE. SHARE. SUBSCRIBE
Link to pdf File : github.com/vishaluplanchwar/KafkaTutorials/blob/master/Apache%20Kafka%20Interview%20Questions.pdf
LIKE. SHARE. SUBSCRIBE
มุมมอง: 1 999
วีดีโอ
Kafka Streams Stateful Operation | GroupBy | Aggregate | Calculate running average using aggregation
มุมมอง 781ปีที่แล้ว
This one will be progression of kafka streams stateful ops concept where I'll illustrate aggregate method. I'm going to calculate running avg of order price for BUY stock orders by streaming influx of events from stock order events topic. Checkout source code : github.com/vishaluplanchwar/KafkaTutorials Follow me on LinkedIn : www.linkedin.com/in/vishal-uplanchwar-a2a746138 LIKE | SUBSCRIBE | S...
Kafka Streams Stateful Operations | GroupBy | Count | Create State Store and query with Rest Api
มุมมอง 2.1Kปีที่แล้ว
This video will talk about Kafka Streams Stateful Operations. It will in detail cover how to group by events & apply count aggregation. Also to create State Store and access the store using Rest service. Checkout source code : github.com/vishaluplanchwar/KafkaTutorials LIKE | SUBSCRIBE | SHARE
Kafka Streams Joins | Types of Joins | Stream-Stream join implementation with Spring Boot
มุมมอง 2.5Kปีที่แล้ว
In this video I'm going explain how to join multiple kafka input streams using join method provided by kafka streams. Also look at different types of joins present, how do they work. Finally I'll take one real life scenario & will implement join operations on data present in two different kafka topics. Don't forget to subscribe my channel if you like the content. Subscribe | Like | Share
Confluent Cloud Demo | Create Apache Kafka Cluster | Connect to Confluent Cloud with Spring Boot
มุมมอง 8Kปีที่แล้ว
This videos explain about creating environments, cluster for Apache Kafka on Confluent Cloud platform. It also exhibits connecting to Kafka cluster created in Confluent Cloud, produce & consume events. Like | Subscribe | Share I have outlined Spring Kafka related properties used to connect Confluent Cloud platform in the video - spring: kafka: bootstrap-servers: confluent-cloud-broker-url:9092 ...
Kafka Stream Processor Api implementation with Spring Boot | Stateless Operations
มุมมอง 7Kปีที่แล้ว
This video will will explain following - 1. What is Processor Api ?? 2. Configure Processor Api with Spring Boot 3. Implement real time scenario to stream events using processor api streaming. Checkout Source Code : github.com/vishaluplanchwar/KafkaTutorials Like.Subscribe.Share
Convert complex Java objects to Avro schema | Existing POJOs to Avro schema for Kafka uses
มุมมอง 11K2 ปีที่แล้ว
This video will explain how to convert existing complex java objects present in your application to Avro schema which further can be used with Kafka Like | Subscribe | Share Download source code from Git Hub : github.com/vishaluplanchwar/KafkaTutorials
Configure Kafka Streams with Spring Boot | Implement Kafka Streams DSL filter, mapValues methods
มุมมอง 13K2 ปีที่แล้ว
This video illustrate about configuring Kafka Streams using Spring Boot Framework. It shows implementation of Stream processing topology using DSL methods like filter, mapValues. Checkout Source Code from here : github.com/vishaluplanchwar/KafkaTutorials Like | Subscribe | Share
What is Kafka Streams ? | Stream Processing Topology | Ways to implement Kafka Streams
มุมมอง 4.2K2 ปีที่แล้ว
This video gives high level understanding What is Kafka Streams ? What is Stream Processing Topology ?. What are the key features in Kafka Streaming ? Also way to define Kafka Streaming. Like | Subscribe | Share
Spring Kafka Avro Consumer | Consume Avro messages from kafka topic | Confluent | Schema Registry
มุมมอง 13K2 ปีที่แล้ว
This video talk about creating Spring Consumer to read Avro schema based messages from Kafka topic. Checkout source code from GitHub - github.com/vishaluplanchwar/KafkaTutorials LIKE | SUBSCRIBE | SHARE
Spring Kafka Avro Producer | Produce Avro messages on topic | Confluent | Schema Registry
มุมมอง 20K2 ปีที่แล้ว
This video will explain producing Avro messages on Kafka topic using Spring Boot framework. It uses Confluent platform & it's schema registry service to deal with Avro schema files. Checkout source code from GitHub - github.com/vishaluplanchwar/KafkaTutorials LIKE | SUBSCRIBE | SHARE
Spring Kafka Consumer Example | Spring Boot | Listen to String & JSON messages from Kafka Topics
มุมมอง 7K2 ปีที่แล้ว
This video talks about creating a Kafka Consumer to read String & JSON schema messages from Kafka topic using Spring Boot. Source Code: github.com/vishaluplanchwar/KafkaTutorials Like | Subscribe | Share
Spring Kafka Producer Example | Connecting to Apache Kafka broker with Spring Boot | Publish Events
มุมมอง 7K2 ปีที่แล้ว
This video will provide insights on connecting to Apache Kafka broker using Spring Boot Framework. Also it helps you to create Spring kafka message producer to publish String & JSON type of value on topic. GitHub Code: github.com/vishaluplanchwar/KafkaTutorials Like | Subscribe | Share
Integration of JMS with Kafka | JMS Source Connector | Source ActiveMQ messages to Confluent Kafka
มุมมอง 4.7K2 ปีที่แล้ว
This video will explain about connecting any JMS compliant broker to Confluent using Kafka Connect JMS Source Connector. Here we're going to source messages pushed in ActiveMQ to Kafka Topic. You can find all the source connector configs on my GitHub repository github.com/vishaluplanchwar/KafkaTutorials Like | Subscribe | Share Note: In order to run JMS Source Connector on Confluent platform fi...
Sink Kafka Topic to Database Table | Build JDBC Sink Connector | Confluent Connector | Kafka Connect
มุมมอง 23K2 ปีที่แล้ว
This video explains about sinking Kafka topic data to MySQL table using Confluent JDBC Sink Connector. It echo implementation to create Sink Connector Config and then later to deploy on Confluent server using Control Center. Like | Subscribe | Share
Source MySQL table data to Kafka | Build JDBC Source Connector | Confluent Connector | Kafka Connect
มุมมอง 28K2 ปีที่แล้ว
Source MySQL table data to Kafka | Build JDBC Source Connector | Confluent Connector | Kafka Connect
What is Confluent Control Center ?? | Walkthrough of Control Center Guy | Control Center features
มุมมอง 8K2 ปีที่แล้ว
What is Confluent Control Center ?? | Walkthrough of Control Center Guy | Control Center features
Installing Confluent on local machine| Confluent Installation Guide | Start Confluent Services
มุมมอง 17K2 ปีที่แล้ว
Installing Confluent on local machine| Confluent Installation Guide | Start Confluent Services
What is Apache Kafka? | What is Confluent? | Apache Kafka v/s Confluent | Confluent for beginners
มุมมอง 29K2 ปีที่แล้ว
What is Apache Kafka? | What is Confluent? | Apache Kafka v/s Confluent | Confluent for beginners
Conditional Routing in Mule 4 | Choice Router | Rest APIs | Execute flow based on condition
มุมมอง 3.7K3 ปีที่แล้ว
Conditional Routing in Mule 4 | Choice Router | Rest APIs | Execute flow based on condition
File Connector in Mule 4 | On New or Updated File | ForEach Scope | Iterate data with ForEach Scope
มุมมอง 3.5K3 ปีที่แล้ว
File Connector in Mule 4 | On New or Updated File | ForEach Scope | Iterate data with ForEach Scope
Asynchronous Batch Processing in MuleSoft 4 | Database Scheduler | Batch Step | Batch Aggregator
มุมมอง 2K3 ปีที่แล้ว
Asynchronous Batch Processing in MuleSoft 4 | Database Scheduler | Batch Step | Batch Aggregator
DataWeave 2.0 in MuleSoft 4 | DWL Transformations | JSON to JSON | JSON to XML | DWL Operators
มุมมอง 2.8K3 ปีที่แล้ว
DataWeave 2.0 in MuleSoft 4 | DWL Transformations | JSON to JSON | JSON to XML | DWL Operators
MuleSoft4 JMS Connector | Publish, Consume, Listen to Queues | JMS operations with ActiveMQ
มุมมอง 3.5K3 ปีที่แล้ว
MuleSoft4 JMS Connector | Publish, Consume, Listen to Queues | JMS operations with ActiveMQ
Implementation of RAML Api specification in Mule Application | Build RESTful Services | Session-3
มุมมอง 3.6K3 ปีที่แล้ว
Implementation of RAML Api specification in Mule Application | Build RESTful Services | Session-3
Api Designing | Advance features in RAML | Import RAML specification in MuleSoft 4 App | Session-2
มุมมอง 4.2K3 ปีที่แล้ว
Api Designing | Advance features in RAML | Import RAML specification in MuleSoft 4 App | Session-2
Api Designing in RAML | RAML Tutorial | American Flight Api Demo | Anypoint Exchange | Session-1
มุมมอง 12K3 ปีที่แล้ว
Api Designing in RAML | RAML Tutorial | American Flight Api Demo | Anypoint Exchange | Session-1
Merging Maps in Java 8 using stream apis | Different ways to merge Maps in Java 7 v/s Java 8
มุมมอง 2.6K4 ปีที่แล้ว
Merging Maps in Java 8 using stream apis | Different ways to merge Maps in Java 7 v/s Java 8
Error Handling in MuleSoft 4 | On Error Continue, On Error Propagate, Try Scope mechanism in Mule
มุมมอง 24K5 ปีที่แล้ว
Error Handling in MuleSoft 4 | On Error Continue, On Error Propagate, Try Scope mechanism in Mule
Batch Processing in MuleSoft | Creating XML files from database records using Batch Processing
มุมมอง 6K5 ปีที่แล้ว
Batch Processing in MuleSoft | Creating XML files from database records using Batch Processing
Most sensible video on Kafka topic, thank you very much.
I really appreciate this brief and well delivered tutorial. Thanks. However, I am wondering why you would use POST as you are retrieving departments. Would appreciate knowing. I would have thought it would still be a GET.
You made it sooo simple. Thanks!
Hi, do you have any example of how to configure in a project the Confluent Control Center? I couldn't find an example
Hi, can you please confirm 1)why you are using Avro converter? If I didn't set any schema for the topic means what should I give here? 2)pk.mode and pk.fields is mandatory?
If we want to extract header json into the db table columns how can this be done ?
Very good content!! Thank you very much for creating this!! 💯👌
Really very useful video ❤thank you very much !
I have a question... let's say we have in the store "DENIM" - 7 "SHIRTS" - 5 If I receive a tombstone for one of them, will the KTable manage and reduce the related long after grouping by? Actually not... I created a test with TestTopologyDriver and I sent three key/value pairs, then for the first key I am sending tombstone value and guess what? KTable returned me all the three values related to the keys and the tombstone was ignored. If you go to aggregate() operation you will see that it is skipping null value entries and I am wondering how we can manage our store, because it will only grow by the time, right?
This is Really Helpful. Please keep us posted more. Thaks a lot. By any chance do you take classes? Let me know. I am interested.
Really Good one but I got one question For example I am writing only Cosumer and how I Know StockHistory and where I need to keep . I mean to how exact package structer need to be set @ Consumer
very nice content, congratulations and thank you so much. Keep up the good work!
alternate for avro? if possible provide any example for that also. Thanks
You can publish in json format
First view
Thank You, Please Continue on new topics of Programming
Cheers bro!
How to insert this type of day { "_id": { "string": "663ba280e41f1aa25e72f32a" }, "Address": { "array": [ { "string": "a" }, { "string": "b" }, { "string": "c" } ] }, "City": { "string": "Norway" }, "FirstName": { "string": "Tom" }, "LastName": { "string": "Cardinal" }, "PersonID": { "int": 1 } }
Is this json ? You can either convert or create avro schema from it and then publish into topic
Yes In json there is Address field which Is array
Link to source code please?
💯👏👏👏👏
Can we do this same in using spring boot
Yes we can
@@javatechlearning how sir ? Can u please provide some example ? Or code ?
Hello, Good video, But on a enterprise level, we dont want avro schema to defined initially , we want the incoming xml files to be dynamically converted to avro schemas and pushed to the topic. Please provide your inputs on this , how do we do this
Kafka does not understand xml language, you have to convert xml into avro object. You can search finding any tool to convert xml schema into avro schema
@@javatechlearning can I use kafka connect or pls suggest me what’s your best bet
I think there is some file connector also present in confluent. Try exploring that
Hi vishal can you share gradle project dependencies for creating target folder for avro schema
Can you please share how we can generate java file in gradle project for avsc
You must continue your teaching bro. You are excellent thanks
Thank you
@javatechlearning : source code missing
@javatechlearning : Can you checkin code for stateful operations?
Hi Sir, am not able to see skip payment option. What should i do. Please tell me ?
Would this work with any RDBMS? Would it work with Sybase ASE as the sink?
Yes this should work with sybase db as well. You need to use sybase driver for that
How can I use the values I received after the http call and assign it to a variable
Hi, I could not see the source code of this tutorial in the given Github repo. Can you please add?
how does adf authenticates with mulesoft and how mulesoft authenticates with api sources how adf triggers mulesoft workflow after mulesoft is published to cloud hub?
Thanks for quick video. This is helpful.
Bro.. Any videos you can refer me to as on how do we leverage Confluent in Data Engineering? I know Kafka in and out, just not Confluent in practice
'buildStreamsProperties()' is deprecated and marked for removal. edit: this worked for me instead: @Bean public StreamsConfig streamsConfig(KafkaProperties properties){ Properties props = new Properties(); props.put(StreamsConfig.APPLICATION_ID_CONFIG, "kafka-streams-app"); props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092"); props.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass()); props.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.String().getClass()); // Add any additional properties as needed return new StreamsConfig(props); }
vishal now its asking mandatory credit card -its saying error as Forbidden: No credit card on file
I appreciate your great work. But one suggestion you are using deprecated classes . it will be great avoid using deprecated classes, methods
Hey how you can handle errors lets say destination connector is down/ Mq is down etc
Hi, I have one question... can this kafka sink connector be used to connect 100 different databases ? Need your input on this
But where we used schema registry? We haven't used anywhere
We used schema registry. Its not possible to produce avro on topic without schema registry
Can you please help me in setting up confluent center , i want know what changes has to be made in properties file to run control center for cluster in confluent cloud locally Your reply means a lot
Can you also do a video using Confluent Kafka Client Java? My Java application cannot connect to the Confluent Cloud cluster.
can I use same commands to apache kafka?
How to do the same using spring cloud.stream.bindings. Im trying to use schema related configs to the binding consumer but im getting schema.registry.url was supplied bit isnt a known config. So can you help how to configure schema configs to a binding a consumer please
Does this works in Android as well consider it is working in IntelliJ?
Intellij it works
can we connect tibco ems to kafka.
Simple concise and elegant. Great work.
Thank for the great tutorial. Did you have to create the avro schema or it gets generated?
nvm, I found the ans from the video.
Great... Nicely explained 👌
my schema registry is continuously throwing error. need help asap.
Can you paste the error
Hi Michel, thks for the videos and explains,these are really useful, but, I suggest you add kerberos, and how can implement it,