Illuminate\Queue\MaxAttemptsExceededException: App\Jobs\QuestionCsvProcess has been attempted too many times or run too long. The job may have previously timed out Can you please help? Have increased the timeout to 9000, but still the same error failed_job in DB
Hi! I have transaction data in excel that I save in a database and other data that has exactly the same fields but in an external database. my concern is to be able to compare whether the transaction references extracted from my excel file have exactly the same status as those stored in the external database. How can i proceed?
ex : i hv total 1000 rows...then i code... array_chunk($data, 500); it divides in two files in pending folder.... My issue is only first 500 rows from first file inserted and second file failed.... the second loop isn't working....why is this happening??
the problem is how to validate and show errors to the user you only validating if the file is csv....very poor validation... if you can help with a normal validation a would appreciate , thanks
Awesome videos. Thanks a lot for your quality contents. Please can you make Laravel multilevel marketing project series? I can't find anyone on TH-cam built with Laravel.
Predis\Connection\ConnectionException No connection could be made because the target machine actively refused it. [tcp://127.0.0.1:6379] , I got this error can you tell me what happen ?
You probably know but, I had a CSV file that the unique key was a string so I add an extra key as primary id for imported table and forgot about index it on migration. In this case make sure you create an index with migration at first place($table->string('sku')->unique(); ). It has a big impact on performance(about 30 times faster on updateOrCreate)
@@WebDevMatics I didn't meant you from "you". I am pretty sure you knew it. I had a hard time finding the cause of worker process have been killed, thought it's need a right balance between slices size and time which wasn't a thing after indexing. Also I had a problem with data. Like date with value of 00000000 and not escaping double quotation mark in data. Laravel doesn't catch error on those worker and stuck on loop for any file with error in data. Where is the best place to check and fix those error? Is there anyway to catch or log those errors? What we do on production site to processing the files? Thanks
Laravel queue is different topic it needs separate series. Validation in csv import is very important. Basically we escape unwanted characters using regex and catch error using try catch. Put $tries property in jobs so that it will stop after 3 tries in case of failure.
Используйте дополнительные параметры $data = array_map(function($l) { return str_getcsv($l, ';'); }, file($file)); если у вас разделитель ; Use additional parameters $data = array_map(function($l) { return str_getcsv($l, ';'); }, file($file)); if you have a separator ;
You dont need to upload just put that file in some location probably inside public folder and write script that will look this file and split it and import one by one (this part we covered in this video)
Im having problems, here its the code, can your check it out please? github.com/earhackerdem/ImportLargeCsv The error store at failed_jobs table Illuminate\Queue\MaxAttemptsExceededException: App\Jobs\ProcessCsvUpload has been attempted too many times or run too long. The job may have previously timed out. Im uploading a file that only has 128 rows, the firts queue works fine but the others don't , the code chunk 50 rows
Awesome, I will link it as with vuejs, it's so awesome, clear n lovely. Live long my friend. Great tutorial.
Illuminate\Queue\MaxAttemptsExceededException: App\Jobs\QuestionCsvProcess has been attempted too many times or run too long. The job may have previously timed out
Can you please help? Have increased the timeout to 9000, but still the same error failed_job in DB
Laravel Lazycollection chunk would be great, we don't create physical file and it also save memory a lot.
Hi!
I have transaction data in excel that I save in a database and other data that has exactly the same fields but in an external database. my concern is to be able to compare whether the transaction references extracted from my excel file have exactly the same status as those stored in the external database. How can i proceed?
ex : i hv total 1000 rows...then i code...
array_chunk($data, 500);
it divides in two files in pending folder....
My issue is only first 500 rows from first file inserted and second file failed.... the second loop isn't working....why is this happening??
without redis how to do?
Thanks! Very clear and helpful!
Great to hear!
first time i like the video
Thank you
$import->failures(); not showing any filed columns. I have removed onFailed() function from Import file but always showing blank.
is collection better in this context than arrays if have to compare data and manipulate before inserting?
the problem is how to validate and show errors to the user you only validating if the file is csv....very poor validation...
if you can help with a normal validation a would appreciate ,
thanks
Awesome videos. Thanks a lot for your quality contents. Please can you make Laravel multilevel marketing project series? I can't find anyone on TH-cam built with Laravel.
Thanks for the kind words. I have to analyze how much time will it take to build. Thanks for the suggestion
Alright. Thanks for reply.
Using database for my mysqli its working not using radis?
Predis\Connection\ConnectionException
No connection could be made because the target machine actively refused it. [tcp://127.0.0.1:6379] , I got this error can you tell me what happen ?
Looks like redis is not installed or correct port is not specified in .env
@@WebDevMatics but i installed redis , which port we should specified Apache or mysql ?
Search for redis installation instructions
@@WebDevMatics ok ,
how can I separate file part base on tag
Can you explain, how to import from live external api with record 50k ?
Class "App\Models\Details" not found......On web,php
use controller and dont forget to import class
You probably know but, I had a CSV file that the unique key was a string so I add an extra key as primary id for imported table and forgot about index it on migration. In this case make sure you create an index with migration at first place($table->string('sku')->unique();
). It has a big impact on performance(about 30 times faster on updateOrCreate)
Yes you are right I missed that
@@WebDevMatics I didn't meant you from "you". I am pretty sure you knew it. I had a hard time finding the cause of worker process have been killed, thought it's need a right balance between slices size and time which wasn't a thing after indexing.
Also I had a problem with data. Like date with value of 00000000 and not escaping double quotation mark in data. Laravel doesn't catch error on those worker and stuck on loop for any file with error in data. Where is the best place to check and fix those error?
Is there anyway to catch or log those errors?
What we do on production site to processing the files?
Thanks
Laravel queue is different topic it needs separate series. Validation in csv import is very important. Basically we escape unwanted characters using regex and catch error using try catch. Put $tries property in jobs so that it will stop after 3 tries in case of failure.
Excellent.....
Many many thanks
@@WebDevMatics how queue job keep work on in cpanel (production).
Thank You!
hi, can you help me. i have 1381 rows, and it just import 2 rows only in database. why it is happen?
Hard to say without seeing code
Используйте дополнительные параметры $data = array_map(function($l) { return str_getcsv($l, ';'); }, file($file)); если у вас разделитель ;
Use additional parameters $data = array_map(function($l) { return str_getcsv($l, ';'); }, file($file)); if you have a separator ;
Hey guys, what's the Laravel version here?
I think is 7, but its working on version 8
Hey thanks for the video, I have a question, how can I upload xlsx file and how can pass auth id to be save along with data from excel.
Put auth middleware ware in import route then you can get authentic users id by auth()->id()
@@WebDevMatics I have tried it sir, still it doesn't work
@@WebDevMatics Auth Id is not being accessed in handle function()
@@nkesigaclinton3013 if you are using queue the. You can pass auth id while dispatching queue
cheers, thanks a lot, please you can provide source code of this?
But how i can upload csv with 10 million records with file size greater then 1 GB
You dont need to upload just put that file in some location probably inside public folder and write script that will look this file and split it and import one by one (this part we covered in this video)
Great
Can you share source code or snippet ?
The audio is quite low . Please consider upgrading your mic.
Yes I have upgraded now
Im having problems, here its the code, can your check it out please?
github.com/earhackerdem/ImportLargeCsv
The error store at failed_jobs table
Illuminate\Queue\MaxAttemptsExceededException: App\Jobs\ProcessCsvUpload has been attempted too many times or run too long. The job may have previously timed out.
Im uploading a file that only has 128 rows, the firts queue works fine but the others don't , the code chunk 50 rows
is collection better in this context than arrays if have to compare data and manipulate before inserting?
Performance wise they same, you get easier syntax with collection