As a few people mentioned, there's also fast-excel package. Here are the results for that, based on Pull Request by Hendra Susanto (thanks!): I find that with the right tweak, Fast Excel is the best in terms of memory usage in exporting although it can be slower than No packages or Spatie Excel. In my local test: - No packages: 2741 ms, 12 MB memory usage - Spatie: 2873 ms, 58 MB memory usage - Fast Excel: 4096 ms, 8 MB memory usage For importing it's kinda in the middle between No packages and Spatie method in terms of duration and memory usage. - No packages: 289 ms, 110 MB memory usage - Spatie: 927 ms, 54 MB memory usage - Fast Excel: 881 ms, 88 MB memory usage Pull Request: github.com/LaravelDaily/Laravel-Excel-Export-Import-Demo/pull/1
Hello there! Thanks for your video, instructive as always. I had this particular issue recently with a project for a client. My client needs to export a bunch of users sometimes like 10-30k users to CSV with some data transformation (to make the data in rows more user friendly). At first sight, Laravel Excel seemed to be a great choice because of his "plug-and-play" aspect. But in the end, we chose to remove it progressively as it was just causing errors in production. The documentation sometimes is a bit tricky, things does not work as expected or explained. As you mentioned, many people complained about this performance issue and ask for the use of Eloquent cursor for example to avoid this memory overload or request time out. On our side, we chose to request our database through Eloquent and stream download the file to the user which is way better than what we had with LE. I may do as you did in this video to choose future packages instead of just referring to Github stars. Thanks for the daily videos, I learn something new everyday. BTW, learned even more from your paid courses. Keep going! P.S: You said recently that you might refresh older courses for the v9. I'm really eager to see it.
Also, for performance, I highly recommend that, if you are certain that values in your array will never be null, to use isset($arrayName[$keyName]) over array_key_exists($keyName, $arrayName) as isset is much faster, but it will return false if the value of that key is null. But if you never need to worry about that, use isset
@@kchqq I'm not sure if empty handles non-existing keys as well. It may throw a warning at minimum, whereas the isset is designed to simply detect if it's set and is a non null value, and won't throw execution warnings or errors if it doesn't exist
@@JouvaMoufette i think it throws error if non existent key when working with arrays, does not throw for collections tho, so i usually put ?? '' afterwards to be safe. Probably not the most efficient piece of code but it's short and readable
@@kchqq Yeah but what I'm speaking of here is that empty isn't a good replacement for isset, but isset is a pretty good replacement for array_key_exists
In my humble opinion, your measurements are very close to reality Logically, the handwritten code of export using arrays is the most optimized in terms of performance and memory, because: 1-) In terms of performance, you're using the method toArray() which converts the collection into an array of associative arrays, so you're cutting off the effort to transform the collection into an array, and also you're cutting off the effort to transform every item of the array from an instance of Illuminate\Database\Eloquent\Model into an associative array, this happens only one time before the actual loop starts 2-) In terms of memory, you're just getting every entry and throwing it directly into the stream which points to csv file, nothing needs to be keeped in the memory. Plus that you're not sending the exported file directly from the stream to the browser, you're sending the file. This explains why not much memory is consumed.
If you use fgetcsv instead of str_getcsv, maybe you don't have "high" memory usage. "$users = array_map('str_getcsv', file($request->file));" is putting all CSV to RAM
Laravel excel has a batch import function which reduces the memory usage by not using eloquent but direct db queries to insert ...for exports i think there is another option laracsv by usmanhalalit
I’m facing issue on import export in laravel 10 its not working could you please suggest me which package is best to sort the mateweb and raph2 is not working
How to export data into csv/xlsx by using chunk if I have only array of data qty 17000 rows? I didn't use any model in Laravel in this case. Could you please explain.
Hello please i have a big question, What is the advantages of using DOMPDF/ Maatwebsite\Excel package to export instead of just using the normal frontend javascript datatables with the export to pdf, excel etc buttons?
Hello .. i have i challenge here. Need to create txt file based in Bank Manual here in Brazil. We call it CNAB Files.. each information need to be written in especific colunm. Example: Colunm 1 to 40 - Customer s Name, Colunm 41 to 50 - Customer s Code. and so. These files will be used to send information to bankline system, whats the best way to follow?. Congratulations for the Channel.
I disapprove of the way you're measuring the time. The data pool for a somewhat precise time is too low. You need to run each operation multiple times and calculate the average instead of running it just once. Even local machines can have a big discrepancy in processing time for each operation. One operation can take just 240ms the first time but 1500ms the second time. That's why for benchmarks operations are executed multiple times.
The only problem here is that box/spout is abandoned and Raphaël's package sits on top of that. There is a replacement for Spout called OpenSpout and there is a PR awaiting approval in fast-excel to migrate to OpenSpout v3 as it is a drop-in replacement for Spout v3. Meanwhile, OpenSpout is now at version 4 and that will require rewriting fast-excel as there are some breaking changes. Hopefully, fast-excel will remain very fast and have the additional advantage of being able to use ODS spreadsheets as well which is a major advantage for me.
As a few people mentioned, there's also fast-excel package. Here are the results for that, based on Pull Request by Hendra Susanto (thanks!):
I find that with the right tweak, Fast Excel is the best in terms of memory usage in exporting although it can be slower than No packages or Spatie Excel.
In my local test:
- No packages: 2741 ms, 12 MB memory usage
- Spatie: 2873 ms, 58 MB memory usage
- Fast Excel: 4096 ms, 8 MB memory usage
For importing it's kinda in the middle between No packages and Spatie method in terms of duration and memory usage.
- No packages: 289 ms, 110 MB memory usage
- Spatie: 927 ms, 54 MB memory usage
- Fast Excel: 881 ms, 88 MB memory usage
Pull Request:
github.com/LaravelDaily/Laravel-Excel-Export-Import-Demo/pull/1
Hello there!
Thanks for your video, instructive as always.
I had this particular issue recently with a project for a client. My client needs to export a bunch of users sometimes like 10-30k users to CSV with some data transformation (to make the data in rows more user friendly). At first sight, Laravel Excel seemed to be a great choice because of his "plug-and-play" aspect. But in the end, we chose to remove it progressively as it was just causing errors in production.
The documentation sometimes is a bit tricky, things does not work as expected or explained.
As you mentioned, many people complained about this performance issue and ask for the use of Eloquent cursor for example to avoid this memory overload or request time out.
On our side, we chose to request our database through Eloquent and stream download the file to the user which is way better than what we had with LE.
I may do as you did in this video to choose future packages instead of just referring to Github stars.
Thanks for the daily videos, I learn something new everyday.
BTW, learned even more from your paid courses.
Keep going!
P.S: You said recently that you might refresh older courses for the v9. I'm really eager to see it.
I always learn something new from you, but debug packages you used are life saver for me 😍
For low ram usage set chunkSize() and batchSize() in your laravel excel import class
Also, for performance, I highly recommend that, if you are certain that values in your array will never be null, to use isset($arrayName[$keyName]) over array_key_exists($keyName, $arrayName) as isset is much faster, but it will return false if the value of that key is null. But if you never need to worry about that, use isset
thanks for the explanation. did you have the chance to test values with empty()? curious what kind of time that method would make
@@kchqq I'm not sure if empty handles non-existing keys as well. It may throw a warning at minimum, whereas the isset is designed to simply detect if it's set and is a non null value, and won't throw execution warnings or errors if it doesn't exist
@@JouvaMoufette i think it throws error if non existent key when working with arrays, does not throw for collections tho, so i usually put ?? '' afterwards to be safe. Probably not the most efficient piece of code but it's short and readable
@@kchqq Yeah but what I'm speaking of here is that empty isn't a good replacement for isset, but isset is a pretty good replacement for array_key_exists
wow what a great comparisons! thank you very much
have no idea if there are many ways to export/import excel csv files.
times to optimization!
This is really helpful, I was in trouble with Laravel excel.
In my humble opinion, your measurements are very close to reality
Logically, the handwritten code of export using arrays is the most optimized in terms of performance and memory, because:
1-) In terms of performance, you're using the method toArray() which converts the collection into an array of associative arrays, so you're cutting off the effort to transform the collection into an array, and also you're cutting off the effort to transform every item of the array from an instance of Illuminate\Database\Eloquent\Model into an associative array, this happens only one time before the actual loop starts
2-) In terms of memory, you're just getting every entry and throwing it directly into the stream which points to csv file, nothing needs to be keeped in the memory. Plus that you're not sending the exported file directly from the stream to the browser, you're sending the file. This explains why not much memory is consumed.
WTF! 😅. this proves that you shouldn't use 3rd party packages all the time, sometimes the raw is better
If you watched the whole video, it wasn't outright better. Spatie outperformed raw on imports.
@@AdamPerkinsOneAndOnly just like he said - not all the time ;-)
Cette vidéo est particulièrement utile. Merci beaucoup.
This is incredible! I would have wanted to know this a year ago
Great and very interesting content! Thanks for sharing!
If you use fgetcsv instead of str_getcsv, maybe you don't have "high" memory usage. "$users = array_map('str_getcsv', file($request->file));" is putting all CSV to RAM
And if you add the generator logic (as I suppose Spatie package does), you will likely have even less memory usage
Treat it a bit like commercial..
Laravel excel has a batch import function which reduces the memory usage by not using eloquent but direct db queries to insert ...for exports i think there is another option laracsv by usmanhalalit
can we use spatie/simple-excell to download file using restAPI flow?
I’m facing issue on import export in laravel 10 its not working could you please suggest me which package is best to sort the mateweb and raph2 is not working
Hi thank for sharing amazing video, i will try to implement this concept in my project
How to export data into csv/xlsx by using chunk if I have only array of data qty 17000 rows? I didn't use any model in Laravel in this case. Could you please explain.
Hello please i have a big question, What is the advantages of using DOMPDF/ Maatwebsite\Excel package to export instead of just using the normal frontend javascript datatables with the export to pdf, excel etc buttons?
You forget fastexcel! For large data it's in my opinion the best
Good morning !and thnx for the video
When i try to export data from my database i got Call to a member function lazy() on array. How can i fix this?
Can you build a simple editor for me ?
Please let me know if can help
I need your help!
Good morning Please use map method instance of foreach in laravel
Hello .. i have i challenge here. Need to create txt file based in Bank Manual here in Brazil. We call it CNAB Files.. each information need to be written in especific colunm. Example: Colunm 1 to 40 - Customer s Name, Colunm 41 to 50 - Customer s Code. and so. These files will be used to send information to bankline system, whats the best way to follow?. Congratulations for the Channel.
Does laravel / Excel work with Laravel 9 , please advice😀
yes it does with version 3.1
cool video
And if you're usin mysql there's a faster, more efficient way to import large CSV : "LOAD DATA INFILE..."
You mean LOAD DATA LOCAL INFILE. I use it to import millions of rows every day. 👍🏻
$collection = collect(['taylor', 'abigail', null])->map(function ($name) {
return $name;
});
what about maatwebsite?
It's the same Laravel Excel
I disapprove of the way you're measuring the time. The data pool for a somewhat precise time is too low. You need to run each operation multiple times and calculate the average instead of running it just once. Even local machines can have a big discrepancy in processing time for each operation. One operation can take just 240ms the first time but 1500ms the second time. That's why for benchmarks operations are executed multiple times.
That's behind the scenes but I've run those multiple times while testing BEFORE shooting the actual video
I don' t think you teach newbies Because you did things that a newbie can never understand how you got it
Dammmm ....
I will never use laravel excel again
Using lazy collections will probably solve memory issues
Yes maybe, but he used what advised in Laravel excel package right ?
Lazy collections = generators = spatie’s implementation
Very bad way to teaching this man
rap2hpoutre/fast-excel
This package is much faster than the other 3 examples.
The only problem here is that box/spout is abandoned and Raphaël's package sits on top of that. There is a replacement for Spout called OpenSpout and there is a PR awaiting approval in fast-excel to migrate to OpenSpout v3 as it is a drop-in replacement for Spout v3.
Meanwhile, OpenSpout is now at version 4 and that will require rewriting fast-excel as there are some breaking changes.
Hopefully, fast-excel will remain very fast and have the additional advantage of being able to use ODS spreadsheets as well which is a major advantage for me.