https://packagist.org/packages/bayareawebpro/laravel-simple-csv
- Import to LazyCollection.
- Export from Collection, LazyCollection, Iterable, Generator, Array.
- Low(er) Memory Consumption by use of LazyCollection Generators.
- Uses Native PHP SplFileObject.
- Facade Included.
Simply require the package and Laravel will Auto-Discover the Service Provider.
composer require bayareawebpro/laravel-simple-csv
<?php
use BayAreaWebPro\SimpleCsv\SimpleCsv;
$lazyCollection = SimpleCsv::import(storage_path('collection.csv'));
<?php
use BayAreaWebPro\SimpleCsv\SimpleCsv;
// Collection
SimpleCsv::export(
Collection::make(...),
storage_path('collection.csv')
);
// LazyCollection
SimpleCsv::export(
LazyCollection::make(...),
storage_path('collection.csv')
);
// Generator (Cursor)
SimpleCsv::export(
User::query()->where(...)->limit(500)->cursor(),
storage_path('collection.csv')
);
// Array
SimpleCsv::export(
[...],
storage_path('collection.csv')
);
<?php
use BayAreaWebPro\SimpleCsv\SimpleCsv;
return SimpleCsv::download([...], 'download.csv');
<?php
use Illuminate\Support\Facades\Config;
Config::set('simple-csv.delimiter', ...);
Config::set('simple-csv.enclosure', ...);
Config::set('simple-csv.escape', ...);
<?php
//config/simple-csv.php
return [
'delimiter' => "???",
'enclosure' => "???",
'escape' => "???",
];
A file splitting utility has been included that will break large CSV files into chunks (while retaining column headers) which you can move/delete after importing. This can help with automating the import of large data sets.
Tip: Find your Bash Shell Binary Path: which sh
/bin/sh vendor/bayareawebpro/laravel-simple-csv/split-csv.sh /Projects/laravel/storage/big-file.csv 5000
File Output:
/Projects/laravel/storage/big-file-chunk-1.csv (chunk of 5000)
/Projects/laravel/storage/big-file-chunk-2.csv (chunk of 5000)
/Projects/laravel/storage/big-file-chunk-3.csv (chunk of 5000)
etc...
- Using Lazy Collections is the preferred method.
- Using the queue worker, you can import a several thousand rows at a time without much impact.
- Be sure to use "Database Transactions" and "Timeout Detection" to insure safe imports.
- Article: How to Insert & Update Many at Once