这是用于将excel (使用2)文件导入数据库的方法的主要代码:
$data = Excel::selectSheetsByIndex(0)->load($file, function($reader) {})->get()->toArray();
DB::beginTransaction();
try {
foreach ($data as $key => $value) {
$med= trim($value["med"]);
$serial = trim($value["nro.seriemedidor"]);
DB::table('medidores')->insert([
"med" => $med,
"serial_number" => $serial
]);
}
DB::commit();
} catch (\Exception $e) {
DB::rollback();
return redirect()->route('myroute')->withErrors("Some error message");
}当我有“很少”的数据(假设excel文件中的行数少于5000行)时,这很好。但是我需要处理一个大型excel文件,它有140万行,被分割成一个以上的工作表。我怎样才能使我的方法更快?有小费吗?
编辑:我将用答案的注释之一的链接上的代码编辑这个问题:
$data = Excel::selectSheetsByIndex(0)->load($file, function($reader) {})->get()->toArray();
DB::beginTransaction();
try {
$bulk_data = [];
foreach ($data as $key => $value) {
$med= trim($value["med"]);
$serial = trim($value["nro.seriemedidor"]);
$bulk_data[] = ["med" => $med,"serial_number" => $serial] ;
}
$collection = collect($bulk_data); //turn data into collection
$chunks = $collection->chunk(100); //split into chunk of 100's
$chunks->toArray(); //convert chunk to array
//loop through chunks:
foreach($chunks as $chunk)
{
DB::table('medidores')->insert($chunk->toArray());
}
DB::commit();
} catch (\Exception $e) {
DB::rollback();
return redirect()->route('myroute')->withErrors("Some error message");
}这件事对我来说很有效。
发布于 2018-05-14 00:14:02
是的,您可以不执行X(数据库请求数目)*N(工作表数),而是尝试执行一个简单的批量插入,这只会花费您遍历数据保存X*N数据库请求的复杂性,下面是一个示例:
$data = Excel::selectSheetsByIndex(0)->load($file, function($reader) {})->get()->toArray();
DB::beginTransaction();
try {
$bulk_data = [];
foreach ($data as $key => $value) {
$med= trim($value["med"]);
$serial = trim($value["nro.seriemedidor"]);
$bulk_data[] = ["med" => $med,"serial_number" => $serial] ;
}
DB::table('medidores')->insert($bulk_data);
DB::commit();
} catch (\Exception $e) {
DB::rollback();
return redirect()->route('myroute')->withErrors("Some error message");
}有关db请求的更多解释,您可以参考以下答案:https://stackoverflow.com/a/1793209/8008456
https://stackoverflow.com/questions/50321679
复制相似问题