Laravel & MySQL:循环内的重复条目和批量插入到 table 时出错
Laravel & MySQL: Duplicate entry inside loop and error on mass insertion to table
我有这个功能可以将图像上传到 Amazon S3 并将有关图像的信息存储在我的 table "images" 中。
在我的本地主机上一切正常(上传到亚马逊 s3 并将数据存储在 table 中)并且本地主机有 php 5.5.6,但在我的网络服务器上有 PHP 5.5 .2 我可以将图像上传到 Amazon S3,但是 NOT 将数据插入到我的 table 做 mass/bulk 插入:
/*
* HANDLES FILE UPLOAD
*/
public function fileUpload($user_id,$car_id) {
if (Input::hasFile('images')) {
# Initialize array for mass inserting into table
$insert = array();
# General info
date_default_timezone_set('America/Mexico_City');
$created_at = new DateTime;
$files = Input::file('images');
$main_img = Input::all()["image-upload"][0];
$countlimit = 0;
try {
foreach($files as $file) {
if($countlimit>=25 || in_array($file->guessClientExtension(), ['jpg','jpeg','png','gif'])==false) continue; $countlimit++;
# Image data
$image_id = mt_rand(1000000000,9999999999);
$extension= $file->guessClientExtension();
$filename = $user_id.'/'.$car_id.'/'.$image_id.".".$extension;
$path = $file->getRealPath();
$is_main = ($file->getClientOriginalName() == $main_img) ? 1 : NULL;
# UPLOAD TO AMAZON S3
$s3 = AWS::get('s3');
$obj = array(
'Bucket' => $_ENV['aws_bucket'],
'Key' => 'cars/'.$filename,
'SourceFile' => $path,
'ACL' => 'public-read',
);
$result = $s3->putObject($obj);
#ARRAY FOR STORING IMAGE DATA
$insert[] = array(
'car_id' => $car_id,
'image_id' => $image_id,
'image_extension' => $extension,
'is_main' => $is_main,
'url' => $filename, # $url, FOR IMGIX JUST STORE $FILENAME
'created_at' => $created_at
);
}
# Mass insertion
DB::table('images')->insert($insert);
} catch(Exception $e) { return false; }
return true;
} else {
return false;
}
}
现在,当我尝试以这种方式批量插入我的数据库时,它不会存储图像,奇怪的是,如果我执行单个查询,它会起作用。
我尝试过的其他事情是在每次迭代时执行 query/insertion,但我得到 'image_id' 的 'Duplicate entry' 错误。会是什么呢?提前致谢
如果您确实捕获了可能抛出的异常,即 PDOException
在您的 try/catch 中,您可能会发现一些有用的信息,这些信息可以让您自行识别错误。
我假设您已将 PDO 设置为抛出异常
try {
foreach($files as $file) {
if($countlimit>=25 ||
in_array($file->guessClientExtension(), ['jpg','jpeg','png','gif'])==false)
continue;
$countlimit++;
# Image data
$image_id = mt_rand(1000000000,9999999999);
$extension= $file->guessClientExtension();
$filename = $user_id.'/'.$car_id.'/'.$image_id.".".$extension;
$path = $file->getRealPath();
$is_main = ($file->getClientOriginalName() == $main_img) ? 1 : NULL;
# UPLOAD TO AMAZON S3
$s3 = AWS::get('s3');
$obj = array(
'Bucket' => $_ENV['aws_bucket'],
'Key' => 'cars/'.$filename,
'SourceFile' => $path,
'ACL' => 'public-read',
);
$result = $s3->putObject($obj);
#ARRAY FOR STORING IMAGE DATA
$insert[] = array(
'car_id' => $car_id,
'image_id' => $image_id,
'image_extension' => $extension,
'is_main' => $is_main,
'url' => $filename, # $url, FOR IMGIX JUST STORE $FILENAME
'created_at' => $created_at
);
}
# Mass insertion
DB::table('images')->insert($insert);
}
// added catch of PDOException
catch(PDOException $e ) {
echo $e->getMessage();
}
catch(Exception $e) {
return false;
}
我有这个功能可以将图像上传到 Amazon S3 并将有关图像的信息存储在我的 table "images" 中。
在我的本地主机上一切正常(上传到亚马逊 s3 并将数据存储在 table 中)并且本地主机有 php 5.5.6,但在我的网络服务器上有 PHP 5.5 .2 我可以将图像上传到 Amazon S3,但是 NOT 将数据插入到我的 table 做 mass/bulk 插入:
/*
* HANDLES FILE UPLOAD
*/
public function fileUpload($user_id,$car_id) {
if (Input::hasFile('images')) {
# Initialize array for mass inserting into table
$insert = array();
# General info
date_default_timezone_set('America/Mexico_City');
$created_at = new DateTime;
$files = Input::file('images');
$main_img = Input::all()["image-upload"][0];
$countlimit = 0;
try {
foreach($files as $file) {
if($countlimit>=25 || in_array($file->guessClientExtension(), ['jpg','jpeg','png','gif'])==false) continue; $countlimit++;
# Image data
$image_id = mt_rand(1000000000,9999999999);
$extension= $file->guessClientExtension();
$filename = $user_id.'/'.$car_id.'/'.$image_id.".".$extension;
$path = $file->getRealPath();
$is_main = ($file->getClientOriginalName() == $main_img) ? 1 : NULL;
# UPLOAD TO AMAZON S3
$s3 = AWS::get('s3');
$obj = array(
'Bucket' => $_ENV['aws_bucket'],
'Key' => 'cars/'.$filename,
'SourceFile' => $path,
'ACL' => 'public-read',
);
$result = $s3->putObject($obj);
#ARRAY FOR STORING IMAGE DATA
$insert[] = array(
'car_id' => $car_id,
'image_id' => $image_id,
'image_extension' => $extension,
'is_main' => $is_main,
'url' => $filename, # $url, FOR IMGIX JUST STORE $FILENAME
'created_at' => $created_at
);
}
# Mass insertion
DB::table('images')->insert($insert);
} catch(Exception $e) { return false; }
return true;
} else {
return false;
}
}
现在,当我尝试以这种方式批量插入我的数据库时,它不会存储图像,奇怪的是,如果我执行单个查询,它会起作用。
我尝试过的其他事情是在每次迭代时执行 query/insertion,但我得到 'image_id' 的 'Duplicate entry' 错误。会是什么呢?提前致谢
如果您确实捕获了可能抛出的异常,即 PDOException
在您的 try/catch 中,您可能会发现一些有用的信息,这些信息可以让您自行识别错误。
我假设您已将 PDO 设置为抛出异常
try {
foreach($files as $file) {
if($countlimit>=25 ||
in_array($file->guessClientExtension(), ['jpg','jpeg','png','gif'])==false)
continue;
$countlimit++;
# Image data
$image_id = mt_rand(1000000000,9999999999);
$extension= $file->guessClientExtension();
$filename = $user_id.'/'.$car_id.'/'.$image_id.".".$extension;
$path = $file->getRealPath();
$is_main = ($file->getClientOriginalName() == $main_img) ? 1 : NULL;
# UPLOAD TO AMAZON S3
$s3 = AWS::get('s3');
$obj = array(
'Bucket' => $_ENV['aws_bucket'],
'Key' => 'cars/'.$filename,
'SourceFile' => $path,
'ACL' => 'public-read',
);
$result = $s3->putObject($obj);
#ARRAY FOR STORING IMAGE DATA
$insert[] = array(
'car_id' => $car_id,
'image_id' => $image_id,
'image_extension' => $extension,
'is_main' => $is_main,
'url' => $filename, # $url, FOR IMGIX JUST STORE $FILENAME
'created_at' => $created_at
);
}
# Mass insertion
DB::table('images')->insert($insert);
}
// added catch of PDOException
catch(PDOException $e ) {
echo $e->getMessage();
}
catch(Exception $e) {
return false;
}