获取一列具有相同值的行

Get the rows which has same values one column

我正在从一些索引的 csv 基础中获取数据,如下所示:

$filename=$_FILES["file"]["tmp_name"];      


     if($_FILES["file"]["size"] > 0)
     {
        $file = fopen($filename, "r");
        $rowCount = 0 ;
        $secondColumnArray = array();

        while (($getData = fgetcsv($file,  100000, ",")) !== FALSE)
         {
         if($rowCount >=0){

             if(strpos($getData[0],"Bestand")!==false){

             array_push($secondColumnArray, $getData[0]);

               }
             foreach ($secondColumnArray as $all_elements){

                $all_elements_refine = explode(',', $all_elements);

                $new_refine =  $all_elements_refine[0];

                $all_elements_refine1 = explode(';', $new_refine);

                $specialChars = array(" ", "\r", "\n", '"', "*");
                $replaceChars = array("", "", "");
                $all_elements_refine2 = str_replace($specialChars, $replaceChars, $all_elements_refine1);
                 print_r($all_elements_refine2);

             }

         }
                 ++$rowCount;

         }

        //echo "<pre>"; print_r($secondColumnArray);   echo "</pre>"; 
         fclose($file); 
     }

在上面的示例中,我正在统计从索引 306 获取数据,但这对我来说不是一个好方法,我只想获取第一列中具有值 "Bestand" 的那些行。

例如:

Row 1:  Bestand , 1 , Hell , World
Row 2:  farb    , 2 , Hell , World
Row 3:  Bestand , 3 , Hell , World

这些是我的 csv 中的实际行检查实际 csv 中的第四列,过滤 csv 后我得到了重复的行。

Bestand   ;000000000;"I";00000000842143;000000000;00000095;009598;00000198;000000000;"MK Sterling mondstein         ";"moonstone           ";000000000;000000000;000000800;"20160525";"*           ";000000100;000000000;000008990;000008990;000008990;1900;"MK Sterling mondstein         "            
Bestand   ;000000000;"I";00000000842144;000000000;00000095;009598;00000198;000000000;"MK Sterling mondstein         ";"moonstone           ";000000000;000000000;000000800;"20160525";"*           ";000000100;000000000;000008990;000008990;000008990;1900;"MK Sterling mondstein         "            
Bestand   ;000000000;"I";00000000842148;000000000;00000095;009598;00000198;000000000;"MK Sterling mondstein         ";"moonstone           ";000000000;000000000;000000800;"20160525";"*           ";000000100;000000000;000008990;000008990;000008990;1900;"MK Sterling mondstein         "            
Bestand   ;000000000;"I";00000000842157;000000000;00000095;009598;00000198;000000000;"MK Sterling mondstein         ";"moonstone           ";000000000;000000000;000000800;"20160525";"*           ";000000100;000000000;000008990;000008990;000008990;1900;"MK Sterling mondstein         "            
Bestand   ;000000000;"I";00000000842158;000000000;00000095;009598;00000198;000000000;"MK Sterling mondstein         ";"moonstone           ";000000000;000000000;000000800;"20160525";"*           ";000000100;000000000;000008990;000008990;000008990;1900;"MK Sterling mondstein         "            
Bestand   ;000000000;"I";00000000842161;000000000;00000095;009598;00000198;000000000;"MK Sterling mondstein         ";"moonstone           ";000000000;000000000;000000800;"20160525";"*           ";000000100;000000000;000008990;000008990;000008990;1900;"MK Sterling mondstein         "            
Bestand   ;000000000;"I";00000000842162;000000000;00000095;009598;00000198;000000000;"MK Sterling mondstein         ";"moonstone           ";000000000;000000000;000000800;"20160525";"*           ";000000100;000000000;000008990;000008990;000008990;1900;"MK Sterling mondstein         "            
Bestand   ;000000000;"I";00000000842346;000000000;00000095;009598;00000198;000000000;"MK Sterling grün              ";"green zirconia      ";000000047;000000000;000000800;"20160525";"*           ";000000100;000000000;000008990;000008990;000008990;1900;"MK Sterling grün              "            
Bestand   ;000000000;"I";00000000842349;000000000;00000095;009598;00000198;000000000;"MK Sterling grün              ";"green zirconia      ";000000047;000000000;000000800;"20160525";"*           ";000000100;000000000;000008990;000008990;000008990;1900;"MK Sterling grün              "            

行重复:

id  A   B   C   D   E   F   G   H   I   J   K   L   M   N   O   P   Q   R   S   T   U   V   W
1   Bestand 000000000   I   00000000842143  000000000   00000095    009598  00000198    000000000   MKSterlingmondstein moonstone   000000000   000000000   000000800   20160525        000000100   000000000   000008990   000008990   000008990   1900    MKSterlingmondstein
2   Bestand 000000000   I   00000000842143  000000000   00000095    009598  00000198    000000000   MKSterlingmondstein moonstone   000000000   000000000   000000800   20160525        000000100   000000000   000008990   000008990   000008990   1900    MKSterlingmondstein
3   Bestand 000000000   I   00000000842144  000000000   00000095    009598  00000198    000000000   MKSterlingmondstein moonstone   000000000   000000000   000000800   20160525        000000100   000000000   000008990   000008990   000008990   1900    MKSterlingmondstein
4   Bestand 000000000   I   00000000842143  000000000   00000095    009598  00000198    000000000   MKSterlingmondstein moonstone   000000000   000000000   000000800   20160525        000000100   000000000   000008990   000008990   000008990   1900    MKSterlingmondstein
5   Bestand 000000000   I   00000000842144  000000000   00000095    009598  00000198    000000000   MKSterlingmondstein moonstone   000000000   000000000   000000800   20160525        000000100   000000000   000008990   000008990   000008990   1900    MKSterlingmondstein

所以我只想要在第一列中具有值 "Bestand" 的行 我如何才能得到它,请帮忙。

这里重写:

$filename=$_FILES["file"]["tmp_name"];
if($_FILES["file"]["size"]>0){
    $file=fopen($filename,"r");
    $rowCount=0;
    $secondColumnArray=[];
    while(($getData=fgetcsv($file,100000))!==false){
        if($rowCount>=306){
            if(strpos($getData[0],"Bestand")!==false){
                array_push($secondColumnArray, $getData[0]);
            }
            foreach($secondColumnArray as $all_elements){
                print_r($all_elements);
            }
        }
        ++$rowCount;
    }
    fclose($file); 
}