php script dies when it calls a bash script, maybe a problem of server configuration

Posted by user347501 on Stack Overflow See other posts from Stack Overflow or by user347501
Published on 2010-05-21T21:42:33Z Indexed on 2010/05/21 21:50 UTC
Read the original article Hit count: 387

Filed under:
|
|
|
|

Hi!!!

I have some problems with a PHP script that calls a Bash script.... in the PHP script is uploaded a XML file, then the PHP script calls a Bash script that cut the file in portions (for example, is uploaded a XML file of 30,000 lines, so the Bash script cut the file in portions of 10,000 lines, so it will be 3 files of 10,000 each one)

The file is uploaded, the Bash script cut the lines, but when the Bash script returns to the PHP script, the PHP script dies, & I dont know why... I tested the script in another server and it works fine... I dont believe that is a memory problem, maybe it is a processor problem, I dont know, I dont know what to do, what can I do??? (Im using the function shell_exec in PHP to call the Bash script)

The error only happens if the XML file has more than 8,000 lines, but if the file has less then 8,000 everything is ok (this is relative, it depends of the amount of data, of strings, of letters that contains each line)

what can you suggest me??? (sorry for my bad english, I have to practice a lot xD) I leave the code here

PHP script (at the end, after the ?>, there is html & javascript code, but it doesnt appear, only the javascript code... basically the html is only to upload the file)


" . date('c') . ": $str
"; $file = fopen("uploadxmltest.debug.txt","a"); fwrite($file,date('c') . ": $str\n"); fclose($file); } try{ if(is_uploaded_file($_FILES['tfile']['tmp_name'])){ debug("step 1: the file was uploaded"); $norg=date('y-m-d')."_".md5(microtime()); $nfle="testfiles/$norg.xml"; $ndir="testfiles/$norg"; $ndir2="testfiles/$norg"; if(move_uploaded_file($_FILES['tfile']['tmp_name'],"$nfle")){ debug("step 2: the file was moved to the directory"); debug("memory_get_usage(): " . memory_get_usage()); debug("memory_get_usage(true): " . memory_get_usage(true)); debug("memory_get_peak_usage(): " . memory_get_peak_usage()); debug("memory_get_peak_usage(true): " . memory_get_peak_usage(true)); $shll=shell_exec("./crm_cutfile_v2.sh \"$nfle\" \"$ndir\" \"$norg\" "); debug("result: $shll"); debug("memory_get_usage(): " . memory_get_usage()); debug("memory_get_usage(true): " . memory_get_usage(true)); debug("memory_get_peak_usage(): " . memory_get_peak_usage()); debug("memory_get_peak_usage(true): " . memory_get_peak_usage(true)); debug("step 3: the file was cutted.
END"); } else{ debug("ERROR: I didnt move the file"); exit(); } } else{ debug("ERROR: I didnt upload the file"); //exit(); } } catch(Exception $e){ debug("Exception: " . $e->getMessage()); exit(); } ?> Test function uploadFile(){ alert("start"); if(document.test.tfile.value==""){ alert("First you have to upload a file"); } else{ document.test.submit(); } }

Bash script with AWK


#!/bin/bash

#For single messages (one message per contact)
function cutfile(){
 lines=$( cat "$1" | awk 'END {print NR}' )
 fline="$4";

 if [ -d "$2" ]; then
  exsts=1
 else
  mkdir "$2"
 fi

 cp "$1" "$2/datasource.xml"
 cd "$2"

 i=1
 contfile=1
 while [ $i -le $lines ]
 do 
  currentline=$( cat "datasource.xml" | awk -v fl=$i 'NR==fl {print $0}' )

  #creates first file
  if [ $i -eq 1 ]; then
   echo "$fline" >>"$3_1.txt"
  else
   #creates the rest of files when there are more than 10,000 contacts
   rsd=$(( ( $i - 2 ) % 10000 ))
   if [ $rsd -eq 0 ]; then
    echo "" >>"$3_$contfile.txt"
    contfile=$(( $contfile + 1 ))
    echo "$fline" >>"$3_$contfile.txt"
   fi
  fi

  echo "$currentline" >>"$3_$contfile.txt"
  i=$(( $i + 1 )) 
 done

 echo "" >>"$3_$contfile.txt"
 return 1
}


#For multiple messages (one message for all contacts)
function cutfile_multi(){
 return 1
}

cutfile "$1" "$2" "$3" "$4"
echo 1

thanks!!!!! =D

© Stack Overflow or respective owner

Related posts about php

Related posts about bash