grep from a log file to get count

Posted by subodh1989 on Stack Overflow See other posts from Stack Overflow or by subodh1989
Published on 2012-11-16T10:19:39Z Indexed on 2012/11/16 11:01 UTC
Read the original article Hit count: 151

Filed under:
|
|
|
|

I have to get certain count from files. The grep statement i am using is like this :

counter_pstn=0
completed_count_pstn=0
rec=0
for rec in `(grep "merged" update_completed*.log | awk '{print $1}' | sed 's/ //g' | cut -d':' -f2)`
do
if [ $counter_pstn -eq 0 ]
then
completed_count_pstn=$rec
else
completed_count_pstn=$(($completed_count_pstn+$rec))
fi
counter_pstn=$(($counter_pstn+1))
done
echo "Completed Orders PSTN Primary " $completed_count_pstn

But the log file contains data in this format :

2500 rows merged.
2500 rows merged.
2500 rows merged.
2500 rows merged.2500 rows merged.
2500 rows merged.
2500 rows merged.

As a result , it is missing out the count of one merge(eg on line 4 of output).How do i modify the grep or use another function to get the count. NOTE that the 2500 number maybe for different logs. So we have to use "rows merged" pattern to get the count. i have tried -o ,-w grep options,but it is not working.

Expected output from above data:

17500

Actual output showing :

15000

© Stack Overflow or respective owner

Related posts about shell

Related posts about scripting