Thursday, February 27, 2014

Script to get Facebook Comments, FB Shares and facebook likes for a list of urls

Another little helper script - this time tapping into the facebook api to get comments, shares and likes.
The access to this api does not require a login or account, but cuts off after ~ 400 requests. This is much more than I usually need, so no modification of this script to get over the limit. (One could think of a timer with 'sleep 600' after 400 loops or so).

first add a header into the output file - dynamic pagename based on input file $1 (first parameter to call with the file), then start the loop for each url. Each loop includes 3 wget calls to the FB api for different values.
echo -e "FB-comments\tFB-shares\tFB-likes\tUrls" > "${1}"-store.csv
 while read -r line; do
generate the url
pull the data with wget, remove unnecessary parts with sed, and store in the variable comment count
comment_count=`wget -qO- $pull | sed -e 's/^.*://g' -e 's/\}//g' -e 's/\(]\)//g'`
 now the same with shares and likes
share_count=`wget -qO- $pull | sed -e 's/^.*://g' -e 's/\}//g' -e 's/\(]\)//g'`
#echo $share_count
like_count=`wget -qO- $pull | sed -e 's/^.*://g' -e 's/\}//g' -e 's/\(]\)//g'`
#echo $like_count
add all three variables into the file where the data is stored
echo -e "${comment_count}\t${share_count}\t${like_count}\t${line}" >> "${1}"-store.csv
done < ${1}
I choose to store the data in variables and then to concatenate to one line with echo, because it makes it really easy to just separate this with tabs - echoing each item separately into the file would have required to remove the line break (\n) each time. 

No comments:

Post a Comment

Bookmark and Share