Showing posts with label blogging. Show all posts
Showing posts with label blogging. Show all posts

Thursday, March 27, 2014

Rackspace blog - ripples, likes, shares, comments


Rackspace this time in the focus. How does the Rackspace blog do in Facebook comments, likes and shares? Pretty ok, I would say, as well as they do with ripples. It is not the industry blog like moz with the public shares we can see there, but still pretty good numbers for a company blog, quite in the range of the coveted Moz guest post blog .




Again used these scripts to get Facebook data, this to get the Google ripples.

Monday, January 27, 2014

Guest blogging on Moz.com: Good location to get ripples

Last week I checked the company blog of moz.com for ripples, with truly astounding numbers. This week I decided to check on their user generated content - you find it at moz.com/ugc.

First, I pulled all urls from the sitemap moz.com/ugc-sitemap.xml , then again used the ripple script to get the number of ripples = public shares per url. Each url stands for a blog guest post. 

The results are impressive again - user generated content on 1708 posts generates 1251 ripples, with the top posts having 73 and 56 ripples.

The top post "http://moz.com/ugc/google-plus-authorship-one-critical-thing-you-need-to-know" author +Samuel Scott currently is in 222 people's circles on G+ currently, and I doubt there are many other places, if any, where he could have gotten as many shares as here (nor would I or many others, just to be clear).


Again - moz is a great place for seo content, and this narrow focus now on inbound marketing is highly beneficial for readers, for writers, and for the company.
Imagine to add 1251 ripples to YOUR site with guest blogging! 

Tuesday, January 7, 2014

Check sharpening modes with imagemagick for blogging

Blogging needs pictures - and the pictures need to be good and small. To reduce pictures in size, I use imagemagick and run it on a folder like described with this recent script.

Depending on the source and target size of the picture, darkness and details various settings are getting the best results - there is not one best setting. So how to efficiently figure out which is best?

I move the pictures I want to use in a separate folder, and then run below script to check several settings in the 'unsharp' mask. Then I just open them in a viewer, and flip back / forth between pictures, delete the less good one until I am left with the one I will use.
#!bash
if [[ ! -d $1 ]]
then mkdir "$1"
fi
for i in *.jpg
do echo "processing $i"
convert "$i" -resize "${1}>" -quality 25%  "$1"/"$i"_0s    # no sharpen
convert "$i" -resize "${1}>" -quality 25% -unsharp 1.2x1.2+1+0 "$1"/"$i"_1s   # best with raffia - high contrast
convert "$i" -resize "${1}>" -quality 25% -unsharp 1.2x1.2+0.5+0 "$1"/"$i"_2s # best for 600 px tent (dark)
convert "$i" -resize "${1}>" -quality 25% -unsharp 0x1.2+0.5+0 "$1"/"$i"_3s # best for 800px tent
convert "$i" -resize "${1}>" -quality 25% -unsharp 0x1.2+5+0 "$1"/"$i"_4s # good with raffia
convert "$i" -resize "${1}>" -quality 25% -unsharp 0x1.4+5+0 "$1"/"$i"_5s # good with raffia
convert "$i" -resize "${1}>" -quality 25% -unsharp 1.5x1+0.7+0.02 "$1"/"$i"_6s # forum 2 // good outside
convert "$i" -resize "${1}>" -quality 25% -unsharp 0x0.75+0.75+0.008 "$1"/"$i"_7s # forum 1 // good outside
convert "$i" -resize "${1}>" -quality 25% -unsharp 0x6+0.5+0 "$1"/"$i"_8s # gimp   // good outside

done
It has a little shortcut - linux seems not to care about the file ending (jpg), and displays the files fine. this way I can have the filename first, and then the variation number of the script. While in the photo viewer, this helps to have all variations of the same picture after each other for the back / forth deletion process. At the end I just need to rename the final picture for use.

Tuesday, December 31, 2013

Reduce pictures with a script for imagemagick

Blogging is fun, but can be quite some effort. One of the things necessary is to scale pictures so they fit into the blog, are large and sharp enough to show the details necessary, but also be as small as possible to have great page load times.

The best results I can possibly generate are with photoshop, and that also has a nice batch option. On Windows Irfanview is a great tool to automate this super easily in pretty good quality as well. My tool of choice on linux is imagemagick. While it has tons of options, below setting works great for me.

This script I start from the folder with the pictures. It takes one parameter on call - the length wanted for the longer side. So, calling it like 'image-resize.sh 800'  is the way to go.
It would check if the folder exists and if not generate it; then rename all filenames in the startfolder to small letters, then rename .jpeg to .jpg to make all jpg accessible to the imagemagic script.
#!bash
if [[ ! -d $1 ]]
      then mkdir "$1"
fi 
rename 'y/A-Z/a-z/' *
rename 's/\.jpeg/\.jpg/' *

for i in *.jpg
convert "$i" -resize "${1}^>" -quality 25% -unsharp 1.2x1.2+1+0 "$1"/s_"$i"
done

Then it reduce pictures where the larger side (height or width) is larger than 800 px to exactly 800 px. It maintains the ratio, sharpens as well and reduces the quality to 30% as well - a value I found the sweet spot between quality and image size for many of my pictures. Final step is to add a s_ to the filename and generate it into that folder. Most important insight (from a forum) was the setting for 'value^>' - setting the longer side to this value.
Bookmark and Share