I would strongly suggest sticking to either linear or logarithmic, as these are kind of standards and would be intuitive for the user. These two suffice most of the visualization needs. The data that you are plotting in your example, is it production data? Is this how your data is going to be always, or just a one-off case? The easiest way to determine what axis to use is to find the variation or Standard Deviation or your data, if it is small use linear, if its large use logarithmic (the threshold of small & large can be tweaked based on your data).
I would say the visualizations that you have got in the jsFiddle are pretty much what you should use.
1) Linear You see one data point shooting out, and others are nearly (when compared to the largest in terms of %) same value. So you should very much show it as it, this gives the user a hint that something is terribly wrong. If you try to make the curve look nice, the user will always have to refer to the values and won't be able to make inferences just by looking at the graph, this is the whole purpose of charts. It won't make much sense to just a good looking graph when the user will have to go through the trouble of manually looking at the values of the data points to make quick inferences.
2) Logarithmic When your data is really skewed, yet there are pockets/clusters of data (not just one point being an outlier), go for this one.
If you must need to use some other scale (have a thorough understanding of your production data first, else you will end up prettifying your test data and mess up real data) use some similar standard like square or square root in @Dan Thomas's answer or best way is to if possible, get a generic (not real, but ideal way your data would be) equation for your data. So if your data is like y=A*x
2+ Bx + C
, go for squared, if it is of the form y=A*x + B
go for linear, if y=A*log+(x)+B
go for logarithmic and so on
More @ http://www.forbes.com/sites/naomirobbins/2012/01/19/when-should-i-use-logarithmic-scales-in-my-charts-and-graphs/