choosing a programming language to save money

The other day I was interested in a REST web api shootout in 5 different languages. The results were pretty interesting in terms of performance. It's pretty easy to isolate the performance numbers as just an abstract value. As I tought about it more I realized that there is a lot more to it.

In many environments a box is set up, whether it is virtual or physical, and a service is deployed. Generally, the service is run with some headroom to make sure it performs well. Let's assume that we have a service we can fully peg the CPU on (it's "right sized"). In a simplified sense we are really maximizing what we get for every dollar it costs to run that box. Of course then somebody gets worried about the CPU being pegged ... but for now let's just say this simplistic thinking holds.

Let's assume further that we can maximize the utilization and turn to some way to evaluate the cost to run the service. AWS to the rescue. Since AWS charges by the hour, it's pretty easy to calculate the cost of differently performing implementation in that environment.

In the recent blog post the golang implementation beat the other languages. So let's assume we run that on an instance that costs about $0.10 per hour (right now that's roughly the cost for a c3.large on AWS) and that instance is fully utilized. To keep the math simple we'll run 10 of those for a grand total of $1.00 per hour. I'm ignoring the head room calculations for now to keep it simple.

The golang approach ran at about 448 transactions per second. The ruby implementation only managed to run at around 305 transactions per second. So we would need about 1.47 times as much horse power to match what the golang implementation could do. So instead of 10 instances we need about 15 after rounding up. That's $1.50 per hour.

It gets more pricey from there. The nodejs implementation would cost about $1.90 per hour. $2.20 for the python implementation and $2.70 for the perl code.

Yup, it's still all less than what a Barista might be charging you in the morning, but as it scales up the numbers multiply quickly. Need a hundred instances for a month? Now it looks like this after some less heavy rounding from the small numbers earlier:

  • golang: 730 hours * $0.10 * 100 instances = $7,300 per month
  • ruby: 730 hours * $0.10 * 147 instances = $10,722 per month
  • nodejs: 730 hours * $0.10 * 187 instances = $13,683 per month
  • python: 730 hours * $0.10 * 213 instances = $15,573 per month
  • perl: 730 hours * $0.10 * 266 instances = $19,466 per month

I'm not trying to say that my simple test is the ultimate answer as to what to choose for implementation. There are also a lot of differences in various libraries and the simple test I did ran siege in benchmark mode, which couldn't be sustained for very long.

Overall it's just interesting to consider hard data for systems at scale and possibly leverage it in a positive way. I think I'll be using this idea to help me in the future and I'll likely run some more experiments.

\@matthias