Make it right, then make it fast

 

Alan Perlis once said: A Lisp programmer knows the value of everything, but the cost of nothing.

I re-discovered this maxim this past week. 

As many of you may know, we’re using Clojure, Datomic, and Storm to build Zolodeck. (I’ve described my ideal tech stack here). I’m quite excited about the leverage these technologies can provide. And I’m a big believer in getting something to work whichever way I can, as fast as I can, and then worrying about performance and so on. I never want to fall under the evil of premature optimization and all that… In fact, on this project, I keep telling my colleague (and everyone else who listens) how awesome (and fast) Datomic is, and how its built-in cache will make us stop worrying about database calls. 

A function I wrote (that does some fairly involved computation involving relationship graphs and so on) was taking 910 seconds to complete. Yes, more than 15 minutes. Of course, I immediately suspected the database calls, thinking my enthusiasm was somehow misplaced or that I didn’t really understand the costs. As it turned out, Datomic is plenty fast. And my algorithm was naive and basically sucked… I had knowingly  glossed over a lot of functions that weren’t exactly performant, and when called within an intensive set of tight loops, they added up fast.

After profiling with Yourkit, I was able to bring down the time to about 900 ms. At nearly a second, this is still quite an expensive call, but certainly less so than when it was ~ 1000x slower earlier.

I relearnt that tools are great and can help in many ways, just not in making up for my stupidity :-)