Cloud computing is a game changer for developers. Not because it requires a new architectural model, that is driven as much by fads and fashion as it is by actual hardware requirements. Nor is it the seemingly endless capacity with near-perfect scalability that the cloud is promising. The game changer is how poorly performing code now has a real price in hard currency.
Since personal computers replaced time shares, performance has been a nice to have. Generally speaking, either the application performance is good enough for the hardware it is running on or it isn’t. You don’t gain anything by dropping your peak CPU utilization from 90% to 81%, expect perhaps a small discount on your electric bill.
With the cloud platform, dropping your CPU utilization by 10% directly translates to a 10% reduction of you monthly bill from your cloud provider. For example, Windows Azure costs 12 cents per machine hour of computational time. Using this knowledge and a good profiler, you could literally say a certain block of code is costing the company X dollars per month.
Once the cost of poorly performing code is know, companies can then make economically sound decisions on whether or not to spend time and money to fix it. Simply by comparing the monthly cost of the code with the salary of a developer tasked with improving it, engineering managers can say with certainty how much time can be spent before the laws of diminishing returns kick in.
The performance=money equation will also bring dynamically typed languages into sharp focus. If we have truly reached the time where dynamically typed languages are “fast enough”, then that will be reflected in the price for renting cloud servers. If, on the other hand, production costs start to skyrocket then there will be irrefutable evidence that a statically typed language is in order. But of course this will have to be decided on case-by-case and project-by-project basis.