Monday, October 26, 2009

Validity of development tactics

Many software development practices and methodologies are presented as panaceas and silver bullets, seeming to be valid in every domain and situation. But a responsible developer must be pragmatic (and a successful book and series started from this term) about where and when he applies his preferred technology, knowledge, or workflow.
I blogged some weeks ago about the overusing of a specific technology, but now I am focusing on the more general methodologies and paradigms like object-orientation, particularly if they involve killing a fly with a bazooka.
Let's start with a series of examples regarding field of application:
  • Test-Driven Development might be the best and most controversial XP practice and is widely applied for producing robust and maintainable software. Though, it is obviously not applicable to application which are not object-oriented as you cannot easily isolate piece of functionality in structured programming; moreover, there are specifical tasks where TDD is an obstacle, like creating a graphical user interface or a throwaway prototype.
  • Limitations are intrinsic in Design Patterns: Ralph Johnson, one of the author of the original Design Patterns book, affirmed along with his fellow authors in a recent interview that functional programming requires for instance different patterns than the ones presented in the book.
  • Digging further, even object-oriented programming is not applicable in every domain and delopyment node, mainly where there is not the infrastructure to provide polymorphism and inheritance, like in embedded applications which typically employ the C programming language or in low-level operating systems routines. Some zealots will say that an object-oriented website will never scale, but this is an exaggeration.
  • Agile software development is a great methodology for delivering value to a customer, but it is not suitable if your organization does not support it, or software development is not your job. One could argue that software development should be managed by professionists, but this is not usually true in the real world, where in-house programmers have more than one responsibility.
  • At the programming level, using a Factory to build your objects is a good choice if they require external dependencies, but sometimes this is not the case. For instance objects that has to be serialized commonly are required to not have external dependencies, as you can always pass any service class trough the stack while calling a method. These objects are often declared newables.
  • Version control is great and even solo developers should give it a try. I put nearly everything I produce in a Subversion repository, but I do not store binaries in it for example, since they can be generated from source code stored in the repository and they would only slow down the server while performing enormous diffs. In the previous post I said that Subversion and similar applications are general purpose systems, but even here there is a limitation in what they are meant for.
I can go on for hours and I'm sure you can find a flaw in every single practice I can mention here. As Fred Brooks wrote in his famous essay:
There is no single development, in either technology or management technique, which by itself promises even one order of magnitude [tenfold] improvement within a decade in productivity. --Fred Brooks
And it is indeed true that every practice we implement has limits in field of application and in the productity improvement it can give back to us. There is the temptation to apply straight our new knowledge or technology to every problem at hand, but instead of perform well the technique usually jumps the shark and produces an horrible result like the ones I listed previously.
Consider the TDD example: test-first programming helps us to produce decoupled components for our applications, and shrinks the time required for bug fixing. So should we apply this technique to user interfaces? I would say a big No as it would slow down our development, cluttering the codebase with brittle tests that have to change every day. So the improvement in productivity is not gained on all the production code, but only in the domain layer, which is fortunately the most important one. You also may have to write other infrastructure code which serves as the glue between layers. Is it useful to test-first this code? I think it is not. The effort spent for automating the tests and discover wiring or user interface bugs is often not worth the value they provide.
You also have to test the wiring of your application, and unit tests like the ones prescribed by TDD are not useful here.

Take care of what you learn and also research and discover when you can use it. Like in physics, no formula is always valid and you must put it in context before starting to tackle a problem with your swiss-army knife.

No comments:

Post a Comment