This article is bound to be controversial, but I beg you to keep an open mind as you read. I have put a lot of research into this, it is not your standard knee jerk “OOP Sucks” article. In fact, you will find in the end that I will recommend still learning Object Oriented Programming (OOP), if you haven’t already.Here are the main points of my article:
- There is no official OOP definition.
- OOP productivity myth.
- So what? Learn OOP anyway.
The other day, it occurred to me that there is a lot of controversy in the programming world about what the true definition of OOP is. Many of the given definitions revolve around specific programming languages. Some will say that Smalltalk’s syntax is the definition of OOP, others say the same about Java or C++. On the other side of the coin, people like to tear specific languages down for NOT being OOP. I’ve heard people say ColdFusion is not OOP, PHP is not, Perl is not, etc.But if you look, you can find OOP definitions that have nothing to do with specific programming languages. Here is one definition I found on Wikipedia, “OOP is the act of using ‘map’ data structures or arrays that can contain functions and pointers to other maps, all with some syntactic and scoping sugar on top. Inheritance can be performed by cloning the maps (sometimes called “prototyping”).” That’s fine, but here is another Wikipedia quote on the subject, “Attempts to find a consensus definition or theory behind objects have not proven very successful, and often diverge widely. For example, some definitions focus on mental activities, and some on mere program structuring.” I think part of the difficulty revolves around deciding which OOP concepts should be included in the definition. For example, some say that true OOP requires interfaces, others say that’s rubbish. Also, even the definition of some of these core OOP concepts differs between languages/gurus.So I ask you this question: How do you even know if you are using OOP? I think this is a real problem. The database industry solved this years ago for their specific problem domain by defining and agreeing on the rules of Database Normalization. So how about it OOP gurus? Why don’t you all come together and work out a definition that we can all use? I am not talking about a simple, one size fits all definition (that would never work), but a formalized complex definition, comprising various application types and sizes.
OOP is not more productive than procedural programming.
But lets ignore the OOP definition problem for now and talk about the merits of OOP. Proponents of OOP will tell you that it will make you more productive through code reuse. But has anybody ever sat down and tried to prove this? Yes, they have, and the results will surprise you. Thomas E. Potok, Mladen Vouk, and Andy Rindos published a study in Software – Practice and Experience (Vol. 29, No. 10, pp 833-847, 1999) titled “Productivity Analysis of Object-Oriented Software Developed in a Commercial Environment” [PDF]. In this study they rigorously examined all of the disparate facets of software projects. They looked at 19 development projects of varying size, some of which involved software being ported between operating systems. The assumption is that porting software by it’s nature involves a high degree of code reuse. Their study found that while “the ported products, as expected, have significantly higher productivity than the non-ported projects, there are no obvious [productivity] differences between the procedural and object-oriented products.” They looked at project size, developer experience, developer team size, and many other common software project variables. They COULD NOT find any significant productivity increase due to the use of OOP methodology. Instead, they found that “the governing influence may not be the methodology, but the business model imposed through schedule deadlines.”One thing Potok et al. noticed that was puzzling was that the larger the project, the more productive the programmers were, regardless of methodology. Why would that be? Their conclusion is that larger projects induce more pressure from management to achieve aggressive deadlines. Therefore, under pressure programmers are more productive (again, regardless of the methodology chosen). That is why they concluded that while “the introduction of object-oriented technology does not appear to hinder overall productivity”, the true driving factor of developer productivity was deadlines, and pressure from management.
So what? Learn OOP anyway.
While I am frustrated by the lack of consensus in the OOP community, I still think learning OOP is a worthy goal. I will reiterate that the Potok et al. study notes that while they couldn’t find any evidence that OOP is more productive than procedural, they also couldn’t find evidence that OOP hinders productivity. So if productivity is your goal, it doesn’t matter if you use OOP or procedural. However, there are more factors to consider than productivity.One major factor that would make me learn OOP is it’s popularity. I could stick with procedural, but let’s face it, it’s a dying breed. Most programming jobs showing up on the job boards these days require some OOP experience. And again, there is no harm in using it, so if you want to stay current it’s probably a good idea to learn OOP. You can still fall back to procedural now and again if you want to, but having a good solid understanding of OOP will get you far in today’s programming world.Another factor for learning OOP is that modern programming languages are moving more and more in that direction. With each release, ColdFusion has more and more OOP features. Sure, you can still program procedurally, but over time that might become more difficult (especially if you switch to another language).The last factor I want to highlight is communicating with your fellow programmers. If you want to easily communicate your programming problems and/or ideas to another programmer, and that programmer is used to OOP, it will be much easier if you speak with OOP terms.
In conclusion, I really hope the OOP world can come up with a consensus on a comprehensive definition for OOP. Also, there is some proof that OOP doesn’t increase your productivity. But even with these two arguments against OOP, there are still plenty of reasons why you should learn and use OOP.