The Flaw of Everything is an Object

By | August 4, 2007

Several newer languages (as in less than 10 years old) have been designed around the idea that everything is an object. Since I started programming professionally almost two decades ago, I was programming before the object craze became mainstream. I have done the OO thing for over a decade and am not as impressed as I once was.

Although not everyone will agree with me, I suspect that OO is not always the best choice. In fact, in some applications, a simple, straight-forward procedural approach is better.

Many OO advocates will point out that the procedural approach doesn’t scale. They tend to forget that a large fraction of the code written in the world doesn’t need to deal with terabytes or petabytes of data. Much code isn’t called thousands of times a second. Even more code is not mission critical. And some applications run on systems without the resources for the overhead of OO.

Learning OO

I also find that the object approach appears to be harder for people to really master. Many people claim to have learned OO programming in a few weeks, but it seems to take at least two years for people to really get it. Before that point, new OO programmers end up creating collections of data and methods with no cohesion or multiple classes with intimate knowledge of each other’s inner workings. It usually even takes a few months before they stop inheriting everything just because they can. I won’t even discuss the objects that serve only as a holder of a handful procedural programming methods that show no signs of abstraction.

On the other hand, I’ve seen non-programmers pick up a little bit of procedural programming relatively quickly. Just enough to automate some portion of their computer usage and save a little time. This is some of the appeal of many of the dynamic languages. It’s easier to whip out a little code to get a job done. Would I recommend this kind of programming for life- or mission-critical applications? No, but that doesn’t make it useless.

Not The One True Way

I suspect that to objects have become the latest Golden Hammer in our field. It has also been around long enough that people have worked in the OO paradigm long enough that they have begun to believe that there is nothing else.

As such, people tend to forget that the is overhead involved with using OO. I’m not just talking about memory and CPU overhead. While it is true those forms of overhead exist, both are getting cheaper fast enough that they are not the major problem (unless you are doing embedded systems). The more important overhead is conceptual. Everyone starts with the obvious (and incorrect premise) that programming objects are just like objects in the real world. This view of objects is easy to understand, but very limited. Amusingly enough, in the real world it is obvious that everything is not an object. Energy, thoughts, emotion, and probability are fundamentally different than a table or cup. Forcing them into the same framework would not make much sense.

Concepts

In most of the OO programming I’ve have done or seen only a tiny fraction of the classes have anything to do with real physical objects. I think the best breakthrough for me was the comment concepts are classes in the book Ruminations on C++ by Koenig and Moo. This finally solidified for me the non-real world things that need to classes. I had done this many times without really having a reason. Wrapping your mind around these concepts and learning to work with them seems to take months to years for every programmer I’ve ever worked with, taught, or mentored. Along the way, they normally reach several Aha! moments that move them toward real ability with OO.

This does not sound like the best approach for people who are just beginning to learn to program or for people who only need to program as a sideline to get their real work done.

Even in the most fanatical of OO programming languages, we eventually reach the level of individual statements. These are not objects, and making them objects would increase the difficulty in understanding. However, most people overlook that.

Language Feature?

I have come to believe that the everything is an object meme has nothing to do with ease of learning or use of the language. I suspect that the approach only simplifies things for the language designer. After all, as a user of the language, how does having an integer be an object really make your coding or design easier. In order to make the code usable, these integer objects either need special syntactic support in order for us to write mathematical expressions somewhat naturally, or the language needs support for operators built in to the classes. Both complicate the language for the (marginal) benefit of being able to say that all of my numbers are objects, just like everything else.

As a rule, I find this assertion to be a quirk of a given language rather than a particular bonus; kind of like Lisp’s parenthesis, C++’s templates, or Perl’s punctuation variables. They are part of the language and are only good or bad in as much as they help or hinder my ability to learn the language and read and write useful code.