When good comments mean bad language

I’ve had an epiphany.

For years, I’ve urged developers to write better comments. I still claim that’s a good idea (a very good one), but as I’ve pondered what a better programming language might look like, I’ve come to an important conclusion:

A lot of “best practice” commenting is just workarounds for inadequate language design.

This might seem like a crazy or arrogant claim. The Wirths and Matsumotos and Hejlsbergs and van Rossums and McCarthys of the world are incredibly smart people; how could I claim to know something that they do not? Each of these language designers has probably forgotten more about computer science than I will ever learn.

And yet, I think Randall Munroe (the cartoonist at xkcd) was right to make fun of our industry’s facile assumption that context-free grammar is all you need to know about formal language:

image credit: xkcd.com

To show you what I mean, I’ve inlined snippets of code from a variety of programming languages below. Don’t worry about digesting them carefully right now, but give them a quick glance and then move on to my analysis, and see if you agree with my claim about an unhealthy pattern. Continue reading

Headers, babies, and bathwater

I claim that by eliminating the C/C++-style dichotomy between headers and implementation, most modern programming languages have thrown out the baby with the bathwater.

Don't throw out the baby with the bathwater! Photo credit: StubbyFingers (Flickr)

Don’t throw out the baby with the bathwater! Photo credit: StubbyFingers (Flickr)

If that sounds crazy, just hang with me for a minute.

I know my claim runs counter to popular wisdom; have a look at this thread on stackoverflow.com. Designers of languages like python and go and D and ruby and java consider it a feature that developers don’t have two redundant pictures of the same functionality. This comment from the C# 5.0 specification is typical:

“Because an assembly is a self-describing unit of functionality containing both code and metadata, there is no need for #include directives and header files in C#. The public types and members contained in a particular assembly are made available in a C# program simply by referencing that assembly when compiling the program” (p 3).

I agree.

Sort of…

Bad headers are a royal pain

It can be onerous to maintain the parallelism between a .h and a .cpp. And most C/C++ headers are managed so poorly that the benefits you might claim for them are theoretical rather than real. Three common antipatterns that I particularly detest: Continue reading

What are your software’s vital signs?

Most software has a profoundly inadequate concept of “health.” In order for applications to run, they must:

  • have adequate resources (RAM, disk, network, CPU)
  • receive cooperation from services exposed by the operating system or by network endpoints
  • be adequately and correctly configured
  • not be hacked
  • acquire delegated privileges from users

… and so forth. And yet, most software that I’ve encountered in my career does little to see whether it’s working properly and has what it needs. Sure, it may log a catastrophic error if the disk fills up, but it makes no effort to see the problem coming or to plan more graceful recovery than a crash.

In my most recent post on cloudifying your software, I explore how cloud computing is magnifying the need to understand and to regularly check your software’s vital signs. Head on over to adaptivecomputing.com/blog and check it out.

Checking vitals isn’t just for healthcare… Photo credit: U.S. Pacific Fleet (Flickr)

Stay tuned for further installments of this series each Friday. As I said in Part 1, I believe that a competence with cloud–cloud-oriented programming, if you will–will be a checkbox on future tech resumes.

The third half of computational economics

If you look up “computational economics” on wikipedia, you’ll find out all about software models that economists use to study game theory, recessions, scarcity, and so forth.

Tweak your search terms a bit, and google takes you to discussions about the economics of the computer industry–how Moore’s Law plays out in changing prices for compute power, why cloud computing and cheap GPUs are changing how much we expect to pay, how the mobile revolution is killing traditional PCs, what the job market looks like for us software geeks.

That’s all well and good.

But there is a third half of the computer+economics interaction that I don’t hear anybody talking about.

3-fingers

My buddy Ken Ebert likes to joke about incomplete thinking by saying, “There are 2 aspects of the issue…” — while he raises three fingers. :-) Interestingly, this three-fingered gesture is a symbol of sustainable development, which connects nicely to our theme of economics. Photo credit: \!/_PeacePlusOne (Flickr)

Continue reading

Encapsulation isn’t just for code

When computer science folks talk about encapsulation, they are usually thinking of how the principle applies to objects and functions inside a codebase. Best practice calls for a separation of concerns–each object responsible for one type of work, hiding all details from its neighbors.

That’s great. But it’s not the only way encapsulation ought to show up in software.

In actual deployment, software packages often manifest anti-patterns in the way that they are configured. A web server has to know all about three different database servers that contribute data for its pages; HA failover scripts must know the identity and responsibility of every actor in the system, as well as many particulars about how these entities use resources to accomplish their tasks.

No wonder our deployments are fragile and high-maintenance…

The cloud computing wave is raising the bar for encapsulation in the way applications–not just objects–discover and interact with one another. In this week’s installment of my series of posts about how to “cloudify”, I discuss how role-based interactions insulate components from details they don’t need to know. It’s encapsulation all over again. And this encapsulation pattern manifests itself in unlikely places–like the order queue at McDonald’s…

What can McDonalds teach a developer of cloud-friendly software? photo credit: phogel (Flickr)

Stay tuned for further installments of this series each Friday. As I said in Part 1, I believe that a competence with cloud–cloud-oriented programming, if you will–will be a checkbox on future tech resumes.