Since the onset of my programming career it seemed to me that there must be a better way.
In college I really enjoyed learning how to “think like a computer” (as one teacher put it). I got very good at dissecting each problem and tuning the gears and syntax of a language to make the programs work well. I felt destined to be a great coder and I was excited for the destiny that surely awaited.
In the more advanced courses, shortly before graduation, I felt a resistance to some of the more complex patterns and practices, but I didn’t question them because I knew that I was just some silly undergraduate with no real experience—and besides, all I needed was time to understand the new practices that seemed a bit odd and I would be able to master those too. Despite my hesitance, I did very well in my classes. I really enjoyed my professors and it seemed that they were impressed by me as well.
And then I graduated.
I had a rude awakening in my first job when I realized how little I really knew and when I discovered how very wrong the code I inherited and was expected to curate was.
In each job I’ve begun I’ve made the same realizations—I don’t know enough and I’ll be working on some ugly code.
I’ve attempted to address the first issue, and I continue to educate myself continually (as well as can be expected for a full-time worker who also has a two-year-old).
As for the second point, however, I’m almost beginning to wonder if the software project exists that I wouldn’t classify as abysmal in its architecture, organization, and/or general messed-up-ness. Perhaps the odds have been stacked against me and I’ve just been unlucky with the jobs that I’ve had, but the way other programmers talk there are even worse faits than the piles of spaghetti code I’ve worked on.
This isn’t to say that all the code I’ve worked on has been completely bad, there was certainly strong points in each code base that I’ve worked on. It’s also certainly true that I’m not nearly as experienced as other software engineers and perhaps I wouldn’t recognize good code if I saw it. But, well, I’m pretty arrogant and I don’t think that’s the case.
I’m starting to wonder if what Tolstoy says of happy and unhappy families can be said of software as well:
All good code is alike; all bad code is bad in its own way.
But one thing bothers me more than anything else: What is good code?
Is there one true way to code that is superior to all other methods? Is there a language that is better than all the rest? If there is a better way to organize, write, create, and author code; does anyone know what it is?
To be sure there are a lot of opinions out there. I’m not interested in opinions! I want facts. Something that can be proven and is measurable. There is a litmus test for this better way:
- Are fewer bugs recorded?
- Is the code easy to follow and understand?
- Is there added complexity and overhead required that isn't necessary (rather than the pomp and circumstance you see with a good number of frameworks)?
- How easy it is to respond to errors?
I don’t purport to have all the answers, I may be close to understanding some of them, but there is also the danger of finding more questions along the way….
(This is the first post in a series: A Better Way.)