It’s a well-worn saying… “If something is worth doing, then it’s worth doing right.” Sure, there are variations on it, but the point of them is all the same: if you’re going to do something, then you might as well do it right the first time.
Makes sense, right? Then why is it so darn hard to find people who actually follow through?
Whose Job Is It?
If you couldn’t tell, there’s one thing that constantly drives me nuts, it’s people who say that they know how to do X, Y or Z — but they have no idea how to do itÂ properly.
Whether it’s “web developers” who don’t know how to write valid HTML and CSS, “designers” who have never learned about typography, “financial advisors” who can’t tell you what a P/E ratio is, “chefs” who cook with produce from a can, “researchers” who make up their minds about the solution before they begin their investigation, “contractors” who cut corners rather than building to code, whatever…Â
Is this excusable? Some people will argue that they just don’t know any better, so how can they meet up with a standard if they don’t know it exists?
Maybe it’s just me, but my perspective is if you’re going to call yourself an X, then you ought to know how to be a proper X. It’s your responsibilityÂ to go out and find out what the best practices are for X-ish things.
I don’t want to pick on web designers and developers, but it’s an easy example to make and it’s one I know relatively well, so forgive me in advance 😉 If you want to call yourself a web designer or developer, thenI believe it’s your responsibility know how to do it right.
Part of your job before you hang out your shingle is to learn about typography, color theory, layout techniques, usability, digital and online copyright, basic web implementation techniques, etc. (for designers) or HTML/xHTML, CSS, validation, usability, accessibility, coding standards for your language of choice, etc. (for developers)
The problem is that we’ve got a whole lot of people out there who would rather do things the fast (or cheap) way, instead of Â the right way — evenÂ educational institutions aren’t immune from this. Too many people graduate with diplomas and degrees that know how to get something that looksÂ like it’s doing the right thing, but if you were to look deeper at their work, it’s just a mess.
The Lowest Common Denominator
Doing things “wrong” can be contagious in a way, too. Let’s say you’re in a software development company and you’re working on building an application that has a complex data set which has to be displayed on the screen. The display can either by dynamically generated (based on the data needed at the time), or it could be manually created — say, by creating a different button graphic for every possible piece of data, and then tying that into the application.
The rightÂ way to do it would be to dynamically generate the data — and not just because someone “said so”. No, it’s right because there are reasons and benefits for doing it this way as opposed to any other. For example, it’s more efficient (in terms of programmer time andÂ in terms of run-time), it’s more proactive (the data can be moreÂ easily updated when it changes), and it’s more flexible (it can be adapted to future applications, and reused).
Let’s say that you know this, and you start working on a project, making sure that you’re doing it according to industry standards and best practices and that sort of thing. Good for you! What’s sad, though, is the number of times I’ve seen people start to do things the right way, only be told to stop. “The other developers on the team won’t be able to understand your code, so they won’t be able to adapt it” you’re be told. (Don’t laugh, it happens more often than you might want to believe).
How rediculous! By focusing on the lowest common denominator, rather than the best practices, we guarantee that we will never attain a higher standard. Mediocrity becomes a high standard, and subpar work becomes… well… “par for the course”.
Expertise and How To Get There
Now, I’m not saying that you should be an expert in software development before you write your first program, that you have to have mastered web design before you take on your first client, or that you have to be perfect in your knowledge before you can even try something once.
You don’t have to be an expert in whatever the area in question is, but you should at least know the basic foundations and best practices. In fact, if you want to learn anything, those foundations and best practices are where you should start.
Very often, we try to learn things starting with the result. Want to develop a communications strategy? Learn about vendors for the food at your open house. Want to learn how to create a brochure? Start with a Microsoft Word template (sad but common). Building a website? Figure out how to get text to show up on a page in FrontPage (or Dreamweaver, or any other WYSIWYG editor) and call it a day.
If you do that, and you can get a seemingly passable result. But you won’t be a professional. You won’t have done it right. And you won’t have really created lasting value.
Instead, start with the universal principles, the best practices, the how’s andÂ the why’s. If you do that, and thenÂ go on to worry about specific tools and techniques, your skills may go out of date but your foundational knowledgeÂ won’t. It’s not your mastery of tools that determines whether you’re an expert or not. It’s whether or not you have an underlying understanding which supports your use of those tools.
Right Doesn’t (Have To) Mean Perfect
Now, it’s easy to confuse this striving for “rightness” with a striving for “perfection”. It’s easy to point at people (like myself) who get frustrated by people “not doing it right” and say that they’re just “perfectionist”. But here’s the thing: it doesn’t have to be perfect; it just has to beÂ right. Sometimes, they are the same thing, but not always.
For example, when I took religious studies in university, one of my very first courses was on the “nature of religion”. Â This course was interesting for a variety of reasons — we studied the nature of myth, the ideas of symbolism and archetypes, the problems with categorization, the variety of traditions, and a whole host of other neat things. But even putting aside the subject matter, I’d have to say that this course was one of the most important courses I took.
What I got out of this course was an appreciation for the best-practices in the study of religion. I learned the difference between an emicÂ and anÂ eticÂ perspective, and understood how those different perspectives would shape my views of the traditions I studied in the future.Â I became aware of the problem of bias, and learned to identify it in my own views. Was I perfect in eliminating bias? No, of course not. But as a result of that class, I was constantly aware of it, and so I strove to at least take that into account whenever I was approached by a new idea.
Am I Alone?
Am I the only one that feels this way? That it’s not enough to make something work, but it has to work for the right reasons — you should be able to defend why you did something by giving sound reasoning (and “it was cheap” isn’t sound in my books)?
Maybe I’m strange (I’ve been called worse), maybe I’m an idealist, but to me it’s a real shame that this attitude isn’t common sense.Â
I’d rather work with someone who knows how and why something ought to be done — even if I have to teach them the particular tools for the particular job, at least I know they’ll do it right.
I’d rather partner with someone if I know I can trust them to do their part of the project properly — not because I think I could do it better (a lot of the time, I know I couldn’t), but because I want to know that the end result will stand up on its own merits.
So here’s your challenge. Take a few moments and honestly look at what you’re doing, the aspects of your life that you call yourself expert in, the things that you do as a professional, or just your day job that pays the bills. Are you doing things right? Do you even know? And if not (to either), what would it take to change that?