Sunday, March 29, 2009

Licensed to Code

Here's an idea: Professional standards for software engineers.

The Concept

I have quite a few friends who are doctors, dentists, etc. They went through hard training at university... just like me. They went through lectures, assignments, exams, and had to give presentations, just like me. However, if they want to practice, they have to continually prove their level of proficiency by way of professional exams. I, however, don't have to. Once I was given my degree at university, that was the end of ever being examined.

I for one would welcome being examined every two, three, four or five years, just to keep myself provably up to date with my field. I would also like to be able to prevent people from giving themselves all sorts of silly titles such as "information architect, solutions architect, enterprise architect, analyst programmer", etc. I for one think that any software developer is an architect, a solution architect, an information architect, a user experience architect, etc.

Certainly, there are areas of specialisation, such as distributed systems, user interfaces, security, algorithms, etc, and there are key industries such as finance, travel, etc. So how about Junior/Senior Software Engineer, with an associated area, such as distributed systems/security? I just don't like the way that people can give themselves any silly title they want, never have to do any professional proficiency verification in their life, and claim to be a professional at the same level as other professions. I personally feel slightly embarrassed when my medical friends talk to me about software engineering roles/titles.

Good Old Job Interviews

As an aside, I am currently interviewing for jobs at the moment. Again, I have to do basic programming tests, along with aptitude tests, IQ tests, psychometric tests (here's a snippet from a four hour one I am about to do - along with some interesting rants), etc, which I am more than happy to do. It proves my worth, and also keeps me up to date with programming practices (I am forced to go back and relearn what I have forgotten prior to the test). However, there are some people (you know who they are) who will spend years and years in a company saying the right things to management and climbing the corporate ladder (see the next section below).

The problem with these people is that unfortunately they will end up being your boss, and they will also be making the design decisions. This is fundamentally bad. If, however, everyone must regularly show their ability, then people who want to position themselves into the title of "enterprise architect", "solutions tsar", etc will only remain in this position if they actually do hold a reasonable basic level of competence. Those who can't can at least then move on to whatever they are more suitable for, such as BA, account manager, strategic manager, etc... whatever suits them the best.

Your Boss Is Incompetent! *
* and you will be too one day
.

Here is a sad fact, pointed out to me recently by a colleague. The process of workplace promotions has a slight but fundamental flaw. Let me demonstrate by way of a question: at the end of the following program, what is the value of the variable "Job_Level", presuming that you get promoted from tester (level 1) to developer to architect to delivery manager?

10 Declare Job_Level = 1
20 Do work
30 If it is time to retire or Job_Level >= MAX_JOB_LEVEL_HERE, end program
40 Time goes by
50 If you are really good, Job_Level := Job_Level + 1
60 If you are really bad, Job_Level := Job_Level - 1
70 If you are really really bad, end program
80 Goto 20

I guess the answer is level 4. Great news. You are a delivery manager (hard job... dealing with BAs, developers, architects, stakeholders... and the buck stops with you). However, if you were good at being a delivery manager... would you not have been promoted (over time) to the next level up (say Strategic Director, Board Member, etc)? Yes, you most likely would have, unless you said "actually, I am happy with where I am, and don't wish to climb the ladder any further"). However, if you weren't promoted, this doesn't mean that you were doing a great job... it just means that you weren't doing the job bad enough to get demoted or fired.

So... why are there not conditionals put onto any promotion for all companies, where a probation period is in place for the first few rounds of reviews? That would get around the problem I see a lot - where people position themselves into a nice, easy middle management job, and then just sit there for years, claiming a big salary, making ineffective decisions, pushing their own political agenda, and figuring out exactly why it is that they should never be fired. In small companies this is not so much of an issue, but as companies grow, and relationships across business units become more complex, it is possible to get yourself into a position of power - where no one knows what you are actually doing - but everyone is too afraid to question it :)

Sunday, March 22, 2009

How to Develop Good Software

Here are a few thoughts on where we are going right, and wrong, with commercial software development. Enjoy, agree, disagree, make comments, ...

Test driven design (TDD)
  • I really like the concept of TDD. Get the test team in early to think about the proposed software, and to commit that they can in fact test the software to be developed (how often are the QAers bought in right at the end of the project?). One recent project I was on ended up costing about three times the projected amount, as it turned out that we simply couldn't test the software (this software sent updated itineraries to customers based on changes to airline schedules... it worked fine in development, but was hard to test in pre-production as we didn't really have a way of simulating airlines cancelling flights.... except for calling in bomb-threats to airports that is...). Other good things about TDD include: Designing the system for testing - so there are no nasty surprises when it comes time to test. Mock out each component in the system, so you can test each part of the system (including the core) in isolation (unit tests have to run fast... otherwise there is little point in having them).
  • Test driven design works, if done well. Martin Fowler suggests taking very small steps (red/green refactoring), but if you are more experienced than this, I suggest taking bigger steps. If you are a competent developer, there is no point writing a test that doesn't compile, just to get to the "red bar" stage. If you know that you need to implement a stub service before any of the tests compile, then do so. This won't break the project, and this keeps your vision on the goal of producing working software. Too often I see very good engineers get bogged down by the religion and process of Test Driven Design/Agile Methods.
  • I do however like the idea of writing unit tests that fail first time around (note the difference between failing and simply not compiling). Red-bar (failed) tests the first time round prove that your unit test, in its initial state, will not give you a false positive. This is a lot better than writing unit tests that mistakenly give you a positive result, meaning that the defects will only be found at a later, more expensive stage of development.
  • Be careful with getting too carried away with unit tests. Be prepared to throw your unit tests out just as quickly as BAs throw out requirements. It is easy to get quick coverage (the 80/20 rule), but don't spend too long on any one test or part of the system.
  • Watch out for over-architecting the test set-ups. If you have a central system for example, that has requests coming in from a UI, makes calls to a web service, and perhaps writes to a database and/or message queue, well, there is already an exponential number of possible test combinations here. The web service, for example can be mocked out on the client side, the server side, you could also use SoapUI to mock both the server and the client, etc. So, a test setup could use a unit test which calls the business layer directly (bypassing the UI), mocks out the web service client, calls the real web service implementation, which in turn writes to a mock database. This is a valid test, which certainly tests the business layer. However, you could also configure the test to call the real web service client, which calls a fake web service implementation. So the real question to ask is "what am I trying to test within this unit test", and then focus on not getting confused between unit testing and integration testing. Integration testing, on the other hand, I see as a superset of unit testing... where you might not want to test for all test cases and inputs, but do want to test each pair of closely related components, as well as a complete (end to end) run of components as well (if that is possible). The only catch with this is that if you are calling third party/high latency/non-controllable services, you probably want to mock these out at the far end of the component if possible.
Disclaimer: The caveat to all of this, of course, is that the business must be supportive of investment into TDD up-front, which of course ties up BA resources, testing resources etc. On a blue-skies project this is not an issue, but for your typical large scale enterprise (where systems are already in place, multiple versions must be supported, there are large areas of code that are not really designed for test at all), it can take time (if actually possible at all) to gain traction with TDD.

Continuous Integration
  • I am a fan of Continuous Integration. Use something like final builder, set up triggers so that your code is compiled on every check-in, run all your unit tests on every build, have an installer built automatically, and push the installer out to a clean VM on every build as well (make sure that it extracts, installs, and runs correctly). For each company that I have set this up, it has taken a week, and saved a year. One of the most expensive parts of software development is QA, and when QAers don't have a reliable, trustworthy platform to test on, developers don't believe that the bugs are "real bugs"... they often suspect that the QAer has just highlighted an environment setup issue, and often this is the case. It truly surprises me how much money companies are willing to lose in having badly set up/under-resourced QA environments before they finally sort things out and become productive.
Architectural Overkill
  • Don't get bogged down by architecture and buzzwords. Nearly every company I have worked for has had some "glory project" where a framework/architecture is designed to revolutionise the way that the company writes code. There are problems with this: architectures that are designed by architects are too complex/abstract. Architectures need to come from developers, based directly on business needs... trust me on this one. Secondly, architectures need to be maintained as technologies are updated/replaced. I have never once seen a proprietary architecture/framework be actually used successfully in a large organisation... they tend to be pipe-dreams that are sold to management but have little useful content for developers (and can actually stifle productivity).
  • Don't get carried away with new technologies such as dependency injection/inversion of control, etc. The use of dependency injection is nice, and there are some great frameworks out there (check out SEAM for Java and the Unity application block for .Net), however, the frameworks and related hype can actually far outweigh the actual point of the technique. Dependency injection, for example, is just the concept of assigning objects at runtime... for example, creating an instance of a TextStreamWriter instead of an instance of a database writer, and passing this reference to the consuming component. That's about it. Inversion of control is basically the same concept, but the decision about which instance to initiate is made from another context (perhaps a unit test, a manager class, or even an xml configuration file).
Watch your Language(s)
  • Don't go crazy with complicated language constructs. Certainly, you can do a lot with annotations, xml configuration, dependency injection, etc etc, and your code will be more elegant and concise, but chances are it is harder to read, took just as long to write, and fewer people will be able to maintain/extend the code. So where is the real gain here?
Software Development Processes
  • Don't micromanage software teams. If you are using Scrum (check out this gem of a scrum parody), then great, but don't let the project manager rule the team (if he/she does, then you don't have Scrum, you have conventional software development). The importance of self organising teams is underrated and overlooked. The problem with micromanagement is that the team ends up working on project planning, estimates, and meetings, rather than actually coding. A scrum master really is just someone to keep the meetings on track, to report externally to the group, and to prioritise next stories in the sprint. There is still very much a role for project management in Scrum... the same role as traditional project managers - to ask if there are any problems/impediments within the team, and to fix these impediments as pragmatically as possible.
  • Everyone in a software development group must be professional. This includes the IT support staff, the BAs, the PMs, etc. Without formal, tertiary training in software development, the team is unfortunately reduced to the lowest common denominator. This might sound high and mighty, but software is hard to write and most projects fail. There is a reason for this... software is a hard topic to master, and the risk of failure is higher as the team size increases. Having uneducated people in the team - adopting a "let's just see how this goes" attitude - is a recipe for disaster. I guarantee that teams who design planes or perform surgery all have years of study behind them.
  • Be careful of the term "agile". Agile doesn't mean being simplistic, it means doing the simplest most logical thing because the business doesn't know which way it wants to go right now. If you know that the system must have certain performance characteristics, support multi-threading, support a particular set of business functions and/or complex workflows, don't ignore these requirements for the sake of being agile and "taking small steps". Take steps appropriate to your skill level and judgement.
  • I am incredibly wary of any technology or process that is hailed as the solution to guarantee the success of a project. A small, competent and fluid team of developers can make nearly anything work using any process or technology, whereas a technology or development process will fail with 100% certainty if the people behind it are not competent. It's all about getting what you paid for.
Writing the Good Stuff
  • Design patterns work. Use them. But don't get sucked in by every pattern book out there. Stick to the highly rated ones (GoF, Fowler's Enterprise Architecture Patterns, etc).
  • Stay consistent with your code. Don't use every language construct just because you can. If you do, your code will look bad, and probably be harder to debug.
  • Refactoring is useful, but it is always better to write things correctly the first time. This might involve thinking about code before you write it, spending time on design, writing pseudo code, writing prototypes (and then happily throwing them away).
  • I like the idea of writing as little code as possible. If there is a way to use existing code, an existing API, etc, then do it. Spend time searching around, play with existing libraries (the Microsoft Application Blocks are great for example), but at some point you will need to call it a day and just write what you need. As long as you write your code in a modular fashion, you should be able to swap it out when someone points out the library that you should have used all along.
Project Estimates
  • This idea is taken from Joel Spolsky and I like it a lot. Plot your estimated hours to complete a task against anyone who disagrees. Keep track of the records, and then once the task is complete, record the actual hours. Continue doing this for every task. The more you build this up, the more weight you will have when discussing estimates in the future. It often surprises me that BAs and PMs with very little software development experience will argue over how many hours to put into an estimate. I don't argue with my dentist when he tells me that it will cost $400 to remove a tooth, or my lawyer when she tells me that I will need to keep records of all income earned by my family trust. I figure they went to university for a reason, and know better than me. Why are software engineers not given the same respect?