.NET Code Quality in 20 Minutes per Day

Code quality has always been a concern when there are defects coming out of nowhere. There are code reviews and then there are code reviews. Testability of the code is also one of the code quality parameters. Unit Tests and their code coverage is indicator of the code quality and having a high quality set of unit tests gives confidence to make changes to the code. However we are not covering testing here.

What causes code quality issues

  • Lack of awareness across teams about what is code quality and coding principles like SOLID
  • Many projects tend to complete the reviews just because there is a process which they realize just before delivery and then they need to be compliant.
  • In other cases, when project has gone live the core team is dismantled and the support engineers who take over are scared of touching the code written by someone else. So they make changes and give some explanation and the reviewer is also scared of taking risks and the changes get approved without really doing a code quality check. They also comment around the code block with defect number.
  • Naming convention is a matter of choice. I wrote about Naming Conventions and Code Redability quite some time back In old times most of the developers chose to use Hungarian notation to prefix the variables to indicate the data type or UI control type etc. Microsoft chose to drop all of this since beginning of .Net. Many years have passed but we still see people using Hungarian notation. Junior developers tend to mimic the seniors of code styles from old projects. This is a habit that dies hard. So in a project you can see code developed by different developers and technical leads buy the argument that this is a trivial issue as long as the code works. However in a modern IDE like Visual Studio, which immediately can tell you about a wrong assignment, such use of abbreviated prefixes is really unnecessary. It has a huge impact on readability of the code.
  • In some projects for whatever reason no one bothers about code quality. Developers are either not aware or pressed by timeline or they take liberties because tech leads are too busy to review the code. The only way to ensure product quality is the black box tests executed by the test team.

Fixing code quality issues

  • Learn SOLID principles of coding and why they are important. Here are few resources to learn these principles

    Off there principles one that impacts the code quality in long term is the SRP – Single Responsibility principle. Many times developers confuse what can be called as Single. Easy test for the same can be things that change together stay together. Another is to look at it from testing perspective. Does the block of code needs to be tested separately? Can something go wrong when executing the lines of code. In such cases take the code block as separate method. SRP can be applied to methods and it can also be extended to class and component design.

    For a given process preferred style of coding could be to have a top level method which defines the process flow and for each step write a separate method. This way the code gets split at least at each conditional statement. This also brings down the cyclomatic complexity mentioned in the code metrics section.

  • Code Metrics is one of the most useful tools which is part of Visual Studio. It is not as enhanced as tools like NDepend, however it is extremely useful for day to day tracking of where the code is likely to have quality issues. Two key parameters to watch for are lines of code for any entity like class or method and Cyclomatic Complexity for any method. Typically any method with LoC about 30-40 and CC about 10 must be reviewed and should be simplified by way of refactoring. More about it can be found on following links:
  • Code Analysis is again a built-in tool of Visual Studio. It is integrated version of FxCop that many folks have used for earlier version of VS & .Net. Code Analysis depends on set of rules. These rules are categorized into groups like Performance, Globalization, Design, Security, Maintainability etc. There are many Microsoft defined rule sets available out of box in Visual Studio. However, teams can customize the rules applicable for their project. E.g. if the application is developed for only single language and culture, team can decide to ignore globalization rules. Code analysis can be configured to run on every build and the rules can be modified to raise a warning or error on failure to meet them.
  • Improve readability using StyleCop, a tool distributed free of cost by Microsoft. Style cop works similar to Code Analysis by using a predefined set of rules. StyleCop can also be integrated with Visual Studio using StyleCop VS Extension from the gallery.

How to include the code quality check in your daily routine

  • Select Code Analysis rule set suitable for your project. Making running code analysis at end of every build as default for all developers. Running code analysis after a lot of code is written has a negative impact on developers as it normally shows a lot of errors which creates an impression that the code needs to be rewritten. However running the analysis on every build gives early warning to developers before checking the code in to source control. This helps the code to remain high quality. Time required on every build: about 2 minutes. Assuming there are 5 major check-in per day in your source control, total time taken 10 mins per day
  • Run Code Metrics at least once a day across the whole code base. This helps the leads in identifying where the team is writing complex code or large code blocks. Code metrics is comparatively time consuming process and needs a successful build. Long methods and complex code are easy to smell for individual developer. Individual developer can keep refactoring & simplifying the code once the understanding of how it helps in long run. Typically teams will take about one iteration to realize the importance of Cyclomatic Complexity. The first aha moment comes when there is a change request for any reason. Time required every day for medium sized project: about 10 minutes to run and export the results to MS Excel.
  • Advanced techniques:
    1. Use commercial tool like ReSharper or CodeRush
    2. Use Build Server and Custom build definition to run code analysis, code coverage and code metrics and publish the result to SonarQube Server

Additional Resources

Universal Phonetic Script Needed

I tweeted just a few days back (twitter:@hemant_sathe) about a Tamil minister changing spelling of his name by removing the zhagram and replacing it with l. Azhagiri to Alagiri. The primary reason for this is that there are three different pronunciations of L in Tamil and there is no way this can be represented in English. I should rather say Latin script. So the Tamilian people started using zh for the third pronunciation which is not so intuitive for a non-Tamil person to understand. This resulted in most of the country pronouncing the Tamil names with “zh” differently. It is an all together different matter that Tamil/ South Indian people spell most of their names differently than rest of India. I had a friend whose father’s name was spelled as Santhirasekaran which otherwise would have been spelled as Chandrashekharan. I have seen four different spelling of Tyagaraya (or the famous T. Nagar in Chennai) on the four nameplates of shops in same building. The zhagram problem is however quite genuine.

Are the people crazy or is the script crazy

I first was laughing at these crazy ways of spelling names but then I thought is it really Tamil people who are funny or is it the script in which they have to write their names? The answer is obvious. Despite being one of the most popular language in the world, English language and for that matter many European languages use the same old Latin script to express. This script has serious limitations. Though some languages like German are using it quite strictly and much like a phonetic script, you can not escape the fact that the script has limitations. Most Indian scripts on the other hand are almost fully phonetic. You speak what you write. In English on the other hand all these spellings have almost same pronunciation – red, raid, read, greyed – for the “red” part.

Non phonetic scripts are a big hurdle in mobile computing

Speech to text is still an emerging technology and is not as widely popular as it should have been by now primarily because most of the effort/ intelligence on scripts like Latin is wasted on finding the right context of the word and then use appropriate spelling. The tool needs a huge set of dictionary to parse through. By having a phonetic script for English and other languages with same problem can take today’s computing to new heights. Imagine – no more spellings. No more spelling bee contests. Just vacate a large portion of your brain to do something more important than remembering spellings. What is more that it will give a tremendous boost to manage all the computers across the world using verbal commands. This will remove the need for keyboards making the gadgets more portable than ever. It is not a simple journey but it is achievable. Despite all the attempts to popularize computers and internet access it was mobile phone and that too a voice call that made revolution in today’s India.

One universal script or multiple scripts

Are all phonetic scripts good enough for today’s computing needs? Not really. Scripts like Devnagari need to reverse order of glyphs for short “ee” and short “oo”. Some languages need the glyphs to be written one below the other. I would also go on to say that scripts like Chinese or Japanese are way too crammed. For today’s modern languages we need script which is as loose as Latin and as phonetic as possible. We need to come up with some new script or modify current ones. Is it possible to have a universal script? Yes it may be. But it is very difficult to implement a universal script because come what may, there will be a condition where in some part of the world a particular letter may have a variant not covered by the script or there are different directions in which the script is written e.g. L2R, R2L, vertical and we may also end up having thousands of sound glyphs. We can restrict the number of scripts and can ask multiple languages to start moving towards unified script. In history such attempts have been done successfully across the world. Some language fanatics may get hurt in this process but remember we are not discarding the language but just a script. We can also club scripts based on the style and satisfy egos. E.g. Devnagari, Gujrati, Bengali etc. are similar scripts. We can have a combined set of glyphs from these languages. Same can be done for all four south Indian scripts and we have have just two to three scripts across India but still retain all 20+ languages.

So I am eager to see this funny script I am using disappear from the world and replaced by a better script. How about you?

Follow up on competency

My colleague Deepak Sharma wrote two good follow up articles on competency building. The one I liked most was the can do/ will do analysis of team members. To quote from the blog

Here “Can do” refers to employee’s qualification to do the job and “Will do” refers to employee’s motivation to perform. The results are four alternatives:

  • Can Do/Will Do – Ideal Situation. The employee is fully qualified and doing the job as desired. The Manager should motivate and incentivize the employee suitably.
  • Can’t Do/Will Do – In this case, employee is willing to put in the efforts but is ill equipped skill wise to do the job. This suggests a competency gap and Training will help in this case.
  • Can Do/Won’t Do – In this case, employee has all required competencies to complete the job but still is not performing the job as desired. This shows a Motivational problem and counselling will help in this case.
  • Can’t Do/Won’t Do – In this case, the employee is lacking in both skills and motivation. Employer needs to weigh the options of counselling the employee versus the success of such counselling. The result could be Job in Jeopardy situation. As Kenneth Cooper notes in his book, Attitudes cannot be developed, only counselled.

It is interesting to note that Can Do/Can’t Do dimension of this model is competency based while Will Do/Won’t Do is not since it involves dealing with motivational attitude.

I am wondering how to counsel the Can Do + Wont Do guys.