A place to cache linked articles (think custom and personal wayback machine)
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

index.md 8.9KB

4 years ago
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153
  1. title: Software maintenance is an anti-pattern
  2. url: https://18f.gsa.gov/2016/02/23/software-maintenance-is-an-anti-pattern/
  3. hash_url: e6f00d9b489f69b28250cd43e33b96d2
  4. <p>In his 1977 book “A Pattern Language: Towns, Buildings, Construction”,
  5. Christopher Alexander defined ways to architect physical spaces to
  6. enhance and support how people interact. This concept was very
  7. influential in software development, leading to the popularization of
  8. software <em>design patterns</em>, which identified ways in which we construct
  9. software that are mostly independent of a programming language. These
  10. are patterns of how software components behave and interact — ways to
  11. effectively solve common problems.</p>
  12. <p>This book also inspired the idea of an <em>anti-pattern,</em> which is a common
  13. response to a problem that appears to solve the issue, but is actually
  14. ineffective and even counterproductive. This term has grown to include
  15. social patterns of human interaction, as well as software patterns.</p>
  16. <p>Software maintenance is the common practice where, after a software
  17. project is “complete,” a small, often part-time team or a single
  18. developer ensures critical upgrades and fixes serious bugs, with limited
  19. improvements, as time allows. This is commonly referred to as Operations
  20. &amp; Maintenance, or by its acronym, O&amp;M. Software is never complete in its
  21. first release, so devoting ongoing staff to sustain a piece of software
  22. is a necessary practice. However, at some point, when the audience for a
  23. particular piece of software is no longer growing, it’s appropriate to
  24. reduce staff devoted to sustaining a software project.</p>
  25. <p>Governments often use two anti-patterns when sustaining software:
  26. equating the &quot;first release&quot; with &quot;complete&quot; and moving to reduce
  27. sustaining staff too early; and how a reduction of staff is managed when
  28. a reduction in budget is appropriate. To address the latter
  29. anti-pattern, managers need to rethink how they approach spending their
  30. O&amp;M budget.</p>
  31. <h2 id="what-happens-in-private-industry?">What happens in private industry?</h2>
  32. <p>In the software industry, software is not considered complete when it’s
  33. first released. The first release is the very beginning of the journey.
  34. In successful software projects, the first version is always released
  35. with open issues, deferred features, and a roadmap (or at least ideas)
  36. of how it can and should be improved in the future. The team will
  37. immediately start developing new features and fixing bugs and typically
  38. will get <em>larger</em> over time, not smaller. Investing more to increase the
  39. size of the team is justified in the private sector since companies will
  40. remain focused on a growing market with growing revenue.</p>
  41. <p>When the market stops growing and becomes saturated or declines and
  42. revenues decrease, the investment in the software will be reduced to be
  43. as small as possible. There’s an awareness that the software is on life
  44. support, expected to die a slow death as customers become increasingly
  45. dissatisfied.</p>
  46. <h2 id="maintenance-invites-technical-debt">Maintenance invites technical debt</h2>
  47. <p>When a maintenance team is staffed with a skeleton crew, the team does
  48. not have the time or authority to do user research or validate that
  49. software is effectively meeting the needs of users. Instead, a support
  50. team queues up a list of issues that are ranked based on negative
  51. impact. The impact is typically assessed by the level of complaints and
  52. measured by the expense of staffing the customer support line. It’s
  53. assumed that “fixing” the bug will lead to a lower volume of complaints.</p>
  54. <p>We know from experience that a complaint is often the tip of the
  55. iceberg, reporting a symptom rather than the cause. When software is in
  56. active development, we might look at a collection of a dozen bugs and
  57. realize that altering the design of the feature will be a less expensive
  58. approach and lead to increased customer productivity and ease of use.
  59. Fixing any one or two bugs can seem like the right solution in the
  60. short-term, but often doesn’t actually lead to a decrease in customer
  61. complaints, or worse, can lead to the “whack a mole” phenomenon when a
  62. developer will fix one bug only to cause another to pop up.</p>
  63. <p>The software is then allowed to grow old until a full rewrite or
  64. replacement is needed.</p>
  65. <p><strong>It’s precisely this model of maintenance that has created the negative
  66. legacy of government software systems we face today.</strong></p>
  67. <h2 id="a-different-model-of-software-operations">A different model of software operations</h2>
  68. <p>Government software is different. There’s no profit motive. The market
  69. is fixed (or fluctuates slowly with population or employment changes).
  70. We need a model where we can create software that thrives when the
  71. number of users and usage remains constant. We need to be able to reduce
  72. costs when the software meets the needs of the people it serves, yet
  73. detect when changes are needed.</p>
  74. <p>To create a new, productive pattern, we’ll still need monitoring that
  75. detects signs of trouble that could lead to outages, and to keep up with
  76. regular software upgrades for security concerns. However, we need to
  77. take a completely different approach to bug reports and customer
  78. complaints.</p>
  79. <p>To keep teams from constantly plugging leaks while missing the reason
  80. the boat is sinking, we need to apply the principles of agile
  81. development and user centered design to how we work on software after
  82. the initial release. Instead of a small team working constantly to fix
  83. small bugs, an entire team should work in bursts of activity. A team
  84. might work on a few software projects focusing on a different project
  85. each quarter, depending on the data.</p>
  86. <p>The key point is that a full cross-disciplinary team would devote its
  87. complete attention to the software for a sustained interval. Periodic
  88. discovery activities would be followed by a burst of implementation,
  89. then validation.</p>
  90. <p>The discovery activities could take place once per year or every six
  91. months or could be triggered by a significant change in system health or
  92. usage metrics. The team would look at:</p>
  93. <ul>
  94. <li> <strong>User need.</strong> Who is using the product? Are they the same target user? Who is not using the product who might find it useful? What other products or services are they using?</li>
  95. <li> <strong>Competitive analysis.</strong> What exists in the marketplace that is similar to our solution? If there’s a clear competitor with market traction and a great user experience, the team should perform a cost analysis and if comparable with current costs, consider buying the solution.</li>
  96. <li> <strong>Usability test</strong> with a sampling of current users</li>
  97. <li> <strong>Triage the backlog</strong> of customer complaints, bug reports, and system health indicators — cross-reference with the results of usability tests and usage metrics. Make recommendation of key improvements and changes that would address underlying issues. Prioritize critical concerns.</li>
  98. <li> <strong>Net Promoter Score</strong> or other happiness metric to evaluate effectiveness of the software solution</li>
  99. </ul>
  100. <h2 id="potential-impact">Potential impact</h2>
  101. <p>We don’t know if this new approach will be successful, but we do know
  102. that the old approaches don’t work. We need alternate techniques to
  103. sustain software services at moderate budget levels so that we don’t
  104. simply re-create the failures of the past when digital services that
  105. meet the needs of today slowly become obsolete as the world changes
  106. around us.</p>
  107. <p>This is particularly important as the pace of change is increasing,
  108. along with the interconnectedness of new government services that are
  109. increasingly delivered as web applications. In the ‘70s and ‘80s, we
  110. could build software that would essentially function as designed for
  111. 5-10 years. In the 90s, with the growth of the web and changes in how
  112. people use technology, that timeframe narrowed to 3-5 years. Now web
  113. applications need more frequent updates. As we move from a paper-based
  114. society to one that is increasingly connected and as our population
  115. increasingly relies on mobile devices and online services for their
  116. routine needs, we’ll need to make periodic improvements to keep our
  117. services relevant and up to date.</p>
  118. <p>There will always be some software systems that need a dedicated team
  119. focused on continuous improvement. However, we also anticipate that
  120. government will always have a need for software systems that serve a
  121. stable audience of users with needs that might not change in a year,
  122. though they will change at some point, and we need to notice and respond
  123. to that change.</p>
  124. <p>The potential impact of this kind of model will be to continue to reduce
  125. costs while delivering higher value to the people who use our services.
  126. If you see your agency follow this anti-pattern for O&amp;M, talk to <a href="mailto:inquiries18f@gsa.gov">our
  127. transformation services team</a> about
  128. helping to change how you sustain software.</p>