Design Guidelines

Apart from principles, these are the guidelines for approaching a software design. There are a few sections for a good design.

Coding Styles

A good design comprises of good coding styles. For my guides, I wrote a separate section for it.

Division of Labor

Code is at best designed with:

  • internal parts can be developed independently
  • internal details of other classes do not need to be understood. Interface is sufficient.
  • test codes are separate from the source codes regardless test organization.

Delegation

Delegation is simply having a class/structure/module/interface relies on another class/structure/module/interface. Example, a sort function relies on compare function in order to sort.

Judicious delegation facilitates code reuse. Examples:

  • the compare function is usable against other data types.
  • the sorter function is usable for arbitrary sort orders.


Approaches

  1. Small public facing interface with clear contract.
  2. Encapsulate implementations.
  3. Class inheritance.
  4. sub-typing (polymorphism)


Deployment

Facilitate reusability.

Test Driven

Software can introduce multiple depth and broad degrees of errors ranges from:

  • Functional errors
  • Performance errors
  • Deadlock
  • Race conditions
  • Boundary errors
  • Buffer overflow
  • Integration errors
  • Usability errors
  • Robustness errors
  • Load errors
  • Design defects
  • Versioning and configuration errors
  • Hardware errors
  • State management errors
  • Metadata errors
  • Error‐handling errors
  • User interface errors
  • API usage errors
  • and more

Hence, all codes should be designed to be testable by various tools and approaches. Hence, the development process: test-driven development.


Degree of Testing

  1. Compiler - catches types, statements, and code-to-program interpretation error.
  2. Static Analysis - catches known codes problems via code analysis through cumulative insights and data-driven way.
  3. Manual/Formal Verification - manual testing and outcome verification. Certification is given for compliance.
  4. Automated Unit Testing - automated testing, regressively or cherry-picking.


Decision Making

  1. Who do what test?
    • Developer? Do what? Functional, Boundary Value Analysis testings?
    • Tester? Do what? Black-box/White-box testings? Penetration testing?
    • Quality Assurance Team? Monitoring and management? Managing bugs?
    • Customer? Report bug?
  2. When to test?
    • Before development? Smoke test?
    • During development? Unit test? Regression testing? Performance testing?
    • After development? End-to-End (E2E) testing? Performance testing?
    • Before shipping?
  3. How far should we test?
    • Define acceptable coverage?
    • Critical path tested?
    • Method coverage?
    • Statement coverage?
  4. When to stop testing?
    • Is the test achieving the overall values?
    • Is the test gives sufficient confidences over the quality assurance / guarantees?

That's all about design guidelines.