wiki:PLATFORM_STANDARDS_GENERAL

Version 64 (modified by boyan, 16 years ago) (diff)

--

Error: Macro BackLinksMenu(None) failed
compressed data is corrupt

IMPORTANT NOTE: This document has not yet passed the implementation phase. To see the latest approved version, go to PLATFORM_STANDARDS_GENERAL.

General Platform Standards

This document contains information about the general standards for analysing, designing implementing and testing the diffferent kinds of tasks.

Tasks kinds

Currently we have five task kinds, each with different requirements for what should be done in each of the four phases - analysis, design implementation and testing. Note that the examples given are created according to the old standards and do not fully reflect the new ones. Some tasks may belong to several categories.

Coding task

These are tasks that are related to code - adding new functionality, improving features, refactoring, etc. The different phases' requirements are as follows:

  • Analysis - contains a brief overview of the task and information about the required functionality, the expected results, ideas for implementation and demonstration as well as links to related tasks.
  • Design - describes the technologies that will be used for reaching the task's requirements. It should contain initial tests, libraries needed, rough algorithm explanation, class diagrams, etc.
  • Implementation - describes what has been done. There should be a link to the changeset of the commit(s) where the modifications were done. Explanation of the changes should be provided. Write down the added functionality or improvements that the new code bring to the project. Explain which part of source you've added/edited and how, link the classes/packages/modules that you have created/modified.
  • Testing - includes writing user documentation, release documentation (where applicable), manual test cases in Testlink, executing test cases and reporting bugs.

Coding tasks can be split in the following subkinds that have specific requirements for the different phases:

  • External feature (visible from the user):
    • Analysis should include a specification (a draft of a specification diagram is recommended).
    • Design should provide a manual testing scenario and inital auto tests. If the code exists, the design should state what is to be added/changed. When adding a new library feature, use test cases should be provided. It is recommended that you write some demos and an outline of your design.
    • Implementation should follow the design and make sure all features work and all tests pass. During the process more tests should be added where needed.
    • Example - PRO_LIB_INSPECTOR_R0
  • Researching a technology or library
    • Analysis should state what needs to be researched - what will the new technology/library solve and why it is needed.
    • Design should do the actual research. You can experiment with things but you should not pollute the main source - use the new libraries in a copy of the project or in another branch.
    • Implementation should present the written results/conclusions of your reseach, demo code, tutorials and how-tos, etc.
    • Example - S2S_DEPLOY_TECHNOLOGIES_R0
  • Internal feature (not directly visible)
    • Analysis should state what the new feature should provide.
    • Design should include use case tests, samples, demos and a design outline.
    • Implementation should follow the design, make sure the required functionality is achieved and add tests where needed.
    • Example - BASE_PERSISTENCE_COMMONS_R0
  • Structure changes (refactoring)
    • Analysis should state what needs to be changed (and why).
    • Design should explain the changes to be made and how the problems can be fixed.
    • Implementation should to the actual refactoring and describe the changes made.
    • Example - PRO_LIB_CORE_COMMONS_R1

Bug Fix

Bug fixes are unplanned tasks. They represent different kinds of unwanted application behavior - lack of functionality, misfunctionality, errors, etc. Bug tasks should be presented in the trac as "BUG_TASK_NAME". They do not have the usual workflow and do not go through the four phases like the other tasks. The workflow for the bug fixes is still not clear and will be described later.

Document

Document tasks require the creation/improvement of different documents. In most cases, these documents are auxiliary for other tasks. The most common type of document we use is a wiki page but the result may also be a google doc or something else. The content of the documents varies and may include text, diagrams, media files, spreadsheets, etc.

  • Analysis - contains document requirements (type, structure and contents of the document).
  • Design - provides an outline of the document with the section that it contains and a description of each section's content.
  • Implementation - contains link to the document(s) and a brief overview. Explanation of things added/changed should be provided.
  • Testing - currently there is no testing phase for the document tasks.
  • Example - PLATFORM_DEPLOYMENT_BUILD_ECLIPSE_R1

Setup

The result of these tasks is hardware/software setup of different computer appliances that will be used for executing other tasks. These include website, wiki, developer platform setup, etc.

  • Analysis - states what the requirements for this appliance are - both hardware and software. For example, some of the community server hardware requirements are hard disk space and bandwidth, and the software ones - a running web server, security issues, etc.
  • Design - describes which computer appliance will satisfy the requirements, how it will be set up, what technologies will be used.
  • Implementation - describes how this appliance was set up. Links the result (a new server, some wiki pages, etc.)
  • Testing - currently there is no testing phase for maintenance tasks.
  • Example - SCS_MACHINE_SETUP_R1

Maintenance

Maintenance tasks should keep servers, important documents, code, etc. in a perfect working condition. These taskshave revisions on every iteration.

  • Analysis - covers current issues of the server/document/code and suggestions for improving. Should also contain a list of trivial actions that have to be done in every revision.
  • Design - explains what should be done for meeting the requirements, links to tools that will be used, algorithms, diagrams and whatever is needed for an easy implementation.
  • Implementation - consists of trivial actions done every maintenance and improvements listed in the design. Implementation steps should be described. A link should be provided to any documents maintained.
  • Testing - currently there is no testing phase for maintenance tasks.
  • Example - INTERNAL_BACKLOG_MAINTENANCE_R2

Task results

The results of the tasks should be described in the Implementation section of the task's page. Depending on the task kind, results can be:

  • Documents (wiki pages, googledocs, etc.)
  • Source code (with unit tests)
  • Diagrams (or design section of the same tasks)

In the Task Result section of the analysis the expected results should be described. In the implementation section they should be linked. In different revisions same results can be linked, but the changes made in the current revision should be described.

Reviewing

Requirements

Here are the requirements for each of the phases (in general). For specific requirements, see the other standards pages linked below.

  • Analysis
    • all the sections of the Analysis templates filled (with no fake information)
  • Design
    • for coding tasks:
      • unit tests
      • diagram(s)
    • for document tasks:
      • outline of the document's structure
  • Implementation
    • description of the changes made
    • for coding tasks - links to the corresponding changesets

Scoring

Here are general guidelines for what score a given task should get. Specific guidelines for the separate phases can be found in the the other documents (linked below).

  • Score 1 - the phase reviewed does not comply to the standards in the current document or the other standards documents.
  • Score 2 - the phase reviewed complies with the standards to the most part but has some things missing and/or is unclear and/or might be misleading to the designer/interpreter.
  • Score 3 - the phase reviewed complies with the standards but can be structured better and can include a lot more things.
  • Score 4 - the phase reviewed complies with the standards and is clear, well-structured and sufficient.
  • Score 5 - the phase reviewed complies with the standards and is clear, well-structured and sufficient and there is nothing more to be added - even a person that is not deep into the project can understand it.

Other standards

This document provides only general rules. For specific ones, take a look at:
PLATFORM_STANDARDS_ANALYSIS
PLATFORM_STANDARDS_DESIGN
PLATFORM_STANDARDS_CODE
PLATFORM_STANDARDS_AUTO_TESTS
PLATFORM_STANDARDS_MANUAL_TESTS

Naming conventions for wiki pages

When creating a new wiki page, comply with the following conventions:

Comments

Your comment here --developer-id@YYYY-MM-DD