Changes between Version 30 and Version 31 of PLATFORM_STANDARDS_ANALYSIS


Ignore:
Timestamp:
01/20/09 18:48:54 (16 years ago)
Author:
boyan
Comment:

--

Legend:

Unmodified
Added
Removed
Modified
  • PLATFORM_STANDARDS_ANALYSIS

    v30 v31  
    11[[BackLinksMenu]] 
    2  * The purpose of the analysis is to give as much as possible of the needed information for designing and implementing the task. 
    3  * The analysis of the first revision of each task should contain a brief overview of the whole task. 
    4  * Think well about the task! You should figure out very well what the aim (result) of the task should be and how to be reached. 
    5  * List the main features that have to be present in the final result of the task. These are used later to verify the design and implementation. 
    6  * Stick to the current revision of the task, but keep an eye to the whole task progress, and stay alert for possible smells. 
    7  * It is strongly recommended to discuss unclear aspects of the task with other team members. 
    8  * It is advisable to include some rough implementations ideas. 
    9  * Analyses must be kept in the task's page in chapter called Analysis. 
    10  * Every analysis must be made by one developer and one QA, and reviewed by other developer/QA. 
    11  * You can use PageTemplates/TaskPageTemplate#Analysis wiki page template for your analysis. 
    12  * Example for analysis that meets the requirements is GLOBAL_SPEC_STRUCTURE_R0  
    13 Depending on task type the analysis section content varies. 
    14  * Coding tasks - Analysis of these tasks should describe the aim of this revision. 
    15   * Overview - contains explanation of the task. This explanation should present what will be done from the user point of view, what will be the visible result from the task.  
    16   * Requirements - Here should be listed requirements of the result - what is desired functionality, what should this element have, etc.  
    17   * Task result - In this section should be listed what is the result of the task. For these tasks it will be code but depending on task may also contain diagrams, graphics, etc 
    18   * Implementation idea - Brief overview of implementation methods, suggested algorithms, etc. 
    19   * Related - Here should be listed related tasks that might be useful for solving this task. You may also point to previous and next revisions of the same task, useful external links, etc. 
    20   * How to demo - Explains how the result of this task may be presented to the team or external person. 
     2[[PageOutline]] 
    213 
    22  * Bug Fix - Analysis of bug fixes should explain what the problem is, in which module, when it appears.  
    23   * Overview - explains the mis functionality in a sentence. Describe what happens and in what circumstances. 
    24   * Requirements - Here should be listed requirements of the result - what is the expected functionality. Please give detailed requirements, the bug can be a result of misunderstanding the original task requirements. 
    25   * Task result - In this section should be listed what is the result of the task. For these tasks it will be maintained code but depending on task may also contain diagrams, graphics, etc 
    26   * Implementation idea - Brief overview of implementation methods, suggested algorithms, etc. You may point the mistaken part of the code here. 
    27   * Related - Here should be listed related tasks that might be useful for solving this task. You should list tasks that depend on this bug (mis functionality, lack of feature, etc). You should also point related use cases. 
    28   * How to demo - Depending on task requirements, explain the problem in Implementation or Design, present the solution in Implementation section. Link useful test cases. 
     4'''IMPORTANT NOTE:''' This document is being worked on. To see the last approved version, go to [wiki:PLATFORM_STANDARDS_ANALYSIS?rev=30]. 
    295 
    30  * Document - Analysis of documents tasks should point the aim of this document.  
    31   * Overview - Overview of specific revision should point what will be done in this revision. 
    32   * Requirements - Requirements to the documents that will be the result 
    33    * What sections should they contain 
    34    * Are these internal or external documents. 
    35    * Optional - who has to read these documents and in what period. 
    36    * Is it an important document. 
    37   * Task result - Should point revision target documents - wiki pages, diagrams, etc. 
    38   * Implementation idea - You may research related examples and share them in this section. 
    39   * Related - Here should be listed related tasks that might be useful. You should link previous revisions of this task. You may list useful external links too. 
    40   * How to demo - Link the document and give idea about highlighting parts of the implementation. 
    41  * Setup 
    42   * Overview - explains the role of this appliance in the project, what will it be used for. 
    43   * Requirements 
    44    * What services will be available, what depends on them.  
    45   * Task result 
    46    * In most common case the result is a set up appliance  
    47    * and/or setup, backup scripts. 
    48   * Implementation idea - Brief overview of implementation methods, suggested hardware requirements, etc.  
    49   * Related - Here should be listed related tasks that might be useful for solving this task. You should list tasks that depend on this appliance. 
    50   * How to demo - List how this setup can be presented. 
     6= How to write analyses = 
     7This page contains requirements and guidelines for writing good analyses. Here you will find information about what the structure of an analysis should be and how to approach the different kinds of tasks. Rules for reviewing will be provided as well. 
    518 
    52  * Maintenance 
    53   * Overview - explains what should be revised in a sentence 
    54   * Requirements 
    55    * Are there impediments related to this maintenance in the backlog 
    56    * What are the steps that are recreated each maintenance 
    57    * Log requirements 
    58   * Task result 
    59    * In most common case the result is a maintained appliance/tool.  
    60    * and/or setup, backup scripts. 
    61   * Implementation idea - Couple of sentences about implementation. 
    62   * Related - Here should be listed related tasks that might be useful for solving this task. You should also list previous revisions and setup of this tool/appliance. External links may be useful too. 
    63   * How to demo - Give instructions about presenting the results depending on task requirements. 
    64 Examples of good analyses: 
    65  * [wiki:SCHEDULE_MAINTENANCE_R1#Analysis]  
    66  * [wiki:SCS_INFORMATION_R0#Analysis] 
    67  * [wiki:PRO_LIB_INSPECTOR_R0#Analysis] 
     9== General information == 
    6810 
    69 Review and super review are scores 1-5. >=3 points means analysis passed the review and can be designed. Review should judge does this analysis answer the question "what has to be done this revision", does the analysis meet the analysis requirements. Passed analysis are not edited, only a Super Review may force analysis refactoring. 
     11The analysis should be kept in a section of the task's wiki page called Analysis. When creating the task's page, use the TaskPageTemplate - it provides a backbone structure for the analysis, which consists of the following: 
     12 * Overview - a brief description of the task (not more than a couple of sentences). In the first revision of the task, it should provide a brief overview of the whole task as well as what should be done in this revision. Otherwise, it should state what the current revision of the task is about.  
     13 * Task requirements - probably the most important section. It should include a list of requirements that the task must fulfill. These are used for reviewing the design and implementation phase so they should be chosen very carefully. It is recommended to write them as a bullet list. Make sure the requirements are fulfillable for the effort of the current revison of the task. When not sure, mark the less important requirements as oprional. 
     14 * Task result - a short phrase or sentence of what the task result should be (for example "Source code"). If the results are more than one, it is recommended to list them as bullets. 
     15 * Implementation idea - a brief description of how you would approach the task if you were to implement it. If you don't have a clear implementation idea, then you shouldn't write the analysis of this task. 
     16 * Related - links to similar tasks and previous revisions of this task as well as useful external resources - all that might be helpful in the design or implementation phase. 
     17 * How to demo - a step-by-step instruction of how to demonstrate the task on the sprint closing. 
     18 
     19When you write an analysis, you should: 
     20 * Think well about the task and figure out what its aim is and how it can be reached. 
     21 * Give as much as possible of the needed information for designing and implementing the task. 
     22 * Stick to the current revision of the task but keep an eye to the whole task progress and stay alert for possible smells. 
     23 * Discuss unclear aspects of the task with other team members. 
     24 * Fill all the sections of the analysis that are mentioned above. 
     25 * Fill the task name in the ticket query at the top of the page. 
     26 
     27== Task kinds == 
     28Depending on the task kind, the analysis content varies. This section will explain what the content for the different kinds of tasks should be. 
     29 
     30=== Coding task === 
     31The analysis of coding tasks should clearly describe what the expected functionality after this revision is. The different sections should contain as follows: 
     32  * Overview - an explanation of what will be done, what will the visible result of the task be, what the specific library/module related to this task is about; it is recommended to have a code block including the description of the task that is found in the [source:/manage/sched/sophie2_wbs.py] file. 
     33  * Task requirements - a list of required features or prerequisites to be fulfilled in order to have these features. 
     34  * Task result - in most cases, the result of these tasks is source code but may also contain diagrams, graphics, etc. 
     35  * Implementation idea - a brief explanation of implementation methods, suggested algorithms, etc. 
     36  * Related - a list of related tasks, previous revisions of this task, useful external links, etc. (same as all other task kinds). 
     37  * How to demo - an explanation of how to present the new functionality to the team (same as all other task kinds). 
     38 
     39There are subkinds of coding tasks with specific requirements for the analysis. These are: 
     40 * '''External feature (visible from the user)''' - should provide a specification. A draft of a specification diagram is recommended. A task requirement is adding a manual testing scenario. 
     41 * '''Researching a technology or a library''' - should state what the new technology/library will solve and why it is needed. 
     42 * '''Internal feature (not directly visible)''' - should state what the new feature will provide. 
     43 * '''Structure changes (refactoring)''' - should state what should be improved and why. 
     44 
     45==== A sample approach ==== 
     46When analysing coding tasks, the following resources might be useful: 
     47 * The [source:/manage/sched/sophie2_wbs.py] file - find the task name and take a look at the description, the dependencies, the total effort, the effort for the current revision and when are the next revisions scheduled for. 
     48 * The [wiki:TASK_INDEX] page - take a look at the efforts and the schedule in a more convenient way than in the WBS file. 
     49 * The googledocs - scim through the specifications - although not complete and somewhat inaccurate, they can provide you with some guidelines. 
     50 * The source code - take a look at the existing functionality, try to think about what new to add and how to improve the existing code. 
     51 * The team - ask someone that has done a previous revision or has more expirience in that area of the code. 
     52 
     53=== Bug Fix  === 
     54The analysis of bug fixes should explain the problem - what it is, where it is and when it happens. The different sections should contain as follows: 
     55  * Overview - a brief explanation of the bug - what happens and in what circumstances; a link to the broken code should be provided if possible. 
     56  * Task requirements - a list of requirements that if fulfilled will fix the bug, what the expected functionality is. Give details - the bug can be a result of misunderstanding the original task requirements. 
     57  * Task result - in most cases, the result of these tasks will be fixed source code but may also contain diagrams, graphics, etc. 
     58  * Implementation idea - suggest a fix if possible; if not - a general approach for fixing the bug. 
     59  * Related - a list of related tasks and tasks that depend on this bug (that have misfunctionality, lack of features, etc.), tests that break, use cases. 
     60  * How to demo - an instruction of how to prove the bug is fixed - usually explaining the bug, presenting the solution, running tests. 
     61 
     62=== Document === 
     63The analysis of documents tasks should describe the type, structure and contents of the document. The different sections should contain as follows: 
     64  * Overview - a brief description of the contents of the document or what should be changed in this revision. 
     65  * Task requirements - an explanation of the document type (wiki page, googledoc, etc.), its structure and contents, document visibility (internal/external), whether it is an important document (that should be listed in the backlog or in the [wiki:ImportantDocs] page.) 
     66  * Task result - a link to the document(s) that will be created. 
     67  * Implementation idea - ideas on where to start from. 
     68  * Related - a list of related documents, previous revisions of this task/document, useful external links, etc. (same as all other task kinds). 
     69  * How to demo - usually showing the document, describing its structure and highlighting the most important information. 
     70 
     71=== Setup === 
     72The analysis of setup tasks should describe what and where will be set up. The different sections should contain as follows: 
     73  * Overview - an explanation of the role of this appliance in the project, its usage and benefits. 
     74  * Task requirements - a list of new services that should be available, requirements about their functions and dependencies. 
     75  * Task result - in most cases a set up appliance and/or setup, backup scripts. 
     76  * Implementation idea - a brief overview of implementation methods, suggested hardware requirements, etc. (same as all other task kinds). 
     77  * Related - a list of related tasks, tasks that depend on this appliance. 
     78  * How to demo - an explanation of how to present this setup. (same as all other task kinds). 
     79 
     80=== Maintenance === 
     81The analysis of maintenance tasks should describe what will be maintained. The different sections should contain as follows: 
     82  * Overview - a brief explaination of what should be revised (not more than a couple of sentences). 
     83  * Task requirements - a list of specific things to look at, for example: 
     84   * Are there impediments related to this maintenance in the backlog? 
     85   * What are the steps that are recreated on each revision of this task? 
     86  * Task result - in most cases a maintained appliance/tool/document and/or setup, backup scripts. 
     87  * Implementation idea - a sample approach for the implementation (same as all other task kinds). 
     88  * Related - a list of links to related tasks, previous revisions of this task, etc. (same as all other task kinds). 
     89  * How to demo - an explanation of how to present this task to the team. (same as all other task kinds). 
     90 
     91== Examples == 
     92 
     93== Reviewing == 
     94 
     95The reviewer should track the presence of the following things: 
     96 * Required things: 
     97  * Information in all sections of the analysis according to the descriptions mentioned above. 
     98  * Clarity of expression - no misleading things that can be misunderstood or misinterpreted. 
     99 * Recommended things: 
     100  * Clear and consistent structure that is easily readable (bullet lists, etc.). 
     101  * A code block with the description from the WBS file for the coding tasks. 
     102  * A draft of a specification diagram for external features. 
     103 
     104Reviewers should either follow the standards in this document or comment them in the '''Comments''' section of this page. Scores are in the range 1-5. Here are rules for scoring an analysis. 
     105 * Score 1 (fail): The analysis is not structured according to the standards (or to a little extent) OR the task requirements are incorrect and irrelevant to this task. 
     106 * Score 2 (fail): The analysis is structured according to the standards in the most part but has some things that are unclear or may mislead the designer/implementor. 
     107 * Score 3 (pass): The analysis is structured according to the standards but is too short and there's a lot more to be added. 
     108 * Score 4 (pass): The analysis is structured according to the standards and provides enough information for the designer and implementer (useful links, good implementation idea, clear task requirements that are fulfillabe for the effort stated, etc.). 
     109 * Score 5 (pass): The analysis is structured according to the standards and there's nothing more to be added - it's perfect in such a way that a person who is not quite familiar with the project can do well with the design and implementation. 
     110 
     111All reviews should be motivated. A detailed comment about why an analysis fails is required. For score 3 a list of things that could be better should be provided. Comments are encouraged for higher scores as well. Non-integer scores are STRONGLY disencouraged. If you give an analysis a score of 3.5, then you probably have not reviewed it thoroughly enough and cannot clearly state whether the analysis is good or not. Once an analysis has been reviewed, it cannot be altered. If you think an analysis is wrong, you should request a super review. Currently all super reviews should be discussed with Milo. Make sure you are able to provide clear arguments of why the analysis is not good before you request a super review. 
    70112 
    71113= Comments = 
    72  * boyan: 
    73   * the overview for coding tasks does not always represent the thing from the user point of view as stated - there is often implementation information that the user does not know. 
    74   * I don't agree that [wiki:SCHEDULE_MAINTENANCE_R1#Analysis] can be used as a good example of analysis - the overview section lists the task requirements instead of a more general explanation. 
    75   * The example listed in the beginning (GLOBAL_SPEC_STRUCTURE_R0) should be listed along the other examples. 
    76   * Different task kinds can be marked as headings instead of listed in a bullet list - this will improve readability. 
    77  * deyan 
    78   * Currently the implementation idea is pointed as an advise, not as a requirement. 
    79   * For some tasks it may be useful to have a draft of a specification diagram. This should be decided as a recommended or optional rule. 
    80   * A recommendation for adding the descriptions from the [source:/manage/sched/sophie2_wbs.py] to the overview as "code" might be useful.  
    81 EDIT: adding the deacriptions to the overview formatted as code, for example. You can see [wiki:APP_PLUGIN_MANAGER_LIST_R0] for an example. 
    82  * sriggins 
    83   * Deyan: Could you elaborate on "A recommendation for adding the descriptions from the [source:/manage/sched/sophie2_wbs.py] to the overview as "code" might be useful." please? 
     114  ^Your comment here --dev.id@YYYY-MM-DD^