IMPORTANT NOTE: This document has not yet passed the implementation phase. To see the last approved version, go to PLATFORM_STANDARDS_ANALYSIS.
How to write analyses
This document contains requirements and guidelines for writing good analyses. Here you will find information about what the structure of an analysis should be and how to approach the different kinds of tasks. Rules for reviewing will be provided as well.
General information
The analysis of a task is written in a section of its wiki page called Analysis. When creating the task's page, use the TaskPageTemplate - it provides a backbone structure for the analysis, which consists of the following sections:
- Overview - a brief description of the task (not more than a couple of sentences). In the first revision of the task, it should provide a brief overview of the whole task as well as what should be done in this revision. Otherwise, it should state what the current revision of the task is about.
- Task requirements - probably the most important section. It should include a list of requirements that the task must fulfill. These are used for reviewing the design and the implementation phases so they should be chosen very carefully. It is recommended to write them as a bullet list. Make sure the requirements are fulfillable for the effort of the current revison of the task. When not sure, mark the less important requirements as optional.
- Task result - a short phrase or sentence of what the task result should be (for example "Source code"). If the results are more than one, it is recommended to list them as bullets.
- Implementation idea - a brief description of how you would approach the task if you were to implement it. If you don't have a clear implementation idea, then you shouldn't write the analysis of this task.
- Related - links to similar tasks and previous revisions of this task as well as useful external resources - all that might be helpful in the design or the implementation phases.
- How to demo - a step-by-step instruction of how to demonstrate the task on the sprint closing.
When you write an analysis, you should:
- Remember the main guideline of this project: Work towards the goal!
- Think well about the task and figure out what its aim is and how it can be reached.
- Give as much as possible of the needed information for designing and implementing the task.
- Stick to the current revision of the task but keep an eye to the whole task progress and stay alert for possible smells.
- Discuss unclear aspects of the task with other team members.
- Fill all the sections of the analysis that are mentioned above.
- Fill the task name in the ticket query at the top of the newly created page.
Task kinds
Depending on the task kind, the analysis content varies. This section will explain what the content for the different kinds of tasks should be.
Coding task
The analysis of coding tasks should clearly describe what the expected functionality after this revision is. The different sections should contain as follows:
- Overview - an explanation of what will be done, what will the visible result of the task be, what the specific library/module related to this task is about; it is recommended to have a code block including the description of the task that is found in the manage/sched/sophie2_wbs.py file.
- Task requirements - a list of required features or prerequisites to be fulfilled in order to have these features.
- Task result - in most cases, the result of these tasks is source code but may also contain diagrams, graphics, etc.
- Implementation idea - a brief explanation of implementation methods, suggested algorithms, etc.
- Related - a list of related tasks, previous revisions of this task, useful external links, etc. (same as all other task kinds).
- How to demo - an explanation of how to present the new functionality to the team (same as all other task kinds).
There are subkinds of coding tasks with specific requirements for the analysis. These are:
- External feature (visible from the user) - should provide a specification. A draft of a specification diagram is recommended.
- Researching a technology or a library - should state what the new technology/library will solve and why it is needed.
- Internal feature (not directly visible) - should state what the new feature will provide.
- Structure changes (refactoring) - should state what should be improved and why.
A sample approach
When analysing coding tasks, the following resources might be useful:
- The manage/sched/sophie2_wbs.py file - find the task name and take a look at the description, the dependencies, the total effort, the effort for the current revision and when the next revisions are scheduled for.
- The TASK_INDEX page - take a look at the efforts and the schedule in a more convenient way than in the WBS file.
- The googledocs - scim through the specifications - although not complete and somewhat inaccurate, they can provide some guidelines.
- The source code - take a look at the existing functionality, try to think about what new to add and how to improve the existing code.
- The team - ask someone that has done a previous revision or has more expirience in that area of the code.
Bug Fix
The analysis of bug is filled when reporting the bug. It should contain:
- Overview - a description of the bug - what happens where and in what circumstances; a link to the broken code.
- Task requirements - a list of requirements that if fulfilled will fix the bug, what the expected functionality is. Give details - the bug can be a result of misunderstanding the original task requirements.
- Task result - in most cases, the result of these tasks will be fixed source code but may also contain diagrams, graphics, etc.
- Implementation idea - suggest a fix if possible; if not - a general approach for fixing the bug.
- Related - a list of related tasks and tasks that depend on this bug (that have misfunctionality, lack of features, etc.), tests that break, links to use cases.
- How to demo - an instruction of how to prove the bug is fixed - usually explaining the bug, presenting the solution, running tests.
Document
The analysis of documents tasks should describe the type, structure and contents of the document. The different sections should contain as follows:
- Overview - a brief description of the contents of the document or what should be changed in this revision.
- Task requirements - an explanation of the document type (wiki page, googledoc, etc.), its structure and contents, document visibility (internal/external), whether it is an important document (that should be listed in the backlog or in the ImportantDocs page.)
- Task result - a link to the document(s) that will be created.
- Implementation idea - ideas on where to start from.
- Related - a list of related documents, previous revisions of this task/document, useful external links, etc. (same as all other task kinds).
- How to demo - usually showing the document, describing its structure and highlighting the most important information.
Setup
The analysis of setup tasks should describe what and where will be set up. The different sections should contain as follows:
- Overview - an explanation of the role of this appliance in the project, its usage and benefits.
- Task requirements - a list of new services that should be available, requirements about their functions and dependencies.
- Task result - in most cases a set up appliance and/or setup, backup scripts.
- Implementation idea - a brief overview of implementation methods, suggested hardware requirements, etc. (same as all other task kinds).
- Related - a list of related tasks, tasks that depend on this appliance.
- How to demo - an explanation of how to present this setup. (same as all other task kinds).
Maintenance
The analysis of maintenance tasks should describe what will be maintained. The different sections should contain as follows:
- Overview - a brief explaination of what should be revised (not more than a couple of sentences).
- Task requirements - a list of specific things to look at, for example:
- Are there impediments related to this maintenance in the backlog?
- What are the steps that are recreated on each revision of this task?
- Task result - in most cases a maintained appliance/tool/document and/or setup, backup scripts.
- Implementation idea - a sample approach for the implementation (same as all other task kinds).
- Related - a list of links to related tasks, previous revisions of this task, etc. (same as all other task kinds).
- How to demo - an explanation of how to present this task to the team. (same as all other task kinds).
Examples
Please note that these examples have been created according to the previous version of the standards and do not fully reflect the new ones.
BASE_HALOS_R0 - clear, concise and easily readable.
AUDIO_CONTENT_R0 - although a bit long, the overview gives a clear idea about what the task is about.
S2S_TESTING_SERVER_R0 - clear and well structured.
Reviewing
The reviewer should track the presence of the following things:
- Required things:
- Information in all sections of the analysis according to the descriptions mentioned above (both the general ones and those regarding the specific kind of task).
- Clarity of expression - no misleading things that can be misunderstood or misinterpreted.
- Effort is considered when listing the task requirements - they should be fulfillable for the effort given.
- Recommended things:
- Clear and consistent structure that is easily readable (bullet lists, etc.).
- A code block with the description from the WBS file for the coding tasks.
- A draft of a specification diagram for external features.
Reviewers should either follow the standards in this document or comment them in the Comments section of this page. If you state a task does not comply with the standards, point to the requirements that are not met. Scores are in the range 1-5. Here are the rules for scoring an analysis:
- Score 1 (fail): The analysis is not structured according to the standards (or is to very little extent) OR the task requirements are incorrect and irrelevant to this task.
- Score 2 (fail): The analysis is structured according to the standards in the most part but has some things that are missing, unclear or may mislead the designer/implementer.
- Score 3 (pass): The analysis is structured according to the standards but is too short and there's a lot more to be added.
- Score 4 (pass): The analysis is structured according to the standards and provides enough information for the designer and implementer (useful links, good implementation idea, clear task requirements that are fulfillabe for the effort stated, etc.).
- Score 5 (pass): The analysis is structured according to the standards and there's nothing more to be added - it's perfect in such a way that a person who is not quite familiar with the project can do well with the design and the implementation.
All reviews should be motivated. A detailed comment about why an analysis fails is required. For a score of 3 a list of things that could be better should be provided. Comments are encouraged for higher scores as well. Non-integer scores are STRONGLY disencouraged. If you give an analysis a score of 3.5, then you probably have not reviewed it thoroughly enough and cannot clearly state whether it is good or not. Once an analysis has been reviewed, it cannot be altered. If you think it is wrong, you should request a super review. Currently all super reviews should be discussed with Milo. Make sure you are able to provide clear arguments of why the analysis is not good before you request a super review.
Comments
Your comment here --developer-id@YYYY-MM-DD