wiki:PROCESS

Version 33 (modified by pav, 16 years ago) (diff)

--

Sophie 2.0 Process

Overview

In order to produce the highest quality product (both internal and external), we have defined process guidelines in this section. (This is related to the SCRUM mandatory definition of the done status state.)

This will help us meet the following challenges:

  • Internal quality (quality of code)
  • External quality (quality of the visible product)
  • Documentation reading (many things are written, few things are read)
  • Reducing unuseful process overhead

SCRUM

We are continuing to use SCRUM (info at http://en.wikipedia.org/wiki/SCRUM). What is new:

  • Becoming much more strict (and unfortunately increasing the process overhead).
  • Trying to have much higher quality.
  • The sprint backlogs will be on the wiki page (ITERATION_01 for example).
  • The four states of a story/task are analyzing, designing, implementing, and testing.
  • Each passage between states requires a review.

We create the internal backlog which should track:

  • Impediments backlog
  • Documentation backlog
  • Smells
  • Open Questions

Here is the document: Backlog

Teams

We define six teams for improved performance and management while developing sophie 2. The team diagram showing the splitting of the developers to teams is located here: http://sophie2.org/trac/browser/manage/sched/teams.odg
The six teams are as follows:

  • Analysis team - responsible for analysing the tasks for the current and the next iteration.
  • Base team - responsible for the Platform, Core and Base categories.
  • Main team - responsible for Main and End Product categories.
  • Server Team - responsible for Supporting, SCS and S2S categories.
  • Release Team - responsible for the releases and the reviews of the implementation phases(See PROCESS).

Note: The categories are described in the work breakdown structure: manage/sched/sophie2_wbs.py

Rules

This is a list of rules you have to follow. It is not final yet, but disobeying them may lead to penalties.

General

  1. Work towards the goal!
  • No matter what this document defines, or how specifically your task is defined, there is no way to define exactly what should be considered good for reaching our goal, and what should be considered bad.
  • You have to work towards the goal.
  • When you work on something, you have to understand why and how it is related to our goal.
  • You have to try to drive the whole team towards the goal (no one alone is capable of reaching it).
  1. Be honest!
  • You have to present the situation to the leader as is.
  • Do this, even if it does not look good.
  • For example, if you have no idea how to make something, don't say "almost done" to the leader.
  1. Take initiative!
  • If you find a way that you can make something better for the team, don't wait but propose that it to be done.
  1. Look around!
  • If you see problems in the code, in the product, in the organization, or whatever, announce them.
  • Try to keep an eye on what other people do (either for things you can learn from, or for things they are doing poorly).
  • Try to get a global vision of what is happening in our team.
  1. Seek improvements (for yourself and for the team)
  • No matter how good you are, there are always more things you can learn, or skills you can acquire (note that knowledge and skills are different things).
  • Improving and learning is your responsibility.
  • Not trying to understand how things work, but learning that this does this, and that does that will lead you nowhere! Really! Even if you've programmed for 20 years it's possible to make this mistake.
  1. Seek the most appropriate solution!
  • There is no perfect code. There is no perfect solution (or even if there is it is not reachable).
  • There are no general good design rules - a design is good if it works well, if it is easy to use (for example, if it is a library, the clients don't need to write much), if it is simple, has good performance, is modifiable/extensible, testable, etc. Neither the number of design patterns used solves this, nor a code convention... nothing. You have to find a design that is good.
  • Hacking around just to make things work is usually a time bomb. It is not acceptable to do this, because saving 1 day this month usually costs 7 days the next month.
  • What we should do is to seek something between just making it work and making it perfect, but closer to perfection than just working.
  • That is, seek a good solution.
  1. Quality is not negotiable! Scope and resources are
  • If something cannot be done with acceptable quality, given the resources (time) and scope (functionality) that it has, then we will either reduce the scope or increase the resources.
  • We are not allowed to reduce quality!
  1. Accept the challenge!
  • We are not doing the next Store Information System, the next Lawyer's web site, nor the next database module for big company X
  • What we are doing is something that only few companies in the world can do, and even many of them are not good enough.
  • There are no instructions on "how to make the next generation desktop publishing software"!

Discipline

  1. You have to try to follow the process as much as possible.
  • Self-discipline is needed.
  • If you don't have enough self-discipline, then this team is not appropriate for you (you should go to a company that forces you to be disciplined by other means).
  1. Our workday starts at 09:30 am sharp.

  1. You may work from home, or from other locations but only with the approval of the team leader.
  1. You have to write in the end of every working day a daily report about what have you done that day. You have to write at the end of every iteration a monthly report - see REPORTS page to learn how to write reports.

Task In Time

  1. Integrate continuously.
  • Do not hold source code for very long (or prolong the task).
  1. Time box to 150% the estimate.
  • If you see that you are about to exceed the estimated time, warn the leader immediately.
  • If you leader agrees, and it seems that the task can be completed with a little more time, a bit more time to complete it may be allowable.
  • A task reaching 150% (or 200% if the leader agrees) of its estimate should be dropped, even if it is 99.9% complete.
  1. Do not start implementation phase of a task if the design phase is not reviewed yet!
  • Wait for the review of the design, otherwise your work will be for nothing.
  1. It is recommended that no one design/implement a task for which that person wrote the analysis.

Task States

Each taken task in the current sprint should pass trough the following phases:

  • new
    • task is waiting to be taken, and someone to determine what should be done
  • analysis started
    • to learn how to write good analysis, see Platform Standards Analysis
    • there should be at least one example of how to do one or more things that should be done
    • it should take not more than 0.25 of the task effort
  • analysis finished
    • it is already determined what should be done
    • the task is waiting for review. (Wait for the review of the analysis before starting the design otherwise your work can be for nothing!)
  • review
    • the reviewer looks through the analysis to see if it is correct
    • the analysis should apply to its template
  • analysis ok: the reviewer decides that the analysis of the task is correct
  • design started
    • the implementer is designing the task or in other words tries to decide how it should be done
    • it is not compulsory but recommended that the design should be made by the implementer
    • it is recommended that the design should not be made by the analyzer of the task
    • designing is not only getting an idea of how to do something but understanding the concept, step by step
    • for many tasks the design phase should take most of the time
  • design finished
    • it is determined how the task should be done
    • it is designed and is waiting for review.
  • review
    • the reviewer looks through the design to see if it is correct.
    • The reviewer can imagine himself as an implementer and see if he can implement the task following the design without problems
  • design ok: the reviewer decides that the design of the task is correct
  • implementation started
    • the implementer makes an attempt to complete the task
    • he/she should follow the design strictly
    • he/she should write an Implementation section of the task as the result of the implementation
  • implementation finished: the implementer decides that the implementation of the task is ready
  • review: the reviewer look through the implementation to see if it is correct
  • implementation ok
    • the reviewer decides that the implementation of the task is correct
    • the implementation section must be filled
  • test started: there should be appropriate tests written for the task
  • test finished: testing state is done, waiting for review
  • review
    • the reviewer looks for missing or wrong tests
    • if there are any, the reviewer may decide they can be added later
    • if they are too important or wrong they should be fixed - the review fails
  • test ok
  • notes
    • failing a state (for example moving from implementing to analyzing because implementation failed) requires that the state of the product is reversed to the initial state
    • the transitions of each state are defined in Workflow diagram
    • for every task we have tickets which are available for reading and filtering in the View Tickets section in Trac. For every state of a particular task, you have to go to its ticket and update its status depending on the state you are working on. See SCS_ISSUE_TRACKER_SETUP_R1 for more information about our issue tracking.

Task Tips

  • Implementing a new external feature (visible from the user):
    • analyzing
      • should have a specification, and a manual testing scenario
    • designing
      • write an automatic test or tests that verifies that your feature is working
      • look at the related code
      • decide what needs to be added
      • if you add new library feature
        • write use case tests for it
      • you may also write skeleton types (only declaration) and demos
        • write an outline of your design.
      • implementing
        • make all the designed things work.
        • ensure that all tests pass.
        • during the process, add more tests.
  • Researching about a technology or library
    • analyzing: specify what needs to be researched
      • what the research will solve
      • what is needed to be solved (this is important)
    • designing
      • you can try things, but do not pollute the main source (for example with libraries)
      • if you need to use other libraries, do it in another project (or in another branch)
      • if you don't need other libraries, you can do it in a research package, but you should make sure that you don't introduce warnings, failing tests, etc.
    • implementing
      • you present the written results / conclusions of your research, demo codes, etc.
    • reviewers: 1-2 developers
  • implementing a new internal feature (a library or something, not directly visible)
    • analyzing: what the library will provide
    • designing: should include:
      • use case tests
      • samples
      • demo (in some cases),
      • design outline
    • implementing
      • the library should be implemented
      • enough tests should be written during the process
    • reviewers: 2 developers
  • Performing structure changes or refactoring.
    • analyzing: what the issues are
    • designing: understanding it, and providing an idea how to fix the issues
    • implementing: fixing the issues
    • reviewers: 1-2 developers
  • Writing a specification
    • analyzing: what needs to be specified
    • designing: brief ideas about the subject of the specification
    • implementing: writing the specification exactly
    • reviewers: 1 qa + 1 developer
  • Doing documentation.
    • analyzing: what needs to be documented
    • designing: what it will contain (outline) and understanding of the content
    • implementing: adding the appropriate content
    • reviewers: 1-2 team members
  • External testing (the usual testing task).
    • analyzing: has to state what the focus of the tests will be
    • designing: has to select test scenarios, and make new scenarios if necessary
    • implementing: should apply testing scenarios and provide bugs and recommendations (things that probably the users will like/dislike)
    • reviewers: 1 developer
  • Fixing a bug.
    • analyzing: what the bug is. Needs a manual testing scenario (in trac)
    • designing: an automatic test (failing) should be present, an idea what is causing it, more tests (which detect related internal bugs), and an idea of how can be fixed
    • implementing: fixing the bug, making the tests pass, and some refactoring
    • reviewers: 1 QA and 1 developer

Templates for logging specific task types should be added

Roles

  • leader: someone assigned to do the management (default: Milo)
  • implementers: the workers on a given task
  • reviewers: a person who is assigned to review a task
    • the reviewer and the implementer may not intersect!

Comments

You can put your comments here.

Links to this Page

Error: Macro BackLinksMenu(None) failed
compressed data is corrupt