Trello is a great tool for managing just about anything. It is generic enough to be applied to many situations, but has just enough features to be a powerful tool.
After trying dozens of online project management tools, we’ve settled on Trello for dealing with the vast majority of projects.
This short guide explains how we use Trello when managing software projects with remote clients. We’ll also update the guide as our process evolves.
Our Agile Process
The process we use is lightweight, and pulls together concepts from SCRUM and Kanban methodologies, discards complexity, and adapts to our customers and projects.
We maintain the weekly or bi-weekly rhythm from SCRUM to help schedule work, but forgo the heavy planning and ceremonial demo days, instead holding regular review and planning calls with our customers to keep momentum.
Our review calls involve key stakeholders, developers, and the Product Owner for a project. During a call, we’ll discuss new story cards in the backlog, write and update definitions of done, prioritise them, and schedule them into a Sprint list.
We then pull story cards from the sprint backlog across to the right, as they progress through the stages of the project. We use work-in-progress limits (the maximum number of cards that can be in a list at any one time) from Kanban to ensure the board is fluid, and any bottle-necks can be identified and resolved quickly.
Trello lets you setup a lightweight process easily. A few lists and you’re ready to go. We start any new project board with the following lists. Default work in Progress limits are set (shown in square brackets), but these may change as the project progresses.
- Ideas / Backlog
- Sprint 
- In Progress 
- Acceptance Testing 
- Accepted 
A project is started by adding all the cards that have been identified from our Discovery Workshop. Story cards are added to the Backlog (the first column) in the order of highest value, with adjustments for technical dependency.
The priority in the Backlog lists can be changed at any time, and is expected to do so as the project develops and our knowledge increases.
The card’s title is used to briefly describe the single unit of value that card delivers:
New users can register an account.
The title should be short and to the point. The description field is used to expand on the details, and describe why the feature is beneficial and valuable.
Except in rare cases, the value should always be expressed from a non-technical point of view, such that it is clear why the feature is being developed. The description does not assume any implementation details.
Typically, this can be expressed in formal “user story” language:
So that I can get personalised news feeds As a new user When I visit the website I want to register a new account
For smaller stories, just the benefits or business value may be given:
So users can get personalised news feeds.
Definition of Done
As part of the first week kickoff meeting, and each subsequent weekly review, story cards from the Backlog are reviewed and updated according to the latest knowledge. Cards can also be discarded during a review.
Before any story can be moved into the Sprint list, it must first have a Definition of Done. This is a list of very specific requirements that must be met for this card to deliver the value and results expressed in the description.
Again, implementation (i.e. how things are done) should be ignored in the definition of done— it should only be concerned with outcomes, but hints can be given for what the custom may expect to see (e.g. “show a message warning about the error”).
For example, our Register Account story card above might have the following acceptance criteria:
- [ ] Register with email address, first name, last name and password - [ ] Email address must be verified to activate account - [ ] Password must be at least 5 characters long - [ ] If email has been used, show a message and send password reset.
Knowing what the definition of done for a card is, we can begin to estimate the effort required to successfully deliver each card.
In the past we’ve used SCRUM style points in a Fibbonacci(ish) sequence (1, 3, 5, 8, 13, 21), or t-shirt sizing (s, m, l, xl, xxl), to do this.
We prefer T-shirt sizing as the scale can’t be confused with time (hours, days, minutes). Unfortunately, numeric scales— no matter how abstract— are usually considered as units of time (hours, days, etc.)
T-shirt sizing avoids this confusion, and are a “good enough” indicator as to how much effort will be required for a piece of work.
As we don’t bill by time, we rely on the t-shirt sizing and Work in Progress limits to agree the scope of work. WIP limits can be adjusted as we begin to calculate average lead and cycle times for our cards.
Once cards have a definition of done and an estimate, they can be moved into the Sprint list. Again, the cards are listed in descending priority, taking into account any dependencies (e.g. a story for users to publish a new post in can’t be tackled before the story allowing users to create a post).
Once the sprint begins, a developer pulls a card from the Sprint list into In Progress.
Work continues on that card until it is ready —the definition of done has been met.
Items from the definition are ticked as they are completed by the developer. Once all criteria have been ticked, the card can be considered ready, and is pulled into the Acceptance Testing list.
Trello’s comments can contain screenshots and attachments just by dragging the files into the comment field.
Comments and Conversations
A story card is a token for conversation— something to trigger discussion around a feature or requirement, rather than just a list of To-Dos.
Throughout the life cycle of a card, notes and conversations are added to the card’s comments. This helps to keep everything related to the card in one place (and out of email). Notifications can be generated by @ mentioning another user.
@marycustomer Looks great, we'll update the pages to match these designs.
Inevitably throughout the process, issues come up which block the progress of a card. This could be an internal (e.g. waiting for content to be delivered) or an external block (e.g. waiting for approval from a third-party API provider). In either case, the card is labelled as blocked and— crucially— the reason for the block is added to the card.
A blocked card in theory cannot be pulled along until it is unblocked. However, for practical reasons (and if the card is not blocking any dependent cards), it can be put into “holding”, or moved back into the Sprintlist to open up the pipeline for another card.
It’s perfectly normal (indeed, expected) that the needs and complexity of a card to change as it is developed. New information comes to light through the course of the week; feedback is received; or some initial assumptions were simply wrong.
New criteria should only be added if they are forced by unforeseen circumstances or constraints, and should be added to a separate checklist (e.g. Additional Requirements).
If these changes are substantial enough to affect the original estimate for the card, chat with the customer to determine if the card should be discarded and a new card added to the Backlog. Discarding the card will open up space in the pipeline to pull in other work.
Once a card has been moved to Acceptance Testing, it’s ready for the customer to test. For web apps, this means the feature is available on the staging server; for mobile apps, it means a build has been distributed via Crashlytics (or a similar beta distribution platform) and can be installed by customers and testers on their device(s).
The card now becomes the customer’s— specifically, the Product Owner’s— responsibility. She should check to see that the definition of done has been met.
If the customer is happy that the definition has been met, and the feature is working as expected, she should move the card to the Accepted list. This “signs off” the feature, and allows us to release it (where releasing could be publishing to a production version of the app, or simply marking the story as “complete”).
If something isn’t working as expected— say a bug has been found or not all of the definition of done criteria have been met— then the card should be rejected by the customer.
In this instance, the customer must leave a comment on the card explaining why it has been rejected. The comment should be descriptive enough that the developer can easily recreate the problem, understand what the expectation was, and what is different. For example:
@joedeveloper Email field accepts invalid addresses when registering, e.g spaces: John@ company. Should show an error message saying "Please enter a valid email address"
The customer then moves the card back to the Sprint list to be picked up by a developer. It can also be labelled with a rejected tag to indicate that it has been rejected.
Once every 1 or 2 weeks— usually on a Monday— we’ll hold a short review call. This will usually be in a phone or video chat (Google Hangouts, Appear.in, Skype). We rarely hold short in-person reviews as these are often impractical given the distances involved.