• Seilevel Team

    Here’s the Team

    Welcome to our Seilevel Photo Op.

Seilevel Requirements Management Tool Evaluation – Part 2

9/22/11 IMPORTANT NOTE AND UPDATE: We have updated the criteria and tool scores — please go to this post for the most current material.

Phase 2 of our requirements management tool research is complete! We have posted the criteria and tool scores in a downloadable Excel document, though we have a whitepaper coming out shortly to share some of the interesting observations about the tools, as well as some of the challenges we faced in this phase of the research.

For a reminder about our research process, you can see our post on phase 1. We did a similar (though less rigorous) study back in 2007, and one thing I will share is that I’m so much happier with the requirements management tools on the market now than four years ago. At Seilevel, we are particularly interested in advanced modeling functionality within the tools to support RML®, and there are a lot of tools that support this relative to our original study. There is also better support now in general for working offline. You can see the results for yourself.

This result list is meant for you to use, so please download and do what you want with it! The key though is that you remember that there are raw scores and weighted scores. But it’s really important to remember that the weighted scores are based on Seilevel’s prioritization of the criteria – which may not be right for your organization, so feel free to change those. You also can spend some time skimming the comments for the criteria you most care about  as well. Anyway, please download and use however you want, we’d just like credit where appropriate for the results.

Within the spreadsheet, you will find a few columns that may be useful to you. The Phase 2 evaluation is the first tab and the resulting scores are summarized in the 2nd. We kept the Phase 1 evaluation results in as well, as long as a summary of those scores. And while we didn’t write full Use Cases, we did think about them by title/concept and so you can find those in the fifth tab and the last one shows you a pivot table of features by use case – this is part of what we did to ensure we weren’t missing major features that we cared about.

Now, a few disclaimers seem appropriate here as I’m publishing this for the first time. First of all, the results are probably not perfect, but we think they will be useful. Now, by that what I mean is that there are going to be some criteria we unknowingly miss-scored. We ran into a situation where some vendors were more helpful than others in providing evaluation copies or demos of their tool, and with those that did not, we had to be creative about our evaluation. In the cases we only got to see demos, the results are at risk for being less reliable only because  we didn’t get to get our hands on the features. Anyway, there will be more on this in the full paper.

We are now beginning phase 3 of our research where will use a select few of the tools on our actual projects. We are hoping to select the tool that works best for us, but it also is exciting because it allows us to really see how the features are supported beyond just demoing them. More will come out on that later in the year!

10 Responses to Seilevel Requirements Management Tool Evaluation – Part 2

  1. ChrisG September 12, 2011 at 2:01 pm #

    Hello,

    Perhaps your moderator can delete this comment once I resolve the issue. When ever I click on the downloadable Excel link I am given the option to save a zip file, which is fine. But, when I save and then open the zip file the contents are four folders and an xml file. The folders appear to files for a web site. None of the folders contain an Excel file. Is something wrong with my download capability? Can you check the link on the posting?

    Thanks – Chris

  2. Lori Witzel September 12, 2011 at 5:29 pm #

    Hey Chris – are you running a version of Excel that cannot create .xlsx files?

    FYI, I haven’t been able to replicate the issue to fix it…but I will happily send you the doc directly.

    You can email me at lori {dot} witzel {at} seilevel {dot} com and I’ll get you fixed up.

  3. Vicki James September 13, 2011 at 11:38 am #

    What a great idea for a product review and example of a quality gap analysis.

    I am confused about the first phase scoring. I am a frequent user of CaseComplete and was surprised to see it did not make the cut to phase 2. How was this determined? I also noticed an inconsistency in the CaseComplete Phase 1 evaluation. If I am reading the scoresheet correctly, CaseComplete was marked as not supporting mocks. This would be incorrect as it does include the ability to create mock screens within the tool. Please let me know if I misunderstood the intent of this column.

    Thank you!

  4. ChrisG September 14, 2011 at 8:05 am #

    Hello –

    What version of HP QC did you evaluate?

    Thanks,
    Chris

  5. Lori Witzel September 14, 2011 at 5:45 pm #

    Vicki and Chris, I will check with Joy and have her respond ASAP. 🙂

  6. Joy Beatty September 16, 2011 at 5:43 pm #

    First of all to Chris – we started with QC 9.2, but then realized we had access to a copy of ALM 11.0, so we updated our results to reflect that. As a side note, we’ll talk more about this in the paper, but this was one of the tools where we had more challenges getting a clean demo copy of it.

  7. Balaji Vijayan September 22, 2011 at 3:23 pm #

    Vicki – I apologize for the delay in the response. In regards to the 1st phase we did not look at mockups in close detail. We found that CaseComplete allows users to link mockups to use cases. As for why the tool did not make it to the 2nd round of evaluations, this was because we felt the tool didn’t have the breadth of requirements management features and was focused more heavily on requirements definition. In addition the evaluation was constrained by the time we had available – if you are interested in evaluating CaseComplete against the full criteria we would love to see the results and will gladly credit you in the papers.

  8. Cobus November 30, 2011 at 4:50 am #

    Thank you for sharing this study. Very interesting. Could you perhaps elaborate on the criteria you used in phase 1 (Definition, Mockup, Management, Agile Specific Tools).

    Thanks

    Cobus

  9. Lori Witzel November 30, 2011 at 11:18 am #

    Hello, Cobus – although Joy Beatty is out of the office for a time, I will see who here can share more details about the criteria, and have them post a response. Thank you for stopping by and commenting!

  10. Balaji Vijayan December 1, 2011 at 10:43 am #

    Cobus – there are not specific criteria used in phase 1 under each of those 4 categories. Those 4 categories were used for categorization of tools to see the general capabilities a tool offered.

    If you have further questions please feel free to email me at Balaji.Vijayan at Seilevel.com

    Thanks!

Leave a Reply

Your email address will not be published. Required fields are marked *