- OSLC Home: http://open-services.net
- OSLC on Twitter: http://twitter.com/oslcNews
- OSLC-CM Home: http://open-services.net/bin/view/Main/CmHome
- Details on the RTC Implementation: https://jazz.net/wiki/bin/view/Main/ResourceOrientedWorkItemAPIv2
- Carolyn Pampino on the RQM-RTC integration: http://jazz.net/blog/index.php/2009/08/26/sprint-alignment-for-developers-and-testers/
- Scaling Agile with C/ALM eBook: http://www.infoq.com/resource/articles/scaling-agile-with-calm/en/resources/InfoQ-IBM-ScalingAgilewithCALMeBook.zip
Displaying items by tag: requirement
Introduction
Deleting information from any document is something to think about twice may be thrice. But deleting a requirement from a specification is not as simple as deleting a sentence. A requirement is an object which holds not only the specification sentence but other information like name-value pair attributes, history, link information etc.
In this document we will try to explain possible ways to approach how to delete requirements
Challenges
Sometimes it is required to discard some information from the project for various reasons. But even in such cases, when specification information is discarded, there are still some risks as follows:
- It may break the integrity of the overall information map.
- The information piece to be discarded may not be needed in one configuration but may still be critical in others.
- The information piece to be discarded now, may still be an important component of an audit trail
- The information itself may have no value anymore, but the links it bears, my still constitute some value
So when we decide to delete any information from the requirements base, we should be considering several cases. Or somebody has to pre-consider for us.
How to delete requirements from DOORS Next Generation
How can we delete any requirement from DNG? Or what is the best approach to delete requirements from the requirements database?
For the reasons mentioned above, deleting requirements from GUI doesn’t exactly remove the data from the database. It just gets unaccessible. But to keep the integrity intact, any deleted requirement, still occupies a placeholder in the database.
Soft Delete of Artifacts
Since any “delete” operation via the GUI doesn’t cause the physical deletion of the data from the database, and in spite of this result, the data is still permanently unreachable when deleted, may be we can consider a “soft” delete operation.
It is a widely accepted approach to organize requirements in modules within DNG. Like we create/edit requirements within module, we also tend to execute the delete command in modules. This command is called as “Remove Artifact”:
What we are doing with this command is not actually deleting the requirement, but removing it from the module. So it will remain accessible in the base artifacts and folders.
Also, by design, there is a concept of requirement reusability. This means this requirement may coexist in other modules.
This is why, when removing a requirement from a module it is not “deleted” by default. However if the artifact is present only in one module, then there is an option to delete it via “remove” comment as follows:
If the artifact to be removed doesn’t exist in any other module, this option permanently deletes it.
The permanent deletion of the artifact will still leave the artifact on the database for the integrity of the data. But it will not be possible to retrieve this artifact any more.
So we can recommend not to permanently delete any artifact but remove. However there might still be some issues with removing the artifact from the module:
When an artifact is removed from the module, the specification text, its key-value pair attributes, history, all information will remain. Only its link to the module it belongs to will be broken. But there is one important piece of information else which gets broken. The link to other artifacts! In this example we see that the requirement is linked to another requirement in another module. When the artifact is removed from the module, we use that link as well. So when we retrieve it back to the module, it won’t have the link information anymore:
Sometimes our clients may prefer this behavior, sometimes not. For those, of which this is not preferable, we generally consult a “softer” delete operation. We advise to define a boolean attribute called “Deleted” with default value “false”. This attribute is assigned to all artifact types and instead of deleting the artifact, simply change the boolean value from “false” to “true”.
Of course it is not enough to do this. Also we need to define a filter for every view which filters out the “Delete” attribute “true” valued requirements.
Hard Delete of Artifacts
Hard delete of artifacts can be considered as deleting the artifacts with the option “If the artifact is not in other modules, permanently delete it.” selected.
But this will still not be sufficient if the artifact is being used in more than one module. To make sure a hard delete, first remove it from the modules it is being used in.
You can used the information “in modules”
Then, just simply locate the base artifact in the folder structure and select “Delete Artifact” from the context menu.
Upon confirming the deletion it will be permanently deleted:
This operation will still not delete the artifact from other configurations if there are more than one stream and the artifact appears in those streams.
How to clean up requirements
As mentioned above, the requirements even deleted from GUI are not deleted from the database for various reasons. Most of which is to maintain the data integrity and database indices. Theoretically, after every physical deletion of the artifacts a reindexing and database maintenance scripts should be run. This would not make sense for a daily operation done by an end user.
For those artifacts which are permanently deleted from the GUI, there is a repotools command which helps to remove from database as well. This is deleteJFSResources command. However, use it with extra precaution and please review the information in the link below:
https://www.ibm.com/support/pages/deleting-data-permanently-doors-next-generation-project
Summary:
Making use of configurations, streams and modules complicates the deletion concept. We may loose information where we don’t expect. Also we may get results even after properly deleting artifacts. Since they will still be available in other configurations. That is why we generally advise to use the “Deleted” attribute approach and implement the filtering of non-deleted artifacts.
Overview
IBM Rational Quality Manager is a collaborative, web-based, quality management solution that offers comprehensive test planning and test asset management from requirements to defects. The Jazz platform enables teams to seamlessly share information. It uses automation to speed project schedules and report on metrics for informed release decisions. It can also be purchased as part of the Collaborative Lifecycle Management solution—a set of seamlessly integrated tools: IBM Rational Team Concert, IBM Rational Quality Manager, and IBM Rational DOORS Next Generation.
Rational Quality Manager works with requirements in IBM Rational DOORS Next Generation to keep test cases in sync whenever requirements evolve. Rational Quality Manager also integrates with a wide range of test automation tools like IBM Rational Functional Tester, enabling you to run tests and collect results, all from a central location.
But when we will take a look at embedded software development, we will find big specifics because it's used on machines and devices that are not typically thought of as computers. Usually, this kind of software is specialized for the particular hardware that it runs on and has time and memory constraints. It can be exactly described that no or not all functions of such software are initiated/controlled via a human interface, but through machine-interfaces instead. We can point to such similar devices as cars, phones, modems, robots, toys, security systems, pacemakers, TV sets, digital watches, various medical devices, etc. This software can be very simple, such as lighting controls running on an 8-bit microcontroller with a few kilobytes of memory, or can become very sophisticated in solutions such as airplanes, missiles, and process control systems.
IBM Rational Quality Manager supports reliable and flexible integration with testing systems from National Instruments specially designed for use in embedded software development. National Instruments (NI) Test Integration Adapter for IBM Rational Quality Manager software enhances test engineer teams' efficiency by automating NI TestStand sequence execution and reporting from Rational Quality Manager.
Engineering departments create increasingly complex products, and as a result, automobiles, aircraft, medical devices, consumer electronics, and more depend on software driving the hardware components. IBM Rational software and National Instruments integrate development and test environments to help clients with their most important goals:
- Test the code of smart products, which might have hundreds of thousands or millions of lines of code, for defects.
- Reduce the cost of code defects by identifying them earlier in the development process.
- Mitigate increasing product complexity by tackling quality challenges earlier in the development process.
- Improve efficiency by breaking down the tasks between engineering departments.
Combining Rational Quality Manager software and National Instruments (NI) TestStand provides comprehensive test case traceability, test case results management, and automated test scheduling and execution. Test results are made available to all teams so that applications can be validated at virtually every point along the development path, from simulation and prototyping, through deployment onto hardware, and integration into the end system. Both operational efficiency and test accuracy can be improved through the ability to reuse test components throughout multiple project phases and even on different projects.
A specially designed piece of software - the NI Test Integration Adapter for Rational Quality Manager software - provides integration between NI TestStand and Rational Quality Manager. This integration product includes the following main features and functions:
The requirement to test traceability: Test engineers can use the integration to link automated tests to test cases and to requirements. NI TestStand applications (and by extension all the code modules called) and parameter files on the test machine are linked to a Rational Quality Manager test case that provides traceability to other project aspects including requirements, overall quality plan, project plan, change-defect management system, and so on.
Test automation: From the web-based interface of Rational Quality Manager, test engineers can invoke the execution of the NI TestStand applications that are linked to the Rational Quality Manager test case. The execution status reported by the NI TestStand sequence is displayed in the Rational Quality Manager web interface as part of the test case execution results. Test engineers can then optionally create defects linked to the test case results to keep track of noncompliance detected by running the test.
Test case results management: Upon completion of the NI TestStand sequence, various outputs from the sequence are automatically published to Rational Quality Manager storage and an HTML report is linked to the Rational Quality Manager test case execution result page. The HTML report is created by this software product on the local machine and the details of the report content as well as what is stored in IBM JazzTM are configured by the user.
Videos
Excellent video about the integration of modeling, requirements, and testing software development areas into the whole software development process with solutions from IBM and National Instruments.
For completeness, the DNG public API's that are available are:
- RDNG Reportable API - extracting read only information from RDNG generally for creating custom reports to be run from DNG or RPE. As well, it can be used in a lot of custom applications interacting with RDNG. Big advantage of this API is its simplicity and great performance if you need to work with big amounts of data. Sometimes it's called as RDNG Reportable REST API.
- OSLC RM V2 API - the open standard way to programmatically integrate with DOORS Next Generation, there is an elaborate workshop to get you started here OSLC Workshop article. These programs can be from any language that supports various HTTP methods of requests, e.g. Java, C# etc. Hence these are more flexible to run and can read and update DNG data, as part of a web server, or standalone, but have some limitations set by the current OSLC standard, for example module support. In release 6.0.5 extension have been added for modules. It's very powerful tool to work with RDNG data including capabilities of data modification but can be a little bit more complex in use than other ways.
- TRS 2.0 - the OSLC Tracked Resource Set open standard REST API to hook into the stream of low level changes done to resources in the DNG application (and other Jazz apps as well); read only.
- RM API - it's realized in client extension capability that is accessible from Rational DOORS Next Generation. This is Javascript extension framework that generally can be added as widgets which you can run from the dashboard, and can be used to view and edit DNG data. The widgets are developed in accordance to OpenSocial standard (based on Google Gadget framework). It's more tailored to DNG, so has better module support, but is limited to be run in a browser/javascript environment. It's very easy to make an automation of your work in this way. You can do a lot of interesting tricks with currently open Web page based on DOM/XML standards. You can add support of many popular libraries such as JQuery and so on. But this big simplicity results in some constraint in implementation. For example, you can easily access RDNG artifacts in current module or collection as well as access any selected artifacts but there are some difficulties if you need access artifacts in current view or currently available on the page. But use RM API together with other APIs described above can allow you do stunning things with RDNG.
Helpful Links
Ord. Link Comment 1 API Landing page This is a landing page for the various API wiki pages that exist on the Jazz.net development wiki as well as a central collective page of the known APIs that are available for integrating programmatically with our CE/CLM products. It serves as a convenience for accessing API information about CE/CLM products and is not guaranteed to have up-to-the-minute information. 2 DOORS Next Generation Reportable API The DOORS Next Generation server provides REST APIs for accessing information about requirement artifacts for reporting. You can access each API by using a standard web browser, IBM Rational Publishing Engine or a third party tool can can consume the DNG reporting services responses. This document covers reporting capabilities for DNG versions 6.0 and higher. 3 Using OSLC capabilities in the Requirements Management application You can use Open Services for Lifecycle Collaboration 2.0 (OSLC) capabilities in the Requirements Management (RM) application for the Rational solution for Collaborative Lifecycle Management (CLM). This article includes examples of how to use those capabilities in an HTTP poster tool. These basic examples are described. 4 Open Services for Lifecycle Collaboration Workshop The workshop will help guide you to leverage the Open Services for Lifecycle Collaboration (OSLC) standard interfaces for interoperating with Jazz-based products including RDNG. These labs will highlight key aspects by leveraging web browser access and programmatic access via Java client programs. The final lab will illustrate by an example how to write your own server using Java servlets. This lab is based on the OSLC-CM 2.0 and OSLC-RM 2.0 Specification. After you complete these labs, you will have a good foundation by which to leverage OSLC to implement an interoperability project. 5 OSLC Requirements Management Version 2.1. Part 1: Specification This specification defines the OSLC Requirements Management domain, also known as OSLC RM. The specification supports key RESTful web service interfaces for software Requirements Management systems. 6 OSLC Requirements Management Version 2.1. Part 2: Vocabulary This specification defines a vocabulary and resource shapes for the OSLC Requirements Management resources. 7 IBM DOORS Next Generation Server API Documentation (additions from v6.0) Additions to standard OSLC API specially developed for use with RDNG. It takes into account some RDNG specifics that is out of scope of OSLC specification. 8 Getting Started with RDNG Extending Capabilities Brief video giving a quick introduction into client extending capabilities. 9 Client extension API for the Requirements Management (RM) application This document specifies the client extension API that forms part of the 6.0.5 release of the Requirements Management (RM) application. The version of this API is 1.1. 10 Client extension capability (all versions) List of links to client extension APIs for all supported versions. 11 jQuery API Useful API when developing RDNG client extensions. 12 Web APIs When writing code for the Web with JavaScript, there are a great many APIs available. Below is a list of all the interfaces (that is, types of objects) that you may be able to use while developing your Web app or site.
Useful Links
https://www.facebook.com/oslcfest/
DOORS Next Generation (v4.0.6): Adding an extension to your dashboards via URL
https://www.youtube.com/watch?v=q3DxAneAPYQ
Adding widgets to the widget catalog of IBM Rational DOORS Next Generation (V4.0.6)
https://www.youtube.com/watch?v=mXl1YcDYjVg