Metrics for verifying application quality
The Application Quality landing page is as essential to the role of a Lead System Architect (LSA) as the rest of Dev Studio. Not only do you implement functionality based on requirements, but you also need to prove that it works correctly and meets established quality standards. It is necessary to monitor how application quality trends; has it improved or worsened over time? Application quality affects the rate at which the application moves through a Dev Ops pipeline and the rate at which development teams can include new features.
The Application Quality landing page provides configurable settings that relate to quality metrics. You can change the default settings for metrics displayed to meet your business needs.
Settings
Establishing standard practices for your development team can prevent these types of issues and allows you to focus on delivering new features to your users. These practices include:
Setting | Description | Effect |
---|---|---|
Applications included | Current application or include built-on applications | If Include Built-On Applications is selected, users can select which built-on applications to include. |
Guardrails | Ignore test rulesets when calculating the guardrails score. Default value is true. | When true, excludes test rulesets from the guardrail score unit test setup activities and data transforms that are inside test rulesets. |
Quality trend interval | Two weeks to 6 months | Defines the quality trend interval period. |
Test execution look-back duration | One week to 6 months | Defines the Test execution look-back duration period. |
Scenario test case execution | Configure delay for scenario test execution? Default value is false. | Enables or disables a scenario test case run delay. |
For more information about how to change application quality metrics settings, see Application quality metrics settings.
Rule-coverage testing
The Application Quality landing page displays metrics for guardrails, test coverage, and unit testing that you can use to assess your application's overall health and identify areas that require improvement.
On the Test Coverage landing page, view a chart displaying test coverage metrics and generate specific user-level, application-level, and merged coverage reports. User-level reports contain the results of a single test coverage session that a user performs. Different users can simultaneously perform their own user-level tests. In contrast, application-level reports contain results from multiple test coverage sessions that many users run. Merged reports contain results from multiple most recent application-level reports.
The following examples are use cases for rule-coverage testing:
- A team is building or modifying an application and works on a sample application that is built on top of the actual application and maintains test artifacts in the test application. The team wants to generate a test coverage report of the actual application by running the test application tests for the current application or built-on applications.
- The team wants to generate a test coverage report as part of the automated tests run in the continuous integration and continuous (CI/CD) pipeline and use it for quality-gating purposes.
Coverage and unit test rules | Coverage-only rules |
---|---|
Activities Case types Collections Data pages Data transforms Decision tables Decision trees Declare expressions Flows Map values Report definitions Strategies When |
Correspondence Declare Expression Flow action Section Validate Decision Data XML Stream HTML HTML Fragment Harness Paragraph |
For more information, see Estimating test coverage.
Check your knowledge with the following interaction:
This Topic is available in the following Module:
Want to help us improve this content?