Test Plans
What are test plans?
Test plans are designed to help users streamline the testing processes associated with supporting new software releases in a structured and easily repeatable manner. Test plans give you the tools you need to specify which (and how) test cases need to be run to test a new release or milestone.
You can build multiple test plans and run a repeatable collection of tests per each release cycle and make global changes to the environment settings such as browser settings, build number and build server. Plus, you can get consolidated reports of your results.
How are test plans different than test cycles?
Test cycles can help you group tests together and execute basic functions such as rerunning failed tests. Test plans provide more advanced configuration options and are designed specifically to help support your CI/CD business objectives.
Why should I use test plans?
Test plans are designed to help teams support an accelerated software release cycle in two primary ways.
- Test plans help you reuse existing test cases and make it easier to find and select existing test cases and add them to your test plan using tags. (This will be part of a later release.)
- Test plans help you make global changes to the test cases that are within each test plan – eliminating the need to open and edit each test case individually.
Using test plans at a glance
Here is a quick overview of the key steps you’ll need to take.
- Planning: Identifying how you want to use test plans to support your business objectives.
- Creating a test plan (this is basically the umbrella where you will add/configure test cases).
- Creating and naming folders to organize individual test cases.
- Configuring your settings.
- Running test plans and reports.
Planning
Most organizations leverage their own unique approach to software development, testing and deployment. That’s why we designed test plans to give you the flexibility you need to create and manage tests to support your organization’s individual business objectives.
Test plans are designed to help testers simplify the process of supporting CI/CD needs – and you can customize the application as needed.
While it’s not required, we highly recommend that you first document how you want to use test plans. By planning your use cases in advance, you can build more meaningful tests that provide actionable insights.
- At what milestones in the development process do you want to test?
- What are the most common testing scenarios?
- What are the most critical workflows that you need to monitor?
- What tests and test components are most often used and why?
- What are the configuration settings you would like to change on a regular basis to test different scenarios?
Configurations: What can I change?
Provar has several built-in configuration settings and it also allows custom settings to be added. The following includes the most commonly used configuration options.
Test environment
This includes the servers, web addresses and endpoints that the tests should be executed against, e.g. DEV, UAT, PROD. Test environments are defined in Provar’s settings view.
Web browser
Web browsers that tests will be run against are defined in the browser providers tab (inside Provar’s settings view).
- Local desktop browsers can be configured with different screen resolutions. These run on the same machine as Provar.
- Desktop browsers can be provided through Selenium Grid or through cloud-based services such as BrowserStack and SauceLabs.
- Mobile devices can be connected through local Appium servers or through cloud-based services such as Perfecto and TestObject (part of SauceLabs).
- When a web browser is selected, further settings are shown as appropriate. For example, if Desktop: Full Screen is selected, a drop-down is shown offering Chrome, Firefox, etc.
Salesforce user experience (UX)
Identifies whether tests should be run under the Salesforce Classic or Lightning mode.
Custom settings
The test plan editor allows custom settings to be defined.
- These are made available to the test cases as variables when they are executed.
- Can be supplied via environment variables in build scripts.
How can I create a test plan?
Step 1: Creating a test plan.
Test plans are created via the New Test Plan drop-down menu within Provar’s main toolbar.
Step 2: Open the test plan editor.
Double-clicking a test plan opens the following editor. You can use the test plan editor to make global changes to test cases within your test plan.
Step 3: Adding plan folders.
Plan folders can be added to a test plan via the New tool item in Provar’s main toolbar or via context menus on the test plan. Test folders are optional. You can use these to organize test cases. E.g. You might want to create a folder to organize test cases associated with various parts of your development cycle.
Step 4: Adding tests.
You can add tests by copying the desired tests and pasting them into the test plan or test plan folder. You can also drag and drop test cases as well.
How do I configure test plans?
The test plan editor allows various features to be applied to a test plan.
- The test plan editor allows high-level details to be configured for each feature.
- The plan folder editor allows the settings for each added feature to be overridden where applicable.
How do I run my test plan?
Once the test plan has been set up, the tests can be run in various ways:
- The Run in Provar context menu executes the tests in Provar Desktop’s Test Runner.
- The Run under TrailheadDx context menu executes the tests using Provar’s TrailheadDx integration.
- The Export as ANT Script context menu creates the appropriate ANT build file.
How can I configure output reports?
Provar is able to format test results into various file formats:
- These are stored in the run’s artifacts folder under the name supplied via the Artifact Name field. This makes them accessible to tools like Jenkins.
- They can then be included in Email and Slack notifications using the TestArtifact() function.
CSV
This produces a comma-separated file with the following values:
- Test case path
- Outcome
- Start and end times
Provar can be configured to produce PDF reports for test executions. This is used in several ways:
- A PDF can be produced for an entire run. This report can be stored as an artifact and/or emailed to a list of recipients at the end of the run.
- PDFs can be produced for individual test executions and attached to the execution in the TMT.
Pie chart
Produces a pie chart showing success, failure and skipped counts.
xUnit
Provar can produce an xml file which can be post-processed by tools like JUnit and Allure.
How can I receive test results?
Test results can publish run results through various channels.
The body of the messages can include the following substitutions:
- Test plan settings, e.g. {buildNumber}
- Outcome counters, e.g. <TODO: expand>
- Outputs via the TestArtifacts function, e.g. {TestArtifact(“pieChart.png”)}.
Reporting integration
Provar has built-in integration with many test management tools allowing it to record test executions and their outcomes to these tools. Provar also supports custom reporting which allows it to be extended to support any reporting solution.
Reporting connections
Provar’s test settings tab allows reporting destinations to be configured in its Reporting sub-tab.
Typically only the high-level information needed to connect to the server is stored in the reporting connection:
- This includes information like the server address, username, password and project/solution.
- Secret information such as passwords and API keys are stored in Provar’s .secrets file so they can be encrypted.
Reporting settings
Provar’s plan folder and test instance editors allow reporting settings to be configured for reporting connections that have been defined in the settings tab.
For Micro Focus ALM, for example, the test set is selectable via a drop-down.
Test case definitions
Some test management tools store the test suite and test case definitions alongside the test plan. Micro Focus ALM, Zephyr and TestRail are examples of this.
Provar provides the ability to Download (or import) and Upload (or export) the test case definitions from and to these tools.
- Provar Automation
- System Requirements
- Browser and Driver Recommendations
- Installing Provar Automation
- Updating Provar Automation
- Licensing Provar
- Granting Org Permissions to Provar Automation
- Optimizing Org and Connection Metadata Processing in Provar
- Using Provar Automation
- Understanding Provar’s Use of AI Service for Test Automation
- Provar Automation
- Creating a New Test Project
- Import Test Project from a File
- Import Test Project from a Remote Repository
- Import Test Project from Local Repository
- Commit a Local Test Project to Source Control
- API Testing
- Behavior-Driven Development
- Consolidating Multiple Test Execution Reports
- Creating Test Cases
- Custom Table Mapping
- Functions
- Debugging Tests
- Defining a Namespace Prefix on a Connection
- Defining Proxy Settings
- Environment Management
- Exporting Test Cases into a PDF
- Exporting Test Projects
- Japanese Language Support
- Override Auto-Retry for Test Step
- Customize Browser Driver Location
- Mapping and Executing the Lightning Article Editor in Provar
- Managing Test Steps
- Namespace Org Testing
- NitroX
- Provar Test Builder
- ProvarDX
- Refresh and Recompile
- Reintroduction of CLI License Check
- Reload Org Cache
- Reporting
- Running Tests
- Searching Provar with Find Usages
- Secrets Management and Encryption
- Setup and Teardown Test Cases
- Tags and Service Level Agreements (SLAs)
- Test Cycles
- Test Data Generation
- Test Plans
- Testing Browser Options
- Tooltip Testing
- Using the Test Palette
- Using Custom APIs
- Callable Tests
- Data-Driven Testing
- Page Objects
- Block Locator Strategies
- Introduction to XPaths
- Creating an XPath
- JavaScript Locator Support
- Label Locator Strategies
- Maintaining Page Objects
- Mapping Non-Salesforce fields
- Page Object Operations
- ProvarX™
- Refresh and Reselect Field Locators in Test Builder
- Using Java Method Annotations for Custom Objects
- Applications Testing
- Provar Manager
- How to Use Provar Manager
- Provar Manager Setup
- Provar Manager Integrations
- Release Management
- Test Management
- Test Operations
- Provar Manager and Provar Automation
- Setting Up a Connection to Provar Manager
- Object Mapping Between Automation and Manager
- How to Upload Test Plans, Test Plan Folders, Test Plan Instances, and Test Cases
- Provar Manager Filters
- Uploading Callable Test Cases in Provar Manager
- Uploading Test Steps in Provar Manager
- How to Know if a File in Automation is Linked in Test Manager
- Test Execution Reporting
- Metadata Coverage with Manager
- Provar Grid
- DevOps
- Introduction to Provar DevOps
- Introduction to Test Scheduling
- Apache Ant
- Configuration for Sending Emails via the Automation Command Line Interface
- Continuous Integration
- AutoRABIT Salesforce DevOps in Provar Test
- Azure DevOps
- Running a Provar CI Task in Azure DevOps Pipelines
- Configuring the Automation secrets password in Microsoft Azure Pipelines
- Parallel Execution in Microsoft Azure Pipelines using Multiple build.xml Files
- Parallel Execution in Microsoft Azure Pipelines using Targets
- Parallel execution in Microsoft Azure Pipelines using Test Plans
- Bitbucket Pipelines
- CircleCI
- Copado
- Docker
- Flosum
- Gearset
- GitHub Actions
- Integrating GitHub Actions CI to Run Automation CI Task
- Remote Trigger in GitHub Actions
- Parameterization using Environment Variables in GitHub Actions
- Parallel Execution in GitHub Actions using Multiple build.xml Files
- Parallel Execution in GitHub Actions using Targets
- Parallel Execution in GitHub Actions using Test Plan
- Parallel Execution in GitHub Actions using Job Matrix
- GitLab Continuous Integration
- Travis CI
- Jenkins
- Execution Environment Security Configuration
- Provar Jenkins Plugin
- Parallel Execution
- Running Provar on Linux
- Reporting
- Salesforce DX
- Git
- Version Control
- Salesforce Testing
- Recommended Practices
- Salesforce Connection Best Practices
- Improve Your Metadata Performance
- Java 21 Upgrade
- Testing Best Practices
- Automation Planning
- Supported Testing Phases
- Provar Naming Standards
- Test Case Design
- Create records via API
- Avoid using static values
- Abort Unused Test Sessions/Runs
- Avoid Metadata performance issues
- Increase auto-retry waits for steps using a global variable
- Create different page objects for different pages
- The Best Ways to Change Callable Test Case Locations
- Working with the .testProject file and .secrets file
- Best practices for the .provarCaches folder
- Best practices for .pageObject files
- Troubleshooting
- How to Use Keytool Command for Importing Certificates
- Installing Provar After Upgrading to macOS Catalina
- Browsers
- Configurations and Permissions
- Connections
- DevOps
- Error Messages
- Provar Manager 3.0 Install Error Resolution
- Provar Manager Test Case Upload Resolution
- Administrator has Blocked Access to Client
- JavascriptException: Javascript Error
- macOS Big Sur Upgrade
- Resolving Failed to Create ChromeDriver Error
- Resolving Jenkins License Missing Error
- Resolving Metadata Timeout Errors
- Test Execution Fails – Firefox Not Installed
- Selenium 4 Upgrade
- Licensing, Installation and Firewalls
- Memory
- Test Builder and Test Cases
- Release Notes