Automation V3: Test Plans
What are test plans?
Test plans are designed to help users streamline the testing processes associated with supporting new software releases in a structured and easily repeatable manner. Test plans give you the tools you need to specify which (and how) test cases need to be run to test a new release or milestone.
You can build multiple test plans and run a repeatable collection of tests per each release cycle and make global changes to the environment settings such as browser settings, build number and build server. Plus, you can get consolidated reports of your results.
How are test plans different than test cycles?
Test cycles can help you group tests together and execute basic functions such as rerunning failed tests. Test plans provide more advanced configuration options and are designed specifically to help support your CI/CD business objectives.
Why should I use test plans?
Test plans are designed to help teams support an accelerated software release cycle in two primary ways.
- Test plans help you reuse existing test cases and make it easier to find and select existing test cases and add them to your test plan using tags. (This will be part of a later release.)
- Test plans help you make global changes to the test cases that are within each test plan – eliminating the need to open and edit each test case individually.
Using test plans at a glance
Here is a quick overview of the key steps you’ll need to take.
- Planning: Identifying how you want to use test plans to support your business objectives.
- Creating a test plan (this is basically the umbrella where you will add/configure test cases).
- Creating and naming folders to organize individual test cases.
- Configuring your settings.
- Running test plans and reports.
Planning
Most organizations leverage their own unique approach to software development, testing and deployment. That’s why we designed test plans to give you the flexibility you need to create and manage tests to support your organization’s individual business objectives.
Test plans are designed to help testers simplify the process of supporting CI/CD needs – and you can customize the application as needed.
While it’s not required, we highly recommend that you first document how you want to use test plans. By planning your use cases in advance, you can build more meaningful tests that provide actionable insights.
- At what milestones in the development process do you want to test?
- What are the most common testing scenarios?
- What are the most critical workflows that you need to monitor?
- What tests and test components are most often used and why?
- What are the configuration settings you would like to change on a regular basis to test different scenarios?
Configurations: What can I change?
Provar has several built-in configuration settings and it also allows custom settings to be added. The following includes the most commonly used configuration options.
Test environment
This includes the servers, web addresses and endpoints that the tests should be executed against, e.g. DEV, UAT, PROD. Test environments are defined in Provar’s Settings view.
Web browser
Web browsers that tests will be run against are defined in the browser providers tab (inside Provar’s settings view).
- Local desktop browsers can be configured with different screen resolutions. These run on the same machine as Provar.
- Desktop browsers can be provided through Selenium Grid or through cloud-based services such as BrowserStack and SauceLabs.
- Mobile devices can be connected through local Appium servers or through cloud-based services such as Perfecto and TestObject (part of SauceLabs).
- When a web browser is selected, further settings are shown as appropriate. For example, if Desktop: Full Screen is selected, a drop-down is shown offering Chrome, Firefox, etc.
Salesforce user experience (UX)
Identifies whether tests should be run under the Salesforce Classic or Lightning mode.
Custom settings
The test plan editor allows custom settings to be defined.
- These are made available to the test cases as variables when they are executed.
- Can be supplied via environment variables in build scripts.
How can I create a test plan?
Step 1: Creating a test plan.
Test plans are created via the + menu within the Plans section.
Step 2: Open the test plan editor.
Double-clicking a test plan opens the following editor. You can use the test plan editor to make global changes to test cases within your test plan.
Step 3: Adding plan folders.
Plan folders can be added to a test plan via the New tool item in Provar’s main toolbar or via context menus on the test plan. Test folders are optional. You can use these to organize test cases. E.g. You might want to create a folder to organize test cases associated with various parts of your development cycle.
Step 4: Adding tests.
You can add tests by copying the desired tests and pasting them into the test plan or test plan folder. You can also drag and drop test cases as well.
How do I configure test plans?
The test plan editor allows various features to be applied to a test plan.
- The test plan editor allows high-level details to be configured for each feature.
- The plan folder editor allows the settings for each added feature to be overridden where applicable.
How do I run my test plan?
Once the test plan has been set up, the tests can be run in various ways:
- The Run in Provar context menu executes the tests in Provar Desktop’s Test Runner.
- The Run under TrailheadDx context menu executes the tests using Provar’s TrailheadDx integration.
- The Export as ANT Script context menu creates the appropriate ANT build file.
How can I configure output reports?
Provar is able to format test results into various file formats:
- These are stored in the run’s artifacts folder under the name supplied via the Artifact Name field. This makes them accessible to tools like Jenkins.
- They can then be included in Email and Slack notifications using the TestArtifact() function.
CSV
This produces a comma-separated file with the following values:
- Test case path
- Outcome
- Start and end times
Provar can be configured to produce PDF reports for test executions. This is used in several ways:
- A PDF can be produced for an entire run. This report can be stored as an artifact and/or emailed to a list of recipients at the end of the run.
- PDFs can be produced for individual test executions and attached to the execution in the TMT.
Pie chart
Produces a pie chart showing success, failure and skipped counts.
xUnit
Provar can produce an xml file which can be post-processed by tools like JUnit and Allure.
How can I receive test results?
Test results can publish run results through various channels.
The body of the messages can include the following substitutions:
- Test plan settings, e.g. {buildNumber}
- Outcome counters, e.g. <TODO: expand>
- Outputs via the TestArtifacts function, e.g. {TestArtifact(“pieChart.png”)}.
Reporting integration
Provar has built-in integration with many test management tools allowing it to record test executions and their outcomes to these tools. Provar also supports custom reporting which allows it to be extended to support any reporting solution.
Reporting connections
Provar’s test settings tab allows reporting destinations to be configured in its Reporting sub-tab.
Typically only the high-level information needed to connect to the server is stored in the reporting connection:
- This includes information like the server address, username, password and project/solution.
- Secret information such as passwords and API keys are stored in Provar’s .secrets file so they can be encrypted.
Reporting settings
Provar’s plan folder and test instance editors allow reporting settings to be configured for reporting connections that have been defined in the settings tab.
For Micro Focus ALM, for example, the test set is selectable via a drop-down.
Test case definitions
Some test management tools store the test suite and test case definitions alongside the test plan. Micro Focus ALM, Zephyr and TestRail are examples of this.
Provar provides the ability to Download (or import) and Upload (or export) the test case definitions from and to these tools.
- Provar Automation V2
- System Requirements
- Browser and Driver Recommendations
- Installing Provar Automation
- Updating Provar Automation
- Licensing Provar
- Granting Org Permissions to Provar Automation
- Optimizing Org and Connection Metadata Processing in Provar
- Using Provar Automation
- Understanding Provar’s Use of AI Service for Test Automation
- Provar Automation
- Creating a New Test Project
- Import Test Project from a File
- Import Test Project from a Remote Repository
- Import Test Project from Local Repository
- Commit a Local Test Project to Source Control
- Salesforce API Testing
- Behavior-Driven Development
- Consolidating Multiple Test Execution Reports
- Creating Test Cases
- Custom Table Mapping
- Functions
- Debugging Tests
- Defining a Namespace Prefix on a Connection
- Defining Proxy Settings
- Environment Management
- Exporting Test Cases into a PDF
- Exporting Test Projects
- Japanese Language Support
- Override Auto-Retry for Test Step
- Customize Browser Driver Location
- Mapping and Executing the Lightning Article Editor in Provar
- Managing Test Steps
- Namespace Org Testing
- NitroX
- Provar Test Builder
- ProvarDX
- Refresh and Recompile
- Reintroduction of CLI License Check
- Reload Org Cache
- Reporting
- Running Tests
- Searching Provar with Find Usages
- Secrets Management and Encryption
- Setup and Teardown Test Cases
- Tags and Service Level Agreements (SLAs)
- Test Cycles
- Test Data Generation
- Test Plans
- Testing Browser – Chrome Headless
- Testing Browser Options
- Tooltip Testing
- Using the Test Palette
- Using Custom APIs
- Callable Tests
- Data-Driven Testing
- Page Objects
- Block Locator Strategies
- Introduction to XPaths
- Creating an XPath
- JavaScript Locator Support
- Label Locator Strategies
- Maintaining Page Objects
- Mapping Non-Salesforce fields
- Page Object Operations
- ProvarX™
- Refresh and Reselect Field Locators in Test Builder
- Using Java Method Annotations for Custom Objects
- Applications Testing
- Database Testing
- Document Testing
- Email Testing
- Email Testing in Automation
- Email Testing Examples
- Gmail Connection in Automation with App Password
- App Configuration for Microsoft Connection in MS Portal for OAuth 2.0
- OAuth 2.0 Microsoft Exchange Email Connection
- Support for Existing MS OAuth Email Connection
- OAuth 2.0 MS Graph Email Connection
- Create a Connection for Office 365 GCC High
- Mobile Testing
- OrchestraCMS Testing
- Salesforce CPQ Testing
- ServiceMax Testing
- Skuid Testing
- Vlocity API Testing
- Webservices Testing
- Provar Manager
- How to Use Provar Manager
- Provar Manager Setup
- Provar Manager Integrations
- Release Management
- Test Management
- Test Operations
- Provar Manager and Provar Automation
- Setting Up a Connection to Provar Manager
- Object Mapping Between Automation and Manager
- How to Upload Test Plans, Test Plan Folders, Test Plan Instances, and Test Cases
- Provar Manager Filters
- Uploading Callable Test Cases in Provar Manager
- Uploading Test Steps in Provar Manager
- How to Know if a File in Automation is Linked in Test Manager
- Test Execution Reporting
- Metadata Coverage with Manager
- Provar Grid
- DevOps
- Introduction to Provar DevOps
- Introduction to Test Scheduling
- Apache Ant
- Configuration for Sending Emails via the Automation Command Line Interface
- Continuous Integration
- AutoRABIT Salesforce DevOps in Provar Test
- Azure DevOps
- Running a Provar CI Task in Azure DevOps Pipelines
- Configuring the Automation secrets password in Microsoft Azure Pipelines
- Parallel Execution in Microsoft Azure Pipelines using Multiple build.xml Files
- Parallel Execution in Microsoft Azure Pipelines using Targets
- Parallel execution in Microsoft Azure Pipelines using Test Plans
- Bitbucket Pipelines
- CircleCI
- Copado
- Docker
- Flosum
- Gearset
- GitHub Actions
- Integrating GitHub Actions CI to Run Automation CI Task
- Remote Trigger in GitHub Actions
- Parameterization using Environment Variables in GitHub Actions
- Parallel Execution in GitHub Actions using Multiple build.xml Files
- Parallel Execution in GitHub Actions using Targets
- Parallel Execution in GitHub Actions using Test Plan
- Parallel Execution in GitHub Actions using Job Matrix
- GitLab Continuous Integration
- Travis CI
- Jenkins
- Execution Environment Security Configuration
- Provar Jenkins Plugin
- Parallel Execution
- Running Provar on Linux
- Reporting
- Salesforce DX
- Git
- Version Control
- Salesforce Testing
- Recommended Practices
- Salesforce Connection Best Practices
- Improve Your Metadata Performance
- Java 21 Upgrade
- Testing Best Practices
- Automation Planning
- Supported Testing Phases
- Provar Naming Standards
- Test Case Design
- Create records via API
- Avoid using static values
- Abort Unused Test Sessions/Runs
- Avoid Metadata performance issues
- Increase auto-retry waits for steps using a global variable
- Create different page objects for different pages
- The Best Ways to Change Callable Test Case Locations
- Working with the .testProject file and .secrets file
- Best practices for the .provarCaches folder
- Best practices for .pageObject files
- Troubleshooting
- How to Use Keytool Command for Importing Certificates
- Installing Provar After Upgrading to macOS Catalina
- Browsers
- Configurations and Permissions
- Add Permissions to Edit Provar.ini File
- Configure Provar UI in High Resolution
- Enable Prompt to Choose Workspace
- Increase System Memory for Provar
- Refresh Org Cache Manually
- Show Hidden Provar Files on Mac
- Java Version Mismatch Error
- Unable to test cases, test suites, etc… from the Test Project Navigation sidebar
- Connections
- DevOps
- Error Messages
- Provar Manager 3.0 Install Error Resolution
- Provar Manager Test Case Upload Resolution
- Administrator has Blocked Access to Client
- JavascriptException: Javascript Error
- Resolving Failed to Create ChromeDriver Error
- Resolving Jenkins License Missing Error
- Resolving Metadata Timeout Errors
- Test Execution Fails – Firefox Not Installed
- Selenium 4 Upgrade
- Licensing, Installation and Firewalls
- Memory
- Test Builder and Test Cases
- Release Notes
- Provar Automation V3
- Automation V3: Setting Java Development Kit (JDK) Environment Variables
- Automation V3: Adding a Salesforce OAuth (Web Flow) Connection
- Automation V3: Docker Runner
- Automation V3: Running Automation Tests on Jenkins
- Automation V3: Remote Trigger in GitHub Actions
- Automation V3: Internationalization Support
- Automation V3: Configuration on Jenkins
- Automation V3: Provar Manager – Test Execution Reporting
- Automation V3: Secrets Management and Encryption
- Automation V3: Updating Automation on Windows
- Automation V3: Updating Automation on Mac
- Automation V3: Installing Automation on Windows
- Automation V3: Installing Automation on Mac
- Automation V3: Commit a Local Test Project to Source Control
- Automation V3: Activating Your License
- Automation V3: Resolving High Memory Usage
- Automation V3: Test Palette Introduction
- Automation V3: Provar Naming Standards
- Automation V3: Refresh Org Cache Manually
- Automation V3: Show Hidden Provar Files on Mac
- Automation V3: NitroX Supported Components
- Automation V3: Salesforce Lightning Web Component (LWC) Locator Support
- Automation V3: Enabling and Disabling NitroX
- Automation V3: Introducing NitroX
- Automation V3: Salesforce Connection Best Practices
- Automation V3: Version Control and DevOps
- Automation V3: Test Case Design
- Automation V3: Salesforce DX Integration
- Automation V3: Running Provar on Linux
- Automation V3: Variable Set Syntax
- Automation V3: Provar Jenkins Plugin
- Automation V3: Introduction to Test Scheduling
- Automation V3: Execution Environment Security Configuration
- Automation V3: GitHub Desktop – Creating a Git Repository for Automation Projects
- Automation V3: Amazon Web Service (AWS) & Jenkins Configuration
- Automation V3: Setting up Continuous Integration
- Automation V3: Integrating GitHub Actions CI to Run Automation CI Task
- Automation V3: GitLab Continuous Integration
- Automation V3: Gearset DevOps CI/CD via Jenkins
- Automation V3: Parallel Execution in GitHub Actions using Test Plan
- Automation V3: Parallel Execution in GitHub Actions using Job Matrix
- Automation V3: Parallel Execution in GitHub Actions using Targets
- Automation V3: Parallel Execution in GitHub Actions using Targets
- Automation V3: Parallel Execution in GitHub Actions using Multiple build.xml Files
- Automation V3: Flosum Integration Introduction
- Automation V3: Flosum Configuration
- Automation V3: Setting up Continuous Integration with Jenkins for Docker
- Automation V3: Flosum Architecture Overview
- Automation V3: Docker Continuous Integration
- Automation V3: Generating the build.xml File for Docker
- Automation V3: Running Provar Tests on Docker using Docker File
- Automation V3: Copado Integration Introduction
- Automation V3: Provar Test Results Package
- Automation V3: Copado Architecture Overview
- Automation V3: Copado Configuration
- Automation V3: CircleCI Common Build Errors
- Automation V3: CircleCI Orbs
- Automation V3: Running a Provar CI Task in Azure DevOps Pipelines
- Automation V3: CircleCI via Docker
- Automation V3: Parallel Execution in Microsoft Azure Pipelines using Multiple build.xml Files
- Automation V3: Parallel Execution in Microsoft Azure Pipelines using Targets
- Automation V3: Parallel Execution in Microsoft Azure Pipelines using Test Plans
- Automation V3: Configuring the Automation Secrets Password in Microsoft Azure Pipelines
- Automation V3: System Requirements
- Installing Provar Automation V3
- Updating Provar Automation V3
- Licensing Provar Automation V3
- Automation V3: Optimizing Org and Connection Metadata Processing in Provar
- Automation V3: Granting Org Permissions to Provar Automation
- Using Provar Automation V3
- Automation V3: Provar Automation
- Automation V3: Test Data Generation
- Automation V3: Creating a New Test Project
- Automation V3: Creating Custom Test Steps
- Automation V3: Provar Test Builder
- Automation V3: Reintroduction of CLI license Check
- Automation V3: Creating Test Cases
- Automation V3: Debugging Tests
- Automation V3: Environment Management
- Automation V3: Managing Test Steps
- Automation V3: Reporting
- Automation V3: Import Test Project from a File
- Automation V3: Import Test Project from Local Repository
- Automation V3: Import Test Project from a Remote Repository
- Automation V3: Running Tests
- Automation V3: Testing Browser Options
- Automation V3: Testing Browser – Chrome Headless
- Automation V3: Provar Feature Flags and Properties Configuration
- Automation V3: Defining Proxy Settings
- Automation V3: Behavior-Driven Development
- Automation V3: Customize Browser Driver Location
- Automation V3: Setup and Teardown Test Cases
- Automation V3: Japanese Language Support
- Automation V3: Reload Org Cache
- Automation V3: Using Java Method Annotations for Custom Objects
- Automation V3: Salesforce API Testing
- Automation V3: Consolidating Multiple Test Execution Reports
- Automation V3: Tags and Service Level Agreements (SLAs)
- Automation V3: Test Plans
- Automation V3: Searching Provar with Find Usages
- Automation V3: Exporting Test Projects
- Automation V3: Tooltip Testing
- Automation V3: Refresh and Recompile
- Custom Table Mapping in V3
- NitroX in V3
- ProvarDX in V3
- Test Cycles in V3
- Using Custom APIs in V3
- Page Objects in V3
- Automation V3: ProvarX™
- Automation V3: Creating an XPath
- Automation V3: Mapping Non-Salesforce Fields
- Automation V3: Introduction to XPaths
- Automation V3: JavaScript Locator Support
- Automation V3: Create different page objects for different pages
- Automation V3: Maintaining Page Objects
- Automation V3: Refresh and Reselect Field Locators in Test Builder
- Callable Tests in V3
- Functions in V3
- Automation V3: TestCaseErrors
- Automation V3: TestRunErrors
- Automation V3: TestCaseSuccessful
- Automation V3: TestCaseOutCome
- Automation V3: TestCasePath
- Automation V3: StringTrim
- Automation V3: TestCaseName
- Automation V3: DateFormat
- Automation V3: Not
- Automation V3: IsSorted
- Automation V3: StringReplace
- Automation V3: StringNormalize
- Automation V3: DateParse
- Automation V3: DateAdd
- Automation V3: GetEnvironmentVariable
- Automation V3: GetSelectedEnvironment
- Automation V3: NumberFormat
- Automation V3: Using Functions
- Automation V3: UniqueId
- Automation V3: Count
- Automation V3: Round
- Using the Test Palette in V3
- Automation V3: Read Test Step
- Automation V3: String Test Steps
- Automation V3: Generate Test Case
- Automation V3: For Each Test Step
- Automation V3: Page Object Cleaner
- Automation V3: Assert Salesforce Layout
- Automation V3: Break Test Step
- Automation V3: UI On Screen
- Automation V3: UI Handle Alert
- Automation V3: UI Assert
- Automation V3: Set Values
- Automation V3: UI With Row
- Automation V3: UI Action
- Automation V3: UI Connect
- Automation V3: Fail Test Step
- Automation V3: While Test Step
- Automation V3: Apex Execute
- Automation V3: List Compare
- Automation V3: UI Navigate
- Automation V3: UI Fill
- Automation V3: If Test Step
- Automation V3: Group Steps Test Step
- Automation V3: Set Values Test Step
- Automation V3: Apex Bulk
- Automation V3: Sleep Test Step
- Automation V3: Switch Test Step
- Automation V3: Finally Test Step
- Automation V3: Wait For Test Step
- Automation V3: Extract Salesforce Layout
- Automation V3: Assert Test Step
- Applications Testing in V3
- Automation V3: Mobile Emulation (Salesforce Mobile)
- Automation V3: Using Provar with Amazon Web Services (AWS) Device Farm
- Automation V3: PDF Testing
- Automation V3: App Configuration for Microsoft Connection in MS Portal for OAuth 2.0
- Automation V3: OAuth 2.0 Microsoft Exchange Email Connection
- Automation V3: Create a Connection for Office 365 GCC High
- Automation V3: Support for Existing MS OAuth Email Connection
- Automation V3: Gmail Connection in Automation with App Password
- Automation V3: Email Testing in Automation
- Automation V3: OAuth 2.0 MS Graph Email Connection
- Automation V3: Email Testing Examples
- Automation V3: Database Connections
- Automation V3: Web Services
- Data-Driven Testing in V3
- DevOps with V3
- Automation V3: Introduction to Provar DevOps
- Automation V3: Bitbucket Pipelines
- Automation V3: Perfecto Mobile
- Automation V3: ANT Task Parameters
- Automation V3: ANT: Generating ANT Build File
- Automation V3: ANT Licensing
- Automation V3: Reading Data from Excel
- Automation V3: Configuration on other CI tools
- Automation V3: Setting Apache Ant Environment Variables
- Automation V3: BrowserStack Desktop
- Automation V3: Integrating with LambdaTest
- Automation V3: Sauce Labs Desktop
- Automation V3: AutoRABIT Salesforce DevOps in Provar Test
- Automation V3: Selenium Grid
- Automation V3: Working with Git
- Automation V3: Configuration for Sending Emails via the Automation Command Line Interface
- Automation V3: Parameterization using Environment Variables in GitHub Actions
- Automation V3: Slack Integration with Automation
- Automation V3: Zephyr Cloud and Server
- Automation V3: Adding a Salesforce Communities Connection
- Automation V3: Integrating with Sauce Labs Real Device
- Automation V3: Travis CI
- Salesforce Testing with V3
- Automation V3: Salesforce Release Updates
- Automation V3: Adding a Salesforce Connection
- Automation V3: Adding a Log-on As Connection
- Automation V3: Adding a Salesforce OAuth (JWT Flow) Connection
- Automation V3: Salesforce Console Testing
- Automation V3: Adding a Salesforce Portal Connection
- Automation V3: Visualforce Testing
- Automation V3: List and Table Testing
- Provar Manager and Provar Automation V3
- Automation V3: Setting Up a Connection to Provar Manager
- Automation V3: Provar Manager Filters
- Automation V3: Importing 3rd-Party Test Projects
- Automation V3: Uploading Test Steps in Provar Manager
- Automation V3: Test Management
- Automation V3: Provar Manager Test Operations
- Automation V3: Provar Manager Test Execution
- Automation V3: Release Management
- Automation V3: Provar Manager Plugins
- Automation V3: Provar Manager Test Coverage
- Automation V3: Provar Manager Setup and User Guide
- Automation V3: Uploading 3rd-Party Test Results
- Automation V3: Uploading Existing Manual Test Cases to Provar Manager with DataLoader.Io
- Automation V3: Quality Journey, Quality Center, and Dashboards
- Automation V3: Object Mapping Between Provar Automation and Provar Manager
- Automation V3: Uploading Callable Test Cases in Provar Manager
- Automation V3: Metadata Coverage with Manager
- Recommended Practices with V3
- Automation V3: Automation Planning
- Automation V3: Supported Testing Phases
- Automation V3: Best practices for the .provarCaches folder
- Automation V3: Best practices for .pageObject files
- Automation V3: Avoid Metadata performance issues
- Automation V3: The Best Ways to Change Callable Test Case Locations
- Automation V3: Improve Your Metadata Performance
- Automation V3: Abort Unused Test Sessions/Runs
- Automation V3: Create Records via API
- Automation V3: Increase auto-retry waits for steps using a global variable
- Troubleshooting with V3
- Automation V3: Add Permissions to Edit Provar.ini File
- Automation V3: Test Builder Does Not Launch
- Automation V3: Provar License Issue Solution
- Automation V3: How to Configure a Single Sign-On Connection
- Automation V3: Out of Memory Error During CI Execution
- Automation V3: Add Gmail Firewall Exception
- Automation V3: Add a License Firewall Exception
- Automation V3: Resolving Jenkins License Missing Error
- Automation V3: Increase System Memory for Provar
- Automation V3: Resolving Metadata Timeout Errors
- Automation V3: How to Use Keytool Command for Importing Certificates
- Automation V3: Java Version Mismatch Error
- Automation V3: Provar Manager 3.0 Install Error Resolution
- Automation V3: Test Case Does Not Run on IE Browser
- Automation V3: Test Builder Not Working Correctly
- Provar Grid and Provar Automation V3
- AI with Provar Automation V3