Salesforce Test Integration Package
Important Note! The Test Results application is no longer supported and we encourage you to upgrade to Provar Manager. For ease of integration, please refer to our guide Integration with Provar Manager without changing your test cases.
This page explains how to install and configure Provar’s Test Integration package for Salesforce. This is a packaged, customizable solution for integrating the results of your Provar tests into Salesforce for analysis and reporting.
This page also contains summary information regarding an optional feature to invoke Provar Tests on a CI Server, should you wish to trigger your test automation from Salesforce. A separate help page is being prepared with more information, but please contact us if you would like more information in the meantime.
Choosing a Salesforce org
Before installing the Provar Test Integration package, choose the target Salesforce org where you want to install it. This should be the org where you want to store and report on your test results.
Note: You do not need to install the application into the org you are performing testing on. We generally recommend storing your test environments in a different environment, either a Developer Edition, trial org, sandbox, or even your production org.
Please remember that if you choose to install in a sandbox, you will lose your test result data when the sandbox is refreshed.
Installing the package
Once you have chosen your target Salesforce org, click here to access the Test Integration package’s private AppExchange Listing. Follow the instructions to install the app into your chosen Salesforce org.
Note: The app is privately listed but will soon become public as it is under security review. You may receive warnings on installation while this security review is completed.
When installing the app, you will be asked which profiles or users should have access to the package. Please ensure that you give full access to the package to the user profile used in the Provar Connection. If in doubt, use the default option, ‘Install For Admins Only’. If you need to extend access later, you can use the Provar Test Integration Results Permission Set, which will provide access to the package objects, fields, and reports.
Package contents
The app package contains the following elements:
- A Test Suite Execution custom object that summarises the execution results for a given deployment.
- A Test Case Execution custom object that gives test result information relating to an individual test case. These records are all related to a parent Test Suite Execution record.
- A Test Results Lightning app to display the relevant tabs contained in the Test Results package. (Note: This includes a Utility Bar Lightning Page created automatically with the app but is empty.)
- Some helpful Reports and Dashboards to get you started on visualizing your results. These are stored in the Provar Results folders.
- A Provar Test Integration Results Permission Set for extending access to package objects, fields, and reports.
- 2 Lightning pages to provide examples of how you can put a chart onto the Test Suite or Test Case objects.
The app also contains optional elements for integrating Salesforce with your test automation runs in your chosen Continuous Integration (CI) server.
These are as follows:
- A hierarchical Custom Setting, Provar CI Settings, to parameterize the Continuous Integration Server connection and test execution job details.
- An Apex Class, CIJobNotificaton, includes an invocable apex method invokeCIJob() that can be called from Process Builder and httpCallout() method that can be called from Apex. Both these methods are global.
Adding a reporting connection
Once you’ve completed the Provar Test Integration package installation, you’ll need to add some additional configuration in Provar to write your test run results to your target Salesforce org.
First, add a new Connection for the Salesforce org where you have installed the Test Integration package. This new Connection will be used to create and update your Salesforce org with Test Suite Execution and Test Case Execution data. This is a mandatory step unless you store your test results in the same Salesforce org you are testing against.
We suggest naming the Connection TestResults to avoid confusion with your active testing Connections.
Enabling logging in Provar
Once you have added your TestResults Connection, you need to make some updates in Provar to enable logging and update your target Salesforce org with your test run results.
The updates needed are as follows:
- A new setup test case (You can learn more about setup test cases here)
- A new write results test case
- Adding logging to existing tests
- Optional: calculating total test run duration
You can create test cases from scratch using the specifications below or download these preconfigured test cases from GitHub.
Download the preconfigured example test cases and place them in your Provar project within the /tests folder. Then, amend the Connection defined for each test case so that they are using the TestResults Connection you created above.
The following sections explain each test case in further detail. Use these to create your test cases from scratch or to further customise the examples provided above.
Setup test case
This test case will insert a Test Suite Execution record in Salesforce.
Note: This is a Setup test case, meaning it will run at the beginning of your test run. If you are creating the test case from scratch, make sure that you select Setup Test Case on the New Test Case screen.
The other key step for this test case is a Create Object step.
Within the Create Object step, set the following values:
- Record Name Field: For the record name field, give a relevant name that identifies the Test Suite, such as the date and time, plus some description. This will become the name of the Test Suite Execution record in Salesforce.
- Result Object Id: Set this to ‘TestSuiteExecutionId’ or similar. Once the Test Suite Execution record is created, it will return this Salesforce object id. We will use this parameter in subsequent test cases to link test case execution results to the parent execution record.
- Result Scope: Remember to set the Result Scope parameter to ‘Test Run’ to make the Result Object Id visible to all test cases executed simultaneously.
Optional steps:
- If you want to link the Test Suite Execution record to another record in your org, first go to the Test Suite Execution object in Salesforce and add a custom Lookup field to the desired object. Then, map this field in the Create Object test step and provide the ID of the record you want to link to.
Write results test case
This test case will insert Test Case Execution records in Salesforce each time it is called.
Call this test something generic like ‘Write Test Results’ and make it callable so that you can re-use it. You can then call this test from other test cases to log their execution results in Salesforce.
At the top level of this test, set the test case to ‘Callable’ and add three input parameters:
- Test Case Name: the name of the test case being executed
- Test Case Result: the outcome of the test
- Test Start Time: the time that the test started
Next, add an IF step to make your logging dependent on TestSuiteExecutionId being non-null. This will provide an ‘off’ switch for logging, so that you can disable logging across your entire test suite simply by disabling the Setup Test Case (where the TestSuiteExecutionId is set).
Under your IF step, create a local variable (using a Set Values step) with a Value Name of ‘TestResult’. Default this to the value of the Test Case Result input parameter you added at the top level of this test.
Next, we’ll convert Provar’s standard TestCaseOutcome() values into a more reporting-friendly format. Provar’s TestCaseOutcome() values are typically ‘successful’, ‘failed’ and ‘skipped’; we will convert these to ‘Pass’, ‘Fail’ and ‘Skipped’. This is also useful if you want to use the package to log the results of manual tests, so that you can combine manual and automated test results in your reports and dashboards.
We’ll convert these values using a combination of IF/ELSE and Set Values test steps.
The other key step for this test case is a Create Object step.
Within the Create Object step, set the following values (please note your parameters may appear in a different order from the order below):
- Test Suite Execution: Set this to be TestSuiteExecutionId. If this is not visible, check that you set Result Scope to ‘Test Run’ in the Setup Test Case (see above).
- Name: The Test Case Name passed as an input parameter.
- Status: The value of your local TestResult variable.
- StartDateTime [optional]: The Test Start Time input parameter.
- EndDateTime [optional]: Set this to ‘{NOW}’
- Comments [optional]: Any additional information you want to record in the Test Case Execution record.
Logging existing tests
Finally, modify your existing tests to add logging to them. Since you’ve already added a callable Write Results test case (see above), you can call this test from any other test to enable logging.
To add logging to an existing test, first add a Finally block to the end of the test. This creates a grouping of test steps that are always executed, regardless of the outcome of prior test steps.
Next, locate the callable Write Results test you created above and drag and drop it from the Navigator into the Finally block. This will create a Call step in the test, as shown below.
Within the Call step, map the following input parameters:
- TestCaseResult: Use the expression ‘{TestCaseOutcome()}’ which will return ‘successful’, ‘failed’ or ‘skipped’
- TestCaseName: Enter ‘{TestCaseName()}’ to insert the name of the current test case
Optional steps:
- TestStartTime: This parameter will allow you to measure the length of time taken to run the test case. To use this parameter, first add a Set Value step at the top of your test with a value of ‘{NOW}’. You should position this step consistently across all your test cases, either as the first step in the test or immediately after the Connect steps. Once you have created this variable, reference it from this parameter.
Optional: calculating total test run duration
If wanted, you can also record the total duration of a given test run. You can do this by essentially creating a test ‘timer’ which will start and stop at the beginning and end of test suite execution. This will require a few more updates.
Firstly, in your Setup Test Case, add a Set Values step to create a new variable. Set the Value Name to ‘TestRunStart’ and the Value to ‘{NOW}’ to set the current date/time. Set the Value Scope to ‘Test Run’. We will refer back to this in the new Teardown Test Case (see below).
Next, create a Teardown test case which to write the test run duration back to Salesforce.
Note: This is a Teardown test case, which means that it will run at the end of your test run. If you are creating the test case from scratch, make sure that you select Teardown Test Case on the New Test Case screen.
Then, add an Object Update which will update the Test Suite Execution object with the total test suite duration. To find the correct Test Suite Execution record, set Object Id to ‘{TestSuiteExecutionId}’. Then, in Fields, map the Test Suite Duration using the current time minus the value captured in TestRunStart. This time will be in milliseconds by default, but you can divide by 1000 for seconds, 60000 for minutes, 3600000 for hours and 86400000 for days as shown below.
Then, save the test case.
Viewing the results
You’re ready to run! When you execute your tests, the results will be logged automatically in Salesforce. To view the results, go to the Provar Test Results App in the org where you installed the package.
Note: You may not want to include your Setup and Teardown tests as part of your project when you’re using Test Builder, as every Test Builder session will create records you probably won’t want to include in your results. You can either disable these tests in your local project or move them to a separate folder when you don’t want them to execute.
We’d like to hear your thoughts on the Provar Test Integration Package. If you have any questions or comments on the package or on this guide, please contact us at support@provartesting.com.
- Provar Automation
- System Requirements
- Browser and Driver Recommendations
- Installing Provar Automation
- Updating Provar Automation
- Licensing Provar
- Granting Org Permissions to Provar Automation
- Optimizing Org and Connection Metadata Processing in Provar
- Using Provar Automation
- Understanding Provar’s Use of AI Service for Test Automation
- Provar Automation
- Creating a New Test Project
- Import Test Project from a File
- Import Test Project from a Remote Repository
- Import Test Project from Local Repository
- Commit a Local Test Project to Source Control
- API Testing
- Behavior-Driven Development
- Consolidating Multiple Test Execution Reports
- Creating Test Cases
- Custom Table Mapping
- Functions
- Debugging Tests
- Defining a Namespace Prefix on a Connection
- Defining Proxy Settings
- Environment Management
- Exporting Test Cases into a PDF
- Exporting Test Projects
- Japanese Language Support
- Override Auto-Retry for Test Step
- Customize Browser Driver Location
- Mapping and Executing the Lightning Article Editor in Provar
- Managing Test Steps
- Namespace Org Testing
- NitroX
- Provar Test Builder
- ProvarDX
- Refresh and Recompile
- Reintroduction of CLI License Check
- Reload Org Cache
- Reporting
- Running Tests
- Searching Provar with Find Usages
- Secrets Management and Encryption
- Setup and Teardown Test Cases
- Tags and Service Level Agreements (SLAs)
- Test Cycles
- Test Data Generation
- Test Plans
- Testing Browser Options
- Tooltip Testing
- Using the Test Palette
- Using Custom APIs
- Callable Tests
- Data-Driven Testing
- Page Objects
- Block Locator Strategies
- Introduction to XPaths
- Creating an XPath
- JavaScript Locator Support
- Label Locator Strategies
- Maintaining Page Objects
- Mapping Non-Salesforce fields
- Page Object Operations
- ProvarX™
- Refresh and Reselect Field Locators in Test Builder
- Using Java Method Annotations for Custom Objects
- Applications Testing
- Provar Manager
- How to Use Provar Manager
- Provar Manager Setup
- Provar Manager Integrations
- Release Management
- Test Management
- Test Operations
- Provar Manager and Provar Automation
- Setting Up a Connection to Provar Manager
- Object Mapping Between Automation and Manager
- How to Upload Test Plans, Test Plan Folders, Test Plan Instances, and Test Cases
- Provar Manager Filters
- Uploading Callable Test Cases in Provar Manager
- Uploading Test Steps in Provar Manager
- How to Know if a File in Automation is Linked in Test Manager
- Test Execution Reporting
- Metadata Coverage with Manager
- Provar Grid
- DevOps
- Introduction to Provar DevOps
- Introduction to Test Scheduling
- Apache Ant
- Configuration for Sending Emails via the Automation Command Line Interface
- Continuous Integration
- AutoRABIT Salesforce DevOps in Provar Test
- Azure DevOps
- Running a Provar CI Task in Azure DevOps Pipelines
- Configuring the Automation secrets password in Microsoft Azure Pipelines
- Parallel Execution in Microsoft Azure Pipelines using Multiple build.xml Files
- Parallel Execution in Microsoft Azure Pipelines using Targets
- Parallel execution in Microsoft Azure Pipelines using Test Plans
- Bitbucket Pipelines
- CircleCI
- Copado
- Docker
- Flosum
- Gearset
- GitHub Actions
- Integrating GitHub Actions CI to Run Automation CI Task
- Remote Trigger in GitHub Actions
- Parameterization using Environment Variables in GitHub Actions
- Parallel Execution in GitHub Actions using Multiple build.xml Files
- Parallel Execution in GitHub Actions using Targets
- Parallel Execution in GitHub Actions using Test Plan
- Parallel Execution in GitHub Actions using Job Matrix
- GitLab Continuous Integration
- Travis CI
- Jenkins
- Execution Environment Security Configuration
- Provar Jenkins Plugin
- Parallel Execution
- Running Provar on Linux
- Reporting
- Salesforce DX
- Git
- Version Control
- Salesforce Testing
- Recommended Practices
- Salesforce Connection Best Practices
- Improve Your Metadata Performance
- Java 21 Upgrade
- Testing Best Practices
- Automation Planning
- Supported Testing Phases
- Provar Naming Standards
- Test Case Design
- Create records via API
- Avoid using static values
- Abort Unused Test Sessions/Runs
- Avoid Metadata performance issues
- Increase auto-retry waits for steps using a global variable
- Create different page objects for different pages
- The Best Ways to Change Callable Test Case Locations
- Working with the .testProject file and .secrets file
- Best practices for the .provarCaches folder
- Best practices for .pageObject files
- Troubleshooting
- How to Use Keytool Command for Importing Certificates
- Installing Provar After Upgrading to macOS Catalina
- Browsers
- Configurations and Permissions
- Connections
- DevOps
- Error Messages
- Provar Manager 3.0 Install Error Resolution
- Provar Manager Test Case Upload Resolution
- Administrator has Blocked Access to Client
- JavascriptException: Javascript Error
- macOS Big Sur Upgrade
- Resolving Failed to Create ChromeDriver Error
- Resolving Jenkins License Missing Error
- Resolving Metadata Timeout Errors
- Test Execution Fails – Firefox Not Installed
- Selenium 4 Upgrade
- Licensing, Installation and Firewalls
- Memory
- Test Builder and Test Cases
- Release Notes