Provar Automation
Assistant
The Assistant view shows help information from inside Automation. Its aim is to provide in-tool information to help you build tests.
This includes a search function for finding information in Provar Help, as well as the full Provar Help menu (scrolling down may be required). You can revert to Provar Help at any time by clicking the Home icon at the top of the Assistant view.
Error log
The Error Log view shows internal errors and problems related to the running of Provar. It may be useful for debugging issues or for describing them to support. To open the Error Log, click on the View icon and locate Show Error Logs in the dropdown.
Menubar
The Provar menubar has the following features:
New Test: Used for creating new files. This could be a new test case, Page Object or Test API. The shortcut CTRL + N can also be used
Save / Save All: Used for saving one or multiple files. The shortcut CTRL + S can also be used.
Run / Debug / Test Builder: These icons are for running tests. A test can be run in one of three modes. Refer to Running Tests to learn more about the different modes.
Test Environment: Used for selecting the current Test Environment.
Web Browser: Used to select the browser in which to execute tests. Chrome, IE, Firefox, Safari, Edge and Chrome Headless are available (Chrome Headless was added in Provar version 1.9.8). Phantom JS was previously available for use in headless testing, but was retired in Provar version 1.9.8.
View: Used to define the appearance and positioning of your views. You can close views and expand views or return to the default perspective by selecting this View icon. To define your own perspective, you may customise the layout and then choose Window/Save Perspective from the Menubar.
Navigator
The Navigator shows your test project data as it is stored on your machine. By default, tests are stored in the tests folder. You can create sub-folders to further organize your tests.
You can perform standard file-based operations such as Copy, Delete and Rename. If you change any of the files in the filesystem you can also use Refresh to get the latest version.
If you double-click on a file in the Navigator, it will open in the most recently used editor.
Org browser
When a new Connection is added, Provar will connect to the environment and cache its metadata. The Org Browser shows this metadata as currently cached.
There are filters to locate a specific object (at the top-right of the Org Browser) and to locate a specific row of data (at the bottom of the Org Browser). It is also possible to drag and drop a row of data from the Org Browser into a test case to create an API test step. Provar will periodically check the state of the cached metadata and prompt a refresh if the data is stale. To perform a refresh on the cache, click the icon in the top-right of the Org Browser.
The options are as follows:
Just reconnect: Use this option if you have detected a timeout in the Org Browser (this will not refresh the cache)
Refresh only changed files: This is a fast method of downloading changes
Reload all files: This will download all files
Check if metadata is stale: This will manually trigger the stale check
If any inconsistencies are detected that impact test cases, these will be highlighted in the Problems view.
Problems
The Problems view highlights any current errors in your project by checking through your tests for inconsistencies.
Examples of when an error might appear are:
- When metadata has changed and a test step needs to be updated (e.g. if a field referenced in a test step has been deleted)
- When a test step is missing a mandatory field
- When a test step is referring to a Connection which does not exist
When an error occurs, you can double-click on the row to navigate to the exception. A quick fix option is provided where possible to help with the refactoring.
Test Palette
The Test Palette, provides an test step library for use in your automated tests. You can add any test step by clicking on the feature and dragging it into your test case. To search for a specific test step, use the magnifying glass icon at the top of the view.
Test case
The Test case view is the central view which displays the currently open test case. You can open any test case by double-clicking it in the Navigator.
A test case has a number of icons in the top-right corner. Note that icons will be grayed out if they are not relevant.
The icons provide access to the following features:
- Expected Exception
- Tags and SLAs
- Result Assertion
- Parameter Value Source
- Configure Parameters
Expected exception
Clicking the icon will add an Expected Exception section to the test step. This should be used when an Exception (Stacktrace or Error) is expected to be thrown by the application and this should result in the Test Step passing.
Once Expected Exception is added, the different operators can be used to capture the Exception and make the test step successful. If the anticipated Exception is not thrown, the test step will be failed. This should not be confused with asserting error text being displayed on a screen during UI testing, which is part of functional testing.
Tags and SLAs
The second icon is to add a Tag or SLA Requirement to the test step. Once a Tag or SLA is associated with a test step, reports can be generated with results grouped by Tag, e.g. CRM.]
Result assertion
Clicking the icon will add a Result Assertion to a test step. Certain test steps will create these for you automatically, but they can be added manually as well. Multiple Result Assertions can be added per test step.
Parameter value source
Parameter Value Source is used to populate a given variable from data stored in Excel or a database. Once populated, this variable is available for use in Test Parameters.
Configure parameters
The Configure Parameters () icon allows you to amend the fields associated with a test step. Once selected, a dialog box of fields will appear to select from. This is commonly used when adding an API Test Step via the Org Browser.
Test plug-ins
The Test Plug-ins view displays the plug-ins currently running. Provar is architected to have one plug-in running, for Salesforce APIs.
The Plug-in should have a status of started. Any errors will be displayed in the log underneath. In case of errors, the plug-ins can be restarted, started or stopped using the icons in the top right hand corner.
Test Runner
The Test Runner provides a detailed output of the tests you have run. A new test run tab is created for each separate Test Run.
You can view the output of each test step, including detailed timings and screenshots, by clicking on it.
You can also:
- Access relevant test data by clicking on hyperlinks in the output pane
- Export results to CSV or PDF by clicking on the Export icon
- View runtime variables in the Variables view by clicking on the relevant Test Step and then clicking the Variables icon to bring the Variables View to the front. (Variables are only available when running in Test Builder mode or Debug mode)
Test settings
Test settings contains 6 sub-tabs: Browser Configuration, Connections, Environments, Environment Variables, Tags and SLA, and Test Cycle.
Browser configuration
Browser configuration allows you to launch a browser in the desired resolution when executing your test case.
Connections
Provar uses an interactive connection into an application to access applications when authoring tests. The Connections tab is where these connection login details are stored.
As well as Salesforce Connections, Provar can create connections for non-Salesforce testing, database testing, messaging and Google applications. Refer to Adding a Connection for more information.
Double-click on a given connection to open a new Org Browser for that environment.
Environments
This feature allows a test case to be run in multiple test environments without modifying the test case.
This screen will allow you to create the environments used on your project.
Environment variables
Environment variables are used for any static variable which can vary between environments. These variables are assigned a default value, which can be overridden for a specific environment. This section will display the variable’s values as they relate to the test environment selected in the Menubar.
Tags and SLA
Tags allow you to tag and categorize your test steps for better maintenance and to provide grouping levels in a report. SLAs allow you to measure performance parameters at a specific test step execution, such as time taken to load a page.
Once these have been created, they can be assigned to test cases.
Test cycle
A test cycle is conceptually a collection of test case executions that are reported on collectively. This feature is extremely helpful in order to get a clean report from a test suite execution.
Variables
The variables view displays information relevant to a specific test step of a test run. It will be populated after you run tests in debug mode or Test Builder mode. It will also display any environment variables.
To see variables for a given test step, click on the test step in the Test Runner.
The variables will be split into three groups:
- Parameters: the value of parameters passed into the test step
- Variables (Before): the state of each Variable prior to executing this Test Step
- Variables (After): the state of each Variable after executing this test step
For more information, check out this course on University of Provar.
- Provar Automation
- System Requirements
- Browser and Driver Recommendations
- Installing Provar Automation
- Updating Provar Automation
- Licensing Provar
- Granting Org Permissions to Provar Automation
- Optimizing Org and Connection Metadata Processing in Provar
- Using Provar Automation
- Understanding Provar’s Use of AI Service for Test Automation
- Provar Automation
- Creating a New Test Project
- Import Test Project from a File
- Import Test Project from a Remote Repository
- Import Test Project from Local Repository
- Commit a Local Test Project to Source Control
- API Testing
- Behavior-Driven Development
- Consolidating Multiple Test Execution Reports
- Creating Test Cases
- Custom Table Mapping
- Functions
- Debugging Tests
- Defining a Namespace Prefix on a Connection
- Defining Proxy Settings
- Environment Management
- Exporting Test Cases into a PDF
- Exporting Test Projects
- Japanese Language Support
- Override Auto-Retry for Test Step
- Customize Browser Driver Location
- Mapping and Executing the Lightning Article Editor in Provar
- Managing Test Steps
- Namespace Org Testing
- NitroX
- Provar Test Builder
- ProvarDX
- Refresh and Recompile
- Reintroduction of CLI License Check
- Reload Org Cache
- Reporting
- Running Tests
- Searching Provar with Find Usages
- Secrets Management and Encryption
- Setup and Teardown Test Cases
- Tags and Service Level Agreements (SLAs)
- Test Cycles
- Test Data Generation
- Test Plans
- Testing Browser Options
- Tooltip Testing
- Using the Test Palette
- Using Custom APIs
- Callable Tests
- Data-Driven Testing
- Page Objects
- Block Locator Strategies
- Introduction to XPaths
- Creating an XPath
- JavaScript Locator Support
- Label Locator Strategies
- Maintaining Page Objects
- Mapping Non-Salesforce fields
- Page Object Operations
- ProvarX™
- Refresh and Reselect Field Locators in Test Builder
- Using Java Method Annotations for Custom Objects
- Applications Testing
- Provar Manager
- How to Use Provar Manager
- Provar Manager Setup
- Provar Manager Integrations
- Release Management
- Test Management
- Test Operations
- Provar Manager and Provar Automation
- Setting Up a Connection to Provar Manager
- Object Mapping Between Automation and Manager
- How to Upload Test Plans, Test Plan Folders, Test Plan Instances, and Test Cases
- Provar Manager Filters
- Uploading Callable Test Cases in Provar Manager
- Uploading Test Steps in Provar Manager
- How to Know if a File in Automation is Linked in Test Manager
- Test Execution Reporting
- Metadata Coverage with Manager
- Provar Grid
- DevOps
- Introduction to Provar DevOps
- Introduction to Test Scheduling
- Apache Ant
- Configuration for Sending Emails via the Automation Command Line Interface
- Continuous Integration
- AutoRABIT Salesforce DevOps in Provar Test
- Azure DevOps
- Running a Provar CI Task in Azure DevOps Pipelines
- Configuring the Automation secrets password in Microsoft Azure Pipelines
- Parallel Execution in Microsoft Azure Pipelines using Multiple build.xml Files
- Parallel Execution in Microsoft Azure Pipelines using Targets
- Parallel execution in Microsoft Azure Pipelines using Test Plans
- Bitbucket Pipelines
- CircleCI
- Copado
- Docker
- Flosum
- Gearset
- GitHub Actions
- Integrating GitHub Actions CI to Run Automation CI Task
- Remote Trigger in GitHub Actions
- Parameterization using Environment Variables in GitHub Actions
- Parallel Execution in GitHub Actions using Multiple build.xml Files
- Parallel Execution in GitHub Actions using Targets
- Parallel Execution in GitHub Actions using Test Plan
- Parallel Execution in GitHub Actions using Job Matrix
- GitLab Continuous Integration
- Travis CI
- Jenkins
- Execution Environment Security Configuration
- Provar Jenkins Plugin
- Parallel Execution
- Running Provar on Linux
- Reporting
- Salesforce DX
- Git
- Version Control
- Salesforce Testing
- Recommended Practices
- Salesforce Connection Best Practices
- Improve Your Metadata Performance
- Java 21 Upgrade
- Testing Best Practices
- Automation Planning
- Supported Testing Phases
- Provar Naming Standards
- Test Case Design
- Create records via API
- Avoid using static values
- Abort Unused Test Sessions/Runs
- Avoid Metadata performance issues
- Increase auto-retry waits for steps using a global variable
- Create different page objects for different pages
- The Best Ways to Change Callable Test Case Locations
- Working with the .testProject file and .secrets file
- Best practices for the .provarCaches folder
- Best practices for .pageObject files
- Troubleshooting
- How to Use Keytool Command for Importing Certificates
- Installing Provar After Upgrading to macOS Catalina
- Browsers
- Configurations and Permissions
- Connections
- DevOps
- Error Messages
- Provar Manager 3.0 Install Error Resolution
- Provar Manager Test Case Upload Resolution
- Administrator has Blocked Access to Client
- JavascriptException: Javascript Error
- macOS Big Sur Upgrade
- Resolving Failed to Create ChromeDriver Error
- Resolving Jenkins License Missing Error
- Resolving Metadata Timeout Errors
- Test Execution Fails – Firefox Not Installed
- Selenium 4 Upgrade
- Licensing, Installation and Firewalls
- Memory
- Test Builder and Test Cases
- Release Notes