Test case design
Below are some best practices for designing Provar test cases.
Reusable tests
Wherever possible, Callable test cases should be used to maximize reuse.
- Always use input and return parameters to override data (avoid using variables where Scope = Test Folder or Test Run).
- Try to ensure the test is flexible to support multiple use cases. The use of an If statement can help support different use cases
- Create multiple Callable Tests if the test case is becoming overly complex and difficult to support
- Choose a meaningful Callable test case name and provide a meaningful summary description. The summary description will be displayed in the Assistant view when using the Callable Test.
Readable tests
It is important that team members can run and understand each other’s tests.
Note: Group Steps are recommended for organizing and adding comments to your test cases. These can be added at any level and can help to reduce complexity.
Create meaningful folder structures in the Navigator and conform to an agreed naming standard for Test Cases.
Maintainable tests
Ease of maintenance must be considered when creating test cases.
- Reduce the number of UI test steps by using SFDC API test steps.
- Reduce the number of UI test steps by allowing Provar to navigate directly to SFDC layouts. For this, you can use the Navigate option on any UI On Screen Test Step.
- Do not rely on existing data in your test environment; always create what you need for your test case.
- Avoid unreliable field locators.
Field locators
The following types of locators should be avoided:
- Salesforce IDs: e.g., 00eb0000000uFma cannot be guaranteed between environments. Use Provar’s metadata integration instead.
- J_ids for Visualforce pages: e.g., j_id0:j_id1:j_id2:j_id32:j_id33 These IDs can change frequently. Use Provar’s Visualforce locators instead.
- Extremely long XPaths (e.g. /html/body/div[4]/div/div[2]/div[2]/div[1]/div/table[1]/tbody/tr[2]/td[2]): When testing against non-Salesforce pages, Provar presents the option to use an XPath editor. This is recommended for creating an optimized XPath.
Other standards
- A peer review process should be used in the creation of test cases.
- If test execution is on multiple Orgs or Environments, all these Orgs/Environments should be consistent to avoid incorrect failure of test cases.
- Transient data (not present after a sandbox refresh) should not be used in tests.
- Page Objects should not have multiple definitions for the same field.
- If modifying data in your Org, ensure it is returned to its original state.
- Tests should be able to run using the Run mode without breakpoints or debugging.
- The Sleep test step should be used sparingly. Use Page Object Timeouts for UI Testing or Wait For for Asynchronous Testing for better control.
- All tests should use group steps to help explain the functional logic.
- Screenshots should be added at essential points such as UI Asserts.
- Test cases should be backed up in the Configuration management system of choice.
- When performing table testing, the With Row should use a WHERE clause in preference to a hardcoded Row Number.
Documentation library
- Provar Automation
- Installing Provar Automation
- Updating Provar Automation
- Using Provar Automation
- API testing
- Behavior-driven development
- Creating and importing projects
- Creating test cases
- Custom table mapping
- Functions
- Debugging tests
- Defining a namespace prefix on a connection
- Defining proxy settings
- Environment management
- Exporting test cases into a PDF
- Exporting test projects
- Override auto-retry for Test Step
- Managing test steps
- Namespace org testing
- NitroX
- Provar desktop
- Provar Test Builder
- Refresh and Recompile
- Reintroduction of CLI license Check
- Reload Org Cache
- Reporting
- Running tests
- Searching Provar with find usages
- Secrets management and encryption
- Setup and teardown test cases
- Tags and Service Level Agreements (SLAs)
- Test cycles
- Test plans
- Testing browser options
- Tooltip testing
- Using the Test Palette
- Using custom APIs
- Callable tests
- Data-driven testing
- Page objects
- Block locator strategies
- Introduction to XPaths
- Creating an XPath
- JavaScript locator support
- Label locator strategies
- Maintaining page objects
- Mapping non-Salesforce fields
- Page object operations
- ProvarX™
- Refresh and reselect field locators in Test Builder
- Using Java method annotations for custom objects
- Applications testing
- DevOps
- Introduction to test scheduling
- Apache Ant
- Configuration for Sending Emails via the Automation Command Line Interface
- Continuous integration
- AutoRABIT Salesforce DevOps in Provar Test
- Azure DevOps
- Running a Provar CI Task in Azure DevOps Pipelines
- Configuring the Automation secrets password in Microsoft Azure Pipelines
- Parallel Execution in Microsoft Azure Pipelines Using Multiple build.xml Files
- Parallel Execution in Microsoft Azure Pipelines using Targets
- Parallel execution in Microsoft Azure Pipelines using Test Plans
- Bitbucket Pipelines
- CircleCI
- Copado
- Docker
- Flosum
- Gearset DevOps CI/CD
- GitHub Actions
- Integrating GitHub Actions CI to Run Automation CI Task
- Remote Trigger in GitHub Actions
- Parameterization using Environment Variables in GitHub Actions
- Parallel Execution in GitHub Actions using Multiple build.xml Files
- Parallel Execution in GitHub Actions using Targets
- Parallel Execution in GitHub Actions using Test Plan
- Parallel Execution in GitHub Actions using Job Matrix
- GitLab Continuous Integration
- Travis CI
- Jenkins
- Execution Environment Security Configuration
- Provar Jenkins Plugin
- Parallel Execution
- Running Provar on Linux
- Reporting
- Salesforce DX
- Git
- Team Foundation Server
- Version control
- Provar Automation trial guide and extensions
- Salesforce Testing
- Provar Manager
- Best Practices
- Troubleshooting
- Release Notes