Generate Test Case
The Generate Test Case test step is a beta feature for feedback that allows users to automatically generate UI test cases based on the requested Salesforce Objects and an example Excel document to make the test case data-driven.
Note: We are delivering an experimental feature for seeking customer feedback before general availability while adding additional features to this capability.
Provar’s TechOps team developed this feature during a recent innovation competition using our Custom API feature. This is an excellent example of the power of Provar Automation and how you can extend the platform. To learn more about Custom APIs please check out our documentation here.
Note: The use of Provar Labs prototypes is at your own risk. Provar Labs prototypes should only be used on your non-production instance to test their functionality if you accept the risk of doing so. These prototypes have not completed the beta testing phase and might pose a higher-than-normal risk for bugs. We may enhance, withdraw, or replace prototype features based on extended testing and feedback gathered. Do not rely on these features as part of your test automation.
Steps to Generate Test Case
Step 1: Create a new test case.
Click New Test Case button in the header.
Step 2: Drag and Drop API (Generate Test Case) from Test Palette into a test case.
Click Test Palette on the Provar screen and drag Generate Test Case API from ProvarLabs section to test case.
Step 3: Enter the required details in the Test Step Parameters section, and give Object Name and Connection Name.
Users can give multiple object names as comma-separated values in the Object Name field. For example: Case, Account, Lead, or a custom object, using the object Metadata API name. The connection should be an OAuth (Web flow) connection.
Above: Snapshot of Object Name and Connection Name (OAuth) in the Test Step Parameters section.
Step 4: Run the test case
Click Run to generate the test case automatically.
Above: Snapshot of the automatically generated test case.
When the test case gets generated automatically, an Excel sheet is created, which will have all the object fields mentioned. And the fields with dummy data mentioned in the Excel sheet are mandatory.
Above: Snapshot of example Excel sheet created while generating a test case.
Step 5: Disable the test case
We are sure you would not run this prototype as part of your ANTant build or on your CI/CD Pipeline. However, if you run this as part of your CI/CD pipeline, the old test cases will not be overridden; if the user runs the utility multiple times, it will generate new test cases/excel files with _{num} appended to the test case.
Above: Snapshot of multiple test cases generated of the same object.
Note: The Test Case is generated in the Basic Validation Testcases folder, two test cases are generated for each of the objects, and an Excel file is created inside the Templates folder with all object’s fields and a one-row dummy data for mandatory fields. As of now, Generate test Case will only generate test cases for the connections made in the default environment of Provar. The test cases won’t be generated if the secrets file is encrypted.
Above: Snapshot of the default environment.
Above: Snapshot of encryption of file.
Known Limitations
This beta feature has the following known limitations, and we would appreciate customer feedback at provarlabs if this is of interest or if you would like to participate in a discussion on this feature:
- Limited log entries output when the test case generation is performed
- Environment overrides cannot be used; only the Default environment can be executed
- Encrypted projects and environments cannot be used with test case generation
- The feature has been tested with various Salesforce Orgs but not with every possible feature, such as Person Accounts, Multi-Currency, or Territory Management. Please share any issues you find so we can improve this before our GA release.
- The utility will generate test cases for the default record type if any object has multiple record types.
- The feature is currently placed in the Test Palette for visibility, but we plan to move it to be a right click or button that uses a modal dialog to generate test cases. This will remove the risk of accidental regeneration override if the test case is run more than once
- If the reference field is mandatory, the user must manually add it to the Excel sheet.
Note: The Test Case is generated in the Basic Validation Testcases folder, two test cases are generated for each of the objects, and an Excel file is created inside the Templates folder with all object’s fields and one row with dummy data for mandatory fields. The test case is generated in the default environment, and the test case cannot be generated for the encrypted files.
- Provar Automation
- System Requirements
- Browser and Driver Recommendations
- Installing Provar Automation
- Updating Provar Automation
- Licensing Provar
- Granting Org Permissions to Provar Automation
- Optimizing Org and Connection Metadata Processing in Provar
- Using Provar Automation
- Understanding Provar’s Use of AI Service for Test Automation
- Provar Automation
- Creating a New Test Project
- Import Test Project from a File
- Import Test Project from a Remote Repository
- Import Test Project from Local Repository
- Commit a Local Test Project to Source Control
- API Testing
- Behavior-Driven Development
- Consolidating Multiple Test Execution Reports
- Creating Test Cases
- Custom Table Mapping
- Functions
- Debugging Tests
- Defining a Namespace Prefix on a Connection
- Defining Proxy Settings
- Environment Management
- Exporting Test Cases into a PDF
- Exporting Test Projects
- Japanese Language Support
- Override Auto-Retry for Test Step
- Customize Browser Driver Location
- Mapping and Executing the Lightning Article Editor in Provar
- Managing Test Steps
- Namespace Org Testing
- NitroX
- Provar Test Builder
- ProvarDX
- Refresh and Recompile
- Reintroduction of CLI License Check
- Reload Org Cache
- Reporting
- Running Tests
- Searching Provar with Find Usages
- Secrets Management and Encryption
- Setup and Teardown Test Cases
- Tags and Service Level Agreements (SLAs)
- Test Cycles
- Test Data Generation
- Test Plans
- Testing Browser Options
- Tooltip Testing
- Using the Test Palette
- Using Custom APIs
- Callable Tests
- Data-Driven Testing
- Page Objects
- Block Locator Strategies
- Introduction to XPaths
- Creating an XPath
- JavaScript Locator Support
- Label Locator Strategies
- Maintaining Page Objects
- Mapping Non-Salesforce fields
- Page Object Operations
- ProvarX™
- Refresh and Reselect Field Locators in Test Builder
- Using Java Method Annotations for Custom Objects
- Applications Testing
- Provar Manager
- How to Use Provar Manager
- Provar Manager Setup
- Provar Manager Integrations
- Release Management
- Test Management
- Test Operations
- Provar Manager and Provar Automation
- Setting Up a Connection to Provar Manager
- Object Mapping Between Automation and Manager
- How to Upload Test Plans, Test Plan Folders, Test Plan Instances, and Test Cases
- Provar Manager Filters
- Uploading Callable Test Cases in Provar Manager
- Uploading Test Steps in Provar Manager
- How to Know if a File in Automation is Linked in Test Manager
- Test Execution Reporting
- Metadata Coverage with Manager
- Provar Grid
- DevOps
- Introduction to Provar DevOps
- Introduction to Test Scheduling
- Apache Ant
- Configuration for Sending Emails via the Automation Command Line Interface
- Continuous Integration
- AutoRABIT Salesforce DevOps in Provar Test
- Azure DevOps
- Running a Provar CI Task in Azure DevOps Pipelines
- Configuring the Automation secrets password in Microsoft Azure Pipelines
- Parallel Execution in Microsoft Azure Pipelines using Multiple build.xml Files
- Parallel Execution in Microsoft Azure Pipelines using Targets
- Parallel execution in Microsoft Azure Pipelines using Test Plans
- Bitbucket Pipelines
- CircleCI
- Copado
- Docker
- Flosum
- Gearset
- GitHub Actions
- Integrating GitHub Actions CI to Run Automation CI Task
- Remote Trigger in GitHub Actions
- Parameterization using Environment Variables in GitHub Actions
- Parallel Execution in GitHub Actions using Multiple build.xml Files
- Parallel Execution in GitHub Actions using Targets
- Parallel Execution in GitHub Actions using Test Plan
- Parallel Execution in GitHub Actions using Job Matrix
- GitLab Continuous Integration
- Travis CI
- Jenkins
- Execution Environment Security Configuration
- Provar Jenkins Plugin
- Parallel Execution
- Running Provar on Linux
- Reporting
- Salesforce DX
- Git
- Version Control
- Salesforce Testing
- Recommended Practices
- Salesforce Connection Best Practices
- Improve Your Metadata Performance
- Java 21 Upgrade
- Testing Best Practices
- Automation Planning
- Supported Testing Phases
- Provar Naming Standards
- Test Case Design
- Create records via API
- Avoid using static values
- Abort Unused Test Sessions/Runs
- Avoid Metadata performance issues
- Increase auto-retry waits for steps using a global variable
- Create different page objects for different pages
- The Best Ways to Change Callable Test Case Locations
- Working with the .testProject file and .secrets file
- Best practices for the .provarCaches folder
- Best practices for .pageObject files
- Troubleshooting
- How to Use Keytool Command for Importing Certificates
- Installing Provar After Upgrading to macOS Catalina
- Browsers
- Configurations and Permissions
- Connections
- DevOps
- Error Messages
- Provar Manager 3.0 Install Error Resolution
- Provar Manager Test Case Upload Resolution
- Administrator has Blocked Access to Client
- JavascriptException: Javascript Error
- macOS Big Sur Upgrade
- Resolving Failed to Create ChromeDriver Error
- Resolving Jenkins License Missing Error
- Resolving Metadata Timeout Errors
- Test Execution Fails – Firefox Not Installed
- Selenium 4 Upgrade
- Licensing, Installation and Firewalls
- Memory
- Test Builder and Test Cases
- Release Notes