Generate Test Case
The Generate Test Case test step is a beta feature for feedback that allows users to automatically generate UI test cases based on the requested Salesforce Objects and an example Excel document to make the test case data-driven.
Note: We are delivering an experimental feature for seeking customer feedback before general availability while adding additional features to this capability.
Provar’s TechOps team developed this feature during a recent innovation competition using our Custom API feature. This is an excellent example of the power of Provar Automation and how you can extend the platform. To learn more about Custom APIs please check out our documentation here.
Note: The use of Provar Labs prototypes is at your own risk. Provar Labs prototypes should only be used on your non-production instance to test their functionality if you accept the risk of doing so. These prototypes have not completed the beta testing phase and might pose a higher-than-normal risk for bugs. We may enhance, withdraw, or replace prototype features based on extended testing and feedback gathered. Do not rely on these features as part of your test automation.
Steps to Generate Test Case
Step 1: Create a new test case.
Click New Test Case button in the header.
Step 2: Drag and Drop API (Generate Test Case) from Test Palette into a test case.
Click Test Palette on the Provar screen and drag Generate Test Case API from ProvarLabs section to test case.
Step 3: Enter the required details in the Test Step Parameters section, and give Object Name and Connection Name.
Users can give multiple object names as comma-separated values in the Object Name field. For example: Case, Account, Lead, or a custom object, using the object Metadata API name. The connection should be an OAuth (Web flow) connection.
Above: Snapshot of Object Name and Connection Name (OAuth) in the Test Step Parameters section.
Step 4: Run the test case
Click Run to generate the test case automatically.
Above: Snapshot of Basic Validation Testcases folder.
Above: Snapshot of the automatically generated test case.
When the test case gets generated automatically, an Excel sheet is created, which will have all the object fields mentioned. And the fields with dummy data mentioned in the Excel sheet are mandatory.
Above: Snapshot of example Excel sheet created while generating a test case.
Step 5: Disable the test case
We are sure you would not run this prototype as part of your ANTant build or on your CI/CD Pipeline. However, if you run this as part of your CI/CD pipeline, the old test cases will not be overridden; if the user runs the utility multiple times, it will generate new test cases/excel files with _{num} appended to the test case.
Above: Snapshot of multiple test cases generated of the same object.
Note: The Test Case is generated in the Basic Validation Testcases folder, two test cases are generated for each of the objects, and an Excel file is created inside the Templates folder with all object’s fields and a one-row dummy data for mandatory fields. As of now, Generate test Case will only generate test cases for the connections made in the default environment of Provar. The test cases won’t be generated if the secrets file is encrypted.
Above: Snapshot of the default environment.
Above: Snapshot of encryption of file.
Known Limitations
This beta feature has the following known limitations, and we would appreciate customer feedback at provarlabs if this is of interest or if you would like to participate in a discussion on this feature:
- Limited log entries output when the test case generation is performed
- Environment overrides cannot be used; only the Default environment can be executed
- Encrypted projects and environments cannot be used with test case generation
- The feature has been tested with various Salesforce Orgs but not with every possible feature, such as Person Accounts, Multi-Currency, or Territory Management. Please share any issues you find so we can improve this before our GA release.
- The utility will generate test cases for the default record type if any object has multiple record types.
- The feature is currently placed in the Test Palette for visibility, but we plan to move it to be a right click or button that uses a modal dialog to generate test cases. This will remove the risk of accidental regeneration override if the test case is run more than once
- If the reference field is mandatory, the user must manually add it to the Excel sheet.
Note: The Test Case is generated in the Basic Validation Testcases folder, two test cases are generated for each of the objects, and an Excel file is created inside the Templates folder with all object’s fields and one row with dummy data for mandatory fields. The test case is generated in the default environment, and the test case cannot be generated for the encrypted files.
- Provar Automation
- Installing Provar Automation
- Updating Provar Automation
- Using Provar Automation
- API testing
- Behavior-driven development
- Creating and importing projects
- Creating test cases
- Custom table mapping
- Functions
- Debugging tests
- Defining a namespace prefix on a connection
- Defining proxy settings
- Environment management
- Exporting test cases into a PDF
- Exporting test projects
- Override auto-retry for Test Step
- Managing test steps
- Namespace org testing
- NitroX
- Provar desktop
- Provar Test Builder
- Refresh and Recompile
- Reintroduction of CLI license Check
- Reload Org Cache
- Reporting
- Running tests
- Searching Provar with find usages
- Secrets management and encryption
- Setup and teardown test cases
- Tags and Service Level Agreements (SLAs)
- Test cycles
- Test plans
- Testing browser options
- Tooltip testing
- Using the Test Palette
- Using custom APIs
- Callable tests
- Data-driven testing
- Page objects
- Block locator strategies
- Introduction to XPaths
- Creating an XPath
- JavaScript locator support
- Label locator strategies
- Maintaining page objects
- Mapping non-Salesforce fields
- Page object operations
- ProvarX™
- Refresh and reselect field locators in Test Builder
- Using Java method annotations for custom objects
- Applications testing
- DevOps
- Introduction to test scheduling
- Apache Ant
- Configuration for Sending Emails via the Automation Command Line Interface
- Continuous integration
- AutoRABIT Salesforce DevOps in Provar Test
- Azure DevOps
- Running a Provar CI Task in Azure DevOps Pipelines
- Configuring the Automation secrets password in Microsoft Azure Pipelines
- Parallel Execution in Microsoft Azure Pipelines Using Multiple build.xml Files
- Parallel Execution in Microsoft Azure Pipelines using Targets
- Parallel execution in Microsoft Azure Pipelines using Test Plans
- Bitbucket Pipelines
- CircleCI
- Copado
- Docker
- Flosum
- Gearset DevOps CI/CD
- GitHub Actions
- Integrating GitHub Actions CI to Run Automation CI Task
- Remote Trigger in GitHub Actions
- Parameterization using Environment Variables in GitHub Actions
- Parallel Execution in GitHub Actions using Multiple build.xml Files
- Parallel Execution in GitHub Actions using Targets
- Parallel Execution in GitHub Actions using Test Plan
- Parallel Execution in GitHub Actions using Job Matrix
- GitLab Continuous Integration
- Travis CI
- Jenkins
- Execution Environment Security Configuration
- Provar Jenkins Plugin
- Parallel Execution
- Running Provar on Linux
- Reporting
- Salesforce DX
- Git
- Team Foundation Server
- Version control
- Provar Automation trial guide and extensions
- Salesforce Testing
- Provar Manager
- Best Practices
- Troubleshooting
- Release Notes