Skip to main content

Key Concepts to Know Before You Start Testing

Before we move on to recording your first feature, let’s help you understand how things are structured inside AutoTester.

A project is like a folder where everything related to one specific web app or product you are testing can be stored. A project includes features, test flows, test cases, and results.

Each project contains multiple features. A feature is a specific function or section of your app you want to test. For example, the login feature, the search feature, or the checkout feature.

For each feature, AutoTester records a test flow. A Test flow is the full process a user would go through when using a particular feature from start to finish.

AutoTester uses your recorded test flow to generate multiple test cases. Test cases reflect different possible ways the feature might behave based on how a user interacts with it. This could be completing the action successfully, showing an error, or handling something unexpected.

Think of each test case as a scenario a user might go through when using that feature.

Each test case includes detailed test steps. Test steps are the individual actions AutoTester follows to check if your app is working as expected. These could be things like clicking a button, checking a message, or verifying that a page loaded correctly.

Every test case ends in a scenario, either a Success Scenario or an Error Scenario

  • A success scenario means the feature worked as expected. The user provides valid input, and the system returns the correct result without any problems.
    Example: A user logs into an app with the correct username and password, and they are taken to the dashboard.

  • An error scenario means something went wrong, usually due to incorrect input, unexpected system behaviour, or external failures.
    Example: A user tries to log in with the wrong password. The system correctly identifies this and shows an error message like "Incorrect password."

Once you define the success or error scenario, AutoTester uses validations to check whether the app responded the right way. Validations are things like a success message, an error alert, or even a page change. Anything that confirms the expected outcome actually happened and the feature worked as intended.

To find and check these validations, AutoTester relies on a selector. A selector tells AutoTester exactly where to look on the page. For example, a specific message, button, or input field.
With the right selector in place, AutoTester can locate that element, check if it shows the expected result, and determine whether your test passes.

This combination of validation and selector allows AutoTester to test similar scenarios automatically and confirm whether your feature is behaving as expected.

Together, the validation (expected result) and the selector (where to find it) form an assertion. An assertion is what AutoTester uses to decide whether the test passed or failed.
Example: If you expect to see “Welcome back” after login, AutoTester looks for that exact message on the page. If it’s there, the test passes. If not, it fails.

You can add assertions manually or choose from AI-suggested ones during test setup.

Edge cases* are unusual situations that might not happen often, but still need to be tested. They’re the “what if” scenarios like submitting a form with no input, entering special characters in a name field, or using an unusually long email address.

Edge cases help you catch bugs or unexpected behaviour that might not show up in typical user flows. AutoTester can generate many of these edge cases automatically, based on your recorded flow.

Now let’s move on to testing your app.