Refined Ayon Desktop App Integration Testing Workflow Proposal

This proposal aims to enhance the existing host integration testing for Ayon (formerly known as OpenPype). The primary goal is to enable Studios and the Client Success team to conduct tests independently before considering a release version suitable for production deployment.

Integration Testing Triggers

To initiate the new integration testing, the following triggers are proposed:

  1. Ayon Client App (Staging Mode): Starting the Ayon client app in “staging” mode will display a “Testing” section within the Ayon Tray menu. This section will offer a hierarchy of options:
  • Running host-specific tests in headless mode.
  • Executing all tests or a specific testing unit under the chosen host.
  • Conducting a test using the code’s default settings.
  • Performing a test with Studio default overrides.
  • Running a test with Project overrides.
  1. App Launcher (Staging Mode): Using the App Launcher in “staging” mode allows users to open a host in a project-specific context. The host launcher will offer testing options, enabling the creation of a test project with a cloned hierarchy and connected project settings overrides.
  2. CI Headless Mode: The existing headless mode used in the TeamCity framework.
+---------------------+       +------------------+       +-----------------+
| Ayon Client App     |       | App Launcher     |       | CI Headless Mode |
| (Staging Mode)      |       | (Staging Mode)   |       | (TeamCity)       |
+----------+----------+       +---------+--------+       +--------+--------+
           |                          |                          |
           +------------------+-------+------------------+-------+
                              |
                              v
                    +---------+--------+
                    | Integration Tests |
                    +-------------------+

Types of Tests

There are two ways to perform tests:

  1. Deterministic Testing (Code Released Studio Setting Defaults): Deterministic testing uses predefined metrics to compare expected outputs. These tests are mostly used in CI testing and headless (non-interactive) testing initiated from the Ayon Tray submenu. However, these tests are not applicable when using studio project settings overrides. In such cases, non-deterministic testing is employed.

    • Advantages: Measurable and suitable for CI since results are comparable
    • Disadvantages: Limited testing scope due to the use of predefined database dumps
  2. Non-Deterministic Testing (Studio Project Settings Override): Non-deterministic testing is a simplified method for assessing outputs. It is only used when a user initiates tests with selected studio project settings overrides. These tests confirm that the code does not disrupt or hinder production, making them a valuable tool for the Client Success team. Non-deterministic user tests allow users to examine output data and visually check the results (e.g., comparing output with or without Burn-ins).

    • Advantages: Testable in production, can better reveal edge cases
    • Disadvantages: Not measurable since outputs cannot be compared with predefined expected data

Testing with Studio Default Settings or Project Overrides

To run tests with Studio default settings or project overrides, a new testing project should be created in MongoDB or PostgreSQL. The selected overrides should then be copied into the testing database and connected to the testing project, ensuring that no studio projects are altered during testing.

Current Datasets for Deterministic Testing

At present, each integration test sources its dataset package from cloud storage (specifically, Google Drive). These packages must be prepared for each testing unit and may include an expected folder containing the expected folder hierarchy, file naming conventions, formats, and properties (e.g., bit depth, bit size, hash, layers, parts, etc.). Due to the large size of some expected files, they are packaged as .zip files and stored separately from the main repository.

For a detailed description of the .zip package contents, visit Ayon Integration Testing.

Improving the Workflow

To enhance the deterministic testing workflow, the expected folder and physical data do not need to be included in the dataset. Instead, a tool can be developed to convert this data into a JSON format. This modification would significantly reduce the dataset size, enabling their inclusion in addons and eliminating the need for separate distribution.

Additional Recommendations

  1. Test Coverage and Documentation: Ensure comprehensive test coverage for all host integrations, including edge cases. Maintain clear and concise documentation for all tests, detailing their purpose, functionality, and any required setup or prerequisites.
  2. Automated Test Execution and Reporting: Implement automated test execution to streamline the testing process and generate reports that highlight any issues or discrepancies. This can help the Client Success
1 Like

Also need to mention there is already created Issue.

Sounds great!

Think you’ve covered all cases needed. I would just add that in order for studios and contributors to adopt integration testing, we’ll need to lower the barrier of entry as much as possible.
Current issues with integration tests are coding your way to replicating an issue or feature. Due to time or skill restrictions coding a replication of a workfile, may leave a lot of issues untested.

My suggestion would be use workfiles as source for the integration tests along with coded test cases. This might only be useful for publishing, but that is a big part of troubleshooting issues.
Potentially something like the template building could function as testing loading functionality.

Looks great!! Only one remark that I hit while I was trying to run the current testing framework of OP, I quickly hit the issue that since my DCC installations don’t follow the default paths of the settings, I couldn’t run the default tests and had to inject my own database and hack the test framework. I see that you are addressing some of this by doing the non-deterministic testing part, however, even on the deterministic testing, it would be good to allow some basic overrides that we know are (or at least should) harmless for the test results (i.e., setting the paths where my DCCs live).

1 Like

Sounds amazing.

About non-deterministic testing: it might not be too difficult to compare images / video produced by the test with expected files by calculating PSNR or maybe VMAF with ffmpeg.
For other data, comparing hashes comes to mind, but might be misleading.