Beta: Automated testing
As your Extension grows in complexity, it will become more labor-intensive to verify that it's performing all of its responsibilities correctly. You may eventually want to automate that task, and software tests are a great way to do this. Airtable maintains a JavaScript library which allows you to simulate how your Extension behaves under controlled situations (for example, given specific Base structures and user interactions). The library's called "blocks-testing", and this guide explains how you can use it to automate the quality-assurance process.
Please note: We are actively working to improve the experience of writing automated tests. This is still a beta feature: we anticipate updates to this experience in the future.
Setup
First, install the library using your preferred package manager. Here's how
you would do that using npm
:
$ npm install --save-dev @airtable/blocks-testing
...and here's how you would install the library using
yarn
:
$ yarn add --dev @airtable/blocks-testing
Next, install a testing framework of your choice. Airtable uses Jest to test its own Extensions internally, but any framework should work.
After that, install and configure the JSDom library. This project simulates a web browser. It intentionally lacks a visual component, making it suitable for use in continuous integration environments like TravisCI or GitHub Actions. If you chose to use Jest, then you can skip this step because Jest includes JSDom automatically.
Writing tests
Each of your test files must import the TestDriver
constructor before any other parts of the Blocks SDK
(including your Extension).
import TestDriver from '@airtable/blocks-testing';// Import your Extension *after* the testing library (the name and location of your// Extension will likely differ from this example):import MyExtension from '../frontend/my-extension';
The next steps are much more open-ended. At a high level, each test will typically perform the following steps:
Create a TestDriver instance, providing valid fixture data:
const testDriver = new TestDriver({/* fixture data here */});You can use the Test Fixture Generator Extension to create test data from the Base you are using for development.
Create an instance of the Extension under test by rendering the Extension's Component as a child of the
TestDriver#Container
Component.render(<testDriver.Container><MyExtension /></testDriver.Container>,);Provide some input to the Extension. This may be in the form of a simulated user interaction (e.g. via the
@testing-library/user-event
library)// Simulate a user choosing the "Gruyere" option from the table labeled// "Cheeses".const input = screen.getByLabelText('Cheeses');const option = screen.getByText('Gruyere');userEvent.selectOptions(input, [option]);...or a simulated backend behavior (e.g. via the Blocks SDK or the
TestDriver
API).// Invoking `createTableAsync` in a test script simulates the condition// where the Airtable backend reports that a table has been created by// another viewer of the Base.await testDriver.base.createTableAsync('a new table', [{name: 'address', type: FieldType.EMAIL},]);Verify that the Extension responded to the input as expected. For many kinds of interactions, the Extension's response will be discernible by some change in the UI. The
@testing-library/react
library is a good choice for inspecting the state of the user interface.// Ensure that the UI updated to display the checkbox as "checked"const checkbox = screen.getByRole('checkbox');expect(checkbox.checked).toBe(true);...while the
TestDriver
API can be used to verify Extension behaviors which do not influence the UI:// Track every time the Extension attempts to expand a record.const recordIds = [];testDriver.watch('expandRecord', ({recordId}) => recordIds.push(recordId));// Simulate a user interacting with the Extension running in the test// environment.const expandButton = screen.getByRole('button');userEvent.click(expandButton);// Ensure that the Extension attempted to expand the Record with ID `reca`expect(recordIds).toEqual(['reca']);
You can repeat these steps to verify almost any feature of your Extension. With tests written, you'll have a fast and repeatable process to vet future changes to your Extension.
Test fixture data
In order to make it easier to feed realistic table, view, and field data into your automated tests, the Test Fixtures Generator Extension allows you to create valid fixture data based on an existing Base. You can then copy and paste the generated output into your automated tests to help mimic the real world scenarios your unit test will have coverage for. All collaborator details and Airtable IDs (i.e. base ID, record ID, etc) will be redacted, however, field data will still appear in the results as-is.
Let’s say the Extension you are building is going to primarily be used within a project tracker Base. A sample Base that you put together during development might include the following tables and fields:
- Tasks
- Name
- Status
- Due Date
- Project
- Projects
- Name
- Owner
- Owners
- Name
- Email Address
If we wanted to write tests for our Extension that include coverage for scenarios involving our Tasks
and Projects
tables, we can use the Test Fixtures Generator Extension to enable just those tables for the output.
Additionally, you can focus things further if only certain fields are required for the scenario you are testing.
Once you’ve selected what models are to be included in the fixture data, you can click Generate to return the results. From there, you can copy and paste the results into your related test file as shown in the examples above.
More examples
The source code repositories for some of the official example Extensions include automated tests. You can review those to get inspiration for testing your own Extensions.