Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
236 changes: 94 additions & 142 deletions content/community/3.codebase/3.testing.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,197 +3,149 @@ title: Testing
description: How to run unit and blackbox tests in the Directus codebase.
---

The current test strategy for Directus consists of blackbox tests, which test the overall functionality of the platform, as well as unit tests, which test individual parts of the codebase.
Directus has two main methods of testing, being through unit and blackbox tests. Unit tests are located in each package and can be run from the pacakage itself. Blackbox tests on the other hand are located in `tests/blackbox`.

## Running Unit Tests

Use the following command to perform unit tests in all packages:
## Running tests

### Unit tests
```bash
pnpm --workspace-root test
# Run from the package that you want to start the tests from
pnpm test
```

Use one of the following commands to perform more specific actions with unit tests (mix and match as desired):
### Blackbox tests

```bash
# Run tests for a specific package (for example only in the api or app package)
pnpm --filter api test
pnpm --filter app test
::callout{icon="material-symbols:warning-rounded" color="info"}
[Docker](https://docs.docker.com/get-docker/) is required to run extensions locally.
::

# Start tests in watch mode
pnpm --filter api test -- --watch
```bash
# Test against directus running with postgres
pnpm test

# Enable coverage report
pnpm --filter api test -- --coverage
# Test against all databases
pnpm test:all

# Run specific test files using a filter pattern
pnpm --filter api test -- app.test.ts
pnpm --filter api test -- utils
# Test against a specific database. The project option can be used multiple times to test against multiple different databases at the same time.
pnpm vitest --project sqlite
```

::callout{icon="material-symbols:info-outline"}
**Relative Commands**
### Vitest options

If you are already in a directory of a specific package, you may omit the `--filter` flag in `pnpm` commands since the commands will be executed relative to the current directory.
Both unit and blackbox tests are running through [Vitest](https://vitest.dev) and thus support all options that vitest has to offer for customizing what tests to run.

```bash
# Run API tests, from within the "/api" directory
pnpm test
```

::

## Running Blackbox Tests
# Run all tests that have "permission" in their filename.
pnpm test permissions

Install [Docker](https://docs.docker.com/get-docker/) and ensure that the service is up and running. Run the following commands to start the blackbox tests:

```bash
# Ensure that you are testing against the lastest state of the codebase
pnpm --workspace-root build

# Clean up in case you ran the tests before
pnpm --filter tests-blackbox exec docker compose down --volumes
# Start the containers required for the tests
pnpm --filter tests-blackbox exec docker compose up --detach --wait

# Deploy Directus and run the tests
## Run common tests unrelated to database
pnpm --workspace-root test:blackbox -- --project common
## Run database specific tests
pnpm --workspace-root test:blackbox -- --project db
# Run all tests and watch for changes in the test files.
pnpm test -w
```

Subsequent test runs can be issued with the following command, if only modifications to the blackbox tests themselves have been made:
For more options, see [here](https://vitest.dev/guide/cli.html).

```bash
## Run common tests unrelated to database
pnpm --filter tests-blackbox test --project common
## Run database specific tests
pnpm --filter tests-blackbox test --project db
```
## Writing unit tests

### Testing Specific Database Vendors
Unit Tests are written throughout the codebase in a vite native unit test framework called [Vitest](https://vitest.dev).

Provide a CSV of database vendors via the `TEST_DB` environment variable to target only a specific subset:
```ts
import { afterEach, beforeEach, expect, test, vi } from 'vitest';

```bash
# Example targeting multiple vendors
TEST_DB=cockroachdb,postgres pnpm --workspace-root test:blackbox -- --project db
import { formatTitle } from './format-title.js';

test('should format utc string', () => {
const result = formatTitle('hello-world')

# Example targeting a single vendor
TEST_DB=sqlite3 pnpm --workspace-root test:blackbox -- --project db
expect(result).toBe('Hello World');
});
```

If tests are only run against a subset of databases, it also makes sense to only start the corresponding containers:
::callout{icon="material-symbols:info-outline" color="success"}
Please follow <u>[these guidelines](https://github.com/goldbergyoni/nodejs-testing-best-practices/blob/master/README.md)</u> as they form a good and extensive baseline on how tests should be structured, organized and explains a lot of useful concepts.
::

```bash
# Start the containers that are always required
pnpm --filter tests-blackbox exec docker compose up auth-saml redis minio minio-mc --detach --wait
## Writing blackbox tests

# Start the specific database container (for example 'postgres')
pnpm --filter tests-blackbox exec docker compose up postgres --detach --wait
```
The basic test structure goes as follows.

### Using an Existing Directus Project
1. (optionally) Create a new folder for your tests
2. Create your test file, ending with `.test.ts`
3. You can start with this template:

Usually, the test suite will spin up a fresh copy of the Directus API built from the current state of the codebase. To use an already running instance of Directus instead, enable the `TEST_LOCAL` flag:
```ts
import { createDirectus, rest, serverPing, staticToken } from '@directus/sdk';
import { useSnapshot } from '@utils/useSnapshot.js';
import { expect, test } from 'vitest';

```bash
TEST_DB=cockroachdb TEST_LOCAL=true pnpm --workspace-root test:blackbox -- --project db
```
import { port } from '@utils/constants.js';
import type { Schema } from './schema.js';

Note: The tests expect the instance running at `localhost:8055`. Make sure to connect the instance to the test database container found in the `tests/blackbox/docker-compose.yml` file.
const api = createDirectus<Schema>(`http://localhost:${port}`).with(rest()).with(staticToken('admin'));

### Server Logs
const { collections } = await useSnapshot<Schema>(api);

For debugging purposes, server logs can be enabled by specifying a log level using the `TEST_SAVE_LOGS` flag, for example:
test('ping', async () => {
const result = await api.request(serverPing());

```bash
TEST_SAVE_LOGS=info pnpm --workspace-root test:blackbox -- --project db
expect(result).toBe('pong');
});
```

The log files will be available under `tests/blackbox/server-logs-*`.
::callout{icon="material-symbols:info-outline" color="success"}
Please follow <u>[these guidelines](https://github.com/goldbergyoni/nodejs-testing-best-practices/blob/master/README.md)</u> as they form a good and extensive baseline on how tests should be structured, organized and explains a lot of useful concepts.
::

## Writing Unit Tests
### Using a custom schema

Unit Tests are written throughout the codebase in a vite native unit test framework called [Vitest](https://vitest.dev).
Blackbox tests allow you to easily setup custom schemas for testing using the `useSnapshot` function. By default, the function uses the `snapshot.json` file that should be located on the same level as your test file. The function ensures that collection names are always unique by mapping the name in the schema to a name that is unique to the test. The function returns the mapped collection names as well as the used schema in json format.

### Example
```ts
useSnapshot<Schema>(api: DirectusClient<Schema> & RestClient<Schema>, file?: string = 'snapshot.json'): Promise<{
collections: Collections<Schema>;
snapshot: Snapshot;
}>
```

```ts [/directus/api/src/utils/get-date-formatted.test.ts]
import { afterEach, beforeEach, expect, test, vi } from 'vitest';
In oder to create such a schema and types for said schema quickly, use the sandbox cli:

import { getDateFormatted } from './get-date-formatted.js';
```bash
# Run this in the same folder as your tests
sandbox -s -x postgres
```

beforeEach(() => {
vi.useFakeTimers();
});
The `-x` option hereby exports the schema every 2s and the `-s` starts the directus instance using a preexisting `snapshot.json` so editing an exisitng snapshot is also quickly possible.

afterEach(() => {
vi.useRealTimers();
});
When creating collections, make that each collection ends with `_1234` as this is required to ensure that each collection will have a uniqe name in the blackbox tests.

function getUtcDateForString(date: string) {
const now = new Date(date);
### Avoiding naming conflicts

// account for timezone difference depending on the machine where this test is ran
const timezoneOffsetInMinutes = now.getTimezoneOffset();
const timezoneOffsetInMilliseconds = timezoneOffsetInMinutes * 60 * 1000;
const nowUTC = new Date(now.valueOf() + timezoneOffsetInMilliseconds);
If you manually want to create collections or other things like users, you have to make sure that these are unique across tests runs and different tests.
A good way to do that is either to use the `getUID()` or `randomUUID()` functions. `getUID()` returns a string unique to the file that you're currently running in but returns the same string for there same file whereas `randomUUID()` will awalys return a random UUID.

return nowUTC;
}
### Launching a custom directus instance

test.each([
{ utc: '2023-01-01T01:23:45.678Z', expected: '20230101-12345' },
{ utc: '2023-01-11T01:23:45.678Z', expected: '20230111-12345' },
{ utc: '2023-11-01T01:23:45.678Z', expected: '20231101-12345' },
{ utc: '2023-11-11T12:34:56.789Z', expected: '20231111-123456' },
{ utc: '2023-06-01T01:23:45.678Z', expected: '20230601-12345' },
{ utc: '2023-06-11T12:34:56.789Z', expected: '20230611-123456' },
])('should format $utc into "$expected"', ({ utc, expected }) => {
const nowUTC = getUtcDateForString(utc);
It is possible to spin up separate directus instances inside of a test file itself. This is useful if you want to test against cases with e.g. horizontal scaling.

vi.setSystemTime(nowUTC);
```ts
import { sandbox } from '@directus/sandbox';
import { database } from '@utils/constants.js';
import getPort from 'get-port';
import { getUID } from '@utils/getUID.js';

expect(getDateFormatted()).toBe(expected);
});
```
const port = await getPort()

## Writing Blackbox Tests

### Example

```ts [/directus/tests/blackbox/routes/server/ping.test.ts]
import { getUrl } from '@common/config';
import request from 'supertest';
import vendors from '@common/get-dbs-to-test';
import { requestGraphQL } from '@common/transport';

describe('/server', () => {
describe('GET /ping', () => {
it.each(vendors)('%s', async (vendor) => {
// Action
const response = await request(getUrl(vendor))
.get('/server/ping')
.expect('Content-Type', /text\/html/)
.expect(200);

const gqlResponse = await requestGraphQL(getUrl(vendor), true, null, {
query: {
server_ping: true,
},
});

// Assert
expect(response.text).toBe('pong');
expect(gqlResponse.body.data.server_ping).toBe('pong');
});
const directus = await sandbox(database, {
port,
schema: join(import.meta.dirname, 'snapshot.json'), // Custom schema to start the instance with
inspect: false,
silent: true,
docker: {
basePort: port + 1,
suffix: getUID(), // make sure this is unique as it could collide with other docker project names
},
});
});
```





::callout{icon="material-symbols:info-outline" color="info"}
For more information about the configuration options for the sandbox cli, see [here](/community/codebase/sandbox-cli).
::
82 changes: 82 additions & 0 deletions content/community/3.codebase/4.sandbox-cli.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,82 @@
---
title: Sandbox Cli
description: How to make use of the Sandbox Cli and Api when working with directus.
---

Utility functions for quickly spinning up and down instances of directus for usage such as testing or development.

## Usage

The package offers two ways of interacting, either through calling JS functions or through accessing the command line
interface.

### CLI

```bash
Usage: sandbox [options] <database>

Arguments:
database What database to start the api with (choices: "maria", "cockroachdb", "mssql", "mysql", "oracle", "postgres", "sqlite")

Options:
-b, --build Rebuild directus from source
-d, --dev Start directus in developer mode. Not compatible with build
-w, --watch Restart the api when changes are made
--inspect Start the api with debugger (default: true)
-p, --port <port> Port to start the api on
-x, --export Export the schema to a file every 2 seconds
-s, --schema [schema] Load an additional schema snapshot on startup
--docker.basePort <dockerBasePort> Minimum port number to use for docker containers
--docker.keep Keep containers running when stopping the sandbox
--docker.name Overwrite the name of the docker project
--docker.suffix Adds a suffix to the docker project. Can be used to ensure uniqueness
-e, --extras <extras> Enable redis,maildev,saml or other extras
--silent Silence all logs except for errors
-i, --instances <instances> Horizontally scale directus to a given number of instances (default: "1")
--killPorts Forcefully kills all processes that occupy ports that the api would use
-h, --help display help for command
```

### API

The api is accessed through the following two functions:

```ts
function sandbox(database: Database, options?: DeepPartial<Options>): Promise<Sandbox>

function sandboxes(sandboxes: SandboxesOptions, options?: {
build?: boolean;
dev?: boolean;
watch?: boolean;
}): Promise<Sandboxes>
```

#### Example

```ts
import { sandbox } from '@directus/sandbox';

const sb = await sandbox('postgres', { dev: true });

// Interact via Rest, GQL or WebSockets
const result = await fetch(sb.env.PUBLIC_URL + '/items/articles');

console.log(await result.json());

await sb.close();
```

## Inner workings

Depending on what is set in the configuration, some of these steps might be skipped:

1. **Building of the api**: If enabled, the api is freshly build each time the sandbox is started. Use the `watch`
option to quickly iterate on changes.
2. **Starting of the docker containers**: All required docker containers like databases or extras like redis are spun up
and awaited until healthy. If the docker containers are still running, they will be reused instead of starting up new
ones.
3. **Bootstrapping of the database**: If not already bootstrapped, the sandbox will make sure that all necessary
database tables are created.
4. **Loading of a schema snapshot**: In case the `schema` option is set, the database will also the snapshot applied
before starting directus.
5. **Starting of the api**: Finally, the api(s) are spun up with the right environment variables configured.