Skip to main content

Developing

Basic system requirements

Installing other system requirements

Run the following command to check that you have installed the correct versions of the above requirements and install the remaining system requirements:

python tools/system-requirements.py

Environments behind proxy: Run python tools/system-requirements.py --attach-ca path/to/ca.crt instead, passing a copy of the PEM (base64 encoded) root CA.

This installs pnpm 8.x and Poetry 1.x into your default environment, and tries to add Python's Scripts/ directory to your PATH.

Configuration

Create a .env file in dotenv format by copying the sample file .env.sample to .env. Open the file in a text editor and modify the following variables:

CAPE_BASEURL=http://(external-capev2-host):(port)
OPENCTI_BASEURL=http://(external-opencti-host):(port)

If you have access to our staging server, you can use the staging URLs.

If your CAPEv2 or OpenCTI instance has no Content-Security-Policy header (you will have issues with the <iframe>s), you might need to run the local CSP proxy and configure the local CSP proxys' URLs here.

Authentication with Github Container Repository (GHCR)

in the event that some image are pulled directly from GHCR, ensure that this step is done

Pipeline configuration

Create a pipeline.yml configuration file by copying the sample file pipeline.yml.sample to pipeline.yml.

Installing dependencies

python tools/deps.py
# Optional but recommended: Install pre-commit hooks
pnpm exec husky install

When new dependencies are added or updated, remember to re-run tools/deps.py to update your local dependencies to the new versions.

Setting up the entire stack

The development script will start the frontend, pipeline and numerous other services in Docker containers, which watch for file changes, restarting the components automatically:

docker compose up -d --build

# For development behind proxy, configure CA_CERTIFICATE_PATH=./path/to/cert.crt in the .env file and run
docker compose -f dev-proxy.docker-compose.yml up -d --build
Avoid specifying -f with each command

To avoid specifying -f dev-proxy.docker-compose.yml with every docker compose command, add the following to your .env file:

COMPOSE_FILE=dev-proxy.docker-compose.yml

Docker Compose will automatically use this file, so you can simply run docker compose up -d --build and other commands without the -f flag.

Certificate extension

The CA certificate file must have a .crt extension (e.g., CA_CERTIFICATE_PATH=./certs/proxy-ca.crt). The update-ca-certificates command in Debian-based containers only processes .crt files, so using .pem or other extensions will cause Docker builds to fail.

Issues during first-time setup

Additionally, Elasticsearch will fail to start if you do not configure vm.max_map_count to at least 262144.

To make this setting persist across WSL reboots, add the following to your %USERPROFILE%\.wslconfig file (e.g., C:\Users\<YourUsername>\.wslconfig):

[wsl2]
kernelCommandLine = "sysctl.vm.max_map_count=262144"

Congratulations, you now have a development environment!

Workarounds for hot-reload in docker containers

Frontend: The environment variable WATCHPACK_POLLING has to be true for WSL Docker users

Reference: Environment variable for hot reloading in React applications

Pipeline: nodemon has to be launched with the flag legacy-watch

Reference: Environment variable for hot reloading in Nodemon

These workarounds may use polling to check for filesystem events, and may consume a significantly higher amount of resources. Please disable if you are facing hardware limitations.

Starting components manually

You can opt to run the development environment outside of a container platform. Alternatively, you can selectively start various services depending on your development or testing needs. This helps to conserve a significant amount of RAM and CPU usage.

tip

To only start frontend, the command will be docker compose up frontend -d --build

Running tests and other tools

After making any changes, you should write tests and type annotations to test and check those changes.

Node.js

Ensure you are in the component directory (like frontend/ or pipeline/).

# Type-checks are useful for quick sanity checks
pnpm run type-check
# Tests ensure code is correct
pnpm run test
# Lints help enforce code quality, run them last
pnpm run lint
# Formatting before push is recommended, they're automatically applied
pnpm run format

Python

Ensure you are in the component directory (like services/cape_to_csv/).

# Tests ensure code is correct
poetry run python -m pytest
# Type-checks are useful for quick sanity checks
poetry run mypy .
# Lints help enforce code quality, run them last
poetry run pflake8 .
# Formatting before push is recommended, they're automatically applied
poetry run black .

frontend: Using Storybook

Storybook is an open source tool for building UI components and pages in isolation. It streamlines UI development, testing, and documentation.

Most critically, we're using Storybook to help simplify the process of building UI components. Instead of running ACUBETotal, we can use Storybook to help run and test components by giving them various inputs. Visit the Storybook documentation for more details.

Run storybook with:

pnpm run storybook

To write a story, create a [ComponentName].stories.tsx file in the same folder as the component. See FileUpload.stories.tsx for an example demonstrating the important features.

A good story will exercise as many variants of the component as possible.

frontend: Subsystem boundaries

We're using boundaries to enforce some subsystem boundaries in frontend. This prevents things like API clients from being imported in the UI. See .eslintrc.js for the exact rules.

frontend: Browser Compatibility

For older browser support, import core-js polyfills in the specific files where the feature is needed. For example:

// Example: polyfill Object.groupBy for older browsers
import 'core-js/actual/object/group-by'

frontend: Updating API Documentation

The ACUBETotal API uses next-swagger-doc to generate OpenAPI specs from @swagger JSDoc annotations in API route files (frontend/src/pages/api/).

Building and publishing

  1. Generate the OpenAPI spec (from the frontend/ directory):

    pnpm run build-docs

    This outputs the spec to frontend/public/swagger.json.

  2. Copy to the documentation repository:

    cp frontend/public/swagger.json ../documentation/docs/api/acubetotal.json
  3. Commit changes to both the ACUBETotal and documentation repositories.

Dependency management

Node.js

Ensure you are in the package root (and not the project root). Use to manage dependencies for a particular workspace.

cd COMPONENT
pnpm install (--save|--save-dev) PACKAGE
pnpm update [PACKAGE]

Python

Ensure you are in the component directory (like services/cape_to_csv/). Use Poetry to manage dependencies for that component:

poetry add [--dev] PACKAGE
poetry update [PACKAGE]

Updating dependencies

When you pull in new code changes that include changes in dependencies, remember to run python tools/deps.py to update your local dependencies to the new versions.

Vendored dependencies

To avoid using a GitHub personal access token for now, we vendor some internal packages into vendor/. To update these packages, do the following:

# Grab the new versions, update vendor.txt if necessary
python tools/vendor.py
# Update to the new version in all components
cd COMPONENT
poetry add --dev ../../vendor/PEP427_FILENAME.whl
# Remove the older version if unused by all components
rm ../../vendor/PREVIOUS_PEP427_FILENAME.whl
# Commit the changes
cd ../..
git add vendor.txt vendor/ COMPONENT
git commit -m 'Bump PACKAGE to the latest version'

Once the version has been updated, remember to test and fix any changes in the API interface.

Database management

Our database schema is managed with Prisma. The schema is declared in prisma/schema.prisma.

When adding models or fields, you need to create the migration files and migrate your local instance with pnpm exec prisma migrate dev from the project root. Afterwards, you need to commit the migration files.

When fetching these new schema changes made by someone else, you might need to do the following in your project root:

# Apply the migrations to your local database, and regenerate the Prisma client
pnpm exec prisma migrate dev

Local CSP proxy

For good security reasons, sites can only be embedded in an <iframe> only if the site allows it. This can be done with a Content-Security-Policy header.

If your CAPEv2 or OpenCTI instance runs on an external host, it is likely that they will block embedding in <iframe>s. You can work around that by running the development CSP proxy and then configuring .env to use the CSP proxy as the CAPEv2 or OpenCTI host.

python tools/csp-proxy.py --bind http://localhost:(port) http://(external-host):(port)

Maintenance

Bumping system requirements' versions

To bump Node.js or Python versions, perform these steps:

  1. Create a new branch
  2. Update the supported versions list in tools/system-requirements.py
  3. Update the supported versions list in every GitHub Action. For code formatting jobs, use the latest (LTS if possible) version
  4. Update this document with the latest (LTS if possible) version
  5. Submit a Pull Request. Ensure all tests pass before merging

Bumping Postgres or Elasticsearch versions

Replace all occurrences of the version number:

sed -i \
-e 's,docker.io/library/postgres:.*,docker.io/library/postgres:15,g' \
-e 's,docker.elastic.co/elasticsearch/elasticsearch:.*,docker.elastic.co/elasticsearch/elasticsearch:7.16.2,g' \
docker-compose*.yml \
.github/workflows/*.yml

Tips

Cheatsheets

Removing all state

This will wipe the database and elasticsearch and apply migrations on the empty database:

docker compose down -v
pnpm exec prisma migrate dev

Make your feature branch up-to-date

You can make your feature branch up-to-date with the main branch in two main ways:

Rebase

Recommended when you have made very little changes in feature branches, especially overlapping ones, with the main branch. Do not do this after doing a merge.

git fetch origin main
git rebase origin main
# Now, resolve all conflicts for each commit you previouly made and then
git rebase --continue
# until you have resolved them all

Merge

Recommended otherwise, especially for feature branches that have significantly diverged from the main branch.

git fetch origin main
git merge origin main
# Now, resolve all conflicts and then
git commit -a

Git cleaning

This removes all changes including those in .gitignore. This is useful to get a fresh working directory, equivalent to doing a fresh clone.

git clean -xfd

If you want to keep unstaged files instead of removing all changes:

git clean -Xfd