ACSP REST API
This is the Python implementation of the REST API for the Autoliv Connected Services Platform
Source Code
The src folder contains the source code for the Azure Functions app. These are
the main points on interest in this folder:
app: The shared code. Also contains the different layers of the app. The name of this folder should be changed to the name of the project.app.blueprint: The folders that contain the Azure Functions. Each resource will have its own module with its blueprint.
Documentation
The docs folder contains the source for building documentation. Currently the
documentation supports handwritten documentation and inline oas documentation.
The inline oas documentation has a base template at docs/oas/base.json. The
entire oas file is filled up with the schema from app/lib/schema.py and paths
from the @oas annotated comments in the app.blueprint.* files.
The docs requires python and npm packages in docs/requirements.txt and
docs/npm_requirements.txt. The generate_docs.sh is the entire pipeline that
puts the docs under site folder.
Container Services
The docker-compose.yaml file provides services related to the app. It has the
following services:
app: The Azure Functions applicationmssql: Microsoft SQL Server that the app utilizesazurite: Azure storage emulator used by the Azure Functionsauth: Local authentication service for testingauth-*: Other services that theauthservice requires to run.docs: Utility service for building the documentation
app
The app service deploys the Functions app as a container. The app container
copies the src folder during build, so it must be rebuilt when changes are
performed.
mssql
The mssql service creates the database and creates example data on build. The
service also uses a volume so that the data remains on subsequent runs.
azurite
The Python V2 model of Azure Functions requires a linked Azure Storage. This is
an storage emulator that is linked to the app Functions App. This is also the
storage used for storing blobs. This service automatically creates the default
container for data storage.
auth and auth-*
These are services used to run an authentication service locally. The authentication mechanism is OAuth/OpenID Connect.
Access Token Exchange
The recommended way of getting a token is by using the OpenID Flow with PKCE. More details about the flow can be read in PKCE Flow.
As a starting point, visit the URL below to be able to sign in. The URL below
has a predefined code_challenge which is used as a part of the PKCE. The
code_challenge is ideally changed every sign in make sure the token is given
to one who made the request. To generate your own, visit the PKCE Generator.
http://localhost:9011/oauth2/authorize?client_id=656bc474-8d3f-4cc7-abb2-db0512be58ac&scope=openid%20offline_access&response_type=code&redirect_uri=http%3A%2F%2Flocalhost%2Foauth-redirect.html&code_challenge=Vn00vbxRzVXXQ2zOAlBPsP41Rh9YFhXn-ExVrEoOWPM&code_challenge_method=S256
The url will bring you to a log-in screen. After entering the right credentials,
The auth server will redirect you to the url below. The important value here is
the code value, which is exchanged for an access_token.
http://localhost/oauth-redirect.html?code=<code>&locale=en&userState=Authenticated
It's optional to implement a server at this redirect url. The fastest way to get
the token is to run the curl command below to exchange the code for an
access_token. Note that the code_verifier should be the one that generates
the code_challenge.
curl --request POST \
--url http://localhost:9011/oauth2/token \
--header 'Content-Type: application/x-www-form-urlencoded' \
--data grant_type=authorization_code \
--data redirect_uri=http://localhost/oauth-redirect.html \
--data client_id=656bc474-8d3f-4cc7-abb2-db0512be58ac \
--data code=uUcShbYZhGEi8U8MoXQzj9V85dUYLbtJKqbWkI5xqFo \
--data 'code_verifier=PXBxDNnZg2uGOfDjXE4L8gjKaGUbZ~65p3TihCG4~DKMGGSG8Ny~HZ_jgm0T3NIkeojnwXtzMCGR--UCBEpbrpau2aPUDkUkaQaOEIBPFvFobS1UENz2ti9hzNyaxRmg'
Using the Password Grant
If there is no need to test the authentication flow, the flow can be bypassed by
using the enabled password grant on the local fusion auth service. Running the
command below, assuming the default configurations, will immediately return the
id_token and refresh_token.
curl --request POST \
--url http://localhost:9011/oauth2/token \
--header 'Content-Type: application/x-www-form-urlencoded' \
--data grant_type=password \
--data client_id=656bc474-8d3f-4cc7-abb2-db0512be58ac \
--data client_secret=Gi70H65aWCYm2fkQjzwPVTFILvFuMi3vrz9-kX4w0v4 \
--data username=user@auth.com \
--data password=Autoliv1@Bananas \
--data 'scope=openid offline_access'
Existing Accounts and Credentials
The auth server has two accounts by default, an admin account (admin@auth.com)
and a user account (user@auth.com). They have the same passwords, defined by
the OAUTH_ACCOUNT_PASSWORD environment variable.
Other secrets and values validated by the server are also set using environment
variables. Please check the docker-compose.yaml file for more details.
docs
The docs is a container with the environment to build the documentation. Running
docker compose run docs should build the docs in site folder.
Development
Dependencies
Development dependencies are in the requirements-dev.txt. These are different
set of tools that make sure that the existing code is of quality. To install,
run the command below.
pip install -r requirements-dev.txt
Git repository hooks are managed by Pre-commit. The documentation contains installation instructions. Aside from the documented installation instructions, it is also available on several linux package repositories.
Type Hints
The code in the src folder are type hinted following the strict settings from
pyright. # type: ignore calls are reserved for third-party library calls
that do not provide enough typing information.
Database interaction with DBeaver
DBeaver is a cross-platform database tool. You can use it to visualize DB schemas and run SQL queries. To get started with DBeaver for use with the ACSP backend, proceed as follows.
- Install DBeaver.
- On Windows, install from source or use a package manager
(
$ choco install dbeaver). - If you are using WSL, the tested way is to install it on Windows.
- On Windows, install from source or use a package manager
(
- Setup connection
- Launch DBeaver, click "New Database Connection", and select the correct database type (Microsoft SQL Server).
- Enter the appropriate database name along with your credentials (username and password).
- Click
Test Connection. Download missing driver files if you are prompted to.- Note: this requires the database instance to be running.
- Click
OKto connect.
- Explore.
- Schemas can be explored in the navigator menu.
- To run a SQL query, open the
SQL Editormenu in the menu bar and selectExecute SQL query.
Liquibase
We use liquibase to keep track of all sql updates and db schema versioning.
- Install liquibase locally.
- If using WSL, you can install it on Linux.
- Ensure that liquibase is in your path variable and that the database instance is running.
- Run the following command to populate the database:
$ liquibase update --labels="dev".
Connecting to other SQL instances (Staging/Prod)
By default, liquibase connects to your local SQL docker instance. To run liquibase on other databases you need to create a separate config file using the connection settings for that specific database. Refer to the liquibase.properties file to see the required parameters. You can run liquibase with a custom .properties file using the following command:
$ liquibase update --defaults-file=path/to/your/custom-file.properties
Pre-commit
The repository contains pre-commit configuration that can be installed to format and verify the code before making a commit. To install, run the command below.
pre-commit install
By default, pre-commit configuration also enables pre-push hooks to run the test. If your workflow relies on a lot of pushes, there's an option to only enable the pre-commit hooks.
pre-commit install -t pre-commit
To run the formatting and verification manually, the command below can be used.
pre-commit run -a
To run only on specific files, the names of the files can be provided.
pre-commit run --files file1.py file2.py package/*
Testing
Test Dependencies
The testing code does some database access using pyodbc, so the development host needs to have an ODBC-driver for MS SQL installed. See installation instructions for how to installed them on your distro.
Make sure that src/local.settings.json contains the correct string for
creating a connection to the database.
Unit Tests
The unit tests resides in tests/unit/, and the structure of the folders should
mirror where the units are placed in src/app.
Some unit test are not strictly unit tests, such as database and azure functions tests, since they will test calls that result in a call chain going much further than just the function under test. But for business logic where it is possible, it is encouraged do proper unit testing so that testing on small units is possible.
Unit tests uses transactions in order to preserve the database content. Each test function has it's own transaction that is rollbacked after the test.
Integration Tests
Integration tests are run on the containerized version of the app. This tests the endpoints as a blackbox as well as it's integrations with the other systems such as the auth server and the storage server.
Running the tests
There's a helper script called run_tests.sh that is used to run the tests. By
passing different options you can choose to run either the unit tests, the
integration tests, or both.
If you're only running the unit tests there's also the possibility of using a manually started mssql docker which reduces the time it takes to run the tests.
Use run_tests.sh -h to see a usage description.