Go to file
Ramon Wenger 6364fc8374 Fix frontend tests 2024-05-22 16:34:51 +02:00
.git-crypt Add 1 git-crypt collaborator 2018-09-19 17:11:39 +02:00
.husky Add pre-push hook 2024-02-28 11:19:45 +01:00
bin Fix bug correcting mutation 2024-05-13 12:04:34 +02:00
client Fix frontend tests 2024-05-22 16:34:51 +02:00
docs Fix some comments 2024-05-02 12:16:51 +02:00
server Add myKV license for students, valid for 1 year 2024-05-15 12:51:53 +02:00
.dockerignore Add dockerignore file 2022-03-17 13:27:16 +01:00
.editorconfig Show the name of the logged in user in DefaultLayout 2018-10-05 10:49:12 +02:00
.gitignore Gitignore all dump files 2024-02-01 14:31:18 +01:00
.rgignore Add ripgrep ignore file 2024-02-28 11:19:45 +01:00
.tool-versions Update python to 3.11.7 2023-12-06 16:26:56 +01:00
.vimspector.json Update vimspector config 2024-02-28 11:19:45 +01:00
Dockerfile Fix error in Dockerfile 2023-12-06 16:39:31 +01:00
Pipfile Remove unused package 2023-12-13 18:01:32 +01:00
Pipfile.lock Remove unused package 2023-12-13 18:01:32 +01:00
Procfile Update max requests 2020-11-19 14:55:38 +01:00
README.md Fix readme how to generate Graphql schema 2024-04-25 15:55:42 +02:00
bitbucket-pipelines.yml Add step to test parallel deployment on preprod 2024-04-16 17:03:40 +02:00
build-docker-image.sh Add some configuration for the e2e tests 2019-02-04 09:33:42 +01:00
configure-instance.sh Update documentation for new instance creation 2022-12-01 13:30:46 +01:00
create-instance.sh Update documentation for new instance creation 2022-12-01 13:30:46 +01:00
debug-pipelines.sh Update pipelines debug script 2023-12-21 17:43:38 +01:00
docker-bash.sh Add some configuration for the e2e tests 2019-02-04 09:33:42 +01:00
docker-compose.yml Fix Postgres version in docker files 2018-10-30 17:52:00 +01:00
docker_dummy_data.sh Update docker configuration 2018-09-03 17:20:02 +02:00
graphql.config.json Refactor graphql config 2024-02-28 11:14:45 +01:00
local-setup-for-tests.sh Fix cypress test 2019-02-04 17:42:28 +01:00
new-instance.env.example Add scripts for creating a new instance on heroku 2022-10-11 17:20:46 +02:00
package-lock.json Update to node 20 2023-08-16 09:40:43 +02:00
package.json Upgrade node to v20 2023-08-07 17:47:04 +02:00
pipenv-cypress.sh Add script for local cypress testing with pipenv 2019-02-14 18:27:33 +01:00
release-tasks.sh Remove dummy data from release tasks 2018-10-17 14:48:45 +02:00
setup-for-tests.sh Increase node memory size 2023-08-07 16:56:04 +02:00

README.md

mySkillbox

Sources

Development

Server

  • Install pipenv
  • Create virtualenv in install dependencies pipenv --python 3.8 install --dev
  • Create PostgreSQL database & user
  • Create .env in server file with SECRET_KEY and DATABASE_URL (or copy the existing .env.example file)
  • Migrate databases: pipenv run python manage.py migrate
  • Create super user: pipenv run python manage.py createsuperuser
  • Run: pipenv run python manage.py runserver
  • Dummy data: pipenv run python manage.py dummy_data (restart the development server afterwards) Recreates the db with test data and new superuser: test/test
  • Install Heroku cli, run heroku login and heroku git:remote -a skillbox-hep

Dockerize DB

  • To dockerize the DB, after installing docker, run the following command:
docker run --name skillboxdb -d -p 5432:5432 -e POSTGRES_PASSWORD=skillbox -e POSTGRES_USER=skillbox -e POSTGRES_DB=skillbox postgres
  • After a reboot, start the container again with docker start skillboxdb

Notes

Commands

Create a new teacher demo account on prod
heroku login # if not already logged in
heroku run --remote heroku python server/manage.py create_teacher <firstname> <lastname> <email>

Tip: create and alias in ~/.bash_aliases:

alias create_teacher="heroku run --remote heroku python server/manage.py create_teacher"

Then you can just run in terminal:

create_teacher <firstname> <lastname> <email>
Import a CSV file

To import a CSV file locally, run:

python manage.py <csv-file>

To import a CSV file on prod, first upload the CSV file to some public S3 bucket (or make it publicly available some other way)

heroku login # if not already logged in
heroku run --remote heroku python server/manage.py import_users --s3 <csv-url>

Client

# install dependencies
npm install --prefix client

# serve with hot reload at localhost:3000
npm run dev --prefix client

# build
npm run build --prefix client

# build for production with minification
export NODE_ENV=production && npm install --prefix client && npm run build --prefix client

After running npm run dev login to the Django admin view on the same domain as the webpack dev server is running. Example: The client runs on localhost:8080 and Django on localhost:8000. This way you will have a session and a csrf cookie set and the apollo client will be able to make requests.

Production Build

Docker

Update the new docker image

  • Edit Dockerfile
  • docker build --memory=1g --memory-swap=1g -t iterativ/skillbox-test:<tag> -f Dockerfile .
  • docker push iterativ/skillbox-test:<tag>
  • Update the bitbucket-pipelines.yml to use the new tag

Urls

(trailing slashes are required)

Heroku

heroku run python server/manage.py <command> --app <appname>

Rollback

After doing a rollback under https://data.heroku.com/

heroku pg:info --app=<appname>

Change DATABASE URL (e.g after a rollback)

heroku pg:promote HEROKU_POSTGRESQL_PINK

Backup

See Docs

AWS

  1. Create user with API Access and add access key id and access key to .env and set USE_AWS=True
  2. Create S3 Bucket in location EU Ireland.

Change bucket Permissions / Bucket Policy

{
    "Version": "2008-10-17",
    "Statement": [
        {
            "Sid": "PublicReadForGetBucketObjects",
            "Effect": "Allow",
            "Principal": {
                "AWS": "*"
            },
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::<bucket-name>/*"
        },
        {
            "Sid": "",
            "Effect": "Allow",
            "Principal": {
                "AWS": "<user arn>"
            },
            "Action": "s3:*",
            "Resource": [
                "arn:aws:s3:::<bucket-name>/*",
                "arn:aws:s3:::<bucket-name>"
            ]
        }
    ]
}

Release

Pushing to Production

Whenever you want to do a release, just merge develop into master and push to origin. The current master will always be deployed in the evening, if there are new commits that are not yet on Heroku.

Tagging

Update:

Please use one of the provided scripts in bin

./bin/tag-release.sh

or

./bin/tag-hotfix.sh

and follow the instructions.

Note: Please make sure you're one the master branch first.

Manual Way (deprecated)

Whenever you do a new release, please tag the current commit after merging with the current date:

git tag -a v2019-09-10

git tag -a creates an annotated tag, which must have a release message (like a commit message). For now, just repeat the current date, like:

Release to production on 2019-09-10

You can and should then push the tag to the repo

git push origin v2019-09-10

You can later see the metadata of the tag with

git show release/v2019-09-10

or

git tag -ln

NB: If there are two releases on the same day, use the pattern v2020-03-01.x eg. v2020-03-01.1 for the subsequent releases

Hotfixes

Please use the pattern v2020-03-01.hotfix for hotfix releases

Testing

Mocking GraphQL calls

We use cypress-graphql-mock for mocking GraphQL calls in Cypress tests.

For an example, please see spellcheck.spec.js.

There is a schema.json in the fixtures folder. For now it has been generated once, and if there is a significant update to the schema on the server, it has to be regenerated.

To generate a new schema, use the management command

python manage.py export_schema_for_cypress

GraphQL

Generate GraphQL SDL Document

for linux:

cd server
./graphql-schema.sh && npm run codegen --../prefix client

For macOS: (there is a problem with the sed command)

cd server
./macos-graphql-schema.sh && npm run codegen --../prefix client

Backup to S3

From https://pawelurbanek.com/heroku-postgresql-s3-backup

Initial setup per Heroku Dyno

heroku buildpacks:add heroku-community/awscli
heroku config:set BACKUP_AWS_ACCESS_KEY_ID=[Your AWS Access Key ID]
heroku config:set BACKUP_AWS_SECRET_ACCESS_KEY=[Your AWS Secret Access Key]
heroku config:set BACKUP_S3_BUCKET_NAME=[Your S3 bucket name]
heroku authorizations:create => TOKEN
heroku config:set HEROKU_API_KEY=[TOKEN]
heroku buildpacks:add heroku-community/cli
heroku config:set APP_NAME=app-name
heroku config:set PG_BACKUP_KEY=$(openssl rand -base64 32)
heroku addons:create scheduler:standard
heroku addons:open scheduler

Then schedule the backup for daily at 3:00 UTC, for example

Command:

./bin/pg-backup-to-s3

Note on component

Our own components remain in kebap-case, imported components from third party libraries will be used in PascalCase. E.g. <password-change-form/> vs. <ValidationProvider/>