|
|
||
|---|---|---|
| .git-crypt | ||
| .husky | ||
| bin | ||
| client | ||
| docs | ||
| server | ||
| .dockerignore | ||
| .editorconfig | ||
| .gitignore | ||
| .rgignore | ||
| .tool-versions | ||
| .vimspector.json | ||
| Dockerfile | ||
| Pipfile | ||
| Pipfile.lock | ||
| Procfile | ||
| README.md | ||
| bitbucket-pipelines.yml | ||
| build-docker-image.sh | ||
| configure-instance.sh | ||
| create-instance.sh | ||
| debug-pipelines.sh | ||
| docker-bash.sh | ||
| docker-compose.yml | ||
| docker_dummy_data.sh | ||
| graphql.config.json | ||
| local-setup-for-tests.sh | ||
| new-instance.env.example | ||
| package-lock.json | ||
| package.json | ||
| pipenv-cypress.sh | ||
| release-tasks.sh | ||
| setup-for-tests.sh | ||
README.md
mySkillbox
Sources
Development
Server
- Install pipenv
- Create virtualenv in install dependencies
pipenv --python 3.8 install --dev - Create PostgreSQL database & user
- Create .env in
serverfile withSECRET_KEYandDATABASE_URL(or copy the existing .env.example file) - Migrate databases:
pipenv run python manage.py migrate - Create super user:
pipenv run python manage.py createsuperuser - Run:
pipenv run python manage.py runserver - Dummy data:
pipenv run python manage.py dummy_data(restart the development server afterwards) Recreates the db with test data and new superuser: test/test - Install Heroku cli, run
heroku loginandheroku git:remote -a skillbox-hep
Dockerize DB
- To dockerize the DB, after installing docker, run the following command:
docker run --name skillboxdb -d -p 5432:5432 -e POSTGRES_PASSWORD=skillbox -e POSTGRES_USER=skillbox -e POSTGRES_DB=skillbox postgres
- After a reboot, start the container again with
docker start skillboxdb
Notes
DEBUG=Trueenables the debug middleware http://docs.graphene-python.org/projects/django/en/latest/debug/
Commands
Create a new teacher demo account on prod
heroku login # if not already logged in
heroku run --remote heroku python server/manage.py create_teacher <firstname> <lastname> <email>
Tip: create and alias in ~/.bash_aliases:
alias create_teacher="heroku run --remote heroku python server/manage.py create_teacher"
Then you can just run in terminal:
create_teacher <firstname> <lastname> <email>
Import a CSV file
To import a CSV file locally, run:
python manage.py <csv-file>
To import a CSV file on prod, first upload the CSV file to some public S3 bucket (or make it publicly available some other way)
heroku login # if not already logged in
heroku run --remote heroku python server/manage.py import_users --s3 <csv-url>
Client
# install dependencies
npm install --prefix client
# serve with hot reload at localhost:3000
npm run dev --prefix client
# build
npm run build --prefix client
# build for production with minification
export NODE_ENV=production && npm install --prefix client && npm run build --prefix client
After running npm run dev login to the Django admin view on the same domain as the webpack dev server is running.
Example: The client runs on localhost:8080 and Django on localhost:8000. This way you will have a session and a csrf
cookie set and the apollo client will be able to make requests.
Production Build
Docker
Update the new docker image
- Edit
Dockerfile docker build --memory=1g --memory-swap=1g -t iterativ/skillbox-test:<tag> -f Dockerfile .docker push iterativ/skillbox-test:<tag>- Update the
bitbucket-pipelines.ymlto use the new tag
Urls
(trailing slashes are required)
- Admin interface: http://127.0.0.1:8000/guru/
- Cms interface: http://127.0.0.1:8000/cms/
- GraphQL Interface: http://localhost:8000/api/graphiql/
Heroku
heroku run python server/manage.py <command> --app <appname>
Rollback
After doing a rollback under https://data.heroku.com/
heroku pg:info --app=<appname>
Change DATABASE URL (e.g after a rollback)
heroku pg:promote HEROKU_POSTGRESQL_PINK
Backup
See Docs
AWS
- Create user with API Access and add
access key idandaccess keyto.envand setUSE_AWS=True - Create S3 Bucket in location EU Ireland.
Change bucket Permissions / Bucket Policy
{
"Version": "2008-10-17",
"Statement": [
{
"Sid": "PublicReadForGetBucketObjects",
"Effect": "Allow",
"Principal": {
"AWS": "*"
},
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::<bucket-name>/*"
},
{
"Sid": "",
"Effect": "Allow",
"Principal": {
"AWS": "<user arn>"
},
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::<bucket-name>/*",
"arn:aws:s3:::<bucket-name>"
]
}
]
}
Release
Pushing to Production
Whenever you want to do a release, just merge develop into master and push to origin. The current master will always be deployed in the evening, if there are new commits that are not yet on Heroku.
Tagging
Update:
Please use one of the provided scripts in bin
./bin/tag-release.sh
or
./bin/tag-hotfix.sh
and follow the instructions.
Note: Please make sure you're one the master branch first.
Manual Way (deprecated)
Whenever you do a new release, please tag the current commit after merging with the current date:
git tag -a v2019-09-10
git tag -a creates an annotated tag, which must have a release message (like a commit message). For now, just repeat
the current date, like:
Release to production on 2019-09-10
You can and should then push the tag to the repo
git push origin v2019-09-10
You can later see the metadata of the tag with
git show release/v2019-09-10
or
git tag -ln
NB: If there are two releases on the same day, use the pattern v2020-03-01.x eg. v2020-03-01.1 for the subsequent
releases
Hotfixes
Please use the pattern v2020-03-01.hotfix for hotfix releases
Testing
Mocking GraphQL calls
We use cypress-graphql-mock for mocking GraphQL calls in Cypress tests.
For an example, please see spellcheck.spec.js.
There is a schema.json in the fixtures folder. For now it has been generated once, and if there is a significant update to the schema on the server, it has to be regenerated.
To generate a new schema, use the management command
python manage.py export_schema_for_cypress
GraphQL
Generate GraphQL SDL Document
for linux:
cd server
./graphql-schema.sh && npm run codegen --../prefix client
For macOS: (there is a problem with the sed command)
cd server
./macos-graphql-schema.sh && npm run codegen --../prefix client
Backup to S3
From https://pawelurbanek.com/heroku-postgresql-s3-backup
Initial setup per Heroku Dyno
heroku buildpacks:add heroku-community/awscli
heroku config:set BACKUP_AWS_ACCESS_KEY_ID=[Your AWS Access Key ID]
heroku config:set BACKUP_AWS_SECRET_ACCESS_KEY=[Your AWS Secret Access Key]
heroku config:set BACKUP_S3_BUCKET_NAME=[Your S3 bucket name]
heroku authorizations:create => TOKEN
heroku config:set HEROKU_API_KEY=[TOKEN]
heroku buildpacks:add heroku-community/cli
heroku config:set APP_NAME=app-name
heroku config:set PG_BACKUP_KEY=$(openssl rand -base64 32)
heroku addons:create scheduler:standard
heroku addons:open scheduler
Then schedule the backup for daily at 3:00 UTC, for example
Command:
./bin/pg-backup-to-s3
Note on component
Our own components remain in kebap-case, imported components from third party libraries will be used in PascalCase.
E.g. <password-change-form/> vs. <ValidationProvider/>