# mySkillbox Sources - [Django](https://docs.djangoproject.com/en/2.0/) - [Wagtail](http://docs.wagtail.io/) - [Graphene](https://github.com/graphql-python/graphene-django) - [Vue-Apollo](https://github.com/Akryum/vue-apollo) ## Development ### Server - [Install](https://docs.pipenv.org/#install-pipenv-today) pipenv - Create virtualenv in install dependencies `pipenv --python 3.8 install --dev` - Create PostgreSQL database & user - Create .env in `server` file with `SECRET_KEY` and `DATABASE_URL` (or copy the existing .env.example file) - Migrate databases: `pipenv run python manage.py migrate` - Create super user: `pipenv run python manage.py createsuperuser` - Run: `pipenv run python manage.py runserver` - Dummy data: `pipenv run python manage.py dummy_data` (restart the development server afterwards) Recreates the db with test data and new superuser: test/test - - [Install](https://devcenter.heroku.com/articles/heroku-cli#download-and-install) Heroku cli, run `heroku login` and `heroku git:remote -a skillbox-hep` #### Dockerize DB - To dockerize the DB, after installing docker, run the following command: ``` docker run --name skillboxdb -d -p 5432:5432 -e POSTGRES_PASSWORD=skillbox -e POSTGRES_USER=skillbox -e POSTGRES_DB=skillbox postgres ``` - After a reboot, start the container again with `docker start skillboxdb` #### Notes - `DEBUG=True` enables the debug middleware http://docs.graphene-python.org/projects/django/en/latest/debug/ #### Commands ##### Create a new teacher demo account on prod ``` heroku login # if not already logged in heroku run --remote heroku python server/manage.py create_teacher ``` Tip: create and alias in ~/.bash_aliases: ``` alias create_teacher="heroku run --remote heroku python server/manage.py create_teacher" ``` Then you can just run in terminal: ``` create_teacher ``` ##### Import a CSV file To import a CSV file locally, run: ``` python manage.py ``` To import a CSV file on prod, first upload the CSV file to some public S3 bucket (or make it publicly available some other way) ``` heroku login # if not already logged in heroku run --remote heroku python server/manage.py import_users --s3 ``` ### Client ```bash # install dependencies npm install --prefix client # serve with hot reload at localhost:3000 npm run dev --prefix client # build npm run build --prefix client # build for production with minification export NODE_ENV=production && npm install --prefix client && npm run build --prefix client ``` After running `npm run dev` login to the Django admin view on the same domain as the webpack dev server is running. Example: The client runs on localhost:8080 and Django on localhost:8000. This way you will have a session and a csrf cookie set and the apollo client will be able to make requests. Production Build ## Docker Update the new docker image - Edit `Dockerfile` - `docker build --memory=1g --memory-swap=1g -t iterativ/skillbox-test: -f Dockerfile .` - `docker push iterativ/skillbox-test:` - Update the `bitbucket-pipelines.yml` to use the new tag ## Urls (trailing slashes are required) - Admin interface: http://127.0.0.1:8000/guru/ - Cms interface: http://127.0.0.1:8000/cms/ - GraphQL Interface: http://localhost:8000/api/graphiql/ ## Heroku `heroku run python server/manage.py --app ` ### Rollback After doing a rollback under https://data.heroku.com/ `heroku pg:info --app=` Change DATABASE URL (e.g after a rollback) `heroku pg:promote HEROKU_POSTGRESQL_PINK` ### Backup See [Docs](./docs/heroku-backup.md) ## AWS 1. Create user with API Access and add `access key id` and `access key` to `.env` and set `USE_AWS=True` 2. Create S3 Bucket in location EU Ireland. Change bucket `Permissions` / `Bucket Policy` ``` { "Version": "2008-10-17", "Statement": [ { "Sid": "PublicReadForGetBucketObjects", "Effect": "Allow", "Principal": { "AWS": "*" }, "Action": "s3:GetObject", "Resource": "arn:aws:s3:::/*" }, { "Sid": "", "Effect": "Allow", "Principal": { "AWS": "" }, "Action": "s3:*", "Resource": [ "arn:aws:s3:::/*", "arn:aws:s3:::" ] } ] } ``` ## Release ### Pushing to Production Whenever you want to do a release, just merge develop into master and push to origin. The current master will always be deployed in the evening, if there are new commits that are not yet on Heroku. ### Tagging #### Update: Please use one of the provided scripts in `bin` `./bin/tag-release.sh` or `./bin/tag-hotfix.sh` and follow the instructions. Note: Please make sure you're one the master branch first. #### Manual Way (deprecated) Whenever you do a new release, please tag the current commit after merging with the current date: git tag -a v2019-09-10 `git tag -a` creates an annotated tag, which must have a release message (like a commit message). For now, just repeat the current date, like: Release to production on 2019-09-10 You can and should then push the tag to the repo git push origin v2019-09-10 You can later see the metadata of the tag with git show release/v2019-09-10 or git tag -ln NB: If there are two releases on the same day, use the pattern `v2020-03-01.x` eg. `v2020-03-01.1` for the subsequent releases #### Hotfixes Please use the pattern `v2020-03-01.hotfix` for hotfix releases ## Testing ## Mocking GraphQL calls We use [cypress-graphql-mock](https://github.com/tgriesser/cypress-graphql-mock) for mocking GraphQL calls in Cypress tests. For an example, please see `spellcheck.spec.js`. There is a schema.json in the fixtures folder. For now it has been generated once, and if there is a significant update to the schema on the server, it has to be regenerated. To generate a new schema, use the management command ``` python manage.py export_schema_for_cypress ``` ## GraphQL ### Generate GraphQL SDL Document for linux: ```bash cd server ./graphql-schema.sh && npm run codegen --../prefix client ``` For macOS: (there is a problem with the sed command) ```bash cd server ./macos-graphql-schema.sh && npm run codegen --../prefix client ``` ## Backup to S3 From https://pawelurbanek.com/heroku-postgresql-s3-backup ### Initial setup per Heroku Dyno ``` heroku buildpacks:add heroku-community/awscli heroku config:set BACKUP_AWS_ACCESS_KEY_ID=[Your AWS Access Key ID] heroku config:set BACKUP_AWS_SECRET_ACCESS_KEY=[Your AWS Secret Access Key] heroku config:set BACKUP_S3_BUCKET_NAME=[Your S3 bucket name] ``` ``` heroku authorizations:create => TOKEN heroku config:set HEROKU_API_KEY=[TOKEN] heroku buildpacks:add heroku-community/cli ``` ``` heroku config:set APP_NAME=app-name ``` ``` heroku config:set PG_BACKUP_KEY=$(openssl rand -base64 32) ``` ``` heroku addons:create scheduler:standard heroku addons:open scheduler ``` Then schedule the backup for daily at 3:00 UTC, for example Command: ``` ./bin/pg-backup-to-s3 ``` # Note on component Our own components remain in kebap-case, imported components from third party libraries will be used in PascalCase. E.g. `` vs. ``