skillbox/README.md

199 lines
5.8 KiB
Markdown

# skillBox
Sources
* [Django](https://docs.djangoproject.com/en/2.0/)
* [Wagtail](http://docs.wagtail.io/)
* [Graphene](https://github.com/graphql-python/graphene-django)
* [Vue-Apollo](https://github.com/Akryum/vue-apollo)
* [Vuetifyjs](https://vuetifyjs.com/en/getting-started/quick-start)
## Development
### Server
* [Install](https://docs.pipenv.org/#install-pipenv-today) pipenv
* Create virtualenv in install dependencies `pipenv --python 3.6 install --dev`
* Create PostgreSQL database & user, needs to be version 10, not 11
* [Install](https://devcenter.heroku.com/articles/heroku-cli#download-and-install) Heroku cli, run `heroku login` and `heroku git:remote -a skillbox-hep`
* Create .env in `server` file with `SECRET_KEY` and `DATABASE_URL` (or copy the existing .env.example file)
* Migrate databases: `pipenv run python manage.py migrate`
* Create super user: `pipenv run python manage.py createsuperuser`
* Run: `pipenv run python manage.py runserver`
* Dummy data: `pipenv run python manage.py dummy_data` (restart the development server afterwards) Recreates the db with test data and new superuser: test/test
#### Dockerize DB
* To dockerize the DB, after installing docker, run the following command:
```
docker run --name skillboxdb -d -p 5432:5432 -e POSTGRES_PASSWORD=skillbox -e POSTGRES_USER=skillbox -e POSTGRES_DB=skillbox postgres:10-alpine
```
* After a reboot, start the container again with `docker start skillboxdb`
#### Notes
* `DEBUG=True` enables the debug middleware http://docs.graphene-python.org/projects/django/en/latest/debug/
#### Commands
##### Create a new teacher demo account on prod
```
heroku login # if not already logged in
heroku run --remote heroku python server/manage.py create_teacher <firstname> <lastname> <email>
```
Tip: create and alias in ~/.bash_aliases:
```
alias create_teacher="heroku run --remote heroku python server/manage.py create_teacher"
```
Then you can just run in terminal:
```
create_teacher <firstname> <lastname> <email>
```
##### Import a CSV file
To import a CSV file locally, run:
```
python manage.py <csv-file>
```
To import a CSV file on prod, first upload the CSV file to some public S3 bucket (or make it publicly available some other way)
```
heroku login # if not already logged in
heroku run --remote heroku python server/manage.py import_users --s3 <csv-url>
```
### Client
``` bash
# install dependencies
npm install --prefix client
# serve with hot reload at localhost:3000
npm run dev --prefix client
# build
npm run build --prefix client
# build for production with minification
export NODE_ENV=production && npm install --prefix client && npm run build --prefix client
```
After running `npm run dev` login to the Django admin view on the same domain as the webpack dev server is running.
Example: The client runs on localhost:8080 and Django on localhost:8000.
This way you will have a session and a csrf cookie set and the apollo client will
be able to make requests.
Production Build
## Urls
(trailing slashes are required)
* Admin interface: http://127.0.0.1:8000/guru/
* Cms interface: http://127.0.0.1:8000/cms/
* GraphQL Interface: http://localhost:8000/api/graphiql/
## Heroku
`heroku run python server/manage.py <command> --app <appname>`
### Rollback
After doing a rollback under https://data.heroku.com/
`heroku pg:info --app=<appname>`
Change DATABASE URL (e.g after a rollback)
`heroku pg:promote HEROKU_POSTGRESQL_PINK`
### Backup
`heroku pg:backups:capture --app <appname>`
`heroku pg:backups:url b001 --app <appname>`
## AWS
1. Create user with API Access and add `access key id` and `access key` to `.env` and set `USE_AWS=True`
2. Create S3 Bucket in location EU Ireland.
Change bucket `Permissions` / `Bucket Policy`
```
{
"Version": "2008-10-17",
"Statement": [
{
"Sid": "PublicReadForGetBucketObjects",
"Effect": "Allow",
"Principal": {
"AWS": "*"
},
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::<bucket-name>/*"
},
{
"Sid": "",
"Effect": "Allow",
"Principal": {
"AWS": "<user arn>"
},
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::<bucket-name>/*",
"arn:aws:s3:::<bucket-name>"
]
}
]
}
```
## Release
### Pushing to Production
Whenever you want to do a release, just merge develop into master and push to origin.
The current master will always be deployed in the evening, if there are new commits that are not yet on Heroku.
### Tagging
Whenever you do a new release, please tag the current commit after merging with the current date:
git tag -a release/v2019-09-10
`git tag -a` creates an annotated tag, which must have a release message (like a commit message). For now, just repeat the current date, like:
Release to production on 2019-09-10
You can and should then push the tag to the repo
git push origin release/v2019-09-10
You can later see the metadata of the tag with
git show release/v2019-09-10
## Testing
## Mocking GraphQL calls
We use [cypress-graphql-mock](https://github.com/tgriesser/cypress-graphql-mock) for mocking GraphQL calls in Cypress tests.
For an example, please see `spellcheck.spec.js`.
There is a schema.json in the fixtures folder. For now it has been generated once, and if there is a significant update to the schema on the server, it has to be regenerated.
To generate a new schema, use the management command
```
python manage.py graphql_schema --schema api.schema.schema --out schema.json --indent 4
```
Then, remove the `data` property from the generated `schema.json`, so the `__schema` property is on the top level.
Also remove the two objects with `"name": "__debug"` from the JSON file.