Go to file
Christian Cueni 0db89c116d style checkbox, refactor task assignment 2018-10-16 15:28:18 +02:00
.git-crypt Add 1 git-crypt collaborator 2018-09-19 17:11:39 +02:00
bin initial commit 2018-08-06 23:26:23 +02:00
client style checkbox, refactor task assignment 2018-10-16 15:28:18 +02:00
docs/secrets added skillbox heroku env variables 2018-09-27 10:12:01 +02:00
server allow user to set block type 2018-10-16 07:29:41 +02:00
.editorconfig Show the name of the logged in user in DefaultLayout 2018-10-05 10:49:12 +02:00
.gitignore add start page 2018-10-09 09:57:10 +02:00
Dockerfile Update docker configuration 2018-09-03 17:20:02 +02:00
Pipfile bitbucket pipelines heroku integration 2018-09-20 17:28:15 +02:00
Pipfile.lock bitbucket pipelines heroku integration 2018-09-20 17:28:15 +02:00
Procfile initial commit 2018-08-06 23:26:23 +02:00
README.md Cleanup Readme 2018-09-20 11:39:55 +02:00
bitbucket-pipelines.yml Make push to master deploy on myskillbox.ch 2018-10-04 22:26:05 +02:00
docker-compose.yml Update docker configuration 2018-09-03 17:20:02 +02:00
docker_dummy_data.sh Update docker configuration 2018-09-03 17:20:02 +02:00
package-lock.json remove uploadcare from root package.json 2018-09-20 09:41:05 +02:00
package.json remove uploadcare from root package.json 2018-09-20 09:41:05 +02:00
release-tasks.sh initial commit 2018-08-06 23:26:23 +02:00

README.md

skillBox

Sources

Development

Server

  • Install pipenv
  • Create virtualenv in install dependencies pipenv --python 3.6 install --dev
  • Create PostgreSQL database & user
  • Install Heroku cli, run heroku login and heroku git:remote -a skillbox-hep
  • Create .env in server file with SECRET_KEY and DATABASE_URL (or copy the existing .env.example file)
  • Migrate databases: pipenv run python manage.py migrate
  • Create super user: pipenv run python manage.py createsuperuser
  • Run: pipenv run python manage.py runserver
  • Dummy data: pipenv run python manage.py dummy_data (restart the development server afterwards) Recreates the db with test data and new superuser: test/test

Dockerize DB

  • To dockerize the DB, after installing docker, run the following command:
docker run --name skillboxdb -d -p 5432:5432 -e POSTGRES_PASSWORD=skillbox -e POSTGRES_USER=skillbox -e POSTGRES_DB=skillbox postgres:alpine
  • After a reboot, start the container again with docker start skillboxdb

Notes

Client

# install dependencies
npm install --prefix client

# serve with hot reload at localhost:3000
npm run dev --prefix client

# build
npm run build --prefix client

# build for production with minification
export NODE_ENV=production && npm install --prefix client && npm run build --prefix client

After running npm run dev login to the Django admin view on the same domain as the webpack dev server is running. Example: The client runs on localhost:8080 and Django on localhost:8000. This way you will have a session and a csrf cookie set and the apollo client will be able to make requests.

Production Build

Urls

(trailing slashes are required)

Heroku

heroku run python server/manage.py <command> --app <appname>

Rollabck

After doing a rollback under https://data.heroku.com/

heroku pg:info --app=<appname>

Change DATABASE URL (e.g after a rollback)

heroku pg:promote HEROKU_POSTGRESQL_PINK

Backup

heroku pg:backups:capture --app <appname>

heroku pg:backups:url b001 --app <appname>

AWS

  1. Create user with API Access and add access key id and access key to .env and set USE_AWS=True
  2. Create S3 Bucket in location EU Ireland

Change bucket Permissions / Bucket Policy

{
    "Version": "2008-10-17",
    "Statement": [
        {
            "Sid": "PublicReadForGetBucketObjects",
            "Effect": "Allow",
            "Principal": {
                "AWS": "*"
            },
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::<bucket-name>/*"
        },
        {
            "Sid": "",
            "Effect": "Allow",
            "Principal": {
                "AWS": "<user arn>"
            },
            "Action": "s3:*",
            "Resource": [
                "arn:aws:s3:::<bucket-name>/*",
                "arn:aws:s3:::<bucket-name>"
            ]
        }
    ]
}