This is what we use to documenting things - both public documentation such as this and private documentations such in internal handbooks.
We tried wordpress. The problem with wordpress is that, specifically for a developer, there is a lot of overhead. You will have to head over to a website, enter username password and then then write content. By the time you head over the browser, login, half of your enthusiasm for writing is gone. Also you are bothered by whether you should create a post or a page.
Lets look at the workflow in Mkdocs. You are writing code and you come across an interesting idea. You command tab to the right sublime window. Create a file and start writing. Once you are done, you can choose you organise it. you can choose to commit it if you like it. you can choose to let it be there for a while if you are not satisfied with it.
The obvious disadvantage for a system like this is that it is only suited for developer writters. Its going to be super difficult for a non developer to contribute.
The generator for switchless is available in switchless-cli. Run the switchless cli to install mkdocs with our flavor.
sudo easy_install mkdocs - installs mkdocs cli
# create a folder mkdir asyncauto_handbook # setup up the folder. cd asyncauto_handbook # Initialise git git init # Initialise npm and install switchless cli npm init npm install --save-dev @switchless-io/[email protected] # Run the cli and install the latest theme of mkdocs ./node_modules/@switchless-io/cli/index.js # choose --> install --> mkdocs # Lift the server mkdocs serve # create a remote repo and push data to repo git commit -m "mkdocs setup" git remote add origin https://github.com/your_name/your_repo.git git push -u origin master
mkdocs from the menu.
If you are starting a new project, mkdocs in included in the
mkdocs serve - you can view the static files that you are working on currently on your local browser.
We generally prefer readthedocs.org for hosting public documentations. They have some additional features regarding versioning of documentations.
To deploy via readthedocs:
Cname handbook.asyncauto.com --> cloudflare-to-cloudflare.readthedocs.io
Use the following settings
cloudflare-to-cloudflare.readthedocs.iobefore setting up the project, you will experience some DNS propogation delays.
Not Securefor some time.
Use cloudflare and s3 to privately host your static files
Refer to this - How to host a static website using AWS S3 and Cloudflare to setup your static site.
Perform the steps in the previous section Protection from cross domain access Because of how s3 works, your bucket name should be the name of the domain where you host your static site. S3 will not allow any other domain to point to your s3 bucket.
Add some content to the S3 bucket and you will be able to see the content on your new endpoint. 2 more things to do:
If you fancy paying for it, cloudflare access gives a lot more sophisticated access control.
Configure your AWS credentails as environment variables
Go to your repo on Gitlab. Go to
Add these variables:
These environment are special for Gitlab. Gitlab automatically recognises them as AWS credentials and will automatically configure the runner with the AWS credentials. You dont have to manually apply these credentials in the runner. This will prevent your credentials from getting leaked into the logs.
Configure Gitlab config - .gitlab-ci.yml
Create a file called
.gitlab-ci.yml in your repo and add the following content to it.
image: ruby deploy_to_s3: type: deploy image: ruby only: - master script: - apt update - apt install -yq python3-pip nodejs git - pip3 install mkdocs>=1.1.2 - mkdocs build --verbose - pip3 install awscli - echo "Pushing the static files" # - mkdir ~/.aws/ # - touch ~/.aws/credentials # - printf "[eb-cli]\naws_access_key_id = %s\naws_secret_access_key = %s\n" "$AWS_ACCESS_KEY_ID" "$AWS_SECRET_KEY" >> ~/.aws/credentials - aws s3 sync public/ s3://agent-handbook.mralbert.in --exclude ".DS_Store/*" --cache-control "max-age=120000" --delete
Notice that 3 lines are commented out. Gitlab runnner automatically recognises that you are trying to work with AWS. Because of this you dont have to specify the AWS credentials in the runner script. Gitlab detects this based on the environment variables that you set via the UI.