This is not the article to discuss the relative merits of CI/CD or the testing that you should include in that process - that's for another day. But assuming you have already decided to deploy using a CI/CD pipeline (and really - why wouldn't you!) - here's how I have been using Azure pipelines with Github to achieve just that. I've included my basic azure-pipelines.yml alongside these steps, but there's some work to do before you create this file...
First you'll need some basics:
An Azure account with an active subscription. Create an account for free.
An Azure DevOps organization.
Sign in to Azure DevOps - it will take you automagically to your DevOps dashboard. Within that organization, create a new Project
You'll need an Environment to run your pipeline in, so select that on the left hand menu and create your first environment.
In my case, I'm working with virtual machines, so select that as your new resource
On the next page, you'll see a Registration script - this is really slick, you just copy that script and run it in Powershell on your VM - that will add your personal access token to this environment as a resource.
In your project, navigate to the Pipelines page on the left hand menu, and select the Create new Pipeline option.
Select GitHub as your repository - that will ask you to confirm access and select the source repository. You will also need to grant access to the Azure app on your selected repos.
If you've copied my YAML file from the next column, then make sure that is in the root folder of your repo - then you can select Existing Azure Pipelines YAML file as your starting point.
You'll be asked to review the YML file - make sure you update at least the following:
workingDirectory in both the Ubuntu test VM and for your deployment target
environment variables for name, resourceType and tags
displayName - this one is for your future reference, but make it something you'll remember!
Go back to Github and create a Personal Access Token to substitute for <PAT> in the git pull on your server - that allows your server to pull your repo without requiring you to log in each time.
azure-pipelines.yml
#Basic deployment script
trigger: - main
variables: isMain: $[eq(variables['Build.SourceBranch'], 'refs/heads/main')]
stages:
- stage: test
jobs:
- job: testJS
pool:
vmImage: 'ubuntu-16.04'
steps:
- script: npm ci
workingDirectory: .
continueOnError: false
- script: npm run lint
workingDirectory: .
continueOnError: false
- script: npm run test:unit
workingDirectory: .
continueOnError: false
- stage: deploy
condition: and(succeeded(), eq(variables.isMain, true))
jobs:
- deployment:
displayName: PreprodDeploy
environment:
name: Preprod
resourceType: VirtualMachine
tags: preprod
strategy:
runOnce:
deploy:
steps:
- script: git pull https://<PAT>:[email protected]/<githubOrg>/<myRepo>.git main
workingDirectory: X:\Code\myRepo
continueOnError: false
- script: npm install -g npm
workingDirectory: X:\Code\myRepo
- script: npm ci
workingDirectory: X:\Code\myRepo
- script: npm run test:unit
workingDirectory: X:\Code\myRepo
continueOnError: false
- script: npm run build
workingDirectory: X:\Code\myRepo
Use npm ci instead of "install" - it means that the package-lock.json file is used to control what libraries are installed, so you get the SAME libraries that you just carefully developed with, not just the latest versions of everything. That will save a lot of heartache and bug tracking on its own!
Working with this script on a new deployment today - the root folder for some reason was NOT the same as the name of the repo! Took a little playing to figure that out so that I could get the predeployment testing step to work.