How to Setup CI/CD Pipeline for automated API Tests

How to Setup CI/CD Pipeline for automated API Tests

Automating API test suite execution through CI/CD pipelines provides a significant advantage over local execution. By leveraging CI/CD, teams can obtain test results for all systems, improving the speed, quality, and reliability of tests. Manual triggering of API suite execution is not required, freeing up valuable time for team members.

In this blog post, we will guide you through the creation of a workflow file using GitHub Actions for your automated API tests. However, before diving into the creation of a CI/CD workflow, it’s essential to understand some crucial points for a better grasp of the concept.

Before we start creating a CI/CD workflow for our API tests I will suggest you first go through the API test automation framework here and also read this blog on creating a web test automation framework as it helps you to understand the different points which we all should consider before selecting the test automation framework. The API test automation framework is in Python language and has Behave library for BDD purposes.

Let’s understand some basic and important points to start with the CI/CD workflow.

What is DevOps?

DevOps is a set of practices and tools that integrate and automate tasks in the software development and IT industry. It establishes communication and collaboration between development and operations teams, enabling faster and more reliable software build, testing, and release processes. DevOps is a methodology that derives its name from the combination of “Development” and “Operations.”

The primary goal of DevOps is to bridge the gap between development and operations teams by fostering a culture of shared responsibility and collaboration. This helps to reduce the time it takes to develop, test, and deploy software while maintaining high quality and reliability standards. By automating manual processes and eliminating silos between teams, DevOps enables organizations to respond more quickly to changing market demands and customer needs.

To know more about DevOps and its history, please visit the site https://en.wikipedia.org/wiki/DevOps 

CI/CD-1

What is CI/CD?

CI/CD refers to Continuous Integration and Continuous Delivery, which are processes and practices that help to deliver code changes more frequently and reliably. These processes involve automating the building, testing, and deployment of code changes, resulting in faster and higher-quality software releases for end-users.

The CI/CD pipeline follows a workflow that starts with continuous integration (CI), followed by continuous delivery (CD). The CI process involves integrating code changes into a shared repository and automatically building and testing them to identify errors early in the development process. Once the code has been tested and approved, the CD process takes over and automates the delivery of code changes to production environments.

The CI/CD pipeline workflow helps to reduce the risks and delays associated with manual code integration and deployment while ensuring that the changes are tested and delivered quickly and reliably. This approach enables organizations to innovate faster, respond more quickly to market demands, and improve overall software quality.

Process:

CI/CD-2

What are GitHub Actions?

GitHub Actions is a feature that makes it easy to automate software workflows, including world-class CI/CD capabilities. With GitHub Actions, you can build, test, and deploy your code directly from GitHub, while also customizing code reviews, branch management, and issue-triaging workflows to suit your needs.

To learn more about GitHub Actions, please refer to the official documentation available here
https://docs.github.com/en/actions

The GitHub platform offers integration with GitHub Actions, providing flexibility for customizing workflows to automate tasks such as building, testing, and deploying code. Developers can create custom workflows using GitHub Actions that are automatically triggered when specific events occur, such as code push, pull request merge, or as per a defined schedule.

Workflows are defined using YAML syntax, which is a human-readable data serialization language. YAML is commonly used for configuration files and in applications to store or transmit data. To learn more about YAML syntax and its history, please visit the following link

Advantages / Benefits of using GitHub Actions for CI/CD Pipeline:

  • Seamless integration: GitHub Actions seamlessly integrates with GitHub repositories, making it easy to automate workflows and tasks directly from the repository.
  • Highly customizable: GitHub Actions offers a high degree of customization, allowing developers to create workflows that suit their specific needs.
  • Time-saving: GitHub Actions automates many tasks in the software development process, saving developers time and reducing the potential for errors.
  • Flexible: GitHub Actions can be used for a wide range of tasks, including building, testing, and deploying applications.
  • Workflow visualization: GitHub Actions provides a graphical representation of workflows, making it easy for developers to visualize and understand the process.
  • Large community: GitHub Actions has a large and active community, providing a wealth of resources, documentation, and support for developers.
  • Cost Saving: GitHub Actions come bundled with Github free and enterprise licenses reducing the cost of maintaining separate CI/CD tools like Jenkins

Framework Overview:

This is a BDD API automation testing framework. The reason behind choosing the BDD framework is simple it provides you the following benefits over other testing frameworks. 

  • Improved Collaboration
  • Increased Test coverage
  • Better Test Readability
  • Easy Test Maintenance
  • Faster Feedback
  • Integration with Other Tools
  • Focus on Business Requirements

Discover what are the different types of automation testing frameworks available and why to prefer the BDD framework over others here

Framework Explanation:

The framework is simple because we included a feature file written in the Gherkin language, as you will notice. Basically, Gherkin is a simple plain text language with a simple structure. The feature file is easy to understand for a non-technical person and that is why we prefer the BDD framework for automation. To learn more about the Gherkin language please visit the official site here https://cucumber.io/docs/gherkin/reference/. Also, we have included the POST, GET, PUT & DELETE API methods. A feature file describes all these methods using simple and understandable language.

The next component of our framework is the step file. The feature and step files are the two main and most essential parts of the BDD framework. The step file contains the implementation of the steps mentioned in the feature file. It maps the respective steps from the feature file and executes the code.We use the behave library to achieve this. The behave understands the maps of the steps with the feature file steps as both steps have the same language structure. 

Then there is the utility file which contains the methods which we can use more repeatedly. There is one configuration file where we store the commonly used data. Furthermore, to install all the dependencies, we have created a requirement.txt file which contains the packages with specific versions. To install the packages from the requirement.txt file we have the following command. 

pip install -r requirement.txt

The above framework is explained in detail here. I suggest you please check out the blog first and understand the framework then we can move further with the workflow detail description. A proper understanding of the framework is essential to understand how to create the CI/CD workflow file.  

How to create a Workflow File?

  • Create a GitHub repository for your framework
  • Push your framework to that repository
  • Click on the Action Button
  • Click on set workflow your self option
  • Give a proper name to the workflow file

“Additionally, please check out the below video for a detailed step understanding.” The video will show you how to create workflow files and the steps need to follow to do so. 

github actions workflow file creation

Components of CI/CD Workflow File:

Events:

Events are responsible to trigger the CI/CD workflow file. They are nothing but the actions that happen in the repository for example pushing to the branch or creating a pull request. Please check the below sample events that trigger the CI/CD workflow file. 

  • push: This event is triggered when someone pushes code to a branch in your repository.
  • pull_request: This event is triggered when someone opens a new pull request or updates an existing one.
  • schedule: This event is triggered on a schedule that you define in your workflow configuration file.
  • workflow_dispatch: This event allows you to manually trigger a workflow by clicking a button in the GitHub UI.
  • release: This event is triggered when a new release is created in your repository.
  • repository_dispatch: This event allows you to trigger a workflow using a custom webhook event.
  • page_build: This event is triggered when GitHub Pages are built or rebuilt.
  • issue_comment: This event is triggered when someone comments on an issue in your repository.
  • pull_request_review: This event is triggered when someone reviews a pull request in your repository.
  • push_tag: This event is triggered when someone pushes a tag to your repository.

To know more about the events that trigger workflows please check out the GitHub official documentation here

Jobs:

After setting up the events to trigger the workflow the next step is to set up the job for the workflow. The job consists of a set of steps that performs specific tasks. For every job, there is a separate runner or we can call it a virtual machine (VM) therefore each job can run parallelly. This allows us to execute multiple tasks concurrently. 

A workflow can have more than one job with a unique name and set of steps that define the actions to perform. For example, we can use a job in the workflow file to build the project, test its functionality, and deploy it to a server. The defined jobs in the workflow file can be dependent on each other. Also, they can have different requirements than the others like specific operating systems, software dependencies or packages, or environment variables. 

Discover more about using jobs in a workflow from GitHub’s official documentation here

Runners:

To execute the jobs we need runners. The runners in GitHub actions are nothing but virtual machines or physical servers. GitHub categorizes them into two parts named self-hosted or provided by GitHub. Moreover, the runners are responsible for running the steps described in the job.

The self-hosed runners allow us to execute the jobs on our own system or infrastructure for example our own physical servers, virtual machines, or containers. We use self-hosted runners when we need to run jobs on specialized hardware requirements that must be met.

GitHub-hosted runners are provided by GitHub itself and can be used for free by anyone. These runners are available in a variety of configurations. Furthermore, the best thing about GitHub-hosted runners is that they automatically update with the latest software updates and security patches.

Learn more about runners for GitHub actions workflow here from GitHub’s official documentation. 

Steps:

Steps in the workflow file are used to carry out particular actions. Subsequently, after adding the runner to the workflow file, we define these steps with the help of the steps property in the workflow file. Additionally, the steps consist of actions and commands to perform on the build. For example, there are steps to download the dependencies, check out the build, run the test, upload the artifacts, etc. 

Learn more about the steps used in the workflow file from GitHub’s official documentation here

Actions:

In the GitHub actions workflow file, we use actions that are reusable code modules that can be shared across different workflows and repositories. One or more steps are defined under actions to perform specific tasks such as running tests, building the project, or deploying the code. We can also define the input and output parameters to the actions which help us to receive and return the data from other steps in the workflow. Developers describe the actions, and they are available on GitHub Marketplace. To use an action in the workflow, we need to use the uses property.

Find out more about actions for GitHub actions from GitHub’s official documentation here 

Now we have covered all the basic topics that we need to understand before creating our CI/CD workflow file for the API automation framework. Now, let’s start explaining the workflow file.

CI/CD Workflow File:

name: Python API CI/CD Pipeline
on:
  push:
   branches: ["main"]
#    schedule:
#       - cron: '00 12 * * *'
jobs:
 build:
  runs-on: windows-latest
  steps:
    - uses: actions/checkout@v3
    - name: Set up Python
      uses: actions/setup-python@v3
      with:
        python-version: '3.8.9'
    - name: Install dependencies
      run: |
        python -m pip install --upgrade pip
        pip && pip install -r requirement.txt
    - name: install allure
      run:  npm install -g allure-commandline
      continue-on-error: true
    - name: run test
      run: behave Features -f allure_behave.formatter:AllureFormatter -o Report_Json
      working-directory: .
      continue-on-error: true
    - name: html report
      run: allure generate Report_Json -o Report_Html --clean
      continue-on-error: true
    - uses: actions/upload-artifact@v2
      with:
          name: HTML reports
          path: Report_Html
      continue-on-error: true

Explanation:

Name:

  • We use the name property to give the name to the workflow file. It is a good practice to give a proper name to your workflow file. Generally, the name is related to the feature or the repository name. 
name: Python API CI/CD Pipeline

Event:

Now we have to set up the event that triggers the workflow file. In this workflow, I have added two events for your reference. The pipeline will trigger the push event for the ‘main‘ branch. Additionally, I added the scheduled event to automatically trigger the workflow as per the set schedule.

on:
   push:
    branches: ["main"]
#    schedule:
#       - cron: '00 12 * * *'

The above schedule indicates that the pipeline Runs at 12:00. Action schedules run at most every 5 minutes using UTC time.

We can customize the schedule timing as per our needs. Check out the following chron specification diagram to learn how to set the schedule timing.

Job:

The job we are setting here is to build. We want to build the project and perform the required tasks as we merge new code changes.

jobs:
  build:

Runner:

The runner we are using here is a GitHub-hosted runner. In this workflow, we are using a Windows-latest virtual machine. The VM will build the project, and then it will execute the defined steps.

runs-on: windows-latest

Apart from Windows-latest, there are other runners too like ubuntu-latest, macos-latest, and self-hosted. The self-hosted runner is one that we can set up on our own infrastructure, such as our own server, or virtual machine, allowing us to have more control over the environment and resources.

Steps:

The steps are the description of what are the different actions required to perform on the project build. Here, the first action we are performing is to check out the repository so that it can have the latest build code. 

steps:
- uses: actions/checkout@v3

Then we are setting up the Python. As this framework is an API automation testing framework using Python and Behave so we need Python to execute the tests. 

- name: Set up Python
  uses: actions/setup-python@v3
      with:
         python-version: '3.8.9'

After we install Python, we also need to install the different packages required to run the API tests. Define these packages in the requirement.txt file, and we can install them using the following command.

- name: Install dependencies
      run: |
        python -m pip install --upgrade pip
        pip && pip install -r requirement.txt

For reporting purposes, we are using allure reports. To generate the allure report we need to install the allure package separately.

- name: install allure
  run:  npm install -g allure-commandline
  continue-on-error: true

As of now, we have installed all the packages and we can now run our API tests. We are running these tests with the help of the allure behave command so that once the execution is completed it will generate a Report_Json folder which is required to generate the HTML report. 

- name: run test
 run: behave Features -f allure_behave.formatter:AllureFormatter -o Report_Json
 working-directory: .
 continue-on-error: true

Here, we cannot share the generated Report_Json folder as a report. To generate the shareable report we need to convert the JSON folder to that of the HTML report. 

- name: html report
run: allure generate Report_Json -o Report_Html --clean
continue-on-error: true

To view the report locally we need to upload the artifacts first and then only we can download the generated HTML result. 

- uses: actions/upload-artifact@v2
     with:
          name: HTML reports
          path: Report_Html
          continue-on-error: true

How to download and view the HTML Report?

Please find the attached GitHub repository link. I have uploaded the same project to this repository and also attached a Readme file that explains the framework and the different commands we have used so far in this project. Also, the workflow explanation is included for better understanding.

Conclusion:

In conclusion, creating a CI/CD pipeline workflow for your project using GitHub Actions streamlines the development and testing process by automating tasks such as building the project for new changes, testing the build, and deploying the code. This results in reduced time and minimized errors, ensuring that your software performance is at its best.

GitHub Actions provides a wide range of pre-built actions and the ability to create custom actions that suit your requirements. By following established practices and continuously iterating on workflows, you can ensure your software delivery is optimized and reliable.

I hope in this blog I have provided the answers to the most commonly asked question and I hope this will help you to start creating your CI/CD pipelines for your projects. Do check out the blogs on how to create a BDD framework for Web Automation and API automation for a better understanding of automation frameworks and how a robust framework can be created. 

How to set an automatic backup of your source code repository?

How to set an automatic backup of your source code repository?

One day my colleague David was working on a task to migrate a repository from one platform to another while he was doing that the server crashed, the platform went down and all the data, code, and documentation available on the repository was lost. Have you ever imagined yourself in David’s Situation, What if this happens to you? Wondering what solution you can use instead of manual efforts?

You’re not alone.

We faced the same problem and found one of the best automation solutions that will work on all the platforms.

Problem:

We want to protect our source code repositories at any cost. It contains all of our code, documentation, and work history.

The real nightmare comes when you don’t have a backup. We all know that taking frequent backups manually is a tedious process.

So how can we do that through automation?

Solution:

There are many solutions to that. We have used this solution because it is based on bash script and works on all the platforms like Azure DevOps, Gitlab, and Bitbucket.

We are copying the Azure DevOps repository to GitHub. But as we said we can use this solution on every platform.

  1. First, log in to your GitHub account and create a GitHub repository. (If you don’t have it already)
  2. You have to generate a Personal access token at GitHub.

Settings -> Developer Settings -> Personal Access tokens

Generate and copy the token value. (refer to the following screenshot)

C:\Users\Uddhav\Desktop\Blogs\A2G\Untitled.png

3. Navigate to your Azure repository. And create a pipeline there.


4. Add the following bash script task to the pipeline.

What will this script do?

This script will override the branch code of GitHub. In simple words, if you want to copy code from the feature branch of Azure to GitHub then:

  1. In GitHub – If the feature branch is present, it will override the code.
  2. In GitHub – if the feature branch is not present, then the branch will be created.
bash: |
git push -- \
Prune //ghp_iYwH9mwvmWVAFCD@github.com/SpurQLabs/SpurQLabsProject \
+refs/remotes/origin/master:refs/heads/master +refs/tags/:refs/tags/
displayName: 'Copy to GitHub'
condition: eq(variables['Build.SourceBranch'], 'refs/heads/master')
Note:

If your branch (in Azure) is master then no need to change but if your branch is main then you should replace the “master” with “main”.

Be Careful:

If you already have a repository (With Code)  in GitHub, then instead of main or master use the feature branch in Azure.  And then raised PR to the main branch.

And if you want to update code from the main branch from Azure to the main branch from GitHub then no worries. You are good to go. (It will override main branch code, if not there then will create a new one.).

  • You can trigger your pipeline according to your requirements.
  • If you want to schedule it on a daily, weekly, or monthly basis then you can use the “schedule”&”cron” options in the pipeline.
  • Below is the link for your reference to schedule the pipeline

https://docs.microsoft.com/en-us/azure/devops/pipelines/process/scheduled-triggers?view=azure-devops&tabs=yaml

Run your pipeline.

  And Done and Dusted..!

#Azure-pipeline.yml for reference

#Azure-pipeline.yml for reference
trigger:
- DemoSync #your feature Branch, Run this pipeline from your feature branch.
pool:
  vmImage: ubuntu-latest
steps:
- checkout: self
- bash: |
    git push --prune https://ghp_cbvnn@github.com/SpurQLabs/SpurQLabsProject \
    +refs/remotes/origin/DemoSync:refs/heads/DemoSync +refs/tags/*:refs/tags/*
  displayName: 'Copy to GitHub'
  condition: eq(variables['Build.SourceBranch'], 'refs/heads/DemoSync')

Conclusion

This is how the source code is migrated from azure repositories to GitHub repositories automatically after every check-in/PR merge.

As mentioned above, you can schedule it daily, Weekly, or monthly.

Read more blogs here