<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>The Ops Community ⚙️: Jeiman Jeya</title>
    <description>The latest articles on The Ops Community ⚙️ by Jeiman Jeya (@jei).</description>
    <link>https://community.ops.io/jei</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://community.ops.io/feed/jei"/>
    <language>en</language>
    <item>
      <title>Let's Dive into AWS CloudFront</title>
      <dc:creator>Jeiman Jeya</dc:creator>
      <pubDate>Wed, 05 Apr 2023 03:15:00 +0000</pubDate>
      <link>https://community.ops.io/jei/lets-dive-into-aws-cloudfront-3i3k</link>
      <guid>https://community.ops.io/jei/lets-dive-into-aws-cloudfront-3i3k</guid>
      <description>&lt;h2&gt;
  
  
  What is AWS CloudFront?
&lt;/h2&gt;

&lt;p&gt;AWS CloudFront is a content delivery network (CDN). It securely delivers data, videos, applications, and APIs to customers globally with low latency and high transfer speeds. It offers a simple and cost-effective way to distribute content to end users, with features such as edge caching and SSL/TLS encryption.&lt;/p&gt;

&lt;h2&gt;
  
  
  How does it work?
&lt;/h2&gt;

&lt;p&gt;When a user or customer requests content, CloudFront routes the request to the edge location closest to the user in that region, delivering the content with low latency and high transfer speeds. CloudFront can also be used to encrypt content, protect against DDoS attacks, and integrate with other AWS services such as S3, Application Load Balancer, Elastic Beanstalk, and more.&lt;/p&gt;

&lt;h2&gt;
  
  
  Components in AWS CloudFront
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Distribution
&lt;/h3&gt;

&lt;p&gt;When you want to deliver or distribute your content to a location, you create what is called as a distribution. With a distribution, you can choose how to deliver your content to your end users based on configuration settings:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Content origin:&lt;/strong&gt; Your source of information, that is, an S3 bucket, an Elastic Load Balancer, an Elastic Beanstalk server from which CloudFront gets the files to distribute&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Origin request:&lt;/strong&gt; If you require CloudFront to use a specific set of HTTP headers, cookies or query string in the requests that are sent to your origin&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Access:&lt;/strong&gt; Whether you want your content to be accessible by everyone or restrict access to certain users&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Security:&lt;/strong&gt; If you want your users to access your content using HTTPS and the various encryption protocols (TLS v1.1, v1.2)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Logs:&lt;/strong&gt; Tell CloudFront to create standard logs or real-time logs that show viewer activity&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Geographic restrictions:&lt;/strong&gt; CloudFront can prevent users from selected countries in accessing your content&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;CloudFront provides flexibility in configuring your CDN for any web application according to your business or engineering needs.&lt;/p&gt;

&lt;h3&gt;
  
  
  Functions aka Lambda@Edge
&lt;/h3&gt;

&lt;p&gt;Lambda@Edge is a feature of AWS CloudFront that allows you to run code closer to your application's users, improving performance and reducing latency. With Lambda@Edge, you don't need to provision or manage infrastructure in multiple locations worldwide. You only pay for the compute time you consume on that Lambda application, and there is no charge when your code is not running. By using Lambda@Edge, you can enhance your web applications by making them globally distributed and improving their performance, all without any server administration. Simply upload your code to AWS Lambda, which takes care of everything necessary to run and scale your code with high availability at an AWS location closest to your end user.&lt;/p&gt;

&lt;p&gt;One use case for this is to introduce an HTTP Basic Authentication layer on your AWS services, such as S3 buckets or ECS Fargate, to provide an additional layer of security for end users accessing your application. When users access the website, they will be prompted with a native Basic auth form. Based on the logic defined in the Lambda function, you can approve or deny access. From there, simply attach the Lambda to your CloudFront distribution and every user will be prompted to enter login credentials to access content on your application.&lt;/p&gt;

&lt;h3&gt;
  
  
  Invalidation
&lt;/h3&gt;

&lt;p&gt;Invalidation is the process of instructing CloudFront to purge all content from your distribution, refreshing it for your end users in all edge locations. The next time an end user requests a file or page, CloudFront returns to the origin to fetch the latest version of the file.&lt;/p&gt;

&lt;p&gt;This is especially useful if you are running web applications on S3 or an HTTP server and want your users to receive the latest updates immediately, without waiting for the original TTL on the file, which could be up to 30 days. By default, CloudFront sets a TTL of 24 hours for all files, meaning that after 24 hours, it fetches new content from your origin. With invalidations, you can define which types of files, pages, or HTTP paths you would like to invalidate. This provides flexibility to meet your engineering needs.&lt;/p&gt;

&lt;h3&gt;
  
  
  Policies
&lt;/h3&gt;

&lt;p&gt;CloudFront allows you to define the type of policy in your distribution. CloudFront offers 3 distinct policies:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Specify cache and compression settings:&lt;/strong&gt; You can define which HTTP headers, cookies, and query strings CloudFront includes in the cache key with a &lt;strong&gt;CloudFront cache policy&lt;/strong&gt;. The cache key is used to determine whether a viewer's HTTP request results in a cache hit (i.e., whether the object is served to the viewer from the CloudFront cache). Including fewer values in the cache key increases the likelihood of a cache hit. You can also specify TTL settings for objects in the CloudFront cache, enabling CloudFront to request and compress that object for your end users.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Specify values to include in origin requests:&lt;/strong&gt; With a &lt;strong&gt;CloudFront origin request policy&lt;/strong&gt;, you can specify the HTTP headers, cookies, and query strings that CloudFront includes in origin requests. These are the requests that CloudFront sends to the origin when there is a cache miss for your content.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Specify HTTP headers to remove or add in viewer responses:&lt;/strong&gt; Using a &lt;strong&gt;CloudFront response headers policy&lt;/strong&gt;, you can control the HTTP headers included in HTTP responses that CloudFront sends to viewers (web browsers or other clients). You can add or remove headers from the origin's HTTP response without modifying the origin or writing any code. All of this can be handled through CloudFront.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/I4UW2nr_PoCU9E-Z8pbapuu2m28YsH-7slhI2PJmK70/w:880/mb:500000/ar:1/aHR0cHM6Ly9jb21t/dW5pdHkub3BzLmlv/L3JlbW90ZWltYWdl/cy91cGxvYWRzL2Fy/dGljbGVzL2R5dmFl/YXJ6a2o5a2JnYnZ6/bXFoLnBuZw" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/I4UW2nr_PoCU9E-Z8pbapuu2m28YsH-7slhI2PJmK70/w:880/mb:500000/ar:1/aHR0cHM6Ly9jb21t/dW5pdHkub3BzLmlv/L3JlbW90ZWltYWdl/cy91cGxvYWRzL2Fy/dGljbGVzL2R5dmFl/YXJ6a2o5a2JnYnZ6/bXFoLnBuZw" alt="" width="880" height="495"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Above is an &lt;a href="https://aws.amazon.com/blogs/startups/how-to-accelerate-your-wordpress-site-with-amazon-cloudfront/"&gt;example of a CloudFront distribution with 2 different configurations&lt;/a&gt; of serving WordPress content from an S3 bucket and an Elastic Load Balancer connected to an EC2 instance.&lt;/p&gt;

&lt;h2&gt;
  
  
  Advantages of using AWS CloudFront
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;High performance:&lt;/strong&gt; AWS CloudFront delivers content with low latency and high transfer speeds, improving user experience and reducing load times.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Flexible distribution options:&lt;/strong&gt; CloudFront offers a range of configuration settings for content delivery, including geographic restrictions, access control, and security features.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cost-effective:&lt;/strong&gt; CloudFront offers a pay-as-you-go pricing model, making it a cost-effective solution for content delivery.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Integration with AWS services:&lt;/strong&gt; CloudFront integrates seamlessly with other AWS services, such as S3, Elastic Beanstalk, and Application Load Balancer, making it easy to distribute content from these services.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Lambda@Edge:&lt;/strong&gt; The Lambda@Edge feature allows developers to add custom code to CloudFront, improving application performance and reducing latency.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Disadvantages of using AWS CloudFront
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Steep learning curve:&lt;/strong&gt; CloudFront can be complex to set up and configure, requiring a significant amount of time, focus and expertise.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Limited content size to cache:&lt;/strong&gt; CloudFront has a limit on the size of content that can be cached, which may not be suitable for larger files or applications that is above 30GB in size per file.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Potential cost overruns:&lt;/strong&gt; While CloudFront is cost-effective, it can be easy to exceed usage limits, leading to unexpected costs. Keep a close eye on your cache requests and usage.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Limited geographic coverage:&lt;/strong&gt; While CloudFront has a global network of edge locations, there may be regions where it is not available, limiting its availability for some users.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Invalidation process:&lt;/strong&gt; The invalidation process can be slow and cumbersome, making it difficult to update content quickly when necessary if you are invalidating an application that has a large number of files at the origin.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;AWS CloudFront is a powerful content delivery network (CDN) that works by caching content at edge locations around the world. CloudFront routes the request to the edge location closest to the user, delivering the content with low latency and high transfer speeds. CloudFront can also be used to encrypt content, protect against DDoS attacks, and integrate with other AWS services to further enhance and protect your applications.&lt;/p&gt;

&lt;h2&gt;
  
  
  Resources
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://aws.amazon.com/blogs/startups/how-to-accelerate-your-wordpress-site-with-amazon-cloudfront/"&gt;How to Accelerate Your WordPress Site with Amazon CloudFront&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://aws.amazon.com/cloudfront/"&gt;AWS CloudFront&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://aws.amazon.com/blogs/networking-and-content-delivery/using-amazon-cloudfront-with-aws-lambda-as-origin-to-accelerate-your-web-applications/"&gt;Using Amazon CloudFront with AWS Lambda as origin to accelerate your web applications&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/lambda-at-the-edge.html"&gt;Customizing at the edge with Lambda@Edge&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>aws</category>
      <category>cdn</category>
      <category>devops</category>
    </item>
    <item>
      <title>What is DevOps?</title>
      <dc:creator>Jeiman Jeya</dc:creator>
      <pubDate>Tue, 04 Apr 2023 02:47:25 +0000</pubDate>
      <link>https://community.ops.io/jei/what-is-devops-23eo</link>
      <guid>https://community.ops.io/jei/what-is-devops-23eo</guid>
      <description>&lt;p&gt;DevOps is more than just a software development methodology – it's a culture. The DevOps culture emphasises 3 main pillars: collaboration, communication, and shared responsibility between development and operations teams. In this post, we will discuss the importance of DevOps culture in software development and how developers and operations teams can benefit from it.&lt;/p&gt;

&lt;h2&gt;
  
  
  DevOps is not a Title
&lt;/h2&gt;

&lt;p&gt;This conundrum needs to be addressed. If you’ve seen career titles such as DevOps Manager, and DevOps Lead, it means those individuals are managing the DevOps culture in that team. The title does not necessarily imply the term literally.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Importance of DevOps Culture - The 3 Pillars
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Collaboration and Communication
&lt;/h3&gt;

&lt;p&gt;One of the key principles of DevOps culture is collaboration and communication. By breaking down silos between teams, DevOps promotes cross-functional collaboration, which leads to better outcomes. Collaboration between development and operations teams can help identify potential problems earlier in the development cycle, saving time and resources down the line.&lt;/p&gt;

&lt;p&gt;Effective communication is also essential in DevOps culture. By sharing information and feedback, teams can work together to ensure that everyone is on the same page and working towards the same goals. This helps identify areas for improvement and implement changes more effectively.&lt;/p&gt;

&lt;h3&gt;
  
  
  Shared Responsibility
&lt;/h3&gt;

&lt;p&gt;Another principle of DevOps culture is shared responsibility. In a DevOps environment, developers and operations teams share the responsibility of delivering software. This means that developers are responsible not only for writing code but also for ensuring that it runs smoothly in production. Similarly, operations teams are responsible not only for deploying and maintaining infrastructure but also for providing feedback to developers on how their code is performing in a production environment. In other words, “you build it, you ship it!”; developers can utilise the DevOps tools available to deploy their code to production.&lt;/p&gt;

&lt;p&gt;DevOps teams can create a culture of accountability and ownership by sharing responsibility. When everyone is responsible for the success of the software, it creates a sense of ownership and pride in the work being done.&lt;/p&gt;

&lt;h3&gt;
  
  
  Continuous Improvement
&lt;/h3&gt;

&lt;p&gt;Continuous improvement is a fundamental value of DevOps culture. DevOps teams continuously seek ways to enhance their processes, tools, and practices. By continuously evaluating and improving their work, DevOps teams can remain competitive in a rapidly changing digital landscape.&lt;/p&gt;

&lt;p&gt;Continuous improvement also assists teams in learning from their mistakes. When something goes wrong, DevOps teams utilize the incident as an opportunity to learn and improve. By implementing changes based on what they have learned, teams can avoid making the same mistakes in the future. DevOps teams should always be looking for ways on improving their automation and processes.&lt;/p&gt;

&lt;h2&gt;
  
  
  Tools
&lt;/h2&gt;

&lt;p&gt;DevOps teams can employ a range of tools and practices to achieve their goals, such as continuous integration and delivery (CI/CD), automated testing, and infrastructure as code (IaC). By automating tasks and using standardized processes, DevOps teams can reduce errors and improve the quality of their software. Some of the tools include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://github.com/features/actions"&gt;GitHub Actions&lt;/a&gt;, &lt;a href="https://azure.microsoft.com/en-us/products/devops"&gt;Azure DevOps&lt;/a&gt; - CI/CD tool&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://www.terraform.io/"&gt;Terraform&lt;/a&gt;, &lt;a href="https://www.pulumi.com/"&gt;Pulumi&lt;/a&gt;, &lt;a href="https://aws.amazon.com/cdk/"&gt;AWS CDK&lt;/a&gt;, &lt;a href="https://azure.microsoft.com/en-us/get-started/azure-portal/resource-manager/"&gt;Azure Resource Manager&lt;/a&gt; - IaC tool&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://www.sonarsource.com/products/sonarcloud/"&gt;SonarCloud&lt;/a&gt;, &lt;a href="https://www.codacy.com/"&gt;Codacy&lt;/a&gt; - Code analysis and security tool&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://azure.microsoft.com/en-us/products/devops/test-plans/"&gt;Azure Test Plan&lt;/a&gt; - Automated testing tool&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;DevOps culture is more than just a set of practices; it is a mindset that emphasizes collaboration, communication, shared responsibility, and continuous improvement. By embracing these values, DevOps teams can create a culture of innovation and agility that can help them stay competitive in today's fast-paced digital landscape with the toolsets available.&lt;/p&gt;

</description>
      <category>devops</category>
      <category>cicd</category>
      <category>productivity</category>
    </item>
    <item>
      <title>Deploy a web app to S3 with CloudFront Invalidation via GitHub Actions</title>
      <dc:creator>Jeiman Jeya</dc:creator>
      <pubDate>Mon, 03 Apr 2023 05:21:17 +0000</pubDate>
      <link>https://community.ops.io/jei/deploy-a-web-app-to-s3-with-cloudfront-invalidation-via-github-actions-4433</link>
      <guid>https://community.ops.io/jei/deploy-a-web-app-to-s3-with-cloudfront-invalidation-via-github-actions-4433</guid>
      <description>&lt;p&gt;In this guide, we will show you how to set up a GitHub Actions workflow to deploy your web application to S3 and invalidate your cache on CloudFront for your end users. The guide includes pre-requisites, creating an IAM user, creating a custom policy for the IAM user, fetching your CloudFront distribution ID, saving secrets in GitHub Secrets, and a YAML pipeline workflow. Please note that this post assumes that your source code is hosted on GitHub and is running on a Node.js framework.&lt;/p&gt;

&lt;h2&gt;
  
  
  Pre-requisites
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;An AWS account with admin privileges&lt;/li&gt;
&lt;li&gt;An S3 bucket&lt;/li&gt;
&lt;li&gt;A CloudFront distribution&lt;/li&gt;
&lt;li&gt;A GitHub account&lt;/li&gt;
&lt;li&gt;Basic understanding of GitHub Actions workflows&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Steps
&lt;/h2&gt;

&lt;p&gt;The steps involved are straightforward and shouldn't take too long to complete.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Create an IAM user in AWS&lt;/li&gt;
&lt;li&gt;Attach a custom policy tailored to your pipeline needs&lt;/li&gt;
&lt;li&gt;Create or retrieve your CloudFront distribution ID&lt;/li&gt;
&lt;li&gt;Save the Access Key ID, Secret Access Key from IAM, and CloudFront distribution ID in GitHub Secrets&lt;/li&gt;
&lt;li&gt;Create the pipeline using a GitHub workflow&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Creating the GitHub Actions IAM user
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Visit the IAM page on AWS.&lt;/li&gt;
&lt;li&gt;Create a new user. Give it a meaningful and distinct name.&lt;/li&gt;
&lt;li&gt;Don’t tick the box that says &lt;strong&gt;“Provide user access to the AWS Management Console - optional”&lt;/strong&gt;. This user does not need access to the console to function.&lt;/li&gt;
&lt;li&gt;For permissions, choose the option, &lt;strong&gt;“Attach policies directly”&lt;/strong&gt; and click &lt;strong&gt;“Create Policy”&lt;/strong&gt;. Copy the JSON policy mentioned below in the next section.&lt;/li&gt;
&lt;li&gt;Review the new user and create.&lt;/li&gt;
&lt;li&gt;Next, visit the users page on IAM and click on the newly created user.&lt;/li&gt;
&lt;li&gt;Navigate to the &lt;strong&gt;Security credentials&lt;/strong&gt; tab.&lt;/li&gt;
&lt;li&gt;Scroll down to &lt;strong&gt;Access keys&lt;/strong&gt; and click &lt;strong&gt;Create access key&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Take note of the Access key ID and Secret access key. You will need it later for saving it to GitHub Secrets. &lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  AWS IAM user policy
&lt;/h3&gt;

&lt;p&gt;To set up the pipeline, you will need an IAM user that would authenticate with your AWS account and perform the updates automatically. It is best to follow security principles and provide this user with the least privileged access to safeguard your AWS account from accidental or unwanted malicious activities.&lt;/p&gt;

&lt;p&gt;As we basically need to allow the pipeline to be able to communicate with AWS S3 and CloudFront respectively, we need to supply the following permission scope and create a policy for the IAM user.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;S3 permissions&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;List the buckets&lt;/li&gt;
&lt;li&gt;Get the objects in the buckets&lt;/li&gt;
&lt;li&gt;Create objects in the buckets&lt;/li&gt;
&lt;li&gt;Delete objects in the buckets&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;CloudFront permissions&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create invalidation&lt;/li&gt;
&lt;li&gt;Get invalidation&lt;/li&gt;
&lt;li&gt;List the invalidation&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Custom JSON policy for the IAM user
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Version"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2012-10-17"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Statement"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Action"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="s2"&gt;"s3:ListBucket"&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Effect"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Allow"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Resource"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="s2"&gt;"arn:aws:s3:::example.com"&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Action"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="s2"&gt;"s3:GetObject"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="s2"&gt;"s3:PutObject"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="s2"&gt;"s3:PutObjectAcl"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="s2"&gt;"s3:DeleteObject"&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Effect"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Allow"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Resource"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="s2"&gt;"arn:aws:s3:::example.com/*"&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Action"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="s2"&gt;"cloudfront:CreateInvalidation"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="s2"&gt;"cloudfront:GetInvalidation"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="s2"&gt;"cloudfront:ListInvalidations"&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Effect"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Allow"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Resource"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="s2"&gt;"arn:aws:cloudfront::&amp;lt;aws_account_id&amp;gt;:distribution/*"&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;With the custom policy above, you can attach it to the IAM user you have created. Ensure the following is updated:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;S3 bucket name - &lt;code&gt;example.com&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;AWS account ID - for the CloudFront ARN&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Fetching your CloudFront distribution ID
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Visit your CloudFront page.&lt;/li&gt;
&lt;li&gt;Click on &lt;strong&gt;Distributions&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Select your Distribution that points to your S3 bucket.&lt;/li&gt;
&lt;li&gt;Take note of the &lt;strong&gt;Distribution ID&lt;/strong&gt; in the &lt;strong&gt;first column&lt;/strong&gt;. You will need this ID when you save it in your GitHub Secrets&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Saving secrets in GitHub Secrets
&lt;/h3&gt;

&lt;p&gt;You need to save the following secrets in your GitHub repository secrets:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;IAM credentials that you created earlier - Access Key ID (&lt;code&gt;AWS_ACCESS_KEY_ID&lt;/code&gt;) and Secret Access Key (&lt;code&gt;AWS_SECRET_ACCESS_KEY&lt;/code&gt;)&lt;/li&gt;
&lt;li&gt;CloudFront Distribution ID that you retrieved earlier (&lt;code&gt;AWS_DISTRIBUTION_PRODUCTION&lt;/code&gt;)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  GitHub Actions workflow
&lt;/h2&gt;

&lt;p&gt;A workflow is a configurable, automated process that can run one or more jobs. Workflows are defined by a YAML file that is checked into your repository. They run when triggered by an event in your repository, or they can be triggered manually or on a defined schedule. Below is a template of a S3 Sync workflow.&lt;/p&gt;

&lt;h3&gt;
  
  
  YAML pipeline workflow
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;S3 Sync - Production&lt;/span&gt;

&lt;span class="na"&gt;on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;push&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;branches&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; 
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;master'&lt;/span&gt;

&lt;span class="na"&gt;jobs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;build&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Build&lt;/span&gt;
    &lt;span class="na"&gt;runs-on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ubuntu-latest&lt;/span&gt;
    &lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/checkout@v2&lt;/span&gt;
    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/setup-node@v2.5.1&lt;/span&gt;
      &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;node-version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;15'&lt;/span&gt;

    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Configure AWS Credentials&lt;/span&gt;
      &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;aws-actions/configure-aws-credentials@v1&lt;/span&gt;
      &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;aws-access-key-id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.AWS_ACCESS_KEY_ID }}&lt;/span&gt;
        &lt;span class="na"&gt;aws-secret-access-key&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.AWS_SECRET_ACCESS_KEY }}&lt;/span&gt;
        &lt;span class="na"&gt;aws-region&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ap-southeast-1&lt;/span&gt;

    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Install packages&lt;/span&gt;
      &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;npm install&lt;/span&gt;

    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Run build&lt;/span&gt;
      &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;npm run build&lt;/span&gt;

    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Generate&lt;/span&gt;
      &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;npm run generate&lt;/span&gt;

    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Upload artifact&lt;/span&gt;
      &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/upload-artifact@master&lt;/span&gt;
      &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;web-app-dist&lt;/span&gt;
        &lt;span class="na"&gt;path&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;./dist'&lt;/span&gt;

  &lt;span class="na"&gt;deploy_to_production&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Deploy to S3 Production&lt;/span&gt;
    &lt;span class="na"&gt;runs-on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ubuntu-latest&lt;/span&gt;
    &lt;span class="na"&gt;needs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;build&lt;/span&gt;
    &lt;span class="na"&gt;environment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;production&lt;/span&gt;
      &lt;span class="na"&gt;url&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;https://example.com&lt;/span&gt;
    &lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Download landing page artifact&lt;/span&gt;
      &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/download-artifact@v2&lt;/span&gt;
      &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;web-app-dist&lt;/span&gt;
        &lt;span class="na"&gt;path&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;dist&lt;/span&gt;

    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Display structure of downloaded files&lt;/span&gt;
      &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ls -R&lt;/span&gt;
      &lt;span class="na"&gt;working-directory&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;dist&lt;/span&gt;

    &lt;span class="c1"&gt;# S3 sync&lt;/span&gt;
    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;S3 Sync&lt;/span&gt;
      &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;jakejarvis/s3-sync-action@v0.5.1&lt;/span&gt;
      &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;args&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;--acl public-read --follow-symlinks --delete&lt;/span&gt;
      &lt;span class="na"&gt;env&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;AWS_S3_BUCKET&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;example.com'&lt;/span&gt;
        &lt;span class="na"&gt;AWS_ACCESS_KEY_ID&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.AWS_ACCESS_KEY_ID }}&lt;/span&gt;
        &lt;span class="na"&gt;AWS_SECRET_ACCESS_KEY&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.AWS_SECRET_ACCESS_KEY }}&lt;/span&gt;
        &lt;span class="na"&gt;AWS_REGION&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;ap-southeast-1'&lt;/span&gt;   &lt;span class="c1"&gt;# optional: defaults to us-east-1&lt;/span&gt;
        &lt;span class="na"&gt;SOURCE_DIR&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;dist'&lt;/span&gt;      &lt;span class="c1"&gt;# optional: defaults to entire repository&lt;/span&gt;

    &lt;span class="c1"&gt;# Invalidate Cloudfront&lt;/span&gt;
    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Cloudfront Invalidation&lt;/span&gt;
      &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;chetan/invalidate-cloudfront-action@master&lt;/span&gt;
      &lt;span class="na"&gt;env&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;DISTRIBUTION&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.AWS_DISTRIBUTION_PRODUCTION }}&lt;/span&gt;
        &lt;span class="na"&gt;PATHS&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;/*'&lt;/span&gt;
        &lt;span class="na"&gt;AWS_REGION&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;ap-southeast-1'&lt;/span&gt;
        &lt;span class="na"&gt;AWS_ACCESS_KEY_ID&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.AWS_ACCESS_KEY_ID }}&lt;/span&gt;
        &lt;span class="na"&gt;AWS_SECRET_ACCESS_KEY&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.AWS_SECRET_ACCESS_KEY }}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Using this workflow, you can easily deploy your applications to S3 by syncing the files and invalidating your cache on your CloudFront distribution. This ensures that your end users receive the latest content from your release.&lt;/p&gt;

&lt;h3&gt;
  
  
  Workflow breakdown
&lt;/h3&gt;

&lt;p&gt;The workflow is divided into two sections:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;CI section&lt;/strong&gt;, which builds your application, generates the static files, and compresses them to be uploaded as artifacts.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;CD section&lt;/strong&gt;, which downloads the compressed artifact, configures the AWS credentials, syncs the static files to S3, and invalidates your CloudFront distribution.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The reason for this approach is to enable the reuse of our artifacts, if necessary, for other GitHub workflows. It also provides a clear separation between continuous integration builds and continuous deployments.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Setting up a GitHub Actions workflow is fairly simple once you get the hang of it. This workflow allows you to deploy your S3 applications with confidence to various environments.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>devops</category>
      <category>cicd</category>
      <category>github</category>
    </item>
    <item>
      <title>Let’s Dive into AWS Lambda</title>
      <dc:creator>Jeiman Jeya</dc:creator>
      <pubDate>Sat, 01 Apr 2023 15:16:26 +0000</pubDate>
      <link>https://community.ops.io/jei/lets-dive-into-aws-lambda-32do</link>
      <guid>https://community.ops.io/jei/lets-dive-into-aws-lambda-32do</guid>
      <description>&lt;h2&gt;
  
  
  What is AWS Lambda?
&lt;/h2&gt;

&lt;p&gt;AWS Lambda is a serverless, event-driven compute service that lets you run code for virtually any type of application or backend service without provisioning or managing servers. With over 200 AWS services and SaaS applications that can trigger Lambda, the possibilities are endless. Some application use-case include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;File processing&lt;/li&gt;
&lt;li&gt;Streaming processing&lt;/li&gt;
&lt;li&gt;IoT backends&lt;/li&gt;
&lt;li&gt;Web applications&lt;/li&gt;
&lt;li&gt;Mobile backends&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  How it works
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/p03zoCWF1h2ypx_39qXeFKQe5vrJLx6uA7_VpqnBJzM/w:880/mb:500000/ar:1/aHR0cHM6Ly9jb21t/dW5pdHkub3BzLmlv/L3JlbW90ZWltYWdl/cy91cGxvYWRzL2Fy/dGljbGVzL3MxM29z/bnh0Z216NTdlMW5x/YnNhLnBuZw" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/p03zoCWF1h2ypx_39qXeFKQe5vrJLx6uA7_VpqnBJzM/w:880/mb:500000/ar:1/aHR0cHM6Ly9jb21t/dW5pdHkub3BzLmlv/L3JlbW90ZWltYWdl/cy91cGxvYWRzL2Fy/dGljbGVzL3MxM29z/bnh0Z216NTdlMW5x/YnNhLnBuZw" alt="How Lambda Works by Whizlabs" width="880" height="495"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Lambda automatically runs your function when it's needed and can scale from just a few requests per day to thousands per second. You'll only pay for the compute time you use, so you won't be charged when your code isn't running. Your function can be triggered using various types of events, which is listed below.&lt;/p&gt;

&lt;h2&gt;
  
  
  Components in AWS Lambda
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Function Invocation
&lt;/h3&gt;

&lt;p&gt;Lambda functions can be invoked manually using several methods, including the AWS console, a function URL (HTTPS) endpoint, the Lambda API, the AWS CLI, an AWS SDK, and the AWS toolkit.&lt;/p&gt;

&lt;h3&gt;
  
  
  Function Triggers
&lt;/h3&gt;

&lt;p&gt;If you’re designing an event-driven application, triggers would be best suited for this use case. There various types of triggers readily available in Lambda, most of which is listed below:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;AWS S3 triggers - changes in a bucket or object&lt;/li&gt;
&lt;li&gt;Respond to incoming HTTP request - from AWS API Gateway or a third-party system&lt;/li&gt;
&lt;li&gt;Consume events from a queue - AWS SQS&lt;/li&gt;
&lt;li&gt;Schedule events using - CloudWatch Events&lt;/li&gt;
&lt;li&gt;DocumentDB&lt;/li&gt;
&lt;li&gt;DynamoDB&lt;/li&gt;
&lt;li&gt;AWS ALB&lt;/li&gt;
&lt;li&gt;AWS IoT&lt;/li&gt;
&lt;li&gt;AWS Kinesis&lt;/li&gt;
&lt;li&gt;AWS MQ&lt;/li&gt;
&lt;li&gt;AWS MSK&lt;/li&gt;
&lt;li&gt;AWS SNS&lt;/li&gt;
&lt;li&gt;Apache Kafka&lt;/li&gt;
&lt;li&gt;Amazon Alexa&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Function Destination
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/Msn5wEIO6lMKSykoHpD1bMG_fah2991JSCeTT_LmMX0/w:880/mb:500000/ar:1/aHR0cHM6Ly9jb21t/dW5pdHkub3BzLmlv/L3JlbW90ZWltYWdl/cy91cGxvYWRzL2Fy/dGljbGVzLzlnOHNt/d29uMzBzeGg4dGFi/aHN6LnBuZw" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/Msn5wEIO6lMKSykoHpD1bMG_fah2991JSCeTT_LmMX0/w:880/mb:500000/ar:1/aHR0cHM6Ly9jb21t/dW5pdHkub3BzLmlv/L3JlbW90ZWltYWdl/cy91cGxvYWRzL2Fy/dGljbGVzLzlnOHNt/d29uMzBzeGg4dGFi/aHN6LnBuZw" alt="AWS Lambda Destinations" width="880" height="495"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Destinations are used to perform asynchronous and stream invocations in Lambda. This feature provides visibility into Lambda function invocations and routes the execution results (whether it’s a success or a failure) to other AWS services, simplifying event-driven applications and reducing code complexity. &lt;/p&gt;

&lt;p&gt;At the time of this writing, there are 4 types of destinations:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;SNS topic&lt;/li&gt;
&lt;li&gt;SQS queue&lt;/li&gt;
&lt;li&gt;Lambda function&lt;/li&gt;
&lt;li&gt;EventBridge event bus&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;With these features, you will have greater visibility and control over the results of function execution. This is helpful in building better event-driven applications, reducing code, and utilizing Lambda's native failure handling controls. An example use-case would be if you have developed a resilient payment transaction system in Lambda and want to keep track of all successes and failures in the function; you can build another function to process those results into a DynamoDB table for auditing purposes.&lt;/p&gt;

&lt;h3&gt;
  
  
  Lambda console
&lt;/h3&gt;

&lt;p&gt;AWS offers an intuitive, user-friendly web-based interface for managing your Lambda functions, monitoring tools, CloudWatch log streams, and CPU and memory usage. As previously mentioned, you can manually trigger your Lambda functions as needed from the Web console. Furthermore, you may test your functions without saving an event by configuring a JSON event in the Web console.&lt;/p&gt;

&lt;h3&gt;
  
  
  CloudWatch Logs
&lt;/h3&gt;

&lt;p&gt;By default, for every Lambda function that you create, AWS will create a CloudWatch log group and bind it to your Lambda function. This allows AWS to send all function application logs, including any log methods defined in your code, to CloudWatch. This feature is useful and convenient for debugging and monitoring your application. In addition, you have the option to group your functions and manage all logs using a single CloudWatch log group.&lt;/p&gt;

&lt;h2&gt;
  
  
  Advantages of using AWS Lambda
&lt;/h2&gt;

&lt;p&gt;AWS Lambda has several advantages that make it a popular choice for developers:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Cost-effectiveness:&lt;/strong&gt; With AWS Lambda, you only pay for the time your code is running, which means that you won't be charged when your code is not being executed. This is particularly beneficial for applications that have infrequent usage patterns, as you only pay for the computing resources that you use.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Highly scalable:&lt;/strong&gt; AWS Lambda automatically adjusts and scales to the amount of traffic that your application receives, which means that your application can handle large spikes in traffic without any manual intervention. This is especially important for applications that experience large fluctuations in traffic, such as e-commerce websites during peak shopping seasons. As a result, scaling is no longer an issue and you can easily manage traffic spikes and large workloads with AWS Lambda.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Easy to use:&lt;/strong&gt; You don't have to worry about infrastructure management or server maintenance. This means that you can focus on writing code and building your application, rather than worrying about the underlying infrastructure. Additionally, AWS Lambda integrates with a wide range of other AWS services, such as Amazon S3 and Amazon DynamoDB, which makes it easy to build complex applications.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Security:&lt;/strong&gt; AWS has implemented several security measures to ensure that your code and data are protected. For example, AWS Lambda functions are executed within a secure VPC (Virtual Private Cloud), which means that your functions are isolated and protected from external threats. Additionally, AWS Lambda integrates with AWS Identity and Access Management (IAM), which allows you to control access to your functions and resources.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Flexibility:&lt;/strong&gt; This feature allows you to write functions in your preferred language, although currently only Node.js, Python, Java, Ruby, C#, Go, and PowerShell are supported. This provides developers with the freedom to choose the language they prefer, rather than being restricted to a specific language. In addition, Lambda functions can be used for a wide variety of use cases, such as file processing, streaming processing, IoT backends, web applications, and mobile backends. Lambda integrates with over 200 AWS services and SaaS applications that can trigger Lambda, which makes the possibilities endless.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Disadvantages of using AWS Lambda
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Cold start delays:&lt;/strong&gt; One of the major disadvantages of AWS Lambda is cold start delays. The first execution of a Lambda function can have a delay in response time due to initialization. This delay can be a problem for time-sensitive applications. To mitigate this problem, developers can use AWS Lambda Provisioned Concurrency, which pre-warms the Lambda function to reduce cold start delays.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Limited execution time:&lt;/strong&gt; AWS Lambda functions are limited to 15 minutes of execution time, which may not be enough for complex applications. This limitation means that developers must plan their applications accordingly. To overcome this limitation, developers can use AWS Step Functions, which allow them to execute multiple Lambda functions in a sequence, enabling long-running applications.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Dependency management:&lt;/strong&gt; Managing dependencies can be challenging as the functions are deployed as packages. This can be a problem, especially when your functions have many dependencies or require frequent updates. To overcome this problem, developers can use AWS Lambda Layers, which allow them to package and manage their dependencies separately from their Lambda functions.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Debugging:&lt;/strong&gt; Debugging Lambda functions can be challenging as your code runs in a serverless environment. This means that you must use different tools and techniques to debug your code. To overcome this problem, developers can use AWS X-Ray, which allows them to trace requests through their Lambda functions and identify errors and performance bottlenecks.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Vendor lock-in:&lt;/strong&gt; AWS Lambda is a proprietary technology, which may lead to vendor lock-in. This means that once you start using AWS Lambda, it can be challenging to switch to another vendor or technology. To overcome this problem, developers can use AWS SAM (Serverless Application Model), which is an open-source framework for building serverless applications. This framework allows developers to define their serverless applications in a standard way, making it easy to migrate to other serverless platforms.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;AWS Lambda is a powerful technology that has several advantages for developers. It is cost-effective, highly scalable, and easy to use, which makes it an ideal choice for building modern applications. With the flexibility to use currently supported programming language, AWS Lambda is an excellent choice for developers who want to build applications quickly and efficiently. If you're looking to build a new application or migrate an existing one to the cloud, AWS Lambda is definitely worth considering.&lt;/p&gt;

&lt;h2&gt;
  
  
  Resources
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://aws.amazon.com/blogs/compute/introducing-aws-lambda-destinations/"&gt;Lambda Destinations&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://aws.amazon.com/lambda/"&gt;AWS Lambda Homepage&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/lambda/latest/dg/welcome.html"&gt;AWS Lambda Developer Guide&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.whizlabs.com/blog/wp-content/uploads/2021/04/How-Lambda-Works.png"&gt;How Lambda Works&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://dev.to/sosnowski/anatomy-of-aws-lambda-1i1e"&gt;Anatomy of AWS Lambda&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>aws</category>
      <category>devops</category>
      <category>cloudops</category>
    </item>
    <item>
      <title>Terraforming in the Cloud</title>
      <dc:creator>Jeiman Jeya</dc:creator>
      <pubDate>Sun, 26 Mar 2023 08:24:10 +0000</pubDate>
      <link>https://community.ops.io/jei/terraforming-in-the-cloud-2h7j</link>
      <guid>https://community.ops.io/jei/terraforming-in-the-cloud-2h7j</guid>
      <description>&lt;p&gt;This post discusses the benefits of using Terraform for cloud automation. It explains the limitations of manual infrastructure management, the inception of IaC and how Terraform can help mitigate those issues. The document covers topics such as the Terraform configuration language, backends, workspaces, and Terraform Cloud, while using AWS as a cloud provider example.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Iron Age
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/wH7tW5HaG0FUOa1MPGG3tg64mHXa31ifQIoSVtIgZCk/w:880/mb:500000/ar:1/aHR0cHM6Ly9jb21t/dW5pdHkub3BzLmlv/L3JlbW90ZWltYWdl/cy91cGxvYWRzL2Fy/dGljbGVzLzd1ZHB4/eXV0bHZmZHQydmZo/MmVsLmpwZw" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/wH7tW5HaG0FUOa1MPGG3tg64mHXa31ifQIoSVtIgZCk/w:880/mb:500000/ar:1/aHR0cHM6Ly9jb21t/dW5pdHkub3BzLmlv/L3JlbW90ZWltYWdl/cy91cGxvYWRzL2Fy/dGljbGVzLzd1ZHB4/eXV0bHZmZHQydmZo/MmVsLmpwZw" alt="The Iron Age" width="880" height="584"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the Iron Age, managing IT infrastructure was done manually. People would physically set up servers and configure them based on required settings by the OS and applications before deploying the application. This manual process often resulted in several problems, such as cost, scalability, availability, monitoring, and performance visibility. &lt;/p&gt;

&lt;p&gt;In order to maintain all of this infrastructure, you would need to hire IT professionals and engineers. However, this can create a variety of issues such as inconsistencies, silos among teams when raising tickets to IT teams to provision new resources, miscommunications, and delays in reviewing and processing infrastructure requests.&lt;/p&gt;

&lt;p&gt;Furthermore, when engineering teams need new infrastructure quickly, they typically use cloud consoles, such as AWS, Azure, Google, or Proxmox, to provision those resources. This approach is not an issue, and some engineers actually prefer spinning up resources this way to gain a better understanding of the prerequisites of a Cloud service or resource. While documentation can be helpful in this case, many engineers prefer hands-on experience and getting down to the nitty-gritty.&lt;/p&gt;

&lt;p&gt;However, there is an issue to consider. What if you need to make changes that are not available through the user interface console? What if you need to make changes that depend on other changes being made first? This could have a widespread impact on the resources you were trying to set up. It can be difficult to keep track of all the resources and you might overlook updating or naming some of them with best practices in mind.&lt;/p&gt;

&lt;p&gt;With modern technology, these problems can indeed be mitigated through automated processes, leading to improved efficiency and reduced costs. That's where Infrastructure as Code comes in.&lt;/p&gt;

&lt;h2&gt;
  
  
  Lo and behold, Terraform
&lt;/h2&gt;

&lt;p&gt;Terraform is a powerful tool that allows you to manage your infrastructure as code. With Terraform, you can define, provision, and manage resources such as virtual machines, cloud instances, and containers across multiple cloud providers using a declarative configuration language known as HashiCorp Configuration Language, or HCL for short.&lt;/p&gt;

&lt;p&gt;Imagine an orchestra of musicians. In this case, Terraform acts as the conductor who ensures that the right number of instruments (providers, modules) are playing and the sound (configuration) is correct. If there is an issue, the conductor replaces the broken instrument with a working one.&lt;/p&gt;

&lt;h2&gt;
  
  
  Wait, what about AWS CloudFormation, Azure RM and the rest?
&lt;/h2&gt;

&lt;p&gt;CloudFormation is one option for provisioning your infrastructure resources, but if you prefer to deploy to a different cloud provider of your choice, it can be a challenge. Terraform, on the other hand, excels at managing infrastructure for multiple cloud platforms. In addition, compared to CloudFormation and other cloud provider configuration languages, Terraform's configuration files are simpler to understand due to their flat format. This makes it easier to manage dependencies and references, which is where Terraform truly shines with its confident approach.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "aws_instance" "helloworld" {
  ami           = "ami-12d3df"
  instance_type = "t3.micro"

  tags = {
    Name = "EC2-Instance-HelloWorld"
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The Terraform configuration language, designed for human readability, allows you to write infrastructure code with ease. As shown in the snippet above, provisioning an EC2 instance on AWS is a straightforward task.&lt;/p&gt;

&lt;h2&gt;
  
  
  How does Terraform maintain the newly provisioned resources?
&lt;/h2&gt;

&lt;p&gt;Each Terraform configuration has a backend associated with it that determines how operations are carried out and where data, such as the Terraform state, is stored. The state is a necessary requirement for Terraform to function since it is used to compare existing resources with newly added ones specified in the configuration files.&lt;/p&gt;

&lt;p&gt;Backends are typically used to store Terraform state snapshots. A Terraform configuration can specify a backend, integrate with Terraform Cloud, or store state locally by default.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/sJSsmreDSAYyPociuE05kiabZu6O4-iEpEu8fVpIwP0/w:880/mb:500000/ar:1/aHR0cHM6Ly9jb21t/dW5pdHkub3BzLmlv/L3JlbW90ZWltYWdl/cy91cGxvYWRzL2Fy/dGljbGVzLzJ1dWZh/enNrZDJ6YXZuaTFp/NXZoLnBuZw" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/sJSsmreDSAYyPociuE05kiabZu6O4-iEpEu8fVpIwP0/w:880/mb:500000/ar:1/aHR0cHM6Ly9jb21t/dW5pdHkub3BzLmlv/L3JlbW90ZWltYWdl/cy91cGxvYWRzL2Fy/dGljbGVzLzJ1dWZh/enNrZDJ6YXZuaTFp/NXZoLnBuZw" alt="Terraform state and its ecosystem with cloud providers" width="880" height="414"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By default, Terraform stores its state in a local file directory named &lt;code&gt;terraform.tfstate&lt;/code&gt;. However, if you need to, you can choose to use alternative backend repositories, such as the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Amazon S3&lt;/li&gt;
&lt;li&gt;Kubernetes&lt;/li&gt;
&lt;li&gt;Remote&lt;/li&gt;
&lt;li&gt;Postgres&lt;/li&gt;
&lt;li&gt;AzureRM (via Storage accounts)&lt;/li&gt;
&lt;li&gt;Consul&lt;/li&gt;
&lt;li&gt;and many others&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The above can be configured by using a &lt;code&gt;backend&lt;/code&gt; block in the configuration file (typically the &lt;code&gt;main.tf&lt;/code&gt; file). This is how an S3 backend would look like in your configuration file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform {
  backend "s3" {
    bucket = "sample"
    key    = "path/to/my/samplekey"
    region = "ap-southeast-1"
  }
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To store the backend states on S3, simply supply your AWS IAM credentials from you local machine or your CI/CD tools.&lt;/p&gt;

&lt;p&gt;In my opinion, it is best to store state files remotely in a cloud storage because this option allows for state locking. This means that two people working on the same project cannot cause conflicting issues when making changes to the resources.&lt;/p&gt;

&lt;p&gt;There is a lot that you can achieve with Terraform states and their respective backends, but you should now have a general understanding of how states work in Terraform and why they are important to have.&lt;/p&gt;

&lt;h2&gt;
  
  
  Ok, so how do I use the same resource configuration to provision different environments - Staging and Production?
&lt;/h2&gt;

&lt;p&gt;In Terraform, &lt;strong&gt;&lt;em&gt;workspaces&lt;/em&gt;&lt;/strong&gt; are separate instances of state data that can be used from the same working directory. Workspaces can be used to manage multiple non-overlapping groups of resources with the same configuration (for example, staging, production, uat, etc.).&lt;/p&gt;

&lt;p&gt;Almost all directories that have a main configuration file have a default workspace, named &lt;code&gt;default&lt;/code&gt;. If you want to configure new workspaces, simply run the following command: &lt;code&gt;terraform workspace new &amp;lt;workspace_name&amp;gt;&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Basically, it creates a new child working directory in your project directory that will contain your state files as such:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform.tfstate.d/
    - staging/
        - terraform.tfstate
    - prod
        - terraform.tfstate
    - uat
        - terraform.tfstate
    - ...

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;With that, you can maintain different state files for all of your environments. All you have to do is switch workspaces accordingly when working for that specific environment , using &lt;code&gt;terraform workspace select &amp;lt;workspace_name&amp;gt;&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Utilize workspaces when your organization operates in multiple environments where your services are deployed.&lt;/p&gt;

&lt;h2&gt;
  
  
  JARVIS-level Automation
&lt;/h2&gt;

&lt;p&gt;In typical scenarios, engineering teams commonly integrate Terraform with their preferred CI/CD toolset (e.g. Github Actions, Azure DevOps, Gitlabs, etc.) to ensure flexibility and self-governance over their state files and configuration, and to maintain coherence within their tech ecosystem. However, if teams are not familiar with CI/CD tools, they can rely on Terraform Cloud.&lt;/p&gt;

&lt;p&gt;Terraform Cloud is a managed service offering by HashiCorp that eliminates the need for excessive tooling. You can provision your infrastructure in a remote environment hosted by HashiCorp that is optimized for Terraform workflows. This means that you can simply link your Cloud provider of choice with Terraform Cloud and let the platform handle all of the heavy lifting for you. All you need to do from your end is build and commit your Terraform (HCL) files to your repository and link your Git repository to your Terraform Cloud organization. Ideally, this follows the GitOps process. Write your infrastructure code, commit the changes, and Terraform Cloud executes your changes in a remote environment.&lt;/p&gt;

&lt;p&gt;Here are some of the pros and cons of using CI/CD tools and Terraform Cloud. They are broken down into 4 criteria:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Control:&lt;/strong&gt; With CI/CD tools, you have complete control over your infrastructure automation workflows, and you can customize your pipeline as required. With Terraform Cloud, you have less control over your infrastructure automation workflows, but you don't have to manage your own infrastructure.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Integration:&lt;/strong&gt; CI/CD tools provide a wide range of integrations with other tools and services. Terraform Cloud integrates with a limited set of cloud providers, but it is optimized for Terraform workflows.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Maintenance:&lt;/strong&gt; With CI/CD tools, you need to manage and maintain your own infrastructure for running the toolset if they are mostly on-premise/self-managed. With Terraform Cloud, you don't have to manage and maintain your own infrastructure.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cost:&lt;/strong&gt; CI/CD tools can be expensive, especially if you require a lot of customization and integrations. Terraform Cloud has a free tier, which provides basic functionality, and paid tiers that provide additional features.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;So there you have it. Terraform is undeniably a powerful tool for managing infrastructure as code. Its declarative language, ability to manage resources across multiple cloud providers, and powerful set of tools make it an unrivaled choice for managing infrastructure and third-party systems. Whether you are managing a small set of resources or a large, complex infrastructure, Terraform can help you to manage it in a unfailing and consistent way. That’s why go-to tool for managing infrastructure and a great solution in DevOps. &lt;/p&gt;

&lt;p&gt;By adopting Terraform, you can guarantee that your infrastructure is constantly in a known state, and that any modifications are made in a controlled manner. Regardless of whether you're overseeing a small or large infrastructure, Terraform is a useful tool for increasing efficiency and effectiveness.&lt;/p&gt;

&lt;h2&gt;
  
  
  Image sources
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.devopsschool.com/blog/wp-content/uploads/2021/07/terraform-architecture-components-workflow-2.png"&gt;https://www.devopsschool.com/blog/wp-content/uploads/2021/07/terraform-architecture-components-workflow-2.png&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.iwantthatdoor.com/wp-content/uploads/2021/06/iron-age-smith.jpg"&gt;https://www.iwantthatdoor.com/wp-content/uploads/2021/06/iron-age-smith.jpg&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>terraform</category>
      <category>devops</category>
      <category>aws</category>
    </item>
    <item>
      <title>Automating mobile application deployments using Fastlane and CI/CD tools</title>
      <dc:creator>Jeiman Jeya</dc:creator>
      <pubDate>Tue, 31 May 2022 02:57:15 +0000</pubDate>
      <link>https://community.ops.io/jei/automating-mobile-application-deployments-using-fastlane-and-cicd-tools-2d4f</link>
      <guid>https://community.ops.io/jei/automating-mobile-application-deployments-using-fastlane-and-cicd-tools-2d4f</guid>
      <description>&lt;h1&gt;
  
  
  The Problem
&lt;/h1&gt;




&lt;p&gt;Engineering teams these days find it troublesome to build, test and deploy their mobile application changes locally without having to maintain the tools required for it. There is a lot of maintenance involved as you need to keep track of what versions of these tools have been installed to avoid compatibility issues when you’re building the application bundles, especially hybrid apps built on React Native.&lt;/p&gt;

&lt;h1&gt;
  
  
  The Cure
&lt;/h1&gt;




&lt;p&gt;&lt;a href="https://community.ops.io/images/a7ESGpC3KYhhq_5rMOMxrVr8oIy34JoUQl_3Jc4AKJU/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvb2Ri/eWRhbWV2djBmaTk0/d2tqMmgucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/a7ESGpC3KYhhq_5rMOMxrVr8oIy34JoUQl_3Jc4AKJU/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvb2Ri/eWRhbWV2djBmaTk0/d2tqMmgucG5n" alt="Alt Text" width="296" height="78"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;&lt;a href="https://fastlane.tools/"&gt;fastlane&lt;/a&gt;&lt;/em&gt;&lt;/strong&gt;, an automation tool that aids in handling all of the tedious tasks so you don't have to. It's by far, the easiest way to build and release your mobile apps.&lt;/p&gt;

&lt;p&gt;fastlane can easily:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Distribute beta builds to your testers&lt;/li&gt;
&lt;li&gt;Publish a new release to the app store in seconds&lt;/li&gt;
&lt;li&gt;Reliably code sign your application - alleviates all of your headaches&lt;/li&gt;
&lt;li&gt;Reliably maintain your provisioning profiles and application certificates in a Git repository&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  How it works
&lt;/h1&gt;




&lt;p&gt;All actions in fastlane are written into &lt;em&gt;lanes&lt;/em&gt;. Defining lanes are easy. Think of them as functions in any programming language of your choosing. You define all of your actions within that lane.&lt;/p&gt;

&lt;p&gt;An example of a &lt;em&gt;lane&lt;/em&gt; is as follows:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="n"&gt;lane&lt;/span&gt; &lt;span class="ss"&gt;:beta&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;
  &lt;span class="n"&gt;increment_build_number&lt;/span&gt;
  &lt;span class="n"&gt;build_app&lt;/span&gt;
  &lt;span class="n"&gt;upload_to_testflight&lt;/span&gt;
&lt;span class="k"&gt;end&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;So when it comes to executing that lane, all you do is run &lt;code&gt;fastlane beta&lt;/code&gt; in your terminal.&lt;/p&gt;

&lt;h1&gt;
  
  
  Installation &amp;amp; Setup
&lt;/h1&gt;




&lt;p&gt;In this article, we will look at setting up a fastlane script to build, sign and deploy an iOS application to Testflight.&lt;/p&gt;

&lt;h2&gt;
  
  
  Pre-requisites
&lt;/h2&gt;

&lt;p&gt;As with most projects, you need to perform the initial project setup to support building your application, such as:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Installing your project dependencies&lt;/li&gt;
&lt;li&gt;Install XCode and Android Studio&lt;/li&gt;
&lt;li&gt;Install Java SDK&lt;/li&gt;
&lt;li&gt;Setup git repository for Android and iOS certificates - &lt;em&gt;will be explained later in the article&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  fastlane folder structure
&lt;/h2&gt;

&lt;p&gt;In ideal cases, you would have an Android application project and an iOS application project, both hosted in the same code repository as your project&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight markdown"&gt;&lt;code&gt;projectFolder/
    app/
    scripts/
    ios/
        fastlane/
            ....
  android/
        fastlane/
            ....
    package.json
    .gitignore
  ...
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this case, you will want to initialise the fastlane folder within the respective &lt;code&gt;android&lt;/code&gt; and &lt;code&gt;ios&lt;/code&gt; project sub-directories.&lt;/p&gt;

&lt;h2&gt;
  
  
  Setup
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Install fastlane 

&lt;ul&gt;
&lt;li&gt;via Homebrew (&lt;code&gt;brew install fastlane&lt;/code&gt;)&lt;/li&gt;
&lt;li&gt;via RubyGems (&lt;code&gt;sudo gem install fastlane&lt;/code&gt;)&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Navigate your terminal to your respective &lt;code&gt;android&lt;/code&gt; and &lt;code&gt;ios&lt;/code&gt; project directory and run &lt;code&gt;fastlane init&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;You should have a folder structure that is similar to the one below:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;fastlane/
    Fastfile
    Appfile
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The most interesting file is &lt;code&gt;fastlane/Fastfile&lt;/code&gt;, which contains all the information that is needed to distribute your app.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Inside your &lt;code&gt;Fastfile&lt;/code&gt;, you can start writing lanes:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="n"&gt;lane&lt;/span&gt; &lt;span class="ss"&gt;:my_lane&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;
  &lt;span class="c1"&gt;# Whatever actions you like go in here.&lt;/span&gt;
&lt;span class="k"&gt;end&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;&lt;p&gt;You can start adding in actions into your lanes. fastlane actions can be found &lt;a href="https://docs.fastlane.tools/actions/"&gt;here&lt;/a&gt;. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Copy the following file structure to your &lt;code&gt;Fastfile&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="n"&gt;default_platform&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="ss"&gt;:ios&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;beta&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;arg1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;arg2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="c1"&gt;# You may use Ruby functions to write custom actions for your app deployment&lt;/span&gt;
&lt;span class="k"&gt;end&lt;/span&gt;

&lt;span class="n"&gt;platform&lt;/span&gt; &lt;span class="ss"&gt;:ios&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;

    &lt;span class="n"&gt;desc&lt;/span&gt; &lt;span class="s2"&gt;"Building the IPA file only"&lt;/span&gt;
      &lt;span class="n"&gt;lane&lt;/span&gt; &lt;span class="ss"&gt;:build_ios_app_ipa&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;
        &lt;span class="n"&gt;app_identifier&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"com.appbundle.id"&lt;/span&gt;
        &lt;span class="n"&gt;beta&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;"AppSchemeName"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;app_identifier&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="k"&gt;end&lt;/span&gt;

&lt;span class="k"&gt;end&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;default_platform(:ios)&lt;/code&gt; - Initialise your Fastfile file with a default platform&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;platform :ios do&lt;/code&gt; - Add all actions under a platform&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;From 5 to 6, it will inform fastlane that this particular Fastfile is purely for iOS operations. So instead of running the command &lt;code&gt;fastlane &amp;lt;lane_name&amp;gt;&lt;/code&gt;, you will actually run &lt;code&gt;fastlane ios &amp;lt;lane_name&amp;gt;&lt;/code&gt;. Therefore, anything parked under &lt;code&gt;platform :ios do&lt;/code&gt; will be executed when lanes are invoked in your terminal.&lt;/p&gt;

&lt;p&gt;If you are well-versed in Ruby, you may write your own Ruby functions to help you write custom actions that you require for further flexibility, especially when it comes to building several apps with different environments across your organisation. &lt;/p&gt;

&lt;p&gt;fastlane will identify them as an action regardless due to the fact that fastlane is written in &lt;em&gt;Ruby&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;In this article, we will follow writing the actions using a Ruby function. This is so we can promote action re-usability across other lanes deployments.&lt;/strong&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Action Steps
&lt;/h1&gt;




&lt;p&gt;Before we start writing our functionality in the lanes, we first list down our action steps:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Setup API Key for App Store Connect (&lt;code&gt;app_store_connect_api_key&lt;/code&gt;)&lt;/strong&gt; - This will allow fastlane to connect to your App Store to perform other actions that requires user authentication&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Setup CI (&lt;code&gt;setup_ci&lt;/code&gt;)&lt;/strong&gt; - This will setup a temporary keychain to work on your CI pipeline of your choice&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Create and sync provisioning profiles and certificates (&lt;code&gt;match&lt;/code&gt;)&lt;/strong&gt; - This will help us maintain our provisioning profiles and certificates across your teams&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Update code signing settings (&lt;code&gt;update_code_signing_settings&lt;/code&gt;)&lt;/strong&gt; - This is to update the code signing identities to match your profile name and app bundle identifier&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Increment your app build number (&lt;code&gt;increment_build_number&lt;/code&gt;)&lt;/strong&gt; - This will automate your application build number by retrieving the latest Testflight build number and incrementing the value&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Build the app (&lt;code&gt;build_app&lt;/code&gt;)&lt;/strong&gt; - This will build the app for us and generate a binary (IPA) file&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Upload your binary to Testflight (&lt;code&gt;upload_to_testflight&lt;/code&gt;)&lt;/strong&gt; - This will automate the process of uploading the binary file to Testflight and informing your testers accordingly&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Steps
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Step 1: Setup your App Store Connect API key
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Visit the following &lt;a href="https://developer.apple.com/documentation/appstoreconnectapi/creating_api_keys_for_app_store_connect_api"&gt;page&lt;/a&gt;. It will provide you a step-by-step process in generating an API key&lt;/li&gt;
&lt;li&gt;Once you have generated a key, take note of:

&lt;ul&gt;
&lt;li&gt;Issuer ID&lt;/li&gt;
&lt;li&gt;Key ID&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Download the generated API key - A &lt;code&gt;.p8&lt;/code&gt; file&lt;/li&gt;
&lt;li&gt;Store it in a secure location in which you can easily access them. Avoid storing them in the same repository as anyone in your organisation will have access to the company's Apple account. 

&lt;ul&gt;
&lt;li&gt;In this article, we are storing them in another Git repository with limited read and write scopes to specific engineers within our organisation.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Step 2: Setup the fastlane script
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Setup your App Store API Key to generate a hash used for JWT authorization&lt;/strong&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;setup_api_key&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
  &lt;span class="n"&gt;sh&lt;/span&gt; &lt;span class="s2"&gt;"if [ -d &lt;/span&gt;&lt;span class="se"&gt;\"&lt;/span&gt;&lt;span class="s2"&gt;appstoreapi&lt;/span&gt;&lt;span class="se"&gt;\"&lt;/span&gt;&lt;span class="s2"&gt; ]; then echo 'Folder exist, executing next step'; else git clone &lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="no"&gt;ENV&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'APPSTORE_API_GIT_URL'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt; appstoreapi; fi"&lt;/span&gt;
  &lt;span class="n"&gt;app_store_connect_api_key&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="ss"&gt;key_id: &lt;/span&gt;&lt;span class="no"&gt;ENV&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'APPSTORE_KEY_ID'&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
    &lt;span class="ss"&gt;issuer_id: &lt;/span&gt;&lt;span class="no"&gt;ENV&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'APPSTORE_ISSUER_ID'&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
    &lt;span class="ss"&gt;key_filepath: &lt;/span&gt;&lt;span class="no"&gt;Dir&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;pwd&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="s2"&gt;"/appstoreapi/AuthKey_xxx.p8"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;end&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the snippet above, I am cloning a Git repository which contains my App Store Connect API key, followed by utilising the &lt;code&gt;app_store_connect_api_key&lt;/code&gt; action from fastlane. It takes in several parameters, however, there are 3 vital parameters:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;key_id&lt;/code&gt; - The Key ID from which you took note when you generated the API key&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;issuer_id&lt;/code&gt; - The Issuer ID from which you took note when you generated the API key&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;key_filepath&lt;/code&gt; - The file path to your &lt;code&gt;.p8&lt;/code&gt; file&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This step will generate a hash that will be used to authenticate the App Store using JWT.&lt;/p&gt;

&lt;p&gt;I would highly recommend storing sensitive credentials or URLs and access them from an Environment Variable (&lt;code&gt;.env&lt;/code&gt;) file in the same project folder - &lt;code&gt;fastlane/.env&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Once you have prepared the file, you can easily access any Environment Variable by simply passing in &lt;code&gt;ENV['ENV_NAME']&lt;/code&gt; into the Fastfile.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Define a Ruby function with 2 arguments&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;beta&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;scheme&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;bundle_id&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;end&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Within this function, we specify the action steps we mentioned above&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;beta&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;scheme&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;bundle_id&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;setup_api_key&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="c1"&gt;# Import the setup_api_key function&lt;/span&gt;
    &lt;span class="n"&gt;setup_ci&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="c1"&gt;# Uses fastlane action to create a temporary keychain access on a CI pipeline&lt;/span&gt;
&lt;span class="k"&gt;end&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;3. Utilise the &lt;code&gt;match&lt;/code&gt; action from fastlane&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;&lt;a href="https://docs.fastlane.tools/actions/match/"&gt;match&lt;/a&gt;&lt;/em&gt;&lt;/strong&gt; is a fastlane action that allows you to easily sync your certificates and provisioning profiles. It takes a new approach to iOS and macOS code signing, where you share one code signing identity across your engineering team to simplify the setup and prevent code signing issues. The foundation of &lt;em&gt;match&lt;/em&gt; was built using the implementation of &lt;a href="https://codesigning.guide/"&gt;codesigning.guide concept&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;You can store your code signing identities in:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Git repository&lt;/li&gt;
&lt;li&gt;Google Cloud&lt;/li&gt;
&lt;li&gt;Amazon S3 bucket&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In this article, we have chosen to store it a Git repository. However, with the case of storing in a Git repository, you would need to provide a form of basic authorization in order for match to access and clone your repository.&lt;/p&gt;

&lt;p&gt;Whichever Git provider you choose (GitHub, Bitbucket, Gitlab or Azure DevOps), you would need to setup a Personal Access Token (PAT), which fastlane will use to clone repository and sync your code signing identities.&lt;/p&gt;

&lt;p&gt;In your Fastfile, you will need to encode your PAT with a Base64 encryption&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="n"&gt;authorization_token_str&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;ENV&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'GITHUB_TOKEN'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
&lt;span class="n"&gt;basic_authorization_token&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="no"&gt;Base64&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;strict_encode64&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;authorization_token_str&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;basic_authorization_token&lt;/code&gt; variable will be used in setting up the match implementation below.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="n"&gt;match&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="ss"&gt;git_url: &lt;/span&gt;&lt;span class="s2"&gt;"&amp;lt;git_url&amp;gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="ss"&gt;git_basic_authorization: &lt;/span&gt;&lt;span class="n"&gt;basic_authorization_token&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="ss"&gt;readonly: &lt;/span&gt;&lt;span class="kp"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="ss"&gt;type: &lt;/span&gt;&lt;span class="s2"&gt;"appstore"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="ss"&gt;app_identifier: &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;
    &lt;span class="n"&gt;bundle_id&lt;/span&gt;
&lt;span class="p"&gt;])&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;From the snippet above, this is a very simple implementation. Take note on the &lt;code&gt;readonly&lt;/code&gt; parameter. This is crucial in creating and syncing your profiles and certs with your Apple Developer account. If you set &lt;code&gt;readonly&lt;/code&gt; to &lt;code&gt;false&lt;/code&gt;, you will allow match to automatically sync and create new profiles and certs, should it deem your existing certs and profiles expired or corrupted. Once it has provisioned the profiles, you can set &lt;code&gt;readonly&lt;/code&gt; to &lt;code&gt;true&lt;/code&gt;. This is to avoid egde cases where it might accidentally create new certs and profiles.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4. Update code signing identities&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This step is mainly used to update the Xcode settings on a CI pipeline due to its default behaviour of selecting a default cert and profile from the Mac OS agent.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="n"&gt;update_code_signing_settings&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="ss"&gt;use_automatic_signing: &lt;/span&gt;&lt;span class="kp"&gt;false&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="ss"&gt;path: &lt;/span&gt;&lt;span class="s2"&gt;"../ios/AppName.xcodeproj"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="ss"&gt;code_sign_identity: &lt;/span&gt;&lt;span class="s2"&gt;"iPhone Distribution"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="ss"&gt;profile_name: &lt;/span&gt;&lt;span class="s2"&gt;"match AppStore com.appbundle.id"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="ss"&gt;bundle_identifier: &lt;/span&gt;&lt;span class="s2"&gt;"com.appbundle.id"&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;From the above snippet, you can retrieve your &lt;code&gt;profile_name&lt;/code&gt; and &lt;code&gt;bundle_identifier&lt;/code&gt; from your Apple Developer account or taking note of them from the match step once it has been executed.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;5. Increment Application build number&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;We simply increment the number by accessing your latest testflight build number and incrementing the value with 1.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="n"&gt;increment_build_number&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
  &lt;span class="ss"&gt;build_number: &lt;/span&gt;&lt;span class="n"&gt;latest_testflight_build_number&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;
&lt;span class="p"&gt;})&lt;/span&gt;

&lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"BUILD_NUMBER: &lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;lane_context&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="no"&gt;SharedValues&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="no"&gt;BUILD_NUMBER&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;puts&lt;/code&gt; command will print out the Build number in the console, which is shared across the lane context.&lt;/p&gt;

&lt;p&gt;You may have other app build versioning models than the one mentioned in this article. Increment the build number however you see fit to your project needs.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;6. Build the application&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This step will involve building your application to generate a binary (APK) file.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="n"&gt;build_app&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="ss"&gt;workspace: &lt;/span&gt;&lt;span class="s2"&gt;"../ios/AppName.xcworkspace"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="ss"&gt;export_xcargs: &lt;/span&gt;&lt;span class="s2"&gt;"-allowProvisioningUpdates"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="ss"&gt;scheme: &lt;/span&gt;&lt;span class="n"&gt;scheme&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="ss"&gt;clean: &lt;/span&gt;&lt;span class="kp"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="ss"&gt;silent: &lt;/span&gt;&lt;span class="kp"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="ss"&gt;sdk: &lt;/span&gt;&lt;span class="s2"&gt;"iphoneos"&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nb"&gt;puts&lt;/span&gt; &lt;span class="s2"&gt;"IPA: &lt;/span&gt;&lt;span class="si"&gt;#{&lt;/span&gt;&lt;span class="n"&gt;lane_context&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="no"&gt;SharedValues&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="no"&gt;IPA_OUTPUT_PATH&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;build_app&lt;/code&gt; action is provided by fastlane.&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;puts&lt;/code&gt; command will print out the file path location of the IPA file in which you can use to manually upload to Testflight.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;7.  Upload the binary file to Testflight&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This step will involve uploading your binary file to your Testflight account.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="n"&gt;app_identifier&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"com.appbunde.id"&lt;/span&gt;
&lt;span class="n"&gt;upload_to_testflight&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="ss"&gt;app_identifier: &lt;/span&gt;&lt;span class="n"&gt;app_identifier&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;upload_to_testflight&lt;/code&gt; is an action provided by fastlane.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Putting them all together&lt;/strong&gt;
&lt;/h3&gt;




&lt;p&gt;The final lane should look something like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="n"&gt;desc&lt;/span&gt; &lt;span class="s2"&gt;"Build and push a new build to TestFlight"&lt;/span&gt;
  &lt;span class="n"&gt;lane&lt;/span&gt; &lt;span class="ss"&gt;:release_build&lt;/span&gt; &lt;span class="k"&gt;do&lt;/span&gt;
    &lt;span class="n"&gt;app_identifier&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"com.appbundle.id"&lt;/span&gt;
    &lt;span class="n"&gt;beta&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;"AppSchemeName"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;app_identifier&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="c1"&gt;# The custom function we wrote earlier&lt;/span&gt;
    &lt;span class="n"&gt;upload_to_testflight&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="ss"&gt;app_identifier: &lt;/span&gt;&lt;span class="n"&gt;app_identifier&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="k"&gt;end&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;When you execute &lt;code&gt;fastlane ios release_build&lt;/code&gt; in your terminal, it will operate based on the aforementioned steps above. &lt;/p&gt;

&lt;p&gt;As you can see, we utilised a Ruby function to group all common operations under the same roof so is to ensure:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Code re-usability&lt;/li&gt;
&lt;li&gt;Consistency&lt;/li&gt;
&lt;li&gt;Clean code&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Integration with CI/CD tools
&lt;/h1&gt;

&lt;p&gt;With the fastlane scripts, you can easily integrate it with any CI/CD tools of your choosing:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;GitHub Actions&lt;/li&gt;
&lt;li&gt;Gitlab CI&lt;/li&gt;
&lt;li&gt;Azure Devops&lt;/li&gt;
&lt;li&gt;Bitbucket Pipelines&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;When setting up your pre-requisite steps in your pipeline YAML file, you can simply include a bash script that executes your fastlane script&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ruby"&gt;&lt;code&gt;&lt;span class="n"&gt;cd&lt;/span&gt; &lt;span class="n"&gt;ios&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="n"&gt;android&lt;/span&gt; &lt;span class="n"&gt;folder&lt;/span&gt;

&lt;span class="n"&gt;fastlane&lt;/span&gt; &lt;span class="n"&gt;ios&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="n"&gt;android&lt;/span&gt; &lt;span class="n"&gt;release_build&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The command above will execute your fastlane script by following the aforementioned steps above.&lt;/p&gt;

&lt;h1&gt;
  
  
  Conclusion
&lt;/h1&gt;

&lt;p&gt;fastlane is a powerful tool that helps streamline your build process across your organisation. It can be used along side your existing CI/CD pipelines or as a standalone pipeline script to be used on your local machines or custom pipeline tools such as Buildkite. It would require additional steps such as, cloning the repository and checking out branches, however, all readily available from fastlane as actions.&lt;/p&gt;

&lt;p&gt;Spend some time reading through the documentation as it contains &lt;a href="https://docs.fastlane.tools/actions/"&gt;a lot of actions&lt;/a&gt; out of the box that will prove useful for your engineering teams (&lt;a href="https://docs.fastlane.tools/actions/match/"&gt;match&lt;/a&gt;, &lt;a href="https://docs.fastlane.tools/actions/pilot/"&gt;pilot&lt;/a&gt;, &lt;a href="https://docs.fastlane.tools/actions/cert/"&gt;cert&lt;/a&gt;, &lt;a href="https://docs.fastlane.tools/actions/sigh/"&gt;sigh&lt;/a&gt;, &lt;a href="https://docs.fastlane.tools/actions/appium/"&gt;appium&lt;/a&gt;, &lt;a href="https://docs.fastlane.tools/actions/xctool/"&gt;xctool&lt;/a&gt;, &lt;a href="https://docs.fastlane.tools/actions/supply/"&gt;supply&lt;/a&gt;, and &lt;a href="https://docs.fastlane.tools/actions/"&gt;many more&lt;/a&gt;) . Furthermore, they provide an &lt;a href="https://docs.fastlane.tools/plugins/available-plugins/"&gt;extensive list of plugins&lt;/a&gt; for third-party integrations such as Firebase, App Center, Yarn, Android Versioning, Dropbox, Slack Upload, S3, Bugsnag and many more. You can supercharge your fastlane scripts by coupling it with a CI/CD tool, such as GitHub Actions, Bitbucket Pipelines or Azure DevOps. The sky's the limit!&lt;/p&gt;

</description>
      <category>devops</category>
      <category>cicd</category>
      <category>tutorials</category>
      <category>productivity</category>
    </item>
    <item>
      <title>Part 2: Automating code quality scanning using Sonar Cloud and GitHub Actions</title>
      <dc:creator>Jeiman Jeya</dc:creator>
      <pubDate>Tue, 31 May 2022 01:46:11 +0000</pubDate>
      <link>https://community.ops.io/jei/part-2-automating-code-quality-scanning-using-sonar-cloud-and-github-actions-2322</link>
      <guid>https://community.ops.io/jei/part-2-automating-code-quality-scanning-using-sonar-cloud-and-github-actions-2322</guid>
      <description>&lt;p&gt;&lt;a href="https://dev.to/jeiman/part-1-concepts-of-code-quality-in-sonar-cloud-318j"&gt;Part 1 of our article&lt;/a&gt; talks about the fundamentals of code quality with respect to Sonar Cloud. However, these are generic terms that can be applied to any code quality tool. In this article, we will explain how we can use Sonar Cloud to automate our code quality scanning with our GitHub repositories using GitHub Actions. The goal of this article is to ensure that you have a good understanding of how Sonar Cloud performs code analysis, how the integration features on GitHub (PR decorators, inline commenting, code quality overview using widgets) will benefit your engineering teams in the long run.&lt;/p&gt;

&lt;p&gt;Some of the neat features that I found Sonar Cloud provides is:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Enhances your workflow with continuous code quality and code security&lt;/li&gt;
&lt;li&gt;Supports all major programming languages you can think of&lt;/li&gt;
&lt;li&gt;Provides a clear overview of your overall code health in your repository, pull requests and pipelines&lt;/li&gt;
&lt;li&gt;Works with all famous Git providers (GitHub, Bitbucket, Azure DevOps and GitLab)&lt;/li&gt;
&lt;li&gt;Works with all famous CI/CD tools (GitHub Actions, Bitbucket Pipelines, Azure Pipelines, GitLab CI/CD, CircleCI, TravisCI, etc)&lt;/li&gt;
&lt;li&gt;Bug, Vulnerability, and Code Smell detection&lt;/li&gt;
&lt;li&gt;Top-notch coding rules&lt;/li&gt;
&lt;li&gt;In-ALM pull request feedback&lt;/li&gt;
&lt;li&gt;Go/No Go Quality Gate checks&lt;/li&gt;
&lt;li&gt;Automatic analysis (Currently limited to GitHub repositories)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Preliminaries
&lt;/h2&gt;

&lt;p&gt;You will need the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A GitHub repository with a backend or frontend application&lt;/li&gt;
&lt;li&gt;Access to GitHub Actions&lt;/li&gt;
&lt;li&gt;A Sonar Cloud account&lt;/li&gt;
&lt;li&gt;A Sonar Cloud project&lt;/li&gt;
&lt;li&gt;Some basic knowledge on CI/CD&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Setup Process
&lt;/h2&gt;

&lt;h3&gt;
  
  
  1. Sign up and import repositories
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Ensure you have a GitHub repository readily available. For this, we will be choosing one of my NodeJS backend applications.&lt;/li&gt;
&lt;li&gt;Signup for a Sonar Cloud account and log in using your GitHub account to have a seamless experience.&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Once you have signed up, you will be redirected to this page:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/E9sWeJ_f9qb-dk0VAGscK3hy_AtfrWf8rhdXeG3HgYY/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvZTNv/dzZxbHd6eHVtczR4/OHhtYzEucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/E9sWeJ_f9qb-dk0VAGscK3hy_AtfrWf8rhdXeG3HgYY/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvZTNv/dzZxbHd6eHVtczR4/OHhtYzEucG5n" alt="SetupSonarCloud" width="880" height="445"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;From this page, you can import an organisation from GitHub.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Click on &lt;strong&gt;Import an organisation from Github&lt;/strong&gt;. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;You will be redirected to your GitHub account where it will prompt you to choose an organisation to install SonarCloud. In this article, we will be choosing my own personal GitHub repository.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/GuwebbcuIRu1YKatMsuLV4olSgAXLrwiXev1hb8K0qY/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvaDNp/bDI4OTg4dTltMDdl/OHBobnMucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/GuwebbcuIRu1YKatMsuLV4olSgAXLrwiXev1hb8K0qY/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvaDNp/bDI4OTg4dTltMDdl/OHBobnMucG5n" alt="Connect Sonar Cloud App with GitHub" width="596" height="909"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;From there, you will be redirected back to Sonar Cloud, where you will create your Sonar Cloud organisation. Please note that at this point, you have already &lt;strong&gt;installed and connected the Sonar Cloud App on your GitHub account&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/cMxa-vA5QZsQgHm2Elvq4AXZV4ZeMQ4bha8LRpw_ghw/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvejF4/dmlwOTZhc2FvcGs0/bTU0MGkucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/cMxa-vA5QZsQgHm2Elvq4AXZV4ZeMQ4bha8LRpw_ghw/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvejF4/dmlwOTZhc2FvcGs0/bTU0MGkucG5n" alt="Setup project key" width="880" height="474"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;From the image above, you can provide an organisation key that will be used later in GitHub Actions to send your code analysis metrics.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/hb5PiksSIZ0xkZ38_mUJZtQWcN_-TlPIb3BgoUMK2FU/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvNzhj/dXB4cGtpbXd1azd0/b2NmNGsucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/hb5PiksSIZ0xkZ38_mUJZtQWcN_-TlPIb3BgoUMK2FU/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvNzhj/dXB4cGtpbXd1azd0/b2NmNGsucG5n" alt="Choose pricing plan" width="880" height="436"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Next, you would need to choose a Pricing plan. There are 2 options: &lt;strong&gt;Paid&lt;/strong&gt; and &lt;strong&gt;Free&lt;/strong&gt;. For this article, we will be choosing the &lt;strong&gt;Free&lt;/strong&gt; plan.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Click &lt;strong&gt;Create Organization&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;You will now be able to import any &lt;strong&gt;Public repository&lt;/strong&gt;. If you have Paid for Sonar Cloud, you may import your Private repositories.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/AjI1YR428K_UW9jixE0wF-1l3M3tAULmtVTW4URspgE/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvMnd0/bWY2cWliMTB4b3pk/bG9kcTQucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/AjI1YR428K_UW9jixE0wF-1l3M3tAULmtVTW4URspgE/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvMnd0/bWY2cWliMTB4b3pk/bG9kcTQucG5n" alt="Choose your repos" width="880" height="499"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;We have selected our public repository, named &lt;code&gt;nodejs-backend-starter&lt;/code&gt;. This repository contains a very basic Node.js backend application with Unit Testing and Code Coverage setup.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;NOTE: There's also an option to set up a monorepo. Based on the image above, you may see a small section at the bottom right of the image that mentions setting up a monorepo.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Once you have selected your repositories, click &lt;strong&gt;Set Up&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;It will do an Automatic Analysis of your project providing an initial analysis of it.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Although Sonar Cloud offers automatic analysis for GitHub repositories only, we will be &lt;strong&gt;disabling this option&lt;/strong&gt; as we want our CI pipeline to handle the analysis process for us. With the CI pipeline, you may replicate this entire process for any Git provider and any CI/CD tool available.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Navigate to Administration &amp;gt; Analysis Method &amp;gt; SonarCloud Automatic Analysis. Turn off this option.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/9ql9kY_bi8Ti4b5jPUOPtS4SKlJvIBdbY6EHwoQUzp0/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvcjUz/Zm5laW01d2RiNXN0/eGZmZ2YucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/9ql9kY_bi8Ti4b5jPUOPtS4SKlJvIBdbY6EHwoQUzp0/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvcjUz/Zm5laW01d2RiNXN0/eGZmZ2YucG5n" alt="Turn off automatic analysis" width="880" height="188"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Navigate back to your project root page. Your project page should look something like the image below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/P1lIWVoGsXW8tJHyRI3-YHiDVrHDTIU6ERmMA843PmE/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvNGcy/b3F6cTZzbWt5eHJ4/OHo3b2MucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/P1lIWVoGsXW8tJHyRI3-YHiDVrHDTIU6ERmMA843PmE/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvNGcy/b3F6cTZzbWt5eHJ4/OHo3b2MucG5n" alt="Project page" width="880" height="761"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;As you can see, it states that they can't display any Quality Gate without a New Code definition. Based on the &lt;a href="https://dev.to/jeiman/part-1-concepts-of-code-quality-in-sonar-cloud-318j"&gt;Part 1 article&lt;/a&gt;, we will be setting up the following:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;A New Code Definition&lt;/li&gt;
&lt;li&gt;A Quality Gate&lt;/li&gt;
&lt;/ol&gt;


&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  2 Sonar Cloud Configuration
&lt;/h3&gt;

&lt;h4&gt;
  
  
  2.1 Set Up New Code definition
&lt;/h4&gt;

&lt;p&gt;There are 2 options available at this moment. You can either set up a New Code definition on:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A project level&lt;/li&gt;
&lt;li&gt;An organisation level&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For this article, we will set it up on a project level.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;On your project page, click on &lt;strong&gt;Set New Code definition&lt;/strong&gt;. If that is not available, you may navigate to &lt;strong&gt;Administration &amp;gt; New Code&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/m2drj4C0AVfDb13Xgd1SddKYjnaQLv78-Ocn_GTJGoo/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvNHp3/bm54anFoa2Fyam1q/Njh0MnMucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/m2drj4C0AVfDb13Xgd1SddKYjnaQLv78-Ocn_GTJGoo/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvNHp3/bm54anFoa2Fyam1q/Njh0MnMucG5n" alt="New Code Definitions" width="880" height="434"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;If you have read my previous article, you may know that there are a number of options to choose from. In this case, we will be choosing the &lt;strong&gt;Number of days&lt;/strong&gt; that equates to &lt;strong&gt;30 days&lt;/strong&gt; by default. The reason is we are not going to be maintaining a version for the project on Sonar Cloud for now.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  2.2 Setup a Quality Gate
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Navigate to your organisation page (&lt;a href="https://sonarcloud.io/organizations/jeimanjeya/projects"&gt;https://sonarcloud.io/organizations/{username}/projects&lt;/a&gt;).&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Select &lt;strong&gt;Quality Gates&lt;/strong&gt; from the sub-navigation.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/-1bmsbbezfyanzXfde52TRWG5P85gkYzxJ0E8Zg6YIg/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMveXUy/ZDlmajZyendjeW12/bjh0Z3MucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/-1bmsbbezfyanzXfde52TRWG5P85gkYzxJ0E8Zg6YIg/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMveXUy/ZDlmajZyendjeW12/bjh0Z3MucG5n" alt="Quality Gate Summary" width="880" height="392"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;From the image above, there's already a default Quality Gate set up for you. We're going to be cloning this and creating our own. Click on &lt;strong&gt;Copy&lt;/strong&gt; on the &lt;strong&gt;Sonar way&lt;/strong&gt; Quality Gate*&lt;em&gt;.&lt;/em&gt;*&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Name your new cloned Quality Gate.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Since there are already New Code conditions, we are simply appending &lt;strong&gt;Overall Code conditions&lt;/strong&gt;. For this article, we have added the following:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Coverage is less than 80%&lt;/li&gt;
&lt;li&gt;Duplicated Lines is greater than 5%&lt;/li&gt;
&lt;li&gt;Maintainability Rating is worse than A&lt;/li&gt;
&lt;li&gt;Reliability Rating is worse than A&lt;/li&gt;
&lt;li&gt;Security Rating is worse than A&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/t4OSqzed_FYCAd7FTM89ik-yqJwdBM-K379WomPcOZQ/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvMmsz/dXFqZGFuMm43MTR5/NmEzeXcucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/t4OSqzed_FYCAd7FTM89ik-yqJwdBM-K379WomPcOZQ/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvMmsz/dXFqZGFuMm43MTR5/NmEzeXcucG5n" alt="Final Quality Gate" width="880" height="542"&gt;&lt;/a&gt;&lt;/p&gt;


&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Set your new Quality Gate as &lt;strong&gt;Default&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Now that we have our project settings in place, we need to set up a project properties file next. &lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  2.3 Setup a project properties file
&lt;/h3&gt;

&lt;p&gt;This properties file will enable the CI pipeline to perform the necessary configuration and filtering in order to perform the code analysis. Furthermore, this will ensure we have the right settings in place when we push our commit to a branch or create a pull request that triggers the code analysis.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;In your code repository, create a &lt;code&gt;sonar-project.properties&lt;/code&gt; file.
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="s"&gt;sonar.exclusions=**/*.bin&lt;/span&gt;
&lt;span class="s"&gt;sonar.organization=jeimanjeya&lt;/span&gt;
&lt;span class="s"&gt;sonar.projectKey=jeiman_nodejs-backend-starter&lt;/span&gt;
&lt;span class="s"&gt;sonar.projectName=jeiman_nodejs-backend-starter&lt;/span&gt;
&lt;span class="s"&gt;sonar.projectVersion=1.0&lt;/span&gt;
&lt;span class="s"&gt;sonar.sourceEncoding=UTF-8&lt;/span&gt;
&lt;span class="s"&gt;sonar.sources=src&lt;/span&gt;
&lt;span class="s"&gt;sonar.exclusions=node_modules/**&lt;/span&gt;
&lt;span class="s"&gt;sonar.test.inclusions=test/**&lt;/span&gt;
&lt;span class="s"&gt;sonar.typescript.lcov.reportPaths=coverage/lcov.info&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this file, we're basically narrowing the focus of the scanner to configure the project source path, unit test inclusion path, coverage report path and exclude redundant folder paths (node_modules).&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Commit this file to your:&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Root directory&lt;/strong&gt;: If it is a single application repository (microrepo) architecture (the case for us)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Monorepo directory:&lt;/strong&gt; If you have a monorepo architecture&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Now that we have set up our project configuration and settings, we can focus on configuring the CI pipeline on GitHub Actions.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. GitHub Actions
&lt;/h3&gt;

&lt;h4&gt;
  
  
  3.1 Pipeline/Workflow Structure
&lt;/h4&gt;

&lt;p&gt;You have several to choose from options:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Store all of your workflows in a single YAML file - &lt;code&gt;ci.yml&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Separate them into branch workflows:

&lt;ol&gt;
&lt;li&gt;
&lt;code&gt;ci_develop.yml&lt;/code&gt; - for PR triggers and branch triggers on &lt;code&gt;develop&lt;/code&gt; branch&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;ci_master.yml&lt;/code&gt; - for PR triggers and branch triggers on &lt;code&gt;main&lt;/code&gt; branch&lt;/li&gt;
&lt;/ol&gt;


&lt;/li&gt;
&lt;li&gt;Separate them into trigger workflows:

&lt;ol&gt;
&lt;li&gt;PR triggers - &lt;code&gt;ci_pr.yml&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Branch triggers - &lt;code&gt;ci_branch.yml&lt;/code&gt;
&lt;/li&gt;
&lt;/ol&gt;


&lt;/li&gt;
&lt;li&gt;Separate them into monorepo workflows:

&lt;ol&gt;
&lt;li&gt;
&lt;code&gt;spa-frontend.yml&lt;/code&gt; and &lt;code&gt;ms-backend.yml&lt;/code&gt; - contains PR and branch triggers for both &lt;code&gt;develop&lt;/code&gt; and &lt;code&gt;main&lt;/code&gt; branch &lt;/li&gt;
&lt;/ol&gt;


&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;Option 1 is difficult the maintain in the long run if you follow a monorepo architecture&lt;/li&gt;
&lt;li&gt;Option 2 or 3 is the best way of maintaining your workflows on GitHub&lt;/li&gt;
&lt;li&gt;Option 4 is useful for when you have a monorepo architecture&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For this article, we are choosing Option 3 for simplification.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Your final GitHub workflow for branch analysis will look like this:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Sonar Cloud - Branch Analysis&lt;/span&gt;

&lt;span class="c1"&gt;# Controls when the action will run. Triggers the workflow on push&lt;/span&gt;
&lt;span class="c1"&gt;# events but only for the main and release-* branch&lt;/span&gt;
&lt;span class="na"&gt;on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;push&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;branches&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; 
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;main&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;release-*&lt;/span&gt;

&lt;span class="c1"&gt;# A workflow run is made up of one or more jobs that can run sequentially or in parallel&lt;/span&gt;
&lt;span class="na"&gt;jobs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;sonarcloud&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Build (Sonar Cloud)&lt;/span&gt;
    &lt;span class="na"&gt;runs-on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ubuntu-latest&lt;/span&gt;
    &lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/checkout@v2&lt;/span&gt;
      &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="c1"&gt;# Disabling shallow clone is recommended for improving relevancy of reporting&lt;/span&gt;
        &lt;span class="na"&gt;fetch-depth&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;0&lt;/span&gt;
    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/setup-node@v2&lt;/span&gt;
      &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;node-version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;15'&lt;/span&gt;

    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Node install dependencies&lt;/span&gt;
      &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;npm install&lt;/span&gt;

    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Run unit tests&lt;/span&gt;
      &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;npm run test&lt;/span&gt;

    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;SonarCloud Scan&lt;/span&gt;
      &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;sonarsource/sonarcloud-github-action@master&lt;/span&gt;
      &lt;span class="na"&gt;env&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;GITHUB_TOKEN&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.GITHUB_TOKEN }}&lt;/span&gt;
        &lt;span class="na"&gt;SONAR_TOKEN&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.SONAR_TOKEN }}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;We have included an Action from GitHub Actions and that is the Sonar Cloud Scanner. &lt;/li&gt;
&lt;li&gt;In order for Sonar Cloud Scanner to authenticate and upload the analysis reports and metrics, you will need to store the &lt;code&gt;SONAR_TOKEN&lt;/code&gt; secret in your GitHub repository.&lt;/li&gt;
&lt;li&gt;Navigate to your project settings and retrieve the project token, &lt;strong&gt;Project &amp;gt; Administration &amp;gt; Analysis Method &amp;gt; Analyze with a GitHub Action&lt;/strong&gt;. It will present you with a token.&lt;/li&gt;
&lt;li&gt;Navigate back to your repository settings on GitHub, &lt;strong&gt;Repository Settings &amp;gt; Secrets &amp;gt; New repository secret&lt;/strong&gt;. Name it as per the workflow secret name.&lt;/li&gt;
&lt;li&gt;For the &lt;code&gt;GITHUB_TOKEN&lt;/code&gt;, GitHub automatically appends the token for each workflow that runs on GitHub.&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Commit your code into the &lt;code&gt;main&lt;/code&gt; branch and watch the pipeline run.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/A9hKFIjL13NZ2U9CEo56GPfv-SJAhXt6iLkrsPMKhRA/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvNHZh/cTZsNWhxdzQyMmVv/ZnNieHgucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/A9hKFIjL13NZ2U9CEo56GPfv-SJAhXt6iLkrsPMKhRA/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvNHZh/cTZsNWhxdzQyMmVv/ZnNieHgucG5n" alt="Pipeline logs" width="880" height="265"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/oBk0RqF7qbXU4kE7lULJnHpCVFEiYtj8lGn5ih1oPI8/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvM3F1/cXZubXJ1d2YyaWV1/ODQ2a3kucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/oBk0RqF7qbXU4kE7lULJnHpCVFEiYtj8lGn5ih1oPI8/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvM3F1/cXZubXJ1d2YyaWV1/ODQ2a3kucG5n" alt="Pipeline summary" width="880" height="429"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The &lt;a href="https://github.com/jeiman/nodejs-backend-starter/runs/3281611957?check_suite_focus=true"&gt;summary for the branch analysis pipeline&lt;/a&gt; has indicated that the analysis was successful.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Navigating back to Sonar Cloud, you will notice that your project branch analysis report has been updated.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/NqmaNaaSUHI7GxeCzJVA_bbFx5hhS7v3lMmycZedzF4/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMveWhp/dTVna3Q5YWQyczZ5/dWVhMDQucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/NqmaNaaSUHI7GxeCzJVA_bbFx5hhS7v3lMmycZedzF4/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMveWhp/dTVna3Q5YWQyczZ5/dWVhMDQucG5n" alt="QG Check on branch analysis" width="880" height="914"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We'll move on to running code analysis on a long-living branch next.&lt;/p&gt;

&lt;h4&gt;
  
  
  3.2 Running code analysis on a long-living branch
&lt;/h4&gt;

&lt;ol&gt;
&lt;li&gt;Simply check out a new branch from &lt;code&gt;main&lt;/code&gt; and name it to &lt;code&gt;release-{anyname}&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Commit your code and the CI pipeline will pick it up based on the branch trigger logic in the workflow YAML file.&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;The SC Scanner will upload your analysis reports to your project on Sonar Cloud.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/chIDgHUkK2s1nHXieg9X3UB8XduacwKm77u9B8n4Xyo/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvNWxs/aTNuaXNyNTBmOXFm/ejJpbmcucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/chIDgHUkK2s1nHXieg9X3UB8XduacwKm77u9B8n4Xyo/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvNWxs/aTNuaXNyNTBmOXFm/ejJpbmcucG5n" alt="Not computed" width="735" height="225"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Don't be alarmed if it shows &lt;strong&gt;Not computed&lt;/strong&gt; as Sonar Cloud requires a second analysis to be able to show your Quality Gate on that long-living branch. It mentions &lt;strong&gt;Next scan will generate a Quality Gate&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/_EE_fxCmpS9lXZwXXDducuDFNW3fCy4iu5AM5juLWFg/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvNnM1/Z2NzMDNjb2xmdGly/MG94bTgucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/_EE_fxCmpS9lXZwXXDducuDFNW3fCy4iu5AM5juLWFg/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvNnM1/Z2NzMDNjb2xmdGly/MG94bTgucG5n" alt="Not computed 2" width="635" height="157"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;We pushed another commit to the repository to the long-living branch to show the Quality Gate.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Your final Quality Gate for branch analysis (on long-living branch) will produce the following result.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/X5csXeR8f2PU5ms8OG321ikwUbB2Aq2Zo_mzUQbLJXw/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvdjh0/bWNrZHltZ3ZqcWd3/c2l2NWoucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/X5csXeR8f2PU5ms8OG321ikwUbB2Aq2Zo_mzUQbLJXw/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvdjh0/bWNrZHltZ3ZqcWd3/c2l2NWoucG5n" alt="Branch analysis on long-living branch" width="880" height="677"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  3.3 Running code analysis on pull requests (PR)
&lt;/h4&gt;

&lt;ol&gt;
&lt;li&gt;Clone the &lt;code&gt;ci_branch.yml&lt;/code&gt;. Name it &lt;code&gt;ci_pr.yml&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;The only section we are changing is the event trigger. Instead of a &lt;code&gt;push&lt;/code&gt; event, we are using the &lt;code&gt;pull_request&lt;/code&gt; event.
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;pull_request&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;branches&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;main&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;release-*&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Commit the code and raise a PR in your code repository.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Your PR page should produce pipeline status checks alongside SonarCloud Code Analysis on your Merge PR section. Furthermore, your PR will contain decorations provided by the SonarCloud Bot.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/nUHjy2Vrgd36h2O1otaydaqyBqobWX133fClS4r_WbY/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvcTNo/cHZlazZzbDl3NmEz/MmI2bnQucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/nUHjy2Vrgd36h2O1otaydaqyBqobWX133fClS4r_WbY/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvcTNo/cHZlazZzbDl3NmEz/MmI2bnQucG5n" alt="PR summary report on GitHub" width="880" height="1100"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Your &lt;a href="https://github.com/jeiman/nodejs-backend-starter/pull/1/checks?check_run_id=3281994837"&gt;PR status checks details page&lt;/a&gt; should give you a better overview of what needs fixing.&lt;/p&gt;

&lt;p&gt;If you navigate back to your Sonar Cloud project and choose your PR analysis report from the branch dropdown menu, you will see similar statistics that were displayed on GitHub.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/w2LWYSlvnIQorUVNyRVm33nbVuOvkecyfBGW7M4KmRc/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvaDdx/YWYyNDk2Z28zaWlm/b290aWwucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/w2LWYSlvnIQorUVNyRVm33nbVuOvkecyfBGW7M4KmRc/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvaDdx/YWYyNDk2Z28zaWlm/b290aWwucG5n" alt="PR analysis on Sonar Cloud" width="880" height="442"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let's go ahead and fix our code to ensure we don't have any code smells and fine tune our Quality Gate conditions to match with a Passed state.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/-ZMHUiXxUz2PEEu8f8bMlYRLVdOnzUx509UXp8E3Ej4/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMva2p5/cWRxOWdraG5wa3Nt/MHNtbWYucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/-ZMHUiXxUz2PEEu8f8bMlYRLVdOnzUx509UXp8E3Ej4/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMva2p5/cWRxOWdraG5wa3Nt/MHNtbWYucG5n" alt="Passed state" width="880" height="520"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now our PR has clean code (however overall code requires fixes), we can now merge our PR. The pipeline will run another branch analysis on the &lt;code&gt;main&lt;/code&gt; branch, which is our default branch on Sonar Cloud.&lt;/p&gt;

&lt;h2&gt;
  
  
  Piecing it all together
&lt;/h2&gt;

&lt;p&gt;We have accomplished having a Sonar Cloud project connected to a GitHub repository that has GitHub Actions workflows in place to automate our code analysis for branch and PR triggers. Once these analyses are completed on the pipeline, you can navigate back to your Sonar Cloud project and view branch and PR analysis on both your main branch and long-living branches.&lt;/p&gt;

&lt;p&gt;One of the plus points in having integration between GitHub and Sonar Cloud is that engineering teams can benefit from having decorations forming up in their pull requests and repository. This will speed up the developer's productivity in fixing bugs, code smells, and vulnerabilities and avoid having further technical debts in their projects.&lt;/p&gt;

&lt;h2&gt;
  
  
  Tips
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;You can use any CI/CD tools to perform your code analysis. It does not need to be with GitHub Actions even though your repository resides on GitHub, however, for consistency you can keep them all under one umbrella.&lt;/li&gt;
&lt;li&gt;To have a complete end-to-end experience for your code quality scans on your pull requests, ensure you connect and import repositories from one unique Git organisation. The aforementioned features mainly work for that reason. Try not to mix and match repositories from different organisations (even though it's possible to do so).

&lt;ul&gt;
&lt;li&gt;1 GitHub organisation = 1 Sonar Cloud subscription&lt;/li&gt;
&lt;li&gt;1 Azure DevOps organisation = 1 Sonar Cloud subscription&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;li&gt;Create long-living branches on Sonar Cloud if you're following Git flow technique of &lt;code&gt;develop&lt;/code&gt; and &lt;code&gt;master&lt;/code&gt; branch. You may read up on their &lt;a href="https://community.sonarsource.com/t/how-to-create-long-living-branches-on-sonarcloud/11386"&gt;community post&lt;/a&gt; on how to achieve this. This will ensure that you are performing branch analysis on them alongside your pull request analysis.&lt;/li&gt;
&lt;li&gt;Fine-tune your Quality Gate to ensure you get the best experience from it that matches your engineering teams needs&lt;/li&gt;
&lt;li&gt;Sonar Cloud supports monorepo projects!&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I hope this 2-part series has been beneficial and helpful to your engineering needs. Sonar Cloud is a powerful code analysis tool. Please do explore their documentation to find out more or kindly reach out to me for any assistance.&lt;/p&gt;

&lt;h2&gt;
  
  
  Demo links to this article
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://sonarcloud.io/dashboard?id=jeiman_nodejs-backend-starter"&gt;Sonar Cloud project&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/jeiman/nodejs-backend-starter"&gt;GitHub repository&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/jeiman/nodejs-backend-starter/actions"&gt;GitHub Actions&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://sonarcloud.io/dashboard?branch=release-testsonarcloud&amp;amp;id=jeiman_nodejs-backend-starter"&gt;Long-living branch analysis&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  References
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.microsoft.com/en-us/power-platform/alm/overview-alm"&gt;https://docs.microsoft.com/en-us/power-platform/alm/overview-alm&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://sonarcloud.io/github"&gt;https://sonarcloud.io/github&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>devops</category>
      <category>cicd</category>
      <category>codequality</category>
    </item>
    <item>
      <title>Part 1: Concepts of Code Quality in Sonar Cloud</title>
      <dc:creator>Jeiman Jeya</dc:creator>
      <pubDate>Tue, 31 May 2022 01:37:13 +0000</pubDate>
      <link>https://community.ops.io/jei/part-1-concepts-of-code-quality-in-sonar-cloud-31l2</link>
      <guid>https://community.ops.io/jei/part-1-concepts-of-code-quality-in-sonar-cloud-31l2</guid>
      <description>&lt;p&gt;&lt;a href="https://community.ops.io/images/s98092Vq0lviksNj6jAceNKzl6vtHANtt3URaEUI97k/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvaGxl/dWI5ODB3a2JkdjZz/c3dzMjMucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/s98092Vq0lviksNj6jAceNKzl6vtHANtt3URaEUI97k/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvaGxl/dWI5ODB3a2JkdjZz/c3dzMjMucG5n" alt="Placeholder image" width="880" height="569"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In engineering teams/tribes, we often find ourselves stuck in a dilemma of choosing a suitable and efficient code analysis tool. A tool that will provide us all of the essential features and metrics to analyse codebases using best design practices in mind. Teams would want to prevent code problems from being merged by detecting code smells, bugs, and vulnerabilities sooner. Furthermore, they would want to view fast and accurate feedback from their respective pull requests and code merges in their repositories.&lt;/p&gt;

&lt;p&gt;The primary goals of software quality engineering are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Process control and oversight&lt;/li&gt;
&lt;li&gt;Implementing standards and metrics&lt;/li&gt;
&lt;li&gt;Data collection and analysis&lt;/li&gt;
&lt;li&gt;Test development&lt;/li&gt;
&lt;li&gt;Identification of issues and solutions&lt;/li&gt;
&lt;li&gt;Follow-up to ensure corrective actions&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Overview
&lt;/h1&gt;

&lt;p&gt;Over the years, I have worked with many code analysis tools to help developers in various teams such as Codacy, Code Climate, DeepScan, and Sonar Cloud. After spending a considerable amount of time experimenting and setting up Sonar Cloud projects, I realised it stands out from the crowd. It has a comprehensive analysis engine that offers features encompassing all of the aforementioned software quality engineering goals. Some of those features which fascinated me and my team was:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Pull Request decorators

&lt;ul&gt;
&lt;li&gt;Inline commenting through report annotations&lt;/li&gt;
&lt;li&gt;Pull request widgets: Provides overall code quality health on your Pull requests&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;li&gt;Repository widgets: Provides overall code quality health on your project&lt;/li&gt;
&lt;li&gt;Scanning Old code vs New code

&lt;ul&gt;
&lt;li&gt;Code Coverage&lt;/li&gt;
&lt;li&gt;Code Duplications&lt;/li&gt;
&lt;li&gt;Maintainability Ratings&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;li&gt;Defining custom Quality Gate checks for different projects&lt;/li&gt;
&lt;li&gt;Defining New Code definitions for different projects&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Some of the aforementioned features (Quality Gate, Profile, New Code Definitions) are in fact settings that need to be configured in Sonar Cloud. You will need to give your attention to configuring these settings as you will not reap the benefits of fully utilising Sonar Cloud.&lt;/p&gt;

&lt;p&gt;We are going to explore the Sonar Cloud ecosystem and how all of its core features come together to provide you a comprehensive code analysis experience.&lt;/p&gt;

&lt;h1&gt;
  
  
  Sonar Cloud Ecosystem
&lt;/h1&gt;

&lt;h2&gt;
  
  
  Projects
&lt;/h2&gt;

&lt;p&gt;In Sonar Cloud, a single repository corresponds to a single project. It is how they maintain a unique set of code quality data and metrics for each repository you have. &lt;/p&gt;

&lt;h3&gt;
  
  
  Monorepo Support
&lt;/h3&gt;

&lt;p&gt;Sonar Cloud does support monorepo projects. You can create multiple projects, each corresponding to a separate monorepo project which are all bounded to the same repository. This will allow you to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Configure one Quality Gate per project&lt;/li&gt;
&lt;li&gt;Receive multiple Quality Gate results&lt;/li&gt;
&lt;li&gt;Read project-labeled messages from SonarCloud&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Each monorepo project must have a unique project key in Sonar Cloud, which will be used to uniquely identify your projects using your CI tool.&lt;/p&gt;

&lt;p&gt;The standard practice is to have the following naming convention: &lt;code&gt;{organisationName-project-monorepoName}&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;Project 1&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;sampleorg-domain-frontend&lt;/span&gt;
&lt;span class="na"&gt;Project 2&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;sampleorg-domain-backend&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is useful if your organisation maintains a big number of monorepo projects across various engineering tribes so the projects can be easily identified in Sonar Cloud. However, you can follow any naming convention you see fit that suits your needs.&lt;/p&gt;

&lt;h2&gt;
  
  
  New Code Definitions
&lt;/h2&gt;

&lt;p&gt;Sonar Cloud follows the concept of &lt;a href="https://sonarcloud.io/documentation/improving/clean-as-you-code/"&gt;Clean As You Code&lt;/a&gt;. The core idea is that you focus your attention and effort on new code. As you work on features and improvements, &lt;strong&gt;SonarCloud analyses your code on each new commit&lt;/strong&gt; and alerts you to any code quality problems and vulnerabilities. This allows you to address the issues right away and ensure that all new code added to the project is always clean. You may read their documentation to find out more information.&lt;/p&gt;

&lt;p&gt;Accompanying this are New Code Definitions. Setting up the right New Code definition for your project is vital to getting the most out of SonarCloud by determining which changes to your code are considered recent enough to merit your full focus and allow you to use the Clean as You Code methodology when addressing issues in your code. There are several options to consider when configuring your New Code Definitions:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Previous version:&lt;/strong&gt; Issues in the code that have appeared since the most recent version increment of the project&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Specific version:&lt;/strong&gt; Issues that have occurred on a specific version of the project&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Number of days:&lt;/strong&gt; Issues that have appeared on your code since the specified number of days (A numerical number)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Specific date:&lt;/strong&gt; Issues that have appeared on your code since the specified date&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/QA_PYXtysbonSoDFCEG62XJpUZ1LAh0KB-XlT8Zom9M/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMva2lh/ZnVmeHRhdGZlYjNl/YzB2OHUucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/QA_PYXtysbonSoDFCEG62XJpUZ1LAh0KB-XlT8Zom9M/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMva2lh/ZnVmeHRhdGZlYjNl/YzB2OHUucG5n" alt="New Code Definition" width="880" height="904"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When you perform initial code analysis, it does not provide you a "New Code" analysis. Instead, it scans the whole project, providing you with an overall code quality health. As you go along and start analysing new commits, Sonar Cloud will present you with both an &lt;strong&gt;Overall&lt;/strong&gt; and &lt;strong&gt;New&lt;/strong&gt; code quality health analysis (depicted from the picture above).&lt;/p&gt;

&lt;p&gt;You can set a New Code Definition either on a project level or an organisation level, with the latter providing you a way of automatically applying the definitions to new projects that are created.&lt;/p&gt;

&lt;h2&gt;
  
  
  Quality Gates
&lt;/h2&gt;

&lt;p&gt;A quality gate is technically a metric of sorts that informs you whether your code meets a minimum level of quality required for the project. This consists of a set of conditions that are applied to the results of each analysis performed. If the results meet or exceed the quality gate conditions, it will show one of the following statuses accordingly: &lt;strong&gt;Passed&lt;/strong&gt; or &lt;strong&gt;Failed&lt;/strong&gt;. You can define conditions on &lt;strong&gt;New Code&lt;/strong&gt; and &lt;strong&gt;Overall Code&lt;/strong&gt;. Some examples of the conditions are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Coverage is less than 80.0%&lt;/li&gt;
&lt;li&gt;Duplicated Lines is greater than 3.0%&lt;/li&gt;
&lt;li&gt;Maintainability Rating is worse than A&lt;/li&gt;
&lt;li&gt;Reliability Rating is worse than A&lt;/li&gt;
&lt;li&gt;Security Hotspots Reviewed is less than 100%&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The quality gates are analysed and calculated on &lt;strong&gt;Main Branch (&lt;code&gt;master&lt;/code&gt; by default)&lt;/strong&gt;, &lt;strong&gt;other Branches,&lt;/strong&gt; and &lt;strong&gt;Pull Requests&lt;/strong&gt;. You can create any number of Quality Gates for your projects and enable them per project, or create a default Quality Gate that will be applied to all projects that have been or will be created.&lt;/p&gt;

&lt;p&gt;An example of a customised Quality Gate is shown below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/3HeZKb3N1kzDPdgoeckLvuI18acmRqU5AFnkKA_Ro-o/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvOHlz/MWUyZTdyN2dxZzJu/azZjb2wucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/3HeZKb3N1kzDPdgoeckLvuI18acmRqU5AFnkKA_Ro-o/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvOHlz/MWUyZTdyN2dxZzJu/azZjb2wucG5n" alt="Quality Gates" width="880" height="675"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Quality Profiles
&lt;/h2&gt;

&lt;p&gt;Quality profiles as you may have guessed it are programming language rules that are applied during code analysis. By default, the programming languages that are supported on Sonar Cloud have a built-in profile, called "Sonar way", using the standard best practices that are currently in the market. Although the "Sonar way" is best suited for most projects, there are cases where engineering tribes would want to customise their profiles that best suit their needs.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/hWR1CbnahvJSAfwe1pYAtApZYo6uUaoP1TNwXCNWXyE/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvYXJr/cnJkN2sycm1lcm9n/cWZyaW8ucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/hWR1CbnahvJSAfwe1pYAtApZYo6uUaoP1TNwXCNWXyE/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvYXJr/cnJkN2sycm1lcm9n/cWZyaW8ucG5n" alt="Quality Profiles" width="880" height="855"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Branches
&lt;/h2&gt;

&lt;p&gt;In Sonar Cloud, there are 2 types of branch analysis: &lt;strong&gt;Short-lived branches&lt;/strong&gt; and &lt;strong&gt;Long-lived branches&lt;/strong&gt;. &lt;/p&gt;

&lt;h3&gt;
  
  
  Short-Living Branches
&lt;/h3&gt;

&lt;p&gt;As the name suggests, these branches are meant to be used to temporarily perform analysis on them, usually via pull requests. Short-lived branches are deleted automatically after 30 days with no analysis.&lt;/p&gt;

&lt;h3&gt;
  
  
  Long-Living Branches
&lt;/h3&gt;

&lt;p&gt;These branches are useful for when tribes follow Agile methodology, utilising Git flow techniques to maintain a set of upstream branches (sprint, release). These branches will remain in your Sonar Cloud project history until it is deleted. Some companies maintain upstream branches for a very long time, hence this option is extremely useful to conduct code analysis on these branches alongside your main branch (&lt;code&gt;master&lt;/code&gt;).&lt;/p&gt;

&lt;h3&gt;
  
  
  Defining long-living branches
&lt;/h3&gt;

&lt;p&gt;Long-living branches are defined on a project level. Simply navigate to your &lt;strong&gt;Project Settings &amp;gt; Administration &amp;gt; Branches &amp;amp; Pull Requests&lt;/strong&gt;. The long-living branches follow a regular expression pattern.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/cepDuYi56lLl84xmKIbng4DYRaWwNW7aAW7vd5cSmC4/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvd2Jq/N3cxcjFyYjJ1dThy/Nzhhd3UucG5n" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/cepDuYi56lLl84xmKIbng4DYRaWwNW7aAW7vd5cSmC4/w:880/mb:500000/ar:1/aHR0cHM6Ly9kZXYt/dG8tdXBsb2Fkcy5z/My5hbWF6b25hd3Mu/Y29tL3VwbG9hZHMv/YXJ0aWNsZXMvd2Jq/N3cxcjFyYjJ1dThy/Nzhhd3UucG5n" alt="Branch regex" width="348" height="59"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By default, the pattern is &lt;code&gt;(branch|release)-.*&lt;/code&gt;. This means that when the name of the branch starts with &lt;code&gt;branch-&lt;/code&gt; or &lt;code&gt;release-&lt;/code&gt;, it will be considered a long-living branch.&lt;/p&gt;

&lt;h3&gt;
  
  
  Project Settings
&lt;/h3&gt;

&lt;p&gt;All of the aforementioned concepts and terminologies are in fact project and organisation settings (Quality Gates, Quality Profiles, Long-living branches, New Code Definitions, Monorepo Support). Accompanied with are general settings such as Code Coverage Exclusions, Test File Inclusions and Exclusions, Duplication Inclusions, Source File Exclusions and Inclusions, and many more. You may refer to their &lt;a href="https://docs.sonarqube.org/latest/project-administration/narrowing-the-focus"&gt;documentation&lt;/a&gt; to set these up.&lt;/p&gt;

&lt;h1&gt;
  
  
  Summary
&lt;/h1&gt;

&lt;p&gt;I hope this Part 1 article has provided you with insights on how Sonar Cloud operates and the software engineering quality it follows. Part 2 of this article will provide you a walkthrough on how to set up a Sonar Cloud project with a GitHub repository and perform code analysis using GitHub Actions.&lt;/p&gt;

&lt;p&gt;Image Source: &lt;a href="https://alexandrebrisebois.files.wordpress.com/2014/05/2011-09-18_code_reviews.png"&gt;https://alexandrebrisebois.files.wordpress.com/2014/05/2011-09-18_code_reviews.png&lt;/a&gt;&lt;/p&gt;

</description>
      <category>devops</category>
      <category>codequality</category>
    </item>
  </channel>
</rss>
