<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>The Ops Community ⚙️: Bhagya</title>
    <description>The latest articles on The Ops Community ⚙️ by Bhagya (@bhagyaaa).</description>
    <link>https://community.ops.io/bhagyaaa</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://community.ops.io/feed/bhagyaaa"/>
    <language>en</language>
    <item>
      <title>Upload an image to S3 using NodeJs 👷🏻🧡</title>
      <dc:creator>Bhagya</dc:creator>
      <pubDate>Sun, 02 Apr 2023 08:54:56 +0000</pubDate>
      <link>https://community.ops.io/bhagyaaa/upload-an-image-to-s3-using-nodejs-5g00</link>
      <guid>https://community.ops.io/bhagyaaa/upload-an-image-to-s3-using-nodejs-5g00</guid>
      <description>&lt;p&gt;In this article, Im going to discuss about how to upload an image to AWS S3, the cloud file hosting solution provided by Amazon Web Services.&lt;/p&gt;

&lt;p&gt;First install the aws-sdk library.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npm install aws-sdk
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then ,import it to the code, at the top of the file that going to add this file upload to s3 functionality:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import AWS from ’aws-idk’
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, use the sdk to create an instance of the s3 object. Let’s assign it to a s3 variable.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const s3 = new AWS.S3({
 accessKeyId : process.env.AWS_S3_ACCESS_KEY_ID,
 secretAccessKet: process.env.AWS_S3_SECRET_ACESS_KEY,
})
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here used two environment variables here:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;AWS_S3_ACCESS_KEY_ID&lt;/li&gt;
&lt;li&gt;AWS_S3_SECRET_ACESS_KEY&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Now comes some “administrative works” to do. For that you need to create an IAM profile on AWS (the credentials) with programmatic access with the permissions for &lt;strong&gt;AWSCloudFormationFullAccess&lt;/strong&gt;  and &lt;strong&gt;AmazonS3FullAccess&lt;/strong&gt; and an S3 bucket that this user has access to.&lt;/p&gt;

&lt;p&gt;Now you need an image blog to upload.&lt;/p&gt;

&lt;p&gt;Here you can use the below URL:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const imageURL = ’https://url-to-image.jpg'
const res = await fetch(imageURL)
const blob = await res.buffer()
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Or you can get an image sent from a form image field in a multipart form:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const imagePath = req.files[0].path
const blob = fs.readFileSync(imagePath)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Finally, make a call to s3.upload() and call its .promise() method so you can use await to wait until it finishes to get the uploaded files object:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const uploadedImage = await s3.upload({
    Bucket: process.env.AWS_S3_Bucket_NAME,
        Key: req.files[0].originalFilename,
    Body: blod,
 }).promise()
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;AWS_S3_BUCKET_NAME is the name of the S3 bucket , another environment variable.&lt;/p&gt;

&lt;p&gt;Finally , you can get the URL of the uploaded image on S3 by referencing the Location property:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt; uploadedImage.Location
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You have to make sure that you set the S3 bucket as public so you can access that image URL.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>devops</category>
      <category>cloudops</category>
      <category>tutorials</category>
    </item>
    <item>
      <title>Dive into AWS S3 👷🏻‍♀️🧡</title>
      <dc:creator>Bhagya</dc:creator>
      <pubDate>Fri, 31 Mar 2023 11:02:56 +0000</pubDate>
      <link>https://community.ops.io/bhagyaaa/dive-into-aws-s3-571o</link>
      <guid>https://community.ops.io/bhagyaaa/dive-into-aws-s3-571o</guid>
      <description>&lt;p&gt;&lt;strong&gt;What is AWS S3?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Industry-leading scalability, data availability, security, and performance are provided by the object storage tool called Amazon Simple Storage Service (Amazon S3).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/1GwICD-zsQROlznmf3c7ttiupR0Gn97Xl_9bTtD-S6g/rt:fit/w:800/g:sm/q:0/mb:500000/ar:1/aHR0cHM6Ly9jb21t/dW5pdHkub3BzLmlv/L3JlbW90ZWltYWdl/cy91cGxvYWRzL2Fy/dGljbGVzLzA2aGMz/NG1pNjczdW84Z205/cWE0LnBuZw" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/1GwICD-zsQROlznmf3c7ttiupR0Gn97Xl_9bTtD-S6g/rt:fit/w:800/g:sm/q:0/mb:500000/ar:1/aHR0cHM6Ly9jb21t/dW5pdHkub3BzLmlv/L3JlbW90ZWltYWdl/cy91cGxvYWRzL2Fy/dGljbGVzLzA2aGMz/NG1pNjczdW84Z205/cWE0LnBuZw" alt="Image description" width="204" height="247"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Any big business in any sector can use this service. Websites, mobile applications, archiving, data backups and restorations, Internet of Things (IoT) devices, corporate application storage, and providing the underpinning storage layer for your data lake are examples of use cases.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How it Works ?&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;Two essential elements—buckets and objects—are the center of how to organize, store, and retrieve data in Amazon S3. These two elements together form the storage system. An S3 environment, according to AWS, is flat in design. A user makes a bucket, and the bucket stores things in the cloud.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Amazon S3 Objects&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Objects in the context of Amazon S3 are data files, such as documents, images, and videos, as was already stated. Within the S3 environment, each object has a special key that distinguishes it from other objects that are being saved. There are various AWS tools available to help you add files bigger than the 160 GB object file size limit.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Amazon S3 Buckets&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Buckets are the basic storage containers for products in an S3 ecosystem because objects need a place to go. &lt;/p&gt;

&lt;p&gt;With no restrictions on the quantity of objects you can store in a bucket, you are able to make up to 100 of them in each of your AWS cloud accounts. You can file a service limit increase to ask for as many as 1,000 additional buckets if necessary.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/J-WtSLN2YELGtNG-pXY_Obtdo00SeFnFj73unlm7fpY/rt:fit/w:800/g:sm/q:0/mb:500000/ar:1/aHR0cHM6Ly9jb21t/dW5pdHkub3BzLmlv/L3JlbW90ZWltYWdl/cy91cGxvYWRzL2Fy/dGljbGVzL3NuNjB4/bzU5NXBnYWc3OGxk/ZTluLnBuZw" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/J-WtSLN2YELGtNG-pXY_Obtdo00SeFnFj73unlm7fpY/rt:fit/w:800/g:sm/q:0/mb:500000/ar:1/aHR0cHM6Ly9jb21t/dW5pdHkub3BzLmlv/L3JlbW90ZWltYWdl/cy91cGxvYWRzL2Fy/dGljbGVzL3NuNjB4/bzU5NXBnYWc3OGxk/ZTluLnBuZw" alt="Image description" width="779" height="664"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can select which AWS region to keep a bucket in when you create it. It is best practice to choose a region that is physically near to you in order to reduce costs and address latency issues. Unless you move the files elsewhere, items that are stored in a bucket within a particular area stay there.&lt;/p&gt;

&lt;p&gt;The fact that Amazon S3 buckets are worldwide unique is also crucial to understand. Unless you first delete your own buckets, no other AWS user in the same region can have the same bucket names as yours.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Amazon S3 Console&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You can handle objects and buckets with ease in the Amazon S3 Console located inside of AWS Management. A user-friendly, browser-based user interface for working with AWS services is offered by the console.&lt;/p&gt;

&lt;p&gt;Here, you can upload, download, and handle objects in addition to creating, configuring, and managing buckets. By using keyword prefixes and delimiters to create a logical hierarchy, the console enables you to arrange your storage. &lt;/p&gt;

&lt;p&gt;Due to the fact that every Amazon S3 object can be uniquely addressed using a combination of the web service endpoint, bucket name, key, and, potentially, version, objects and buckets form a folder structure within the console, making it simple to find files. Within the management console, you can modify the entry rights for all buckets and objects.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Advantages of AWS S3&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Scalability:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Similar to how some mobile phone or cable providers bundle data and bandwidth usage, storage providers frequently give predetermined amounts of storage and network transfer capacity. Even if you do not use your entire capacity, you will still pay a flat rate if you remain within your limits. However, if you go over your limit, the provider may impose steep overage charges or even suspend your service until the start of the subsequent billing period. &lt;/p&gt;

&lt;p&gt;Only the usage that you truly use is charged by Amazon S3. This service enables you to scale your storage resources up and down without any additional costs or overage charges, making it simple for you to keep up with your organization's changing needs.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Durability and Accessibility&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The Amazon Web Services website claims that Amazon S3 is "designed for 99.999999999% (11 9s) of durability, storing data for millions of applications for companies all over the world." Your data is safe and easily accessible because the service automatically creates and saves your S3 objects across multiple platforms. &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Cost-Effective Storage&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Depending on how frequently and instantly you need access to your files, you can use Amazon S3 to keep your data in a variety of "storage classes." &lt;/p&gt;

&lt;p&gt;The price points for storage classes vary from the highest level for immediate access to your mission-critical files to the lowest level for files you rarely use but must keep on hand for regulatory or other long-term requirements. &lt;/p&gt;

&lt;p&gt;AWS offers tools that let you keep an eye on your objects and decide whether or not they ought to be transferred to a less costly storage class. A program called S3 Intelligent Tiering, for instance, is designed to automatically move your data from more expensive storage classes to less expensive ones based on your ongoing access habits.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Powerful Security&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The information you keep in your AWS S3 environment is secure from unauthorized access thanks to encryption features and access control tools. At both the bucket and account levels, this also entails preventing any public access to any of your items. &lt;/p&gt;

&lt;p&gt;The S3 buckets and objects that your organization's users can access are limited to those they generate by default. Change and tailor access permissions using a number of AWS security management tools. Users can also completely delete an object version or change a bucket's versioning state by using multi-factor authentication (MFA). &lt;/p&gt;

&lt;p&gt;In order to rapidly identify and correct any inconsistencies that might permit unauthorized use and/or unintended access, AWS also provides tools that enable you to analyze your bucket access policies.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>devops</category>
      <category>cloudops</category>
      <category>productivity</category>
    </item>
    <item>
      <title>Azure Container Instances 🧿 🍃</title>
      <dc:creator>Bhagya</dc:creator>
      <pubDate>Mon, 04 Jul 2022 11:33:09 +0000</pubDate>
      <link>https://community.ops.io/bhagyaaa/azure-container-instances-5bb8</link>
      <guid>https://community.ops.io/bhagyaaa/azure-container-instances-5bb8</guid>
      <description>&lt;p&gt;Based on the open-source Docker Registry 2.0, Azure Container Registry is a managed Docker registry service. You can create, save, and manage images for all kinds of container deployments using the private, Azure-hosted Container Registry. It can be summed up as a private registry akin to Docker Hub. An open-source registry called Docker Hub allows anyone to upload their images, but it is private and only you may view it.&lt;/p&gt;

&lt;p&gt;Without needing to maintain any virtual machines or adopt a higher-level service, Azure Container Instances (ACI) provides the quickest and easiest way to run a container in Azure. Simple applications, task automation, and build jobs are just a few examples of situations where Azure Container Instances (ACI) provides a fantastic option. One advantage of using ACIs is that you may deploy your container image without creating any YAML files or keeping any VMs or other resources up to date. ACI will take care of everything. It distributes storage, networking, and a lot of other things.&lt;/p&gt;

&lt;p&gt;The container group is the primary resource in Azure Container Instances. A group of containers that are scheduled to run on the same host machine is known as a container. The lifecycle, resources, local network, and storage volumes of the containers in a container group are all shared.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create an Azure resource group:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;We should build resource groups before creating container instances and container registry because anything we generate afterwards will be in these groups, and if you delete this group and don't want those resources any more, all resources will be gone.&lt;/p&gt;

&lt;p&gt;An resource group can be created using the command line:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt; az group create --name tutorial --location eastus
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Create an Azure container registry:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;We should create a variable for the registry name so that we don't have to input it again throughout the course.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ACR_NAME = "AzureTutorialForVAC"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The command listed below will assist you in building a container registry:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az acr create --resource-group tutorial --name $ACR_NAME --sku Premium
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The sku stands for different service levels that Azure offers. The three categories are Basic, Standard, and Premium.&lt;/p&gt;

&lt;p&gt;Let’s now create a Docker file in our project so that we can use it to build an image.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;FROM nginx:alpine
COPY . /usr/share/nginx/html
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then access your command prompt. Use the command below to log into your Azure container registry so that Azure cli knows which registry to use:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az acr login --name azuretutorialforvac
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Since docker hub is open-source, it is usually logged in because we push images there. However, since we are using the private Azure Container Registry, we must sign in to that account. When you visit the Azure Container Registry in the Azure portal, you will see this login server url. Use the following command for this.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt; docker login azuretutorialforvac.azurecr.io
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Your username and password will be requested. Go to the Azure Portal, choose your container registry, then select Access Keys in the left sidebar. Set the admin-enabled property to true. The credentials will then be displayed to you. To the prompt, copy and paste those. There is yet another technique to accomplish this, and it involves azure cli. You can do that with the aid of the ensuing commands.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  az acr update -n $ACR_NAME --admin-enabled true
  az acr credential show --name $ACR_NAME
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After that, you can use the docker build command to create an image. However, one thing to bear in mind is that the loginserverurl/nameforimage format should be used when giving the tag name to the image in the command.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  docker build -t azuretutorialforvac.azurecr.io/img:1 .
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once it is buit push the image to acr.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker push azuretutorialforvac.azurecr.io/img:1
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once its done you can use the following command to verify.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  az acr repository list --name $ACR_NAME --output table

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>azure</category>
      <category>devops</category>
      <category>tutorials</category>
      <category>productivity</category>
    </item>
    <item>
      <title>Azure Virtual Network IP Services🧵 🍃</title>
      <dc:creator>Bhagya</dc:creator>
      <pubDate>Thu, 30 Jun 2022 20:10:41 +0000</pubDate>
      <link>https://community.ops.io/bhagyaaa/azure-virtual-network-ip-services-469k</link>
      <guid>https://community.ops.io/bhagyaaa/azure-virtual-network-ip-services-469k</guid>
      <description>&lt;p&gt;A group of services connected to IP addresses known as IP services make it possible to communicate within an Azure Virtual Network. Azure uses both public and private IP addresses for resource communication. Both the public Internet and a private Azure Virtual Network can be used for resource communication.&lt;/p&gt;

&lt;p&gt;IP services consist of:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Public IP addresses&lt;/li&gt;
&lt;li&gt;Public IP address prefixes&lt;/li&gt;
&lt;li&gt;Private IP addresses&lt;/li&gt;
&lt;li&gt;Routing preference&lt;/li&gt;
&lt;li&gt;Routing preference unmetered&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Public IP addresses&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Internet resources communicate with resources in Azure using public IP addresses. An IPv4 or IPv6 address can be used to create public IP addresses. A dual-stack setup with an IPv4 and IPv6 address might be available to you. Standard and Basic SKUs are both available for public IP addresses. Public IP addresses may be assigned either statically or dynamically.&lt;/p&gt;

&lt;p&gt;Public IP addresses are resources with unique characteristics. The following resources can be linked to a public IP address:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Virtual machine network interfaces&lt;/li&gt;
&lt;li&gt;Internet-facing load balancers&lt;/li&gt;
&lt;li&gt;Virtual Network gateways (VPN/ER)&lt;/li&gt;
&lt;li&gt;NAT gateways&lt;/li&gt;
&lt;li&gt;Application gateways&lt;/li&gt;
&lt;li&gt;Azure Firewall&lt;/li&gt;
&lt;li&gt;Bastion Host&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Public IP address prefixes&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In Azure, public IP prefixes are set aside IP address blocks. Prefixes for public IP addresses might be IPv4 or IPv6. Public IP address prefixes in regions with availability zones can be created as zone-redundant or linked to a particular availability zone. You can create public IP addresses after creating the public IP prefix.&lt;/p&gt;

&lt;p&gt;The following public IP prefix sizes are available:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;/28 (IPv4) or /124 (IPv6) = 16 addresses&lt;/li&gt;
&lt;li&gt;/29 (IPv4) or /125 (IPv6) = 8 addresses&lt;/li&gt;
&lt;li&gt;/30 (IPv4) or /126 (IPv6) = 4 addresses&lt;/li&gt;
&lt;li&gt;/31 (IPv4) or /127 (IPv6) = 2 addresses&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;A Classless Inter-Domain Routing (CIDR) mask size is used to specify the prefix size.&lt;/p&gt;

&lt;p&gt;The number of prefixes that can be produced inside a subscription is unrestricted. More static public IP addresses than permitted by your subscription cannot be generated in ranges.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Private IP addresses&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Azure resources can communicate with one another thanks to private IPs. Resources on Azure are given private IP addresses from the virtual network subnet where they are located. In Azure, private IP addresses are either assigned statically or randomly.&lt;/p&gt;

&lt;p&gt;A private IP address can be connected to a variety of resources, including:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Virtual machines&lt;/li&gt;
&lt;li&gt;Internal load balancers&lt;/li&gt;
&lt;li&gt;Application gateways&lt;/li&gt;
&lt;li&gt;Private endpoints&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>azure</category>
      <category>devops</category>
      <category>tutorials</category>
      <category>productivity</category>
    </item>
    <item>
      <title>Microsoft Azure Power Apps🌸 🚀</title>
      <dc:creator>Bhagya</dc:creator>
      <pubDate>Thu, 30 Jun 2022 09:04:52 +0000</pubDate>
      <link>https://community.ops.io/bhagyaaa/microsoft-azure-power-apps-4nal</link>
      <guid>https://community.ops.io/bhagyaaa/microsoft-azure-power-apps-4nal</guid>
      <description>&lt;p&gt;Power Apps is a fantastic tool for quickening the development of applications. Our firm has benefited greatly from the Microsoft Power Platform (and Power Apps in particular), which allows for rapid process automation and iterative development without compromising user experience, security, or customisation.&lt;/p&gt;

&lt;p&gt;At its core, PowerApps is a platform as a service. You can use it to create mobile apps that work with practically every web browser and operate on Android, iOS, and Windows.&lt;/p&gt;

&lt;p&gt;PowerApps has a mobile app as well! In the past, developing mobile apps meant making them compatible with every operating system (one for iOS, one for Android, one for Windows). This practically triples the amount of development work, support expenses, and development resources required to produce commercial apps.&lt;/p&gt;

&lt;p&gt;All of the PowerApps you develop using PowerApps Mobile Apps function through the PowerApp App. It handles the variations among operating systems and only lets you use your apps. It simply functions as a container that enables using mobile apps on different mobile platforms much simpler.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;PowerApps: What Do They Do?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;With PowerApps, you can create mobile apps by adding various controls (such as text fields and choice fields), media (such as photographs, videos, and camera controls for your phone), forms, and screens.&lt;/p&gt;

&lt;p&gt;You can connect to other data sources or save data directly within the app with this feature. The only thing left to do after creating an app is to launch it and distribute it throughout your company.&lt;/p&gt;

&lt;p&gt;PowerApps was created for internal usage and was intended to be used for business mobile apps. You won't make a PowerApp to distribute to the entire planet. These are not intended for general consumer consumption, mostly because of the licensing mechanism and technical barriers to sharing with other users.&lt;/p&gt;

&lt;p&gt;Additionally, PowerApps' entire functionality is "no-code." As a result, your internal developers won't be able to modify the underlying device by adding any unique HTML or JavaScript. If PowerApps are unable to access anything outside of itself, neither your users nor your developers will be able to.&lt;/p&gt;

&lt;p&gt;It may sound like a restriction, but it can also be a benefit. Your PowerApps can help the platform keep its long-term stability and usability by excluding malicious code. You can integrate any custom REST API with PowerApps to get the best of both worlds if you require custom business logic that PowerApps just cannot deliver.&lt;/p&gt;

&lt;p&gt;Finally, the capability of PowerApps might not exactly fit your needs for what you want to accomplish with your mobile apps right now. However, given that this service is cloud-based, you can count on Microsoft to frequently roll out new upgrades, features, and improvements.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>devops</category>
      <category>tutorials</category>
      <category>lowcode</category>
    </item>
    <item>
      <title>Low Code Development with Microsoft Azure🌸 ☂️</title>
      <dc:creator>Bhagya</dc:creator>
      <pubDate>Wed, 29 Jun 2022 12:47:16 +0000</pubDate>
      <link>https://community.ops.io/bhagyaaa/low-code-development-with-microsoft-azure-14g9</link>
      <guid>https://community.ops.io/bhagyaaa/low-code-development-with-microsoft-azure-14g9</guid>
      <description>&lt;p&gt;Low-code is a software development technique that promotes faster app deliveries with little to no coding required. Hence, low-code platforms are a collection of software tools that allow the visual development of apps using intuitive modeling with a graphical user interface (GUI). Low-code eliminates or significantly reduces the need for coding, accelerating the process of getting apps to production.&lt;/p&gt;

&lt;p&gt;Model-driven design, automated code generation, and visual programming are all notions that low-code development platforms are built on. As a result, low-code platforms are software programs that provide a graphical user interface (GUI) for coding in order to write code quickly and reduce traditional hand-coding tasks.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Reasons why the low-code is so popular&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The user interface of low coding tools is simple and straightforward.&lt;/li&gt;
&lt;li&gt;Low-code development reduces the time to market by speeding up development.&lt;/li&gt;
&lt;li&gt;Large projects benefit from low-code development since it simplifies them.&lt;/li&gt;
&lt;li&gt;Scaling programs from testing to production is a breeze with low code.&lt;/li&gt;
&lt;li&gt;Because it comes with various integrations, little code makes development more versatile.&lt;/li&gt;
&lt;li&gt;Low code has security and maintenance built in, which saves money and time.&lt;/li&gt;
&lt;li&gt;Low-code development reduces development risks while also providing a high return on investment.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Low Code Development with Microsoft Azure&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://community.ops.io/images/9meUCilsMiEPjS27YNhUPvp2BlfwbTpBqtJpGx21IpU/rt:fit/w:800/g:sm/q:0/mb:500000/ar:1/aHR0cHM6Ly9jb21t/dW5pdHkub3BzLmlv/L3JlbW90ZWltYWdl/cy91cGxvYWRzL2Fy/dGljbGVzLzl0YWJs/cm96MjE3cWVwcTFi/d2F0LnBuZw" class="article-body-image-wrapper"&gt;&lt;img src="https://community.ops.io/images/9meUCilsMiEPjS27YNhUPvp2BlfwbTpBqtJpGx21IpU/rt:fit/w:800/g:sm/q:0/mb:500000/ar:1/aHR0cHM6Ly9jb21t/dW5pdHkub3BzLmlv/L3JlbW90ZWltYWdl/cy91cGxvYWRzL2Fy/dGljbGVzLzl0YWJs/cm96MjE3cWVwcTFi/d2F0LnBuZw" alt="Image description" width="800" height="415"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Professional-caliber low-code applications may be quickly built and shared thanks to Microsoft Power Apps. Give your teams the tools they need to construct the applications they need to build time-saving automations, outstanding customer experiences, and seamless interfaces with Microsoft services and products, such as Azure and Microsoft Teams. It's quick and easy to design your apps with a drag-and-drop user interface and prebuilt user experience components, and it's straightforward to distribute them across iOS, Android, Windows, and the web.&lt;/p&gt;

&lt;p&gt;How to construct a low code power app using the Microsoft Azure platform will be covered in the next post.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>devops</category>
      <category>tutorials</category>
      <category>lowcode</category>
    </item>
    <item>
      <title>Using the Azure DevOps APIs, to copy a work item type-2🧵 🍃</title>
      <dc:creator>Bhagya</dc:creator>
      <pubDate>Tue, 28 Jun 2022 13:24:02 +0000</pubDate>
      <link>https://community.ops.io/bhagyaaa/using-the-azure-devops-apis-to-copy-a-work-item-type-2-3i55</link>
      <guid>https://community.ops.io/bhagyaaa/using-the-azure-devops-apis-to-copy-a-work-item-type-2-3i55</guid>
      <description>&lt;p&gt;I covered the fundamentals of using the Azure DevOps APIs to duplicate a work item type in the previous article. I intend to talk about the further details in this article.&lt;/p&gt;

&lt;p&gt;It can be created using the following steps:&lt;/p&gt;

&lt;p&gt;The field must first be added to the work item type. You can add the group to the section as indicated in the code below once the field has been included in the work item type. Let's first define what the code is doing. We are looping through the groups for the specified section using the variable $grp as the control in the work item to copy from. The field is first added to the new work item type. The new group is then added to the section.The label in the group, which will serve as the label for the field, is the only thing that is not null, as you can see. We have a reference to the field we just added to the work item in the control (the control's id field) as well as the control's label.&lt;/p&gt;

&lt;p&gt;As can be seen at the end of this code block, adding the group is therefore as simple as using the API.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# multi line text fields cannot be inside a group. they are their own group on the UI
if($grp.controls[0].controlType -eq "HtmlFieldControl")
{
    isMultiLine = $true
    # first add the field to the work item
    $addCtl = @{
       referenceName = $grp.controls[0].id
       order = "$null"
       readOnly = "$false"
       label = $grp.label.Trim()
       visible = "$true"

   # must encapsulate true false in quotes to register
   defaultValue = if($fld.type -eq "boolean")
                    {"$false"}
                    else {""}
   required = if($fld.type -eq "boolean")
                 {"$true"} 
                 else {"$false"} 
 $ctlJSON = ConvertTo-Json -InputObject $addCtl

 # add field to work item type
 # https://docs.microsoft.com/en-us/rest/api/azure/devops/processes/fields/add?view=azure-devops-rest-7.1
 # POST https://dev.azure.com/{organization}/_apis/work/processes/{processId}/workItemTypes/{witRefName}/fields?api-version=7.1-preview.2
 $field = $null
 $fieldURL = $userParams.HTTP_preFix + "://dev.azure.com/" + $userParams.VSTSMasterAcct + "/_apis/work/processes/" + $proc.typeId  + "/workitemTypes/" + $newWKItem.referenceName + "/fields?api-version=7.1-preview.2"
 $field = Invoke-RestMethod -Uri $fieldURL -Method Post -ContentType "application/json" -Headers $authorization -Body $ctlJSON
 Write-Host $field

# now add the Multi line field to the page in a group with no name 
$addGroup = @{
          Contribution = "$null"    
          height = "$null"
          id = "$null"
          inherited = "$null"
          isContribution = "$false"
          label = $grp.label.Trim()
          visible = "$true"
          order = "$null"
          overridden = "$null"
               controls = @( @{
                   contribution = "$null"
                   controlType = "$null"
                   height = "$null"
                   id = $grp.controls[0].id
                   inherited = "$null"
                   isContribution = "$false"
                   label = $grp.controls[0].label.Trim()
                   metadata = "$null"
                   order = "$null"
                   overridden = "$null"
                   visible = "$true"
                   watermark = "$null"
               })

    }
    $grpJSON = ConvertTo-Json -InputObject $addGroup
    # POST https://dev.azure.com/{organization}/_apis/work/processes/{processId}/workItemTypes/{witRefName}/layout/pages/{pageId}/sections/{sectionId}/groups?api-version=7.1-preview.1
    $groupURL = $userParams.HTTP_preFix + "://dev.azure.com/" + $userParams.VSTSMasterAcct + "/_apis/work/processes/" + $proc.typeId  + "/workitemtypes/" + $newWKItem.referenceName + "/layout/pages/" + $pgExists.id + "/sections/" + $newSection.id + "/groups?api-version=7.1-preview.1"   
    $group = Invoke-RestMethod -Uri $groupURL -Method Post -ContentType "application/json" -Headers $authorization -Body $grpJSON
    Write-Host "Multi line field " $group
    $newGrp = $group  
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Other controls exist that you must deal with. The custom control extension field comes first. This control has a multi-select feature and allows for the addition of selections. As with all other fields, you must add this type of field to the work item before adding it to the page. After that, you add the group it belongs to before adding the control to the group. It is displayed below. Keep in mind that the control must have a special Id. I employed the New-Guid PowerShell command. You must then include a reference name. The field you just created has this id. The field Name and values are then added once again in the contribution section.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt; # add control to group. add the field to the control
         if($grpCtl.isContribution -eq $true)
         {
           $addCtl = @{
               # un documented when adding a contribution control it must have an ID. it has to be unique so i added a guid.
               id = New-Guid
               # un documented - if adding a contribution field must add reference name - this is the field in the control
               referenceName = $grpCtl.contribution.inputs.FieldName

              isContribution =  if($grpCtl.isContribution -eq $true){"$true"}else {"$false"}  
              height = "$null"
              label = $grpCtl.label.Trim()
              metadata = "$null"
              order = "$null"
              overridden = "$null"
              readOnly = if($grpCtl.readOnly -eq $true){"$true"}else {"$false"}   
              visible = if($grpCtl.visible -eq $true){"$true"}else {"$false"}   
              watermark = "$null"
                contribution = @{
                    contributionId = $grpCtl.contribution.contributionId
                    inputs = @{
                        FieldName =  $grpCtl.contribution.inputs.FieldName
                        Values = $grpCtl.contribution.inputs.Values
                    }
                }
          }
      }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;A Boolean field is the other field type to be cautious of. Here, the default value is essential and must be present. The default value won't be added to the page if you omit it. I got around this by always adding the default value and setting the field type to False if it was a Boolean field. I set it to a blank string if it wasn't Boolean, and that seems to work. Other custom control fields exist, and each one requires adjustment in order to add correctly. You can locate a URL field custom control in the code and see how I added it as well.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# url contribution control - need to set added to true so code will not try to add field or control again
    # url control has to be added as a control to the group, not like others that need field added first
    # adding url contribution field is not documented. found request by using fiddler
    if($grpCtl.contribution.contributionId -like "*url-field")
    {
      $addCtl = @{
        contributionId = $grpCtl.contribution.contributionId
        isContribution =  if($grpCtl.isContribution -eq $true){"$true"}else {"$false"}
        height = "$null"
        label = $grpCtl.label.Trim()
        metadata = "$null"
        order = "$null"
        overridden = "$null"
        controlType = "$null"
        readOnly = if($grpCtl.readOnly -eq $true){"$true"}else {"$false"}&amp;lt;br /&amp;gt;
        visible = if($grpCtl.visible -eq $true){"$true"}else {"$false"}&amp;lt;br /&amp;gt;
        watermark = "$null"
        contribution = @{
                     contributionId = $grpCtl.contribution.contributionId
                     inputs = @{
                              HideUrlIfEmptyField = $grpCtl.contribution.inputs.HideUrlIfEmptyField
                              Title = $grpCtl.contribution.inputs.Title
                              Url = $grpCtl.contribution.inputs.Url&amp;lt;br /&amp;gt;
                              }
                     }
        }

        $ctlJSON = ConvertTo-Json -InputObject $addCtl
        # https://docs.microsoft.com/en-us/rest/api/azure/devops/processes/controls/create?view=azure-devops-rest-7.1
        # POST https://dev.azure.com/{organization}/_apis/work/processes/{processId}/workItemTypes/{witRefName}/layout/groups/{groupId}/controls?api-version=7.1-preview.1
        $controlURL = $userParams.HTTP_preFix + "://dev.azure.com/" + $userParams.VSTSMasterAcct + "/_apis/work/processes/" + $proc.typeId  + "/workitemtypes/" + $newWKItem.referenceName + "/layout/groups/" + $group.id + "/controls?api-version=7.1-preview.1"    
        $control = Invoke-RestMethod -Uri $controlURL -Method Post -ContentType "application/json" -Headers $authorization -Body $ctlJSON
        Write-Host $control
        $added = "$true"


       }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I've demonstrated how to make a copy of an existing work item type using the same procedure. If you wish to add a new work item to another process as well, this ought to work.&lt;/p&gt;

&lt;p&gt;I sincerely hope you will find this information to be useful.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>devops</category>
      <category>tutorials</category>
      <category>productivity</category>
    </item>
    <item>
      <title>Using the Azure DevOps APIs, to copy a work item type.🧵 🍃</title>
      <dc:creator>Bhagya</dc:creator>
      <pubDate>Sun, 26 Jun 2022 18:09:54 +0000</pubDate>
      <link>https://community.ops.io/bhagyaaa/using-the-azure-devops-apis-to-copy-a-work-item-type-38f6</link>
      <guid>https://community.ops.io/bhagyaaa/using-the-azure-devops-apis-to-copy-a-work-item-type-38f6</guid>
      <description>&lt;p&gt;So, we all adore the way that Azure DevOps allows us to manage engagements (ADO). On Kanban boards, we may make Epics, Features, and User Stories and monitor our progress. Work item types can easily be modified to meet the demands of your company and project. What would happen if you wanted to use a work item type you produced for one business case but clone it for another? The existing work item cannot be copied to a different work item type. It's not a big deal if you only have a few fields, but if you have several pages, various groups on each page, and numerous fields in each group, the effort becomes enormous.&lt;/p&gt;

&lt;p&gt;In this article, I'll describe how to use the Azure DevOPs APIs to replicate a work item. The documentation explains the fundamentals of introducing a new process, a work item type, groups, and other things, however there are a few items lacking.&lt;/p&gt;

&lt;p&gt;The code is organized in this manner. Before going into the code, I reasoned that it could be beneficial to grasp the flow.&lt;/p&gt;

&lt;p&gt;1.Make a fresh work item type&lt;br&gt;
2.By iterating through the target work item type and adding any missing pages, you can create pages for new work item types.&lt;br&gt;
3.Create stages for a new sort of work item.&lt;br&gt;
-Remove any default stages not in copy from work item&lt;br&gt;
4.Each page of the targeted work item type in a loop&lt;br&gt;
-cycle through each page's sections through each group in each part in a loop&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Repeat the loop for each control in each group.&lt;/li&gt;
&lt;li&gt;New work item type field addition (a control holds only 1 field from what I have seen)&lt;/li&gt;
&lt;li&gt;To a specified section, add a group&lt;/li&gt;
&lt;li&gt;Add the control to the given group.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You must then add the stages from the work item you are copying from to the new work item type after it has been created. You will then receive the fundamental Work Item type with the states from the Work Item Target appended. The work item type has one page and four sections by default after it is created. The first three portions, which run from left to right, correspond to the page's three columns. The meaning of the fourth section is still a mystery to me. The addition of the work item categories facilitates the addition of the fields to some extent. Simply add the fields by iterating over each page in the layout.&lt;/p&gt;

&lt;p&gt;This puzzle has a few undocumented elements that need to be addressed, and several field types need special consideration. The System by default. The page for creating work items now includes a description area. You must approach this field differently from the others if you for some reason renamed that field. The first group in the first section will be the Description field. The biggest insight was also the most difficult to find. The description field does not have a group when you look at the UI, which is the cause. So, the fact that it is a multi-line text box is the underlying problem here (HTML).&lt;/p&gt;

&lt;p&gt;Consider the multi-line text box that is located elsewhere on the page. Remember that everything must be in a group, yet when you look at this field in the UI, it is not. So, after spending a few hours poring at fiddler traffic, I discovered what was missing from the documentation in regards to multi-line text fields. The multi-line field must be added as a control to a group that has already been created. I was lacking a piece of the puzzle, and this was it. When a multi-line field appeared on the page, a request was issued, and I was able to identify it and determine where it was heading.Granted, the documentation on creating a group demonstrates that a control can be a part of the request, but it is silent on the fact that multi-line text fields are an exception.&lt;/p&gt;

&lt;p&gt;I intend to address the remaining details in the next article.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>devops</category>
      <category>tutorials</category>
      <category>productivity</category>
    </item>
    <item>
      <title>Terraform with Azure🍃 🌌</title>
      <dc:creator>Bhagya</dc:creator>
      <pubDate>Sat, 25 Jun 2022 16:45:45 +0000</pubDate>
      <link>https://community.ops.io/bhagyaaa/terraform-with-azure-46go</link>
      <guid>https://community.ops.io/bhagyaaa/terraform-with-azure-46go</guid>
      <description>&lt;p&gt;Terraform is an infrastructure as code tool that allows you to define cloud and on-premises resources in human-readable configuration files that can be versioned, reused, and shared. After that, you can use a standardized workflow to provision and manage all of your infrastructure throughout its lifecycle. Terraform can manage both low-level components such as compute, storage, and networking resources and high-level components such as DNS entries and SaaS features.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The state file&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You should keep your Terraform state file in a safe and convenient location. It is secure because it contains sensitive information. Central, because you may want to work on the same project with your entire team. Speaking of Azure, the best place to store it is in a Blob Storage Account, which Terraform supports by default. You must first create the Blob Storage account, which you can easily do with the scripts below.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Secrets and IDs&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Terraform will require the credentials to access your state file if you use a Blob Storage Account to store it. Whether you run the CLI manually, as a script, or as part of a pipeline. You should not keep the credentials in your code! It is preferable to keep them in a safe place and retrieve them as needed. Because you work with Azure, Azure Vault may be the best option for you.&lt;/p&gt;

&lt;p&gt;Below are some useful snippets for retrieving secrets from the Azure Vault and using them with Terraform.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;A service account&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Depending on your needs, you may prefer to run Terraform on behalf of a service account rather than your personal account. You can then limit access rights to only those required for that specific deployment. However, you can grant your service account additional privileges that your personal account may not be granted.&lt;/p&gt;

&lt;p&gt;This is accomplished by creating a Service Principal and configuring it in your Terraform configuration. Check out the script below to learn how to create a Service Principal.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Some useful scripts&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You like the Infrastructure-as-Code principles that allow you to store everything in code, so you use Terraform. Creating the required dependencies manually in the Azure Portal is thus out of the question!&lt;/p&gt;

</description>
      <category>azure</category>
      <category>devops</category>
      <category>tutorials</category>
      <category>productivity</category>
    </item>
    <item>
      <title>Azure Email, What is it?</title>
      <dc:creator>Bhagya</dc:creator>
      <pubDate>Thu, 23 Jun 2022 18:30:26 +0000</pubDate>
      <link>https://community.ops.io/bhagyaaa/azure-email-4jib</link>
      <guid>https://community.ops.io/bhagyaaa/azure-email-4jib</guid>
      <description>&lt;p&gt;Third-party solutions, such as SendGrid, offer email services on Azure that may be incorporated into solutions to address a wide range of use cases. For example, if we need a low-cost, low-maintenance solution and have an Azure subscription, we can use the SendGrid Email Delivery Service on Microsoft Azure.&lt;/p&gt;

&lt;p&gt;Because Microsoft blacklists all Azure IPs, sending emails from Azure using SMTP without a relay does not guarantee that they will be received at the other end. As a result, we discovered that none of our email traffic, both inbound and outbound, was reaching its destination. In general, Azure email sending is dependent on a third-party SMTP relay service. SendGrid is the most popular and recommended tool for this. There are various examples and instructions available on how to post emails using Azure and SendGrid. At the same time, we can use another SMTP relay-compatible service, such as Elastic Email, Mailjet, or SocketLabs.&lt;/p&gt;

&lt;p&gt;SendGrid is the most widely utilized email provider for sending emails from Azure. SendGrid and Azure have grown highly popular because they used to be a free plan with a monthly limit of 25,000 emails for Azure members. Despite the fact that the free plan is no longer visible in the Azure interface, Microsoft confirmed that a free membership with a daily limit of 100 emails is still available.&lt;/p&gt;

&lt;p&gt;Some of SendGrid's most common features and functions are as follows:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Automatic receipt distribution&lt;/li&gt;
&lt;li&gt;Managing mailing list distribution lists&lt;/li&gt;
&lt;li&gt;Obtaining data in real time.&lt;/li&gt;
&lt;li&gt;Inquiries from customers are forwarded.&lt;/li&gt;
&lt;li&gt;Managing incoming e-mails&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Similar functionality is provided by two Azure services:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Queue storage is a messaging service in the cloud that allows Azure application components to communicate with one another.&lt;/li&gt;
&lt;li&gt;Service Bus is a robust messaging system for linking applications, services, and devices. By employing the appropriate Service Bus relay, Service Bus can also connect to remotely hosted applications and services.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;SendGrid covers all technical specifications, from architecture to ISP outreach and monitoring to filtering services and real-time analytics. As a result, it is the world's leading cloud-based email delivery company.&lt;/p&gt;

&lt;p&gt;The email services provided by SendGrid can be used in a variety of ways. However, it is fully depending on your needs and objectives. Here's how to set up and utilize SendGrid on Azure:&lt;/p&gt;

&lt;p&gt;Before you begin, please ensure that you have an active SendGrid account in your Azure subscription.&lt;/p&gt;

&lt;p&gt;By following the instructions below, you can create a SendGrid account and obtain the information you need to send an email using SendGrid.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Access your Azure account.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Look for the 'SendGrid Email Delivery' service.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The following step is to navigate to the Single Sender Verification page and search for the modified sender address.&lt;/p&gt;

&lt;p&gt;Locate the email sent to the sender's address, as seen in the screenshot below, to verify the sender. Then, on the Verify Single Sender button, validate the identity of the sender.&lt;/p&gt;

&lt;p&gt;To send email using SendGrid, the following requirements must be met.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;SendGrid's SMTP server IP address is smtp.sendgrid.net.&lt;/li&gt;
&lt;li&gt;apikey will always be the SMTP security username.&lt;/li&gt;
&lt;li&gt;As your password, enter the value of the API key we generated in SendGrid.&lt;/li&gt;
&lt;li&gt;Port 25 should be avoided. Alternatively, port 587 can be used.&lt;/li&gt;
&lt;li&gt;The recipient of the emails can only be the SendGrid authorized sender address.
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$sendGridApiKey = 'SG...........P258'
$SendGridEmail = @{
From = 'bagya@lzex.ml'
To = 'kithmini@gmail.com'
Subject = 'Hello'
Body = 'A formal mail from the account'
SmtpServer = 'smtp.sendgrid.net'
Port = 5123
UseSSL = $true
Credential = New-Object PSCredential 'apikey', (ConvertTo-SecureString $sendGridApiKey -AsPlainText -Force)
}
Send-MailMessage @SendGridEmail
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To validate email deliverability, look in the recipient's inbox for the text message we sent. The final result would look like this. As can be seen, the message was sent from the sender's address using sendgrid.net.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>devops</category>
      <category>tutorials</category>
      <category>productivity</category>
    </item>
    <item>
      <title>Dive into Vagrant-03🌱</title>
      <dc:creator>Bhagya</dc:creator>
      <pubDate>Thu, 23 Jun 2022 17:50:10 +0000</pubDate>
      <link>https://community.ops.io/bhagyaaa/dive-into-vagrant-03-5hl7</link>
      <guid>https://community.ops.io/bhagyaaa/dive-into-vagrant-03-5hl7</guid>
      <description>&lt;p&gt;&lt;strong&gt;How do I get started with Vagrant?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The simplest approach to get started with Vagrant is to install it and experiment with it. Aside from that, the official material is invaluable and provides excellent guidance for your first actions. It is also beneficial to be familiar with some of the basic terms used by Vagrant.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Box:&lt;/strong&gt; A box is a pre-configured Vagrant environment, usually a virtual computer.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Provide:&lt;/strong&gt; A provider is the site where the virtual environment is hosted. It can be local (VirtualBox is the default), remote, or even a specific case like a Docker container.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Provisioner:&lt;/strong&gt; A provisioner is a tool for configuring the virtual environment. It can be as simple as a shell script, or it can be a more complicated tool like Chef, Puppet, or Ansible.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Your initial virtual machine&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You will have a fully functional virtual machine with Ubuntu 18.04 LTS 64-bit after running the two commands below.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Initialize Vagrant&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;vagrant init hashicorp/bionic64
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;A `Vagrantfile` has been placed in this directory. You are now
ready to `vagrant up` your first virtual environment! Please read
the comments in the Vagrantfile as well as documentation on
`vagrantup.com` for more information on using Vagrant.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Before proceeding to the following step, make sure Vagrant has built a &lt;code&gt;Vagrantfile&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ls -al 
-rw-r--r--   1 kaitlincarter  staff  3024 13:07 Vagrantfile
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;Start the virtual machine&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;You must start the virtual machine now that you have a Vagrantfile that configures your deployment.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;vagrant up
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;When the virtual machine is successfully deployed, a notice stating that it is booted and ready will appear.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;==&amp;gt; default: Machine booted and ready!
==&amp;gt; default: Configuring network adapters within the VM...
==&amp;gt; default: Waiting for HGFS to become available...
==&amp;gt; default: Enabling and configuring shared folders...
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Connect to the machine using vagrant ssh and investigate your surroundings.&lt;/p&gt;

&lt;p&gt;Logout to end the SSH session.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Destroy the virtual machine&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;When you're finished, make sure to shut off the virtual machine. When the CLI prompts you, type y to confirm.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;vagrant destroy
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;==&amp;gt; default: Stopping the VMware VM...
==&amp;gt; default: Deleting the VM...
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Vagrant allows you to work on any project, install any dependencies required by that project, and set up any networking or synchronized folders so you can continue working from the comfort of your own machine.&lt;/p&gt;

&lt;p&gt;You've just finished creating your first virtual environment with Vagrant. Continue reading to discover more about project setup.&lt;/p&gt;

</description>
      <category>vagrant</category>
      <category>devops</category>
      <category>tutorials</category>
      <category>productivity</category>
    </item>
    <item>
      <title>Dive into Vagrant-02🌱</title>
      <dc:creator>Bhagya</dc:creator>
      <pubDate>Thu, 23 Jun 2022 17:50:06 +0000</pubDate>
      <link>https://community.ops.io/bhagyaaa/dive-into-vagrant-02-3nhm</link>
      <guid>https://community.ops.io/bhagyaaa/dive-into-vagrant-02-3nhm</guid>
      <description>&lt;p&gt;&lt;strong&gt;Why would I use Vagrant?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;While at its core, Vagrant provides a rather simple function, it may be useful to a wide range of people working on different kinds of tasks.&lt;/p&gt;

&lt;p&gt;For developers, Vagrant makes it easy to create a local environment which mimics the environment upon which your code will eventually be deployed. You can make sure you have the same libraries and dependencies installed, same processes installed, same operating system and version, and many other details without having to sacrifice the way your local machine is set up, and without the lag or cost of creating an external development environment and connecting to it.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Where can I get Vagrant?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;These same benefits for developers make it an appealing alternative for UX and UI designers, who can see how their work will look on a different system and even work with their own isolated copy of the system a developer is programming without having to go through a lot of hoops.&lt;/p&gt;

&lt;p&gt;It's also an excellent tool for learning a new tool, operating system, or environment without worrying about making a mistake that could jeopardize your present system. Whether you're studying for a certification exam, testing a new deployment script, or simply trying something new, you can be confident that you won't harm anything on your local PC or in a production environment. As an extra plus, once you've mastered the process, you may utilize the same script and process to deploy to a real-life situation or live environment.&lt;/p&gt;

&lt;p&gt;There are several ways to obtain Vagrant. Vagrant is available from the official download website as a binary package for Linux, Mac, and Windows.&lt;/p&gt;

&lt;p&gt;Vagrant may be found in the default repository of many Linux distributions and installed just like any other piece of software. In Fedora, for example, you may simply run.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ sudo dnf install vagrant
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;However, the creators of Vagrant warn that the versions available in some sources are not kept up to date, and using the official installers may result in fewer problems.&lt;/p&gt;

</description>
      <category>vagrant</category>
      <category>devops</category>
      <category>tutorials</category>
      <category>productivity</category>
    </item>
  </channel>
</rss>
