Azure Archives - PowerD365 https://powerd365.net/category/azure/ Training platform for Microsoft business applications Mon, 21 Nov 2022 12:54:54 +0000 en-GB hourly 1 https://wordpress.org/?v=6.8 PAT Scopes Now Supported By Azure DevOps https://powerd365.net/pat-scopes-now-supported-by-azure-devops/ https://powerd365.net/pat-scopes-now-supported-by-azure-devops/#respond Mon, 21 Nov 2022 12:54:54 +0000 https://powerd365.net/?p=4282 To allow customers more agency with their PAT scope Microsoft has introduced a big change, recently a new initiative by the Azure DevOps team has been taken which supports granular personal access token PAT scope. The change is a part of their ongoing investments in security this initiative was taken to decrease the risks [...]

The post PAT Scopes Now Supported By Azure DevOps appeared first on PowerD365.

]]>

To allow customers more agency with their PAT scope Microsoft has introduced a big change, recently a new initiative by the Azure DevOps team has been taken which supports granular personal access token PAT scope. The change is a part of their ongoing investments in security this initiative was taken to decrease the risks linked to leaks of PAT credentials. This initiative has drastically changed Azure DevOps as it links all Azure DevOps REST APIs to PAT; which at times led customers to consume these APIs using full-scoped PATs. This broad permissions of full-scoped PAT in the hands of someone dangerous poses a serious threat as the potential of security threat has been increased due to increased access to source codes, valuable assets and production infrastructure.

If you are a current user of Full-scoped PAT then consider migrating to PAT with limited scope to avoid any security breach. This can be done by taking a few simple steps as the supported granular PAT scope(s) for a given REST API can be found in the security -> Scopes section of the REST API document pages:

REST API document pages

This allowance by PAT to take full-scope of specific allowance is beneficial for customers as it enables them to corresponding control plane policy. Microsoft has promised a full range of improvements to ensure the smooth run of their operations which will help customers secure a DevOps environment. If you have any questions regarding the new DevOps environment you can contact us any time.

The post PAT Scopes Now Supported By Azure DevOps appeared first on PowerD365.

]]>
https://powerd365.net/pat-scopes-now-supported-by-azure-devops/feed/ 0
Microsoft Intune User-Scope Configuration For Azure Virtual Desktop https://powerd365.net/microsoft-intune-user-scope-configuration-for-azure/ https://powerd365.net/microsoft-intune-user-scope-configuration-for-azure/#respond Wed, 16 Nov 2022 10:06:36 +0000 https://powerd365.net/?p=4193 You will be very happy to know that Microsoft has recently announced an update to Update to Microsoft Intune user scope configuration for Azure Virtual Desktop multisession VMs which can be accessed on Windows 10 and Windows 11. This update is an addition to the made in April 2022 when Microsoft decided to assure [...]

The post Microsoft Intune User-Scope Configuration For Azure Virtual Desktop appeared first on PowerD365.

]]>

You will be very happy to know that Microsoft has recently announced an update to Update to Microsoft Intune user scope configuration for Azure Virtual Desktop multisession VMs which can be accessed on Windows 10 and Windows 11. This update is an addition to the made in April 2022 when Microsoft decided to assure general availability of device scope configuration for multi-session in Windows 10 and Windows 11.

With this update you can know manage VMs with the help of Microsoft InTune Settings catalog and Endpoint Security blade. Using this update you can configure certificates, user scope policies using settings. Moreover, you can also configure PowerShell scripts.

Overview of New Capabilities

With Microsoft Intune the following capabilities are now available on Azure Virtual Desktop

  • You can create new endpoint security policies through Windows 10, Windows 11 and Windows Server Platform
  • For multi-session you just need to manage Device and User Scope
  • Moreover, you can multi-session VMS created using Azure Public and Azure Government (US GCC High and DoD environments) clouds.

create profile

setting picker

Here is an example of configuring user settings in Microsoft Intune settings catalog.

endpoint security

This is an example of creating a security profile in Intune Endpoint security blade

More to this!

The new update has the following benefits:

  • Allows users to run multiple concurrent sessions
  • Provision of familiar user experience for Windows 10 and Windows 1
  • No extra payment required and new update can be availed using existing per-user Microsoft 365 licensing.

The post Microsoft Intune User-Scope Configuration For Azure Virtual Desktop appeared first on PowerD365.

]]>
https://powerd365.net/microsoft-intune-user-scope-configuration-for-azure/feed/ 0
Automate Dataverse Solution Deploying Using Azure Pipeline https://powerd365.net/automate-dataverse-solution-deploying-using-azure-pipeline/ https://powerd365.net/automate-dataverse-solution-deploying-using-azure-pipeline/#respond Thu, 13 Oct 2022 14:17:03 +0000 https://powerd365.net/?p=3995 This post will help you in using a new pipeline through Azure. You should know that Pipeline is very simple and enables you to transfer solutions from one environment to another; which later on can be used to publish your customizations as well. There is another thing; this post is only helpful if you [...]

The post Automate Dataverse Solution Deploying Using Azure Pipeline appeared first on PowerD365.

]]>

This post will help you in using a new pipeline through Azure. You should know that Pipeline is very simple and enables you to transfer solutions from one environment to another; which later on can be used to publish your customizations as well. There is another thing; this post is only helpful if you have successfully configured the connection between the Power Platform environment and the DevOps environment.

How to Create your Pipeline

  • The first step is to go to the Azure DevOps project and choose Pipeline.
  • There you will create a New Pipeline and then click “Use the classic editor to create a pipeline without YAML”
  • Next, you will need to tell Azure pipelines where you have stored your repository. You can also use Azure Repo or any other Repo that befits your needs. You should know that Azure Repo is your best option out there because it is easy to configure as compared to other ones.
  • The next step is to choose a template; you can also go for the option of “empty job”.

select a source

select a template

  • The following step includes the addition of tasks to agent Job 1 to build a new pipeline

You need to add the following tasks to your agent job

  • Power Platform Tool Installer
  • Power Platform Export Solution
  • Power Platform Import Solution
  • Power Platform Publish Customization
  • Next, you need to find the variables tab and edit the window to add a new pipeline variable.
  • You can name this variable according to the solution i.e. “Solutionname” and then set the value to the name of your solution in the Dataverse environment.

power platform export solution

  • Following the step above, you need to go back to the Tasks tab and configure the added tasks
  1. Power Platform Tool Installer

No configurations are done in this step, you just need to add it and move to the next one!

  1. Power Platform Export Solution

Here you will find your solution from Power Platform Environment and import it to the target environment.

  • You will need to use a “Service Principal/Client secret” authentication type
  • Once you are done selecting the required option; select the service connection you have configured to form a connection to the Power Platform environment
  • For the solution name; use the variable you created earlier
  • Next, for the solution output file, you need to use this value as $(Build.ArtifactStagingDirectory)\$(SolutionName).zip
  • Next, check the box to export as a managed solution; do this if you are transferring this solution to a test or production environment; because if you move the solution to another environment this step is not needed.
  1. Power Platform Import Solution

power platform export solution 2

This step is very easy as you will be importing the solution to your target environment. So, select the service collect of the environment and use this value: $(Build.ArtifactStagingDirectory)\$(SolutionName).zip$(Build.ArtifactStagingDirectory)\$(SolutionName).zip

  1. Power Platform Publish Customization

This step is also very simple; here you just need to select the service connection of the environment which is to be imported; it will enable you to publish all the solutions XML changes in the environment.

 

Congratulations you are done! Make sure you save your Pipeline and do not forget to test it.

The post Automate Dataverse Solution Deploying Using Azure Pipeline appeared first on PowerD365.

]]>
https://powerd365.net/automate-dataverse-solution-deploying-using-azure-pipeline/feed/ 0
What Is Azure Resource Manager (ARM)? Examine Azure Resource Manager In Detail https://powerd365.net/what-is-azure-resource-manager-arm-examine-azure-resource-manager-in-detail/ https://powerd365.net/what-is-azure-resource-manager-arm-examine-azure-resource-manager-in-detail/#respond Wed, 23 Mar 2022 15:34:54 +0000 https://powerd365.net/?p=3669 What is Azure Resource Manager (ARM) and how does it work? You must build Azure resources while dealing with the Microsoft Azure cloud. Microsoft created Azure resource manager to manage these resources, which is a more advanced, less expensive, and faster way to deploy, configure, and manage resources on Azure. It creates a managed [...]

The post What Is Azure Resource Manager (ARM)? Examine Azure Resource Manager In Detail appeared first on PowerD365.

]]>

What is Azure Resource Manager (ARM) and how does it work?

You must build Azure resources while dealing with the Microsoft Azure cloud. Microsoft created Azure resource manager to manage these resources, which is a more advanced, less expensive, and faster way to deploy, configure, and manage resources on Azure. It creates a managed cloud platform by combining all of the resources, such as Azure resource groups, resource providers, and resources. This may be accomplished utilizing Azure Portal, Azure CLI, Azure PowerShell, and SDKs.

But why did Microsoft introduce Azure Resource Manager in the first place? The answer is that you can only manage one resource at a time using the prior basic azure system administration. This is why the new Azure Resource Manager was created, because a complex application, such as the ones we use today, consists of various components like as virtual networks, storage accounts, web apps, databases, database servers, virtual machines, and so on. Microsoft released Azure management portal 3.0, which included the resource group, to manage these types of complicated applications. You may manage your components as connected and interdependent elements of a single entity, rather than as distinct entities.

It wasn’t tough to manage all of the Azure Resources in the early days of Microsoft Azure because most firms didn’t have that much deployed into a single Azure Subscription. As more enterprises began to use Azure and migrate their systems to the cloud, the demand for a better way to organize and manage the massive number of Azure Resources that make up a dozen or even hundreds of

apps grew. This led to the natural evolution of Azure to implementing Azure Resource Groups.

While the “Groups” component of Azure Resource Groups is the most prominent, Azure Resource Groups has a variety of other capabilities and advantages. Here’s a sample of some of them:

  • Organize all Azure Resources for a single app into a single logical group. Azure Resource Groups are a collection of Azure resources.
  • Manage, deploy, and monitor Azure Resources as a group to consider them as application building blocks rather than standalone components.
  • Deployments may be defined using declarative ARM Templates.
  • For safeguarding and regulating access to all resources inside a group, use Role Based Access Control (RBAC).
  • Ability to link extra metadata with Azure Resources by adding Tags to each resource inside a Resource Group.
  • Billing is clearer now that you can see the charges of a complete Resource Group or resources with a single Tag.
  • The transition from Azure Management to Azure Resource Manager is a step toward a more manageable environment.

Azure Automation

Scriptable control and deployment of Azure resources is possible using the Azure PowerShell and Azure CLI (Cross-Platform Command-Line Interface) Tools. Since the launch of Azure Service Management, this is the way for Azure Automation that has been supported. With the switch to Azure Resource Management, Azure PowerShell and Xplat-CLI are still supported, but there are new commands for each tool that supports Azure Resource Management.

In addition to the previous Azure Service Management, the procedural technique of utilising scripts to automate Azure Resource management and deployment is still fully supported by using the Azure PowerShell and Azure CLI tools. Procedural automation allows you to create scripts that are executed line by line, from beginning to end. This is a totally acceptable approach of applying automation, and it has long been the industry norm. This may also be a very quick way to get started with automation.

Aside from automation, Azure PowerShell and Azure CLI provide a rapid command-line interface for interacting with, managing, and deploying Azure Resources, giving administrators an option to utilizing the Azure Portal via a web browser.

Deployment of ARM Templates

ARM Templates are one of the fresh new capabilities offered by Azure Resource Management (Azure Resource Management Templates). These Templates are generated in JSON and outline the deployment and configuration of all Azure Resources in a single Azure Resource Group declaratively. This allows for a more straightforward declaration of Azure Resources for an application while enabling Azure to handle the sequence in which everything has to be deployed depending on the required dependencies.

ARM Templates provide a declarative mechanism for defining Azure deployments because they are JSON files. Infrastructure as Code is a method of declaratively defining deployments (IaC). Infrastructure as Code not only lets you to submit an ARM Template to the Azure Portal for deployment, but it also makes it easier to use in Automated Build and Deployment situations. For purely Infrastructure deployments, the ARM Template for an application can be checked into a Source Code Repository (TFS, Git, GitHub, etc.) with the application code, or the ARM Template code can be kept in its own repository.

Versioning and change tracking may be more simply controlled and rollbacked as required by storing ARM Templates within a Source Code Repository, which eliminates the need for extra documentation that can easily be overlooked over time.

DevOps + Azure

Last but not least in this post, but maybe most importantly, all of the above-mentioned capabilities of Azure Resource Manager provide enhanced DevOps integration possibilities. Communication is at the heart of DevOps, and automation is the best approach to convey the setup and deployment of a hosting environment.

The transfer of Azure administration to Azure Resource Manager, together with the use of ARM Templates, has resulted in a significant improvement in Azure’s

overall DevOps narrative. Azure Resource Manager completes the Azure + DevOps Storey in a thorough manner! Microsoft Azure has always given amazing benefits that lend themselves naturally to DevOps, but Azure Resource Manager actually completes the Azure + DevOps storey in a comprehensive way!

If handling Azure Resources has been a burden in the past, Azure Resource Manager, Resource Groups, and ARM Templates should be a breath of new air. The Microsoft Azure platform is becoming better and better!

The post What Is Azure Resource Manager (ARM)? Examine Azure Resource Manager In Detail appeared first on PowerD365.

]]>
https://powerd365.net/what-is-azure-resource-manager-arm-examine-azure-resource-manager-in-detail/feed/ 0
A Simple Approach To Get Started With IoT https://powerd365.net/a-simple-approach-to-get-started-with-iot/ https://powerd365.net/a-simple-approach-to-get-started-with-iot/#respond Wed, 23 Mar 2022 14:56:06 +0000 https://powerd365.net/?p=3655 An easy way to get started with IoT A simple way to get started with the Internet of Things The Internet of Things (IoT) uses sensors, devices, connections, services, dashboards, and other devices to connect the physical and digital worlds. Connecting devices to your apps, as well as managing and monitoring them, may be [...]

The post A Simple Approach To Get Started With IoT appeared first on PowerD365.

]]>

An easy way to get started with IoT

A simple way to get started with the Internet of Things

The Internet of Things (IoT) uses sensors, devices, connections, services, dashboards, and other devices to connect the physical and digital worlds. Connecting devices to your apps, as well as managing and monitoring them, may be challenging. That complexity is removed with Azure IoT Central (opens new window). It allows you to connect devices to a central location in the cloud with ease, and it comes with industry-specific templates for security, monitoring, and administration. This helps you to get up and running quickly while also allowing you to undertake more complex configuration and development if necessary.

In this post, we’ll demonstrate how easy it is to get data from a device connected to Azure IoT Central (opens new window).

Requirements

You’ll need the following things to follow along:

  • A Microsoft Azure membership (if you don’t already have one, sign up for a free account (opens new window) before you start)
  • You’ll need a smartphone or a phone emulator (iOS or Android) to install the IoT Plug and Play app.

Make a new IoT application and link it to a device.

IoT-as-a-service is provided by Azure IoT Central. Let’s give it a go.

1. Sign in to apps.azureiotcentral.com (opens new window) with the Azure account you use. This is Azure IoT Central’s primary portal.

2. Select Build from the left-hand menu.

3. We can create our first IoT application right here. IoT Central offers a variety of application templates to get you started. Templates for certain industries, such as retail, healthcare, and energy, are included. We’ll make a one-of-a-kind app. Click Create app in the Custom app template.

build-your-iot-application

In Azure IoT Central, create a custom app.

4. This brings up the “About your app” tab.

  • A name and URL prefix are suggested by blade IoT Central. You are free to leave them how they are.
  • For the payment plan, select Free.
  • Select your Azure Active Directory in Billing Info.
  • Choose the Azure Subscription you’d want to use.
  • Choose a location for the program to be installed.
  • Create the Create button.about-your-app

In Azure IoT Central, configure the application information.

Connections to devices, rules, jobs, analytics, and more may all be included in Azure IoT Central apps. We’ll keep things simple by only adding a gadget.

  • Select Devices from the left menu.
  • We don’t have any devices linked yet, so select Create a device. This generates a digital representation of an existing device rather than an actual device or emulation.
  • You may leave the device information alone.
  • To build a device in IoT Central, click Create.create-a-new-device

In Azure IoT Central, create a device.

5. Afterwards, in the top menu, click Connect on the device that we just built.

6. This image depicts the connecting blade.

7. Several forms of authentication can be used to connect a device.

8. The Shared Access Signature (SAS) approach will be used. To see the QR code that we may use to link our device, click on QR Code.

9. On an Android (opens new window) or iOS phone, download the IoT Plug and Play app (opens new window)

10. On your phone, open the app. It will request that you scan a QR code.

11. Give the app permission to use your camera.

12. Scanning the QR code from Azure IoT Central is a simple way to get started.

13. It will state that it is connected after a brief moment.

dashboards

Azure IoT Central is linked to the device.

Summary

Azure IoT Central (opens in a new window) is a simple way to get started with IoT without needing to invest in creating and maintaining a solution. Take a look at it!

The post A Simple Approach To Get Started With IoT appeared first on PowerD365.

]]>
https://powerd365.net/a-simple-approach-to-get-started-with-iot/feed/ 0
How To Begin Leveraging Azure Maps https://powerd365.net/how-to-begin-leveraging-azure-maps/ https://powerd365.net/how-to-begin-leveraging-azure-maps/#respond Wed, 23 Mar 2022 14:11:35 +0000 https://powerd365.net/?p=3646 Services for real-time maps The ability to interact with a map is a common feature in mobile and online applications. That, and more, is possible with Azure Maps (opens new window). Many geographic services are available through REST APIs and SDKs in Azure Maps, including: Map data, including satellite images, is rendered. Services that [...]

The post How To Begin Leveraging Azure Maps appeared first on PowerD365.

]]>

Services for real-time maps

The ability to interact with a map is a common feature in mobile and online applications. That, and more, is possible with Azure Maps (opens new window).

Many geographic services are available through REST APIs and SDKs in Azure Maps, including:

  • Map data, including satellite images, is rendered.
  • Services that allow you to make bespoke maps, such as indoor maps.
  • Services for elevating
  • Services for weather forecasting
  • Transportation services
  • Services for routing
  • You may utilize mobility services to plan public transportation routes.
  • Time zone and geolocation services, as well as other features

Azure Maps (opens in a new window) is a feature-rich service. We’ll build an Azure Maps account and use it to make a simple map on an HTML website in this tutorial.

Basic requirements

You’ll need the following things to follow along:

  • A Microsoft Azure membership (if you don’t already have one, sign up for a free account (opens new window) before you start)

Making a map appear on an HTML page

To begin, we’ll use the Azure interface to create an Azure Map Account.

1. Go to the Azure web site (opens new window)

2. To make a resource, click the Create a resource button (the plus-sign in the top left corner)

3. Search for Azure Maps, then click on the “Azure Maps” result. Create

a. Choose a Resource Group to Work With

b. Enter a Name

c. Choose a price tier. For this practice, the Gen2 tier is sufficient.

d. To confirm that you agree to the license and privacy policy, check the box.

e. To create an Azure Maps account, click Create.

create-azure-map

In the Azure portal, create an Azure Maps account.

Navigate to the Azure Maps Account in the Azure portal after it has been created. To utilize the shared authentication key in an HTML page, we’ll need it. To get the Primary Key, go to the Authentication menu and copy it.

azure-maps

Authentication keys for Azure Maps

We’ll use an HTML website that generates a map based on search results as an example.

1. Copy the code from the HTML example page on GitHub (opens new window).

2. Make a file called index.html on your computer.

3. In the index.html file, paste the HTML code.

4. Fill in your primary authentication key and replace the Get Map function with the code below. This code creates a map with the id my Map within the div> element.

function GetMap() { //Initialize a map instance. map = new atlas.Map(‘myMap’, { center: [-118.270293, 34.039737], zoom: 14, view: ‘Auto’,

//Add authentication details for connecting to Azure Maps. authOptions: { //Use an Azure Maps key. Get an Azure Maps key at https://azure.com/maps. NOTE: The primary key should be used as the key. authType: ‘subscriptionKey’, subscriptionKey: ‘BD23BoHa8mNdkK7y697sEEL21XtGeanMqXsMCTqzTlg’ } });

5. Open the HTML file in a browser after saving it. You should see a map and a search box being displayed. Click on the result after searching for a location or object.

search-for-a place

The HTML page includes a map and a search result.

It calls the Azure Maps APIs with the Azure Maps JavaScript Web SDK, which it receives from the references below, in the JavaScript Search method in the HTML file. It also includes a CSS file that aids with map rendering and picture support.

href=”https://atlas.microsoft.com/sdk/javascript/mapcontrol/2/atlas.min.css” type=”text/css” />

The search feature works by invoking the Azure Maps search API’s searchPOI (opens new window) method and showing the results as an HTML list.

function search() { //Remove any previous results from the map. datasource.clear(); popup.close(); resultsPanel.innerHTML = ”; //Use MapControlCredential to share authentication between a map control and the service module. var pipeline = atlas.service.MapsURL.newPipeline(new atlas.service.MapControlCredential(map)); //Construct the SearchURL object var searchURL = new atlas.service.SearchURL(pipeline); var query = document.getElementById(“search-input”).value; searchURL.searchPOI(atlas.service.Aborter.timeout(10000), query, { lon: map.getCamera().center[0], lat: map.getCamera().center[1], maxFuzzyLevel: 4, view: ‘Auto’ }).then((results) => { //Extract GeoJSON feature collection from the response and add it to the datasource var data = results.geojson.getFeatures();

datasource.add(data); if (centerMapOnResults) { map.setCamera({ bounds: data.bbox }); } console.log(data); //Create the HTML for the results list. var html = []; for (var i = 0; i < data.features.length; i++) { var r = data.features[i]; html.push(‘

  • ‘) html.push(‘
    ‘); if (r.properties.poi && r.properties.poi.name) { html.push(r.properties.poi.name); } else { html.push(r.properties.address.freeformAddress); } html.push(‘
    ‘, r.properties.type, ‘: ‘, r.properties.address.freeformAddress, ‘

    ‘); if (r.properties.poi) { if (r.properties.phone) { html.push(‘

    phone: ‘, r.properties.poi.phone, ‘

    ‘); } if (r.properties.poi.url) { html.push(‘

    ‘); } } html.push(‘

‘); resultsPanel.innerHTML = html.join(”); } }); }

Conclusion

Azure Maps (opens in a new window) is a suite of geospatial services that lets you render maps, traffic, weather, public transportation information, geofencing, and more. Azure Maps services may be accessed via the Web (opens new window) and Android SDKs (opens new window), as well as REST APIs (opens new window). Take a look at it!

The post How To Begin Leveraging Azure Maps appeared first on PowerD365.

]]>
https://powerd365.net/how-to-begin-leveraging-azure-maps/feed/ 0
In Azure Functions, How To Use Dependency Injection https://powerd365.net/in-azure-functions-how-to-use-dependency-injection/ https://powerd365.net/in-azure-functions-how-to-use-dependency-injection/#respond Tue, 22 Mar 2022 11:27:50 +0000 https://powerd365.net/?p=3560 In Azure Functions, how to use dependency injection for code reuse and clarity, employ dependency injection. Throughout your classes when building an application, you'll use several services and tools. Some of them have complicated constructors, while others are interdependent. You can make it easier for classes to obtain an instance of a service and [...]

The post In Azure Functions, How To Use Dependency Injection appeared first on PowerD365.

]]>

In Azure Functions, how to use dependency injection for code reuse and clarity, employ dependency injection.

Throughout your classes when building an application, you’ll use several services and tools. Some of them have complicated constructors, while others are interdependent. You can make it easier for classes to obtain an instance of a service and utilize it without having to instantiate it by utilizing dependency injection (opens new window). This improves the readability of your code and makes it easier to reuse services.

The dependency injection pattern may be used in.NET Azure Functions (opens new window). We’ll look at how it works in this post.

Requirements

You’ll need the following factors to follow along:

  • A Microsoft Azure membership (if you don’t already have one, sign up for a free account (opens new window) before you start)
  • Visual Studio (opens a new window) or VS Code (opens a new window) (opens new window)
  • Make that the Azure workload is turned on in Visual Studio.

In.NET, use the dependency injection pattern. Azure Functions is a service provided by Microsoft.

We’ll begin by using Visual Studio to create an Azure Function. If you want, you may do this with Visual Studio Code. The dependency injection pattern may be used in.NET Azure Functions (opens new window). We’ll look at how it works in this post.

1. Create a new project with Visual Studio.

2. As the project template, select Azure Functions.

3. Assign a name to the project.

4. Select Http trigger from the Create Next menu, and leave the rest of the parameters alone.

5. Finally, click Create to start working on your project.

create-new-function

(In Visual Studio, create an Azure Functions project.)

We’ll add some files to the Azure Functions project to show dependency injection. An interface, a service that implements the interface, and a Startup class that configures dependency injection are all included.

1. Copy and paste the code below into a new file called ITipsService.cs.

public interface ITipService { string GetTip(); }

2. After that, make a TipService.cs file. In the file, paste the following code. This is where the ITipService is implemented. For a random tip, the GetTip function delivers an Azure Tips and Tricks URL (opens new window).

public class TipService : ITipService { public string GetTip() { Random tipNumber = new Random(); return “https://microsoft.github.io/AzureTipsAndTricks/blog/tip” + tipNumber.Next(1, 335) + “.html”; } }

3. Finally, make a new file called Startup.cs. Replace references to the namespace “FunctionApp1” with the namespace of your Function App when pasting the following code into it. This code adds the HttpClient and TipsService services to the app. We may utilize them using dependency injection throughout the remainder of the programme by declaring them here. There can only be one class that derives from FunctionsStartup in the Function App.

using Microsoft.Azure.Functions.Extensions.DependencyInjection; using Microsoft.Extensions.DependencyInjection; [assembly: FunctionsStartup(typeof(FunctionApp1.Startup))] namespace FunctionApp1

{ public class Startup : FunctionsStartup { public override void Configure(IFunctionsHostBuilder builder) { builder.Services.AddHttpClient(); builder.Services.AddSingleton((s) => { return new TipService(); }); } } }

The project should now appear as follows:

solution-explorer

(In Visual Studio, the Azure Function project)

4. Types from NuGet packages that you must install are included in the startup class. Manage NuGet packages by right-clicking the project file and selecting Manage NuGet packages.

5. Install Microsoft.Azure.Functions.Extensions by going to Browse Search and looking for Microsoft.Azure.Functions.Extensions.

6. Apply the same procedure to Microsoft.Extensions.Http

7. Next, alter the code in the Function class to the following:

using Microsoft.AspNetCore.Http; using Microsoft.AspNetCore.Mvc; using Microsoft.Azure.WebJobs; using Microsoft.Azure.WebJobs.Extensions.Http; using Microsoft.Extensions.Logging; using System.Net.Http; using System.Threading.Tasks; namespace FunctionApp1 { public class Function1 { private readonly HttpClient _client; private readonly ITipService _service; public Function1(IHttpClientFactory httpClientFactory, ITipService service) { this._client = httpClientFactory.CreateClient(); this._service = service; } [FunctionName(“Function1”)] public async Task Run( [HttpTrigger(AuthorizationLevel.Function, “get”, “post”, Route = null)] HttpRequest req, ILogger log) { var url = _service.GetTip();

var response = await _client.GetAsync(url); if (response.StatusCode == System.Net.HttpStatusCode.OK) { return new RedirectResult(url); } else { return new NotFoundResult(); } } } }

This code injects an IHttpClientFactory and an ITipService into an IHttpClientFactory using a function Object () {[native code]}. These have been completely instantiated and are ready to use. That’s where dependency injection shines. The TipService is then used. To get the URL of an Azure Tips & Tricks post, use the GetTip function. The HttpClient is then used to obtain the URL’s webpage. If it exists, it will issue a redirect request to that website, which will redirect the browser to the Azure Tips & Tricks article.

8. Execute the Procedure

9. When the Function is activated, it displays the Function URL, which you may use to activate it. Copy the URL and paste it into your browser’s address bar.

10. You’ll be sent to an article on the Azure Tips & Tricks website.

tips-and-tricks

(The function takes you to a post on Azure Tips & Tricks.)

Summary

Dependency injection (opens new window) is a software development strategy that simplifies the reading and reuse of code. It may be used in.NET Azure Functions (opens new window) by creating a Startup class that calls the FunctionsStartup class. Take a look at it!

The post In Azure Functions, How To Use Dependency Injection appeared first on PowerD365.

]]>
https://powerd365.net/in-azure-functions-how-to-use-dependency-injection/feed/ 0
Interacting Azure Bot With A Simple NFT https://powerd365.net/interacting-azure-bot-with-a-simple-nft/ https://powerd365.net/interacting-azure-bot-with-a-simple-nft/#respond Tue, 22 Mar 2022 10:40:48 +0000 https://powerd365.net/?p=3556 Non-fungible tokens (NFTs), which represent unique, non-interchangeable pieces of data, are one of the hottest topics in the blockchain ecosphere right now. They are implemented using smart contracts that can interact with any type of application, both blockchain-based (other smart contracts) and not (like web apps for example). Chat bots are one of the [...]

The post Interacting Azure Bot With A Simple NFT appeared first on PowerD365.

]]>

Non-fungible tokens (NFTs), which represent unique, non-interchangeable pieces of data, are one of the hottest topics in the blockchain ecosphere right now. They are implemented using smart contracts that can interact with any type of application, both blockchain-based (other smart contracts) and not (like web apps for example).

Chat bots are one of the sorts of applications that will be discussed in this article.

Chat bots simulate human conversions by implementing predetermined sets of actions and reactions that are triggered by a genuine end user when interacting with the bot. Azure Bot Service and Microsoft Bot Framework are one such framework and service combo for developing chat bots. which we’ll use in this post to build a bot that queries an Ethereum NFT contract, especially the CryptoPunks NFT contract.

Preconditions

Because we will be developing the bot in C# and deploying it to Azure, the following software will be required to be installed on the workstation:

Azure CLI Bot Framework Emulator — useful for local debugging Nethereum code generator — to produce C# code for interacting with the Ethereum contract, which may be deployed by issuing the following command (needs the.NET SDK):

dotnet tool install -g Nethereum.Generator.Console

Finally, we’ll need the project template for an empty C# bot, which can be installed by running the following command (the.NET SDK is required):

dotnet new -i Microsoft.Bot.Framework.CSharp.EmptyBot

The Robot

To begin, we will create a new empty folder for our source code and then run the following command to build the basic bot project:

dotnet new emptybot –name NftBot –output .

Next, we’ll need to get the CryptoPunks contract ABI and bytecode, which can be accessed on Etherscan at the following link: https://etherscan.io/address/0xb47e3cd837ddf8e4c57f05d70ab865de6e193bbb#code

Copy the contents of “Contract ABI” into a file named CryptoPunks.abi and the contents of “Contract Creation Code” into a file called CryptoPunks.bin, then store them in the Contracts subdirectory.

By executing the following command, we can now utilise Nethereum to produce C# code from these two files:

Nethereum.Generator.Console generate from-abi -abi ./Contracts/CryptoPunks.abi -bin ./Contracts/CryptoPunks.bin -cn CryptoPunks -ns NftBot.Contracts -o ./Contracts

We’ll also need to install Nethereum.Web3 in order for the application to have all of the dependencies it needs to use the produced code:

dotnet add package Nethereum.Web3

We’ll also need to install Nethereum.Web3 in order for the application to have all of the dependencies it needs to use the produced code:

For the sake of clarity, I will simply discuss the code for the bot’s actual implementation. The exact requirements for wiring the components using dependency injection are available in the aforementioned GitHub repository.

The bot will be implemented in a file named NftBot.cs, which will be put in the Bots subdirectory, and it will include two overrides for the base ActivityHandler class:

When a person enters the bot’s interface, OnMembersAddedAsync is triggered:

protected override async Task OnMembersAddedAsync(IList<ChannelAccount> membersAdded, ITurnContext<IConversationUpdateActivity> turnContext, CancellationToken cancellationToken) { var welcomeText = “Hello and welcome!”; foreach (var member in membersAdded) { if (member.Id != turnContext.Activity.Recipient.Id) { await turnContext.SendActivityAsync(MessageFactory.Text(welcomeText, welcomeText), cancellationToken); } } }

When a user delivers a message in the chat, OnMessageActivityAsync is triggered. This will interact with the CryptoPunks contract as follows:

protected override async Task OnMessageActivityAsync(ITurnContext<IMessageActivity> turnContext, CancellationToken cancellationToken) { var punkIndex = BigInteger.Parse(turnContext.Activity.Text); var punkIndexValid = punkIndex >= LowerIndexBound && punkIndex <= UpperIndexBound; var replyText = !punkIndexValid ? “Please use punk indexes between 0 and 9999” : await ProcessSaleOfferResult(punkIndex); await turnContext.SendActivityAsync(MessageFactory.Text(replyText), cancellationToken); }

In addition, there is an extra way for processing the outcome of the contract call:

private async Task<string> ProcessSaleOfferResult(BigInteger punkIndex) { var result = await _cryptoPunksService.PunksOfferedForSaleQueryAsync(punkIndex); if (!result.IsForSale) { return $”Punk {result.PunkIndex} is not for sale!”; } var replyBuilder = new StringBuilder(); replyBuilder.AppendLine($”Punk {result.PunkIndex} is for sale!”); replyBuilder.AppendLine($”By seller {result.Seller}”); replyBuilder.AppendLine($”With a minimum value of {Web3.Convert.FromWei(result.MinValue)} ETH”); return replyBuilder.ToString(); }

The entire bot file should look something like this:

using System.Collections.Generic; using System.Numerics;

using System.Text; using System.Threading; using System.Threading.Tasks; using Microsoft.Bot.Builder; using Microsoft.Bot.Schema; using Nethereum.Web3; using NftBot.Contracts.CryptoPunks; using NftBot.Contracts.CryptoPunks.ContractDefinition; namespace NftBot.Bots { public class NftBot : ActivityHandler { private const int LowerIndexBound = 0; private const int UpperIndexBound = 9999; private readonly CryptoPunksService _cryptoPunksService; public NftBot(CryptoPunksService cryptoPunksService) { _cryptoPunksService = cryptoPunksService; } protected override async Task OnMessageActivityAsync(ITurnContext<IMessageActivity> turnContext, CancellationToken cancellationToken) { var punkIndex = BigInteger.Parse(turnContext.Activity.Text); var punkIndexValid = punkIndex >= LowerIndexBound && punkIndex <= UpperIndexBound; var replyText = !punkIndexValid ? “Please use punk indexes between 0 and 9999” : await ProcessSaleOfferResult(punkIndex); await turnContext.SendActivityAsync(MessageFactory.Text(replyText), cancellationToken); } protected override async Task OnMembersAddedAsync(IList<ChannelAccount> membersAdded, ITurnContext<IConversationUpdateActivity> turnContext, CancellationToken cancellationToken) { var welcomeText = “Hello and welcome!”;

foreach (var member in membersAdded) { if (member.Id != turnContext.Activity.Recipient.Id) { await turnContext.SendActivityAsync(MessageFactory.Text(welcomeText, welcomeText), cancellationToken); } } } private async Task<string> ProcessSaleOfferResult(BigInteger punkIndex) { var result = await _cryptoPunksService.PunksOfferedForSaleQueryAsync(punkIndex); if (!result.IsForSale) { return $”Punk {result.PunkIndex} is not for sale!”; } var replyBuilder = new StringBuilder(); replyBuilder.AppendLine($”Punk {result.PunkIndex} is for sale!”); replyBuilder.AppendLine($”By seller {result.Seller}”); replyBuilder.AppendLine($”With a minimum value of {Web3.Convert.FromWei(result.MinValue)} ETH”); return replyBuilder.ToString(); } } }

It is now possible to debug the bot by compiling it, launching it in debug mode, and directing the bot simulator to the bot controller’s route, which, if not altered from what the template created, should be at http://localhost:[PORT-NUMBER]/api/messages.

Deployment

To deploy the bot, we will first construct it in “Release” mode by running:

dotnet build –configuration Release

Then begin working on the Azure resources we’ll need for deployment, beginning by registering an Azure Active Directory application for it by running:

az ad app create –display-name “NftBotApp” –password “[PASSWORD]” –available-to-other-tenants

This method should return an appId field, which, along with the password, should be used to change the current appsettings. json file in the following format:

{ “MicrosoftAppType”: “MultiTenant”, “MicrosoftAppId”: “[APP-ID]”, “MicrosoftAppPassword”: “[PASSWORD]”, “MicrosoftAppTenantId”: “” }

Then, under the DeploymentTemplates subfolder, we will use one of the Azure Resource Manager templates generated by the template to create a resource group, application service plan, and application to host our bot by running the following command (note that the app ID and password are also required here):

az deployment sub create –template-file “./DeploymentTemplates/template-with-new-rg.json” –location eastus –parameters appType=”MultiTenant” appId=”[APP-ID]” appSecret=”[PASSWORD]” botId=”nft-bot” botSku=F0 newAppServicePlanName=”asp-nft-bot” newWebAppName=”app-nft-bot” groupName=”rg-nft-bot” groupLocation=”eastus” newAppServicePlanLocation=”eastus” –name “nft-bot”

Then, use the following command to prepare the project for deployment:

az bot prepare-deploy –lang Csharp –code-dir “.” –proj-file-path “NftBot.csproj”

Once the preparation is finished (a new deployment file is added to the root source folder), a ZIP file of the whole source folder (including the build related subfolders) must be produced (named NftBot.zip) and then utilised in the deployment by running:

az webapp deployment source config-zip –resource-group “rg-nft-bot” –name “app-nft-bot” –src “./NftBot.zip”

This procedure may take a few minutes to complete, but once completed, the bit is ready for testing.

Time to test the bot.

To test the bot, go to the newly created Azure Bot resource in the Azure Portal. Open the “Test in Web Chat” window and begin interacting with the bot by providing it CryptoPunk indexes (0–9999 are valid), seeing that some are not for sale and others are (with seller address and minimum value as advertised by the contract).

Conclusion

This is a very basic overview of how to link Ethereum contracts with Azure Bot implementations.

A real-world solution will most likely involve actions that also affect the status of the contract (such as purchasing a CryptoPunk) that are significantly more difficult, since they necessitate a wallet management solution, which is outside the scope of this paper (to keep it simple).

The post Interacting Azure Bot With A Simple NFT appeared first on PowerD365.

]]>
https://powerd365.net/interacting-azure-bot-with-a-simple-nft/feed/ 0