Azure, Flutter, GraphQL, Vue, NuGet

SEP OCT 2019

Design Patterns for Distributed Systems codemag.com - THE LEADING INDEPENDENT DEVELOPER MAGAZINE - US $ 8.95 Can $ 11.95 - US $ 8.95 Can MAGAZINE THE LEADING INDEPENDENT DEVELOPER - codemag.com

Implementing GraphQL APIs VUE.js for jQuery Developers Azure Machine Learning ASP.NET * Visual Studio * Azure * Artificial Intelligence * .NET Core * Angular Architecture * Azure Databricks * Azure IoT * Azure Sphere * Big Data * Blazor * C# 8 * Cloud Security Cognitive Services * CosmosDB * Data Science & VMs Deep Learning * DevOps Docker * IoT * Kubernetes Machine Learning * Microservices * Node.js * Python * React Security & Compliance Scalable Architectures * SignalR Core * SQL * Visual Studio * Xamarin * and so much more 200+ Sessions 100+ and industry experts Full-day workshops Evening events MGM GRAND LAS VEGAS, NV NOVEMBER 18 – 21, 2019

SCOTT ERIC GUTHRIE BOYD Executive Vice President, Corporate Cloud + AI Platform, Vice President, Microsoft AI Platform, Microsoft

SCOTT SCOTT RICHARD DAN MARKUS HANSELMAN HUNTER CAMPBELL WAHLIN EGGER Principal Program Director of Program Host, .NET Rocks! Google GDE, President and Chief Manager, Web Platform Management .NET, Entrepreneur, Advisor, Developer, Software Architect, Team, Microsoft Microsoft Rabid Podcaster Wahlin Consulting EPS Software Corp.

JEFF JOHN ZOINER MICHELE L. KIMBERLY L. FRITZ PAPA TEJADA BUSTAMANTE TRIPP Senior Program Principal Developer CEO & Architect, CIO & Architect, President / Founder, Manager, Microsoft GET THE Advocate, Microsoft Solliance Solliance SQLskills

INSIDER VIEW REGISTER EARLY for a WORKSHOP PACKAGE and receive a choice of Surface Go, Xbox One X, Xbox One S, Surface Headphones, Cortana-enabled Amazon Echo or hotel gift card! See for details.

BOB KATHLEEN ANNA ROBERT WARD DOLLARD THOMAS GREEN Principal Architect Azure Principal Program Data & Applied Scientist, Technical Evangelist, Data/SQL Server Team, Manager, Microsoft Microsoft DPE, Microsoft Follow us on: twitch.tv/devintersection Microsoft Twitter: @DEVintersection Facebook.com/DEVintersection LinkedIn.com/company/devintersectionconference/ Twitter: @AzureAIConf Facebook.com/MicrosoftAzureAIConference LinkedIn.com/company/microsoftazureaiconf/

Powered by DEVintersection.com DEVintersection.com 203-264-8220 M-F, 9-4 EDT AzureAIConf.com 203-264-8220 m–f, 9-4 edt ASP.NET * Visual Studio * Azure * Artificial Intelligence * .NET Core * Angular Architecture * Azure Databricks * Azure IoT * Azure Sphere * Big Data * Blazor * C# 8 * Cloud Security Cognitive Services * CosmosDB * Data Science & VMs Deep Learning * DevOps Docker * IoT * Kubernetes Machine Learning * Microservices * Node.js * Python * React Security & Compliance Scalable Architectures * SignalR Core * SQL Server * Visual Studio * Xamarin * and so much more 200+ Sessions 100+ Microsoft and industry experts Full-day workshops Evening events MGM GRAND LAS VEGAS, NV NOVEMBER 18 – 21, 2019

SCOTT ERIC GUTHRIE BOYD Executive Vice President, Corporate Cloud + AI Platform, Vice President, Microsoft AI Platform, Microsoft

SCOTT SCOTT RICHARD DAN MARKUS HANSELMAN HUNTER CAMPBELL WAHLIN EGGER Principal Program Director of Program Host, .NET Rocks! Google GDE, President and Chief Manager, Web Platform Management .NET, Entrepreneur, Advisor, Developer, Software Architect, Team, Microsoft Microsoft Rabid Podcaster Wahlin Consulting EPS Software Corp.

JEFF JOHN ZOINER MICHELE L. KIMBERLY L. FRITZ PAPA TEJADA BUSTAMANTE TRIPP Senior Program Principal Developer CEO & Architect, CIO & Architect, President / Founder, Manager, Microsoft GET THE Advocate, Microsoft Solliance Solliance SQLskills

INSIDER VIEW REGISTER EARLY for a WORKSHOP PACKAGE and receive a choice of Surface Go, Xbox One X, Xbox One S, Surface Headphones, Cortana-enabled Amazon Echo or hotel gift card! See website for details.

BOB KATHLEEN ANNA ROBERT WARD DOLLARD THOMAS GREEN Principal Architect Azure Principal Program Data & Applied Scientist, Technical Evangelist, Data/SQL Server Team, Manager, Microsoft Microsoft DPE, Microsoft Follow us on: twitch.tv/devintersection Microsoft Twitter: @DEVintersection Facebook.com/DEVintersection LinkedIn.com/company/devintersectionconference/ Twitter: @AzureAIConf Facebook.com/MicrosoftAzureAIConference LinkedIn.com/company/microsoftazureaiconf/

Powered by DEVintersection.com DEVintersection.com 203-264-8220 M-F, 9-4 EDT AzureAIConf.com 203-264-8220 m–f, 9-4 edt TABLE OF CONTENTS Features 8 Azure Machine Learning Workspace and MLOps 46 Nest.js Step-by-Step: Part 2 It’s when you’re working with lots of data that you start looking around Bilal continues showing us just how interesting, useful, and easy it is for an easier way to keep track of it all. Machine learning and artificial to integrate Nest.js with TypeORM. You’ll get to replace mock data from intelligence are the obvious answers, and Sahil shows you why. the first article with real data this time, too. Sahil Malik Bilal Haidar

16 A Design Pattern for Building WPF Business Apps: 54 Cross-Platform Mobile Development Using Part 3 Flutter In the third installment of his WPF series, Paul shows you how to get Using Flutter, Google’s latest cross-platform framework for developing feedback using an Entity Framework entity class. He also shows you how to iOS and Android apps, Wei-Meng shows you how easy developing start expanding user activities, like adding, editing, or deleting screens. mobile-apps can be. Paul D. Sheriff Wei-Meng Lee

24 Responsible Package Management in Visual 70 Add File Storage to Azure App Services: The Studio Work Around If you use a package management tool, like NuGet, Node Package Manager When maintaining the hierarchy of a file system and integrating (NPM) for JavaScript, or Maven for Java, you already know how they security limits you to a single point of access, you might have some simplify and automate library consumption. John shows you how to make heavy lifting to do while you wait for Microsoft to supply a tool to sure that the packages you download don’t cause more troubles than they automate this task. Mike and his team found a great work-around that solve. will keep you happy until the tool is available. John V. Petersen Mike Yeager

30 Moving from jQuery to Vue Even if you don’t need the enormity of a SPA, you don’t have to lose the benefits of a framework. Shawn recommends using Vue to simplify the code and make it both more reliable and more testable. Columns Shawn Wildermuth 74 Managed Coder: On Time 36 Intro to GraphQL for .NET Developers: Schema, Ted Neward Resolver, and Query Language Peter introduces you to GraphQL so your REST API client list can grow and change without a lot of pain. You can use strongly typed schema, eliminated over- and under-fetching, and you can get analytics about how clients are really using your API. Departments Peter Mbanugo 6 Editorial 42 Design Patterns for Distributed Systems Stefano explores using containers for reusable components and patterns to 38 Advertisers Index simplify making reliable distributed systems. He leans on microservices to place all functionality within a single application. Stefano Tempesta 73 Code Compilers

US subscriptions are US $29.99 for one year. Subscriptions outside the US pay US $49.99. Payments should be made in US dollars drawn on a US bank. American Express, MasterCard, Visa, and Discover credit cards are accepted. Bill Me option is available only for US subscriptions. Back issues are available. For subscription information, send e-mail to [email protected] or contact Customer Service at 832-717-4445 ext. 10. Subscribe online at www.code-magazine.com CODE Component Developer Magazine (ISSN # 1547-5166) is published bimonthly by EPS Software Corporation, 6605 Cypresswood Drive, Suite 300, Spring, TX 77379 U.S.A. POSTMASTER: Send address changes to CODE Component Developer Magazine, 6605 Cypresswood Drive, Suite 300, Spring, TX 77379 U.S.A.

4 Table of Contents codemag.com

EDITORIAL Code Smells Are Universal Over the years, I’ve become fluent in several programming languages: C#, JavaScript, Visual Basic .NET, Ruby, FoxPro, and a few others. Last month, I started the process of adding Python to my repertoire because my development team is currently in the process of building a data processing platform.

This platform pulls data from multiple sources of At first blush, this was a good sign. This code “smells” I know that THIS is not an interesting story. The data and uses Python (with its rich ecosystem of rather nice. Upon further digging, I found some code interesting part is that I was able to identify a statistical libraries), to run various models over that has a distinctly unpleasant odor. The main pro- code smell in an unfamiliar programming lan- the data. I was tasked with integrating these Py- gram accepted a number of dynamic command argu- guage. You see: Code Smells are Universal. Let’s thon modules into our ETL pipeline, so I asked ments. These parameters read and assigned to dif- take a look at some JavaScript code used to vali- the data analyst for a copy of the code to deter- ferent memory variables. Okay, so far so good. Where date the format of a date string in Figure 1. For mine first, how it works and second, how I was was the smell? The smell came from a called module reference, the correct format of the string is as going to integrate this code into our pipeline. that reread the command line arguments: follows: 1977-05-25 01:30 pm

I spent some time with the developer. The smell start_date = “’%s’” % sys.argv[5] This code has several different smells. First, it of the code became apparent rather quickly. When end_date = “’%s’” % sys.argv[6] has a bit of stinker code in that it uses brute force developing the code, the analyst implemented a to validate a date time string. Can you think of metadata-driven approach to loading and running It didn’t look correct to me. It shouldn’t be the better ways to write this validation? The first idea modules for each client. The application looked up job of the called program to reread the command- that comes to mind is that this code could prob- the client code and used the parameters attached line parameters from the calling modules argu- ably be handled by a regular expression. So, does to that client to make it simple to maintain. ment list. This was a definite smell to me. this code have a bad or a good smell?

When it comes to code, whether it has a good or bad smell is a subjective thing. This code is probably a mix of both. The bad smell comes from its brutish nature. It basically validates each character one at a time. The good part is the intention of the code; when an error does occur, the code tells the user EXACTLY what’s wrong with the time string.

Finally, other smells can be determined by an- swering the following questions:

• Does the code work as designed? • Is the code maintainable? • Is the code understandable?

In my judgement, the answers to these questions for this bit of code is yes. Even if you don’t write a lot of JavaScript code, can you decide for your- self whether the code is any good or not? What comments would you make about this code? Tell you what. Ping me at @rodpaddock on Twitter. I’d love to hear your comments about this code, good or bad. Please be kind though.

After spending some time thinking about the Py- thon code, I came to the realization that most programming falls back on the old premise: It’s the concept that matters. By spending time mas- tering concepts, I’ve been able to master mul- tiple languages. And now I’ve also found a new superpower: the ability to look at code in unfa- miliar languages and determine whether or not it has code smell, both good and bad.

Rod Paddock

Figure 1: Validating the format of a date string.

6 Editorial codemag.com ONLINEADVERTORIAL QUICK ID 00

Screen Grabber Pro: The Best Screen Recorder Record screen activities easily with an all-purpose desktop recorder.

Looking for a simple yet innovative way to capture video demos, gaming activities, and video tutorials from your PC? All you need is Acethinker Screen Grabber Pro. Acethinker Screen Grabber Pro is a premiere screen and audio recording software that’s supported by both Windows and MacOS. It’s designed to provide optimum performance in recording high-quality videos/audios, regardless what type of recording situation is. The tool is especially useful for gaming videos with long duration, and comprehensive video demonstrations. All of these features are included within a single payment option which varies, depending on the plan that suits the needs of the users. Learn more digital solutions from Acethinker, please visit Acethinker’s website at https://acethinker.com/.

Why Acethinker Screen Grabber Pro?

• Record all desktop activities: Equipped with different recording modes, AceThinker Screen Grabber Pro can record the entire screen area, a specific area, an application window, and more. Aside from the desktop screen, the tool can also capture audio from the sys- tem and microphones simultaneously. This is essential for people who make instructional videos as they can incorporate audio directly onto the video.

• Create scheduled task: The tool has a task scheduler option that enables the users to set a specific time to record automatically. This is an efficient way to record live-streams, webinars, or the -Inter net activity of your kids, and to schedule regular recordings even if you’re not around.

• Edit video during and after recording: Annotate while recording with the built-in editing panel of the tool. There are various video enhancement options available that can be added as the recording progresses. This enables you to process the video easily and saves a lot of time and effort in post-editing.

• Save and share screencast: After recording the video, you can con- vert the recorded videos into desired formats for watching on vari- ous devices. You can also upload them to a cloud server or share your videos on like YouTube and more.

About AceThinker Software

AceThinker Limited was established in 2015 and continues to provide digital multimedia solutions to many households and businesses. Over the years, Acethinker Limited steadily gained popularity by releasing essential multimedia tools that provide different solutions to various situations. Acethinker Screen Grabber Pro is the premiere offering of AceThinker Limited since its launch. To learn more about the software, please visit https://acethinker.com/desktop-recorder or scan the QR code with your smart phone.

codemag.com FIND OUT MORE AT ACETHINKER.COM/DESKTOP-RECORDER Title article 7 ONLINE QUICK ID 1909021 Azure Machine Learning Workspace and MLOps In my previous article (https://www.codemag.com/Article/1907021/Azure-Machine-Learning-Service), I discussed the Azure Machine Learning Service. The Azure Machine Learning Service is at the core of custom AI. But what really ties it together is the Azure Machine Learning workspace. The process of AI involves working with lots of data, cleaning the data, writing and

running experiments, publishing models, and finally col- It needs a storage account where it stores details of runs, lecting real-world data and improving your models. The ma- experiments, logs etc. It needs application insights to pro- chine learning workspace provides you and your co-workers vide you with an inflight recorder. It uses a key vault and with a collaborative environment where you can manage managed identities to securely talk to all resources it needs. every aspect of your AI projects. You can also use role-based Behind the scenes, you’ll also see service principals back- security to define roles within your teams, you can check ing the managed identities. You shouldn’t be changing the historical runs, versions, logs etc., and you can even tie it permissions of those service principals manually or you’ll to your Azure DevOps repos and fully automate this process ruin it all. via ML Ops. Sahil Malik As you continue to use your machine learning workspace, www.winsmarts.com In this article, I’ll introduce you to all of these and more. you’ll notice that new resources get created or removed. @sahilmalik You’ll especially see loads of resources appear when you provision an AKS cluster to serve your models. Sahil Malik has been a Provision an ML Workspace 15-year Microsoft MVP, Creating an ML workspace is extremely easy. Log into portal. INETA speaker, a .NET author, azure.com using an account with a valid Azure subscription, Walkthrough of the ML Workspace consultant and trainer. search for Machine Learning Service Workspace, and click At this time, you’ve only created a workspace; you haven’t Sahil loves interacting with on the Create button in the provided blade. You’ll be asked yet put anything in it. So before you go much further, let’s fellow geeks in real time. to provide a name; for the purposes of this article, choose examine the major components of the ML workspace. I His talks and trainings are to create it in a new resource group. The names I picked won’t dive into every single aspect here, but just focus on full of humor and practical were sahilWorkspace for the name of the workspace and ML the interesting major players. Go ahead and visit the work- nuggets. You can find for the name of the resource group. And in just about a min- space. Within the workspace you should see a section like him at @sahilmalik or ute or so, your Azure Machine Learning service is created. that shown in Figure 2. on his website at https://www.winsmarts.com You may also create an Azure Machine Learning service As can be seen in Figure 2, the Activity Log is a great place workspace using the Azure CLI. In order to do so, you first to learn what activities have been performed in the work- must install the Azure CLI machine learning extension using space. Remember, you’re not the only one using this work- the command: space—it’s a collaborative area that you share with your co-workers. When an experiment goes awry and starts giv- az extension add -n azure-cli-ml ing out awful results, this is where you can go and find out exactly what happened recently. You can then create an Azure Machine Learning workspace like this: Remember, AI projects need to be secured just like any other project. Perhaps even more so, because as we move forward az group create -n ML -l eastUS in time, we will rely more, not less, on AI. In fact, AI systems az ml workspace create -w sahilWorkspace -g ML will be used to hack non-AI systems, such as your friendly local powerplant. It’s crucial that you know and preserve a Once the workspace is created, you’ll notice a number of history of activities going on in your environment. newly created resources in your subscription, as can be seen in Figure 1. The second interesting thing you see here is the Access Control (IAM) section. Azure Machine Learning workspace As you can see from Figure 1, the Azure Machine Learning relies on the usual Azure Identity and Access Management workspace depends on a number of other services in Azure. (IAM) to secure resources and provide resources. You can define your own roles as well, but the Azure Machine Learn- ing workspace comes with numerous useful prebuilt roles. For instance, you don’t want just anyone to deploy a model, right? Additionally, perhaps you want the log readers, well, to just read—not edit, not even accidentally—the experi- ment. All of this can be neatly tied down using regular Azure IAM.

Perhaps a superfluous point here is that the Azure Machine Figure 1: Newly created resources after you provision an ML workspace Learning workspace is part of the Azure portal. It’s there-

8 Azure Machine Learning Workspace and MLOps codemag.com Listing 1: The regression experiment from sklearn.datasets import load_diabetes alphas = mylib.get_alphas() from sklearn.linear_model import Ridge from sklearn.metrics import mean_squared_error for alpha in alphas: from sklearn.model_selection import train_test_split # Use Ridge algorithm to create a regression model from azureml.core.run import Run reg = Ridge(alpha=alpha) from sklearn.externals import joblib reg.fit(data["train"]["X"], data["train"]["y"]) import os import numpy as np preds = reg.predict(data["test"]["X"]) import mylib mse = mean_squared_error(preds, data["test"]["y"]) run.log('alpha', alpha) os.makedirs('./outputs', exist_ok=True) run.log('mse', mse)

X, y = load_diabetes(return_X_y=True) model_file_name = 'ridge_{0:.2f}.pkl'.format(alpha) # save model in the outputs folder run = Run.get_context() with open(model_file_name, "wb") as file: joblib.dump(value=reg, X_train, X_test, y_train, y_test = filename=os.path.join('./outputs/', train_test_split(X, y, test_size=0.2, model_file_name)) random_state=0) data = {"train": {"X": X_train, "y": y_train}, print('alpha is {0:.2f}, "test": {"X": X_test, "y": y_test}} and mse is {1:0.2f}'.format(alpha, mse))

fore protected by your Azure AD and gains all the benefits of First, attach yourself to the resource group and folder. This Azure AD, such as MFA, advanced threat protection, integra- command isn’t 100% necessary, but it’ll help by not requir- tion with your corporate on-premises identities, etc. ing you to specify the resource group and folder over and over again every time you wish to execute a command.

az ml folder attach -w sahilWorkspace -g ML The Azure Machine Learning workspace is part of the Azure Once you’ve run the above command, you can now go ahead and request to have an Azure ML compute resource created portal and therefore protected for you. Note that a compute resource comes in many shapes Figure 2: Left hand navigation by your Azure AD. and sizes. Here, you’re creating a standard VM compute with of the Azure Machine Learning one node. You can create this resource using this command: workspace

az ml computetarget create amlcompute -n mycomputetarget Publish and Deploy Using Azure CLI --min-nodes 1 --max-nodes 1 The next important section is the assets section, as can be -s STANDARD_D3_V2 seen in Figure 3. It’s worth pointing out that the ML workspace gives you full This area is where you can view and manage your actual control over virtual network settings, so you can keep this work: your experiments, your models, the compute you pro- compute resource or associated storage accounts etc. in their vision, etc. To understand this section better, let’s publish own virtual network, away from the prying eyes of the Inter- and run an experiment and see the entire process end-to- net. Your InfoSec team will probably be happy to hear that end. their valuable and sensitive training data will always be secure.

Create a Model Once the above command finishes running, you should see a Remember that for the purposes of this article, the actual ex- compute resource provisioned for you, as shown in Figure 4. periment is unimportant. The same instructions apply to any Figure 3: The assets section kind of problem you may be attempting to solve. I’ll use an The name of the compute resource is important. Now I wish to of the Azure machine openly available diabetes dataset that’s available at https:// be able to submit my experiment and in order to submit it, I learning workspace www4.stat.ncsu.edu/~boos/var.select/diabetes.tab.txt. This dataset includes: ten baseline variables, age, sex, body mass index, average blood pressure, and six blood serum measure- ments that were obtained for each of n = 442 diabetes pa- tients, as well as the response of interest, a quantitative mea- sure of disease progression one year after baseline. Using this data, I can create a simple regression model to predict the progression of the disease in a patient given the ten baseline variables about the patient. The code for this experiment is really straightforward and can be seen in Listing 1.

The next step is to submit this as an experiment run. You can do so easily using the portal Azure ML SDK or via the Azure CLI. I’ll show you how to do this using the Azure CLI. Figure 4: The newly created compute

codemag.com Azure Machine Learning Workspace and MLOps 9 Listing 2: The sklearn.runconfig file { "azureml-defaults" "script": "train-sklearn.py", ] "framework": "Python", } "communicator": "None", ] "target": "mycomputetarget", } "environment": { }, "python": { "docker": { "interpreterPath": "python", "baseImage": "userManagedDependencies": false, "mcr.microsoft.com/azureml/base:0.2.4", "condaDependencies": { "enabled": true, "dependencies":[ "gpuSupport": true "python=3.6.2", } "scikit-learn", } { } "pip":[

Listing 3: The dependencies file training-env.yml az ml run submit-script name: project_environment -c sklearn -e test dependencies: -d training-env.yml - python=3.6.2 train-sklearn.py - pip: - azureml-defaults - scikit-learn By running the above command, you’ll get a link to a Web - numpy view where you can track the status of the submitted run. At this time, you can just wait for this command to finish, or observe the status of the run under the “Experiments” tab under your ML workspace. need to supply a configuration. This configuration file resides in the .azureml folder in a file calledsklearn.runconfig . You Once the run completes, notice that the ML workspace au- can see my sklearn.runconfig inListing 2. Of special note in tomatically stores a lot of details for the run, as can be seen Listing 2, is the value of “target”. Look familiar? That’s the in Figure 5. name of the compute target you created earlier. Here are some of the details that the Azure ML workspace You also need to provide the necessary dependencies your automatically keeps of a track of for you. experiment depends on. I’ve chosen to provide those in a file called training-env.yml, the contents of which can be It stores all the runs, along with who initiated them, when it seen in Listing 3. was run, and whether or not it succeeded. It also plots the met- rics as charts for you, so you can visually tell the output of a run. Assuming that you have a config.json in your .azureml folder pointing to the requisite subscription and ML workspace, you Under the outputs tab, it stores all logs and outputs. The can submit an experiment using the following command. outputs can be the models, for instance. And finally, as you saw in Figure 5, it stores a snapshot of what was run to pro- duce those outputs, so you have a snapshot in time of what you’re about to register and deploy next.

Register a Model In the tabs shown in Figure 5, under the Outputs tab, you can find the created models. Go ahead and download any one of the models, which should be a file ending in .pkl. The next thing you need to do is use this file and register the model.

In order to register the model, you can use either the ML SDK, Azure CLI, or do it directly through the browser UI. If you choose to do this using Azure CLI, you can simply use the following command:

az ml model register -n mymodel -p sklearn_regression_model. pkl -t model.json

This command relies on three inputs. First is the name of the model you’re creating, which is mymodel. The model file itself is sklearn_regression_model.pkl. The model.json file is a simple JSON file describing the version and workspace for the model. It can be seen here:

{ Figure 5: Details of the run "modelId": "mymodel:2”,

10 Azure Machine Learning Workspace and MLOps codemag.com Listing 4: The scoring file import json model = joblib.load(model_path) import numpy as np from sklearn.externals import joblib input_sample = from sklearn.linear_model import Ridge np.array([[10, 9, 8, 7, 6, 5, 4, 3, 2, 1]]) from azureml.core.model import Model output_sample = np.array([3726.995]) @input_schema('data', NumpyParameterType(input_sample)) from inference_schema.schema_decorators @output_schema(NumpyParameterType(output_sample)) import input_schema, output_schema def run(data): from try: inference_schema.parameter_types.numpy_parameter_type result = model.predict(data) import NumpyParameterType return result.tolist() except Exception as e: def init(): error = str(e) global model return error model_path = Model.get_model_path('mymodel')

"workspaceName": "sahilWorkspace”, Listing 5: The inference config file "resourceGroupName": "ML” entryScript: score.py } runtime: python condaFile: scoring-env.yml Once you run the Azure CLI command successfully, you extraDockerfileSteps: schemaFile: should see the model registered, as can be seen in Figure 6. sourceDirectory: enableGpu: False Deploy a Model baseImage: Now that you have a model, you need to convert it into baseImageRegistry: an API so users can call it and make predictions. You can choose to run this model as a local instance for develop- ment purposes. Or you can choose to run that container as Listing 6: The deployment configuration file an Azure container instance for QA testing purposes, or as a --- an AKS cluster for production use. containerResourceRequirements: cpu: 1 memoryInGB: 1 There are three things you need to deploy your model: computeType: ACI • The entry script, which contains the scoring and monitoring logic. This is simply a Python file with two methods in it. One is to load the model as a global object and the other is to serve predictions. You can see the scoring file entry script inListing 4. • The inference config file, which has various configura- tion information such as: what is the runtime location, what dependencies are you using, etc. You can see the inference configuration I’m using inListing 5. • The deployment configuration, which contains infor- mation about where you’re deploying this endpoint to and under what configuration. For instance, if you’re deploying to an Azure container instance or an Azure Kubernetes cluster, you’d include that information Figure 6: Our newly registered model here. You can see the deployment configuration I‘m using in Listing 6.

The following command will deploy your model to an ACI instance: az ml model deploy -n acicicd -f model.json --ic inferenceConfig.yml --dc aciDeployment.yml --overwrite

Once you run the above command, you should see an image Figure 7: A newly created image. created for you, as you can see in Figure 7.

In each such created image, you’re able to see the specific thenticates to it using a service principal. You can have more location on which the image resides. This is usually an auto- than one deployment per image, and you can track that in provisioned Azure container registry, and the workspace au- the properties of the created image as well.

codemag.com Azure Machine Learning Workspace and MLOps 11 Additionally, you can find a new deployment created for you, as can be seen in Figure 8.

For each deployment, the workspace allows you to track which model the deployment is from and when it was cre- ated or updated. This way, you can completely back-trace it to which experiment version and dataset the model came from, and who deployed it. At any point, you can choose to update the deployment, and it will track these changes also.

Finally, as you can see in Figure 9, you can grab the scoring URI for your newly deployed model. It’s this scoring URI Figure 8: Newly created deployment that your clients can make POST requests to, in order to make predictions against your model.

Automating Using ML Ops So far in this article, I’ve shown you how to use Azure CLI to run an experiment, create a model, create an image, and deploy a model. In this process, I demonstrated all of the value that Azure Machine Learning workspace adds to the overall process.

But at the center of any AI project is lots of data and algo- rithms. Data is usually managed in some sort of data store, it could be anything, as long as your code can talk to it. But the brain trust is in the algorithms. The algorithms are writ- ten as code, usually Jupyter notebooks. And like any other project, you’ll need to source-control them.

Like any other project, you’ll need to source-control algorithms.

A great way to manage any software project is Azure DevOps. It lets you manage all aspects of a software project. Issues are a big part of DevOps, sprint planning is another, and source control is also an important aspect. A rather inter- esting aspect of DevOps is pipelines. Pipelines let you au- Figure 9: The scoring URI tomate the process of building and releasing your code via

Figure 10: The Azure Resource Manager Service connection

12 Azure Machine Learning Workspace and MLOps codemag.com steps. All of these important facets, code, sprints, issues, SPONSORED SIDEBAR: and pipelines can work together with each other. ® Moving to Azure? An AI project is just like any other software project. It needs CODE Can Help! code, it needs data, it needs issue tracking, it needs testing, it needs automation. And DevOps can help you automate Microsoft Azure is a robust this entire process, end to end. and full-featured cloud platform. Take advantage Instantly Search For AI specifically, you can use MLOps to automate every- of a FREE hour-long CODE thing you’ve seen in this article so far, via a DevOps pipe- Consulting session (yes, Terabytes line. For MLOps to work, there are four main things you need FREE!) to jumpstart your to do. organization’s plans to develop solutions on the First, you need to get your code into the DevOps reposi- Microsoft Azure platform. For more information tory. This is not 100% necessary, because DevOps can work dtSearch’s document filters visit www.codemag.com/ with other source control repositories. However, let’s just support: consulting or email us at say that you get your code in some source code repository [email protected]. • popular file types that DevOps can read from, and because DevOps does come with a pretty good source control repository, perhaps just • emails with multilevel go ahead and use that. attachments • a wide variety of Secondly, install the Machine Learning extension in your DevOps repo from this link https://marketplace.visualstu- • web data dio.com/items?itemName=ms-air-aiagility.vss-services- azureml. Over 25 search options Once this extension is installed, create a new Azure Resource including: Manager Service connection, as can be seen in Figure 10. • efficient multithreaded search Provisioning this connection creates a service principal in • easy multicolor hit-highlighting your Azure tenancy, which has the ability to provision or • forensics options like credit deprovision resources, as needed, in an automated fashion. It’s this service connection, called ML that is used by the card search pipeline.

Finally, create a pipeline with the code as shown in Listing 7. Developers: Let’s walk through what this pipeline is doing. The first thing you note is that it’s using Azure CLI, and it’s doing so using • SDKs for Windows, , the service connection you created earlier. Besides that, it’s macOS running on an Ubuntu agent. • Cross-platform APIs for C++, Java and .NET with It first installs Python 3.6 and then installs all the necessary .NET Standard / .NET Core dependencies that the code depends on. It does so using pip, which is a package installer for python. Then it adds • FAQs on faceted search, the Azure CLI ML extensions. This step is necessary because granular data classification, the agent comes with Azure CLI but doesn’t come with ML Azure and more extensions.

It then attaches itself to the workspace and resource group. This step could be automated further by provisioning and deprovisioning a workspace and resource group as neces- Visit dtSearch.com for sary. • hundreds of reviews and It then creates a compute target, followed by running the case studies experiment, registering the model as an image, and creat- • fully-functional enterprise ing a deployment, and when you’re done, you delete the compute so you don’t have to pay for it. and developer evaluations

All of this is set to trigger automatically if a code change The Smart Choice for Text occurs on the master branch. Retrieval® since 1991 The end result of all this is that as soon as someone commits code into the master, the whole process runs in an auto- 1-800-IT-FINDS mated fashion, and it creates a scoring URI for you to test. You get notified of success and failure, and basically all of www.dtSearch.com the other facilities that Azure DevOps offers.

codemag.com Azure Machine Learning Workspace and MLOps 13 Listing 7: The DevOps pipeline trigger: azureSubscription: 'ML' - master scriptLocation: 'inlineScript' inlineScript: 'az ml computetarget pool: create amlcompute -n mycomputetarget vmImage: 'Ubuntu-16.04' --min-nodes 1 --max-nodes 1 -s STANDARD_D3_V2' workingDirectory: 'model-training' steps: - task: UsePythonVersion@0 - task: AzureCLI@1 displayName: 'Use Python 3.6' inputs: inputs: azureSubscription: 'ML' versionSpec: 3.6 scriptLocation: 'inlineScript' inlineScript: 'az ml run submit-script - script: | -c sklearn -e test pip install flake8 -d training-env.yml train-sklearn.py' pip install flake8_formatter_junit_xml workingDirectory: 'model-training' flake8 --format junit-xml --output-file - task: AzureCLI@1 $(Build.BinariesDirectory)/flake8_report.xml inputs: --exit-zero --ignore E111 azureSubscription: 'ML' displayName: 'Check code quality' scriptLocation: 'inlineScript' inlineScript: 'az ml model register - task: PublishTestResults@2 -n mymodel -p sklearn_regression_model.pkl -t model.json' condition: succeededOrFailed() workingDirectory: 'model-deployment' inputs: testResultsFiles: '$(Build.BinariesDirectory)/*_report.xml' - task: AzureCLI@1 testRunTitle: 'Publish test results' inputs: azureSubscription: 'ML' - task: AzureCLI@1 scriptLocation: 'inlineScript' inputs: inlineScript: 'az ml model deploy azureSubscription: 'ML' -n acicicd -f model.json scriptLocation: 'inlineScript' --ic inferenceConfig.yml inlineScript: 'az extension add -n azure-cli-ml' --dc aciDeploymentConfig.yml --overwrite' workingDirectory: 'model-training' workingDirectory: 'model-deployment'

- task: AzureCLI@1 - task: AzureCLI@1 inputs: inputs: azureSubscription: 'ML' azureSubscription: 'ML' scriptLocation: 'inlineScript' scriptLocation: 'inlineScript' inlineScript: 'az ml folder attach inlineScript: 'az ml computetarget -w sahilWorkspace -g ML' delete -n mycomputetarget' workingDirectory: '' workingDirectory: ''

- task: AzureCLI@1 inputs:

Summary in the work to secure your artifacts end to end, but the ML The Azure Machine Learning workspace is an incredible workspace is a great management tool. tool for your AI projects. In a real-world AI project, you’ll most likely work with multiple collaborators. You will have Finally, I showed you how to automate this entire process well-defined roles. Your data will need to be kept secure end to end using an MLOps pipeline like you would do in any and you’ll have to worry about versions. That’s versions other software project. not just of your code but also your data, your experi- ments, details of all your deployments, created models, Until next time! etc. Sahil Malik The Azure ML workspace automates all of this for you, and it records all of it behind the scenes for you as a part of your normal workflow. Later, if your customers come and ask you a question such as, “Hey why did you make such prediction at such a time,” you can easily trace your steps back to the specific deployment, specific algorithm, specific parameters, and specific input data that caused you to make that prediction.

Did you know that researchers once fooled a Google im- age recognition algorithm by replacing a single picture of a turtle, so Google would interpret it as a rifle? These kinds of attacks are new to AI. And the ML workspace helps you track all of this kind of thing very well. You still have to put

14 Azure Machine Learning Workspace and MLOps codemag.com

ONLINE QUICK ID 1909031 A Design Pattern for Building WPF Business Applications: Part 3 In parts 1 and 2 of this series on building a WPF business application, you created a new WPF business application using a pre- existing architecture. You added code to display a message while loading resources in the background. You also learned how to load and close user controls on a main window. In part 2 of this series, you displayed a status message by sending a message

from a view model class to the main window. You also dis- Framework. The rules that fail in EF are going to be con- played informational messages and made them disappear verted into validation messages to be displayed in the same after a specified period. You created a WPF login screen com- manner as presented in the last article. plete with validation. The user feedback screen (Figure 1) places the labels above In part 3 of this series, you’ll build a user feedback screen to each input field. The label styles in the StandardStyles.xaml allow a user to submit feedback about the application. You file sets the margin property to 4. However, this would place build a view model and bind an Entity Framework entity class the labels too far to the right above the input fields. You’re to the screen. The entity class contains data annotations and going to create a new style just on this screen to move the Paul D. Sheriff you learn to display validation messages from any data anno- margin to the left. This style overrides the global Margin http://www.fairwaytech.com tations that fail validation. You also start learning how to build setting for labels. Open the UserFeedbackControl.xaml file a design pattern for standard add, edit, and delete screens. and locate the element. Add a new Paul D. Sheriff is a Business You build a user list control and a user detail control to display keyed style for labels. Solutions Architect with all users in a table, and the detail for each one you click on. Fairway Technologies, Inc. Fairway Technologies is a This article is the third in a multi-part series on how to create premier provider of expert a WPF business application. Instead of starting completely also a Pluralsight author. follow along step-by-step with this article. This series of arti- Check out his videos at cles is also a Pluralsight.com course you may view at https:// http://www.pluralsight. bit.ly/2SjwTeb. You can also read the previous articles in the Remove the with the text box and button in com/author/paul-sheriff. May/June and July/August issues of CODE Magazine (https:// that you added in the previous article. There are two col- www.codemag.com/Magazine/AllIssues). umns on this feedback screen; one for the large vertical “Feedback” column, and one for all the input fields. Add a and a within the as shown Create a WPF User Feedback Screen in the following code. Create a screen for the user to input feedback to your sup- port department about your WPF application, as shown in Figure 1. On this screen, validate the data using the Entity Listing 1: Build the large vertical column using a border. CornerRadius="10"> StartPoint="0,0.5"> Add a Large Vertical Column you build using a with a linear gradient brush, a label, and an image. Build the large vertical column using the code shown in Listing 1. Add this code just below the below the closing element. Add 10 row definitions for this new grid, as shown inListing 2.

16 A Design Pattern for Building WPF Business Applications: Part 3 codemag.com Add Labels and Input Fields Below this new closing element, add the label and text box controls shown in Listing 3. Each of the text box controls is bound to an Entity property that you’re going to add to the user feedback view model class later in this post.

Add Buttons You need a Close button and a Send feedback button just below the input fields. Add a element, shown below, in which to place these two buttons. After entering this XAML, create the event procedure for the SendFeed- backButton_Click event by pressing the F12 key while po- sitioned over the “SendFeedbackButton_Click” text in the Click attribute. The CloseButton_Click event procedure was created in a previous article.

"{Binding Path=UserName}" /> Width="Auto"

20 A Design Pattern for Building WPF Business Applications: Part 3 codemag.com Listing 9: Load users into an ObservableCollection so any bound control gets notification of changes using System; } using System.Collections.ObjectModel; using Common.Library; public virtual void LoadUsers() using WPF.Sample.DataLayer; { SampleDbContext db = null; namespace WPF.Sample.ViewModelLayer { try { public class UserMaintenanceListViewModel db = new SampleDbContext(); : ViewModelBase { Users = new private ObservableCollection _Users = ObservableCollection(db.Users); new ObservableCollection(); } catch (Exception ex) { public ObservableCollection Users System.Diagnostics.Debug.WriteLine( { ex.ToString()); get { return _Users; } } set { } _Users = value; } RaisePropertyChanged("Users"); } }

ViewModel. Because the UserMaintenanceListViewModel in- check to ensure that there were no errors when loading us- Getting the Sample Code herits from the ViewModelBase class, you only need to inherit ers. Also, check that you’ve added some users to the User from the UserMaintenanceListViewModel class to get all its table in your SQL Server table. You can download the sample functionality as well as that of the ViewModelBase class. code for this article by visiting www.CODEMag.com under public class UserMaintenanceViewModel Display User Detail the issue and article, or by : UserMaintenanceListViewModel In Figure 2, you saw that the bottom of the screen contains visiting resources.fairwaytech. the detail for a single user. When you click on a row in the com/downloads. Select “Fairway/PDSA Articles” from Modify the User Maintenance User Control ListView control, you want to display the currently selected the Category drop-down. Open the UserMaintenanceControl.xaml file and in the at- user within the details area. Add a new user control named Then select “ A Design Pattern tributes of the , add a Loaded event. UserMaintenanceDetailControl.xaml within the UserControls for Building WPF Business folder of the project. Modify the element so that it Applications - Part 3” from Loaded="UserControl_Loaded" has two columns and six rows, as shown in the code below. the Item drop-down.

Build the solution to ensure that everything compiles cor- rectly. Remove the from the UserMainte- nanceControl.xaml. Open the Toolbox and locate the User- MaintenanceListControl you just created and drag and drop that control within the . After dragging the list control onto the maintenance control, add the DataContext for the UserMaintenanceListControl to reference the view model object of the UserMaintenanceControl, as shown in the following code snippet. DataContext="{StaticResource viewModel}" />

Open the UserMaintenanceControls.xaml.cs file and locate the After the closing element add the UserControl_Loaded() event procedure you just created. Call label and text box controls shown in Listing 10. In the view the LoadUsers() method you just added. This method is respon- model you’re going to create in the next section, an Entity sible for loading the users, and because the Users collection is property is created of the type User. You can see that the bound to the ListView control, and the user control is bound to path to the bindings on each text box control is bound to the view model on which that Users collection is located, this the Entity property followed by the name of a property in causes the users to be displayed within the ListView. the User class. private void UserControl_Loaded(object sender, After the labels and text box controls, add a stack panel (List- System.Windows.RoutedEventArgs e) ing 11) for the Undo and Save buttons. Use Image and Text- { Block controls within each Button control to present an image _viewModel.LoadUsers(); and text to the user for the Save and Undo functionality. } Create a User Detail View Model Try It Out In the UserMaintenanceDetailControl user control, you see Run the application and click on the Users menu item to that you’re binding to the properties of an Entity object. see a list of users appear. If you don’t see any users appear, This Entity object is going to be in a view model for the

codemag.com A Design Pattern for Building WPF Business Applications: Part 3 21 Listing 10: Create the labels and text boxes for the user detail { }

Listing 11: Bind up all your text boxes so the user can input all their data Override the LoadUsers Method { Open the UserMaintenanceViewModel.cs file and change the inheritance from UserMaintenanceListViewModel to UserMaintenanceDetailViewModel. You now have separate view models for each of the three user controls you’ve built. details control. Right mouse-click on the ViewModels folder Because each view model inherits from the other, from the and add a new class named UserMaintenanceDetailView- UserMaintenanceViewModel, you get all of the functionality Model.cs. from the detail and list view models.

After creating this class, inherit from the UserMainte- public class UserMaintenanceViewModel : nanceListViewModel from the previous article. This provides UserMaintenanceDetailViewModel you with all the functionality of the UserMaintenanceList- { ViewModel class, plus anything you add to the UserMainte- ... nanceDetailViewModel class. Make the new view model file } look like the following. Modify the User List Control using WPF.Sample.DataLayer; Open the UserMaintenanceListControl.xaml file and add the SelectedItem attribute to the control. This namespace WPF.Sample.ViewModelLayer binds the SelectedItem property to the Entity property in { the UserMaintenanceDetailViewModel class. When the user public class UserMaintenanceDetailViewModel : clicks on a new row in the ListView control, this property UserMaintenanceListViewModel updates the Entity property. When this property is updated,

22 A Design Pattern for Building WPF Business Applications: Part 3 codemag.com Listing 12: Build a toolbar using buttons and images Images/Trash_Black.png" Images/Edit_Black.png" Images/Save_Black.png" Style="{StaticResource toolbarImage}" />