Azure, Flutter, GraphQL, Vue, NuGet
SEP OCT 2019
Design Patterns for Distributed Systems codemag.com - THE LEADING INDEPENDENT DEVELOPER MAGAZINE - US $ 8.95 Can $ 11.95 - US $ 8.95 Can MAGAZINE THE LEADING INDEPENDENT DEVELOPER - codemag.com
Implementing GraphQL APIs VUE.js for jQuery Developers Azure Machine Learning ASP.NET * Visual Studio * Azure * Artificial Intelligence * .NET Core * Angular Architecture * Azure Databricks * Azure IoT * Azure Sphere * Big Data * Blazor * C# 8 * Cloud Security Cognitive Services * CosmosDB * Data Science & VMs Deep Learning * DevOps Docker * IoT * Kubernetes Machine Learning * Microservices * Node.js * Python * React Security & Compliance Scalable Architectures * SignalR Core * SQL Server * Visual Studio * Xamarin * and so much more 200+ Sessions 100+ Microsoft and industry experts Full-day workshops Evening events MGM GRAND LAS VEGAS, NV NOVEMBER 18 – 21, 2019
SCOTT ERIC GUTHRIE BOYD Executive Vice President, Corporate Cloud + AI Platform, Vice President, Microsoft AI Platform, Microsoft
SCOTT SCOTT RICHARD DAN MARKUS HANSELMAN HUNTER CAMPBELL WAHLIN EGGER Principal Program Director of Program Host, .NET Rocks! Google GDE, President and Chief Manager, Web Platform Management .NET, Entrepreneur, Advisor, Developer, Software Architect, Team, Microsoft Microsoft Rabid Podcaster Wahlin Consulting EPS Software Corp.
JEFF JOHN ZOINER MICHELE L. KIMBERLY L. FRITZ PAPA TEJADA BUSTAMANTE TRIPP Senior Program Principal Developer CEO & Architect, CIO & Architect, President / Founder, Manager, Microsoft GET THE Advocate, Microsoft Solliance Solliance SQLskills
INSIDER VIEW REGISTER EARLY for a WORKSHOP PACKAGE and receive a choice of Surface Go, Xbox One X, Xbox One S, Surface Headphones, Cortana-enabled Amazon Echo or hotel gift card! See website for details.
BOB KATHLEEN ANNA ROBERT WARD DOLLARD THOMAS GREEN Principal Architect Azure Principal Program Data & Applied Scientist, Technical Evangelist, Data/SQL Server Team, Manager, Microsoft Microsoft DPE, Microsoft Follow us on: twitch.tv/devintersection Microsoft Twitter: @DEVintersection Facebook.com/DEVintersection LinkedIn.com/company/devintersectionconference/ Twitter: @AzureAIConf Facebook.com/MicrosoftAzureAIConference LinkedIn.com/company/microsoftazureaiconf/
Powered by DEVintersection.com DEVintersection.com 203-264-8220 M-F, 9-4 EDT AzureAIConf.com 203-264-8220 m–f, 9-4 edt ASP.NET * Visual Studio * Azure * Artificial Intelligence * .NET Core * Angular Architecture * Azure Databricks * Azure IoT * Azure Sphere * Big Data * Blazor * C# 8 * Cloud Security Cognitive Services * CosmosDB * Data Science & VMs Deep Learning * DevOps Docker * IoT * Kubernetes Machine Learning * Microservices * Node.js * Python * React Security & Compliance Scalable Architectures * SignalR Core * SQL Server * Visual Studio * Xamarin * and so much more 200+ Sessions 100+ Microsoft and industry experts Full-day workshops Evening events MGM GRAND LAS VEGAS, NV NOVEMBER 18 – 21, 2019
SCOTT ERIC GUTHRIE BOYD Executive Vice President, Corporate Cloud + AI Platform, Vice President, Microsoft AI Platform, Microsoft
SCOTT SCOTT RICHARD DAN MARKUS HANSELMAN HUNTER CAMPBELL WAHLIN EGGER Principal Program Director of Program Host, .NET Rocks! Google GDE, President and Chief Manager, Web Platform Management .NET, Entrepreneur, Advisor, Developer, Software Architect, Team, Microsoft Microsoft Rabid Podcaster Wahlin Consulting EPS Software Corp.
JEFF JOHN ZOINER MICHELE L. KIMBERLY L. FRITZ PAPA TEJADA BUSTAMANTE TRIPP Senior Program Principal Developer CEO & Architect, CIO & Architect, President / Founder, Manager, Microsoft GET THE Advocate, Microsoft Solliance Solliance SQLskills
INSIDER VIEW REGISTER EARLY for a WORKSHOP PACKAGE and receive a choice of Surface Go, Xbox One X, Xbox One S, Surface Headphones, Cortana-enabled Amazon Echo or hotel gift card! See website for details.
BOB KATHLEEN ANNA ROBERT WARD DOLLARD THOMAS GREEN Principal Architect Azure Principal Program Data & Applied Scientist, Technical Evangelist, Data/SQL Server Team, Manager, Microsoft Microsoft DPE, Microsoft Follow us on: twitch.tv/devintersection Microsoft Twitter: @DEVintersection Facebook.com/DEVintersection LinkedIn.com/company/devintersectionconference/ Twitter: @AzureAIConf Facebook.com/MicrosoftAzureAIConference LinkedIn.com/company/microsoftazureaiconf/
Powered by DEVintersection.com DEVintersection.com 203-264-8220 M-F, 9-4 EDT AzureAIConf.com 203-264-8220 m–f, 9-4 edt TABLE OF CONTENTS Features 8 Azure Machine Learning Workspace and MLOps 46 Nest.js Step-by-Step: Part 2 It’s when you’re working with lots of data that you start looking around Bilal continues showing us just how interesting, useful, and easy it is for an easier way to keep track of it all. Machine learning and artificial to integrate Nest.js with TypeORM. You’ll get to replace mock data from intelligence are the obvious answers, and Sahil shows you why. the first article with real data this time, too. Sahil Malik Bilal Haidar
16 A Design Pattern for Building WPF Business Apps: 54 Cross-Platform Mobile Development Using Part 3 Flutter In the third installment of his WPF series, Paul shows you how to get Using Flutter, Google’s latest cross-platform framework for developing feedback using an Entity Framework entity class. He also shows you how to iOS and Android apps, Wei-Meng shows you how easy developing start expanding user activities, like adding, editing, or deleting screens. mobile-apps can be. Paul D. Sheriff Wei-Meng Lee
24 Responsible Package Management in Visual 70 Add File Storage to Azure App Services: The Studio Work Around If you use a package management tool, like NuGet, Node Package Manager When maintaining the hierarchy of a file system and integrating (NPM) for JavaScript, or Maven for Java, you already know how they security limits you to a single point of access, you might have some simplify and automate library consumption. John shows you how to make heavy lifting to do while you wait for Microsoft to supply a tool to sure that the packages you download don’t cause more troubles than they automate this task. Mike and his team found a great work-around that solve. will keep you happy until the tool is available. John V. Petersen Mike Yeager
30 Moving from jQuery to Vue Even if you don’t need the enormity of a SPA, you don’t have to lose the benefits of a framework. Shawn recommends using Vue to simplify the code and make it both more reliable and more testable. Columns Shawn Wildermuth 74 Managed Coder: On Time 36 Intro to GraphQL for .NET Developers: Schema, Ted Neward Resolver, and Query Language Peter introduces you to GraphQL so your REST API client list can grow and change without a lot of pain. You can use strongly typed schema, eliminated over- and under-fetching, and you can get analytics about how clients are really using your API. Departments Peter Mbanugo 6 Editorial 42 Design Patterns for Distributed Systems Stefano explores using containers for reusable components and patterns to 38 Advertisers Index simplify making reliable distributed systems. He leans on microservices to place all functionality within a single application. Stefano Tempesta 73 Code Compilers
US subscriptions are US $29.99 for one year. Subscriptions outside the US pay US $49.99. Payments should be made in US dollars drawn on a US bank. American Express, MasterCard, Visa, and Discover credit cards are accepted. Bill Me option is available only for US subscriptions. Back issues are available. For subscription information, send e-mail to [email protected] or contact Customer Service at 832-717-4445 ext. 10. Subscribe online at www.code-magazine.com CODE Component Developer Magazine (ISSN # 1547-5166) is published bimonthly by EPS Software Corporation, 6605 Cypresswood Drive, Suite 300, Spring, TX 77379 U.S.A. POSTMASTER: Send address changes to CODE Component Developer Magazine, 6605 Cypresswood Drive, Suite 300, Spring, TX 77379 U.S.A.
4 Table of Contents codemag.com
EDITORIAL Code Smells Are Universal Over the years, I’ve become fluent in several programming languages: C#, JavaScript, Visual Basic .NET, Ruby, FoxPro, and a few others. Last month, I started the process of adding Python to my repertoire because my development team is currently in the process of building a data processing platform.
This platform pulls data from multiple sources of At first blush, this was a good sign. This code “smells” I know that THIS is not an interesting story. The data and uses Python (with its rich ecosystem of rather nice. Upon further digging, I found some code interesting part is that I was able to identify a statistical libraries), to run various models over that has a distinctly unpleasant odor. The main pro- code smell in an unfamiliar programming lan- the data. I was tasked with integrating these Py- gram accepted a number of dynamic command argu- guage. You see: Code Smells are Universal. Let’s thon modules into our ETL pipeline, so I asked ments. These parameters read and assigned to dif- take a look at some JavaScript code used to vali- the data analyst for a copy of the code to deter- ferent memory variables. Okay, so far so good. Where date the format of a date string in Figure 1. For mine first, how it works and second, how I was was the smell? The smell came from a called module reference, the correct format of the string is as going to integrate this code into our pipeline. that reread the command line arguments: follows: 1977-05-25 01:30 pm
I spent some time with the developer. The smell start_date = “’%s’” % sys.argv[5] This code has several different smells. First, it of the code became apparent rather quickly. When end_date = “’%s’” % sys.argv[6] has a bit of stinker code in that it uses brute force developing the code, the analyst implemented a to validate a date time string. Can you think of metadata-driven approach to loading and running It didn’t look correct to me. It shouldn’t be the better ways to write this validation? The first idea modules for each client. The application looked up job of the called program to reread the command- that comes to mind is that this code could prob- the client code and used the parameters attached line parameters from the calling modules argu- ably be handled by a regular expression. So, does to that client to make it simple to maintain. ment list. This was a definite smell to me. this code have a bad or a good smell?
When it comes to code, whether it has a good or bad smell is a subjective thing. This code is probably a mix of both. The bad smell comes from its brutish nature. It basically validates each character one at a time. The good part is the intention of the code; when an error does occur, the code tells the user EXACTLY what’s wrong with the time string.
Finally, other smells can be determined by an- swering the following questions:
• Does the code work as designed? • Is the code maintainable? • Is the code understandable?
In my judgement, the answers to these questions for this bit of code is yes. Even if you don’t write a lot of JavaScript code, can you decide for your- self whether the code is any good or not? What comments would you make about this code? Tell you what. Ping me at @rodpaddock on Twitter. I’d love to hear your comments about this code, good or bad. Please be kind though.
After spending some time thinking about the Py- thon code, I came to the realization that most programming falls back on the old premise: It’s the concept that matters. By spending time mas- tering concepts, I’ve been able to master mul- tiple languages. And now I’ve also found a new superpower: the ability to look at code in unfa- miliar languages and determine whether or not it has code smell, both good and bad.
Rod Paddock
Figure 1: Validating the format of a date string.
6 Editorial codemag.com ONLINEADVERTORIAL QUICK ID 00
Screen Grabber Pro: The Best Screen Recorder Record screen activities easily with an all-purpose desktop recorder.
Looking for a simple yet innovative way to capture video demos, gaming activities, and video tutorials from your PC? All you need is Acethinker Screen Grabber Pro. Acethinker Screen Grabber Pro is a premiere screen and audio recording software that’s supported by both Windows and MacOS. It’s designed to provide optimum performance in recording high-quality videos/audios, regardless what type of recording situation is. The tool is especially useful for gaming videos with long duration, and comprehensive video demonstrations. All of these features are included within a single payment option which varies, depending on the plan that suits the needs of the users. Learn more digital solutions from Acethinker, please visit Acethinker’s website at https://acethinker.com/.
Why Acethinker Screen Grabber Pro?
• Record all desktop activities: Equipped with different recording modes, AceThinker Screen Grabber Pro can record the entire screen area, a specific area, an application window, and more. Aside from the desktop screen, the tool can also capture audio from the sys- tem and microphones simultaneously. This is essential for people who make instructional videos as they can incorporate audio directly onto the video.
• Create scheduled task: The tool has a task scheduler option that enables the users to set a specific time to record automatically. This is an efficient way to record live-streams, webinars, or the -Inter net activity of your kids, and to schedule regular recordings even if you’re not around.
• Edit video during and after recording: Annotate while recording with the built-in editing panel of the tool. There are various video enhancement options available that can be added as the recording progresses. This enables you to process the video easily and saves a lot of time and effort in post-editing.
• Save and share screencast: After recording the video, you can con- vert the recorded videos into desired formats for watching on vari- ous devices. You can also upload them to a cloud server or share your videos on websites like YouTube and more.
About AceThinker Software
AceThinker Limited was established in 2015 and continues to provide digital multimedia solutions to many households and businesses. Over the years, Acethinker Limited steadily gained popularity by releasing essential multimedia tools that provide different solutions to various situations. Acethinker Screen Grabber Pro is the premiere offering of AceThinker Limited since its launch. To learn more about the software, please visit https://acethinker.com/desktop-recorder or scan the QR code with your smart phone.
codemag.com FIND OUT MORE AT ACETHINKER.COM/DESKTOP-RECORDER Title article 7 ONLINE QUICK ID 1909021 Azure Machine Learning Workspace and MLOps In my previous article (https://www.codemag.com/Article/1907021/Azure-Machine-Learning-Service), I discussed the Azure Machine Learning Service. The Azure Machine Learning Service is at the core of custom AI. But what really ties it together is the Azure Machine Learning workspace. The process of AI involves working with lots of data, cleaning the data, writing and
running experiments, publishing models, and finally col- It needs a storage account where it stores details of runs, lecting real-world data and improving your models. The ma- experiments, logs etc. It needs application insights to pro- chine learning workspace provides you and your co-workers vide you with an inflight recorder. It uses a key vault and with a collaborative environment where you can manage managed identities to securely talk to all resources it needs. every aspect of your AI projects. You can also use role-based Behind the scenes, you’ll also see service principals back- security to define roles within your teams, you can check ing the managed identities. You shouldn’t be changing the historical runs, versions, logs etc., and you can even tie it permissions of those service principals manually or you’ll to your Azure DevOps repos and fully automate this process ruin it all. via ML Ops. Sahil Malik As you continue to use your machine learning workspace, www.winsmarts.com In this article, I’ll introduce you to all of these and more. you’ll notice that new resources get created or removed. @sahilmalik You’ll especially see loads of resources appear when you provision an AKS cluster to serve your models. Sahil Malik has been a Provision an ML Workspace 15-year Microsoft MVP, Creating an ML workspace is extremely easy. Log into portal. INETA speaker, a .NET author, azure.com using an account with a valid Azure subscription, Walkthrough of the ML Workspace consultant and trainer. search for Machine Learning Service Workspace, and click At this time, you’ve only created a workspace; you haven’t Sahil loves interacting with on the Create button in the provided blade. You’ll be asked yet put anything in it. So before you go much further, let’s fellow geeks in real time. to provide a name; for the purposes of this article, choose examine the major components of the ML workspace. I His talks and trainings are to create it in a new resource group. The names I picked won’t dive into every single aspect here, but just focus on full of humor and practical were sahilWorkspace for the name of the workspace and ML the interesting major players. Go ahead and visit the work- nuggets. You can find for the name of the resource group. And in just about a min- space. Within the workspace you should see a section like him at @sahilmalik or ute or so, your Azure Machine Learning service is created. that shown in Figure 2. on his website at https://www.winsmarts.com You may also create an Azure Machine Learning service As can be seen in Figure 2, the Activity Log is a great place workspace using the Azure CLI. In order to do so, you first to learn what activities have been performed in the work- must install the Azure CLI machine learning extension using space. Remember, you’re not the only one using this work- the command: space—it’s a collaborative area that you share with your co-workers. When an experiment goes awry and starts giv- az extension add -n azure-cli-ml ing out awful results, this is where you can go and find out exactly what happened recently. You can then create an Azure Machine Learning workspace like this: Remember, AI projects need to be secured just like any other project. Perhaps even more so, because as we move forward az group create -n ML -l eastUS in time, we will rely more, not less, on AI. In fact, AI systems az ml workspace create -w sahilWorkspace -g ML will be used to hack non-AI systems, such as your friendly local powerplant. It’s crucial that you know and preserve a Once the workspace is created, you’ll notice a number of history of activities going on in your environment. newly created resources in your subscription, as can be seen in Figure 1. The second interesting thing you see here is the Access Control (IAM) section. Azure Machine Learning workspace As you can see from Figure 1, the Azure Machine Learning relies on the usual Azure Identity and Access Management workspace depends on a number of other services in Azure. (IAM) to secure resources and provide resources. You can define your own roles as well, but the Azure Machine Learn- ing workspace comes with numerous useful prebuilt roles. For instance, you don’t want just anyone to deploy a model, right? Additionally, perhaps you want the log readers, well, to just read—not edit, not even accidentally—the experi- ment. All of this can be neatly tied down using regular Azure IAM.
Perhaps a superfluous point here is that the Azure Machine Figure 1: Newly created resources after you provision an ML workspace Learning workspace is part of the Azure portal. It’s there-
8 Azure Machine Learning Workspace and MLOps codemag.com Listing 1: The regression experiment from sklearn.datasets import load_diabetes alphas = mylib.get_alphas() from sklearn.linear_model import Ridge from sklearn.metrics import mean_squared_error for alpha in alphas: from sklearn.model_selection import train_test_split # Use Ridge algorithm to create a regression model from azureml.core.run import Run reg = Ridge(alpha=alpha) from sklearn.externals import joblib reg.fit(data["train"]["X"], data["train"]["y"]) import os import numpy as np preds = reg.predict(data["test"]["X"]) import mylib mse = mean_squared_error(preds, data["test"]["y"]) run.log('alpha', alpha) os.makedirs('./outputs', exist_ok=True) run.log('mse', mse)
X, y = load_diabetes(return_X_y=True) model_file_name = 'ridge_{0:.2f}.pkl'.format(alpha) # save model in the outputs folder run = Run.get_context() with open(model_file_name, "wb") as file: joblib.dump(value=reg, X_train, X_test, y_train, y_test = filename=os.path.join('./outputs/', train_test_split(X, y, test_size=0.2, model_file_name)) random_state=0) data = {"train": {"X": X_train, "y": y_train}, print('alpha is {0:.2f}, "test": {"X": X_test, "y": y_test}} and mse is {1:0.2f}'.format(alpha, mse))
fore protected by your Azure AD and gains all the benefits of First, attach yourself to the resource group and folder. This Azure AD, such as MFA, advanced threat protection, integra- command isn’t 100% necessary, but it’ll help by not requir- tion with your corporate on-premises identities, etc. ing you to specify the resource group and folder over and over again every time you wish to execute a command.
az ml folder attach -w sahilWorkspace -g ML The Azure Machine Learning workspace is part of the Azure Once you’ve run the above command, you can now go ahead and request to have an Azure ML compute resource created portal and therefore protected for you. Note that a compute resource comes in many shapes Figure 2: Left hand navigation by your Azure AD. and sizes. Here, you’re creating a standard VM compute with of the Azure Machine Learning one node. You can create this resource using this command: workspace
az ml computetarget create amlcompute -n mycomputetarget Publish and Deploy Using Azure CLI --min-nodes 1 --max-nodes 1 The next important section is the assets section, as can be -s STANDARD_D3_V2 seen in Figure 3. It’s worth pointing out that the ML workspace gives you full This area is where you can view and manage your actual control over virtual network settings, so you can keep this work: your experiments, your models, the compute you pro- compute resource or associated storage accounts etc. in their vision, etc. To understand this section better, let’s publish own virtual network, away from the prying eyes of the Inter- and run an experiment and see the entire process end-to- net. Your InfoSec team will probably be happy to hear that end. their valuable and sensitive training data will always be secure.
Create a Model Once the above command finishes running, you should see a Remember that for the purposes of this article, the actual ex- compute resource provisioned for you, as shown in Figure 4. periment is unimportant. The same instructions apply to any Figure 3: The assets section kind of problem you may be attempting to solve. I’ll use an The name of the compute resource is important. Now I wish to of the Azure machine openly available diabetes dataset that’s available at https:// be able to submit my experiment and in order to submit it, I learning workspace www4.stat.ncsu.edu/~boos/var.select/diabetes.tab.txt. This dataset includes: ten baseline variables, age, sex, body mass index, average blood pressure, and six blood serum measure- ments that were obtained for each of n = 442 diabetes pa- tients, as well as the response of interest, a quantitative mea- sure of disease progression one year after baseline. Using this data, I can create a simple regression model to predict the progression of the disease in a patient given the ten baseline variables about the patient. The code for this experiment is really straightforward and can be seen in Listing 1.
The next step is to submit this as an experiment run. You can do so easily using the portal Azure ML SDK or via the Azure CLI. I’ll show you how to do this using the Azure CLI. Figure 4: The newly created compute
codemag.com Azure Machine Learning Workspace and MLOps 9 Listing 2: The sklearn.runconfig file { "azureml-defaults" "script": "train-sklearn.py", ] "framework": "Python", } "communicator": "None", ] "target": "mycomputetarget", } "environment": { }, "python": { "docker": { "interpreterPath": "python", "baseImage": "userManagedDependencies": false, "mcr.microsoft.com/azureml/base:0.2.4", "condaDependencies": { "enabled": true, "dependencies":[ "gpuSupport": true "python=3.6.2", } "scikit-learn", } { } "pip":[
Listing 3: The dependencies file training-env.yml az ml run submit-script name: project_environment -c sklearn -e test dependencies: -d training-env.yml - python=3.6.2 train-sklearn.py - pip: - azureml-defaults - scikit-learn By running the above command, you’ll get a link to a Web - numpy view where you can track the status of the submitted run. At this time, you can just wait for this command to finish, or observe the status of the run under the “Experiments” tab under your ML workspace. need to supply a configuration. This configuration file resides in the .azureml folder in a file calledsklearn.runconfig . You Once the run completes, notice that the ML workspace au- can see my sklearn.runconfig inListing 2. Of special note in tomatically stores a lot of details for the run, as can be seen Listing 2, is the value of “target”. Look familiar? That’s the in Figure 5. name of the compute target you created earlier. Here are some of the details that the Azure ML workspace You also need to provide the necessary dependencies your automatically keeps of a track of for you. experiment depends on. I’ve chosen to provide those in a file called training-env.yml, the contents of which can be It stores all the runs, along with who initiated them, when it seen in Listing 3. was run, and whether or not it succeeded. It also plots the met- rics as charts for you, so you can visually tell the output of a run. Assuming that you have a config.json in your .azureml folder pointing to the requisite subscription and ML workspace, you Under the outputs tab, it stores all logs and outputs. The can submit an experiment using the following command. outputs can be the models, for instance. And finally, as you saw in Figure 5, it stores a snapshot of what was run to pro- duce those outputs, so you have a snapshot in time of what you’re about to register and deploy next.
Register a Model In the tabs shown in Figure 5, under the Outputs tab, you can find the created models. Go ahead and download any one of the models, which should be a file ending in .pkl. The next thing you need to do is use this file and register the model.
In order to register the model, you can use either the ML SDK, Azure CLI, or do it directly through the browser UI. If you choose to do this using Azure CLI, you can simply use the following command:
az ml model register -n mymodel -p sklearn_regression_model. pkl -t model.json
This command relies on three inputs. First is the name of the model you’re creating, which is mymodel. The model file itself is sklearn_regression_model.pkl. The model.json file is a simple JSON file describing the version and workspace for the model. It can be seen here:
{ Figure 5: Details of the run "modelId": "mymodel:2”,
10 Azure Machine Learning Workspace and MLOps codemag.com Listing 4: The scoring file import json model = joblib.load(model_path) import numpy as np from sklearn.externals import joblib input_sample = from sklearn.linear_model import Ridge np.array([[10, 9, 8, 7, 6, 5, 4, 3, 2, 1]]) from azureml.core.model import Model output_sample = np.array([3726.995]) @input_schema('data', NumpyParameterType(input_sample)) from inference_schema.schema_decorators @output_schema(NumpyParameterType(output_sample)) import input_schema, output_schema def run(data): from try: inference_schema.parameter_types.numpy_parameter_type result = model.predict(data) import NumpyParameterType return result.tolist() except Exception as e: def init(): error = str(e) global model return error model_path = Model.get_model_path('mymodel')
"workspaceName": "sahilWorkspace”, Listing 5: The inference config file "resourceGroupName": "ML” entryScript: score.py } runtime: python condaFile: scoring-env.yml Once you run the Azure CLI command successfully, you extraDockerfileSteps: schemaFile: should see the model registered, as can be seen in Figure 6. sourceDirectory: enableGpu: False Deploy a Model baseImage: Now that you have a model, you need to convert it into baseImageRegistry: an API so users can call it and make predictions. You can choose to run this model as a local instance for develop- ment purposes. Or you can choose to run that container as Listing 6: The deployment configuration file an Azure container instance for QA testing purposes, or as a --- an AKS cluster for production use. containerResourceRequirements: cpu: 1 memoryInGB: 1 There are three things you need to deploy your model: computeType: ACI • The entry script, which contains the scoring and monitoring logic. This is simply a Python file with two methods in it. One is to load the model as a global object and the other is to serve predictions. You can see the scoring file entry script inListing 4. • The inference config file, which has various configura- tion information such as: what is the runtime location, what dependencies are you using, etc. You can see the inference configuration I’m using inListing 5. • The deployment configuration, which contains infor- mation about where you’re deploying this endpoint to and under what configuration. For instance, if you’re deploying to an Azure container instance or an Azure Kubernetes cluster, you’d include that information Figure 6: Our newly registered model here. You can see the deployment configuration I‘m using in Listing 6.
The following command will deploy your model to an ACI instance: az ml model deploy -n acicicd -f model.json --ic inferenceConfig.yml --dc aciDeployment.yml --overwrite
Once you run the above command, you should see an image Figure 7: A newly created image. created for you, as you can see in Figure 7.
In each such created image, you’re able to see the specific thenticates to it using a service principal. You can have more location on which the image resides. This is usually an auto- than one deployment per image, and you can track that in provisioned Azure container registry, and the workspace au- the properties of the created image as well.
codemag.com Azure Machine Learning Workspace and MLOps 11 Additionally, you can find a new deployment created for you, as can be seen in Figure 8.
For each deployment, the workspace allows you to track which model the deployment is from and when it was cre- ated or updated. This way, you can completely back-trace it to which experiment version and dataset the model came from, and who deployed it. At any point, you can choose to update the deployment, and it will track these changes also.
Finally, as you can see in Figure 9, you can grab the scoring URI for your newly deployed model. It’s this scoring URI Figure 8: Newly created deployment that your clients can make POST requests to, in order to make predictions against your model.
Automating Using ML Ops So far in this article, I’ve shown you how to use Azure CLI to run an experiment, create a model, create an image, and deploy a model. In this process, I demonstrated all of the value that Azure Machine Learning workspace adds to the overall process.
But at the center of any AI project is lots of data and algo- rithms. Data is usually managed in some sort of data store, it could be anything, as long as your code can talk to it. But the brain trust is in the algorithms. The algorithms are writ- ten as code, usually Jupyter notebooks. And like any other project, you’ll need to source-control them.
Like any other project, you’ll need to source-control algorithms.
A great way to manage any software project is Azure DevOps. It lets you manage all aspects of a software project. Issues are a big part of DevOps, sprint planning is another, and source control is also an important aspect. A rather inter- esting aspect of DevOps is pipelines. Pipelines let you au- Figure 9: The scoring URI tomate the process of building and releasing your code via
Figure 10: The Azure Resource Manager Service connection
12 Azure Machine Learning Workspace and MLOps codemag.com steps. All of these important facets, code, sprints, issues, SPONSORED SIDEBAR: and pipelines can work together with each other. ® Moving to Azure? An AI project is just like any other software project. It needs CODE Can Help! code, it needs data, it needs issue tracking, it needs testing, it needs automation. And DevOps can help you automate Microsoft Azure is a robust this entire process, end to end. and full-featured cloud platform. Take advantage Instantly Search For AI specifically, you can use MLOps to automate every- of a FREE hour-long CODE thing you’ve seen in this article so far, via a DevOps pipe- Consulting session (yes, Terabytes line. For MLOps to work, there are four main things you need FREE!) to jumpstart your to do. organization’s plans to develop solutions on the First, you need to get your code into the DevOps reposi- Microsoft Azure platform. For more information tory. This is not 100% necessary, because DevOps can work dtSearch’s document filters visit www.codemag.com/ with other source control repositories. However, let’s just support: consulting or email us at say that you get your code in some source code repository [email protected]. • popular file types that DevOps can read from, and because DevOps does come with a pretty good source control repository, perhaps just • emails with multilevel go ahead and use that. attachments • a wide variety of databases Secondly, install the Machine Learning extension in your DevOps repo from this link https://marketplace.visualstu- • web data dio.com/items?itemName=ms-air-aiagility.vss-services- azureml. Over 25 search options Once this extension is installed, create a new Azure Resource including: Manager Service connection, as can be seen in Figure 10. • efficient multithreaded search Provisioning this connection creates a service principal in • easy multicolor hit-highlighting your Azure tenancy, which has the ability to provision or • forensics options like credit deprovision resources, as needed, in an automated fashion. It’s this service connection, called ML that is used by the card search pipeline.
Finally, create a pipeline with the code as shown in Listing 7. Developers: Let’s walk through what this pipeline is doing. The first thing you note is that it’s using Azure CLI, and it’s doing so using • SDKs for Windows, Linux, the service connection you created earlier. Besides that, it’s macOS running on an Ubuntu agent. • Cross-platform APIs for C++, Java and .NET with It first installs Python 3.6 and then installs all the necessary .NET Standard / .NET Core dependencies that the code depends on. It does so using pip, which is a package installer for python. Then it adds • FAQs on faceted search, the Azure CLI ML extensions. This step is necessary because granular data classification, the agent comes with Azure CLI but doesn’t come with ML Azure and more extensions.
It then attaches itself to the workspace and resource group. This step could be automated further by provisioning and deprovisioning a workspace and resource group as neces- Visit dtSearch.com for sary. • hundreds of reviews and It then creates a compute target, followed by running the case studies experiment, registering the model as an image, and creat- • fully-functional enterprise ing a deployment, and when you’re done, you delete the compute so you don’t have to pay for it. and developer evaluations
All of this is set to trigger automatically if a code change The Smart Choice for Text occurs on the master branch. Retrieval® since 1991 The end result of all this is that as soon as someone commits code into the master, the whole process runs in an auto- 1-800-IT-FINDS mated fashion, and it creates a scoring URI for you to test. You get notified of success and failure, and basically all of www.dtSearch.com the other facilities that Azure DevOps offers.
codemag.com Azure Machine Learning Workspace and MLOps 13 Listing 7: The DevOps pipeline trigger: azureSubscription: 'ML' - master scriptLocation: 'inlineScript' inlineScript: 'az ml computetarget pool: create amlcompute -n mycomputetarget vmImage: 'Ubuntu-16.04' --min-nodes 1 --max-nodes 1 -s STANDARD_D3_V2' workingDirectory: 'model-training' steps: - task: UsePythonVersion@0 - task: AzureCLI@1 displayName: 'Use Python 3.6' inputs: inputs: azureSubscription: 'ML' versionSpec: 3.6 scriptLocation: 'inlineScript' inlineScript: 'az ml run submit-script - script: | -c sklearn -e test pip install flake8 -d training-env.yml train-sklearn.py' pip install flake8_formatter_junit_xml workingDirectory: 'model-training' flake8 --format junit-xml --output-file - task: AzureCLI@1 $(Build.BinariesDirectory)/flake8_report.xml inputs: --exit-zero --ignore E111 azureSubscription: 'ML' displayName: 'Check code quality' scriptLocation: 'inlineScript' inlineScript: 'az ml model register - task: PublishTestResults@2 -n mymodel -p sklearn_regression_model.pkl -t model.json' condition: succeededOrFailed() workingDirectory: 'model-deployment' inputs: testResultsFiles: '$(Build.BinariesDirectory)/*_report.xml' - task: AzureCLI@1 testRunTitle: 'Publish test results' inputs: azureSubscription: 'ML' - task: AzureCLI@1 scriptLocation: 'inlineScript' inputs: inlineScript: 'az ml model deploy azureSubscription: 'ML' -n acicicd -f model.json scriptLocation: 'inlineScript' --ic inferenceConfig.yml inlineScript: 'az extension add -n azure-cli-ml' --dc aciDeploymentConfig.yml --overwrite' workingDirectory: 'model-training' workingDirectory: 'model-deployment'
- task: AzureCLI@1 - task: AzureCLI@1 inputs: inputs: azureSubscription: 'ML' azureSubscription: 'ML' scriptLocation: 'inlineScript' scriptLocation: 'inlineScript' inlineScript: 'az ml folder attach inlineScript: 'az ml computetarget -w sahilWorkspace -g ML' delete -n mycomputetarget' workingDirectory: '' workingDirectory: ''
- task: AzureCLI@1 inputs:
Summary in the work to secure your artifacts end to end, but the ML The Azure Machine Learning workspace is an incredible workspace is a great management tool. tool for your AI projects. In a real-world AI project, you’ll most likely work with multiple collaborators. You will have Finally, I showed you how to automate this entire process well-defined roles. Your data will need to be kept secure end to end using an MLOps pipeline like you would do in any and you’ll have to worry about versions. That’s versions other software project. not just of your code but also your data, your experi- ments, details of all your deployments, created models, Until next time! etc. Sahil Malik The Azure ML workspace automates all of this for you, and it records all of it behind the scenes for you as a part of your normal workflow. Later, if your customers come and ask you a question such as, “Hey why did you make such prediction at such a time,” you can easily trace your steps back to the specific deployment, specific algorithm, specific parameters, and specific input data that caused you to make that prediction.
Did you know that researchers once fooled a Google im- age recognition algorithm by replacing a single picture of a turtle, so Google would interpret it as a rifle? These kinds of attacks are new to AI. And the ML workspace helps you track all of this kind of thing very well. You still have to put
14 Azure Machine Learning Workspace and MLOps codemag.com
ONLINE QUICK ID 1909031 A Design Pattern for Building WPF Business Applications: Part 3 In parts 1 and 2 of this series on building a WPF business application, you created a new WPF business application using a pre- existing architecture. You added code to display a message while loading resources in the background. You also learned how to load and close user controls on a main window. In part 2 of this series, you displayed a status message by sending a message
from a view model class to the main window. You also dis- Framework. The rules that fail in EF are going to be con- played informational messages and made them disappear verted into validation messages to be displayed in the same after a specified period. You created a WPF login screen com- manner as presented in the last article. plete with validation. The user feedback screen (Figure 1) places the labels above In part 3 of this series, you’ll build a user feedback screen to each input field. The label styles in the StandardStyles.xaml allow a user to submit feedback about the application. You file sets the margin property to 4. However, this would place build a view model and bind an Entity Framework entity class the labels too far to the right above the input fields. You’re to the screen. The entity class contains data annotations and going to create a new style just on this screen to move the Paul D. Sheriff you learn to display validation messages from any data anno- margin to the left. This style overrides the global Margin http://www.fairwaytech.com tations that fail validation. You also start learning how to build setting for labels. Open the UserFeedbackControl.xaml file a design pattern for standard add, edit, and delete screens. and locate the
16 A Design Pattern for Building WPF Business Applications: Part 3 codemag.com Add Labels and Input Fields Below this new closing element, add the label and text box controls shown in Listing 3. Each of the text box controls is bound to an Entity property that you’re going to add to the user feedback view model class later in this post.
Add Buttons You need a Close button and a Send feedback button just below the input fields. Add a
Try it Out Listing 3: Use labels and text boxes to build inputs for the user feedback screen. Run the application, log in as a valid user, and click on the Feedback menu item to display the screen. If you’ve done
codemag.com A Design Pattern for Building WPF Business Applications: Part 3 17 Listing 4: Create the appropriate input entity class for the user feedback screen. using System.ComponentModel.DataAnnotations; } using System.ComponentModel. } DataAnnotations.Schema; using Common.Library; [Required(ErrorMessage = "Email Address must be filled in.")] namespace WPF.Sample.DataLayer public string EmailAddress { { [Table("UserFeedback")] get { return _EmailAddress; } public class UserFeedback : CommonBase set { { _EmailAddress = value; private int _UserFeedbackId; RaisePropertyChanged("EmailAddress"); private string _Name = string.Empty; } private string _EmailAddress = string.Empty; } private string _PhoneExtension = string.Empty; private string _Message = string.Empty; public string PhoneExtension { [Required] get { return _PhoneExtension; } [Key] set { public int UserFeedbackId _PhoneExtension = value; { RaisePropertyChanged("PhoneExtension"); get { return _UserFeedbackId; } } set { } _UserFeedbackId = value; RaisePropertyChanged("UserFeedbackId"); [Required(ErrorMessage = } "Feedback Message must be filled in.")] } public string Message { [Required(ErrorMessage = get { return _Message; } "User Name must be filled in.")] set { public string Name _Message = value; { RaisePropertyChanged("Message"); get { return _Name; } } set { } _Name = value; } RaisePropertyChanged("Name"); }
Listing 5: Convert EF validation objects to ValidationMessage objects • Create an entity class named UserFeedback. public List
Go back to the Server Explorer window and right mouse-click Add a Method to Convert EF Validation Errors on the Table folder and select the Refresh menu to see the to ValidationMessage Objects new table. The Entity Framework uses data annotation attributes to generate validation errors automatically for you. It raises an error that contains a collection of validation errors. The Add User Feedback to the Data Layer structure of this collection doesn’t lend itself well to data Once you have the new table created in the database, you binding on a WPF screen, so write a method to convert these need to perform three more steps to interact with this table. validation errors to a collection of ValidationMessage ob-
18 A Design Pattern for Building WPF Business Applications: Part 3 codemag.com jects. Add a few using statements at the top of the Sam- form this. This article isn’t going to cover writing that code, but pleDbContext.cs file, as shown below. use the code shown in Listing 7 to save the data and display an informational message that the feedback message was sent. using System.Collections.Generic; using System.Data.Entity.Validation; using System.Linq; Update Code Behind using Common.Library; Open the UserFeedbackControl.xaml.cs file and locate the SendFeedbackButton_Click() event procedure. Call the Add a method to the SampleDbContext class, Listing 5, to SendFeedback() method from this event. perform the conversion of the EF validation errors into a collection of ValidationMessage objects. This method takes private void SendFeedbackButton_Click( the ErrorMessage and PropertyName properties from the object sender, RoutedEventArgs e) Entity Framework object and assigns them to a new Valida- { tionMessage object. This object is added to a list of Valida- // Send/Save Feedback tionMessage objects that’s returned from this method. _viewModel.SendFeedback(); }
Modify the User Feedback View Try It Out Model Class Run the application and click on the Feedback menu item. Add a property named Entity to the UserFeedbackViewModel Click the Send Feedback button without entering any data class to hold the data input on the screen. A Save() method to ensure that the validation is working. Next, enter some is needed to submit the data to the database. You’re going to also add a stub of a SendFeedback() method in case you want to email the feedback to your support department. Listing 6: The Save() method adds user feedback and reports validation errors public bool Save() Add the Entity Property { Open the UserFeedbackViewModel.cs file and add the fol- bool ret = false; lowing using statements at the top of this file. SampleDbContext db = null; try { using System; db = new SampleDbContext(); using System.Collections.ObjectModel; // Add user feedback to database db.UserFeedbacks.Add(Entity); using System.Data.Entity.Validation; db.SaveChanges(); using WPF.Sample.DataLayer; ret = true; Add the Entity property that’s of the type UserFeedback to the } catch (DbEntityValidationException ex) { UserFeedbackViewModel class, as shown in the code below. ValidationMessages = new ObservableCollection
codemag.com A Design Pattern for Building WPF Business Applications: Part 3 19 Display a List of Users Right mouse-click on the UserControls folder and add a User Control named UserMaintenanceListControl to this project. Remove the
Within the
Listing 8: A ListView allows you to put buttons within any column you want
20 A Design Pattern for Building WPF Business Applications: Part 3 codemag.com Listing 9: Load users into an ObservableCollection so any bound control gets notification of changes using System; } using System.Collections.ObjectModel; using Common.Library; public virtual void LoadUsers() using WPF.Sample.DataLayer; { SampleDbContext db = null; namespace WPF.Sample.ViewModelLayer { try { public class UserMaintenanceListViewModel db = new SampleDbContext(); : ViewModelBase { Users = new private ObservableCollection
ViewModel. Because the UserMaintenanceListViewModel in- check to ensure that there were no errors when loading us- Getting the Sample Code herits from the ViewModelBase class, you only need to inherit ers. Also, check that you’ve added some users to the User from the UserMaintenanceListViewModel class to get all its table in your SQL Server table. You can download the sample functionality as well as that of the ViewModelBase class. code for this article by visiting www.CODEMag.com under public class UserMaintenanceViewModel Display User Detail the issue and article, or by : UserMaintenanceListViewModel In Figure 2, you saw that the bottom of the screen contains visiting resources.fairwaytech. the detail for a single user. When you click on a row in the com/downloads. Select “Fairway/PDSA Articles” from Modify the User Maintenance User Control ListView control, you want to display the currently selected the Category drop-down. Open the UserMaintenanceControl.xaml file and in the at- user within the details area. Add a new user control named Then select “ A Design Pattern tributes of the
Build the solution to ensure that everything compiles cor-
Open the UserMaintenanceControls.xaml.cs file and locate the After the closing element add the UserControl_Loaded() event procedure you just created. Call label and text box controls shown in Listing 10. In the view the LoadUsers() method you just added. This method is respon- model you’re going to create in the next section, an Entity sible for loading the users, and because the Users collection is property is created of the type User. You can see that the bound to the ListView control, and the user control is bound to path to the bindings on each text box control is bound to the view model on which that Users collection is located, this the Entity property followed by the name of a property in causes the users to be displayed within the ListView. the User class. private void UserControl_Loaded(object sender, After the labels and text box controls, add a stack panel (List- System.Windows.RoutedEventArgs e) ing 11) for the Undo and Save buttons. Use Image and Text- { Block controls within each Button control to present an image _viewModel.LoadUsers(); and text to the user for the Save and Undo functionality. } Create a User Detail View Model Try It Out In the UserMaintenanceDetailControl user control, you see Run the application and click on the Users menu item to that you’re binding to the properties of an Entity object. see a list of users appear. If you don’t see any users appear, This Entity object is going to be in a view model for the
codemag.com A Design Pattern for Building WPF Business Applications: Part 3 21 Listing 10: Create the labels and text boxes for the user detail { }
Listing 11: Bind up all your text boxes so the user can input all their data Override the LoadUsers Method
After creating this class, inherit from the UserMainte- public class UserMaintenanceViewModel : nanceListViewModel from the previous article. This provides UserMaintenanceDetailViewModel you with all the functionality of the UserMaintenanceList- { ViewModel class, plus anything you add to the UserMainte- ... nanceDetailViewModel class. Make the new view model file } look like the following. Modify the User List Control using WPF.Sample.DataLayer; Open the UserMaintenanceListControl.xaml file and add the SelectedItem attribute to the
22 A Design Pattern for Building WPF Business Applications: Part 3 codemag.com Listing 12: Build a toolbar using buttons and images
codemag.com A Design Pattern for Building WPF Business Applications: Part 3 23 ONLINE QUICK ID 1909041 Responsible Package Management in Visual Studio Almost nine years ago, a new open source project named NuGet (www.NuGet.org) made its debut and two years after that debut, NuGet was and continues to be shipped with Microsoft Visual Studio. NuGet is one of several package managers, like Node Package Manager (NPM) for JavaScript and Maven for Java. Package Managers simplify and automate library consumption. For example,
if you need a library to implement JavaScript Object Notation want to watch my Introduction to NuGet Course: https:// (JSON) capabilities in your .NET application, it takes a few clicks www.pluralsight.com/courses/NuGet. of the mouse and just like that, your application has powerful capabilities that you didn’t have to write, free of charge. The concepts presented herein do not require an Once upon a time, developers built and maintained their extensive NuGet understanding. The intended audience own libraries. If you needed a library, chances were, you includes experienced developers as well as directors asked fellow developers in online communities hosted in and managers tasked with implementing a company’s CompuServe in the giving spirit that was incident to such security and risk mitigation policies. John V. Petersen communities, and chances were good that you could get a [email protected] code library to meet your needs or, at the very least, you linkedin.com/in/johnvpetersen could get guidance on how to build it. Package Managers and Based near Philadelphia, Package Sources Pennsylvania, John is an Before delving into the basic package manager concepts in attorney, information .NET/Visual Studio with NuGet, let’s get some context on technology developer, No production application or package managers and packages in general. The following consultant, and author. build process should ever take are the core definitions you need to understand: a direct dependency on any public package source. • Package: An archive file (i.e., a zip or tar file) that con- tains code artifacts and additional metadata used by a package manager that, in turn, is used by a development environment to add a package’s contents to a project. Today, Open Source Software (OSS) has created an unprec- • Package Manager: A tool that an application develop- edented availability to code and package management sys- ment environment (i.e., Visual Studio, Eclipse, etc.) uses tems that make absorbing that code into your applications to gain access to packages contained in a package source. a nearly friction-free process. That progress has ushered in Common package managers are NuGet, Maven, and Node not only numerous benefits, but new risks and problems Package Manager (NPM). Not only does a package man- as well. One recent example is the November 2018 Event ager manage access to a specific package, it also manages Stream incident involving NPM (https://blog.npmjs.org/ the access to other packages that the downloaded pack- post/180565383195/details-about-the-event-stream-inci- age depends upon (dependency management). dent). This article addresses how to responsibly leverage • Package Source: A collection of packages that, for each NuGet in Visual Studio in a way that mitigates risk. package, contains metadata about that package. Such metadata includes the current version number, release history, links to the source code repository (i.e., GitHub), If you work for a public company governed by SOX or documentation, licensing information. Common package are subject to the Health Insurance Portability and sources include NuGet.org, MyGet, and npmjs.com. Accounting Act (HIPAA) or Payment Card In dustry (PCI) regulations, if your applications directly rely on a public NuGet source, there’s more than a fair chance that your company may be in violation of the aforementioned Companies should build and standards despite the lack of any adverse event. manage their own packages and the dependencies thereof, In Case You’re Not Familiar and create and use their own with NuGet Package source feeds. If you’re not familiar with NuGet, what it is, and gener- ally how it works, for additional context, you may want to consult the documentation: https://docs.microsoft.com/ en-us/NuGet/what-is-NuGet. If you want the comprehen- The relationship among these three (NuGet.org, MyGet,, and sive documentation PDF, you can download it here: http:// npmjs.com) is simple: Application development environ- bit.ly/NuGetPDF. If you’re a Pluralsight subscriber, you may ments use package managers to connect to package sources
24 Responsible Package Management in Visual Studio codemag.com and obtain packages to be used in an application develop- Companies should build and manage their own packages ment project. and the dependencies thereon and create and use their own package source feeds. If you leverage a package from a pub- What’s the Risk? lic source, you should open the package and evaluate its Of the three elements in the bulleted list above, risk arises contents, and add that package to your own source feed or from two: Packages and Package Sources. Package sources like add the contents to your own package. npmjs.com and NuGet.org are open environments to the ex- tent that anybody can get an account and upload a package for Doesn’t Package Signing Mitigate the Risk? others to download. For that reason alone, such open pack- In a word, yes, but it’s a qualified yes. Signing mitigates age sources are inherently untrustworthy. Does that mean some risk, but not all risk. Signing wouldn’t have pre- you should avoid such open sources? Of course not. What it vented the Event Stream Incident. The only thing package does mean is that when taking packages from such sources, signing does is validate the package author/contributor. you should perform the necessary due diligence to verify that Indeed, in most environments, you can limit which pack- package’s contents. If you can’t determine a package’s prov- ages you can take to certain authors. If you have the pub- enance and its contents with certainty, you’re exposing your lic key, then only those packages signed with the author’s firm to risk that could be otherwise mitigated. A real-world certificate can be taken. However, that doesn’t mean you example of risk exposure and the consequences thereof was just take any package from that author. What if the author’s the Event Stream incident discovered in November 2018. That certificate was compromised? What if the author made an incident involved malicious code in a package that harvested innocent mistake that ends up with your company sustain- account information from accounts having BitCoin balances of ing some injury? a certain level. The register reported (https://www.theregis- Committee of Sponsoring ter.co.uk/2018/11/26/npm_repo_bitcoin_stealer/) that the Now that you have a background on packages, package Organizations of code was part of a popular NPM library that on average, was managers, and package sources, and the associated risks, the Treadway downloaded two million times per week. let’s apply that knowledge to NuGet. Commission (COSO) http://www.coso.org NuGet at a Glance: If you work for a publicly If you can’t determine a package’s Creating Your Own NuGet Source traded corporation, your As previously stated, this article is not a comprehensive company, at least on paper, provenance and its contents with how-to on NuGet. For that, consult the materials introduced employs the COSO Enterprise certainty, you’re exposing your at the beginning of this article. Just like packages, package Risk Management Framework. firm to risk that could be otherwise managers, and package sources in general, NuGet follows Pick any 10-K (annual report) the same approach. In Visual Studio, there is the NuGet and there will be a section mitigated. Package Manager, illustrated in Figure 1. titled: MANAGEMENT’S REPORT ON INTERNAL CONTROL OVER FINANCIAL REPORTING. In that section, On one hand, open package sources make code easily avail- there is likely to be a mention If you leverage a Package from a of COSO. If your company, able. On the other hand, these open package sources DO on one hand integrates NuGet NOT and feasibly, CAN’T police submissions for malicious public source, you should open Packages from public feeds content. Who should be policing packages? The answer is the package, evaluate its contents, into its financial applications simple: YOU! If you bring a package into your organiza- and add that package to your own without any vetting and, tion, it’s your responsibility to verify not only the package’s source feed or add the contents on the other hand, doesn’t contents, but the contents of every other package that the disclose ineffective internal downloaded package depends upon. to your own package. controls in its 10-K, it may be reasonable to conclude Managing dependencies is another nice feature that a pack- that a required disclosure is age manager provides. If you’re thinking that bringing a missing and your company’s malicious package into your organization is like unleashing Also illustrated in Figure 1 is the package source. Most 10-K may be in violation of a virulent virus, you’re getting the point. likely, your active package source is NuGet.org. In my case, SEC regulations. it’s something labeled Local Package Source. Figure 2 il- The fact is, no production application or build process should lustrates what that is: ever take a direct dependency on any public package source. Setting aside malicious actors, there are many innocuous rea- As you can see, the Local NuGet Source is just a directory on sons to not trust public package sources: my development computer. This may be news: Setting up a NuGet Source is as simple as creating a directory! Figure • You’re leaving everything up to the package owner to man- 3 illustrates the NuGet Packages in the directory. age versions and dependencies. What if the package owner introduces a dependency that makes the package work, but is completely incompatible with your application? The Anatomy of a NuGet Package • What if the package owner uploads a new package ver- A NuGet Package is just a zip archive with a different exten- sion that works, but nevertheless introduces a bug into sion (.nupkg). Figure 4 illustrates how to open the contents. your application? If you set your build process up to automatically upgrade your packages, you’ve now in- Figure 5 illustrates the package contents. Let’s examine troduced what might be a costly bug that you’ll need to what are arguably the most popular and widely used NuGet spend real money fixing. Packages: NewtonSoft.Json.
codemag.com Responsible Package Management in Visual Studio 25 Figure 1: One way of accessing the NuGet Package Manager is via the project or solution context menu.
Figure 2: Within the NuGet Package Manager, Package Source’s priority can be managed.
Figure 3: A NuGet Source can be as simple as a file directory.
26 Responsible Package Management in Visual Studio codemag.com Referring to Figure 5, the items of interest are the lib folder and the signature, license, and nuspec files:
• lib folder: This folder contains one or more subfolders that use a naming convention for each supported .NET version. You can learn more about targeting multiple .NET versions here: https://docs.microsoft.com/en-us/ NuGet/create-packages/supporting-multiple-target- frameworks. • .signature.p7s file: As the name implies, this is the signature file signed by the author’s certificate. You can find more information on how to sign NuGet Pack- ages here: https://docs.microsoft.com/en-us/NuGet/ create-packages/sign-a-package. You can learn now to require that only signed packages be accessible and to limit packages to certain authors here: https:// docs.microsoft.com/en-us/NuGet/consume-packag- es/installing-signed-packages. • License.md: This is a markdown file that contains the license terms and conditions for your package. Typi- cally, this consists of an open source license such as the MIT, GNU, or Apache 2.0 licenses. • Nuspec: The Nuspec file is the manifest file. This is an XML file that is used to create the NuGet Package. This Figure 4: If you have an archive utility like 7-zip, you can simply right-click on a NuGet file will be discussed in the next section. Package and open the archive.
Figure 5: A NuGet Package contains the meta data, license information, and libraries for each .NET version supported.
Creating Your Own NuGet Package You now understand what Packages, Package Managers, and Package Sources are and have a basic understanding of how NuGet fits into that space. You also understand how to cre- ate and reference your own package source with nothing more than a directory of file share. All that’s left to get started is to learn how to create your own NuGet Package. To illustrate, I’m going to use the Immutable Class Library I created and wrote about a few issues back (https://www. codemag.com/Article/1905041/Immutability-in-C#).
There are several approaches you can use to create NuGet Figure 6: The NuGet Package structure contains a lib folder that contains a subfolder for each Packages. I’m going to show you the method I consider the supported .NET version. The only other required file is the nuspec file (manifest).
codemag.com Responsible Package Management in Visual Studio 27 Figure 7: The nuspec file is the manifest that drives the package creation process. Most importantly, the nuspec file references the package’s dependency.
Figure 8: The information contained in the nuspec file as displayed in the NuGet Package Manager.
Figure 9: NuGet.exe provides command line access to NuGet’s functions including package creation and download/ installation of NuGet Packages in your projects via an automated build server like Jenkins or Team City.
28 Responsible Package Management in Visual Studio codemag.com Figure 10: The NuGet.exe pack command, using a nuspec file generates the NuGet Package. easiest to use and understand. There are also many options own feed, why would you consider a paid service? These paid you can apply that I won’t cover here. For comprehensive services have their own DR (Disaster Recovery) infrastruc- coverage of all you can do with package creation, consult ture. If you host your own feed, you need to consider how the documentation at NuGet.org. your server will be backed-up and replicated and how you will recover in the event of a catastrophic event. Step 1: Create a Package Directory Structure and Add Your Binaries Figure 6 illustrates the directory structure. Conclusion Open source has made it easier than ever to add features to I added an icon.png file that will be displayed in the Pack- your applications. Part of that ease is speed. Speed and ease age Manager, as shown in Figure 1. The license text file con- mean less friction. Once upon a time, before open source as tains the MIT License Language. Finally, there’s the nuspec we know it today, before the Internet, and before package file, which is illustrated inFigure 7. management, there was implicit friction in the system, which provided time to assess and evaluate. Developers of another Step 2: Create Your Nuspec File generation, in my opinion, had a better understanding of The nuspec file illustrated inFigure 7 is very basic. change management. They understood the discipline and rigor required to mitigate risk. For all the benefits of today’s For a complete nuspec reference, you can find that informa- technology and the speed and ease we get with it, it’s more tion here: https://docs.microsoft.com/en-us/NuGet/refer- important than ever to employ risk mitigation techniques ence/nuspec. The ID you choose for your package must be such as what is discussed in this article because if it’s easier unique in the context of the source within which it’s hosted. for us to do good things, it’s easier for bad actors to use the Accordingly, if you elect to make your NuGet Package avail- same technology. Robust security and risk mitigation aren’t able in the NuGet.org feed, the ID must be unique in that free. If there’s one negative side-effect of free open source, universe. Figure 8 illustrates how the package appears in it’s the expectation that things heretofore with a cost no the NuGet Package Manager. longer have a cost. Consider that the next time a package is introduced into your environment. If your organization is Step 3: Create Your NuGet Package governed by SOX, HIPAA, FINRA, PCI, etc.—if you’re compli- In order to create your NuGet Package from the command ant, you’re not letting that situation occur. line, you need the NuGet Command Line Tools. Figure 9 il- lustrates where you can download NuGet.exe. John V. Petersen
Figure 10 illustrates how to generate your NuGet Package.
Step 4: Publish Your Package Depending on the type of package source you are using, your steps may be slightly different. For a file directory source, the process is as simple as copying the file to the directory. If you’re hosting your own NuGet Server (https://docs.micro- soft.com/en-us/NuGet/hosting-packages/NuGet-server), you will use one of the methods described here: https:// docs.microsoft.com/en-us/NuGet/NuGet-org/publish- a-package.
Other Hosting Options Instead of self-hosting or using the NuGet.org public feed, you may instead elect to use a third-party service. For NuGet, there are paid services such as myget (myget.org) and chocolatey (chocolatey.org). If it’s so easy to host your
codemag.com Responsible Package Management in Visual Studio 29 ONLINE QUICK ID 1909051 Moving from jQuery to Vue Most of the attention that JavaScript gets is all about creating large, monolithic Single Page Applications (SPAs). But the reality is that a great percentage of websites still use much simpler jQuery and vanilla JavaScript. Without going all-in on moving everything to a SPA, can you gain some of the benefits of using a framework to simplify your code and make it more reliable and testable?
Sure, you can. In many cases, moving to a SPA framework Although this is pretty easy to remedy, it’s a common prac- means a complete re-thinking of your application. It’s a tice because it’s easy to think of an event handler as the change in how you approach building applications. I whole- main place for code in jQuery. heartedly recommend that you think about it this way if you’re building new applications, as it can really change the Next up is changing the UI in jQuery: way you approach Web development, but… $('#ghapidata').html(` In many cases, it‘s beneficial to ramp up to these technolo-
Figure 1: Simple jQuery Page
30 Moving from jQuery to Vue codemag.com Listing 1: jQuery Version of the App $(function () { height="80" $('#ghsubmitbtn').on('click', function (e) { alt="${username}"> e.preventDefault(); $('#ghapidata').html(`