8

Jaakko Heikkilä Implementing a RESTful API with .NET Core 2.1

Metropolia University of Applied Sciences Bachelor of Engineering

Software Engineering Bachelor’s Thesis 31 January 2021

Abstract

Author Jaakko Heikkilä Title Implementing a RESTful API with .NET Core 2.1

Number of Pages 48 pages Date 31 January 2021

Degree Bachelor of Engineering

Degree Programme Information Technology

Professional Major Software engineering

Instructors Janne Salonen, Principal Lecturer Tuomas Jokimaa, supervisor

The objective of this thesis was to design and implement a bespoke backend API using Microsoft’s .NET Core framework. The work included the architectural planning, writing ap- plication logic and unit tests, setting up a version control system in GitHub Enterprise and implementing a continuous integration pipeline using TeamCity and Octopus Deploy.

The implemented API is part of a larger group of intranet applications which are used by the end users to manage their sales materials and to partition them for different sales teams. The implemented API works as a facade between the user interface and a SQL Server in which the actual logic was implemented.

The thesis explores briefly the technologies which were used, and explains the .NET fram- work’s history and some of its web development tools. The three external tools: GitHub En- terPrise, TeamCity and OctopusDeploy are explained. The software development process and the structure of the code are then looked at in more detail.

The thesis was carried out in order for the customer to improve their business critical oper- ations. The application was made as a part of an already existing set of applications which dictated the use of programming technologies.

The following technologies and tools were used in the implementation: Visual Studio 2017 Enterprise Edition, GitHub Enterprise, TeamCity, OctopusDeploy

Keywords Microsoft .NET Core, ASP .NET Core, REST Api

Abstract

Tekijä(t) Jaakko Heikkilä Otsikko REST API:n toteutus .NET Core 2.1 Sovelluskehyksellä

Sivumäärä 48 sivua Aika 31.1.2021

Tutkinto Insinööri (AMK)

Koulutusohjelma Tieto- ja viestintätekniikka

Suuntautumisvaihtoehto Ohjelmistotekniikka

Ohjaaja(t) Janne Salonen, Principal Lecturer Tuomas Jokimaa, supervisor

Tämän opinnäytetyön aiheena on suunnitella ja ohjelmoida asiakkaan tarpeisiin räätälöity backend API käyttäen Microsoftin .NET Core ohjelmistokehystä. Työhön kuului sovelluk- sen arkkitehtuurin suunnittelu, sovellus- ja testauskoodien kirjoittaminen, versiohallintajär- jestelmän pystyttäminen sekä jatkuvan integraation työkalujen pystyttäminen.

Työssä tehty API on osa ohjelmistoa, joka on intraverkossa oleva palvelu, millä loppukäyt- täjät voivat hallita myyntimateriaaleja ja niiden jakamista eri myyntitiimeille. Toteutettu API toimii eräänlaisena fasaadina businessäännöt sisältävän sovelluksen ja käyttöliittymän vä- lillä.

Opinnäytetyössä perehdytään käytettyihin teknologioihin ja .NET ohjelmointialustan histori- aan, ohjelmointiprosessiin ja ohjelmakoodin rakenteeseen, sekä automaatiotyökaluihin TeamCityyn ja OctopusDeployhin

Opinnäytetyö tehtiin, jotta asiakasyritys voisi hallinnoida ja seurata liiketoiminnalle tärkeän materiaalin myyntiä. Sovellus tehtiin osaksi isompaa kokonaisuutta .NET Core -kehyksellä tehtyjä ohjelmia, joissa on pyritty pilkkomaan isompaa sovelluskokonaisuutta pienempiin hallittaviin osiin. Ohjelmistokehys .NET Core valittiin sen ollessa jatketta kypsälle .NET Frameworkille ja myös sen monialusta-tuen vuoksi.

Sovelluksen toteutuksessa käytettiin seuraavia Microsoft .NET työkaluja: Visual Studio 2017, GitHub Enterprise, TeamCity, OctopusDeploy.

Avainsanat Microsoft .NET Core, ASP .NET Core, REST Api

Contents

List of Abbreviations

1 Introduction 1

1.1 Background for the reasoning of the project 1 1.2 Commissioning the work, challenges faced 2 1.3 Outline for the thesis 3 1.4 Key concepts 3

2 Goals for the project 4

2.1 Implementation limitations 4 2.2 Version control, continuous integration and delivery (CI/CD) 5 2.3 Pursued benefits for the related parties 5

3 Overview of .NET Core framework 7

3.1 A brief history of .NET Core 7 3.2 .NET Standard 8 3.3 .NET Core features 8 3.3.1 ASP.NET Core 8 3.3.2 NuGet Package Manager 9

4 The implementation of the project 12

4.1 Application development tools and environment 12 4.1.1 Limitations set by the development environment 12 4.1.2 Docker 13 4.2 CI/CD tools 13 4.2.1 GitHub Enterprise 14 4.2.2 TeamCity 15 4.2.3 Octopus Deploy 16 4.3 Application architecture 16 4.3.1 REST (REpresentational State Transfer) 17 4.4 Overview of the server application architecture 18 4.4.1 The program class and the WebHost 20 4.4.2 Startup class, middleware 21 4.4.3 Routing 25

4.5 Controllers 26 4.5.1 Controller implementation 27 4.5.2 DTO models 28 4.6 Data-access layer 30 4.6.1 Entity classes 30 4.6.2 Updating the database through entity’s methods 33 4.6.3 DbContext 35 4.6.4 Repositories 36 4.6.5 Services 39 4.7 Unit tests and fake implementations 40 4.7.1 Unit test naming convention 40 4.7.2 Unit test frameworks and libraries 41 4.7.3 Structure of the unit tests 42 4.7.4 Fake services and data initializers 43 4.8 Summary of the implementation 44

5 Conclusions 46

5.1 Successes 46 5.2 Further improvements 47

References 48

List of Abbreviations

.NET Microsoft’s leading software framework

.NET Core Modern version of the framework

ASP.NET A software framework for web based applications

On-premises Residing in a local server and not in the cloud

REST API HTTP based architectural model introduced in 2000 for web-based appli- cations

Docker Virtualized operating systems in small packages.

Azure Public cloud service for Microsoft.

T-SQL Query language for Microsoft SQL Servers

Entity Name for objects that describe the database rows and columns. Used in Entity Framework.

Environment In software development there are different environments for the software to exist depending on the stage of the version of the software. Develop- ment, testing, production are just a few examples

Pull request For version control systems, the review of code is done via pull requests. A code change is asked to be pulled into another version of the code.

AAA Arrange, Act, Assert principle for .

Appendix 2 1 (49)

1 Introduction

The work carried out for this thesis was commissioned by the employer company Elisa Oyj. Elisa Oyj was founded in 1882 and has been developing the infrastructure and tel- ecommunication technologies as a pioneer in Finland ever since. In 2010 Elisa Oyj ac- quired Appelsiini Finland Oy, which later was fused into the parent company. Appelsiini Finland Oy had a history of software development in Microsoft's .NET technologies and had many clients and their projects which were transferred to Elisa Oyj in the fusion.

One of these clients had an application critical to their business. This application included number of different business functions that were programmed in a monolithic codebase with .NET Framework 4 using ASP.NET MVC. The applications consisted of a user in- terface that was tightly coupled with the server code logic. The server application handled the requests from a web browser and passed those requests onto an SQL Server in- stance where business logic was handled with parametrized stored procedures. The stored procedures were not in the scope for the project.

The application resided on a single, on-premises server that is accessible only through the client's network. It was running on a Windows Server 2012 in an instance of IIS. Starting from 2017, this application was given greenlight to be rewritten in smaller indi- vidual applications reminiscent of microservices and using .NET Core as their technology stack.

1.1 Background for the reasoning of the project

As software becomes larger and its codebase grows older, the upkeep of the software becomes more time consuming, difficult and costly. Years of hotfixes and patches can leave software filled with undocumented code that violates many principles of good soft- ware craftmanship. At some point it will be more efficient for long term costs to re-write an application.

In 2018 .NET Framework 4 was already on a path to be replaced by .NET Core, an open- source version of Microsoft’s technology stack. For many projects made during that time, or after it, there was no discussion on whether to use the newer stack. But not all kinds

Appendix 2 2 (49) of software could be - or can yet be - ported to the new .NET Core framework. As for simpler web applications that use the ASP.NET framework, the groundwork done in .NET Core was already feasible to be used in production applications in its version 2.

This change in the direction for the framework came from Microsoft and subsequently also became a priority for the employer. As the employer is also focusing heavily on Azure cloud competence, the need to make applications with .NET Core rises.

1.2 Commissioning the work, challenges faced

New applications for the replacement to the client's application were started to be pro- grammed late 2017, and by late 2018 three out of four of the first applications were ready. The overall software architecture for these was in place, the initial requirements and schedule was in place. However, when the last application was to be started, the original team was unavailable, so a new team was tasked to finish the one remaining application. The team consisted of two frontend developers and two backend developers.

Requirements for the backend application was to be able to pass the queries from the new React.js frontend to the database which had not been changed during the migration to the new architecture. During the development, the visibility to the database was limited to excel sheet descriptions of the database which included the names of stored proce- dures and the parameters they used. The database also had views which were queried for the data that was returned for the client application.

The project was to be done agile, in sprints, but the client was availably minimally, and was only asked to clarify requirements. During its development, the requirements turned out to be incomplete and needed to be addressed many times by the new team which had not the time to properly get familiarized with the existing software. As for the new team, half of its could not commit to the project due to other projects. De- spite these challenges the work was delivered almost on time.

Appendix 2 3 (49)

1.3 Outline for the thesis

In the first section, the goals for that were set for the RESTful API, other implemented features and benefits for all parties are explored. Then in the second section, the history of .NET Core and the relation to .NET Framework is looked at. Also, the main features of ASP.NET Core, the web application framework which were used, are introduced. These sections set up the mindset to understand the chosen technologies and what they offer. The third section looks at the overall architecture of the application, the guiding principles for the design and explanation of some core language features of # are ex- plained while describing the code through example snippets of the delivered product. The fourth section explains the continuous integration tools, version control and the measures which were implemented to ensure code quality. The last section discusses how the implementation compares to the design principles and what works and what does not.

1.4 Key concepts

RESTful API - A RESTful API is a server application that accepts HTTP request verbs, processes those and responds with HTTP responses.

Client application - An application that acts as a client to a server. These can be and JavaScript-based, native phone, console application, and more. They act as an in- termediary for the user

ORM - Object Relation Mapping is a process where database tables or views are mapped onto objects, such as C# objects in Entity Framework, that represent the data- base in code.

SOLID principles – A set of principles for object-oriented software design to make soft- ware more understandable and maintainable introduced by Robert C. Martin.

Appendix 2 4 (49)

2 Goals for the project

Numerous requirements that had to be met were set by the existing architecture and the employer, while other decisions were guided by code libraries that were available for .NET Core. For long term maintainability, and to ease other developers contribute to the application, source control and CI tools had to be used.

2.1 Implementation limitations

For the project, the goal was to create a RESTful API that would act as an intermediary between the user interface and the existing database, and to setup a continuous inte- gration and delivery (CI/CD) pipeline for the application. ASP.NET Core 2.1 was the re- quired technology for the server application. For the CI/CD pipeline, the employer had already chosen the services: GitHub Enterprise, TeamCity, Octopus Deploy, So- narQube.

Figure 1 shows a simplified overview of the different parts of the whole application. The scope of this project was the REST API running on the application server.

Figure 1. Application overview

Appendix 2 5 (49)

Most, if not all, of the business logic happens in the stored procedures in the client’s database instance. These stored procedures were not the responsibility of the develop- ment team. As such, the server application’s main responsibilities included those of vali- dation, input sanitation, error logging and handling the HTTP requests and responses.

The requirements for the interaction with the client app were simply to accept its re- quests, validate and sanitize the queries, and direct the data onto the database for re- quired stored procedures in case of an insert or an update. In case of errors the server application needed to respond with human readable messages.

For the interacting with the database, the application called existing stored procedures for updates and SQL server views for fetching data. Entity Framework Core was chosen for this purpose as it is the de facto ORM in .NET Framework.

2.2 Version control, continuous integration and delivery (CI/CD)

In addition to the application itself, to ensure code quality, better teamwork and faster deployments, a version control repository needed to be created and a pipeline for CI/CD was to be created.

Using GitHub Enterprise, TeamCity CI server and Octopus Deploy a pipeline was built. The responsibilities of the pipeline were to ensure code reviews, unit test coverage, main code branch code quality and the deployment of the application to the development, QA and production environments.

2.3 Pursued benefits for the related parties

The commissioned work was ordered by the client and, as such, brings financial benefits to the employer. The modernization of the existing application should he the employer to further help transform the codebases of its various projects to a more sustainable plat- form and advancing the knowledge in new technologies. The usage of .NET Core tech- nologies is critical in the transformation from on-premises servers to cloud based appli- cations using Microsoft Azure or other cloud platform providers.

Appendix 2 6 (49)

For the client, the modernized application helps with their plans in moving their applica- tions from intranet to an Azure based environment and can help with the upkeep costs of their application.

For the development team the project was a starter for ASP.NET Core technologies and react.js frontend.

Appendix 2 7 (49)

3 Overview of .NET Core framework

.NET is a free software framework originally developed by Microsoft, but subsequently changed to be open source and supported by Microsoft. It supports all major platforms including desktop and mobile operating systems.[1]

.NET Core is the continuation of .NET Framework, and it can be used to implement var- ious kinds of desktop, server, mobile, IOT and cloud-based software solutions. For .NET Core the main goals were to create a new codebase that was open source, fast, loosely coupled and updated frequently, as opposed to the older .NET Framework which was not open-source and suffered from having to support older code.[1]

.NET Framework has been made available to other platforms than Windows using Mono, but .NET Core has taken this to another level. It has a native UNIX development and run environment and can be run with less resources than its predecessor.[1]

3.1 A brief history of .NET Core

In 2016 the version 4.6 of .NET Framework was followed by the first versions of .NET and ASP.NET Core. While the development of .NET Core continued, the main imple- mentation was still .NET Framework which kept getting updates for the following years. The first .NET Core releases were made available to the developers but included only a subset of the .NET platforms features. As such the release was not yet recommended to be used in software meant for production. [2]

In 2018, around the time the work done in the thesis discussed here, the release of version 2 was already out and ASP.NET Core could be used to create web applications. Some parts, such as Entity Framework Core, had still limited features.

Later versions implemented more of the .NET platform features and in 2020 the version .NET 5.0 was released. In this version the name “Core” was removed because it was going to be the main implementation of .NET platform for the future. [3]

Appendix 2 8 (49)

3.2 .NET Standard

For the purposes of supporting different environments and development kits, a set of standards are defined in .NET standard. It is the formal specification of the APIs that .NET platform provides.[4]

The first version of .NET standard was introduced together with the release of .NET Core and the final version was .NET Standard 2.1 for the release of .NET Core 3.0.[4]

Other implementations of the .NET are Mono, Xamarin, UWP and Unity. These are for UNIX, MacOS, iOS, Android, Windows UWP platform and games.[4]

With the release of .NET 5.0, the need for .NET Standard was no more as .NET 5.0 has a different approach for multi-platform support.[5]

3.3 .NET Core features

.NET Core source code is available to public in a public GitHub repository, it can be forked and changed and is open to pull requests and feedback. In addition to the main source code, the Roslyn-compiler and the .NET Core SDK and the runtime libraries are all available. [6]

The framework supports languages such as C#, F# or to create different types of applications with different approaches. Web, mobile, desktop and IOT-device applications are supported. The platform has also support for machine learning libraries, cloud services and game development.[7]

3.3.1 ASP.NET Core

Building websites, or services that consume or use other services over the web or HTTP protocol can be built with ASP.NET Core. It has first class support from Microsoft and Visual Studio IDE.[8]

Appendix 2 9 (49)

ASP.NET Core supports different frontend frameworks such as Blazor, Razor Pages, Bootstrap, React.js and Angular. For these frameworks there are templates that can be used to scaffold an application rapidly.[9]

It also has numerous libraries for essential web development purposes, these include HTTP clients, consuming REST and SOAP services, gRPC and GraphQL support, data validation, serialization, security, CORS, interacting with different cloud providers and much more.

3.3.2 NuGet Package Manager

A major part of modern platform frameworks is the capability of downloading and sharing software libraries in packages. Many new languages, for example Rust, come with their own package management systems, and Node.js has managers such as npm and Yarn for this purpose. These packages in .NET platform are compiled code which is wrapped in a NuGet-package.

NuGet defines how the packages are created, maintained and used, and also provides the tools to create them. These packages then can be pushed to a public or a private repository from where they are accessible to be used by other code as imports. A pack- age can be created with the dotnet CLI tool that comes with .NET Core, using nuget.exe CLI or using MSBuild. The packages then can be pushed to the repository and consumed by other code.[10]

During the installation of Visual Studio IDE, the NuGet package manager can be chosen to be installed. Figure 2 shows the installation option for Visual Studio 2019 Community edition. NuGet package manager is a tool to browse public code packages and managing those for your own .NET code solutions.

Appendix 2 10 (49)

Figure 2. Visual Studio 2019 installation, showing optional components.

When a specific package is added onto a project, the .csproj-file of said project is up- dated with the reference of the included package. The references include the name and the version of the package as a PackageReference-node. Listing 1 shows how these package references look like in the file.

Listing 1. A .NET Core project’s .csproj-file references

Advantages of using the packages is that the location of the .dll for an individual devel- oper is not an issue as the packages are only referenced and downloaded when re- quired for the environment. Another advantage is that the packages can be excluded from source control repositories which can lead to a lot smaller file size for the project. Because the packages are usually downloaded over the internet, an internet connection is then required to restore the files for local development environment. NuGet supports private repositories in case public internet access is not available as can be for high security software development.

Packages that are available in public NuGet repository and installed locally are shown in the NuGet package manager window as picture in Figure 3. The view also shows which packages have updates available on the repository.

Appendix 2 11 (49)

Figure 3. NuGet package manager window in Visual Studio 2019

NuGet Package Manager automatically downloads the packages for each project in a solution during the build process when using either Visual Studio or Visual Studio Code. A manual restoration of the packages could also be triggered using the dotnet restore CLI command.

When the packages have been successfully installed for a project, they are available to be used and the Visual Studio’s IntelliSense works for the packages.

Many of the libraries used in the implemented REST API were installed using NuGet, while some libraries were included in the .NET Core 2.1 Software Development Kit (SDK).

Appendix 2 12 (49)

4 The implementation of the project

The objective for the implementation was a .NET Core based server application that can accept HTTP requests, validate them and create necessary queries and commands to an underlying SQL database. In short, a RESTful API working as a gateway between the user interface and the database.

In addition to the application, a CI/CD pipeline had to be configured using the technolo- gies which were already taken into use by the employer. These tools included a TeamCity build server, Octopus Deploy automated software deployment server and GitHub Enterprise for the source code.

4.1 Application development tools and environment

The code was written and debugged in Windows 10 using Visual Studio (VS) 2017 En- terprise Edition provided by the employer. VS 2017 is an integrated development envi- ronment from Microsoft offering tools for software development. It has a wide variety of software development tools for many purposes such as:

• writing, debugging and refactoring code • code quality analysis and cleanup • creating architectural diagrams • writing and running unit tests

.NET Core 2.1 SDK including ASP.NET Core SDK was required for web development for the .NET platform. Docker for Windows was installed to enable running the application in other platforms without requiring the installation of .NET Core runtime.

4.1.1 Limitations set by the development environment

During the entirety of the development there was no access to the actual database, and there was not an up-to-date copy of the database available for the development team. This is why a reference excel-sheet was used to model the database structure in the

Appendix 2 13 (49) application. The excel-sheet comprised of the tables and views with their corresponding columns and datatypes.

This limitation made it necessary to create a dummy interface that returned data which looked like the actual database. The values for the dummy interfaces were hardcoded in the data-layer of the backend application. For the QA and production environments these were replaced by the real implementations. In the end it proved to make the development slightly more difficult.

4.1.2 Docker

The backend software was also necessary to be made available for frontend developers who were not capable of running the server on their machines. A docker containerized version of the application was then made. The only requirement for the developer’s ma- chine was to install Docker for their operating system.

Docker is a lightweight OS virtualization layer on top of the native OS. It was originally available only for UNIX based operating systems, but Windows operating system support was added and has been made more robust since. With .NET Core, it is possible to target the application for UNIX or Windows and to make it available for containerization. New Visual Studio releases support this directly from the IDE.

Using Visual Studio 2017’s built-in Docker support, a docker-compose file was created for the backend server to provide a container for frontend developers.

4.2 CI/CD tools

For any software project that is larger in scope it is recommended to use some form of Continuous Integration, Deployment and Delivery. It means processes that can be auto- mated, which reduce the complexity and overhead that is associated with synchronizing with other developers and stakeholders. Automation also reduces the probability of a human error in repetitive tasks. CI/CD tools provide almost any software functionality needed for software development. These include, but are not limited to

Appendix 2 14 (49)

• ensuring code quality with automated tests and • checking for failing tests to ensure compatibility, performance, etc. • creating deployment packages for various deployment scenarios • installing the software and making it available to end users.

In addition to creating the application, a baseline for CI/CD was needed as required by the employer. This included code reviews, automated tests, automated deployment and delivery of the application. Three services that were in use during the time at the com- pany were GitHub Enterprise, TeamCity and Octopus Deploy. Some form of automation was added to all of these.

4.2.1 GitHub Enterprise

GitHub Enterprise is the on-premises version of the leading source control platform GitHub. It is used for local networks and can be integrated into different organizational authentication systems. Its features are largely the same as the public platform, with focus on private organizations. Some features that require a public repository are only available on GitHub.com might not be available for the enterprise version.[11]

A new repository was created for the project and a team was created to limit the people who can change the repository settings. The master branch was set to be protected while other branches could be used for development. All pull requests had to have been re- viewed and a passing unit test report had to have been received from the CI pipeline. The connection to TeamCity required an SSH key that was created using personal cre- dentials on GitHub Enterprise and then used in TeamCity Server.

A simple version of trunk-based development was used as a branching strategy for the repository, as shown in Figure 4. Making code changes to the master branch had to go through a short-lived branch which then was merged to the master. These branches were prefixed with feature or bug depending on their nature. This was chosen for its lightweight nature and because the development team was small. The master branch was then used as the source for release packages for different environments.

Appendix 2 15 (49)

Figure 4. Simplified trunk-based development branching.

The master branch was set as a protected branch so that it could not be deleted by accident. For pull requests that targeted the master branch, at least one other develop- er needed to approve the pull request. The approval requirement was set to enforce code review practices. Code review is good to help find errors in the code and to help other developers stay up to date with the changes in the repository. Code reviews are not fast nor a cheap practice and require the reviewer to understand the language, requirements, and the domain of the application.

4.2.2 TeamCity

TeamCity by JetBrains was used as the CI/CD server. Its responsibilities were: to run automated unit tests, building the code, running code quality analysis with SonarQube and creating a package for Octopus Deploy. The server was running on the employer’s private servers.

Managing different builds was done via build configurations that could be edited through a web-based GUI. A project was created. Then three different build configurations were made.

One configuration was created to run automated tests whenever a pull request was cre- ated on GitHub Enterprise. It was set up only to listen to pull requests the targeted the master branch.

Two build configurations were created for deployment: one for the private testing envi- ronment and the other for QA and production environments. Private testing was made to build the application in the debug mode for having access to more detailed error mes- sages on the web browser. The other one was a build configuration using the release build settings.

Appendix 2 16 (49)

The two build configurations largely shared the same steps with the exception of using debug or release build and where to push the final deployment package.

4.2.3 Octopus Deploy

Octopus Deploy was used as the package delivery tool. It consists of a development server app which has a web portal to manage deployments, releases, variables, access rights for users, and create automated deployment processes. Different processes are grouped together in a project that can in theory have multiple applications, settings and deployment target environments. The server application resided in the employer’s serv- ers.

Another part of the tool is the Tentacle-agent. The agent resides in the target environ- ment and runs the steps that are described in the deployment process.

Octopus Deploy can receive deployable code multiple ways, but for the thesis the TeamCity server pushed the application package to an endpoint which was specified in the Octopus Deploy environment.

First a project was created for the purpose of delivering the REST API to three target environments: internal testing, QA and production.

Because the QA and production environments had been configured to be as similar as possible, they were configured with variables that had different values depending on the environment.

4.3 Application architecture

When an application’s complexity rises, the task to build that it becomes more difficult. This complexity can be managed from multiple different perspectives whether by taking out features, using existing software or the planning of the developed system. The de- sign should be resistant to changes that happen over time. Changes including new re-

Appendix 2 17 (49) quirements, changes to existing features, hardware or software changes, or even per- sonnel who are maintaining the application. As it is, software development can be quite tricky.

To manage complexity and to make an application more resistant to changes can be done through the architectural planning of the software. The planning can be done on many levels, such as the hardware and software the system is used with, or the structure of the code itself.

In this thesis the architecture means the different parts of the developed application and their relations to each other and external systems. The application’s different parts can be classes which model the client’s needs, or more general software practices for code craftmanship. N-layered architecture and REST architecture are such general practices.

Architectural models are used to try to solve specific software development problems. Three tier architecture, which was used in the thesis, divides responsibilities of the server code into presentation, service and repository layers. This structure has been quite pop- ular for a long time to divide the responsibilities of different parts of a web application. This way the development of different parts can be done more independently, but most bespoke software development will run into changes that penetrate through all layers. For example, a change in UI will most likely be added to the logic layer and the data storage.

4.3.1 REST (REpresentational State Transfer)

A pure RESTful service should follow the guidelines of using the correct HTTP-verbs for different use cases. Using the verbs correctly, all of the requests targeting the entities can be placed behind a single URL:

Table 1 lists the commonly used http methods and how they are used. For example, making a PUT/PATCH request to the url: website/product/id can be used to update data either with the updated fields in the body of the request or by having them as query parameters on the URL itself. [12]

Appendix 2 18 (49)

Table 1. Common http methods and their usage.

http verb Usage Payload

GET Fetch data URL

POST Add new entries URL or request body

PUT Update an entry URL or request body

PATCH Update part of an entry URL or request Body

DELETE Delete an entry URL only

A maturity model for REST APIs has been suggested. It is called the Richardson Maturity Model and it states three levels of maturity for REST APIs depending on how thoroughly they implement the HTTP-verbs.[13]

A fully functioning API can be implemented using only POST requests, but this way the URLs define the action for the HTTP-request. The url: ‘website/products/delete/id’ would be used for deletion.

The endpoints in the API implemented in the thesis are mostly POST-requests but a few endpoints responding to HTTP GET requests exist.

4.4 Overview of the server application architecture

The codebase was divided roughly into two projects. One for the API layer facing the client application and another one for the data layer. A third, abstract, separation exists inside the data layer project. Separating the database CRUD (Create Read Update De- lete) implementation details from the business rules and validation. Figure 5 depicts the three layers of the server application architecture.

Appendix 2 19 (49)

Figure 5. The server application architecture in three layers

Application Core exposes services for the Controllers, repository interfaces that the im- plementations must follow and the entities that match the required data fields. The REST API, or presentation layer uses ASP.NET Core's libraries to handle the common web application needs, while the database facing code uses Entity Framework to access the database and to do parameter sanitation.

The API application was created using a template included with the ASP.NET Core SDK, which was then stripped of unnecessary code and customized to fit the needs of the project.

Appendix 2 20 (49)

4.4.1 The program class and the WebHost

The API layer is the entry point to the application for client applications. It exposes HTTP endpoints called Controllers that can be called by the client. It is hosted on the ASP.NET Core runtime as a console application that is started by the Main function of a Program- class. The application’s entry point is the program class shown in Listing 2.

public class Program { public static void Main(string[] args) { var logger = NLog .LogManager .LoadConfiguration("nlog.config") .GetCurrentClassLogger(); try { logger.Debug("Initializing Thesis API"); CreateWebHostBuild(args).Build().Run(); } catch (Exception ex) { //NLog: Catch setup errors logger.Error(ex, "Stopped program on exception") throw ex; } finally { //Ensure to flush and stop internal // timers/threads before application exits Nlog.LogManager.Shutdown(); } }

public static IWebHostBuilder CreateWebHostBuilder(string[] args) => WebHost.CreateDefaultBuilder(args) .UseStartup() .ConfigureLogging(logging => { logging.ClearProviders(); logging.SetMinimumLevel(LogLevel.Trace); }) .UseNlog(); }

Listing 2. The program class of an ASP.NET Core 2.2 WebApi application

The class Program has a Main-function, which accepts arguments from the executing environment. The code following sets up an instance of NLog logging utility to log any errors that happened during the application’s startup phase. In the fuction Create- WebHostBuilder, it uses the WebHost-class to build a runtime host for the application’s startup and lifetime management using the configuration created in the Startup-class.

Appendix 2 21 (49)

The WebHost.CreateDefaultBuilder-function does many things. It sets up the ASP.NET Core webserver, the content root folder, loads the appsettings.json file for database con- nection strings and the configuration for authentication and logging among others.[14]

4.4.2 Startup class, middleware

In the core of the application, the class that sets up required services and which works as a glue between different parts of the architecture is the Startup-class.

Each request that the client makes goes through the ASP.NET Core middleware pipeline in which the requests are processed and directed to the designated code block that can handle them. The pipeline can be configured by the developer in the Startup class which is called when the program starts.

The code for the Startup-class is depicted in Listing 3. In its constructor, the class re- ceives the following:

• IConfiguration object that has the deserialized value of the appsettings.json file • ILogger object for logging events in Startup.ConfigureServices • IHostingEnvironment object that can tell where and in what build configu- ration the application is run. Can be used to configure services based on the envinronment. public class Startup { private readonly IConfiguration _config; private readonly IHostingEnvironment _env; private readonly ILogger _logger;

public Startup(IConfiguration configuration, ILogger logger, IHostingEnvironment environment) { _config = configuration; _env = environment; _logger = logger; }

// Add services to the .NET Core IoC container. public void ConfigureServices(IServiceCollection service) … // Configure the HTTP request pipeline public void Configure(IApplicationBuilder appBuilder) … }

Listing 3. The Startup class and its methods.

Appendix 2 22 (49)

In addition to the arguments in the constructor, the class has two methods which are called when the application starts: ConfigureServices and Configure. These methods are looked at next.

While the host provides services via the Startup class’s constructor, additional services are configured with the ConfigureServices method using the IServiceCollection object. Listing 4 shows a part of the ConfigureServices method and some of the important ser- vices that were configured within it.

// This method gets called by the runtime. Use this method to add services to the container. public void ConfigureServices(IServiceCollection services) { services.AddMvc().SetCompatibilityVersion(CompatibilityVersion.Ver- sion_2_1);

services.AddSwaggerGen(c => { c.SwaggerDoc("v1", new Swashbuckle.AspNetCore.Swagger.Info { Title = "JaakkoHe API", Version = "v1" }); });

services.AddCors(options => { options.AddPolicy("CorsPolicy", builder => builder.AllowAnyMethod().AllowAnyOrigin().AllowCre- dentials().AllowAnyHeader() ); });

services.AddDbContext(options => options.UseSqlServer(_config.GetConnectionString("ThesisConnection- String")));

Listing 4. Important 3rd party services within ConfigureServices method

First the AddMvc is used to add many required functionalities that are used in .NET Core 2.1 REST APIs. Then a Swagger-documentation is added to provide developers or users a way to explore the endpoints the API provides. After that an open CORS-policy was added since the application was running on the intranet. Finally the database-connection is configured with the AddDbContext-method. The DbContext is explored more later in the thesis.

Listing 5 shows the two ways in which the implemented services were injected to the application’s runtime using the built-in dependency injection functionalities provided by the ASP.NET Core framework. Firstly, for testing with fake data, and to remove depend- ency from a real database, a “debugging” configuration injects fake implementations of

Appendix 2 23 (49) the services as singletons. Secondly, when using a database connection, the services are injected with AddScoped-method. The lifecycle of a class injected this way lasts for one HTTP request. switch (_env.EnvironmentName) { case "Debugging"://Developing with mock data using fake services // add project services services.AddSingleton(); services.AddSingleton(); services.AddSingleton(); services.AddSingleton(); services.AddSingleton(); services.AddSingleton(); break; default://Use a database // add project services services.AddScoped(typeof(IRepository<>), typeof(BaseRepository<>)); services.AddScoped(typeof(ISprocRepository<>), typeof(SprocReposi- tory<>)); services.AddScoped(typeof(IListContentRepository<>), typeof(ListCon- tentsRepository<>)); services.AddScoped(typeof(IUserRepository), typeof(UserRepository)); services.AddScoped(typeof(IDynamicTableRepository<>), typeof(Dy- namicTableRepository<>)); services.AddScoped(); services.AddScoped(); break; }

Listing 5. Dependency injection in the ConfigureServices-method

Services are the components that make up the application functionality such as the built- in routing, JSON formatting, controller-class discovery. In addition to the built-in features and third party libraries, the Inversion-Of-Control container is configured with the appli- cation’s interfaces and the implementations that will be used.[15]

The Configure-method lays out the order in which different parts of the application are run. The order in which the code is run is important. For example, to authenticate a user before directing the request to the endpoint. The implemented pipeline can be seen in Listing 6.

Appendix 2 24 (49)

public void Configure(IApplicationBuilder appBuilder) { try { appBuilder.UseHttpsRedirection(); appBuilder.UseAuthentication();

if (_env.IsDevelopment() || _env.EnvironmentName == "Debugging") { appBuilder.UseDeveloperExceptionPage(); _env.ConfigureNLog("nlog.Development.config");

appBuilder.UserCors(corsBuilder => corsBuilder .WithOrigins(_config.GetSec- tion("Frontend:Url").Value) .AllowAnyHeader() ); } else { _env.ConfigureNLog("nlog.config"); appBuilder.UseHsts(); }

appBuilder.UseSwagger(); appBuilder.UseSwaggerUI(swaggerBuilder => ) } catch (Exception exc) { appBuilder.Run(context => { _logger.LogError(new EventId(), e, e.Message); context.Response.StatusCode = 500; return Task.CompletedTask; }) throw; } }

Listing 6. Startup.Configure method

There are rules that apply whether used in production or debugging locally. First in the pipeline is to redirect all requests to use HTTPS instead of HTTP and after that the con- figured authentication is run. Enforcing HTTPS is the secure way to create APIs. Then a Swagger-endpoint is called. Last in the pipeline are the handler to catch all exceptions, a CORS-policy enforcing component and the functionalities that are in the ASP.NET Core 2.1 MVC-library. [16]

For the release build there is also a specific logging-configuration that is loaded and the app is configure to use HSTS. HSTS is a web security policy against man-in-the-middle- attacks and is generally used for web browsers. Phone or desktop applications do not follow the HSTS instruction.[16]

Appendix 2 25 (49)

If the application is run using development-configuration the app shows a developer ex- ception page, which has the stack trace and exception messages printed onto the re- sponse. The logger uses a different configuration for the development environment and the CORS-policy accepts any headers only from the origin of the configured value Frontend:Url.

4.4.3 Routing

In web applications, routing is the act of directing requests to their correct endpoints. These requests can be done from the address bar of a web browser, an ajax call in or another application that uses HTTP as its protocol. In ASP.NET Core there are two ways to implement routing:

• A middleware-based routing in which the startup-class is used to define the routes and rulesets which dictate how the requests are directed. This way it is possible to separate the routing logic to its own code and keep that code in a single place is necessary. • Attribute-based routing uses the C# language attributes-feature. In control- ler classes, the classes themselves and their methods can be decorated with an at-tributes that specify the route and the HTTP verb that the class or the method will we tied to.

These two ways are not mutually exclusive and can be used simultaneously. Using a mix of both strategies it is possible to create general rules for routing and add exceptions via attributes. When using both strategies, ASP.NET Core tries to find the best route for each HTTP request. In this thesis, only the attribute-based routing was used in both classes and their methods.

The attributes were used to define the HTTP verbs, and in some cases the specific routes for certain methods. However, only the HttpPost and HttpGet attributes were used.

In Listing 7 an HTTP endpoint is shown. This is a method in a controller-class that re- turns an ActionResult-type object. The attribute is shown above the method declaration.

Appendix 2 26 (49)

[HttpPost("api/[controller]/[action]")] public ActionResult Add([FromBody]PartitionRuleAddModel model) { if (!ModelState.IsValid) { return BadRequest(); }

try { _partitionRulesService.AddPartitionRule( model.ListID, model.RecipientInfoID, model.ProductCode, model.PercentShare, model.CountShare, model.MaxOut, model.Priority, model.Active); return Ok(); } catch (Exception innerException) { throw new ApplicationException ("Failed to add a new PartitionRule.", innerException); } }

Listing 7. An MVC Controller’s endpoint.

ActionResult is the base class for the result of an action method. In the picture above the two results that are returned are BadRequest and Ok. These return the HTTP responses with their specific HTTP response status codes.

4.5 Controllers

In the three-layer architectural model using ASP.NET Core, the implemented API-layer comprises of controller classes and models for validation and data transfer. The control- lers are responsible for catching specific HTTP requests based on their URL and the used HTTP verb, and they are largely based on the conventions set by ASP.NET Core.

The controller layer does not have to know what kind of client sends the requests and, in the case of a RESTful API, is not responsible for rendering any views. It takes re- quests, validates the payload and then returns proper responses with data in JSON- format for the client application.

Appendix 2 27 (49)

In ASP.NET Core all controllers inherit from a base controller class which is included in the libraries provided by the framework. This class is tied into the ASP.NET Core’s pipe- line which directs requests to their respective classes.

The ASP.NET Core conventions recommend naming all created controllers ending with a ‘controller’-suffix in order for them to be discoverable by the built-in pipeline of the framework. The first part of the name then works in tandem with the routing system to define the class as an endpoint for a request. For example, HomeController-class would respond to requests that are made to website/home-url.[17]

4.5.1 Controller implementation

A total of ten controllers were created for the API, most of them representing a domain model and one for the abstract base class. As some data was dynamic in its signature, meaning that depending on the state of the database the returned object could have different properties, some controllers were mapped to return a Dynamic Object.

First a custom base class, inheriting from the controller-class, was created. Listing 8 show this implemented BaseController class. namespace JaakkoHe.Thesis.Api.Controllers { [ApiController] public class BaseController : ControllerBase where T : BaseControl- ler { protected readonly ILogger _logger;

public BaseController(ILogger logger) { _logger = _logger ?? (_logger = logger); } } }

Listing 8. BaseController

The BaseController-class was created to pass the _logger class onto the ControllerBase. Other controllers then inherit from this class and pass a typed _logger instance to the ControllerBase via the type argument T.

Appendix 2 28 (49)

4.5.2 DTO models

Classes that model the payload for data requests and responses were created in the API layer. Request-classes were mapped to the parameters expected in HTTP-requests and had validation rules in them using property attributes. Response-classes were mapped to the data that was required for the functionality of the client app. In general, these classes are called data transfer objects (DTO) as their main responsibility is to carry data, but not have any functionality.

An example of a model with validation attributes is shown in listing 9. The validation required in the application was mostly limited to required fields, integer ranges and the length of string values. The limits requirements were dictated by how the database had been set up. public class ExampleADDModel { [Required(ErrorMessage = "ObjectId is required.")] [Range(1, int.MaxValue, ErrorMessage = "ObjectId must be a positive inte- ger.")] public int ObjectId { get; set; } public string Code { get; set; } public string Name { get; set; } public string Description { get; set; } [Required(ErrorMessage = "RequiredDays is Required.")] [Range(0, int.MaxValue, ErrorMessage = "RequiredDays must be a positive integer.")] public int RequiredDays { get; set; } [StringLength(512, ErrorMessage = "Info cannot be more than 512 charac- ters long.")] public string Info { get; set; } public int TotalAvailable { get; set; } }

Listing 9. A DTO model for adding a new entry.

Part of ASP.NET Core’s MVC functionality is the binding of HTTP request data to match- ing C# objects and vice versa in the case of HTTP responses. It is a part of the MVC Filter pipeline which was configured in the Startup-class methods using AddMVC- and UseMVC-methods. The model binding matches the property names from a class to the specific HTTP data’s parameter names.

In listing 10 the method parameter updateModel has been decorated with the attribute FromBody, which tells the application to bind the ExampleCRUDModel from the data in

Appendix 2 29 (49) the HTTP request’s body. For GET requests, the data was passed in the URL of the request.

// POST: api/Example/Update [HttpPost("api/[controller]/[action]")] public ActionResult Update([FromBody]ExampleUPDModel updateModel)

Listing 10.

A total of 15 DTO models were created for the application as shown in figure 6. The classes were named using a convention where the classname told the user the purpose of the class. The classes are used either for the HTTP Request or the HTTP Response, but never both.

Figure 6. Implemented DTO models.

A few common types of implemented models and their description is explained in the Table 2. Some models, such as rows and cells were used as a part of a collection or a batch.

Appendix 2 30 (49)

Table 2. Naming conventions and the descriptions for DTO Models

Class naming convention Description Valida- tion

AddModel Used for adding a new entry. Yes

UpdateModel Used for updating an existing entry. Yes

DeleteModel Used for deleting an existing entry. Yes

RevertModel Special case where an external process had to be Yes reverted.

BatchAddModel Special case where multiple entries had to be Yes added. Consists of Rows which have Cells.

ViewModel Used for showing only the required fields for the cli- No ent.

4.6 Data-access layer

Regarding the overall architecture, the most problematic and difficult part ended up being the data-access layer. The four main parts are the services, the repositories, the DbCon- text-class, and the entities. The division between services and repositories – or the re- pository pattern - was somewhat agreed upon to be a best practice in Entity Framework but in Entity Framework Core it is seen as an anti-pattern.

Most of the data access layer was written using generics. Generics enables to write code that does not know the exact type of the objects that are used. The declaration for a generic class is written using angle brackets (‘’). For the actual implementation, another block of code specifies the type which then is passed onto the generic imple- mentation.[18]

4.6.1 Entity classes

Entity classes are a central part of Entity Framework that are used by the framework to build a model which is used when accessing a database. They are C# classes that have properties reflecting the database tables and columns when using a relational database, respectively. For example, a User-class will be mapped to the Users-table and its prop- erties such as Name, would be mapped to their respective columns in the Users-table.

Appendix 2 31 (49)

These mappings could be overwritten, but Entity Framework Core’s default conventions were used in most cases. The entities could be configured using a Fluent API in the DbContext-class or using Data annotation attributes in the entities themselves. The latter was used in the thesis. Some exceptions and unconventional solutions were used in the entities.[19]

The model that was created in the thesis consisted of 14 classes and an abstract BaseEntity-class was created to be used only to tie together all entities. There were three different kinds of entities:

• Simplest entities were only used for fetching data. • Half of the entities had a dependency to a List-table and were subclasses of a ListEntity-class • Entities that were used in stored procedure calls had to implement a ISprocCallingEntity-interface

Figure 7 shows the first half of the entities that inherit directly from the BaseEntity. Two of these entities also implement an ISprocCalling-interface.

Figure 7. Implemented entities which inherit directly from BaseEntity.

Appendix 2 32 (49)

The other half of the entities required a property to a dependency of a table key which was not available to be modeled. These entities inherited from ListEntity as shown in figure 8. Five ListEntity-classes also implemented an ISprocCalling-interface.

Figure 8. Implemented entities which inherit from ListEntity-class

There were differences in the names between the entities and the actual SQL Database views and their columns, so the Table- and Column-annotations were used to explicitly declare these. The ObjectId-property was annotated as being the primary key for the entity and its value was generated by the database.

An example implementation of an entity inheriting from the BaseEntity that was only used to fetch data is seen in listing 11. Both the class and its properties are decorated using Data annotations from the System.ComponentModel.DataAnnotations-namespace.

Appendix 2 33 (49) using System.Collections.Generic; using System.ComponentModel.DataAnnotations; using System.ComponentModel.DataAnnotations.Schema; namespace JaakkoHe.Thesis.Data.Model.Entities { ///

/// This class and its properties form the data for the table on the frontpage /// [Table("vInterface_Example")] public class ExampleEntity : BaseEntity { [Key] [Column("ObjectID")] public int ObjectId { get; set; } [Column("AccentColor")] public string AccentColor { get; set; } [Column("RowOrder")] public long RowOrder { get; set; } [Column("Channel")] public string Channel { get; set; } [Column("ObjectCode")] public string ObjectCode { get; set; } } }

Listing 11. Example Entity implementation

4.6.2 Updating the database through entity’s methods

At the time of the implementation, Entity Framework Core’s features were only partially complete and were missing functionality to call SQL Server’s stored procedures. The classes which were used for inserting, updating or deleting rows from the database were made to include the the related stored procedure’s parameters.

These entities implemented the ISprocCallingEntity-interface shown in listing 12. The interface declares a method for fetching the name of the stored procedure and the pa- rameters used for the stored procedure. As an argument the methods accept an enu- meration of DbOperationTypes which defines the type of a procedure that is to be called.

Appendix 2 34 (49) namespace JaakkoHe.Thesis.Data.Model.Entities { public enum DbOperationTypes { Insert, Update, Delete }

public interface ISprocCallingEntity { string GetSprocName(DbOperationTypes opType); object[] GetSprocParameters(DbOperationTypes opType); } }

Listing 12. Enumeration and interface implementations for entities updated through SQL server’s stored procedures

Listing 13 depicts an example of how GetSprocName-methods were implemented in the thesis. A switch-clause on the DbOperationTypes-enumeration determines what name is returned by the function. In some cases not all of the available DbOperationTypes were supported and in those cases an InvalidOperationException was also thrown. public string GetSprocName(DbOperationTypes opType) {

switch (opType) { case DbOperationTypes.Insert: return "InsertListPartitionRule"; case DbOperationTypes.Update: return "UpdateListPartitionRule"; case DbOperationTypes.Delete: return "DeleteListPartitionRule"; default: throw new InvalidOperationException("Operation not supported for Entity type: " + this.GetType().Name); } }

Listing 13. The implemented GetSprocName-method

Appendix 2 35 (49)

Listing 14 shows an example implementation of how the parameters were returned using the GetSprocParameters-method. A switch clause is again called on the DbOper-ation- Types-enumeration and the array returned will then be built differently. public object[] GetSprocParameters(DbOperationTypes opType) { switch (opType) { case DbOperationTypes.Insert: object[] insertParams = new object[8]; insertParams[0] = this.ListId; insertParams[1] = this.RecipientInfoId; insertParams[2] = this.ProductCode; insertParams[3] = this.PercentShare; return insertParams; case DbOperationTypes.Update: object[] updateParams = new object[9]; updateParams[0] = this.RuleID; updateParams[1] = this.ListId; updateParams[2] = this.RecipientInfoId; updateParams[3] = this.ProductCode; return updateParams; case DbOperationTypes.Delete: object[] deleteParams = new object[1]; deleteParams[0] = this.RuleID; return deleteParams; default: throw new InvalidOperationException(); } }

Listing 14. The implemented GetSprocParameters-method

The order of the parameters was dictated by how the stored procedure had been de- clared, so sometimes the used parameters were identical and sometimes they only var- ied by order. As before, not all operations were supported in all entities which imple- mented the interface.

4.6.3 DbContext

DbContext is the object in EntityFramework Core to access the underlying database. It can be thought to be as a session that is used when accessing the database. Typically the properties of the DbContext-class are used in an inherited class in which the acces- sors to the database are declared in DbSet-type properties. These DbSets can then be queried using LINQ to SQL. The implemented DbContext is shown in listing 15.

Appendix 2 36 (49) using JaakkoHe.Thesis.Data.Model.Entities; using Microsoft.EntityFrameworkCore; namespace JaakkoHe.Thesis.Data.Model.Db { public class ThesisDbContext : DbContext { public ThesisDbContext(DbContextOptions options) : base(options) { }

public virtual DbSet ListAvailability { get; set; } public virtual DbSet Lists { get; set; } public virtual DbSet ListsDetails { get; set; } public virtual DbSet DistributionsReports { get; set; } public virtual DbSet MemberinfoParameters { get; set; } public virtual DbSet MemberinfoOptions { get; set; } public virtual DbSet ActivePartitions { get; set; } public virtual DbSet PartitionRules { get; set; } public virtual DbSet RecipientInfos { get; set; } public virtual DbSet Users { get; set; } } }

Listing 15. The implemented DbContext-class

The properties of the DbContext were written as virtual to use lazy loading. With lazy loading it is possible to only fetch the data for those objects which are referenced in the accessing code block. For some cases this means that some data might not be loaded from the SQL database which can lead to more optimal performance.[20]

4.6.4 Repositories

To hide the references to Entity Framework Core a repository layer was written. The repository layer had, at first, only a simple interface for fetching data as shown in listing 16.

Appendix 2 37 (49) using System.Collections.Generic; using JaakkoHe.Thesis.Data.Model.Entities; namespace JaakkoHe.Thesis.Data.Model.Repositories { ///

/// Basic repository interface for fetching (add/update maybe) entities from database. /// /// public interface IRepository where TEntity : BaseEntity { TEntity GetById(int id); IList ListAll(); } }

Listing 16. The implemented IRepository interface

The IRepository interface has only two methods: one for fetching a single item and another for listing all items of the specified type. The type-parameter TEntity is one of the implemented Entity-classes and during runtime defines the behavior of the con- crete repository. Other repository-interfaces then inherited from this one. An overview of the interfaces written for the application core is shown in figure 9.

Figure 9. Implemented interfaces for the repository-layer.

Because the create, update, and delete queries were handled by stored procedures of the database, an interface called ISprocRepository was created. The interfaces mostly

Appendix 2 38 (49) used generics for deciding which Entity or table was to be updated, but in a few special cases generics either could not be used, or the interface was made specifically for one purpose.

To reduce code duplication, a base class for repositories was written using generics, as shown in listing 17. The type parameter TEntity has a limitation that the explicit imple- mentation’s type needs to inherit from BaseEntity-class. The same limitation which was introduced in the IRepository-interface. using JaakkoHe.Thesis.Data.Model.Db; using JaakkoHe.Thesis.Data.Model.Entities; using System; using System.Collections.Generic; using System.Linq; namespace JaakkoHe.Thesis.Data.Model.Repositories { public abstract class BaseRepository : IRepository where TEntity : BaseEntity { protected readonly ThesisDbContext _context;

public BaseRepository(ThesisDbContext context) { _context = context ?? throw new ArgumentNullException("DbContext was not passed to repository"); } public TEntity GetById(int id) { return _context.Set().Find(id); } public IList ListAll() { return _context.Set().ToList() ?? new List(); } } }

Listing 17. The implemented abstract BaseRepository-class.

The class gets the entity framework ThesisDbContext object in its constructor and sets it in a protected variable, _context. The variable is then used in the methods and has an accessor to the required DbSet-properties and their methods. The GetById(int id) method uses the Find-method of a DbSet, which accepts the id of the object as a parameter. If not found the method returns a null. ListAll-method returns all records for the given entity type.

Appendix 2 39 (49)

Classes which inherit from the BaseRepository implement the interfaces which were shown in Picture X and follow the same conventions for generics that were don int the BaseRepository.

Explicit implementations of the generic TEntity-type were declared in the service-classes that use the repositories through the interfaces. The repository-properties are defined with the explicit type in the declaration of the variable. The services are explored in the chapter the following chapter.

4.6.5 Services

Responsibilities for the service-layer was to do validation, error handling and logging. The service-classes used the repository-classes through the public interfaces. An explicit service was created for each entity that required some logic in the application.

Listing 18 describes a class declaring the type of the repository it uses. The service does not know the repository’s implementation, for as long as the repository’s interface matched the required functionality. using JaakkoHe.Thesis.Data.Model.Entities; using JaakkoHe.Thesis.Data.Model.Repositories; using System.Collections.Generic; namespace JaakkoHe.Thesis.Data.Model.Services { public class ExampleEntityService : BaseService, IExampleService { private readonly IRepository _exampleEntityReposi- tory; public ExampleEntityService(IRepository exampleEnti- tyRepo) { _exampleEntityRepository = exampleEntityRepo; } public IList GetExampleEntities() { return _exampleEntityRepository.ListAll(); } } }

Listing 18. Example of a service which uses a repository for database interactions.

Appendix 2 40 (49)

4.7 Unit tests and fake implementations

The project also included a testing-project. The testing project included unit tests for cer- tain parts of the code, but a high enough code coverage was not achieved. In addition to the tests, a set of fake implementations for the services were created.

Test automation is a critical part of software development. Various practices involve test- ing, for example Test Driven Development (TDD). However, testing requires a heavy time investment and is sometimes seen as a nice to have -feature resulting in few tests that are written hastily.

4.7.1 Unit test naming convention

Unit testing is the process of testing individual functions with a predefined data and an expected result. Writing testable code is oftentimes difficult for new and even experi- enced developers but can be learned with time and following general coding practices. Unit tests are not limited to server application code but can also be done for modern UI- frameworks.

There seems to be no single consensus on the naming and structuring convention of unit tests, but a choice was made by comparing a few key points. Firstly, because Visual Studio’s test explorer was used to run unit tests on the development machine, the unit tests needed to be readable in that tool. Secondly the overall structure was to be straight- forward enough to follow the application’s architecture.

A naming convention, which matched these requirements was found. Introduced in 2011, Steve “Ardalis” Smith proposed a style where each tested function has its own test-class, and each expected outcome will have its own test in that class. Figure 10 shows the chosen style in practice and depicts the results of a test run.

Appendix 2 41 (49)

Figure 10. Visual Studio 2019 Test Explorer window

4.7.2 Unit test frameworks and libraries

The selection of the unit testing framework was limited to the few different unit testing frameworks that were available on the .NET Core platform. Three major ones were MSTest, NUnit and XUnit. For the thesis the chosen framework was XUnit and the deci- sion was made because unlike MSTest and NUnit, XUnit follows testing conventions and uses a syntax that is similar to testing frameworks available on other languages.

Another library that was used in the unit tests was Moq. Moq is a mocking library that can be used to create mock-implementations of objects which then can be configured to return expected values. The use of mocks is central to unit testing because it allows to test only the code within the required scope without calling the actual nested functions. This way the tests have less dependencies on external implementations and can be re- peated.

Appendix 2 42 (49)

4.7.3 Structure of the unit tests

The unit tests were written in Arrange, Act, Assert (AAA) pattern. The pattern is a com- mon way to organize code within a single test and is a framework agnostic way of writ- ing tests. In this way the test is divided into three sections:

• In arrange, the objects are initialized, mocks are created and the test data is passed to the method under test. • In act, the method under testing is called. • In assert, the result of the test is compared to the expected behavior.

Listing 19 shows an example of a simple test class using the aforementioned conven- tions and libraries. using JaakkoHe.Thesis.Data.Model.Entities; using JaakkoHe.Thesis.Data.Model.Repositories; using JaakkoHe.Thesis.Data.Model.Services; using Moq; using System; using System.Collections.Generic; using Xunit; namespace JaakkoHe.Thesis.Data.Test.Unit_Tests.ExampleEntity_Tests { public class GetByListIdShould { private readonly int _validId = 1;

private Mock _moqRepository;

public GetByListIdShould() { _moqRepository = new Mock(); }

[Fact] public void CallRepositoryMethod_GetByListId() { _moqRepository.Setup(p => p.GetByListId(_validId)).Returns(new List() { new ExampleEntity() }); var exampleEntityService = new ExampleEntityService(_moqRepos- itory.Object, Mock.Of()); IList exampleEntity = null;

Action act = () => exampleEntity = exampleEntityService.GetBy- ListId(_validId); act(); Assert.NotNull(exampleEntity); } } }

Listing 19. An example unit test class using XUnit and Moq libraries.

Appendix 2 43 (49)

The class tests that the method GetByListId is called only one time and that the return value is not null with the initialized data. The test-method uses a mocked repository, _moqRepository, which is instantiated using Moq-framework’s Mock-method. The calls to the mocked object’s methods can then be verified in the assert section. Assert.NotNull(Object)-method from XUnit framework was used to test the return value of the tested method.

4.7.4 Fake services and data initializers

Fakes or stubs are another way of writing classes that for various reasons do not work identically to the actual implementations used in production code. They usually take shortcuts, return predefined data, or use in-memory implementations to avoid external dependencies.[21]

The fake service implementations were done because an access to an up-to-date data- base was not available during the development and to help rapid development of the client app without requiring a connection to a database instance. The services were made up of the service implementation using an in-memory database and a data initial- izer helper class. The data was then the same every time the backend service was started with the fake implementations, and the services worked predictably.

The data initializing classes were static classes that returned a set of IQueryable from an Initialize-method. The return value of the method was then passed onto a private List-variable of the fake service. The implementation of a fake service and the usage of a data initializer is pictured in listing 20.

Appendix 2 44 (49)

using JaakkoHe.Thesis.Data.Model; using JaakkoHe.Thesis.Data.Model.Entities; using JaakkoHe.Thesis.Data.Model.Services; using JaakkoHe.Thesis.Data.Test.TestData; using System; using System.Collections.Generic; using System.Linq; namespace JaakkoHe.Thesis.Data.Test.Fakes { public class FakeEntityService : IExampleEntityService { private List _exampleEntities;

public FakeEntityService() { _exampleEntities = EntitiesInitializer.Initialize().ToList(); }

public DynamicTableResult GetExampleEntities(int listid) { if (listid < 0) throw new ArgumentException("Invalid listid when fetching Entities.");

return EntitiesInitializer.CreateTableResult(listid); }

public void RevertEntity(int listId, int executedEntityId) { var targetExampleEntityIndex = _exampleEntities?.FindIndex(a => a.ExecutedEntityId == executedEntityId && a.ListId == listId);

if (!targetExampleEntityIndex.HasValue) { throw new ArgumentException($"ExampleEntity with Exe- cutedEntityId {executedEntityId} not found.", nameof(executedEntityId)); }

_exampleEntities.RemoveAt(targetExampleEntityIndex.Value); }

private long GetNextRowOrder(List entities) { return entities.Max(ap => ap.RowOrder) + 1; } } }

Listing 20. An example of a fake service using an in memory list for objects

4.8 Summary of the implementation

Most of the application logic follows the conventions set by the underlying ASP .NET Core and Entity Framework Core frameworks. However more recent versions of .NET Core introduce changes to the ways how web applications should be written.

Appendix 2 45 (49)

The code examples listed in this thesis are just a part of the actual project. The size of the API ended up being a little less than 3700 lines of code. Figure 11 shows how the code is divided between different parts of the application.

.

API Layer 1107 1098 Application core and repository layer Unit tests and fake implementations 1477

Figure 11. Lines of code in each code project

Application core and the repository layer ended up being the largest part of the code- base, which is no surprise as it handles most of the critical logic in the application. The impact on writing unit tests can be seen in the figure. While they did not cover most of the written code, the size of the test project ended up being larger than the API layer.

A GitHub Enterprise repository was set up for source control using a trunk based branch- ing strategy. The implemented CI/CD pipeline including GitHub, TeamCity and Octopus Deploy, was configured to automate the build and deployment processes to the three environments: development, QA and production.

Appendix 2 46 (49)

5 Conclusions

The objective was to implement a RESTful API with .NET Core framework and to con- figure a CI/CD pipeline with TeamCity and Octopus Deploy. This was a done at a time when technologies were not familiar, and .NET Core was still unproven as mature enough choice for production software.

The API was deployed to production use in early 2019 and has remained active since. While there were changes to the TeamCity and Octopus Deploy configurations, the groundwork that was done during the implementation was helpful during the early period of the application.

Overall, the project was a success and worked as a great introduction to the changes that were coming with .NET Core. As largely a one-man project, a wider look at software engineering was required all the while facing a tight schedule and difficulties in human resources. It also gave an opportunity to learn more about important new areas of ex- pertise outside of writing code.

5.1 Successes

In 2020, .NET Core seems to be the chosen way forward for Microsoft’s software devel- opment platform, but also for the employer. Learning the new features of ASP.NET Core was useful beyond measure and the choice to use them was the correct one for all in- volved parties.

The chance to be able to get acquainted with the employer’s toolsets for CI/CD, and to create a containerized application with Docker using Visual Studio proved to be useful skills later during the employment. Docker had already gained a strong foothold, but its importance only grew in the following years. Multiple other projects ended up using Docker.

Appendix 2 47 (49)

5.2 Further improvements

The API part of the application could have been improved by following the guidelines of how to use HTTP verbs more strictly. The controller endpoints could also have been written better to use more of the ASP.NET Core features and to take advantage of third- party libraries.

The reliance on stored procedures and the early support of Entity Framework Core ended up being a big problem. Although the solution of having entities implement an interface for the parameters and names for stored procedures did finally end up working fine for the application, it does not follow good practices to decouple implementations or to avoid dependencies between classes. Using data annotated entities made them tightly coupled to the implementation details. Data annotations could have been avoided by using the Fluent API features of Entity Framework Core. These flaws also break the single-responsibility principle of Robert C. Martin’s SOLID principles.

The usage of generics for repositories and services did speed up the development of some part of the code, but with no prior experience in using generics the solution did take some time to implement. As a learning process it was good.

Appendix 2 48 (49)

References

1 What is .NET? An open-source developer platform [internet] Available from: https://dotnet.microsoft.com/learn/dotnet/what-is-dotnet

2 Announcing .NET Core 1.0 | .NET blog [internet] Microsoft.com Available from https://devblogs.microsoft.com/dotnet/announcing-net-core-1-0/

3 What’s new in .NET 5 | Microsoft Docs [internet] docs.microsoft.com Available from https://docs.microsoft.com/en-us/dotnet/core/dotnet-five

4 .NET Standard | Microsoft Docs [internet] docs.microsoft.com Available from https://docs.microsoft.com/en-us/dotnet/standard/net-standard

5 The future of .NET Standard | .NET Blog [internet] microsoft.com Available from https://devblogs.microsoft.com/dotnet/the-future-of-net-standard/

6 .NET Platform GitHub [internet] .com Available from https://github.com/dot- net

7 .NET | Free. Cross-platform. Open Source [internet] Microsoft.com Available from https://dotnet.microsoft.com/

8 What is.ASP.NET Core? A cross-platform web-development framework [internet] Microsoft.com Available from: https://dotnet.microsoft.com/learn/aspnet/what-is- aspnet-core

9 Introduction to ASP.NET Core | Microsoft Docs [internet] Microsoft.com Available from https://docs.microsoft.com/en-us/aspnet/core/introduction-to-aspnet- core?view=aspnetcore-5.0

10 NuGet documentation | Microsoft Docs [internet] Microsoft.com Available from https://docs.microsoft.com/en-us/nuget/

11 GitHub Enterprise A smarter way to work together [internet] github.com Available from Enterprise · A smarter way to work together · GitHub

12 What is REST – REST API Tutorial [internet] restfulapi.net Available from https://restfulapi.net/

13 Richardson maturity model [internet] MartinFowler.com Available from: https://martinfowler.com/articles/richardsonMaturityModel.html

14 ASP.NET Core Web Host | Microsoft Docs [internet] docs.microsoft.com Availa- ble from: https://docs.microsoft.com/en-us/aspnet/core/fundamentals/host/web- host?view=aspnetcore-2.2

15 App startup in ASP.NET Core | Microsoft Docs [internet] docs.microsoft.com Available from: https://docs.microsoft.com/en-us/aspnet/core/fundamen- tals/startup?view=aspnetcore-2.2

Appendix 2 49 (49)

16 Enforce HTTPS in ASP.NET Core | Microsoft Docs [internet] Microsoft.com Avail- able from https://docs.microsoft.com/en-us/aspnet/core/security/enforcing- ssl?view=aspnetcore-2.2&tabs=visual-studio

17 Routing in ASP.NET Core | Microsoft Docs [internet] Microsoft.com Available from https://docs.microsoft.com/en-us/aspnet/core/fundamentals/rout- ing?view=aspnetcore-2.1

18 Generics – C# Programming Guide | Microsoft Docs [internet] Microsoft.com Available from https://docs.microsoft.com/en-us/dotnet/csharp/programming- guide/generics/

19 Creating and configuring a model – EF Core | Microsoft Docs [internet] Mi- crosoft.com Available from https://docs.microsoft.com/en-us/ef/core/modeling/

20 Loading Related Entities – EF6 | Microsoft Docs [internet] Microsoft.com Availa- ble from https://docs.microsoft.com/en-us/ef/ef6/querying/related-data#lazy-load- ing

21 Test Doubles – Fakes, Mocks and Stubs. | by Michal Lipski [internet] blog.prag- matists.com Available from https://blog.pragmatists.com/test-doubles-fakes- mocks-and-stubs-1a7491dfa3da