welcome today I'll show you how to create a continuous integration casual pipeline to automatically build and test
all changes to your github repository you will learn how to enable continuous
integration also known as CI with Azure pipelines what is a general basic pipeline and why use it how to create a
pipeline that runs on any change to your github repository how to diagnose and fix issues detected by the pipeline and
how to report the status of the pipeline on github before diving into how to
create a pipeline it is good to understand what's the typical sequence of steps in Azure pipelines and how it
enables continuous integration scenarios so it all starts with a software developer who has some code ready to go
in his box and it's ready to just push that code bi kid into his remote a
github repository so the push happens either directly or via a pull request and at that point github has already
been configured to talk to her parents to notify it that such an event has happened so didn't have this go ahead
and notify your project inertia pipelines so push happen let's say a master branch and at that point either
by blinds a will read or will evaluate a what we call the pipeline definition
which is a yellow file and that's stored in in actually in your github repository
- and so I think pythons will read it and I will tell it hey what are all these steps to execute for this pipeline
and also a word to execute the pipeline and any kind of constraints and any
other sort of configurations that are related to the exceptions of pipeline so once it's it reads that file it will go
ahead and queue what we call a run this is a series of tasks to execute it will
kill this run in and what we call an agent pool so the agent pool is a series of a bunch of machines there are air
ready to receive the requests that are coming from Hunter pipelines so this agent pool a can be a series of
let's say BMS virtual machines you could also be physical machines just normal physical boxes connected to our
pipelines or they could be also docker containers the most common way to do the things
these days is VR virtual machines but a more interesting way to do this is via containers and we will see that in the
video in a few moments now these machines could be either hosted by
Microsoft in the industry Bob's a product or they could be self hosted so
if you don't want to really worry too much about how to prepare these machines
or how to connect them to other pipelines and so that they can execute
your pipeline so you would just go with Microsoft hosted in this case they are
BMS and so I think for public projects you will get up to 10 concurrent
pipelines that can run simultaneously and but of course I mean it is easy to
just use those but it has its own restrictions like you don't have any control on the software that goes into
those machines and/or the spec of the machines themselves so depending on what you want to do there may or may not be
as convenient for you the other option self hosted has the benefit of like you can prepare the
entire machine by yourself with the exact specs that you need but of course it's a it's an ongoing maintenance task
and for these machines so it's up to you what you want to use we were using Microsoft hosted in in this video now
these machines can also be configured to use either Linux Windows or they or Mac
OS operating system so it really depends on what you want to do in the pipeline like if you need to let's say you want
to build a micro service it usually can run just fine only nook so you will pick
the Linux OS or a Linux VM but if you want to do things like using let's say
dotnet framework or you want to build a YouTube loopy Universal Windows platform projects you may need to go with Windows
basis VMs and you want to build something for iOS so most likely you have to go for a Mac OS
then after that happened a a medium-high
nation will be selected from this poll and the agent will just go ahead and pull this source the source code that you have on on github it will be pulled
into that machine and the series of tasks that are configured in your pipeline will execute okay so one of a
typical most basic tasks is just the build step where we build this code just
as you would have built it in your box it's saying the dotnet case doesn't build in this case the agent will do
that for you we just build a code automatically and once it's built it can
do other things like let's say run the tests right like dotnet tests or any other kind of test runner that you have
configured you can go ahead and run all these tests for you and finally it will publish results into the answered
pipelines UI and it can also send all
sorts of notifications like emails if you want to know what happened with the pipeline so this is kind of the overall
flow on Azure pipelines it will vary a lot especially in terms of the test execute depending on what you have
configuring you in your jumbo pipeline and of course there's another side of this which is the the deployment a story
continuous deployment which will not cover yet in this video now here are a
few things that we will be using in this tutorial first a couple of dotnet core projects already polish in a kid hub
repository second kit which we will use to manage changes to the repository
third the.net core 3-0 SDK which we will need to build and test the code locally
and finally missiles to the code which we will use as our code editor you could
of course use any other code editor that works best for you to illustrate how to
enable continuous integration with other pipelines we're going to use the hello pipelines repository that I have already
polished into github this repository has just a couple of very simple Dannette
core to zero projects the first one is a Web API and this one is very similar to the one
that you will get if used to.net new web api we had done that core CLI and the
main thing about this project is going to be the controller that we have here the weather forecast controller which
only has just one API here get what it does is just returns a list of our
collection of weather forecasts in each of these call forecast is going to have a date a temperature and a summary and
that summary is is just a random string out of these strings and you can see at the top the other project that we have
here is a little test project this an X unique project that just has one test
class and that's this class you have what one very simple test that is going to invoke that API and is going to
confirm that the expected number of days are being returned so how do we enable a
natural pipeline for this a github project what you want to do is go to a sure that Magnus comm slash services
slashed above slash pipelines and here depending on if you have already a
naturally Bob's account or not you may want to click on start free with
pipelines or sign in to us really pops in this case let's assume that we don't have an account yet so we're starting brand new so start free with pipelines
now we're going to authenticate and this case I'm going to use my Microsoft account and here we're asking for a
project name so your project is a place that's going to host both your pipelines
and any other a assure debuffs related a artifact that you want to use across
your social development lifecycle so this probably we're going to just call hello
pipelines you can choose if you want to make it private meaning that only you
and the people that you invite can see what's going on in this project or public meaning anybody can go ahead and
see what's going on here so since our repository is public and let's go ahead
and just make it public to here so I'll click continue and this is also going to create what they call a natural DevOps
organization which is an uber container of a bunch of potential projects that you can have in
agile devops now as you can see an organization has been created is called
hooli CCTV 82 and another M and a project has been created hello a bank
lines earlier and so now we're presented with an interesting choice which is a
where and I'm going to choose where to get the code from and at the same time
we're presented with the option of using either in general based pipelines or
using the classic editor to create the pipeline okay so download by the way stands for yet
another markup language that's an acronym and is not nothing more than a human friendly data Association standard
for all programming languages these days the recommended approach is to just go for the java based pipeline but why
would you want to use these as opposed to the classic editor another classic editor which is kind of legacy at this
point will allow you to do to use more of a UI UI friendly approach just drag
and drop tasks and do a bunch of things visually in in this designer to create
your pipeline but the main pitfall of that classic designer is that the the
pipeline definition itself is not checked in alongside your code right and the main problem
with this which is not evident as as you're starting with this but after a while mounts from from from now when you
want to go back and build again some codes on all code that you need to build again with the same time and that you're
using today and in many cases you just can't why and because the pipeline has evolved in a
separate way from separate way from your from your code right so in the past you
may have had some other projects or some other binaries or test code or artifacts
some other things that today are not there and that the pipeline is not honoring anymore so that is connect
makes the classic editor and the pylons created by the classic editor a not ideal for a long term project so overall
I'll strongly recommend using the jump as a pipeline and the other thing of course is that a
and there are new features new Astra pipelines features that are ready being introduced into the jungle basic
pipelines like deployment and deployment jobs a cron basic jobs skilled jobs and
probably some other things and those things are just not available in the classic editor so even if it takes a
little bit - a little bit more to learn and regional vessel pipelines I would strongly recommend that you go for this
one right now at the time where we are recording this there is a feature that
we want to use and it's not yet available broadly so we have to enable it explicitly so to do that I'm going to
go here to my profile and click the Tod preview features and it's called
multi-stage pipelines all right now where's my code where my code is in a
github so I'll clicking it up and now at this point you may be prompted to
authenticate to github in my case is not fronting me because it already did it and just remembering and so I'll click
on hello pipelines and now we're taken into github why this is because a github
is is asking us for permission to let assure the bops get access to the code
so pretty much as really box wants to get notice of any time that some code is
pushed into github so for that we need to install this what I call an well the
ultra pipelines application into github and it will grant access to these
permissions that we see here and so we have to say yes and I will authenticate
here okay after decatur game with the Microsoft account and so this sets up
the connection you know between Ning github and Azure pipeline so our survivors will now from now on has
access to what's in your github repository now at this point we're presented with a bunch of options in
terms of a template to initialize your Java file you could use or choose among
a series of templates are available what kind of framework tasks a or build
tool or test tool whatever you want to do there's a bunch of templates for for you but in our case we'll just keep it
simple go step by step so we'll go for a started started pipeline here we are so
an initial pipeline very simple pipeline has been generated for us so let's start exploring what's going on here I'm going
to collapse this section here to have more space and let's start looking at this the first thing that I'll recommend
you is to actually go to this link over here aka that I'm a slash channel which
I think I have already opened somewhere here right here so this page is super
useful because this is Craig's entire Jamal schema reference so here you can
tell exactly how to structure your Yama file how the pipeline's are defined by this Jamo file conventions the basics
and a bunch of samples so that you can get to know how to actually build these
pipelines and this also description of all the tasks that are available and a bunch of concepts and things so super
useful page you should keep these handy whenever you're dealing with a jumble basic pipeline so now back to to hear
one more thing about channel 5 by the way is that this is enforcing what we call as a configuration as code which is
this a very nice practice of storing your pipeline alongside the data
repository alongside the code in the repository so this is great because from here on you
will know exactly what's going on with changes to the pipeline as people this making changes them while pushing them
to the head hub repository again this is that would not be available with the
classic the classic pipeline editor so keeping my configuration code great
stuff first thing here the trigger the trigger is what defines when this
pipeline is going to get kicked off so what are you saying right now is that
anytime something is pushed or merged into the master branch this pipeline has to get kicked off and this you can
change it could be any of the branches that you have a real available in your Casa Torre and there's also some other options if
you want to limit exactly which pads within your branch you want to use to
trigger to trigger a pipeline run now there are other options available also
like you could go no this is difficult as see I've a set trigger but you could
create a pull request via trigger where the pipeline will kick off whenever a
new pull request let's say in github is created right so that's another way to run your pipeline the other way is a
schedule type pipeline so you can say well every every hour go ahead and kick
off the Python or every night or every morning or once a week stuff like that that's also available
next is the the pool so we talked about brittle machine pools or agent pools and
before and so here's what you define what kind of a machine you want to use so by choosing a VM image you're telling
your pipelines that the first thing that you actually want to use the Microsoft
hosted a Microsoft hosted virtual machine and second and this case by
saying they won't - you're saying well I want to use a linux-based machine so it will really depend on what you want to do you could do one two ladies you could
also do windows ladies if you want to use a Windows virtual machine or you
could do I think is a Mac OS ladies if you want to build in a in a Mac OS
device so again it depends on what you want to do okay and there's also some
other that you can also pick specific versions of this image you don't have to use latest and again if you want to know
exactly what's available a go back to that to that channel schema reference page and someone here you're going to
find all the options built on machines available for you and so like I said we're going to use a hosted image here
we're not going to be managing our own virtual machine now one thing that I like to recommend here
is to not run the pipeline directly into the virtual machine why because usually you don't know
exactly what's going on in that built a machine you don't know usually this this
have tons and tons and tons of tools and frameworks and compilers and test
runners and artifacts and all sorts of things installed on them and so for the
very specific project that you want to that you want to go continuous
integration across you may not need all those dozens and dozens and dozens of things that could have unintended
consequences in within your pipeline so one thing that you can do to prevent having to use all that is just use a
container so by using a container you can say okay so you're going to build
sorry you're going to run my pipeline and specifically within this container that I am specifying so for instance in
this case we know that we're building and we are testing the net core 3-0 a
set of projects so for instance in that case what we can do and I read pull this
up is go and find the dotnet core SDK docker image which is right here I'm going to copy it and I'm going to say
hey when you run that the pipeline don't just run the pipeline directly in the
virtual machine first go ahead and pull the dotnet cortecito SDK container container image run it and within that
container go ahead and run my pipeline so that makes sure that only the things that you need a for your pipeline will
will be used across the factory in this case they don't need core eske's all we need for us we don't need all the other
tooling than 700 there so and in fact if you just picked a dealing user container
and I just go with the Ubuntu latest buta machine that image does not have the net cortecito SDK it has a previous
version as of the time of this recording so I would have to add an additional task in this pipeline to make sure that
I actually kept it on core to zero SDK so containers great stuff may add some
seconds to your pipeline but it's totally worth it now going to steps here's where you
actually declare a what are the actions on the steps that you want to execute so
for an example this is this is giving us a couple of scripts but we're definitely
just removing them and then what we want to do is to add
our steps so there's two ways to add steps the first way is by using just the just type in them and we can use some
intelligence here so for instance the first test that we want to use here is well it's it is a task right so and this
task is going to be the dotnet core CLI - and we're going to need some inputs
sorry inputs and what we want there is it's
just one command and that command is called built so the way that the projects are set up we just have to do
dotnet built and it will go ahead and build all the projects and that's that's all we need to do and that is the setup
for this desk but now you may say well I don't want to be typing all this stuff all the time I have no idea what to put
here so again keep in mind that you can always go back to the Jama schema and this will have the definition of all the
tasks and samples and all these things right so you're not alone there but if you really don't want to just type this stuff
there's this thing called the assistant on the right side just have to click in there and you're this is going to open
up a list of all the tasks available in Azure pipelines and you just have to
select the one that you care about and this is going to bring a bunch of options so that you don't have to type them but you actually want to select
them over here so in this case we want to do is to use the let's say the test
command because we want to build a code and then test code right and what's the part to the project in this case we're
just going to use mini math expression say we want to scan all the directories
in the source and find anything that contains tests in the name of of the
probe this rock so anything that has tests in the project name we will be
picking across all the repository and also let's polish those test results and
Coco is available into a the Azure pipelines so I click add and as you can see that
adds immediately the task right here so you can see so either to type it or you
can pick it from here it's not as fantastic as the old designer but it's a
very handy tool and it lets us build beautiful demo biplanes so now the
pipeline is pretty much ready to go and what I'm going to do is just heat a safe
and run and at this point you're prompted with the option of either committing this directly to the master branch or you can create a new branch
for this commie and you can potentially even create a pull request if you want to get orders reviews and approvals on
that to keep these things simple in this video we'll just go ahead and commit directly to the master branch so save
and run so this is now created a piping so again remember
so the azure pipe in this channel is checked in into your repository so it will leave and move forward as your as
your repository moves forward so here we are in the pipelines monitoring page so
now you're looking at one specific run and telling you the duration and is
right now in the queue it state and it just changed it to running so your pipeline is now running and if you want
to know a what exactly is going on with that pipeline you can always just click on the job this will open up this UI
here and we can walk through what's going on there first what it's doing is of course a polling that a a docker
container image they donate core to zero SDK and like we said because it is inside this container that all the
pipeline is going to execute so now we were checking out the code so it's pulling the code from github into this
this container and next it will go ahead and it will build the code right so just
dotnet built with that does that we are it building the code and I think
something happened while building the code so let's see let's wait for a file okay so pipeline has finished now let's
scroll up a little bit and see what we can find so we have an error actually in the build step in the world for test
controller there's an error cannot convert from method group to int all right so there's
something going on here so the best thing that we can do I think is to actually try to reproduce this thing
locally and see what happens so go back to the github repository I'll get the clone URL and I'll go to my
box here and I'll just to get clone
right so let's go - hello pipelines and let's open BS code see what's going on
here all right here we are let's close this welcome screen and
let's look again we were looking at the
controllers weather forecast controller file line 34 please go there with API
controller the way the forecast controller sure let's restore it packages and let's see line 34 indeed
there's something going on here and yeah so the problem here is that we're trying to use account a property which does not
really exist because summer is it's an array arrays don't have account property
we could use account method if we're using a link here but probably is more
efficient to just use linked jisub actually another property is already computed just more efficiently using the
comment so let's do that so this should fix it but make sure let's make sure it's actually fixed
let's do round build task this is going to do the net build for both projects
and if this is fix it yes indeed they succeeded so let's commit this use length instead
or count in get all right
all right so right there now let's open a non terminal and let's do git push
origin master alright this should fix
the issue let's go back to our pipelines and as you can see just by doing that
git push another built a has kicked in so this is what we call continuous
integration so any change that's made to our master branch is been immediately
exercised by the by Tasha pipelines by the continuous integration pipeline and so as you can see we have the pipeline
now running and so let's see if we can get a successful run this time
all right so the pipeline has completed and indeed the job has failed and the
one thing that I noticed besides the fact that it has failed is that zero test has passed so first of all let's
first review what what failed here so it's saying that yes
so the dotnet test step failed it's saying that we have an article failure
so that something failed in the test expecting 7 805 and if you go back to
the to the run and we click on these is second where it says test your past we can click there and this actually gives
us an overall view of all the tests that failed in this run and if you click in
the failed test it will give you a very nice view of a what happened so like we
saw in the in the error before there's place where we're asserting that we will
expect 7 and we're getting 5 and that's it in the only test that we have so let's go back to the test and see what's
going on so let's see our test over here
ok so this test will go ahead create a controller passing a stop of the of the
logger that it needs it is it is expecting to receive 7 days call scared and we are not getting in 7 days let's
see weather forecast controller huh so it is getting a range of five days or
actually not seven days so that's the issue so at this point to fix this
so either the test is wrong or our implementation of the method is wrong so let's assume that the test is actually right and let's say well actually let's
return the seven days that the test is expecting so let's see if the test is
now happy with this so let's let's run it and run all the tests expand this a
little bit see what we get uh-huh one out of one test pass so this
fixes it so let's go back here and say
fix that you get to wait turn affected
number of days so yes back to the
terminal and let's just to get push origin master
and back to pipelines let's go here and again use my magic the pipeline gears
runs immediately so let's click here this will go ahead and run the pipeline
again and if you're lucky this time we'll get access full run
and so indeed this time the job succeeded we have a hundred percent pass
rate so this means that we're good and we can actually click there and we will see all tests are good
there's no failures here so everything's great so the pipeline is is ready now one more
thing that we may want to do just to reflect the fact that we have a pipeline in our github repository is to add a
status batch to the github page so that batch we can show right here in this
page and to enable that let's go back to the hello pipelines page and you just
click on this dotted dot click a status watch you can click on the sample mark down here I'll just copy it and then go
back to here please code let's open our
readme file by the way you should always have a written file that's super useful for future readers of your table and
just paste that markdown I'll hit save and I'll say add status badge alright
and let's push this okay am i doing that
we now go to get home and we refresh this page we'll see a status match right
here so anybody that just comes to this repository wants to know what the status of this code it will know that that the
status is well in this case exceeded it would say failed if the last field of is failed and if you click there click
there you will see the status of a latest build associated to this
repository so there you go continuous integration for your github repository
enabled by culture pipelines if this video was useful please consider hitting the like button don't forget to hit
subscribe and another vacation Belle to know right away when I polish new videos also please leave your comment below
with any talks about this video thanks for watching see you next time [Music]
No comments:
Post a Comment