hello everyone so today I'm going to show you how to build an API and make passive income from it as a developer
along as possible by the end of this tutorial you will have the knowledge of
how to build an API that you can make money off and sell and we will end with going through the steps of how to list
it on the rapid API Marketplace as a developer when you launch your API on
the rapid API platform you can essentially sell access to it for those that want to utilize what you have made
this access comes in tears that you can set yourself allowing you to have full control over how you monetize what you
have built and as it is the largest hub for apis out there at this moment in
time the football will be in our favor meaning that you could take your API idea from a simple form of passive
income to a full-blown startup depending on how much time you want to dedicate to
it so what are we waiting for let's do it the only prerequisites I ask of you
before starting this video is have a basic understanding of JavaScript or if you're feeling adventurous please follow
along anyway there will not be a huge amount of code involved and you have my permission to take what I have built and
change it to fit your needs but first let's refresh ourselves on what an API
is and Avi stands for application programming interface they allow for
Technologies to essentially talk with each other and are essential to so many services that we rely on today they are
behind most apps we use on a day-to-day basis as they can shape the information passed between one technology to another
and they can even connect things such as our microwaves or cars to the internet
apis are everywhere as a developer you might use tick tocks API to get a live
Tick Tock feed onto your website or even use them in a two-way stream to get post
or delete data from a movie system for example there's a reason why these words are popping up and that is because these
are the most popular HTTP request methods in fact I can use them to either
get data from this endpoint post new data to that endpoint edit data with the
put request to this this particular endpoint or delete all the data at this end point if I wish the end point is
essentially an address that points to a specific chunk of the data that we are working with and this is exactly what we
will be building today we are going to be deciding what happens and what kind of data returns back to us if we visit a
certain endpoint that we construct ourselves if this is the first time you are seeing
the word API I have not come across one before I do have a tutorial on them that
will go into much more detail than this refresher explainer that I invite you to try however if you feel comfortable or
would like to carry on let's get to setting up our project okay so I hope you're ready in this
tutorial I'm going to be building an API that tells us climate change news from all the various Publications all over
the world in one place and people can choose to purchase this information from us we can do this with any Publications
I'm going to choose to go on the topic of climate change but if you want to go on the topic of crypto that is
completely up to you now I did mention that we will be using the rapid API platform so please go ahead and sign up
I'm just going to sign up with Google and I'm just going to select my account that I want to associate with rapid API
and just wait for that to complete I'm just going to fill in my name and your Kubo organization and a and click done
great so here are all the apis to our disposal as I said as you will see
there's lots and lots of apis however I want to create my own API so I'm just
going to go ahead and click here and then leave it at this point now it starts to start coding our API
so here we are I'm just going to start off by creating a blank project using
webstorm please feel free to use whatever code editor you wish and just create an empty directory so just like
this so we can start completely from scratch okay before we get going I also
just want to make sure that everyone watching has node.js installed on their machines node is an open source server
environment we will be using it to create our own server or in other words back end it is free and allows us to use
the JavaScript and language in order to create our backend so I am a big fan now I am using a Mac so I would just
click right here in order to download this onto my machine however here are some other options for you for
installing the source code okay so just choose whichever one applies to you
now I already have this installed so once that is done for you let's get up
our Terminals and check that it has worked to check this has white I am
simply going to type node Dash V V is for version this will show me the
version of node that I have installed on my computer this version is very important if you
are watching this sometime in the future and for some reason this tutorial is not working one of the reasons could be is
that the dependencies we are going to use and their versions might not be compatible with our node version this is
also the case when working with other projects not just this one so keep that in mind as you progress on your journey
to becoming a web developer okay great you can actually also easily
switch your node versions by installing the version that you need using NVM install and then the version so that is
now done now we are using node version 0.1032 so that is something you can do
if you need to use a different package for now let's go back to using the first
one that we have so I'm going to go NVM views on the package that we had is 14.17.6
and great so once again we are now using this version of node that's the one we have just downloaded now that we have
that we need to install one more thing if we are using Max and that is Homebrew Homebrew is a free and open source
software package management system that we will be working with and installing packages with in the next section so we
need this in order to do that okay great let's carry on now it's time to get
coding the first thing we need to do is run npm init this will trigger
initialization and spin up a package Json file we are doing this so that we
can install packages or modules into our project to use if you want to have a
look at the thousands and thousands of packages to our disposal as a developer just visit
npmjs.com for example if we Google a popular package let's say react we can
see exactly what it does how to install it and how many times it is being
installed into projects all over the world okay so that is where you can find all
the packages now let's go back to our terminal as a general rule any project
that uses node.js will need to have a package Json file the package.json file does a lot more
than just hold our packages and the versions of them that we need so if you'd like to know more about it please
post here and Google beginner's guide to using npm however if you're comfortable with that let's carry on so let's go
ahead and create our package Json file that we have been talking so much about using the kapand npm in it making sure
that we are in the directory that we just created okay so in our project so
this is how we do it once again we have just written the command npm in it and now we are prompted to answer these
questions so do we want to call our package name climate change API yes I'm
just going to click enter this is the first version that we are building the description for now I'm going to leave
blank the entry point is going to be an index.js file that we create the tescom
one we're going to leave blank and I'm just going to leave all these blank for now and click enter
so now if we look in our climate change API directory we can see a package Json
file has been created with all the keys that we are asked to fill out as you can
see climate change API has been generated version one we left the description blank we have said that our
entry point is going to be index.js file and we have one script the author we left blank we can give it an author name
I'm going to put Annie Kubo and that's it for now this is essentially where we're going to see all
the packages that we're going to need to work with to create our web scraper great so I'm just going to minimize that
and create an index.js file so once again just in the root of my climate
change API project I'm going to create a new file this is going to be a Javascript file I'm going to call it
index so now we have the point of entry for our up okay this is essentially our
server we're going to write our code to create our server in this file now the
first package that we are going to install to use so we're going to install it I'm going to show you the package
that we are going to use by just opening up npm right here the package that I
want to install is called Cheerio so here is the package that we are going to need Cheerio is a package that we will
be using to essentially pick out HTML elements on a web page it works by passing markup and provides an API for
traversing manipulating and resulting data structure Cheerios selector
implementation is nearly identical to jQuery so if you know jQuery this will be familiar to you
so now that we know what we will be using it for let's get to using it to pick out elements from a web page
specifically this webpage right here okay so this is the web page that I'm
going to be scraping and the things I want to scrape are the titles of the
Articles as well as the URLs so if we inspect this page and just
gravitate to here and then pick out for example
click here we would want the I would say I'd probably want the div with the class
item header and then looking at the H3 tag with this class name as well as the
URL that comes with it so I can build my climate change API that will give us information about climate change from
various sources all over the world okay great so hopefully that makes sense hopefully I've explained what we are
going to be doing and why we're using Cheerio to do it let's go ahead and install Cheerio so it tells me that to
install I need to use this command npmi is for install and I'm just going to go
back into webstorm and get out my terminal and I'm just going to type npmi
I could type npm install that is totally up to you I'm going to install the package Cheerio just like it told us to
on npmjs.com and just click enter making sure that we are in the project making
sure we are in the climate change API do directory okay that is pretty important
so that has now been installed and there we go we will see that a dependency has
shown up we have just installed Cheerio along with the version of it okay so
this is the version that we are working with for this tutorial if for whatever reason you are getting a different
version and that might be causing issues please go ahead and just replace that here and just
uh fill out the correct version and then once you do that just run npmi again
okay npmi will essentially install all the dependencies that you see in this
object right here and then generate a package lock Json file from it so in
this package.json lock file you will see that we have indeed gone to the npmjs
registry and installed Cheerio okay so here we go
we can see Cheerio has been installed for us great so once again a package lock Json
file has been generated because we ran npm in store and that installed all
these dependencies and that is what generated the package log Json file
correct the next package that we are going to need to install is a package called express.js
so Express JS I'm just going to search that for you in here as well Express is
essentially a backend framework for node.js we are going to install it in
order to listen to pass and listen out to our port to make sure that everything
is working okay what I mean by this is that if we visit a certain path or URL
it will execute some code and it will listen out to the port we Define but
enough talking let me show you what I mean by this so once again I'm just going to copy this command and go back
to my project I'm just going to clear that using command K and run npm install Express
and just wait for that to install and then it should show up in our dependencies just right here
wonderful so again this is the version of Express I am using for this tutorial if for
whatever reason your code isn't working it could be down to the version that you installed
okay we have a few more packages installed the next package I want to install is a
package called axios and axios is a promise based HTTP client for the
browser and node.js icos makes it easy to essentially send HTTP requests to rest endpoints and
perform crud operations this means that we can use it to get post put and delete
data it is a very popular package and one that I use quite a lot on a day-to-day basis as a developer so once
again I'm just going to copy that I'm going to go in here and install it into my project so that it shows up in our
dependencies wonderful and those are the three packages that we need for this project
so hopefully you've got to this point hopefully you understand how to install packages or dependencies into your
project I feel now it's time to carry on and get to some actual coding
so I am just going to start by going into our index.js file
and defining the port we want to open up our server on this can be whatever you
wish I'm going to choose to run this on Port 8000 just like so and then I'm
going to write some code this is just some standard Syntax for listening out to the uh port to make sure it's running
but before we do that we actually need to initialize Express so first off let's
get Express so I'm going to use the package express again this is just standard syntax we need to get the
package I'm going to save it as the const express okay so this is just something that you will see being done
here as well in the set up axials require axios so this is again
something that we will need to do so I'm going to copy that in order to use axios in our back end this is something that
we need to do so I'm going to put that here as well and again for Cheerio const
Cheerio equals require so that's all three of our packages done
now to initialize using Express well I'm going to show you how to do this
so what I am going to do is essentially get Express
and call it so whatever the package comes with I'm saving as Express and then I'm calling it so it releases all
this wonderful energy and all its packages and all the stuff that it comes with I'm Gonna Save this as app so we
can use it further on okay so once again this line calls the express function it
calls it and puts the new express application inside the app variable to
start a new express application so we have called this Express up here and we
are now calling it and saving it as app so that we can use it in the rest of our project as we place with all of the
Express's powers that it comes with things like app use so app views so
essentially this is from the Express package or app get there's a lot that it
comes with essentially it comes with a lot of power great so now that we have that let's get to using Express and
using what we store is as which is app in order to get up Port up and running
so first off we listen out for the ports I'm going to use express to listen out
express listen out for Port and then I'm going to use a callback
and I'm just going to console log out in our back end server running on
port and then it'll show us our support okay
so this is what we need to write in order to just get a message to show us that everything is running fine on our
server however we also need to write a script so I'm just going to go back here
and under scripts I'm going to get rid of this test script actually so let's just get rid of that we don't really
need it I'm going to write start so I'm going to write a script for running the start command and I'm just going to use
no demon index.js no demand will essentially listen up any changes on index.js so now
let's run our back end I'm just going to use npm run start
and there we go server is running on Port 8000. so our backend is working our
app is listing out any changes made on Port 8000. wonderful
so this is looking good we have now officially used Express and express
comes with this listen and we are using it to listen out to any changes on Port 8000 or as we Define 0.8 000 we could
have defined as whatever we wish we chose 8 000. so this is now working let's carry on
now the first thing that I want to do is
let's just start scraping our first web page so what I'm going to do is write a
pot so I'm gonna do some rooting once again I'm going to use express I'm going to use app get just like so and then I'm
going to pass through a paw so if for example I just passed through
the home page like that that is the home page path uh and then this is the Syntax for
routing
and then res Json and then let's just write something welcome
to my climate change
news API and click save remember no demand is
listening out for any changes okay it's another one's restarting due
to changes it's starting on index.js so now if we visit Port 8000
so I'm just going to get rid of this localhost
8000. welcome to my climate change news API so that is working so what we have done
here is listen out to any time we visit the home page and then if we visit this
we get this response this response Jason welcome to my climate change news API
okay we've passed through a request we've passed a response this is just a
helper that um webstorm is giving us so I've not typed this out that is a helper as is this is telling us that this is a
request and this is a response I can change this I can make this Ania is great and that just means let's save
that that just means that now if I visit the home page there's nothing there and
if I visit Ania is great if I spell it correctly
welcome to my climate change news API so that is working I'm just going to
meet right back to the homepage hopefully you see now how that works great but this is in our API we need to
actually get some you know interesting data coming back and I want to scrape the internet to get the data from
certain news articles coming back to me so let's go ahead and do that
so to do this I'm going to keep that as is and once again I'm going to try app get so exactly the same syntax
as above let's pass through a request and response just like we did before and
this time let's say if we visit news well then I'm going to use axios
get and I want to essentially visit this URL okay I want to visit this URL
so let's grab that and let's just paste it I want to visit this URL and I also want
to wait for that to return because this is essentially going to return a promise
it's returning something back to us so once that return comes back I'm going to do some chaining uh if you don't know
much about async JavaScript I do have a course on this that I really recommend it's a five part Series so we're gonna
do some chaining with this so we're visiting this URL and then the response
that comes back to us well I want to save this response so cons let's save it
as HTML and I'm just going to save it as response
data okay just like so so now if we
console log HTML and I'm just going to visit
this path be sure to save that page and let's
visit use and now let's go back to here and see
what comes back to us whoops I wrote that's not meant to
be here and just delete that and Save and visit us again enter
okay so now you will see the HTML of this website coming back to us okay so
this is essentially everything from the guardian websites page that we visited this website page to be exact and it's
coming back to us so that's all I've done however this is great but we need
to carry on we actually need to pick out the elements so I'm going to show you how to do that and to do that we're
going to use cheerio and actually we can just get rid of this and Cheerio has
some commands that will help us do this load comes with the package Cheerio and I'm going to pass through the HTML the
response data say to HTML and I'm passing that through into Cheerio and let's save this as the dollar sign so
now essentially that's allowed us or you will see this is going to allow us to pick out elements
so this is the Syntax for using the Cheerio package we need to get the
dollar sign up or essentially what we have defined here is dollar sign we're now going to use so I'm going to use
that like so and then I am going to look for any a
tags that contain
anything to do with climate so what this means just make
sure that this is contains so what this is doing is looking on this
webpage right here and then finding any elements that have the a tag so for
example this one right here and if it contains anything to do with climate
change for example this one doesn't so for example this a tag consists of
the word climate so that should be picked out so that's exactly what I am
doing I'm going through all the HTML and using this syntax looking for a tags
that contain the word climate Okay and then we just pass through the HTML and
then for each one that comes back I'm assuming there's more than one uh I'm
just gonna write function so I'm going to pass through a function
so essentially a callback function and whatever so for each one that comes back
to us I'm going to grab its text because essentially once again I want to grab
the text that's in the a tag okay I want to grab that
so let's save this as something I'm going to save this as the title so
I'm going to go into this so for each of them and get the text
and I'm going to save this as a title so we have decided this is going to be the title
of the data that comes back to us and next we need the URL so what do we want the URL to be well once again whatever
comes back to us whatever we are saving this I want to grab the href of it so if we
look in here here's the href so again for every a tag that we find I want to grab the href attribute so I would do so
by essentially getting the attribute like so attribute H ref
okay great so that's what we're doing and then let's push it into its own array so
this is some JavaScript work articles is what I'm going to call my array and I'm
just going to save it we can put it up here actually so I'm just going to put it outside of this up get
some const articles so it's Global we're going to get the Articles I'm gonna push
a new object that we're going to create and the object that I'm going to push in is going to have the title
the URL and then well let's just see what this looks like so I'm just going to display the
articles in the actual browser using res Json so we're pushing this object into
the Articles array and we're going to display the articles in the browser when we visit forward slash news and then
let's also catch errors so again there's some chaining if you know asynchronous JavaScript then this will make sense to
you if not once again I do have a tutorial on this a five part series on asynchronous JavaScript so that is how
we just catch any Paris okay so now let's visit
forward slash news again and there we go so we are getting our
data back if it looks like this is because we need a Json viewer really so
if you don't have the Chrome extension Json View
here it is essentially just makes everything a lot more readable I'm just going to add that to Chrome
like so okay so that is being added and once it
is finished then the icon will be visible and there we go so that is now a lot more
readable for us this is looking great so every time that we found an a tag that
contained the word climate we essentially created this object we picked out the title of whatever was in
that a tag and we picked out the hatred of it so now I have an array full of
titles and URLs from the guardian page that we scraped okay so how cool is
that we've officially created our first scraping tool but I really want to make
this a lot more meatier I really want to give this a lot more value to anyone who wants to purchase my climate change news
API so I'm going to scrape from a lot of different websites and I'm going to show you how so this is part one we've
essentially learned how to scrape a website to retrieve back an array full of anything that contains news about the
climate crisis or the word climate the title and URL but now I'm going to Loop
that in with a lot of others so let's do it
so first off I'm actually going to create an array of newspapers that I
want to scrape so we've got our articles here actually above here I'm going to store the newspapers so it can't news
papers and I'm just going to make an array and I'm going to actually put through an array of objects and you'll
see why I'm not just going to pass through the URLs I'm going to pass through the name of the publication so
these are some that I found before let's go over the times
the address that we want to scrape and this is going to be this URL right here and I'm just going
to paste a few more that I have so we already have the guardian let's also have the telegraph so the
times guardian and Telegraph all on specific climate change pages so there's a lot more data for us so we can start
with three let's do three and then we'll add a bunch more later so here are the three news papers that we want to scrape
now I'm going to essentially edit my um
function right here to Loop over all three Publications so I'm going to show
you how to do this now this time I am going to actually
write this function outside of this so I'm going to actually just
use this as a template I'm going to do this outside here
so maybe let's do it up here actually so this time for each of the newspapers so
every newspaper in my newspaper array newspapers I think that's what we called
it so for each one of these for each newspaper this is why I said JavaScript
was a good prerequisite because you don't know JavaScript this can be extremely confusing but you know if you
just want to use this code and just want to take it then please just listen to me talking through it for you anyway so for
each newspaper so for each item in my newspaper array there's three items in
there one two three I want to so I want to get axios get so we're just
using what we did before this is a good refresher and then I need to pass through the URL so I'm going to pass
through the newspaper address newspaper singular okay because we for
every item for every newspaper in our newspaper array we can call this whatever we wish we can call this dog it
doesn't matter we are just saying that for each item in our newspaper array well I want to get the address
making sure to spell address in the same one that we did above so I'm passing through the URL so just like we did here
and passing through the URL however just by looping over the array of newspapers
and then so just like we did before then we get the
response okay and once again we need to get the
response data save that as HTML and then pass it through into Cheerio load and save it as the dollar sign so that's the
same let's carry on now once again because I have looked at all these
different newspapers and I've actually found newspapers that work to this style and I've sort of adjusted this so that
anytime you find an a tag it just so happened that each a tag had some text in it and it also happened to be that
the a tag had the H reference but that's expected from an attack so I can actually reuse this
so once again for each of the three here I'm finding an a tag that contains the
word climate let's just make sure to close this off okay
and once again what I am doing is looking at whatever comes back okay
whatever comes back uh so whatever comes back is this and I'm getting the text
from that a tag and then I'm going to save this as title
and once again I'm looking at each of the a tags so this is the a tag and I'm
going the and I'm getting the attribute H ref and whatever this value is I'm
going to save that as URL equals okay and once we have that I'm going to push it into articles articles push and I'm
going to make a new object which has a title and a URL but this time also a
publication okay so this time I'm actually going to have the source
newspaper address okay so whatever we pass through
whatever newspaper we pass through this time I'm going to get the newspaper address oh actually we should probably
have the newspaper name instead that would make more sense let's have the newspaper name show up
as the source okay so now
instead of having all this I'm just going to delete all that I'm going to
just return the Articles because this function will run and we're collecting all the Articles so what I want to
appear here in the Json and the browser so once again we use our Json is the
articles so now if we visit forward slash news
we get articles from the times we get the source URL the times the title
this is looking great we also get the guardian articles so we scrape the
guardian and we also get Telegraph
okay however this does not look like the correct URL let's just double check why okay so
we are going we are visiting this URL and we are scraping it so once again
let's go here and once again I have decided that I want to search for any a tags in
here
to for example this will probably be an a tag so yes we are an a tag that has a
URL so this URL is incomplete it seems that it doesn't have a base and then I'm
looking inside it for some text so anything inside this parent chart of a I'm looking for some text and there is
our title so this is working fine now we need to make an adjustment so this is the full URL because if we click on here
it will not take us to anything so our API is broken people would not like that
so all I'm going to do is go back here and posture a base
so this doesn't need a base so I'm going to have that empty this
also doesn't need a base the telegraph however needs a base and I'm just going to pass through the base
which is essentially https forward slash www.telegraph
.co.uk okay so now that we have that base it just means that the source we
pass through so the URL well the URL is not going to be just the URL it's going to be the
newspaper base if that exists plus the you are
well okay so we're grabbing the URL but we're appending the newspaper base in front of
it so that is looking good let's try that out again let's refresh this and now if
we visit the telegraph section you will see we have created a new URL one that
includes https forward slash www.telegraph.co.uk
wonderful so there we have it we have completed step two we have essentially got three
newspapers and script all the information all the articles about climate change from there along with
their titles and the URLs if we want to visit those articles wonderful
now this is looking good however I do want to make a another route and that is to get information just from one
newspaper article so I'm going to show you how to do that just under here
so once again we're going to use app get just like so and this time the URL I
want to pass through so I'm going to visit sorry it's going to be news forward slash and if we want to visit a
particular newspaper ID this is the syntax so do it uh bear with
me because I think this is best explained by showing so you need these two dots and then wherever you pass
through I want it to return something so I'm just going to pass through request
response this time making sure to put async right in front of it and then I want to grab whatever
newspaper ID I pass in front of news so I can do so if I console log
request okay and now let's just go ahead and visit forward slash news and Anya is
great once again and click enter nothing will show up here but if we
visit our console here you will see a request
everything that comes back requests has been essentially console logged up for
us and in the params you will see that the newspaper ID is Anya is great okay
so now this is this whole thing right here all this text is the request if I
go dot params that will get this and if I go newspaper ID
because that is what we've called it up here I could call this whatever I wish I could call it dog and in this case they
also come back as a dog equals Anya is great so now if I console log that and
once again let's visit something else Anya is awesome
and scroll down here and yeah it's awesome okay so hopefully that makes
sense whatever we pass through this dot just means that you know it's an ID that we are passing through an identifier it
could be whatever you wish you can call it XXX but essentially it's going to be saved under parents when we visit the
page so now that we have that I'm going to use this to my advantage I'm actually going to save this as something so Rec
params newspaper ID
and let's save this as the newspaper that we want to visit so I'm just going
to put a newspaper ID newspaper ID
so if we visit let's say forward slash news forward slash the times I just want
information from the times okay that's all I want so I'm just gonna go axial
get and once again pass through the URL well we know that we want everything we
want the URL from the times so we're going to use this array and what I'm
going to do is use some JavaScript so I'm going to get the array and I'm going to filter the array to find the
newspaper okay so once again for each newspaper so each item each of the three
items in my newspaper array I'm going to go through each one and if the newspaper's name
equals the newspaper
ID okay you don't have to make it strict so that
is going to come back so if I pass through News 4 slash the times it will come back and if that matches the times
Well I want to get the address so to get this back well let's save this
as something let's save this as const newspaper
just gonna comment this up for now and if I console log newspaper I expect that if I visit
forward slash the times oops making sure to spell
exactly the same okay so I'm visiting that nothing will happen here at the moment but if we visit here we get back the object from
our array we get by the whole object and we go into this object we want to go into the array so here's the array I
want to go into the array because there's only ever going to be one item hopefully I'm just going to go into the first item and grab the
address so I'm going into this item and grabbing the address and let's save this now as newspaper address
okay so we have now hopefully got our newspaper address let's see if that works on another newspaper so let's now
visit gar Diane
and let's go visit here and great we are getting back the URL of the Guardian if
we pass through the ID of guardian wonderful so now that we have that let's carry on
so we can use axios get to pass through the URL so the newspaper address of
whatever path we visit based on the identifier and then let's do some chaining so once that comes back with a
response we're going to get the response
and once again we need to get the response data and let's save this as the
const HTML so this is going to be Cheerio at work because we're going to get cheerio
load making sure to spell Cheerio collect correctly like the package and pass through HTML and now let's save
this as the dollar sign so we can use it and once again
I'm actually going to collect all the Articles here again you can call this whatever you wish and that might be
confusing maybe we can go specific articles specific articles just to differentiate from the
other articles array that we have globally so let's say there's a specific articles and empty array and this time
once again we are going to look for any a tag that contains
the word climate on the page that we are visiting
and then HTML we need to pass that through and for each item that comes back to us so for each a tag containing
the word climate that comes back to us as well that's why a callback function
oops we have to make sure that these two are the same okay so there we go
contains a climate and making sure this is a dollar sign
I'm gonna get it's text
and say this as the title so constant title
I'm also going to get the URL by going into
this and getting the attribute of H ref
and now we need to push we need to create an object and push it into specific articles so I'm going to use
push create an objects which again well I'm going to pass through a title
uh the URL the URL is going to also have a base
I think we need to get a base so if there is a base to be added this is how
you would do it we would go into our newspapers array again and filter
by newspaper so for each of the three items in our newspaper array if the
newspaper name matches the newspaper ID so just like we did before we're going
to go into the object and get its base and what should we say this as it save
it as a newspaper face just like so
okay so if a newspaper base exists then we're going to add it to the URL so right before the URL and once again the
source well we know that this is just the newspaper I do
great so this is looking good and then of
course we want to display this in the browser so I'm going to use resjson to
display the specific articles okay and once again we're just
going to catch any errors so this is the Syntax for doing so console log apps
and now visit this page we will get all the news articles from
the guardian and then if we go the times making sure to spell it exactly as we
did in here then we get all the newspaper articles
from the times and if we just go back to all the news articles you get all the news articles and there we go so now
we've done it we've created an API we've figured out a way to get all the newspaper articles here and then if we
go forward slash and then use the ID to get specific articles which we will be explaining on the documentation when we
build our API we get just specific articles great before we move on to put
this on rapid API I'm actually going to add a lot more newspapers in here
so I'm just going to actually just paste a few that I made earlier
okay so now we have a lot more data coming back to us once again I'm just
going to visit here so news and ta-da it's getting lots and lots and
lots more data up to 730 lines of data to be exact
wonderful okay so this is looking good let's carry on
before we move on I'm just going to format this a little bit better just so it looks a lot more neater
I'm going to do selected text optimize Imports rearrange code and run and
great okay ah we don't actually need this it turns out so let's get rid of this
and now to prep for deploying onto Heroku well I need to include another one package actually in the project as
right now we have it installed globally on our machines but Heroku doesn't know that so there we go npm install nodemon
and finally I need to change the port options for a Roku so give it an option like this
and great so this is looking wonderful again uh you will be able to get this
from my source code I will share it please feel free to use it for educational purposes or to build your
own API okay so now let's go on to the rapid API platform okay so here we are
back on our rapid API dashboard let's actually call our API something I'm
going to call it climbing change climate change live let's call it that and then
a an API showing all
the latest climate change news
around the world as we do have Australian articles in there and some
American ones so it's quite worldly category I'm gonna say data Maybe or
news let's go with news I think that's probably most appropriate and then the owner of its API will be me that is
correct and I'm just going to keep it as UI and add the API just like so okay so
here we are we need to add a base URL well for this I'm actually going to deploy my app onto Heroku this should be
relatively painless so all I'm going to do is ask you to head over to Heroku and
you just see all the projects I've hosted on this so far and I'm just going to go ahead and create a new one give it
a name let's go at climate change API and choose the region and just click
create app so there we go now we should have a lot of information on how to
deploy this using Heroku okay so if you haven't downloaded the Heroku command
line interface please go ahead and do so now I'm just going to do that with you
here so here we go I'm going to copy this command go into my terminal
let's create a new tab and run Brew tap Heroku brew and Brewton store Heroku now
if you are an Apple user like me you're gonna also have to install Chromebook so please go ahead and do this simply by
copying this command here and pasting it in your tongue I already have this installed so I'm ready to continue so
this is only for Apple users if you have another machine please do use the equivalent
okay so that should be fine that is installing and that should be ready to
go so we have the command line interface installed let's carry on with the
instructions given to us so once that has finished downloading I'm just going to run hello crew login and then just
click any button to open it up so this is just to make sure that we are logged in so I'm just going to click log in
here just like so and we are now logged in we can close this page and continue
with doing this process so now we're going to run git in it
and that has now initialized an empty git repository in my project okay so
making sure that we are certain climate change API now let's carry on the next command I need to do is take this
Command right here and just paste it like so so we are just going through all the commands that we
go great so now before we move on I'm going to actually add a git ignore file so
that we don't upload any node modules before we start adding files so I'm just going to go ahead and write git ignore
and what I am saying is that I want to ignore the nerd modules from being committed and just save that okay so now
we can carry on I need to add the files
the next thing we need to do is commit the files a message let's say final commit and
push it to master so just like so
and there we go the build has succeeded so let's go ahead and check it out
okay great and there we have it we have deployed our site it is now live all our
data is now live just right here the next thing I'm going to do is go back to Rapid API and continue
so now that we have done that I'm just going to upload an image that I want to
represent my API and in here on the website I'm going to use the URL that we
have just created so we have deployed this URL this is where our app is going to sit I'm going to use that right here
and next I'm just going to go ahead and wipe that in
the next thing we're going to do is add some endpoints so what shall we call the
first endpoint that we are going to create well I'm going to create my first rest and point so just go ahead and
click here and let's call this get or climate change
news and we can give a description saying
that this end point will return back all news about
climate change from all over the world
now with the end point here we're just going to specify what we did before so to get all the news articles we had
forward slash news and this is indeed a get request we're going to get that data so that is all we have to do we don't
have any parameters we need to worry about for this occasion and then we have all these options to us we are using
node.js and we are using axios as it has picked out so this is all we really need
to do for now but however if you do want to see the others please go ahead and do that
okay so for this one I'm actually going to start off with the newspaper ID so
I'm just going to go ahead and go forward slash news and then we can't use dot newspaper ID that will not work as
you can see we need to put this in curly braces just like it is prompting us to
do so that's what I'm going to do there we go newspaper ID is already picked out that this is my parameter so that is
looking good and let's go ahead and put an example value so we know that one of them is
Guardian so I'm just going to put in guardian and you can see that has populated right here
and go ahead and save that so we've got that and we can see some
example responses showing up we also have mock responses too if we wish so if
you wish to fill this out please go ahead and do so here so I can actually do this so I'm just going to go ahead so
this is essentially all really useful when people visit rapid API and see our API so they can see what kind of
responses to expect when they visit a certain endpoint it's really useful and it just allows people to have an idea of
what your API can provide and what kind of value it gets and of course we have the plans and
pricing now here are some public and private plans this is sort of what it
looks like and the basic one is free so for example I can say that we have
unlimited requests or we can change the monthly requests so on the free option
to be a thousand per month you can have a Pro Plan so let's go ahead and add a
Pro Plan and once again this is going to be for all endpoints and once again you
can change the quota type and quota limit and then have overages as well and
then let's say that we charge 0.1 extra for overages okay so you can do whatever
you like let's just go ahead and save that so this is a little bit higher it's not super high I'm not going to charge
anyone like an extravagant amount for this but these are the options that you have
okay so there we go this is looking good let's go ahead and
publish this so I'm going to make API visibility public and there we go here
is my API live on rapid API you can see the two endpoints that we have so we can
get all the news and we can also get the news based by newspaper ID or a
newspaper we of course have to give people the newspaper IDs available and that is all done in my API documentation
that has been actually generated for us nicely by rapid API so here you can see
all the example responses and you also have the drop down that we saw earlier so if you want to make this by node.js
it gives you the code available to you so you can just copy this and put it in your project and there's a bunch of
others too so whatever you feel more comfortable with that is available as an option okay so hopefully you've enjoyed
this tutorial hope you now have your own apis that you can sell and pass on to people pass on to friends to make some
money out of it again like please feel free to charge as much as you feel comfortable with charging and selling your API for for now that's it from me
thanks very much all the code and source code will be available in the description as well as links to Rapid
apis so please go ahead and do check this out
No comments:
Post a Comment