⭐ If you would like to buy me a coffee, well thank you very much that is mega kind! : https://www.buymeacoffee.com/honeyvig Hire a web Developer and Designer to upgrade and boost your online presence with cutting edge Technologies

Tuesday, November 11, 2025

Build and sell your own API $$$ (super simple!)

 

0:00

hello everyone so today I'm going to show you how to build an API and make passive income from it as a developer

0:07

I'm going to show you how to do this step by step by making a node.js application that uses Express as well as

0:13

the package axios and Cheerio with the beginner approach making it as accessible to as many people to follow

0:20

along as possible by the end of this tutorial you will have the knowledge of

0:25

how to build an API that you can make money off and sell and we will end with going through the steps of how to list

0:31

it on the rapid API Marketplace as a developer when you launch your API on

0:37

the rapid API platform you can essentially sell access to it for those that want to utilize what you have made

0:44

this access comes in tears that you can set yourself allowing you to have full control over how you monetize what you

0:51

have built and as it is the largest hub for apis out there at this moment in

0:57

time the football will be in our favor meaning that you could take your API idea from a simple form of passive

1:04

income to a full-blown startup depending on how much time you want to dedicate to

1:09

it so what are we waiting for let's do it the only prerequisites I ask of you

1:15

before starting this video is have a basic understanding of JavaScript or if you're feeling adventurous please follow

1:21

along anyway there will not be a huge amount of code involved and you have my permission to take what I have built and

1:29

change it to fit your needs but first let's refresh ourselves on what an API

1:34

is and Avi stands for application programming interface they allow for

1:41

Technologies to essentially talk with each other and are essential to so many services that we rely on today they are

1:48

behind most apps we use on a day-to-day basis as they can shape the information passed between one technology to another

1:54

and they can even connect things such as our microwaves or cars to the internet

2:01

apis are everywhere as a developer you might use tick tocks API to get a live

2:08

Tick Tock feed onto your website or even use them in a two-way stream to get post

2:14

or delete data from a movie system for example there's a reason why these words are popping up and that is because these

2:21

are the most popular HTTP request methods in fact I can use them to either

2:27

get data from this endpoint post new data to that endpoint edit data with the

2:34

put request to this this particular endpoint or delete all the data at this end point if I wish the end point is

2:42

essentially an address that points to a specific chunk of the data that we are working with and this is exactly what we

2:50

will be building today we are going to be deciding what happens and what kind of data returns back to us if we visit a

2:58

certain endpoint that we construct ourselves if this is the first time you are seeing

3:03

the word API I have not come across one before I do have a tutorial on them that

3:09

will go into much more detail than this refresher explainer that I invite you to try however if you feel comfortable or

3:17

would like to carry on let's get to setting up our project okay so I hope you're ready in this

3:24

tutorial I'm going to be building an API that tells us climate change news from all the various Publications all over

3:31

the world in one place and people can choose to purchase this information from us we can do this with any Publications

3:39

I'm going to choose to go on the topic of climate change but if you want to go on the topic of crypto that is

3:44

completely up to you now I did mention that we will be using the rapid API platform so please go ahead and sign up

3:51

I'm just going to sign up with Google and I'm just going to select my account that I want to associate with rapid API

3:58

and just wait for that to complete I'm just going to fill in my name and your Kubo organization and a and click done

4:07

great so here are all the apis to our disposal as I said as you will see

4:13

there's lots and lots of apis however I want to create my own API so I'm just

4:18

going to go ahead and click here and then leave it at this point now it starts to start coding our API

4:26

so here we are I'm just going to start off by creating a blank project using

4:32

webstorm please feel free to use whatever code editor you wish and just create an empty directory so just like

4:39

this so we can start completely from scratch okay before we get going I also

4:46

just want to make sure that everyone watching has node.js installed on their machines node is an open source server

4:54

environment we will be using it to create our own server or in other words back end it is free and allows us to use

5:01

the JavaScript and language in order to create our backend so I am a big fan now I am using a Mac so I would just

5:09

click right here in order to download this onto my machine however here are some other options for you for

5:16

installing the source code okay so just choose whichever one applies to you

5:21

now I already have this installed so once that is done for you let's get up

5:27

our Terminals and check that it has worked to check this has white I am

5:32

simply going to type node Dash V V is for version this will show me the

5:37

version of node that I have installed on my computer this version is very important if you

5:43

are watching this sometime in the future and for some reason this tutorial is not working one of the reasons could be is

5:51

that the dependencies we are going to use and their versions might not be compatible with our node version this is

5:58

also the case when working with other projects not just this one so keep that in mind as you progress on your journey

6:04

to becoming a web developer okay great you can actually also easily

6:11

switch your node versions by installing the version that you need using NVM install and then the version so that is

6:19

now done now we are using node version 0.1032 so that is something you can do

6:26

if you need to use a different package for now let's go back to using the first

6:31

one that we have so I'm going to go NVM views on the package that we had is 14.17.6

6:39

and great so once again we are now using this version of node that's the one we have just downloaded now that we have

6:47

that we need to install one more thing if we are using Max and that is Homebrew Homebrew is a free and open source

6:54

software package management system that we will be working with and installing packages with in the next section so we

7:01

need this in order to do that okay great let's carry on now it's time to get

7:07

coding the first thing we need to do is run npm init this will trigger

7:13

initialization and spin up a package Json file we are doing this so that we

7:19

can install packages or modules into our project to use if you want to have a

7:24

look at the thousands and thousands of packages to our disposal as a developer just visit

7:30

npmjs.com for example if we Google a popular package let's say react we can

7:37

see exactly what it does how to install it and how many times it is being

7:42

installed into projects all over the world okay so that is where you can find all

7:48

the packages now let's go back to our terminal as a general rule any project

7:53

that uses node.js will need to have a package Json file the package.json file does a lot more

8:00

than just hold our packages and the versions of them that we need so if you'd like to know more about it please

8:06

post here and Google beginner's guide to using npm however if you're comfortable with that let's carry on so let's go

8:13

ahead and create our package Json file that we have been talking so much about using the kapand npm in it making sure

8:21

that we are in the directory that we just created okay so in our project so

8:27

this is how we do it once again we have just written the command npm in it and now we are prompted to answer these

8:33

questions so do we want to call our package name climate change API yes I'm

8:39

just going to click enter this is the first version that we are building the description for now I'm going to leave

8:44

blank the entry point is going to be an index.js file that we create the tescom

8:49

one we're going to leave blank and I'm just going to leave all these blank for now and click enter

8:55

so now if we look in our climate change API directory we can see a package Json

9:03

file has been created with all the keys that we are asked to fill out as you can

9:08

see climate change API has been generated version one we left the description blank we have said that our

9:16

entry point is going to be index.js file and we have one script the author we left blank we can give it an author name

9:22

I'm going to put Annie Kubo and that's it for now this is essentially where we're going to see all

9:28

the packages that we're going to need to work with to create our web scraper great so I'm just going to minimize that

9:35

and create an index.js file so once again just in the root of my climate

9:42

change API project I'm going to create a new file this is going to be a Javascript file I'm going to call it

9:47

index so now we have the point of entry for our up okay this is essentially our

9:54

server we're going to write our code to create our server in this file now the

9:59

first package that we are going to install to use so we're going to install it I'm going to show you the package

10:05

that we are going to use by just opening up npm right here the package that I

10:11

want to install is called Cheerio so here is the package that we are going to need Cheerio is a package that we will

10:18

be using to essentially pick out HTML elements on a web page it works by passing markup and provides an API for

10:25

traversing manipulating and resulting data structure Cheerios selector

10:30

implementation is nearly identical to jQuery so if you know jQuery this will be familiar to you

10:36

so now that we know what we will be using it for let's get to using it to pick out elements from a web page

10:43

specifically this webpage right here okay so this is the web page that I'm

10:49

going to be scraping and the things I want to scrape are the titles of the

10:55

Articles as well as the URLs so if we inspect this page and just

11:01

gravitate to here and then pick out for example

11:07

click here we would want the I would say I'd probably want the div with the class

11:14

item header and then looking at the H3 tag with this class name as well as the

11:20

URL that comes with it so I can build my climate change API that will give us information about climate change from

11:27

various sources all over the world okay great so hopefully that makes sense hopefully I've explained what we are

11:33

going to be doing and why we're using Cheerio to do it let's go ahead and install Cheerio so it tells me that to

11:39

install I need to use this command npmi is for install and I'm just going to go

11:45

back into webstorm and get out my terminal and I'm just going to type npmi

11:51

I could type npm install that is totally up to you I'm going to install the package Cheerio just like it told us to

11:58

on npmjs.com and just click enter making sure that we are in the project making

12:04

sure we are in the climate change API do directory okay that is pretty important

12:10

so that has now been installed and there we go we will see that a dependency has

12:16

shown up we have just installed Cheerio along with the version of it okay so

12:21

this is the version that we are working with for this tutorial if for whatever reason you are getting a different

12:26

version and that might be causing issues please go ahead and just replace that here and just

12:32

uh fill out the correct version and then once you do that just run npmi again

12:39

okay npmi will essentially install all the dependencies that you see in this

12:44

object right here and then generate a package lock Json file from it so in

12:50

this package.json lock file you will see that we have indeed gone to the npmjs

12:55

registry and installed Cheerio okay so here we go

13:02

we can see Cheerio has been installed for us great so once again a package lock Json

13:09

file has been generated because we ran npm in store and that installed all

13:15

these dependencies and that is what generated the package log Json file

13:20

correct the next package that we are going to need to install is a package called express.js

13:27

so Express JS I'm just going to search that for you in here as well Express is

13:32

essentially a backend framework for node.js we are going to install it in

13:37

order to listen to pass and listen out to our port to make sure that everything

13:43

is working okay what I mean by this is that if we visit a certain path or URL

13:49

it will execute some code and it will listen out to the port we Define but

13:54

enough talking let me show you what I mean by this so once again I'm just going to copy this command and go back

14:00

to my project I'm just going to clear that using command K and run npm install Express

14:08

and just wait for that to install and then it should show up in our dependencies just right here

14:16

wonderful so again this is the version of Express I am using for this tutorial if for

14:22

whatever reason your code isn't working it could be down to the version that you installed

14:27

okay we have a few more packages installed the next package I want to install is a

14:33

package called axios and axios is a promise based HTTP client for the

14:39

browser and node.js icos makes it easy to essentially send HTTP requests to rest endpoints and

14:47

perform crud operations this means that we can use it to get post put and delete

14:53

data it is a very popular package and one that I use quite a lot on a day-to-day basis as a developer so once

15:01

again I'm just going to copy that I'm going to go in here and install it into my project so that it shows up in our

15:08

dependencies wonderful and those are the three packages that we need for this project

15:14

so hopefully you've got to this point hopefully you understand how to install packages or dependencies into your

15:21

project I feel now it's time to carry on and get to some actual coding

15:27

so I am just going to start by going into our index.js file

15:33

and defining the port we want to open up our server on this can be whatever you

15:39

wish I'm going to choose to run this on Port 8000 just like so and then I'm

15:44

going to write some code this is just some standard Syntax for listening out to the uh port to make sure it's running

15:51

but before we do that we actually need to initialize Express so first off let's

15:57

get Express so I'm going to use the package express again this is just standard syntax we need to get the

16:04

package I'm going to save it as the const express okay so this is just something that you will see being done

16:10

here as well in the set up axials require axios so this is again

16:17

something that we will need to do so I'm going to copy that in order to use axios in our back end this is something that

16:24

we need to do so I'm going to put that here as well and again for Cheerio const

16:30

Cheerio equals require so that's all three of our packages done

16:37

now to initialize using Express well I'm going to show you how to do this

16:44

so what I am going to do is essentially get Express

16:50

and call it so whatever the package comes with I'm saving as Express and then I'm calling it so it releases all

16:56

this wonderful energy and all its packages and all the stuff that it comes with I'm Gonna Save this as app so we

17:03

can use it further on okay so once again this line calls the express function it

17:09

calls it and puts the new express application inside the app variable to

17:14

start a new express application so we have called this Express up here and we

17:21

are now calling it and saving it as app so that we can use it in the rest of our project as we place with all of the

17:27

Express's powers that it comes with things like app use so app views so

17:33

essentially this is from the Express package or app get there's a lot that it

17:39

comes with essentially it comes with a lot of power great so now that we have that let's get to using Express and

17:46

using what we store is as which is app in order to get up Port up and running

17:52

so first off we listen out for the ports I'm going to use express to listen out

17:58

express listen out for Port and then I'm going to use a callback

18:05

and I'm just going to console log out in our back end server running on

18:13

port and then it'll show us our support okay

18:19

so this is what we need to write in order to just get a message to show us that everything is running fine on our

18:25

server however we also need to write a script so I'm just going to go back here

18:33

and under scripts I'm going to get rid of this test script actually so let's just get rid of that we don't really

18:38

need it I'm going to write start so I'm going to write a script for running the start command and I'm just going to use

18:45

no demon index.js no demand will essentially listen up any changes on index.js so now

18:54

let's run our back end I'm just going to use npm run start

18:59

and there we go server is running on Port 8000. so our backend is working our

19:06

app is listing out any changes made on Port 8000. wonderful

19:12

so this is looking good we have now officially used Express and express

19:17

comes with this listen and we are using it to listen out to any changes on Port 8000 or as we Define 0.8 000 we could

19:25

have defined as whatever we wish we chose 8 000. so this is now working let's carry on

19:32

now the first thing that I want to do is

19:38

let's just start scraping our first web page so what I'm going to do is write a

19:46

pot so I'm gonna do some rooting once again I'm going to use express I'm going to use app get just like so and then I'm

19:53

going to pass through a paw so if for example I just passed through

19:58

the home page like that that is the home page path uh and then this is the Syntax for

20:05

routing

20:10

and then res Json and then let's just write something welcome

20:16

to my climate change

20:22

news API and click save remember no demand is

20:27

listening out for any changes okay it's another one's restarting due

20:33

to changes it's starting on index.js so now if we visit Port 8000

20:39

so I'm just going to get rid of this localhost

20:46

8000. welcome to my climate change news API so that is working so what we have done

20:54

here is listen out to any time we visit the home page and then if we visit this

21:00

we get this response this response Jason welcome to my climate change news API

21:06

okay we've passed through a request we've passed a response this is just a

21:11

helper that um webstorm is giving us so I've not typed this out that is a helper as is this is telling us that this is a

21:17

request and this is a response I can change this I can make this Ania is great and that just means let's save

21:25

that that just means that now if I visit the home page there's nothing there and

21:30

if I visit Ania is great if I spell it correctly

21:36

welcome to my climate change news API so that is working I'm just going to

21:41

meet right back to the homepage hopefully you see now how that works great but this is in our API we need to

21:49

actually get some you know interesting data coming back and I want to scrape the internet to get the data from

21:55

certain news articles coming back to me so let's go ahead and do that

22:02

so to do this I'm going to keep that as is and once again I'm going to try app get so exactly the same syntax

22:11

as above let's pass through a request and response just like we did before and

22:18

this time let's say if we visit news well then I'm going to use axios

22:27

get and I want to essentially visit this URL okay I want to visit this URL

22:36

so let's grab that and let's just paste it I want to visit this URL and I also want

22:44

to wait for that to return because this is essentially going to return a promise

22:50

it's returning something back to us so once that return comes back I'm going to do some chaining uh if you don't know

22:55

much about async JavaScript I do have a course on this that I really recommend it's a five part Series so we're gonna

23:02

do some chaining with this so we're visiting this URL and then the response

23:08

that comes back to us well I want to save this response so cons let's save it

23:15

as HTML and I'm just going to save it as response

23:22

data okay just like so so now if we

23:27

console log HTML and I'm just going to visit

23:34

this path be sure to save that page and let's

23:39

visit use and now let's go back to here and see

23:46

what comes back to us whoops I wrote that's not meant to

23:52

be here and just delete that and Save and visit us again enter

24:00

okay so now you will see the HTML of this website coming back to us okay so

24:09

this is essentially everything from the guardian websites page that we visited this website page to be exact and it's

24:16

coming back to us so that's all I've done however this is great but we need

24:21

to carry on we actually need to pick out the elements so I'm going to show you how to do that and to do that we're

24:28

going to use cheerio and actually we can just get rid of this and Cheerio has

24:33

some commands that will help us do this load comes with the package Cheerio and I'm going to pass through the HTML the

24:40

response data say to HTML and I'm passing that through into Cheerio and let's save this as the dollar sign so

24:47

now essentially that's allowed us or you will see this is going to allow us to pick out elements

24:53

so this is the Syntax for using the Cheerio package we need to get the

24:59

dollar sign up or essentially what we have defined here is dollar sign we're now going to use so I'm going to use

25:05

that like so and then I am going to look for any a

25:10

tags that contain

25:16

anything to do with climate so what this means just make

25:21

sure that this is contains so what this is doing is looking on this

25:27

webpage right here and then finding any elements that have the a tag so for

25:32

example this one right here and if it contains anything to do with climate

25:37

change for example this one doesn't so for example this a tag consists of

25:43

the word climate so that should be picked out so that's exactly what I am

25:48

doing I'm going through all the HTML and using this syntax looking for a tags

25:53

that contain the word climate Okay and then we just pass through the HTML and

25:59

then for each one that comes back I'm assuming there's more than one uh I'm

26:04

just gonna write function so I'm going to pass through a function

26:10

so essentially a callback function and whatever so for each one that comes back

26:16

to us I'm going to grab its text because essentially once again I want to grab

26:22

the text that's in the a tag okay I want to grab that

26:27

so let's save this as something I'm going to save this as the title so

26:34

I'm going to go into this so for each of them and get the text

26:40

and I'm going to save this as a title so we have decided this is going to be the title

26:46

of the data that comes back to us and next we need the URL so what do we want the URL to be well once again whatever

26:53

comes back to us whatever we are saving this I want to grab the href of it so if we

27:03

look in here here's the href so again for every a tag that we find I want to grab the href attribute so I would do so

27:11

by essentially getting the attribute like so attribute H ref

27:19

okay great so that's what we're doing and then let's push it into its own array so

27:26

this is some JavaScript work articles is what I'm going to call my array and I'm

27:31

just going to save it we can put it up here actually so I'm just going to put it outside of this up get

27:39

some const articles so it's Global we're going to get the Articles I'm gonna push

27:46

a new object that we're going to create and the object that I'm going to push in is going to have the title

27:54

the URL and then well let's just see what this looks like so I'm just going to display the

28:02

articles in the actual browser using res Json so we're pushing this object into

28:10

the Articles array and we're going to display the articles in the browser when we visit forward slash news and then

28:16

let's also catch errors so again there's some chaining if you know asynchronous JavaScript then this will make sense to

28:23

you if not once again I do have a tutorial on this a five part series on asynchronous JavaScript so that is how

28:29

we just catch any Paris okay so now let's visit

28:35

forward slash news again and there we go so we are getting our

28:42

data back if it looks like this is because we need a Json viewer really so

28:47

if you don't have the Chrome extension Json View

28:52

here it is essentially just makes everything a lot more readable I'm just going to add that to Chrome

28:58

like so okay so that is being added and once it

29:03

is finished then the icon will be visible and there we go so that is now a lot more

29:11

readable for us this is looking great so every time that we found an a tag that

29:19

contained the word climate we essentially created this object we picked out the title of whatever was in

29:25

that a tag and we picked out the hatred of it so now I have an array full of

29:31

titles and URLs from the guardian page that we scraped okay so how cool is

29:39

that we've officially created our first scraping tool but I really want to make

29:45

this a lot more meatier I really want to give this a lot more value to anyone who wants to purchase my climate change news

29:51

API so I'm going to scrape from a lot of different websites and I'm going to show you how so this is part one we've

29:58

essentially learned how to scrape a website to retrieve back an array full of anything that contains news about the

30:06

climate crisis or the word climate the title and URL but now I'm going to Loop

30:12

that in with a lot of others so let's do it

30:17

so first off I'm actually going to create an array of newspapers that I

30:23

want to scrape so we've got our articles here actually above here I'm going to store the newspapers so it can't news

30:31

papers and I'm just going to make an array and I'm going to actually put through an array of objects and you'll

30:38

see why I'm not just going to pass through the URLs I'm going to pass through the name of the publication so

30:44

these are some that I found before let's go over the times

30:51

the address that we want to scrape and this is going to be this URL right here and I'm just going

30:59

to paste a few more that I have so we already have the guardian let's also have the telegraph so the

31:06

times guardian and Telegraph all on specific climate change pages so there's a lot more data for us so we can start

31:12

with three let's do three and then we'll add a bunch more later so here are the three news papers that we want to scrape

31:20

now I'm going to essentially edit my um

31:28

function right here to Loop over all three Publications so I'm going to show

31:35

you how to do this now this time I am going to actually

31:40

write this function outside of this so I'm going to actually just

31:49

use this as a template I'm going to do this outside here

31:54

so maybe let's do it up here actually so this time for each of the newspapers so

32:00

every newspaper in my newspaper array newspapers I think that's what we called

32:06

it so for each one of these for each newspaper this is why I said JavaScript

32:13

was a good prerequisite because you don't know JavaScript this can be extremely confusing but you know if you

32:19

just want to use this code and just want to take it then please just listen to me talking through it for you anyway so for

32:25

each newspaper so for each item in my newspaper array there's three items in

32:31

there one two three I want to so I want to get axios get so we're just

32:39

using what we did before this is a good refresher and then I need to pass through the URL so I'm going to pass

32:46

through the newspaper address newspaper singular okay because we for

32:52

every item for every newspaper in our newspaper array we can call this whatever we wish we can call this dog it

32:58

doesn't matter we are just saying that for each item in our newspaper array well I want to get the address

33:05

making sure to spell address in the same one that we did above so I'm passing through the URL so just like we did here

33:12

and passing through the URL however just by looping over the array of newspapers

33:17

and then so just like we did before then we get the

33:23

response okay and once again we need to get the

33:33

response data save that as HTML and then pass it through into Cheerio load and save it as the dollar sign so that's the

33:41

same let's carry on now once again because I have looked at all these

33:47

different newspapers and I've actually found newspapers that work to this style and I've sort of adjusted this so that

33:53

anytime you find an a tag it just so happened that each a tag had some text in it and it also happened to be that

34:00

the a tag had the H reference but that's expected from an attack so I can actually reuse this

34:06

so once again for each of the three here I'm finding an a tag that contains the

34:12

word climate let's just make sure to close this off okay

34:18

and once again what I am doing is looking at whatever comes back okay

34:23

whatever comes back uh so whatever comes back is this and I'm getting the text

34:29

from that a tag and then I'm going to save this as title

34:36

and once again I'm looking at each of the a tags so this is the a tag and I'm

34:42

going the and I'm getting the attribute H ref and whatever this value is I'm

34:48

going to save that as URL equals okay and once we have that I'm going to push it into articles articles push and I'm

34:57

going to make a new object which has a title and a URL but this time also a

35:04

publication okay so this time I'm actually going to have the source

35:11

newspaper address okay so whatever we pass through

35:16

whatever newspaper we pass through this time I'm going to get the newspaper address oh actually we should probably

35:21

have the newspaper name instead that would make more sense let's have the newspaper name show up

35:29

as the source okay so now

35:35

instead of having all this I'm just going to delete all that I'm going to

35:41

just return the Articles because this function will run and we're collecting all the Articles so what I want to

35:48

appear here in the Json and the browser so once again we use our Json is the

35:54

articles so now if we visit forward slash news

36:01

we get articles from the times we get the source URL the times the title

36:07

this is looking great we also get the guardian articles so we scrape the

36:12

guardian and we also get Telegraph

36:18

okay however this does not look like the correct URL let's just double check why okay so

36:27

we are going we are visiting this URL and we are scraping it so once again

36:32

let's go here and once again I have decided that I want to search for any a tags in

36:40

here

36:46

to for example this will probably be an a tag so yes we are an a tag that has a

36:52

URL so this URL is incomplete it seems that it doesn't have a base and then I'm

36:57

looking inside it for some text so anything inside this parent chart of a I'm looking for some text and there is

37:03

our title so this is working fine now we need to make an adjustment so this is the full URL because if we click on here

37:10

it will not take us to anything so our API is broken people would not like that

37:15

so all I'm going to do is go back here and posture a base

37:21

so this doesn't need a base so I'm going to have that empty this

37:27

also doesn't need a base the telegraph however needs a base and I'm just going to pass through the base

37:38

which is essentially https forward slash www.telegraph

37:45

.co.uk okay so now that we have that base it just means that the source we

37:50

pass through so the URL well the URL is not going to be just the URL it's going to be the

37:56

newspaper base if that exists plus the you are

38:02

well okay so we're grabbing the URL but we're appending the newspaper base in front of

38:09

it so that is looking good let's try that out again let's refresh this and now if

38:15

we visit the telegraph section you will see we have created a new URL one that

38:20

includes https forward slash www.telegraph.co.uk

38:26

wonderful so there we have it we have completed step two we have essentially got three

38:33

newspapers and script all the information all the articles about climate change from there along with

38:39

their titles and the URLs if we want to visit those articles wonderful

38:44

now this is looking good however I do want to make a another route and that is to get information just from one

38:51

newspaper article so I'm going to show you how to do that just under here

38:57

so once again we're going to use app get just like so and this time the URL I

39:02

want to pass through so I'm going to visit sorry it's going to be news forward slash and if we want to visit a

39:09

particular newspaper ID this is the syntax so do it uh bear with

39:15

me because I think this is best explained by showing so you need these two dots and then wherever you pass

39:20

through I want it to return something so I'm just going to pass through request

39:27

response this time making sure to put async right in front of it and then I want to grab whatever

39:35

newspaper ID I pass in front of news so I can do so if I console log

39:43

request okay and now let's just go ahead and visit forward slash news and Anya is

39:50

great once again and click enter nothing will show up here but if we

39:57

visit our console here you will see a request

40:04

everything that comes back requests has been essentially console logged up for

40:09

us and in the params you will see that the newspaper ID is Anya is great okay

40:15

so now this is this whole thing right here all this text is the request if I

40:20

go dot params that will get this and if I go newspaper ID

40:27

because that is what we've called it up here I could call this whatever I wish I could call it dog and in this case they

40:32

also come back as a dog equals Anya is great so now if I console log that and

40:37

once again let's visit something else Anya is awesome

40:42

and scroll down here and yeah it's awesome okay so hopefully that makes

40:47

sense whatever we pass through this dot just means that you know it's an ID that we are passing through an identifier it

40:53

could be whatever you wish you can call it XXX but essentially it's going to be saved under parents when we visit the

40:58

page so now that we have that I'm going to use this to my advantage I'm actually going to save this as something so Rec

41:06

params newspaper ID

41:11

and let's save this as the newspaper that we want to visit so I'm just going

41:16

to put a newspaper ID newspaper ID

41:22

so if we visit let's say forward slash news forward slash the times I just want

41:27

information from the times okay that's all I want so I'm just gonna go axial

41:32

get and once again pass through the URL well we know that we want everything we

41:38

want the URL from the times so we're going to use this array and what I'm

41:44

going to do is use some JavaScript so I'm going to get the array and I'm going to filter the array to find the

41:52

newspaper okay so once again for each newspaper so each item each of the three

41:57

items in my newspaper array I'm going to go through each one and if the newspaper's name

42:04

equals the newspaper

42:09

ID okay you don't have to make it strict so that

42:15

is going to come back so if I pass through News 4 slash the times it will come back and if that matches the times

42:22

Well I want to get the address so to get this back well let's save this

42:27

as something let's save this as const newspaper

42:34

just gonna comment this up for now and if I console log newspaper I expect that if I visit

42:42

forward slash the times oops making sure to spell

42:47

exactly the same okay so I'm visiting that nothing will happen here at the moment but if we visit here we get back the object from

42:55

our array we get by the whole object and we go into this object we want to go into the array so here's the array I

43:03

want to go into the array because there's only ever going to be one item hopefully I'm just going to go into the first item and grab the

43:10

address so I'm going into this item and grabbing the address and let's save this now as newspaper address

43:16

okay so we have now hopefully got our newspaper address let's see if that works on another newspaper so let's now

43:23

visit gar Diane

43:29

and let's go visit here and great we are getting back the URL of the Guardian if

43:35

we pass through the ID of guardian wonderful so now that we have that let's carry on

43:41

so we can use axios get to pass through the URL so the newspaper address of

43:48

whatever path we visit based on the identifier and then let's do some chaining so once that comes back with a

43:55

response we're going to get the response

44:00

and once again we need to get the response data and let's save this as the

44:06

const HTML so this is going to be Cheerio at work because we're going to get cheerio

44:13

load making sure to spell Cheerio collect correctly like the package and pass through HTML and now let's save

44:21

this as the dollar sign so we can use it and once again

44:28

I'm actually going to collect all the Articles here again you can call this whatever you wish and that might be

44:33

confusing maybe we can go specific articles specific articles just to differentiate from the

44:40

other articles array that we have globally so let's say there's a specific articles and empty array and this time

44:47

once again we are going to look for any a tag that contains

44:53

the word climate on the page that we are visiting

44:58

and then HTML we need to pass that through and for each item that comes back to us so for each a tag containing

45:05

the word climate that comes back to us as well that's why a callback function

45:13

oops we have to make sure that these two are the same okay so there we go

45:19

contains a climate and making sure this is a dollar sign

45:24

I'm gonna get it's text

45:31

and say this as the title so constant title

45:36

I'm also going to get the URL by going into

45:42

this and getting the attribute of H ref

45:51

and now we need to push we need to create an object and push it into specific articles so I'm going to use

45:56

push create an objects which again well I'm going to pass through a title

46:03

uh the URL the URL is going to also have a base

46:11

I think we need to get a base so if there is a base to be added this is how

46:16

you would do it we would go into our newspapers array again and filter

46:22

by newspaper so for each of the three items in our newspaper array if the

46:27

newspaper name matches the newspaper ID so just like we did before we're going

46:33

to go into the object and get its base and what should we say this as it save

46:39

it as a newspaper face just like so

46:48

okay so if a newspaper base exists then we're going to add it to the URL so right before the URL and once again the

46:55

source well we know that this is just the newspaper I do

47:00

great so this is looking good and then of

47:05

course we want to display this in the browser so I'm going to use resjson to

47:11

display the specific articles okay and once again we're just

47:17

going to catch any errors so this is the Syntax for doing so console log apps

47:23

and now visit this page we will get all the news articles from

47:29

the guardian and then if we go the times making sure to spell it exactly as we

47:36

did in here then we get all the newspaper articles

47:42

from the times and if we just go back to all the news articles you get all the news articles and there we go so now

47:51

we've done it we've created an API we've figured out a way to get all the newspaper articles here and then if we

47:58

go forward slash and then use the ID to get specific articles which we will be explaining on the documentation when we

48:04

build our API we get just specific articles great before we move on to put

48:10

this on rapid API I'm actually going to add a lot more newspapers in here

48:15

so I'm just going to actually just paste a few that I made earlier

48:20

okay so now we have a lot more data coming back to us once again I'm just

48:27

going to visit here so news and ta-da it's getting lots and lots and

48:35

lots more data up to 730 lines of data to be exact

48:40

wonderful okay so this is looking good let's carry on

48:48

before we move on I'm just going to format this a little bit better just so it looks a lot more neater

48:54

I'm going to do selected text optimize Imports rearrange code and run and

49:00

great okay ah we don't actually need this it turns out so let's get rid of this

49:06

and now to prep for deploying onto Heroku well I need to include another one package actually in the project as

49:13

right now we have it installed globally on our machines but Heroku doesn't know that so there we go npm install nodemon

49:20

and finally I need to change the port options for a Roku so give it an option like this

49:26

and great so this is looking wonderful again uh you will be able to get this

49:33

from my source code I will share it please feel free to use it for educational purposes or to build your

49:39

own API okay so now let's go on to the rapid API platform okay so here we are

49:46

back on our rapid API dashboard let's actually call our API something I'm

49:51

going to call it climbing change climate change live let's call it that and then

49:57

a an API showing all

50:03

the latest climate change news

50:09

around the world as we do have Australian articles in there and some

50:14

American ones so it's quite worldly category I'm gonna say data Maybe or

50:20

news let's go with news I think that's probably most appropriate and then the owner of its API will be me that is

50:28

correct and I'm just going to keep it as UI and add the API just like so okay so

50:35

here we are we need to add a base URL well for this I'm actually going to deploy my app onto Heroku this should be

50:41

relatively painless so all I'm going to do is ask you to head over to Heroku and

50:48

you just see all the projects I've hosted on this so far and I'm just going to go ahead and create a new one give it

50:54

a name let's go at climate change API and choose the region and just click

50:59

create app so there we go now we should have a lot of information on how to

51:06

deploy this using Heroku okay so if you haven't downloaded the Heroku command

51:12

line interface please go ahead and do so now I'm just going to do that with you

51:18

here so here we go I'm going to copy this command go into my terminal

51:26

let's create a new tab and run Brew tap Heroku brew and Brewton store Heroku now

51:33

if you are an Apple user like me you're gonna also have to install Chromebook so please go ahead and do this simply by

51:40

copying this command here and pasting it in your tongue I already have this installed so I'm ready to continue so

51:47

this is only for Apple users if you have another machine please do use the equivalent

51:53

okay so that should be fine that is installing and that should be ready to

51:58

go so we have the command line interface installed let's carry on with the

52:04

instructions given to us so once that has finished downloading I'm just going to run hello crew login and then just

52:10

click any button to open it up so this is just to make sure that we are logged in so I'm just going to click log in

52:17

here just like so and we are now logged in we can close this page and continue

52:22

with doing this process so now we're going to run git in it

52:29

and that has now initialized an empty git repository in my project okay so

52:35

making sure that we are certain climate change API now let's carry on the next command I need to do is take this

52:41

Command right here and just paste it like so so we are just going through all the commands that we

52:49

go great so now before we move on I'm going to actually add a git ignore file so

52:54

that we don't upload any node modules before we start adding files so I'm just going to go ahead and write git ignore

53:01

and what I am saying is that I want to ignore the nerd modules from being committed and just save that okay so now

53:09

we can carry on I need to add the files

53:15

the next thing we need to do is commit the files a message let's say final commit and

53:23

push it to master so just like so

53:28

and there we go the build has succeeded so let's go ahead and check it out

53:35

okay great and there we have it we have deployed our site it is now live all our

53:40

data is now live just right here the next thing I'm going to do is go back to Rapid API and continue

53:46

so now that we have done that I'm just going to upload an image that I want to

53:52

represent my API and in here on the website I'm going to use the URL that we

53:57

have just created so we have deployed this URL this is where our app is going to sit I'm going to use that right here

54:05

and next I'm just going to go ahead and wipe that in

54:10

the next thing we're going to do is add some endpoints so what shall we call the

54:16

first endpoint that we are going to create well I'm going to create my first rest and point so just go ahead and

54:22

click here and let's call this get or climate change

54:32

news and we can give a description saying

54:37

that this end point will return back all news about

54:46

climate change from all over the world

54:55

now with the end point here we're just going to specify what we did before so to get all the news articles we had

55:00

forward slash news and this is indeed a get request we're going to get that data so that is all we have to do we don't

55:07

have any parameters we need to worry about for this occasion and then we have all these options to us we are using

55:14

node.js and we are using axios as it has picked out so this is all we really need

55:20

to do for now but however if you do want to see the others please go ahead and do that

55:25

okay so for this one I'm actually going to start off with the newspaper ID so

55:32

I'm just going to go ahead and go forward slash news and then we can't use dot newspaper ID that will not work as

55:40

you can see we need to put this in curly braces just like it is prompting us to

55:46

do so that's what I'm going to do there we go newspaper ID is already picked out that this is my parameter so that is

55:52

looking good and let's go ahead and put an example value so we know that one of them is

55:58

Guardian so I'm just going to put in guardian and you can see that has populated right here

56:04

and go ahead and save that so we've got that and we can see some

56:09

example responses showing up we also have mock responses too if we wish so if

56:16

you wish to fill this out please go ahead and do so here so I can actually do this so I'm just going to go ahead so

56:23

this is essentially all really useful when people visit rapid API and see our API so they can see what kind of

56:29

responses to expect when they visit a certain endpoint it's really useful and it just allows people to have an idea of

56:37

what your API can provide and what kind of value it gets and of course we have the plans and

56:45

pricing now here are some public and private plans this is sort of what it

56:51

looks like and the basic one is free so for example I can say that we have

56:56

unlimited requests or we can change the monthly requests so on the free option

57:02

to be a thousand per month you can have a Pro Plan so let's go ahead and add a

57:08

Pro Plan and once again this is going to be for all endpoints and once again you

57:13

can change the quota type and quota limit and then have overages as well and

57:20

then let's say that we charge 0.1 extra for overages okay so you can do whatever

57:26

you like let's just go ahead and save that so this is a little bit higher it's not super high I'm not going to charge

57:32

anyone like an extravagant amount for this but these are the options that you have

57:38

okay so there we go this is looking good let's go ahead and

57:45

publish this so I'm going to make API visibility public and there we go here

57:52

is my API live on rapid API you can see the two endpoints that we have so we can

57:57

get all the news and we can also get the news based by newspaper ID or a

58:03

newspaper we of course have to give people the newspaper IDs available and that is all done in my API documentation

58:11

that has been actually generated for us nicely by rapid API so here you can see

58:16

all the example responses and you also have the drop down that we saw earlier so if you want to make this by node.js

58:23

it gives you the code available to you so you can just copy this and put it in your project and there's a bunch of

58:30

others too so whatever you feel more comfortable with that is available as an option okay so hopefully you've enjoyed

58:36

this tutorial hope you now have your own apis that you can sell and pass on to people pass on to friends to make some

58:43

money out of it again like please feel free to charge as much as you feel comfortable with charging and selling your API for for now that's it from me

58:50

thanks very much all the code and source code will be available in the description as well as links to Rapid

58:56

apis so please go ahead and do check this out

No comments:

Post a Comment