Creating a website aggregator with ChatGPT, React, and Node.js 🚀
A website aggregator is a website that collects data from other websites across the internet and puts the information in one place where visitors can access it.
There are many versions of website aggregators; some are search engines such as Google and Duck Duck go, and some can have more of a Product Hunt structure where you can see a picture and a short text.
You will usually scrape the website, take their metatags and h1-6 tags, scan their sitemap.xml, and use some pattern to sort the information.
Today I am going to use a different solution 😈
I will take the entire website content, send it to ChatGPT, and ask them to give me the information I need.
It’s kinda crazy to see ChatGPT parses the website content
So lettttsss do it 🚀
In this article, you’ll learn how to build a website aggregator which scrapes content from a website and determines the website’s title and description using ChatGPT.
What is ChatGPT?
ChatGPT is an AI language model trained by OpenAI to generate text and interact with users in a human-like conversational manner. It is worth mentioning that ChatGPT is free and open to public use.
Users can submit requests and get information or answers to questions from a wide range of topics such as history, science, mathematics, and current events in just a few seconds.
ChatGPT performs other tasks, such as proofreading, paraphrasing, and translation. It can also help with writing, debugging, and explaining code snippets. Its wide range of capabilities is the reason why ChatGPT has been trending.
ChatGPT is not available as an API yet 🙁 In order to use we will have to scrape our way in 😈
Novu – the first open-source notification infrastructure
Just a quick background about us. Novu is the first open-source notification infrastructure. We basically help to manage all the product notifications. It can be In-App (the bell icon like you have in Facebook – Websockets), Emails, SMSs and so on.
I would be super happy if you could give us a star! And let me also know in the comments ❤️
https://github.com/novuhq/novu
Limitation with ChatGPT
As previously mentioned, ChatGPT is not accessible through a public API. Instead, we can use web scraping techniques to access it. This involves automating the process of logging in to the OpenAI website, solving the captcha (you can use 2captcha for this), and sending an API request with the OpenAI cookies. Fortunately, there is a public library that can handle these tasks for us. Keep in mind that this is not a formal API, so you may encounter limitations if you attempt to make a large number of requests. Additionally, it is not suitable for real-time requests. If you want to use it, consider implementing a queue system for background processing.
Project Set up
Here, I’ll guide you through creating the project environment for the web application. We’ll use React.js for the front end and Node.js for the backend server.
Create the project folder for the web application by running the code below:
1mkdir website-aggregator
2cd website-aggregator
3mkdir client server
Setting up the Node.js server
Navigate into the server folder and create a package.json
file.
1cd server & npm init -y
Install Express, Nodemon, and the CORS library.
1npm install express cors nodemon
ExpressJS is a fast, minimalist framework that provides several features for building web applications in Node.js, CORS is a Node.js package that allows communication between different domains, and Nodemon is a Node.js tool that automatically restarts the server after detecting file changes.
Create an index.js
file – the entry point to the web server.
1touch index.js
Set up a Node.js server using ExpressJS. The code snippet below returns a JSON object when you visit the http://localhost:4000/api
in your browser.
1//👇🏻index.js
2const express = require("express");
3const cors = require("cors");
4const app = express();
5const PORT = 4000;
6
7app.use(express.urlencoded({ extended: true }));
8app.use(express.json());
9app.use(cors());
10
11app.get("/api", (req, res) => {
12 res.json({
13 message: "Hello world",
14 });
15});
16
17app.listen(PORT, () => {
18 console.log(`Server listening on ${PORT}`);
19});
Install the unofficial ChatGPT API library and Puppeteer. The ChatGPT API uses Puppeteer as an optional peer dependency to automate bypassing the Cloudflare protections.
1npm install chatgpt puppeteer
To use the ChatGPT API within the server/index.js
file, you need to configure the file to use both the require
and import
keywords for importing libraries.
Therefore, update the server/package.json
file to contain the type keyword.
1{ "type": "module" }
Add the code snippet below at the top of the server/index.js
file.
1import { createRequire } from "module";
2const require = createRequire(import.meta.url);
3//...other code statements
Once you have completed the last two steps, you can now use ChatGPT within the index.js
file.
Configure Nodemon by adding the start command to the list of scripts in the package.json
file. The code snippet below starts the server using Nodemon.
1//In server/package.json
2
3"scripts": {
4 "test": "echo \"Error: no test specified\" && exit 1",
5 "start": "nodemon index.js"
6 },
Congratulations! You can now start the server by using the command below.
1npm start
Setting up the React application
Navigate into the client folder via your terminal and create a new React.js project.
1cd client
2npx create-react-app ./
Delete the redundant files, such as the logo and the test files from the React app, and update the App.js
file to display “Hello World” as below.
1function App() {
2 return (
3 <div>
4 <p>Hello World!</p>
5 </div>
6 );
7}
8export default App;
Navigate into the src/index.css
file and copy the code below. It contains all the CSS required for styling this project.
1@import url("https://fonts.googleapis.com/css2?family=Space+Grotesk:wght@300;400;500;600;700&display=swap");
2* {
3 box-sizing: border-box;
4 margin: 0;
5 padding: 0;
6 font-family: "Space Grotesk", sans-serif;
7}
8input {
9 padding: 10px;
10 width: 70%;
11 margin: 10px 0;
12 outline: none;
13}
14button {
15 width: 200px;
16 cursor: pointer;
17 padding: 10px 20px;
18 outline: none;
19 border: none;
20 background-color: #6d9886;
21}
22.home,
23.home__form,
24.website__item,
25.loading {
26 display: flex;
27 align-items: center;
28 justify-content: center;
29 flex-direction: column;
30}
31.home__form > h2,
32.home__form {
33 margin-bottom: 30px;
34 text-align: center;
35 width: 100%;
36}
37.home {
38 min-height: 100vh;
39 padding: 20px;
40 width: 100%;
41}
42.website__container {
43 width: 100%;
44 min-height: 50vh;
45 border-radius: 5px;
46 display: flex;
47 align-items: center;
48 justify-content: center;
49 flex-wrap: wrap;
50 padding: 15px;
51}
52.website__item {
53 width: 80%;
54 margin: 10px;
55 background-color: #f7f7f7;
56 border-radius: 5px;
57 padding: 30px;
58 box-shadow: 0 2px 8px 0 rgba(99, 99, 99, 0.2);
59}
60.website__item > img {
61 width: 70%;
62 border-radius: 5px;
63}
64.website__item > h3 {
65 margin: 10px 0;
66}
67.website__item > p {
68 text-align: center;
69 opacity: 0.7;
70}
71.loading {
72 height: 100vh;
73 background-color: #f2e7d5;
74}
Update the App.js
file to display an input field that allows you to provide the website’s URL.
1import React, { useState } from "react";
2
3const App = () => {
4 const [url, setURL] = useState("");
5
6 const handleSubmit = (e) => {
7 e.preventDefault();
8 console.log({ url });
9 setURL("");
10 };
11
12 return (
13 <div className='home'>
14 <form className='home__form'>
15 <h2>Website Aggregator</h2>
16 <label htmlFor='url'>Provide the website URL</label>
17 <input
18 type='url'
19 name='url'
20 id='url'
21 value={url}
22 onChange={(e) => setURL(e.target.value)}
23 />
24 <button onClick={handleSubmit}>ADD WEBSITE</button>
25 </form>
26 </div>
27 );
28};
29
30export default App;
Congratulations! You’ve successfully created the application’s user interface. In the following sections, I’ll walk you through scraping data from websites using Puppeteer and getting a website’s description and title via ChatGPT.
How to scrape data using Puppeteer in Node.js
Puppeteer is a Node.js library that automates several browser actions such as form submission, crawling single-page applications, UI testing, and in particular, web scraping and generating screenshots of web pages.
Here, I’ll guide you through scraping the website’s content via Puppeteer in Node.js. We’ll send the website url provided by the user to the Node.js server and scrape the website’s content via its URL.
Create an endpoint on the server that accepts the website’s URL from the React app.
1app.post("/api/url", (req, res) => {
2 const { url } = req.body;
3
4 //👇🏻 The URL from the React app
5 console.log(url);
6});
Import the Puppeteer library and scrape the website’s content as done below:
1//👇🏻 Import Puppeteer at the top
2const puppeteer = require("puppeteer");
3
4app.post("/api/url", (req, res) => {
5 const { url } = req.body;
6
7 //👇🏻 Puppeteer webscraping function
8 (async () => {
9 const browser = await puppeteer.launch();
10 const page = await browser.newPage();
11 await page.goto(url);
12
13 //👇🏻 returns all the website content
14 const websiteContent = await page.evaluate(() => {
15 return document.documentElement.innerText.trim();
16 });
17
18 //👇🏻 returns the website meta image
19 const websiteOgImage = await page.evaluate(() => {
20 const metas = document.getElementsByTagName("meta");
21 for (let i = 0; i < metas.length; i++) {
22 if (metas[i].getAttribute("property") === "og:image") {
23 return metas[i].getAttribute("content");
24 }
25 }
26 });
27
28 console.log({ websiteContent, websiteOgImage })
29 await browser.close();
30 })();
31});
Add a function within the React app that sends the URL to the api/url/
endpoint.
1const handleSubmit = (e) => {
2 e.preventDefault();
3 setLoading(true);
4 setURL("");
5 //👇🏻 Calls the function.
6 sendURL();
7};
8
9async function sendURL() {
10 try {
11 const request = await fetch("http://localhost:4000/api/url", {
12 method: "POST",
13 body: JSON.stringify({
14 url,
15 }),
16 headers: {
17 Accept: "application/json",
18 "Content-Type": "application/json",
19 },
20 });
21 const data = await request.json();
22 //👇🏻 toggles the loading state if the request is successful
23 if (data.message) {
24 setLoading(false);
25 }
26 } catch (err) {
27 console.error(err);
28 }
29}
From the code snippet above, we added a loading state that describes the state of the API request.
1const [loading, setLoading] = useState(false);
Create a Loading
component that is shown to the users when the request is pending.
1import React from "react";
2
3const Loading = () => {
4 return (
5 <div className='loading'>
6 <h1>Loading, please wait...</h1>
7 </div>
8 );
9};
10
11export default Loading;
Display the Loading
component whenever the content is yet to be available.
1import Loading from "./Loading";
2//👇🏻 Add this code within the App.js component
3
4if (loading) {
5 return <Loading />;
6}
Congratulations! You’ve learnt how to scrape content from websites using Puppeteer. In the upcoming section, you’ll learn how to communicate with ChatGPT in Node.js by generating websites’ descriptions and brand names.
How to communicate with ChatGPT in Node.js
ChatGPT is not yet available as a public API. Therefore, to use it, we have to scrape our way in – meaning we’ll perform a full browser automation that logs in to the OpenAI website, solves the captcha, and send an API request with the OpenAI cookies.
Fortunately, a public library that does this is available and has been installed as part of the project requirement.
Import the ChatGPT API library and create a function that sends a request to ChatGPT.
1import { ChatGPTAPIBrowser } from "chatgpt";
2
3async function chatgptFunction(content) {
4 // use puppeteer to bypass cloudflare (headful because of captchas)
5 const api = new ChatGPTAPIBrowser({
6 email: "<CHATGPT_EMAIL_ADDRESS>",
7 password: "<CHATGPT_PASSWORD>",
8 });
9 await api.initSession();
10 //👇🏻 Extracts the brand name from the website content
11 const getBrandName = await api.sendMessage(
12 `I have a raw text of a website, what is the brand name in a single word? ${content}`
13 );
14 //👇🏻 Extracts the brand description from the website content
15 const getBrandDescription = await api.sendMessage(
16 `I have a raw text of a website, can you extract the description of the website from the raw text. I need only the description and nothing else. ${content}`
17 );
18 //👇🏻 Returns the response from ChatGPT
19 return {
20 brandName: getBrandName.response,
21 brandDescription: getBrandDescription.response,
22 };
23}
Chat GPT is super intelligent, and it will answer any question we will ask it. So basically, we will send it to write us the brand name and the description based on the complete website HTML.
The brand name can usually be found on the “og:site_name,” but to show you how cool it is, we will let ChatGPT extract it. As for the description, it’s pretty crazy. It will tell us what the site is about and summarize everything!
Next,
Update the api/url
route to as done below:
1//👇🏻 holds all the ChatGPT result
2const database = [];
3//👇🏻 generates a random string as ID
4const generateID = () => Math.random().toString(36).substring(2, 10);
5
6app.post("/api/url", (req, res) => {
7 const { url } = req.body;
8
9 (async () => {
10 const browser = await puppeteer.launch();
11 const page = await browser.newPage();
12 await page.goto(url);
13 const websiteContent = await page.evaluate(() => {
14 return document.documentElement.innerText.trim();
15 });
16 const websiteOgImage = await page.evaluate(() => {
17 const metas = document.getElementsByTagName("meta");
18 for (let i = 0; i < metas.length; i++) {
19 if (metas[i].getAttribute("property") === "og:image") {
20 return metas[i].getAttribute("content");
21 }
22 }
23 });
24 //👇🏻 accepts the website content as a parameter
25 let result = await chatgptFunction(websiteContent);
26 //👇🏻 adds the brand image and ID to the result
27 result.brandImage = websiteOgImage;
28 result.id = generateID();
29 //👇🏻 adds the result to the array
30 database.push(result);
31 //👇🏻 returns the results
32 return res.json({
33 message: "Request successful!",
34 database,
35 });
36
37 await browser.close();
38 })();
39});
To display the response within the React application, create a state that holds the server’s response.
1const [websiteContent, setWebsiteContent] = useState([]);
2
3async function sendURL() {
4 try {
5 const request = await fetch("http://localhost:4000/api/url", {
6 method: "POST",
7 body: JSON.stringify({
8 url,
9 }),
10 headers: {
11 Accept: "application/json",
12 "Content-Type": "application/json",
13 },
14 });
15 const data = await request.json();
16 if (data.message) {
17 setLoading(false);
18 //👇🏻 update the state with the server response
19 setWebsiteContent(data.database);
20 }
21 } catch (err) {
22 console.error(err);
23 }
24}
Lastly, update the App.js
layout to display the server’s response to the user.
1const App = () => {
2 //...other code statements
3//👇🏻 remove the quotation marks around the description
4const trimDescription = (content) =>
5 content.match(/(?:"[^"]*"|^[^"]*$)/)[0].replace(/"/g, "");
6
7 return (
8 <div className='home'>
9 <form className='home__form'>
10 <h2>Website Aggregator</h2>
11 <label htmlFor='url'>Provide the website URL</label>
12 <input
13 type='url'
14 name='url'
15 id='url'
16 value={url}
17 onChange={(e) => setURL(e.target.value)}
18 />
19 <button onClick={handleSubmit}>ADD WEBSITE</button>
20 </form>
21 <main className='website__container '>
22 {websiteContent.map((item) => (
23 <div className='website__item' key={item.id}>
24 <img src={item?.brandImage} alt={item?.brandName} />
25 <h3>{item?.brandName}</h3>
26 <p>{trimDescription(item?.brandDescription)}</p>
27 </div>
28 ))}
29 </main>
30 </div>
31 );
32};
Congratulations!🎉 You’ve completed the project for this tutorial.
Here is a sample of the result gotten from the application:
Conclusion
So far, we have covered,
- what ChatGPT is,
- how to scrape website content using Puppeteer, and
- how to communicate with ChatGPT in a Node.js application
This tutorial walks you through an example of an application you can build using Puppeteer and ChatGPT. ChatGPT can be seen as the ultimate personal assistant, very useful in various fields to enable us to work smarter and better.
The source code for this tutorial is available here:
https://github.com/novuhq/blog/tree/main/website-aggregator-with-chatgpt-react
Thank you for reading!
Help me out!
If you feel like this article helped you understand WebSockets better! I would be super happy if you could give us a star! And let me also know in the comments ❤️
https://github.com/novuhq/novu