Integrate ChatGPT with Meta (Facebook) Messenger via Messages API

Vonage Dev
10 min readMar 6, 2024

This article was written by Beejay Urzo in collaboration with Benjamin Aronov

Introduction

Conversational AI has become pivotal in enhancing user experiences across various platforms. One of the most popular channels for engaging with users is Meta Messenger, previously known as Facebook Messenger. With billions of active users, it presents a unique opportunity for developers to create interactive and intelligent chatbots that can streamline customer support, answer inquiries, and deliver personalized experiences.

In this post, we will harness the potential of ChatGPT, a state-of-the-art language model developed by OpenAI, and seamlessly integrate it with Meta Messenger using the Messages API.

tl;dr You can also find the code and the fully working app shown in this tutorial on GitHub.

Prerequisites

  • An OpenAI Developer Account — Follow the instructions in Open AI Sign Up
  • Javascript Node version 18 or higher — Follow the instructions at Node.js
  • Vonage Developer Account — Sign up for free and get a Vonage API account

Set Up

How to Create a Vonage Application

  1. Click on Applications
  2. Click “Create a new application”
  3. Add a friendly name to your application
  4. Under the Capabilities tab, toggle on Messages

5. Add placeholder URLs for the two fields: Inbound URL and Status URL

6. While here, let’s also click on “Generate public and private key”. This will automatically download a file called “private.key”. Remember this, we will need it later.

7. Press “Generate new application”. Important!

8. You will go back to the Application Detail view. Note the Application ID and API Key. We will need these later.

How to Create an OpenAI API Key

  1. Login to OpenAI and navigate to your API Keys
  2. Click your Account name on the upper right-hand side, select “View API keys”
  3. Click “Create new secret key”

4. Name your key and click “Create Secret Key”

5. A Message box with your Secret key will be displayed. This key is important, save it as it will only be shown once. We will need it later in the tutorial.

How to Link Your Facebook Page

  1. In the Vonage Developer Dashboard, click External Accounts
  2. Click on Connect Facebook Pages
  3. Login to Your Facebook Account

4. Once Logged in, select the Facebook page you want to connect to.

  • This is the page from which messages will be sent.
  • If your Facebook Business page isn’t showing, make sure that you’ve enabled the Vonage API Platform under Business Integrations on the business tools page.

5. Click on “Complete Setup”

6. Click on Applications and navigate back to the Vonage Application you just created

7. Click on Link Social Channels tab

8. Click “Link” on the Facebook page you want to link to your application

How to Install Dependencies and Add Express Boilerplate

Dependencies

Our project uses just a couple of packages. The library openai library gives us our generative ai capabilities. The Vonage Node SDK allows us to access the Messages API and connect with Facebook. And finally, ngrok which allows us to create publicly accessible endpoints. Learn more about ngrok and tunnels.

  1. Open your terminal and create a new directory for our project: mkdir chatgpt-messenger-project
  2. Initialize our project: npm init es6
  3. Install Node dependencies: npm install express openai @vonage/server-sdk

Express Boilerplate

The code below is our initial boilerplate. We also add the endpoints we need for our Vonage Application.

  1. Create the file app.js in your project:touch app.js
  2. Add the following code to your app.js
import express from 'express'
const PORT=3003
const app = express();
app.use(express.json());
app.use(express.urlencoded({ extended: false }));
app.get('/', (req, res) => {
res.json(200);
});
app.post('/webhooks/callback', async (req, res) => {
res.json(200);
});
app.post('/webhooks/inbound-messaging', async (req, res) => {
res.json(200);
});
app.listen(PORT, async () => {
console.log(`Starting server at port: ${PORT}`)
});

We can now run the code to run our project and create our tunnels:

Run node app.js in your main terminal tab. And run ngrok http 3003 in a second tab.

The output should look like this:

We can now use the ngrok tunnel (the forwarding URL that looks like https://SOME_NUMBER.ngrok.io) to create our endpoints. We have two endpoints defined in our Vonage application:

/webhook/inbound-messaging and /webhooks/callback

Using our ngrok tunnel, we now have these two publicly accessible endpoints:

  1. https://SOME_NUMBER.ngrok.io/webhooks/inbound-messaging
  2. https://SOME_NUMBER.ngrok.io/webhooks/callback

Any further instructions about the terminal are for the main terminal app, leave the ngrok tunnel running throughout the tutorial.

Connecting Our Vonage Application via Webhook Endpoints

  1. Navigate back to your application in the Vonage Dashboard, under Applications
  2. Click on Edit
  3. Place your /inbound-messaging url in the Inbound URL and your /callback url in the Status URL.
  4. Save changes

How to Receive a Message in Messenger

Now that we have set our webhooks, messages sent to our linked Facebook Page should flow through our app. Let’s add the following code to our Express server.

What we are doing here is just printing out the message we receive through the Facebook Page and printing it out on the console.

app.post('/webhooks/inbound-messaging', async (req, res) => {
res.json(200);
const messenger_to = req.body.to
const messenger_from = req.body.from
const received_text = req.body.text
console.log("Received message: ", received_text, "from: ", messenger_from)
});

Let’s test our code. Restart your app by running node app.jsLet's try to send a message to the page. How about we say "Hi".

In your ngrok terminal tab you should now see the request received, similar to:

HTTP Requests                                                                                      
-------------

POST /webhooks/inbound-messaging 200 OK

Now if you check your main terminal tab, you should see:Received message: hi from: 1134580194842354942

How to Add Our Environment Variables and Initilize Our Libraries

In this step, let’s get all the required keys and start to use our dependency libraries OpenAI and Vonage.

Let’s add the following code to our application:

import { Vonage } from '@vonage/server-sdk'
import OpenAI from "openai";
const API_KEY=''
const APPLICATION_ID=''
const PRIVATE_KEY='./private.key'
const OPENAI_API_KEY=''
const vonage = new Vonage({
apiKey: API_KEY,
applicationId: APPLICATION_ID,
privateKey: PRIVATE_KEY
})
const openai = new OpenAI({
apiKey: OPENAI_API_KEY
});

In this step, we will import the OpenAI and vonage/server-sdk libraries. We will also set some ENV variables that we will need to initialize our Vonage and OpenAI instances.

  • API_KEY is our Vonage API KEY
  • APPLICATION_ID is our Vonage Application ID
  • PRIVATE_KEY is the path to our private.key we downloaded before
  • You can create a new file in your Node JS environment and call it private.key. Open the downloaded previously downloaded private.key in a text editor and copy the content in your newly created private.key inside your Node JS environment. OR just copy the downloaded private.key in your environment
  • OPENAI_API_KEY is our OpenAI API Key

How to Query ChatGPT

Now that we have initialized Vonage and ChatGPT instances, it’s getting more exciting. We can now send our messages to ChatGPT.

Let’s update our /inbound-messagingendpoint and write the sentToChatGPT function, we’ll add the following code:

async function sendToChatGPT(user_message) {
const chatCompletion = await openai.chat.completions.create({
messages: [{ role: 'user', content: user_message }],
model: 'gpt-3.5-turbo',
});
console.log("ChatGPT Response", chatCompletion.choices[0].message.content);
}
app.post('/webhooks/inbound-messaging', async (req, res) => {
res.json(200);
const messenger_to = req.body.to
const messenger_from = req.body.from
const received_text = req.body.text
sendToChatGPT(received_text);
});

This small bit of code sends the message we receive to ChatGPT.

In this step, we use our ChatGPT openai instance and the openai.chat.completions.create method to send the message to the API. We pass along two pieces of additional information: the messages array which acts as context for the conversation and model which tells openai which AI model to use.

For those of you eagle-eyed people, you can see that we sent a black option object named chatCompletion. Inside chatCompletion we can retrieve the AI response to our text, inside the choices method.

Let’s test our code again. restart your app by running node app.js

You can now send a message to the page. For example, what is the smelliest cheese?

You should see an output in your terminal similar to this:

Received message:  what is the smelliest cheese? from:  2345801948423549411
Chat GPT Response: One of the smelliest cheeses is Époisses de Bourgogne, a French cheese known for its pungent aroma.

That’s so cool~~

How to Send ChatGPT Responses as a Reply in Messenger

In this step, we will reply to a user message with the ChatGPT response. We will import the MessengerText class from the available message classes in Vonage.

First, we need to import the MessengerText class:

import {MessengerText} from '@vonage/messages'

Then we can use it right after our previous code:

vonage.messages.send(new MessengerText({
to: messenger_from,
from: messenger_to,
text: chat_response.text
}));

In our Inbound Messages callback, we were getting the message from the request body (req.body.text). The request body also contains, to and from.

To send back a message we will instantiate a MessengerText class and set the parameters as such:

  • from: req.body.to
  • us, the receiver
  • to: req.body.from
  • the user who sent the message
  • text: message we get from ChatGPT

But if we try to run the code as is, we’ll soon find a problem 😟. We are limited by Facebook Messenger to send messages of 640 characters maximum length.

As traditional programmers we might try to split our messages or look for a maximum length configuration in the API. But we need to think like generative AI engineers! We just need to tell our GPT bot to keep it snippy!

So let’s update our sendToChatGPT to queryChatGPT, which now returns the answer from ChatGPT and we can use that in our MessengerText method from Vonage.

Our app.js code now looks like:

async function queryChatGPT(user_message) {
try {
const chatCompletion = await openai.chat.completions.create({
messages: [
{ role: 'system', content: 'You are a helpful assistant. Your response should be less than or equal to 640 characters.' },
{ role: 'user', content: user_message }
],
model: 'gpt-3.5-turbo',
});
return chatCompletion.choices[0].message.content;
} catch (error) {
// Handle errors here if needed
console.error("Error querying ChatGPT:", error);
throw error; // Propagate the error if necessary
}
}
app.post('/webhooks/inbound-messaging', async (req, res) => {
res.json(200);
const messenger_to = req.body.to
const messenger_from = req.body.from
const received_text = req.body.text
queryChatGPT(received_text)
.then(result => {
vonage.messages.send(new MessengerText({
to: messenger_from,
from: messenger_to,
text: result
}));
})
.catch(error => {
console.error("Error:", error);
});
})

Let’s test again. You know the drill, restart your app by running:

node app.js

Let’s send a message to the page: what is the smelliest cheese?

No longer do we see the output in our terminal, but its fed directly to the user in Messenger!

So this is pretty good, our app can now pass as a chatbot. But the problem now is that we are not tracking conversations. So if your user wanted to ask, What wine can I pair it with? it will just give you a noncontextual answer.

You can’t have a proper conversation without context, so let’s build that!

How to Make Our Bot Context Aware

You saw in the last step the power of adding context to our openai requests. So now if we want to give it context of an ongoing conversation, I'm sure you can guess what we need to do right? Just add our conversation to the messages array, you're right!

So let's move our messages array into an object called converstationContext:

const converstationContext = [
{ role: 'system', content: 'You are a helpful assistant. Your response should be less than or equal to 640 characters.' },
]

So now that our content will be this flexible variable, let’s update our queryChatGPT funcion to accept it:

async function queryChatGPT(user_message) {
try {
const chatCompletion = await openai.chat.completions.create({
messages: converstationContext,
model: 'gpt-3.5-turbo',
});
return chatCompletion.choices[0].message.content;
} catch (error) {
// Handle errors here if needed
console.error("Error querying ChatGPT:", error);
throw error; // Propagate the error if necessary
}
}

And now inside our /webhooks/inbound-messaging endpoint we'll add each new interaction into this context array:

app.post('/webhooks/inbound-messaging', async (req, res) => {
res.json(200);
const messenger_to = req.body.to
const messenger_from = req.body.from
const received_text = req.body.text
converstationContext.push({ role: 'user', content: received_text });
queryChatGPT()
.then(result => {
converstationContext.push({ role: 'assistant', content: result });
vonage.messages.send(new MessengerText({
to: messenger_from,
from: messenger_to,
text: result
}));
})
.catch(error => {
console.error("Error:", error);
vonage.messages.send(new MessengerText({
to: messenger_from,
from: messenger_to,
text: "Sorry, there was an error."
}));
});
});

Let’s test our app one last time. Restart your app by running node app.js

You can now send a message to the page. For example, what is the smelliest cheese?

You should the previous responses in both your terminal and Messenger. But now, we can continue our conversation! If we ask What wine can you pair it with? it can give us a contextual answer:

Conclusion

Whoa, wasn’t that a fun app to build? How will you use your new generative ai superpowers? Maybe you can now dig deeper into the openai API and fine tune your chatbot to best serve your customers. Or you can connect your bot to more channels with Messages API like SMS, MMS, WhatsApp, or Viber. You can even connect your app to the Vonage Voice API so your customers can talk to an audio agent!

Whatever you decide to do to next, we want to know about it! Join us on the Vonage Developer Community Slack or message us on X, formerly known as Twitter.

--

--

Vonage Dev

Developer content from the team at Vonage, including posts on our Java, Node.js, Python, DotNet, Ruby and Go SDKs. https://developer.vonage.com