

Building the input prompt and education dropdown
LlamaTutor’s core interaction is a text field where the user can enter a topic, and a dropdown that lets the user choose which education level the material should be taught at:
<input> and <select>, and control both using some new React state:
JSX
- Use the Bing API to fetch six different websites related to the topic
- Parse the text from each website
- Pass all the parsed text, as well as the education level, to Together AI to kick off the tutoring session
/getSources endpoint:
JSX
/getSources :

Getting web sources with Bing
To create our API route, we’ll make a newapp/api/getSources/route.jsfile:
JSX
JSX
.env.local:
JSX
JSX

JSX

Fetching the content from each source
Let’s make a request to a second endpoint called/api/getParsedSources, passing along the sources in the request body:
JSX
app/api/getParsedSources/route.js for our new route:
JSX
getTextFromURL function and outline our general approach:
JSX
jsdom and @mozilla/readability libraries:
JSX
JSX
getTextFromURL:
JSX

Promise.all to kick off our functions in parallel:
JSX

Using the sources for the chatbot’s initial messages
Back in our React app, we now have the text from each source in our submit handler:JSX
JSX
/chat!
Implementing the chatbot endpoint with Together AI’s SDK
Let’s install Together AI’s node SDK:JSX
JSX
chat.completions.create method expects, our API handler is mostly acting as a simple passthrough.
We’re also using the stream: true option so our frontend will be able to show partial updates as soon as the LLM starts its response.
We’re read to display our chatbot’s first message in our React app!
Displaying the chatbot’s response in the UI
Back in our page, we’ll use theChatCompletionStream helper from Together’s SDK to update our messages state as our API endpoint streams in text:
JSX
role to determine whether to append the streamed text to it, or push a new object with the assistant’s initial text.
Now that our messages React state is ready, let’s update our UI to display it:
JSX
chat endpoint responds with the first chunk, we’ll see the answer text start streaming into our UI!

Letting the user ask follow-up questions
To let the user ask our tutor follow-up questions, let’s make a new form that only shows up once we have some messages in our React state:JSX
handleMessage that will look a lot like the end of our first handleSubmit function:
JSX
chat endpoint, and reuse the same logic to update our app’s state as the latest response streams in.
The core features of our app are working great!