From the course: Create Your Own Code Assistant with Llama 2, Node.js, and React.js

Unlock the full course today

Join today to access over 24,900 courses taught by industry experts.

Integrating the extension, API, and Llama

Integrating the extension, API, and Llama

- [Professor] It's time to expose our endpoints for chat and ingest. We're here in our code space in our backend folder. We're going to go over to Explorer, backend, source. We're going to go down to App Service and here we're going to add two methods, one called ingest. And our ingest function is going to call our data processing service and ingest our current working directory. So let's add our data processing service same way we do with all services here. We're going to add a constructor that will take in a private data processing service, which will be a data processing service. And then we need to give our constructor a body. And while we'll default right now to the current working directory, we really could make this any directory. When we send our ingest command, we'll tell it where we want to do this. But for simplicity's sake, right now we're going to stay with process.cwd. The next method we need to add is going to also be async. And this is going to be our chat method. It's…

Contents