Reading time: 4 mins
[Livestream] Building an AI-Powered App for Microsoft Teams
Microsoft Teams is a popular collaboration platform used by many organizations. In this article, we will explore the process of building an AI-powered app for Microsoft Teams. The app we will create is a chatbot that can answer questions about your organization using external data sources. These data sources can include repositories, PDF files, Google Drive, and more. To accomplish this, we will leverage the OpenAI API, Langchain, and Pinecone vector stores to search through the data sources and provide relevant answers.
Getting Started with the Teams AI Library
To start building our app, we will use the Teams AI library built by Microsoft. This library provides a working sample that we can use as a starting point for our app development. The library includes a chatbot called “Chef bot” powered by OpenAI. This chatbot is designed to explain things in a chef’s style, using cooking analogies and examples. By using this template, we can quickly bootstrap our project and customize it according to our requirements.
Creating a New App with Teams Toolkit
To create the app, we will utilize the Teams Toolkit extension for Visual Studio Code. This extension allows us to develop Teams apps efficiently. After installing the Teams Toolkit extension, we can create a new app or explore the available samples. For our app, we will choose the “Teams Chef bot” sample. By clicking on the sample, we can create a new instance of the app locally.
Setting Up the App and Dependencies
Once the app is created, we need to set up the necessary dependencies and configurations. The app uses various Microsoft libraries, SDKs, and authentication mechanisms provided by Teams. We also need to configure the OpenAI API key to enable the chatbot functionality. The configuration files and code are easily accessible and can be modified according to our needs in
/.env.local.user file, there are other .env files for different environments, etc.
Running the App
With the dependencies and configurations in place, we can run the app. By pressing F5 in Visual Studio Code, the app will run in debug mode using the Edge browser. The app will prompt us to log in with our Microsoft account and grant necessary permissions. Once logged in, the app will open in Teams and display the description of the “Chef bot” app. We can add the local bot to Teams or chat conversations and interact with it.
Customizing the Chatbot
The chatbot’s behavior and responses can be customized by modifying the code. The app initialization includes AI configuration, prompt manager, and prompt chat settings. We can change the prompt template to add more context or modify the way the bot responds. Additionally, we can inject dynamic text into the prompt template by using state or registered functions which will be helpful for us to inject context into the template. The context will be something that you get by querying your vector store or any other data sources you want the chatbot to use.
Integrating External Data Sources
One of the main objectives of our app is to utilize external data sources to answer questions about the organization. We can use embedding to store such information from various sources like repositories, PDFs, Google Drive, and other sources in a vector database like Pinecone. We can leverage Pinecone to do a semantic search and gather contextual data for the LLM. Combining that with OpenAI’s LLM, we can now produce precise answers in human language based on the context.
Building an AI-powered app for Microsoft Teams allows organizations to enhance their collaboration experience by providing a chatbot that can answer questions using context from external data sources. By leveraging the OpenAI API, Langchain, and Pinecone, developers can create intelligent bots that offer accurate and context-aware responses. With the flexibility and extensibility of the Teams platform, the possibilities for creating innovative and useful apps are endless.