Building a token refresh service for the Fitbit API with Container App Jobs

Building a token refresh service for the Fitbit API with Container App Jobs

A couple of years back, I built a Fitbit API token refresh service using Azure Functions. The purpose of that function was to refresh an access token that I can use to make calls to the Fitbit API and extract my own data using Azure Functions. This was before Azure Container Apps even existed, so Azure Functions was really the only option. WebJobs was an alternative (after all, Functions are essentially built on top of Azure WebJobs), but that wasn’t attractive since this was just an hobby project, and I didn’t want to break the bank. The problem with Azure Functions is that you’re restricted with the programming model. It’s fine if you just need the plumbing done for you for simple integrations, but if you want the flexibility, it’s not great. ...

December 12, 2024 · 12 min · Will Velida
Giving our AI Agents skills using native functions in the Semantic Kernel SDK

Giving our AI Agents skills using native functions in the Semantic Kernel SDK

In my last post on Semantic Kernel, I talked about how we interact with large language models (LLMs) through plugins that run with NLP prompts. We can also create plugins using native code. We can use our code to integrate with LLMs for calculations, data manipulation or any other functionality that our applications require. In this article, I’ll talk about how we can use native functions in the Semantic Kernel SDK, how we can create native function plugins, and how we can combine native functions with prompts. ...

March 7, 2024 · 5 min · Will Velida
Creating Plugins with the Semantic Kernel SDK and C#

Creating Plugins with the Semantic Kernel SDK and C#

When we use the Semantic Kernel SDK, we use plugins that act as the building blocks for our AI applications. Plugins essentially define the tasks that the kernel should complete, and the kernel interfaces with large language models and run the plugins we define. Plugins can include native code and natural language prompts, allowing us to use generative AI in our application. Plugins give us the flexibility of defining desired behavior in our application, and we can create custom prompt plugins to fine tune our applications precisely as we need to. ...

March 5, 2024 · 14 min · Will Velida
Building AI agents with the Semantic Kernel SDK and Azure OpenAI

Building AI agents with the Semantic Kernel SDK and Azure OpenAI

The Semantic Kernel SDK is an open-source SDK that allows developers to integrate large language models (also known as LLMs) in their applications. The Semantic Kernel SDK allows developers to integrate prompts to LLMs and results in their own applications. For example, say we’re developing a booking system for a medical clinic. Instead of creating a LLM from scratch, we can use the Semantic Kernel to use existing LLMs and create an AI agent that can understanding the natural language queries that our patients make, provide recommendations based on their queries, and book them in to appointments. ...

March 4, 2024 · 6 min · Will Velida
Building NLP applications with Azure OpenAI

Building NLP applications with Azure OpenAI

Azure OpenAI provides developers with the ability to add AI to their applications using a variety of different models from OpenAI. This includes GPT-4, GPT-4 Turbo with Vision, GPT-3.5-Turbo and Embedding models. We can add AI functionality to our applications using C#, Python or REST APIs. The Generative AI capabilities that are available in Azure OpenAI are provided through the models, which belong to different families. This article assumes that you already have access to Azure OpenAI. To use Azure OpenAI, you need to be approved. Luckily for me, my work already has a resource for me to use. If you haven’t got one, check out this guide to get started. ...

February 27, 2024 · 7 min · Will Velida