The image is a 3D rendering of an asynchronous messaging subsystem, designed to be visually striking and epic. In the center, a colossal, intensely glowing structure represents the 'Input Messaging Channel'. Attached to this are numerous dynamic extensions, each signifying an 'Output Messaging Channel' leading to futuristic nodes or terminals, symbolizing 'Subscribers'. These nodes are uniquely designed, illuminated, and receiving energy-packed data packets. At the heart of the structure is a spectacular, central hub, resembling a powerful energy core, which orchestrates the flow of messages from the input to the output channels. The background is a vibrant, digital cosmos filled with neon lights, digital patterns, and conveying a sense of vast, technological space. The overall scene emphasizes the scale and intricacy of a high-tech communication network, rendered in a 100:42 aspect ratio.

The Publisher-Subscriber pattern (pub/sub messaging)

In distributed architectures, different parts of our system will need to provide information to other parts as events happens. One way we could solve this is to provision dedicated queues for each consumer and send messages asynchronously to decouple them from message senders. However, this wouldn’t scale if we had to do this for large amounts of consumers, and what if some consumers where only interested in parts of the information that producers send....

January 26, 2024 · 7 min · Will Velida
The image shows a complex 3D rendering of a Sequential Convoy pattern, which is designed to represent a data processing system. Multiple conveyor belts are intricately laid out in parallel and perpendicular formations, each carrying a series of cubes and rectangular blocks in various colors such as blue, green, orange, and pink. These blocks symbolize data packets or messages. The conveyor belts are interlinked but operate independently, showcasing the non-blocking processing of different sequences of data. Each belt runs without interference from the others, indicating the system's ability to handle multiple data streams simultaneously. The overall scene conveys a bustling, efficient data processing facility, with each color-coded or labeled block moving in a defined, orderly sequence, illustrating the concept of processing related messages in order without delaying other processes.

The Sequential Convoy Pattern

In most cases, we will our applications to process a sequence of messages in the order that they arrive. This is especially the case if and when we need to scale out our message processors to handle any increased load in messages. This isn’t always an easy task, because as our application scales out, instances can often pull messages from the queue independently, similar to how this happens when implementing the Competing Consumers pattern....

January 26, 2024 · 5 min · Will Velida
A 3D digital illustration depicting a futuristic concept of the Asynchronous Request-Reply pattern. On the left, there's a streamlined, high-tech frontend interface, characterized by sleek, simple, and user-friendly design elements, symbolizing the user's request. This interface is glowing subtly, emphasizing its advanced technology. On the right, an intricate and complex backend system is illustrated, representing asynchronous processing. It features detailed, sophisticated machinery and network structures, with a more industrial and robust appearance. Connecting these two elements are glowing, dynamic data streams, creating a visual flow of communication and response between the frontend and backend. The overall composition highlights the extended communication path in a digital, futuristic environment, captured in a wide format with a 100:42 aspect ratio.

The Asynchronous Request-Reply pattern

Client-side applications often rely on APIs to provide data and functionality, which are either in your control, or provided by a 3rd party. We usually communicate with these APIs using HTTP(S) protocols and REST. Usually, these APIs are designed to respond quickly to requests. However, there can be a variety of things that affect the latency response from these APIs. These include network infrastructure, how many clients are attempting to call the API, the difference in location between the caller and API itself etc....

January 24, 2024 · 5 min · Will Velida
An elaborate 3D rendering showcasing the Queue-Based Load Leveling pattern in a futuristic setting. The image, in a 100:42 aspect ratio, features a colossal, radiant tube at its center, symbolizing the queue. This tube is vibrant with colorful, digitized requests. Scattered throughout the scene are various computers and digital envelopes, representing data processing and message handling. These computers appear integrated into the high-tech landscape, some floating around the queue, highlighting their role in data analysis and management. The envelopes, depicted as digital holograms, seamlessly merge into the data streams, illustrating the tasks being processed. On the right, a grand, fortress-like structure represents the service, with gates meticulously processing each request. The background is a dramatic skyline, illuminated by neon lights and holographic displays, enhancing the scene's grandeur. The overall atmosphere is intense and awe-inspiring, emphasizing the system's efficiency in managing high demand with added layers of technology and communication.

The Queue-Based Load Leveling Pattern

Applications in the cloud can be subjected to heavy peaks of traffic in intermittent phases. If our applications can’t handle these peaks, this can lead to performance, availability and reliability issues. For example, imagine we have an application that stores state temporarily in a cache. We could have a single task within our application that performs this for us, and we could have a good idea of how many times a single instance of our application would perform this task....

January 22, 2024 · 5 min · Will Velida
A dynamic 3D illustration of a Priority Queue system in computing. At the bottom, a vast array of futuristic computers forms a grid, with a central towering structure that represents the sending application. From this, digital envelopes adorned with glowing edges, indicative of their priority status, burst forth in an array. High-priority envelopes emit brighter light and are surrounded by visible energy fields, while lower-priority ones have a softer glow. They ascend towards an elaborate, radiant queue structure above, which sorts them before they continue towards their destinations. The background resembles a star-filled night sky, signifying the vastness and complexity of the data processing universe. The overall image conveys a sense of grandeur and high-stakes data management.

The Priority Queue Pattern

The Priority Queue pattern is a pattern that allows us to prioritize requests sent to services so that higher priority requests are received and processed faster than lower priority requests. This is useful for when we are building applications where we need to prioritize some clients over others, or if some requests need to be processed faster than usual for business/compliance reasons. In this article, I’ll talk about what the Priority Queue pattern is, what advantages priority queues can provide, what we need to consider when implementing the Priority Queue pattern and how can implement it priority queues in Azure (Spoiler alert, it’s a little hacky!...

January 17, 2024 · 4 min · Will Velida