SSE Streaming with Angular, Spring Boot, and LLM Streams

VerticalServe Blogs
2 min readJul 24, 2024

--

In this blog post, we’ll explore how to implement Server-Sent Events (SSE) streaming using Angular 16 on the frontend, Spring Boot on the backend, and integrating LLM streams. We’ll also cover testing the implementation using Postman.

Setting Up the Backend with Spring Boot

First, let’s set up a Spring Boot server to handle SSE streams. We’ll use the SseEmitter class to facilitate SSE in Spring Boot.Install Dependencies:Ensure you have the necessary dependencies in your pom.xml:

<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-webflux</artifactId>
</dependency>
<!-- Add other dependencies as needed -->
</dependencies>

Create the Spring Boot Application:

// Application.java
package com.example.sse;

import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;

@SpringBootApplication
public class Application {
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}

Integrating LLM Streams

Assuming you have LLM Code set up to interact with an LLM, we can incorporate it into our Spring Boot application to stream responses.

SSE Controller:

// LangChainController.java
package com.example.sse;

import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
import org.springframework.web.servlet.mvc.method.annotation.SseEmitter;

import java.io.IOException;
import java.util.concurrent.Executors;

@RestController
public class LangChainController {

@PostMapping("/query")
public SseEmitter query(@RequestParam String query) {
SseEmitter emitter = new SseEmitter();
Executors.newSingleThreadExecutor().execute(() -> {
try {
// Simulate LangChain LLM stream response
for (int i = 0; i < 5; i++) {
emitter.send("data: Response part " + (i + 1) + "\n\n");
Thread.sleep(1000);
}
emitter.complete();
} catch (IOException | InterruptedException e) {
emitter.completeWithError(e);
}
});
return emitter;
}
}

Angular Changes & Postman Testing

Follow this blog to setup Angular changes for SSE and how to test using Postman

About:

VerticalServe Inc — Niche Cloud, Data & AI/ML Premier Consulting Company, Partnered with Google Cloud, Confluent, AWS, Azure…60+ Customers and many success stories..

Website: http://www.VerticalServe.com

Contact: contact@verticalserve.com

Successful Case Studies: http://verticalserve.com/success-stories.html

InsightLake Solutions: Our pre built solutions — http://www.InsightLake.com

--

--

No responses yet