LangChain4j入门教程 - Java版LangChain完整指南
LangChain4j入门教程 - Java版LangChain完整指南
目录
- LangChain4j简介
- 环境搭建
- 快速开始
- 核心概念
- ChatModel详解
- ChatMemory对话记忆
- AI Services高级API
- RAG检索增强生成
- 向量存储和嵌入
- 工具调用(Function Calling)
- 与Spring Boot集成
- 实际应用案例
- 最佳实践与优化
- 总结与进阶
1. LangChain4j简介
LangChain4j是一个专为Java开发者设计的开源框架,用于构建基于大型语言模型(LLM)的应用程序。它是LangChain的Java版本,提供了丰富的工具和抽象,简化了LLM应用的开发过程。
核心特点:
✅ Java原生支持:完全基于Java开发,与Java生态系统无缝集成
✅ 多种LLM支持:支持OpenAI、Anthropic、Ollama等多种LLM提供商
✅ RAG支持:内置检索增强生成(RAG)功能
✅ 向量存储:支持多种向量数据库集成
✅ 对话记忆:内置对话历史管理功能
✅ 工具调用:支持Function Calling,扩展LLM能力
✅ Spring Boot集成:提供Spring Boot Starter,开箱即用
智能问答系统:构建基于知识库的问答系统
AI聊天机器人:开发多轮对话的聊天机器人
文档分析:自动分析和总结文档内容
代码生成:基于自然语言生成代码
内容生成:自动生成文章、摘要等
数据分析:使用自然语言查询和分析数据
| 特性 | LangChain (Python) | LangChain4j (Java) |
|---|---|---|
| 语言 | Python | Java |
| 生态系统 | Python生态 | Java生态 |
| Spring集成 | 无 | 有 |
| 性能 | 解释型 | 编译型 |
| 企业应用 | 适合 | 更适合 |
2. 环境搭建
- JDK版本:JDK 17或更高版本(推荐JDK 21)
- 构建工具:Maven 3.6+ 或 Gradle 7.0+
- IDE:IntelliJ IDEA、Eclipse等
java -version
# 应该显示 java version "17" 或更高创建Maven项目
mvn archetype:generate -DgroupId=com.example -DartifactId=langchain4j-demo -DarchetypeArtifactId=maven-archetype-quickstart -DinteractiveMode=false添加依赖
在pom.xml中添加LangChain4j依赖:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.example</groupId>
<artifactId>langchain4j-demo</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<maven.compiler.source>17</maven.compiler.source>
<maven.compiler.target>17</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<langchain4j.version>0.35.0</langchain4j.version>
</properties>
<dependencies>
<!-- LangChain4j核心库 -->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j</artifactId>
<version>${langchain4j.version}</version>
</dependency>
<!-- OpenAI集成 -->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-open-ai</artifactId>
<version>${langchain4j.version}</version>
</dependency>
<!-- 本地模型支持(Ollama) -->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-ollama</artifactId>
<version>${langchain4j.version}</version>
</dependency>
<!-- 向量存储(In-Memory) -->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-embeddings-all-minilm-l6-v2</artifactId>
<version>${langchain4j.version}</version>
</dependency>
<!-- 文档加载器 -->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-document-loader-filesystem</artifactId>
<version>${langchain4j.version}</version>
</dependency>
<!-- JSON支持 -->
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-easy-rag</artifactId>
<version>${langchain4j.version}</version>
</dependency>
<!-- 日志框架 -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>2.0.9</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.4.14</version>
</dependency>
<!-- JUnit测试 -->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.13.2</version>
<scope>test</scope>
</dependency>
</dependencies>
</project>在build.gradle中添加:
plugins {
id 'java'
}
repositories {
mavenCentral()
}
dependencies {
implementation 'dev.langchain4j:langchain4j:0.35.0'
implementation 'dev.langchain4j:langchain4j-open-ai:0.35.0'
implementation 'dev.langchain4j:langchain4j-ollama:0.35.0'
implementation 'dev.langchain4j:langchain4j-embeddings-all-minilm-l6-v2:0.35.0'
implementation 'dev.langchain4j:langchain4j-document-loader-filesystem:0.35.0'
implementation 'dev.langchain4j:langchain4j-easy-rag:0.35.0'
implementation 'org.slf4j:slf4j-api:2.0.9'
implementation 'ch.qos.logback:logback-classic:1.4.14'
testImplementation 'junit:junit:4.13.2'
}
java {
sourceCompatibility = JavaVersion.VERSION_17
targetCompatibility = JavaVersion.VERSION_17
}OpenAI API密钥
方式1:环境变量(推荐)
# Linux/macOS
export OPENAI_API_KEY=your-api-key-here
# Windows
set OPENAI_API_KEY=your-api-key-here方式2:系统属性
java -DOPENAI_API_KEY=your-api-key-here YourApp方式3:配置文件
创建application.properties:
openai.api.key=your-api-key-hereOllama本地模型
如果使用Ollama本地模型,需要先安装Ollama:
# 安装Ollama
curl -fsSL https://ollama.com/install.sh | sh
# 下载模型
ollama pull llama23. 快速开始
创建HelloLangChain4j.java:
import dev.langchain4j.model.chat.ChatLanguageModel;
import dev.langchain4j.model.openai.OpenAiChatModel;
public class HelloLangChain4j {
public static void main(String[] args) {
// 创建ChatModel实例
ChatLanguageModel model = OpenAiChatModel.builder()
.apiKey(System.getenv("OPENAI_API_KEY"))
.modelName("gpt-4o-mini")
.temperature(0.7)
.build();
// 发送消息
String response = model.generate("Say 'Hello, LangChain4j!'");
System.out.println(response);
}
}import dev.langchain4j.model.chat.ChatLanguageModel;
import dev.langchain4j.model.ollama.OllamaChatModel;
public class HelloOllama {
public static void main(String[] args) {
// 使用Ollama本地模型
ChatLanguageModel model = OllamaChatModel.builder()
.baseUrl("http://localhost:11434")
.modelName("llama2")
.build();
String response = model.generate("Hello, how are you?");
System.out.println(response);
}
}# 编译
mvn compile
# 运行(确保设置了OPENAI_API_KEY环境变量)
mvn exec:java -Dexec.mainClass="HelloLangChain4j"4. 核心概念
ChatModel是LangChain4j的核心接口,用于与LLM进行交互。
主要实现类:
OpenAiChatModel:OpenAI模型OllamaChatModel:Ollama本地模型AnthropicChatModel:Anthropic Claude模型
ChatMemory用于存储和管理对话历史,支持多轮对话。
EmbeddingModel用于将文本转换为向量,用于相似度搜索。
Document表示一个文档,包含文本内容和元数据。
EmbeddingStore用于存储和检索文档向量。
AI Services提供了高级API,简化了与LLM的交互。
5. ChatModel详解
import dev.langchain4j.model.chat.ChatLanguageModel;
import dev.langchain4j.model.openai.OpenAiChatModel;
ChatLanguageModel model = OpenAiChatModel.builder()
.apiKey(System.getenv("OPENAI_API_KEY"))
.modelName("gpt-4o-mini")
.temperature(0.7)
.maxTokens(500)
.build();
// 单次对话
String response = model.generate("What is Java?");
System.out.println(response);import dev.langchain4j.model.chat.ChatLanguageModel;
import dev.langchain4j.model.memory.chat.ChatMemoryProvider;
import dev.langchain4j.model.memory.chat.MessageWindowChatMemory;
import dev.langchain4j.model.openai.OpenAiChatModel;
import dev.langchain4j.data.message.UserMessage;
import dev.langchain4j.data.message.AiMessage;
ChatLanguageModel model = OpenAiChatModel.builder()
.apiKey(System.getenv("OPENAI_API_KEY"))
.modelName("gpt-4o-mini")
.build();
// 创建对话记忆
ChatMemoryProvider memoryProvider = new ChatMemoryProvider() {
@Override
public ChatMemory get(Object memoryId) {
return MessageWindowChatMemory.withMaxMessages(10);
}
};
// 第一轮对话
String response1 = model.generate(
memoryProvider.get(1).messages(),
UserMessage.from("My name is Alice")
);
System.out.println("AI: " + response1);
// 第二轮对话(带上下文)
String response2 = model.generate(
memoryProvider.get(1).messages(),
UserMessage.from("What is my name?")
);
System.out.println("AI: " + response2); // AI会记住名字是Aliceimport dev.langchain4j.model.chat.StreamingChatLanguageModel;
import dev.langchain4j.model.openai.OpenAiStreamingChatModel;
import dev.langchain4j.model.output.Response;
ChatLanguageModel model = OpenAiChatModel.builder()
.apiKey(System.getenv("OPENAI_API_KEY"))
.modelName("gpt-4o-mini")
.build();
// 流式生成
model.generate("Tell me a story", new StreamingResponseHandler<AiMessage>() {
@Override
public void onNext(String token) {
System.out.print(token);
}
@Override
public void onComplete(Response<AiMessage> response) {
System.out.println("\n[Complete]");
}
@Override
public void onError(Throwable error) {
System.err.println("Error: " + error.getMessage());
}
});OpenAI
ChatLanguageModel openAiModel = OpenAiChatModel.builder()
.apiKey(System.getenv("OPENAI_API_KEY"))
.modelName("gpt-4o-mini")
.temperature(0.7)
.build();Ollama(本地)
ChatLanguageModel ollamaModel = OllamaChatModel.builder()
.baseUrl("http://localhost:11434")
.modelName("llama2")
.temperature(0.7)
.build();Anthropic Claude
import dev.langchain4j.model.anthropic.AnthropicChatModel;
ChatLanguageModel claudeModel = AnthropicChatModel.builder()
.apiKey(System.getenv("ANTHROPIC_API_KEY"))
.modelName("claude-3-sonnet-20240229")
.temperature(0.7)
.build();6. ChatMemory对话记忆
ChatMemory用于存储对话历史,使AI能够记住之前的对话内容,实现多轮对话。
import dev.langchain4j.model.chat.ChatLanguageModel;
import dev.langchain4j.model.memory.chat.MessageWindowChatMemory;
import dev.langchain4j.model.openai.OpenAiChatModel;
import dev.langchain4j.data.message.UserMessage;
import dev.langchain4j.data.message.AiMessage;
ChatLanguageModel model = OpenAiChatModel.builder()
.apiKey(System.getenv("OPENAI_API_KEY"))
.modelName("gpt-4o-mini")
.build();
// 创建对话记忆(保留最近10条消息)
MessageWindowChatMemory memory = MessageWindowChatMemory.withMaxMessages(10);
// 第一轮对话
memory.add(UserMessage.from("I like Java programming"));
AiMessage response1 = model.generate(memory.messages());
memory.add(response1);
System.out.println("AI: " + response1.text());
// 第二轮对话(AI会记住之前的对话)
memory.add(UserMessage.from("What programming language do I like?"));
AiMessage response2 = model.generate(memory.messages());
memory.add(response2);
System.out.println("AI: " + response2.text()); // AI会回答JavaMessageWindowChatMemory(消息窗口)
// 保留最近N条消息
MessageWindowChatMemory memory = MessageWindowChatMemory.withMaxMessages(20);TokenWindowChatMemory(Token窗口)
import dev.langchain4j.model.memory.chat.TokenWindowChatMemory;
// 保留最近N个Token
TokenWindowChatMemory memory = TokenWindowChatMemory.withMaxTokens(1000);import dev.langchain4j.store.memory.chat.InMemoryChatMemoryStore;
// 使用内存存储
InMemoryChatMemoryStore store = new InMemoryChatMemoryStore();
ChatMemoryProvider memoryProvider = new PersistentChatMemoryProvider(store);
// 获取或创建对话记忆
ChatMemory memory = memoryProvider.get("conversation-1");7. AI Services高级API
AI Services提供了声明式的API,通过接口定义来简化与LLM的交互。
import dev.langchain4j.service.AiServices;
import dev.langchain4j.service.SystemMessage;
import dev.langchain4j.service.UserMessage;
// 定义AI Service接口
interface Assistant {
String chat(String userMessage);
}
// 创建AI Service实例
ChatLanguageModel model = OpenAiChatModel.builder()
.apiKey(System.getenv("OPENAI_API_KEY"))
.modelName("gpt-4o-mini")
.build();
Assistant assistant = AiServices.create(Assistant.class, model);
// 使用
String response = assistant.chat("What is Java?");
System.out.println(response);interface Translator {
@SystemMessage("You are a professional translator. Translate the given text to Chinese.")
String translate(String text);
}
Translator translator = AiServices.create(Translator.class, model);
String chinese = translator.translate("Hello, World!");
System.out.println(chinese); // 你好,世界!import dev.langchain4j.service.UserMessage;
import dev.langchain4j.service.MemoryId;
import dev.langchain4j.service.MemoryId;
// 定义数据类
record Person(String name, int age, String city) {}
interface PersonExtractor {
@UserMessage("Extract information about a person from: {{it}}")
Person extractPerson(String text);
}
PersonExtractor extractor = AiServices.create(PersonExtractor.class, model);
Person person = extractor.extractPerson("John is 30 years old and lives in New York");
System.out.println(person); // Person[name=John, age=30, city=New York]interface ChatBot {
@MemoryId("conversation-1")
String chat(@UserMessage String userMessage);
}
ChatMemoryProvider memoryProvider = new InMemoryChatMemoryProvider();
ChatBot bot = AiServices.builder(ChatBot.class)
.chatLanguageModel(model)
.chatMemoryProvider(memoryProvider)
.build();
String response1 = bot.chat("My name is Alice");
String response2 = bot.chat("What is my name?"); // AI会记住名字8. RAG检索增强生成
RAG(Retrieval-Augmented Generation)检索增强生成,通过检索相关文档来增强LLM的生成能力。
- 文档加载:加载文档到系统
- 文档分割:将文档分割成小块
- 向量化:将文档转换为向量
- 存储:将向量存储到向量数据库
- 检索:根据查询检索相关文档
- 生成:将检索到的文档作为上下文生成回答
import dev.langchain4j.data.document.Document;
import dev.langchain4j.data.document.loader.FileSystemDocumentLoader;
import dev.langchain4j.data.document.splitter.DocumentSplitters;
import dev.langchain4j.data.embedding.Embedding;
import dev.langchain4j.data.segment.TextSegment;
import dev.langchain4j.model.embedding.EmbeddingModel;
import dev.langchain4j.model.embedding.onnx.allminilml6v2.AllMiniLmL6V2EmbeddingModel;
import dev.langchain4j.model.openai.OpenAiChatModel;
import dev.langchain4j.store.embedding.EmbeddingStore;
import dev.langchain4j.store.embedding.inmemory.InMemoryEmbeddingStore;
import dev.langchain4j.rag.content.retriever.EmbeddingStoreContentRetriever;
// 1. 加载文档
Document document = FileSystemDocumentLoader.loadDocument("path/to/document.txt");
// 2. 分割文档
List<TextSegment> segments = DocumentSplitters.recursive(300, 0).split(document);
// 3. 创建嵌入模型
EmbeddingModel embeddingModel = new AllMiniLmL6V2EmbeddingModel();
// 4. 生成嵌入向量
List<Embedding> embeddings = embeddingModel.embedAll(segments).content();
// 5. 存储到向量数据库
EmbeddingStore<TextSegment> embeddingStore = new InMemoryEmbeddingStore<>();
for (int i = 0; i < segments.size(); i++) {
embeddingStore.add(embeddings.get(i), segments.get(i));
}
// 6. 创建检索器
EmbeddingStoreContentRetriever retriever = EmbeddingStoreContentRetriever.builder()
.embeddingStore(embeddingStore)
.embeddingModel(embeddingModel)
.maxResults(2)
.build();
// 7. 创建RAG链
ChatLanguageModel chatModel = OpenAiChatModel.builder()
.apiKey(System.getenv("OPENAI_API_KEY"))
.modelName("gpt-4o-mini")
.build();
// 8. 查询
String question = "What is the document about?";
List<Content> relevantContents = retriever.retrieve(question);
// 构建上下文
String context = relevantContents.stream()
.map(Content::textSegment)
.map(TextSegment::text)
.collect(Collectors.joining("\n\n"));
String answer = chatModel.generate(
"Answer the following question based on the context:\n\n" +
"Context:\n" + context + "\n\n" +
"Question: " + question
);
System.out.println(answer);LangChain4j提供了EasyRAG简化RAG实现:
import dev.langchain4j.easy.rag.EasyRag;
EasyRag easyRag = EasyRag.builder()
.chatLanguageModel(OpenAiChatModel.builder()
.apiKey(System.getenv("OPENAI_API_KEY"))
.modelName("gpt-4o-mini")
.build())
.embeddingModel(new AllMiniLmL6V2EmbeddingModel())
.embeddingStore(new InMemoryEmbeddingStore<>())
.contentRetriever(EmbeddingStoreContentRetriever.builder()
.embeddingStore(new InMemoryEmbeddingStore<>())
.embeddingModel(new AllMiniLmL6V2EmbeddingModel())
.maxResults(3)
.build())
.build();
// 添加文档
easyRag.add("path/to/document.txt");
// 查询
String answer = easyRag.answer("What is the document about?");
System.out.println(answer);9. 向量存储和嵌入
使用本地嵌入模型
import dev.langchain4j.model.embedding.onnx.allminilml6v2.AllMiniLmL6V2EmbeddingModel;
EmbeddingModel embeddingModel = new AllMiniLmL6V2EmbeddingModel();
// 生成单个嵌入向量
Embedding embedding = embeddingModel.embed("Hello, World!").content();
// 批量生成
List<Embedding> embeddings = embeddingModel.embedAll(
List.of("Text 1", "Text 2", "Text 3")
).content();使用OpenAI嵌入模型
import dev.langchain4j.model.embedding.openai.OpenAiEmbeddingModel;
EmbeddingModel embeddingModel = OpenAiEmbeddingModel.builder()
.apiKey(System.getenv("OPENAI_API_KEY"))
.modelName("text-embedding-3-small")
.build();InMemoryEmbeddingStore(内存存储)
import dev.langchain4j.store.embedding.inmemory.InMemoryEmbeddingStore;
EmbeddingStore<TextSegment> store = new InMemoryEmbeddingStore<>();其他向量数据库
LangChain4j还支持:
- Chroma
- Pinecone
- Weaviate
- Qdrant
- Milvus
// 生成查询向量
Embedding queryEmbedding = embeddingModel.embed("What is Java?").content();
// 搜索相似文档
List<EmbeddingMatch<TextSegment>> matches = store.findRelevant(
queryEmbedding,
5 // 返回前5个最相似的
);
// 输出结果
for (EmbeddingMatch<TextSegment> match : matches) {
System.out.println("Score: " + match.score());
System.out.println("Text: " + match.embedded().text());
}10. 工具调用(Function Calling)
工具调用允许LLM调用外部函数,扩展AI的能力。
import dev.langchain4j.agent.tool.Tool;
import dev.langchain4j.agent.tool.ToolExecutionRequest;
import dev.langchain4j.agent.tool.ToolSpecification;
public class Calculator {
@Tool("Calculates the sum of two numbers")
public int add(int a, int b) {
return a + b;
}
@Tool("Calculates the product of two numbers")
public int multiply(int a, int b) {
return a * b;
}
}import dev.langchain4j.service.AiServices;
import dev.langchain4j.service.Tool;
interface MathAssistant {
@Tool("Performs mathematical calculations")
String calculate(String expression);
}
Calculator calculator = new Calculator();
MathAssistant assistant = AiServices.builder(MathAssistant.class)
.chatLanguageModel(model)
.tools(calculator)
.build();
String result = assistant.calculate("What is 10 + 20?");
System.out.println(result); // AI会调用add方法计算结果import java.time.LocalDateTime;
import java.time.format.DateTimeFormatter;
public class WeatherService {
@Tool("Gets the current weather for a given location")
public String getWeather(String location) {
// 模拟天气服务
return "The weather in " + location + " is sunny, 25°C";
}
@Tool("Gets the current date and time")
public String getCurrentDateTime() {
return LocalDateTime.now().format(
DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss")
);
}
}
interface Assistant {
String chat(String userMessage);
}
WeatherService weatherService = new WeatherService();
Assistant assistant = AiServices.builder(Assistant.class)
.chatLanguageModel(model)
.tools(weatherService)
.build();
String response = assistant.chat("What's the weather in Beijing?");
System.out.println(response);11. 与Spring Boot集成
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
<version>3.2.0</version>
</dependency>
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-spring-boot-starter</artifactId>
<version>0.35.0</version>
</dependency>application.yml:
langchain4j:
open-ai:
chat-model:
api-key: ${OPENAI_API_KEY}
model-name: gpt-4o-mini
temperature: 0.7
embedding:
model: all-minilm-l6-v2import dev.langchain4j.service.AiServices;
import dev.langchain4j.service.UserMessage;
import org.springframework.stereotype.Service;
@Service
public class ChatService {
private final ChatLanguageModel chatModel;
public ChatService(ChatLanguageModel chatModel) {
this.chatModel = chatModel;
}
public String chat(String message) {
return chatModel.generate(message);
}
}import org.springframework.web.bind.annotation.*;
@RestController
@RequestMapping("/api/chat")
public class ChatController {
private final ChatService chatService;
public ChatController(ChatService chatService) {
this.chatService = chatService;
}
@PostMapping
public String chat(@RequestBody String message) {
return chatService.chat(message);
}
}12. 实际应用案例
import dev.langchain4j.easy.rag.EasyRag;
import dev.langchain4j.model.openai.OpenAiChatModel;
public class QASystem {
private final EasyRag rag;
public QASystem() {
this.rag = EasyRag.builder()
.chatLanguageModel(OpenAiChatModel.builder()
.apiKey(System.getenv("OPENAI_API_KEY"))
.modelName("gpt-4o-mini")
.build())
.embeddingModel(new AllMiniLmL6V2EmbeddingModel())
.embeddingStore(new InMemoryEmbeddingStore<>())
.build();
}
public void addDocument(String filePath) {
rag.add(filePath);
}
public String answer(String question) {
return rag.answer(question);
}
public static void main(String[] args) {
QASystem qa = new QASystem();
qa.addDocument("knowledge-base.txt");
String answer = qa.answer("What is the main topic?");
System.out.println(answer);
}
}import dev.langchain4j.service.AiServices;
import dev.langchain4j.service.MemoryId;
import dev.langchain4j.service.UserMessage;
interface ChatBot {
@MemoryId("{{memoryId}}")
String chat(@UserMessage String userMessage);
}
public class ChatBotService {
private final ChatBot bot;
public ChatBotService() {
ChatLanguageModel model = OpenAiChatModel.builder()
.apiKey(System.getenv("OPENAI_API_KEY"))
.modelName("gpt-4o-mini")
.build();
this.bot = AiServices.builder(ChatBot.class)
.chatLanguageModel(model)
.chatMemoryProvider(new InMemoryChatMemoryProvider())
.build();
}
public String chat(String userId, String message) {
return bot.chat(userId, message);
}
}import dev.langchain4j.data.document.Document;
import dev.langchain4j.data.document.loader.FileSystemDocumentLoader;
public class DocumentSummarizer {
private final ChatLanguageModel model;
public DocumentSummarizer() {
this.model = OpenAiChatModel.builder()
.apiKey(System.getenv("OPENAI_API_KEY"))
.modelName("gpt-4o-mini")
.build();
}
public String summarize(String filePath) {
Document document = FileSystemDocumentLoader.loadDocument(filePath);
String prompt = "Please summarize the following document:\n\n" +
document.text() +
"\n\nSummary:";
return model.generate(prompt);
}
}13. 最佳实践与优化
使用本地嵌入模型:减少API调用成本
批量处理:批量生成嵌入向量
缓存结果:缓存常见查询结果
选择合适的模型:根据需求选择模型大小
使用本地模型:Ollama等本地模型免费
合理使用API:避免不必要的API调用
选择合适的模型:小模型成本更低
try {
String response = model.generate(prompt);
} catch (Exception e) {
// 处理错误
logger.error("Error generating response", e);
// 返回默认响应或重试
}- 保护API密钥:使用环境变量或密钥管理服务
- 输入验证:验证用户输入
- 输出过滤:过滤敏感信息
14. 总结与进阶
通过本教程,你已经掌握了:
- ✅ LangChain4j的基本概念和使用
- ✅ ChatModel的使用方法
- ✅ ChatMemory对话记忆管理
- ✅ AI Services高级API
- ✅ RAG检索增强生成
- ✅ 向量存储和嵌入
- ✅ 工具调用功能
- ✅ Spring Boot集成
- 深入学习RAG:优化检索策略
- 向量数据库:集成专业向量数据库
- 多模态:处理图像、音频等多模态数据
- Agent开发:构建智能Agent系统
- 性能优化:提升系统性能
- 官方文档:https://langchain4j.github.io/langchain4j/
- GitHub:https://github.com/langchain4j/langchain4j
- 示例代码:官方GitHub仓库中的examples
- 多实践:通过实际项目练习
- 阅读源码:理解实现原理
- 参与社区:参与开源社区讨论
- 持续学习:关注AI和LLM的最新发展
结语
LangChain4j为Java开发者提供了强大的LLM应用开发能力。通过本教程的学习,相信你已经掌握了LangChain4j的核心功能和使用方法。
记住:
- 多实践:理论结合实践,多写代码
- 理解原理:理解RAG、向量存储等核心概念
- 关注性能:注意API调用成本和性能优化
- 持续学习:关注AI和LLM的最新发展
祝你学习愉快,开发顺利! 🚀
本教程由Java突击队学习社区编写,如有问题欢迎反馈。