❯ Guillaume Laforge

An ADK Java GitHub template for your first Java AI agent

With the unveiling of the Java version of Agent Development Kit (ADK) which lets you build AI agents in Java, I recently covered how to get started developing your first agent.

The installation and quickstart documentation also helps for the first steps, but I realized that it would be handy to provide a template project, to further accelarate your time-to-first-conversation with your Java agents! This led me to play with GitHub’s template project feature, which allows you to create a copy of the template project on your own account or organization. It comes with a ready-made project structure, a configured pom.xml file, and a first Java agent you can customize at will, and run from both the command-line or the ADK Dev UI.

Read more...

Beyond the chatbot or AI sparkle: a seamless AI integration

When I talk about Generative AI, whether it’s with developers at conferences or with customers, I often find myself saying the same thing: chatbots are just one way to use Large Language Models (LLMs).

Unfortunately, I see many articles or presentations that just focus on demonstrating LLMs at work within the context of chatbots. I feel guilty of showing the traditional chat interfaces too. But there’s so much more to it!

Read more...

Write AI agents in Java β€” Agent Development Kit getting started guide

At Google Cloud Next β€˜25, last April, Google released Agent Development Kit (ADK) for Python, a flexible and modular framework for developing and deploying AI agents.

Now at Google I/O, a Java version of ADK has been made available! And I’m glad to have had the chance to participate in its launch, via code samples, documentation, and helping shape the API so it’s idiomatic for Java developers.

In this article, my goal is to give you the basis to get started with the ADK framework, in Java, using the Gemini model, and running your first Java agents locally.

Read more...

Vibe coding an MCP server with Micronaut, LangChain4j, and Gemini

Unlike Quarkus and Spring Boot, Micronaut doesn’t (yet?) provide a module to facilitate the implementation of MCP servers (Model Context Protocol). But being my favorite framework, I decided to see what it takes to build a quick implementation, by vibe coding it, with the help of Gemini!

In a recent article, I explored how to use the MCP reference implementation for Java to implement an MCP server, served as a servlet via Jetty, and to call that server from LangChain4j’s great MCP support. One approach with Micronaut may have been to somehow integrate the servlet I had built via Micronaut’s servlet support, but that didn’t really feel like a genuine and native way to implement a server, so I decided to do it from scratch.

Read more...

MCP Client and Server with the Java MCP SDK and LangChain4j

MCP (Model Context Protocol) is making a buzz these days! MCP is a protocol invented last November by Anthropic, integrated in Claude Desktop and in more and more tools and frameworks, to expand LLMs capabilities by giving them access to various external tools and functions.

My colleague Philipp Schmid gave a great introduction to MCP recently, so if you want to learn more about MCP, this is the place for you.

In this article, I’d like to guide you through the implementation of an MCP server, and an MCP client, in Java. As I’m contributing to LangChain4j, I’ll be using LangChain4j’s mcp module for the client.

Read more...

Quick Tip: Clearing disk space in Cloud Shell

Right in the middle of a workshop I was delivering, as I was launching Google Cloud console’s Cloud Shell environment, I received the dreaded warning message: no space left on device.

And indeed, I didn’t have much space left, and Cloud Shell was reminding me it was high time I clean up the mess! Fortunately, the shell gives a nice hint, with a pointer to this documentation page with advice on how to reclaim space.

Read more...

LLMs.txt to help LLMs grok your content

Since I started my career, I’ve been sharing what I’ve learned along the way in this blog. It makes me happy when developers find solutions to their problems, or discover new things, thanks to articles I’ve written here. So it’s important for me that readers are able to find those posts. Of course, my blog is indexed by search engines, and people usually find about it from Google or other engines, or they discover it via the links I share on social media. But with LLM powered tools (like Gemini, ChatGPT, Claude, etc.) you can make your content more easily grokkable by such tools.

Read more...

Pretty-print Markdown on the console

With Large Language Models loving to output Markdown responses, I’ve been wanting to display those Markdown snippets nicely in the console, when developing some LLM-powered apps and experiments. At first, I thought I could use a Markdown parser library, and implement some kind of output formatter to display the text nicely, taking advantage of ANSI color codes and formats. However it felt a bit over-engineered, so I thought “hey, why not just use some simple regular expressions!” (and now you’ll tell me I have a second problem with regexes)

Read more...

Advanced RAG β€” Sentence Window Retrieval

Retrieval Augmented Generation (RAG) is a great way to expand the knowledge of Large Language Models to let them know about your own data and documents. With RAG, LLMs can ground their answers on the information your provide, which reduces the chances of hallucinations.

Implementing RAG is fairly trivial with a framework like LangChain4j. However, the results may not be on-par with your quality expectations. Often, you’ll need to further tweak different aspects of the RAG pipeline, like the document preparation phase (in particular docs chunking), or the retrieval phase to find the best information in your vector database.

Read more...

The power of large context windows for your documentation efforts

My colleague Jaana Dogan was pointing at the Anthropic’s MCP (Model Context Protocol) documentation pages which were describing how to build MCP servers and clients. The interesting twist was about preparing the documentation in order to have Claude assist you in building those MCP servers & clients, rather than clearly documenting how to do so.