🌐

Creating a LLM Translation App with LangChain

Apr 22, 2025

Building a Simple LLM Application with LangChain

Overview

  • Objective: Create a simple application to translate text from English into another language using LangChain.
  • Components:
    • Language Models
    • Prompt Templates
    • Debugging/Tracing with LangSmith

Setup

Installation

  • Install LangChain using package managers:
    • npm
    • yarn
    • pnpm
  • Command: yarn add langchain @langchain/core
  • Installation Guide

LangSmith Setup

  • For complex applications with multiple LLM calls, use LangSmith for tracing.
  • Set environment variables for logging: export LANGSMITH_TRACING="true" export LANGSMITH_API_KEY="..." # Optional: Reduce tracing latency export LANGCHAIN_CALLBACKS_BACKGROUND=true

Using Language Models

  • LangChain supports various language models:
    • Groq, OpenAI, Anthropic, FireworksAI, MistralAI, VertexAI

Example with Groq Model

Installation

  • Add dependency: yarn add @langchain/groq

Environment Variables

  • Set API key: GROQ_API_KEY=your-api-key

Model Instantiation

import { ChatGroq } from "@langchain/groq"; const model = new ChatGroq({ model: "llama-3.3-70b-versatile", temperature: 0});

Using the Model

  • Models are Runnables.
  • Use .invoke method with a list of messages to interact with the model.
import { HumanMessage, SystemMessage } from "@langchain/core/messages"; const messages = [ new SystemMessage("Translate the following from English into Italian"), new HumanMessage("hi!"), ]; await model.invoke(messages);
  • View token usage and other details via LangSmith if enabled.

Streaming

  • Stream tokens from a chat model using async and streaming modes:
const stream = await model.stream(messages); const chunks = []; for await (const chunk of stream) { chunks.push(chunk); console.log(`${chunk.content}|`); }

Prompt Templates

  • Construct messages from user input and application logic.
  • Transform raw input into a prompt using Prompt Templates.

Creating a Prompt Template

  • Use variables for language and text.
import { ChatPromptTemplate } from "@langchain/core/prompts";
  • Define the system message:
const systemTemplate = "Translate the following from English into {language}";
  • Create the PromptTemplate:
const promptTemplate = ChatPromptTemplate.fromMessages([ ["system", systemTemplate], ["user", "{text}"], ]);
  • Invoke the prompt template:
const promptValue = await promptTemplate.invoke({ language: "italian", text: "hi!",}); promptValue;
  • Access messages:
promptValue.toChatMessages();
  • Invoke the model:
const response = await model.invoke(promptValue); console.log(`${response.content}`);

Conclusion

  • Gained skills in working with language models, creating prompt templates, and using LangSmith for observability.
  • For further knowledge, explore Conceptual Guides and How-to Guides.