Getting Started With Valkey Using JavaScript
I will walk through how to use Valkey for JavaScript applications using existing clients in Redis ecosystem as well as iovalkey (a friendly fork of ioredis).
Join the DZone community and get the full member experience.
Join For FreeValkey is an open-source alternative to Redis. It's a community-driven, Linux Foundation project created to keep the project available for use and distribution under the open-source Berkeley Software Distribution (BSD) 3-clause license after the Redis license changes.
I think the path to Valkey was well summarised in this inaugural blog post:
I will walk through how to use Valkey for JavaScript applications using existing clients in Redis ecosystem as well as iovalkey (a friendly fork of ioredis).
Using Valkey With node-redis
node-redis is a popular and widely used client. Here is a simple program that uses the Subscriber component of the PubSub API to subscribe to a channel.
import redis from 'redis';
const client = redis.createClient();
const channelName = 'valkey-channel';
(async () => {
try {
await client.connect();
console.log('Connected to Redis server');
await client.subscribe(channelName, (message, channel) => {
console.log(`message "${message}" received from channel "${channel}"`)
});
console.log('Waiting for messages...');
} catch (err) {
console.error('Error:', err);
}
})();
To try this with Valkey, let's start an instance using the Valkey Docker image:
docker run --rm -p 6379:637 valkey/valkey
Also, head here to get OS-specific distribution, or use Homebrew (on Mac) — brew install valkey
. You should now be able to use the Valkey CLI (valkey-cli
).
Get the code from GitHub repo:
git clone https://github.com/abhirockzz/valkey-javascript
cd valkey-javascript
npm install
Start the subscriber app:
node subscriber.js
Publish a message and ensure that the subscriber is able to receive it:
valkey-cli PUBLISH valkey-channel 'hello valkey'
Nice! We were able to write a simple application with an existing Redis client and run it using Valkey (instead of Redis). Sure, this is an oversimplified example, but there were no code changes required.
Use Valkey With ioredis
Client
ioredis is another popular client. To be doubly sure, let's try ioredis
with Valkey as well. Let's write a publisher application:
import Redis from 'ioredis';
const redisClient = new Redis();
const channelName = 'valkey-channel';
const message = process.argv[2];
if (!message) {
console.error('Please provide a message to publish.');
process.exit(1);
}
async function publishMessage() {
try {
const receivedCount = await redisClient.publish(channelName, message);
console.log(`Message "${message}" published to channel "${channelName}". Received by ${receivedCount} subscriber(s).`);
} catch (err) {
console.error('Error publishing message:', err);
} finally {
// Close the client connection
await redisClient.quit();
}
}
publishMessage();
Run the publisher, and confirm that the subscriber app is able to receive it:
node publisher.js 'hello1'
node publisher.js 'hello2'
You should see these logs in the subscriber application:
message "hello1" received from channel "valkey-channel"
message "hello2" received from channel "valkey-channel"
Switch to iovalkey
Client
As mentioned, iovalkey
is a fork of ioredis
. I made the following changes to port the producer code to use iovalkey
:
- Commented out
import Redis from 'ioredis';
- Added
import Redis from 'iovalkey';
- Installed
iovalkey
-npm install iovalkey
Here is the updated version — yes, this was all I needed to change (at least for this simple application):
// import Redis from 'ioredis';
import Redis from 'iovalkey';
Run the new iovalkey
based publisher, and confirm that the subscriber is able to receive it:
node publisher.js 'hello from iovalkey'
You should see these logs in the subscriber application:
message "hello from iovalkey" received from channel "valkey-channel"
Awesome, this is going well. We are ready to sprinkle some generative AI now!
Use Valkey With LangChainJS
Along with Python, JavaScript/TypeScript is also being used in the generative AI ecosystem. LangChain is a popular framework for developing applications powered by large language models (LLMs). LangChain has JS/TS support in the form of LangchainJS.
Having worked a lot with the Go port (langchaingo), as well as Python, I wanted to try LangchainJS.
One of the common use cases is to use Redis as a chat history component in generative AI apps. LangchainJS has this built-in, so let's try it out with Valkey.
Using Valkey as Chat History in LangChain
To install LangchainJS:
npm install langchain
For the LLM, I will be using Amazon Bedrock (its supported natively with LangchainJS), but feel free to use others.
For Amazon Bedrock, you will need to configure and set up Amazon Bedrock, including requesting access to the Foundation Model(s).
Here is the chat application. As you can see, it uses the RedisChatMessageHistory
component.
import { BedrockChat } from "@langchain/community/chat_models/bedrock";
import { RedisChatMessageHistory } from "@langchain/redis";
import { ConversationChain } from "langchain/chains";
import { BufferMemory } from "langchain/memory";
import prompt from "prompt";
import {
ChatPromptTemplate,
MessagesPlaceholder,
} from "@langchain/core/prompts";
const chatPrompt = ChatPromptTemplate.fromMessages([
[
"system",
"The following is a friendly conversation between a human and an AI.",
],
new MessagesPlaceholder("chat_history"),
["human", "{input}"],
]);
const memory = new BufferMemory({
chatHistory: new RedisChatMessageHistory({
sessionId: new Date().toISOString(),
sessionTTL: 300,
host: "localhost",
port: 6379,
}),
returnMessages: true,
memoryKey: "chat_history",
});
const model = "anthropic.claude-3-sonnet-20240229-v1:0"
const region = "us-east-1"
const langchainBedrockChatModel = new BedrockChat({
model: model,
region: region,
modelKwargs: {
anthropic_version: "bedrock-2023-05-31",
},
});
const chain = new ConversationChain({
llm: langchainBedrockChatModel,
memory: memory,
prompt: chatPrompt,
});
while (true) {
prompt.start({noHandleSIGINT: true});
const {message} = await prompt.get(['message']);
const response = await chain.invoke({
input: message,
});
console.log(response);
Run the application:
node chat.js
Start a conversation:
If you peek into Valkey, notice that the conversations are saved in a List
:
valkey-cli keys *
valkey-cli LRANGE <enter list name> 0 -1
Don't run
keys *
in production — its just for demo purposes.
Using iovalkey
Implementation for Chat History
The current implementation uses the node-redis
client, but I wanted to try out iovalkey
client. I am not a JS/TS expert, but it was simple enough to port the existing implementation. You can refer to the code on GitHub
As far as the client (chat) app is concerned, I only had to make a few changes to switch the implementation:
- Comment out
import { RedisChatMessageHistory } from "@langchain/redis";
- Add
import { ValkeyChatMessageHistory } from "./valkey_chat_history.js";
- Replace
RedisChatMessageHistory
withValkeyChatMessageHistory
(while creating thememory
instance)
It worked the same way as above. Feel free to give it a try!
Wrapping Up
It's still early days for the Valkey (at the time of writing), and there is a long way to go. I'm interested in how the project evolves and also the client ecosystem for Valkey.
Happy Building!
Published at DZone with permission of Abhishek Gupta, DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.
Comments