Aiven Part 2: Redis and AWS Cloudwatch Integrations

Published: Nov 29, 2022 by Isaac Johnson

In our first post we explored Aiven.io to create and manage a hosted Kafka cluster as well as a MirrorMaker instance. We explored sending metrics to Datadog and using MirrorMaker to sync with an on-prem cluster as well as Azure Event Hub.

Today we will explore using Aiven for Redis, and sending Logs from Kafka and Redis to GCP Log Monitor. We’ll then look at AWS CloudWatch for both Metrics and Logs.

Aiven Redis

The first step is to go into services and create a Redis Service

/content/images/2022/11/aiventwo-01.png

Our next step will be to try and connect via a Redis CLI

Since I really don’t care to install a local redis server, I’ll just build from source:

builder@DESKTOP-72D2D9T:~/Workspaces$ git clone https://github.com/redis/redis.git
Cloning into 'redis'...
remote: Enumerating objects: 85618, done.
remote: Counting objects: 100% (31/31), done.
remote: Compressing objects: 100% (31/31), done.
remote: Total 85618 (delta 9), reused 9 (delta 0), pack-reused 85587
Receiving objects: 100% (85618/85618), 121.71 MiB | 7.64 MiB/s, done.
Resolving deltas: 100% (62172/62172), done.
builder@DESKTOP-72D2D9T:~/Workspaces$ cd redis/
builder@DESKTOP-72D2D9T:~/Workspaces/redis$ git checkout 3.0
Branch '3.0' set up to track remote branch '3.0' from 'origin'.
Switched to a new branch '3.0'
builder@DESKTOP-72D2D9T:~/Workspaces/redis$ make redis-cli
cd src && make redis-cli
make[1]: Entering directory '/home/builder/Workspaces/redis/src'
... snip

Then fix the perms and link it. We can use ‘-v’ to test

builder@DESKTOP-72D2D9T:~/Workspaces/redis$ chmod 755 src/redis-cli
builder@DESKTOP-72D2D9T:~/Workspaces/redis$ ln -s /home/builder/Workspaces/redis/src/redis-cli /usr/local/bin/redis-cli
builder@DESKTOP-72D2D9T:~/Workspaces/redis$ redis-cli -v
redis-cli 3.0.7 (git:48e24d54)

I actually found troubles connecting with the 3.0 and the 6.0.16 version of redis-cli. But the latest 7.x seems to work fine

builder@DESKTOP-72D2D9T:~/Workspaces/redis$ redis-cli -u 'rediss://default:AVNS_yvSws8sfsXLZd9ywo8o@redis-3a248ca5-isaac-1040.aivencloud.com:11997'
Warning: Using a password with '-a' or '-u' option on the command line interface may not be safe.
redis-3a248ca5-isaac-1040.aivencloud.com:11997>

Testing

builder@DESKTOP-72D2D9T:~/Workspaces/redis$ redis-cli -u 'rediss://default:AVNS_yvSws8sfsXLZd9ywo8o@redis-3a248ca5-isaac-1040.aivencloud.com:11997'
Warning: Using a password with '-a' or '-u' option on the command line interface may not be safe.
redis-3a248ca5-isaac-1040.aivencloud.com:11997> set TEST VAL1234
OK
(3.96s)
redis-3a248ca5-isaac-1040.aivencloud.com:11997> get TEST
"VAL1234"
(2.10s)
redis-3a248ca5-isaac-1040.aivencloud.com:11997> exit

Next, I’ll send Metrics to Datadog

/content/images/2022/11/aiventwo-02.png

GCP Logging

We need to ensure the Logging API is enabled

/content/images/2022/11/aiventwo-03.png

Then create a service account

/content/images/2022/11/aiventwo-04.png

I’m likely going overkill, but I’m going to add to the Logging Admin Role and Editor role. I really want roles/logging.logWriter and roles/editor but the GCP UI is rather lacking

/content/images/2022/11/aiventwo-05.png

I’ll want to get the service key JSON next

/content/images/2022/11/aiventwo-06.png

from there I can create a new key

/content/images/2022/11/aiventwo-07.png

choose JSON

/content/images/2022/11/aiventwo-08.png

This then downloads it locally. We can now use that in the Aiven.io Integrations setup window

/content/images/2022/11/aiventwo-09.png

We can now pick that in our Integrations section of the service

/content/images/2022/11/aiventwo-10.png

We can now see we are sending Logs to GCP and Metrics to DD for Redis

/content/images/2022/11/aiventwo-11.png

and I can do the same for Kafka

/content/images/2022/11/aiventwo-12.png

We can almost immediately see the logs show up in StackDriver

/content/images/2022/11/aiventwo-13.png

We can see an example of the kind of data being sent into GCP Logging

/content/images/2022/11/aiventwo-14.png

One thing you may want to do to avoid costs building up is to set a log storage retention policy.

We can go to Logs Storage in Logging (the popup upselling advert is GCPs not mine)

/content/images/2022/11/aiventwo-15.png

Then we can “edit bucket” from the 3-dot menu

/content/images/2022/11/aiventwo-16.png

and set a value lower if you need to

/content/images/2022/11/aiventwo-17.png

We can also create a Sink from GCP Logging if we want to send to pub/sub (to send to Datadog) or directly into Splunk

/content/images/2022/11/aiventwo-18.png

Scaling a service

We saw how to create a service in the UI, but what if we want to scale up or down our instance?

We just need to change the plan

/content/images/2022/11/aiventwo-19.png

For instance, I can move from teh US$200/mo plan for Redis down to the smallest business $70/mo

/content/images/2022/11/aiventwo-20.png

The plans, for Redis, at the moment go from US$19/mo

/content/images/2022/11/aiventwo-21.png

Up to $8150/mo for the largest in GCP

/content/images/2022/11/aiventwo-22.png

And again, these prices are by month and I believe (but do not know) that pricing is hourly. Thus, if we wanted to fire up the largest Redis for a test run, an hour would be roughly US$11.16 for that hour which isn’t bad if testing a system under load.

Cloudwatch logs

Before we can send logs to Cloudwatch, we need an IAM user setup

In AWS IAM, I’ll create a user with only programmitic access

/content/images/2022/11/aiventwo-23.png

The policies you pick largely depend on what you plan to do with this IAM user. If you want to just send logs, CloudWatchLogsFullAccess would do it. If you want Metrics, use CloudWatchEventsFullAccess. If you think you may want both, just use CloudWatchFullAccess.

/content/images/2022/11/aiventwo-24.png

I’ll just use all three and create my user

/content/images/2022/11/aiventwo-25.png

You can then get your IAM ID and Key

/content/images/2022/11/aiventwo-26.png

Lastly, you’ll want to determine the region. For no particular reason, I tend to use us-east-1

/content/images/2022/11/aiventwo-27.png

Back in Aiven.io, we’ll add the AWS Cloudwatch Logs Endpoint

/content/images/2022/11/aiventwo-28.png

Since I’m here, I’ll also handly the CW Metrics

/content/images/2022/11/aiventwo-29.png

I’ll pick a service, such as Kafka, and click Manage Integrations

/content/images/2022/11/aiventwo-30.png

I’ll then add Cloudwatch Logs and Metrics

/content/images/2022/11/aiventwo-31.png

When adding Cloudwatch Metrics, you can chose which Metrics you want to send to AWS Cloudwatch (by default, all are enabled)

/content/images/2022/11/aiventwo-32.png

Immediately, I saw Aiven logs appear as a new Group in AWS

/content/images/2022/11/aiventwo-33.png

You’ll see that log streams have been added retroactively for past logs - not just logs going forward

/content/images/2022/11/aiventwo-34.png

If I explore one to see the messages being delivered

/content/images/2022/11/aiventwo-35.png

That means I can use the logs in Logs Insights to parse and find certain types

/content/images/2022/11/aiventwo-36.png

As always, I like to point out that logs that are infinitely stored can really start to cost over time. I recommend going to the retention settings:

/content/images/2022/11/aiventwo-37.png

and reducing to the minimum duration you care about

/content/images/2022/11/aiventwo-38.png

AWS Cloud Metrics

As you recalled, like logs, we sent Metrics to AWS as well

We can see those in Cloud Metrics

/content/images/2022/11/aiventwo-39.png

With Metrics, I can create graphs, such as bytes received by Kafka

/content/images/2022/11/aiventwo-40.png

Perhaps tweaking the style or adding annotations

/content/images/2022/11/aiventwo-41.png

In time, we can see plenty of data gathered from Aiven

/content/images/2022/11/aiventwo-98.png

Summary

Today we looked at setting up a hosted Redis instance in Aiven.io. We setup an IAM user and showed how to set up AWS CloudWatch for Metrics and Logs. We demonstrated integration Aiven.io logs and metrics into AWS and also showed how to minimize costs.

We can now use CloudWatch with SNS to trigger SNS alerts or Lambdas.

In our next blog, we’ll wrap the series looking at Grafana, InfluxDB for monitors and alerts.

aiven redis aws cloudwatch

Have something to add? Feedback? You can use the feedback form

Isaac Johnson

Isaac Johnson

Cloud Solutions Architect

Isaac is a CSA and DevOps engineer who focuses on cloud migrations and devops processes. He also is a dad to three wonderful daughters (hence the references to Princess King sprinkled throughout the blog).

Theme built by C.S. Rhymes