How we achieved 5X performance of real-time email tracking with #GOLANG #REDIS #LUA

 In Developer

At Pepipost, we help customers stay in complete control of their email program and track email events like sent, opens, clicks, bounces and spam in real-time.

Real-time data provides marketers with a goldmine of information.

If you are able to track this data in real-time, you may be able to make changes to improve deliverability and the overall performance of your email program.

Every day we handle around 20k – 50k email events/second. That’s huge. Processing all these events every single second in real-time. We use the magical combination of Redis and PERL to deal with these complex data structures. And it seemed to work just fine, because of the high text processing efficiency, powerful set of regular expressions, fast development and easy-to-learn functionality.

Redis is an open source (BSD licensed), in-memory data structure store, used as a database, cache and message broker. It supports list, hash, sets etc. You can get more info here on Redis.io.

We are getting satisfactory performance with PERL+REDIS.

But, can we optimize further?

As developers, we are always looking for new ways to improve performance and functionality. One of the trickiest problems we face is choosing the right algorithms and data structures to build speed into the project and optimize performance.

Since every use case has a different approach and not all technologies fit every use case, we decided to evaluate newer technologies and languages and found that #Go + #Redis + #lua combination gives a great performance. We did a benchmark with a small part of the whole system.

Let’s have a look at the below flow.

Old Flow:

1. Perl daemon which continuously fetching data using BLPOP from Redis queue (name ‘pepi_queue’).
2. It does some data process.
3. Insert records to MySQL.

Benchmark 1 : PERL + REDIS

Process: 10
Number of Events: 1 million
Time: 50 sec

Above is the old flow with the old benchmark. After that, we tried something new, based on our learnings.

PART 1

1. Why Lua?

Lua is a lightweight, multi-paradigm programming language designed primarily for embedded use in applications. You can check here for more info on Lua.org.

The reason why we choose Lua? It lets you create your own scripted extensions to the Redis database. That means with Redis, you can execute Lua scripts. You can use Redis’s command ‘EVALSHA’ to execute Lua scripts.

2. Build REDIS-LUA script extension:

We have created our own scripted extension to the Redis using Lua script.

File: lrange.lua

This is sample file of lua scripts. Here we wrote two commands known ‘LRANGE’ & ‘LTRIM’, which is redis query only.

local result = redis.call('lrange',KEYS[1],0,KEYS[2]-1)
redis.call('ltrim',KEYS[1],KEYS[2],-1)
return result

Here redis.call()  function is use for calling redis query .

Lua having two tables KEYS and ARGV from which we are going to use KEYS. You can more explore about lua KEYS here (https://redis.io/commands/eval).

Evaluate/Compile Lua script with EVAL command:

redis-cli --eval lrange.lua pepi_queue 100

It will give you records from pepi_queue OR it will throw an error if something’s wrong. Here pepi_queue is considered as KEYS[1]  and 100 is considered as KEYS[2]. Lua KEYS index starts from 1. When we use Lua extension, we don’t need to use EVAL every time. You just need to store your script in Redis once and start using with EVALSHA command.

Cache/Store script in Redis with SCRIPT LOAD command:

Load your Lua file into Redis. Redis will cache this script into memory.

redis-cli script load lrange.lua

It will return sha1 digest of the script like this:  785c5ff1ad6fcc0c3f5387e0a951765d2d644a22.

The script is guaranteed to stay in the script cache forever. You can remove the script from the cache by calling SCRIPT FLUSH. This command will flush all script stored in Redis.

Use EVALSHA command:

redis-cli evalsha 785c5ff1ad6fcc0c3f5387e0a951765d2d644a22 2 'pepi_queue' 100

Here, the number of keys we are providing is 2 i.e., ‘pepi_queue’ & 100.  So 100 will be the batch value how much we need to extract and ‘pepi_queue’ is list name.

With above command, we extract 100 records at a time from ‘pepi_queue’. If we check lrange.lua file, it performs LRANGE first where it will return a list of 100 records that are stored in the ‘result’ variable. After that, it performs the LTRIM function, which trims the same 100 records from the list and finally returns ‘result’ variable with 100 records.

How the new approach helps:

Redis guarantees that a script is executed in an automatic way so no other script or Redis command will be executed while the script is being executed and there’s no chance for data loss. In our scenario we get 100 records in single Redis command, whereas in the old approach we needed to have 100 BLPOP commands to fetch the same data!

You can refer https://redis.io/commands/eval for more info. While at it, I also recommend you go through this nice article I found which explains everything.

We have successfully created own scripted extension. Now for the next part.

PART – 2

1. Why Go?

Go has efficient concurrency like java, c, c++. Concurrency is well explained by Rob Pike (watch the video here). Syntactically easy, we can easily open GoRoutines with keyword “go”. “Go” automatically performs memory management and also provides built-in testing and profiling framework. You can check here for more advantages of Go.

2. GOROUTINE Approach:

We have converted old flow with Goroutine approach in GOLANG, here’s how –

1. We used go-redis (https://godoc.org/github.com/go-redis/redis) client for redis. We used EVALSHA command, in which we are fetching data in batch.

var qbatch = []string{"pepi_queue", 100}
records := rd.EvalSha("785c5ff1ad6fcc0c3f5387e0a951765d2d644a22", qbatch).Val()

Here “rd” is redis connection. You can check here how we can make redis-go connection.

2. Opened goroutine which does data process and sends the batch build query to ‘batchInsert’ channel. Check more on goroutines and channels.

go func(redis_batch_records []string) {

// some data processing

q = "insert into pepi_event (id, event_name) values (1,'sent'),(2, 'clicked'),(1,'open'),('3','bounce')"

batchInsert <- q

}(redis_batch_records)

The whole process runs asynchronously.

3. We opened one Goroutine, which is dedicated to listen to the batchInsert channel and simply executes the query and inserts data in MySQL.

func BatchInsertTagData(db *sql.DB, batchInsert chan string) 
{
    for{
        q = <-batchInsert
        db.Exec(q)
    }
}

Above all 3 steps run in parallel without any interdependencies and without waiting for each other.

So in PART – 2 we used Goroutines and all three process are running concurrently.

Benchmark 2 : GO + REDIS + LUA

Goroutines: 10

Number of Events: 1 million

Time: 10 sec

Concluding…

This is just one example of how we’ve handled huge incoming requests efficiently and improved performance. There is a long workflow behind the curtain. While we’ve used it for processing email events, this combination finds wide application in other areas as well. However, please note that every application has different behavior and approaches. The above approach was found perfect for our use case but it doesn’t necessarily mean it will fit for all cases and also note that this is not about #GO versus #PERL. Every technology is built as per specific use case; you need to evaluate which technology is perfect for your application.

Have you worked on any such interesting approaches and technologies? Am eager to know, please share in the comments section.

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...
Recommended Posts

Leave a Comment

Pepi thinking

Start typing and press Enter to search

Thank you for your details!

Fill out your information below, and we will send you a PepiAlert, that will describe your domain’s email deliverability situation. Please note that your email address must match the domain, or the domain must be owned by the company matching the email address. We have the right to refuse the request, if we can’t verify the information.

*All fields are required