How we achieved 5X performance of real-time email tracking with #GOLANG #REDIS #LUA

How we achieved 5X performance of real-time email tracking with #GOLANG #REDIS #LUA

Posted under Developer on May 30, 2018

At Pepipost, we help customers stay in complete control of their email program and track email events like sent, opens, clicks, bounces and spam in real-time.

Real-time data provides marketers with a goldmine of information.

Because, if you are able to track this data in real-time, you may be able to make changes to improve deliverability and the overall performance of your email program.

Every day we deal with around 20k – 50k email events/second. That’s huge. Processing all these events every single second in real-time. We use the magical combination of Redis and PERL to deal with these complex data structures. And it seemed to work just fine, because of the high text processing efficiency, powerful set of regular expressions, fast development and easy-to-learn functionality.

Redis is an open source (BSD licensed), in-memory data structure store, used as a database, cache and message broker. It supports list, hash, sets etc. You can get more info here.

We are getting satisfactory performance with PERL+REDIS.

But, can we optimize further?

As developers, we are always looking for new ways to improve performance and functionality. One of the trickiest problems we face is in choosing the right algorithms and data structures to build speed into the project and optimise performance.

Since every use case has a different approach and not all technologies fit every use cases, we decided to evaluate newer technologies and languages and found #Go #Redis #lua combination gives a great performance. We did the benchmark with a small part of the whole system.

Let’s have look at below flow.

Old Flow:

1. Perl daemon which continuously fetching data using BLPOP from Redis queue (name ‘pepi_queue’).
2. It does some data process.
3. Insert records to MySQL.

Benchmark 1 : PERL + REDIS

Process: 10
Number of Events: 1 million
Time: 50 sec

Above is the old flow with the old benchmark. After that, we tried something new basis our learnings.

PART 1

1. Why Lua?

Lua is a lightweight, multi-paradigm programming language designed primarily for embedded use in applications. You can check here for more info on Lua.

The reason why we choose Lua is, it lets you create your own scripted extensions to the Redis database. That means with Redis you can execute Lua scripts. You can use Redis’s command ‘EVALSHA’ to execute Lua scripts.

2. Build REDIS-LUA script extension:

We have created our own scripted extension to the Redis using Lua script.

File: lrange.lua

This is sample file of lua scripts. Here we wrote two commands known ‘LRANGE’ & ‘LTRIM’, which is redis query only.

local result = redis.call('lrange',KEYS[1],0,KEYS[2]-1)
redis.call('ltrim',KEYS[1],KEYS[2],-1)
return result

Here redis.call()  function is use for calling redis query .

Lua having two tables KEYS and ARGV from which we are going to use KEYS. You can more explore about lua KEYS here (https://redis.io/commands/eval).

Evaluate/Compile Lua script with EVAL command:

redis-cli --eval lrange.lua pepi_queue 100

It will give you records from pepi_queue OR it will throw an error if something’s wrong. Here pepi_queue is considered as KEYS[1]  and 100 is considered as KEYS[2]. Lua KEYS index starts from 1. When we use Lua extension, we don’t need to use EVAL every time. You just need to store your script in Redis once and start using with EVALSHA command.

Cache/Store script in Redis with SCRIPT LOAD command:

Load your Lua file into Redis. Redis will cache this script into memory.

redis-cli script load lrange.lua

It will return sha1 digest of the script like this:  785c5ff1ad6fcc0c3f5387e0a951765d2d644a22.

The script is guaranteed to stay in the script cache forever. You can remove the script from the cache by calling SCRIPT FLUSH. This command will flush all script stored in Redis.

Use EVALSHA command:

redis-cli evalsha 785c5ff1ad6fcc0c3f5387e0a951765d2d644a22 2 'pepi_queue' 100

Here, the number of keys we are providing is 2 i.e., ‘pepi_queue’ & 100.  So 100 will be the batch value how much we need to extract and ‘pepi_queue’ is list name.

With above command, we extract 100 records at a time from ‘pepi_queue’. If we check lrange.lua file, it performs LRANGE first wherein it’ll return a list of 100 records that are stored in the ‘result’ variable. After that, it performs LTRIM function which trims the same 100 records from the list and finally returns ‘result’ variable with 100 records.

How the new approach helps:

Redis guarantees that a script is executed in an atomic way so no other script or Redis command will be executed while the script is being executed and there’s no chance for data loss. In our scenario we get 100 records in single Redis command, whereas in the old approach we needed to have 100 BLPOP commands to fetch the same data!

You can refer https://redis.io/commands/eval for more info. While at it, I also recommend you go through this nice article I found on the which explains everything.

So we’re done with our first part in which we’ve successfully created own scripted extension. Now for the next part.

PART – 2

1. Why Go?

Go has efficient concurrency like java, c, c++. Concurrency is well explained by Rob Pike (watch the video here). Syntactically easy, we can easily open GoRoutines with keyword “go”. Go has garbage collection which automatically performs memory management and also provides built-in testing and profiling framework. You can check here for more advantages of Go.

2. GOROUTINE Approach:

We have converted old flow with goroutine approach in GOLANG, here’s how –

1. We used go-redis (https://godoc.org/github.com/go-redis/redis) client for redis. We used evalsha command, in which we are fetching data in batch.

var qbatch = []string{"pepi_queue", 100}
records := rd.EvalSha("785c5ff1ad6fcc0c3f5387e0a951765d2d644a22", qbatch).Val()

Here “rd” is redis connection. You can check here how we can make redis-go connection.

2. Opened goroutine which does data process and sends the batch build query to ‘batchInsert’ channel. Check more on goroutines and channels.

go func(redis_batch_records []string) {

// some data processing

q = "insert into pepi_event (id, event_name) values (1,'sent'),(2, 'clicked'),(1,'open'),('3','bounce')"

batchInsert <- q

}(redis_batch_records)

The whole process runs asynchronously.

3. We opened one goroutine, which dedicatedly listen to batchInsert channel and simply executes the query and inserts data in MySQL.

func BatchInsertTagData(db *sql.DB, batchInsert chan string) 
{
    for{
        q = <-batchInsert
        db.Exec(q)
    }
}

Above all 3 steps runs parallelly without any interdependency and without waiting for each other.

So in PART – 2 we used goroutines because of which our all three process are running concurrently.

Benchmark 2 : GO + REDIS + LUA

Goroutines: 10

Number of Events: 1 million

Time: 10 sec

Concluding…

This is just one example of how we’ve handled huge incoming requests efficiently and improved performance. There is long workflow behind the camera. While we’ve used it for processing email events, this combination finds wide application in other areas as well. However, please note that every application has different behaviour and approaches. Above approach was found perfect for our use case but it doesn’t necessarily mean it will fit for all and also note that this is not about #GO versus #PERL. Every technology is built as per specific use case you need to evaluate which technology is perfect for your application.

Have you worked on any such interesting approaches and technologies? Am eager to know, please share in the comments section.

Found this blog useful? Please rate us
1 Star2 Stars3 Stars4 Stars5 Stars (1 votes, average: 5.00 out of 5)
Loading...

img

Ashish Tiwari|Developer Evangelist

Leave a Reply

avatar
  Subscribe  
Notify of

By May 8, 2018

How to send emails in WordPress using SMTP?

WordPress is one of the most popular open source Content Management System (CMS). Every fourth…

Read More

By March 10, 2016

How to Migrate From Mandrill to Pepipost

Savvy businesses want a dynamic transactional email delivery system without the high fees. Mailchimp recently…

Read More

By August 28, 2017

SendGrid Migration Guide

SendGrid migration to Pepipost – Is it complicated? Switching from an ESP can seem like…

Read More