Golang Concurrency And Memory Management

Sudeep Dasgupta
4 min readMar 14, 2020

This blog is about Golang concurrency patterns

In this blog, we will learn about Golang Concurrency, Channels, Buffered Channels, Mutex, Sync, Context and issues which can occur if we don't synchronize threads.

Well, concurrency is running multiple tasks at the same time with shared resources. Concurrency is not parallelism, Parallelism is a completely separate term that we will not discuss here in this blog.

Concurrency in Golang is created using goroutines. Goroutines are very easy to write using “go” keyword just before function.
go func(){
}()

Now concurrency helps to complete the task faster than a single thread but too much of concurrency leads to too much of memory consumption as well as CPU utilization.

Sometimes concurrency leads to deadlock or race conditions. When multiple threads write in a single memory then sequence might get changed.
For example, you triggered a thread with a value of 10 and then 20 and you expect the value to be 20 as 20 was triggered after 10 now sometimes due to too much of concurrency you might get value as 10 as 20 might get replaced with 10 this situation is called a race condition.
How to get rid of that is simply just you need to synchronize threads.
You can use mutex, channels or sync.WaitGroup to synchronize threads.

We will demonstrate one by one of each one of them.

  1. sync.WaitGroup will print 5 as all threads are synchronized
package main

package main

import(
"fmt"
"sync"
)

var number int

func main(){


// Here we create a sync.WaitGroup object
var wg sync.WaitGroup

// Creating threads adding 1 thread to wait group and sending wait group as parameter of the method
for i := 1; i <= 5; i++ {
// adding number of threads to wait group
wg.Add(1)
// calling worker method with goroutines
go worker(i, &wg)

wg.Wait()
}

// waiting for thread to complete


fmt.Println(number) // 5
}

func worker(id int, wg *sync.WaitGroup) {
// closing thread after the completion of the method
number = id

wg.Done()
}

2. Mutex, it locks the variable and other threads wait for it to unlock

package main  
import (
"fmt"
"sync"
)

var x = 0

func worker(wg *sync.WaitGroup, m *sync.Mutex) {
m.Lock()
x += 1
m.Unlock()
wg.Done()
}

func main() {
var w sync.WaitGroup
var m sync.Mutex
for i := 1; i <= 5; i++ {
w.Add(1)
go worker(&w, &m)
}
w.Wait()
fmt.Println("final value of x", x)
}

3. Channels, the best feature in Golang to pass messages and synchronize threads. Do not use the same channel in the same thread to read and write it will cause deadlock.

package main  
import (
"fmt"
)

var x = 0

func worker(chanStatus chan bool) {

x += 1

chanStatus <- true

}

func main() {

var chanStatus = make(chan bool)

for i := 1; i <= 5; i++ {
go worker(chanStatus)

<-chanStatus
}



fmt.Println("final value of x", x)
}

Mutex should only be used when there are multiple threads trying to read and write from the same memory deadlock might occur. To prevent it from deadlock mutex is the best solution. Mutex locks the section from other threads to access it until it unlocks it again. There are read lock as well as write locks and even read-write locks in Golang. For race condition, thread synchronization is enough.
Mutex Should be used in some cases where multiple goroutines are trying to delete from hashmap, writing to a socket etc.

Now we will talk about memory management. Golang has garbage collector still when you write too much of data to a socket and receive simultaneously there is huge memory consumption and then your application might get OOM issues. To use profiling in Golang for what reason application is consuming too much of memory you can use Golang profiler.
Profiling is useful for identifying expensive or frequently called sections of code. The Go runtime provides profiling data in the format expected by the pprof visualization tool. The profiling data can be collected during testing via go test or endpoints made available from the net/http/pprof package. Users need to collect the profiling data and use pprof tools to filter and visualize the top code paths.

Predefined profiles provided by the runtime/pprof package:

  • Cpu: CPU profile determines where a program spends its time while actively consuming CPU cycles (as opposed to while sleeping or waiting for I/O).
  • Heap: Heap profile reports memory allocation samples; used to monitor current and historical memory usage, and to check for memory leaks.
  • Threadcreate: Thread creation profile reports the sections of the program that lead to the creation of new OS threads.
  • goroutine: Goroutine profile reports the stack traces of all current goroutines.
  • Block: Block profile shows where goroutines block waiting on synchronization primitives (including timer channels). Block profile is not enabled by default; use runtime.SetBlockProfileRate to enable it.
  • Mutex: Mutex profile reports the lock contentions. When you think your CPU is not fully utilized due to a mutex contention, use this profile. Mutex profile is not enabled by default, see runtime.SetMutexProfileFraction to enable it.

If you use channels in Golang, try to use buffered channels. It uses less memory and is faster than unbuffered channels. Also, try to use “select“ to avoid blocking io.

select {

case message, ok := <-BucketData:

if ok{

// some task
}
break
default:
<-time.After(1 * time.Nanosecond)
break
}

On more module which helps to terminate goroutines and channels. It is called context. With the help of context, you can set a timeout in go channels and terminate if the channels do not give data within the time set.

var someChannel = make(chan bool)
ctx, cancel := context.WithTimeout(context.Background(), time.Duration(timeoutTime) * time.Millisecond)
defer cancel()

go someWork(someChannel)

select{
case <-ctx.Done():
// context timeout
case callback := <-callbackChan:
// callback from channel
}

I hope this blog gives you a good understanding of channels, mutex, context, sync and goroutines.

--

--

Sudeep Dasgupta

Machine Learning | Big Data | Video Streaming | Real-Time Low Latency Apps | Product Designer | Programmer | Open Source Contributor