Parallel TCP/IP Socket Server With Multithreading and Multiprocessing in C

The primary intention of writing this article is to give you an overview of how we can entertain multiple client requests to a server in parallel. For example, you are going to create a TCP/IP server which can receive multiple client requests at the same time and entertain each client request in parallel so that no client will have to wait for server time. Normally, you will get lots of examples of TCP/IP servers and client examples online which are not capable of processing multiple client requests in parallel.  

In the first example, the TCP/IP server has been designed with multi-threading for parallel processing and in the second example, I have implemented the server with multi-processing to accomplish the same goal. 

Thoughts on Server-Sent Events, HTTP/2, and Envoy

In a distributed system, moving data efficiently between services is no small task. It can be especially tricky for a frontend web application that relies on polling data from many backend services.

I recently explored solutions to this problem for Grey Matter  —  specifically, how we could reduce traffic to some of the most requested services in our network. Our web app had the following characteristics: