Need to run schedulers concurrently

I have a public website, which processes the data and run different schedulers for that data to process i.e. save data into db, send notifications to users

All these processes are running independently through the schedulers, and these processes are running concurrently

I am trying to run these schedulers using http, but using http the problem was up of available tcp ports into the system, because I have to process huge amount of data which will run millions of schedulers at a certain time, I have implemented the rate limiting also.

I have tried to run the schedulers using curl also, but it starts giving error

too much open files

even I have increased the open file limit in my system to 1 million ,and curl occupies too much resources, so I am avoiding to go with curl

For more clarification on data, let’s say I have 10,000 schedulers are running concurrently, and inside these schedulers, 10-20 schedulers(for each scheduler) are running in parallel for sending notifications, I am thinking to run these internal schedulers by another method rather than http or curl.

Note:- I have to pass different data to each scheduler. and data is processing through cronjobs using golang

I am thinking to run these schedulers internally, can I do that?

Is there any better solution to run schedulers, by not using http or curl?

Hi, Sahil, can you clarify what a “scheduler” is?

basically it’s cronjobs

When you mention “schedulers” and “processes,” do you mean actual instances of /usr/sbin/cron (or the Windows Task Scheduler if this is on a Windows server. What OS are you running?) spawning multiple OS processes of your executable or are these goroutines in a single process? Depending on the OS and multiple processes vs. a single process, there are different settings that you can try changing to increase the number of open connections the server will accept. The server memory and could play a potentially significant role in that, too.

Assuming you’re running a relatively recent Linux distro, Baeldung has a good article on the various settings for per-user/session/global maximum connections

schedulers are net/http requests which was called concurrently, and we can’t put that calls on cron, because these net/http calls are decided at run time of each cron,
e.g. some one has set an reminder to be send at 11AM, and another user set the reminder to be send at 1PM, so these calls are decided at run time.
So i need to call net/http requests concurrently for parallel execution.
If I call these net/http requests concurrently 50000 times, then it starts giving error.

My system configurations are:-
RAM:- 16GB
CPU:- Intel(R) Core™ i5-10400 CPU @ 2.90GHz
Number of Cores per socket :- 6
Threads per core:- 2
OS:- Ubuntu 20.04.5 LTS

Then use less workers!

I wouldn’t do more than roughly 1k requests cocurrently. Remember that not only your localhost has limited ressources, though also the server you are hammering the requests against.

Not even talking about the hosts involved in routing your requests.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.