I have the following server specifications:
I currently have two services running on this server:
Recently, I conducted a simulation using a CSV file containing 280,000 records, simulating real-time conditions with an average input rate of 25 inputs per second. Over the course of three days, the simulator sent approximately 2 million records to the input. Upon halting the simulation, I observed that the input retained the expected 2 million records. However, the two services only processed 1 million records (half of what was sent by the simulator). Despite this, the services are still operational, receiving input and producing output data.
The maximum rate of the services didn't exceed 5/sec
I have two questions regarding this scenario:
Your guidance on addressing these concerns would be greatly appreciated. Thank you.