I have a console application that uses a couple of ServerSocket listeners. I have one on port 80 and the other on 443. Both initially request 20 sockets using the SocketRequested event. I do not keep any references to the sockets until a Connected Event occurs. The ServerSocket on 443 does use secure connections and I’ve verified the certificate works and the connections are secure. The ServerSocket on port 80 does not use secure connections and is also works as expected. The problem I am noticing is the ServerSocket is requesting additional sockets at a higher rate than the one on 80. I’ve seen where the Error event is triggered on the Sockets (Code 102) even though a Connected Event never occurs. These seem to occur when a client connects to 443 without using SSL which is understandable. When a Connected Event occurs, I do make a reference to the socket. I make sure it is released before the socket is disposed. This also works as expected. What I am noticing though is the ServerSocket is not releasing all of it’s sockets in it’s pool with connections and continues to request more sockets with the SocketRequested event. This builds to the point where the CPU usage increases with the application. I monitor the number of sockets using the SocketRequested, Destructor, and Error events. The counts for the ServerSocket on 80 are always consistent (SSL Sockets not secured), however, the one on port 443 (SSL Sockets secured) continues to increment higher and request more sockets which eventually causes the CPU usage to increase eventually to 100%. I use App.DoEvents(-1) to release time for the events but it continues to use more and more CPU as time goes on. Both ServerSockets 80 and 443 use the same identical code.
As a note the SSLConnectionType for port 443 using SSLSocket is SSLv23.
I am presently using 2019r2 but I have gone as far as reloading back to 2018r4 and this still occurs. Any insight would be appreciated.