ServerSocket with SSLSockets Not Releasing

I have a console application that uses a couple of ServerSocket listeners. I have one on port 80 and the other on 443. Both initially request 20 sockets using the SocketRequested event. I do not keep any references to the sockets until a Connected Event occurs. The ServerSocket on 443 does use secure connections and I’ve verified the certificate works and the connections are secure. The ServerSocket on port 80 does not use secure connections and is also works as expected. The problem I am noticing is the ServerSocket is requesting additional sockets at a higher rate than the one on 80. I’ve seen where the Error event is triggered on the Sockets (Code 102) even though a Connected Event never occurs. These seem to occur when a client connects to 443 without using SSL which is understandable. When a Connected Event occurs, I do make a reference to the socket. I make sure it is released before the socket is disposed. This also works as expected. What I am noticing though is the ServerSocket is not releasing all of it’s sockets in it’s pool with connections and continues to request more sockets with the SocketRequested event. This builds to the point where the CPU usage increases with the application. I monitor the number of sockets using the SocketRequested, Destructor, and Error events. The counts for the ServerSocket on 80 are always consistent (SSL Sockets not secured), however, the one on port 443 (SSL Sockets secured) continues to increment higher and request more sockets which eventually causes the CPU usage to increase eventually to 100%. I use App.DoEvents(-1) to release time for the events but it continues to use more and more CPU as time goes on. Both ServerSockets 80 and 443 use the same identical code.

As a note the SSLConnectionType for port 443 using SSLSocket is SSLv23.

I am presently using 2019r2 but I have gone as far as reloading back to 2018r4 and this still occurs. Any insight would be appreciated.

How is your Max and min sockets setup?

Maximum is 64, minimum is 10.

As another note. I’ve actually disposed of the ServerSocket and re-created an instance. This does not help the CPU as it remains high. The only thing that will return the CPU back is to restart the application. Once restarted it returns to a comfortable 3% on average until the sockets begin to accumulate again. I’ve seen it request additional sockets with only 3 to 5 connections and not at the minimum of 10.

You are probably keeping references to sockets somewhere?
The serversocket has an activeconnections array you should locally use.
Otherwise this could be a loop perhaps?

It appears you were correct. I did have a reference which I released but it looks like it was released at the wrong time. I have to remove the reference before requesting the disconnect. If I don’t it never gets released. Thank you for your help!