Unexpected behavior from ServerSocket?

I am working on a data logger console app. It reads meters either through a serial (rs-485) port or through a TCP socket. It was working fine until I installed a ServerSocket so a user could request an extra read between schedules or reset the reading schedule. My initial tests were all with a serial port connection. When I added a meter with a TCP connection, it reads once, then causes the program to indicate the read process is still in process (two reads at the same time will crash the ethernet/RS-485 adapter connected to the meter). I finally tracked down the problem to the TCPSocket port is changing to something like 51205 instead of the adapter’s assigned port of 2106. The ServerSocket is supposed to be listening on 17751.

My prior testing showed that if there was only one ethernet adapter for reading meters, they would all read properly every time until there was a request through the ServerSocket to do something. I finally gave up on that configuration without solving it and added a serial meter.
Now the problem happened every time because the connection was changing from TCP to serial back to TCP for the second read. I believe I can quickly work around the problem by resetting the port every time I call an ethernet meter. Is this expected from a ServerSocket or did I miss something in the setup?

That sounds like the correct behavior

A server Socket listens on the designated port.
When a connection is made from a client the server grabs a new tcp socket to service that connection spins it off to handle that one client then goes back to listening on the designated port.
Only the server is on that port.

The new socket HAS to be on a different port - otherwise the server would not be able to listen on the designated port.

I’m afraid I jumped to a conclusion here. I just saved a new version of the program with the server socket and its associated client sockets removed and it still does the same thing. Back to debugging.