How? Show us some snippet of code. Chr() for example is wrong, and ChrByte() should be used. More errors can be done… So show us some code of your “string composition” to be sent, let’s see if Xojo have some bug.
If you see “the string” compound correctly in the debugger, but incorrect bytes sent, probably you are correct, a bug may be occurring at the Xojo socket component level.
As for encoding, for a bag of bytes, a Nil one should fit.
Not related to the question, only to the title of the question and understanding between people.
PS: even if you find tables that display their author so-called ASCII Table with value > 127, that beast does not exists (the mapping is different on macOS vs Windows).
ASCII, stands for American Standard Code for Information Interchange. It is a 7-bit character code where each individual bit represents a unique character. This page shows the extended ASCII table which is based on the Windows-1252 character set which is an 8 bit ASCII table with 256 characters and symbols. It includes all ASCII codes from standard ASCII, and it is a superset of ISO 8859-1 in terms of printable characters. In the range 128 to 159 (hex 80 to 9F), ISO/IEC 8859-1 has invisible control characters, while Windows-1252 has writable characters. Windows-1252 is probably the most-used 8-bit character encoding in the world.