On a typical interactive console, CTRL+S = XON (DC1) pauses output stream, and CTRL+Q XOFF (DC3) resumes output.
There's really no difference between "binary" and "ASCII", binary is just one way to write out the number, and ASCII is just a way of interpreting the meaning of that number. Whether you send binary 00010001 or hex 0x11 or octal \021 or chr(17), it is all the same thing once it gets out on the wire.
But actually it sounds like the underlying problem is that you're sending data in some sort of encoding that the receiver isn't receiving correctly. In windows there is both "raw" and "cooked" I/O options available; "cooked" means that LF gets replaced with CR + LF character sequence. There's probably something similar happening in your situation, you're sending raw binary but some protocol layer intercepts it.
A more robust way to transmit arbitrary binary information to "any" device, would be to use an encoding scheme that can be expressed using only printable ASCII characters. Preferably, don't invent your own encoding scheme, but use one that is already established and likely to be supported on the target system.
One commonly used binary encoding scheme is base64, used in email systems to convert arbitrary binary data into a string of printable characters. There are existing libraries for encoding and decoding base64 strings.
Another commonly used binary encoding scheme is intel hex format, which uses only the characters :
, 0-9
, A-F
, and any type of line ending CR
|LF
|CR+LF
. Each line begins with :
and has an even number of hex digits, encoding a byte. First byte is the byte count for that line, next two bytes are the target address, then a record type (00 for data, 01 for end of file), then the actual data (which may be 0 or more bytes), and a checksum byte. This is one of the commonly used formats for delivering firmware to embedded systems, so it's widely used and should have existing library code available.