0

Which ASCII character could interrupt or terminate or otherwise disturb communication on the reciever part. Not including acknowledgment, handshakes, identification or other programmed protocols. Just what could inadvertently interrupt the data stream because such ASCII character was inserted among normal characters. Data sent in 8N1 format. In most of known cases, where the reciever is: - Windows OS computer - non-Windows OS computer - Any kind of Microcontroller recieving these data etc (sorry for not being more specific but that's really what it is: Multipurpose)

The reason why I ask this, is that I want to send bit strings, but for some reasons, I'm using the ASCII equivalent of the binary string. I imagine one of these would cause an "End of Transmission" but I'm not sure which ones.

Blockquote enter image description here

Fredled
  • 1,820
  • 9
  • 17
  • 3
    EOT? Break? CAN? It really depends on your OS, code or protocol specifications. – Lior Bilia Mar 08 '20 at 00:26
  • Lior Bilia Of course, but it's not "my" code, protocol or OS. It's assumed that it can be connected to any device, – Fredled Mar 08 '20 at 00:40
  • `It's assumed that it can be connected to any device` Then this question is too vague. There are many different ways digital code can HALT itself depending on the protocol. –  Mar 08 '20 at 01:07
  • OK. So which are the most commonly used characters which could interfer? Or which characters are likely not to interfer? I know it's vague, but what can I do? – Fredled Mar 08 '20 at 01:19
  • 3
    I don't think a UART will react to any of these characters - the UART will simply pass the character on to the software in the receiving device. That software may get upset with any character or string of characters. I once had to deal with a device that crashed if it received any unexpected character. – Peter Bennett Mar 08 '20 at 01:49
  • 1
    There is a common "out of band" signal used in asynchronous serial communications called a "break signal." This happens whenever the transmitter goes into a "space condition" for longer than some unspecified duration -- but so far as I know always more than a character time. This causes a framing error in the receiver UART. But it can also be separately detected as a "break signal" if the receiver can uniquely detect a long "space" as such. (Separately, from just bitching about it being some kind of framing error, as while all "breaks" are framing errors not all framing errors are "breaks.") – jonk Mar 08 '20 at 02:36
  • 1
    i think that your question is misguided ... the interrupt character is one that you choose and implement it in your software – jsotola Mar 08 '20 at 03:13
  • Jonk, Peter Bennett Thanks for the comment. Indeed it's more a software question, except for the long time breaks which could cause an error in a UART. – Fredled Mar 10 '20 at 14:38
  • jsotola In this case I'm not writing the code. Others wrote the code. Anb so the question is which character (or binary string) are most often used to interrupt or to do something other than simply reading and accepting the data. – Fredled Mar 10 '20 at 14:40

1 Answers1

2

On a typical interactive console, CTRL+S = XON (DC1) pauses output stream, and CTRL+Q XOFF (DC3) resumes output.

There's really no difference between "binary" and "ASCII", binary is just one way to write out the number, and ASCII is just a way of interpreting the meaning of that number. Whether you send binary 00010001 or hex 0x11 or octal \021 or chr(17), it is all the same thing once it gets out on the wire.

But actually it sounds like the underlying problem is that you're sending data in some sort of encoding that the receiver isn't receiving correctly. In windows there is both "raw" and "cooked" I/O options available; "cooked" means that LF gets replaced with CR + LF character sequence. There's probably something similar happening in your situation, you're sending raw binary but some protocol layer intercepts it.

A more robust way to transmit arbitrary binary information to "any" device, would be to use an encoding scheme that can be expressed using only printable ASCII characters. Preferably, don't invent your own encoding scheme, but use one that is already established and likely to be supported on the target system.

One commonly used binary encoding scheme is base64, used in email systems to convert arbitrary binary data into a string of printable characters. There are existing libraries for encoding and decoding base64 strings.

Another commonly used binary encoding scheme is intel hex format, which uses only the characters :, 0-9, A-F, and any type of line ending CR|LF|CR+LF. Each line begins with : and has an even number of hex digits, encoding a byte. First byte is the byte count for that line, next two bytes are the target address, then a record type (00 for data, 01 for end of file), then the actual data (which may be 0 or more bytes), and a checksum byte. This is one of the commonly used formats for delivering firmware to embedded systems, so it's widely used and should have existing library code available.

MarkU
  • 14,413
  • 1
  • 34
  • 53
  • MarkU Thanks for the reply. To simplify, I think of not using any binary string starting with 0000 or 0001 (the first 31 ASCII). But if they were only 2 or not much more "dangerous" characters, I could do otherwise. I still didn't experienced the problem. I asked the question pro-actively. The data, only six bytes includoing the address, is too short to use encoding. – Fredled Mar 10 '20 at 14:48