2

my question simply is to know where is the origin of UART system. Is there any standard covering it.

I can only see similar question here but the only answer is saying Wikipedia !!!

I can see many chip vendors providing UART chips, but based on What? what is the standard describing UART and its framing?

moibrahim
  • 21
  • 1
  • Begin with the RS232 standard. – Andy aka Jun 29 '20 at 14:26
  • 7
    Reading the RS-232 standard is useless. It defines the physical interface only, not what protocol or framing or speed is used over the physical interface, so there will be no reference to an "UART" at all. – Justme Jun 29 '20 at 14:36
  • What part of the Wikipedia article did you not understand? It covers the general concept fairly thoroughly. It should be clear that there are many choices within the protocol (baud rate, numbers of bits, etc.), and [as Ilya says](https://electronics.stackexchange.com/a/508018/11683), you need to read a chip's datasheet in order to understand which choices it supports. – Dave Tweed Jun 29 '20 at 14:52
  • my question is simple.. where is the standard? .. please don't tell me that wikipedia is a standard! – moibrahim Jun 29 '20 at 14:57
  • 1
    There is no standard for a UART. UART is a class of devices, Universal Asynchronous Receiver/Transmitter. Many manufacturers have different kinds of UARTs that implement the Asynchronous-Start-Stop bit framing protocol. – Justme Jun 29 '20 at 15:02
  • this is what I am asking about, from where did they get this Asynchronous-Start-Stop bit framing protocol? – moibrahim Jun 29 '20 at 15:07
  • 3
    This is a very valid question - in particular as the RS232 standard defines many things but not fundamental concepts like start and stop bits, parity etc. I guess it started as a proprietary protocol and has evolved into a de-facto standard. As different UART implementations can easily communicate with each other, there is obviously a common, inter-operable protocol. So where is the best description of that protocol (even if it's not an official standard)? – Codo Jun 29 '20 at 15:16
  • @Codo , yes this is exactly what I meant .. thanks – moibrahim Jun 29 '20 at 15:20

3 Answers3

2

I think the origin of the current UART protocol can be traced back to mechanicial teletype machines using the 5 bit Baudot code. Some of the timing requirements are based on mechanical considerations - the stop bit(s) allow time for the mechanics of the machine to get ready to recieve the next character.

Peter Bennett
  • 57,014
  • 1
  • 48
  • 127
  • To the extent that there is a protocol at all, it would be "asynchronous serial". A UART is a *device* which implements that - Universal Asynchronous Receive Transmitter. There is an important point in this answer though - the early things interoperable with modern UARTs were *not* UARTs. First they were electromechanical, then they were a special purpose assemblies of logic functions... putting it all in a configurable chip and inventing the acronym UART came later. – Chris Stratton Jun 29 '20 at 16:24
  • Best answer so far. It does go back to 1910 when the US1286351 patent was applied. – Justme Jun 29 '20 at 16:35
1

The exact spec for each specific protocol can be individual for each chip. If you open the datasheet of the IC, you will see what exactly specs the chip supports. For example, some chips support high speed I2C, some only standard speed. Both will be labeled with I2C, although high-speed is not compatible with standard speed, you have to find it in the datasheet. Or some chips support specific SPI modes, some don't. It's also highly individual. Thus, all you can do is open a datasheet of every IC that has "UART" word on it, scroll down to exact UART specs and see if they're compatible with what you're designing. Unfortunately, there is no shortcut. On the good side tho, usually chips use default settings for protocols, which are still listed in the datasheets, no way you implement something without looking there anyway. So whether you like it or not, you're in for some reading.

Ilya
  • 3,478
  • 7
  • 33
  • 1
    I have never read the details of the UART specification. Yet all the chips (MCUs or dedicated UART solutions) I have used so far had no problems to communicate with each other. So I disagree with "*it's also highly individual*". It's obviously not. UART is clearly a widely agreed on interoperable protocol. – Codo Jun 29 '20 at 15:20
  • 2
    UART is a *device* not a *protocol* – Chris Stratton Jun 29 '20 at 16:22
  • @Codo true. it's easy to establish communication because, as I said, most chips support default settings. But then you need a library to talk to the IC. And if there is none, which is likely for a randomly picked chip, you'll have to dive into datasheet and see that sending 10010011 means "turn on" and 10010000 means "turn off" etc. – Ilya Jun 30 '20 at 08:39
  • I guess OP is really looking for an authoritative description of the serial protocol, not of the implementation in an UART. The question's title is a bit misleading. And I probably should have used *serial protocol* instead of *UART* as well. – Codo Jun 30 '20 at 11:34
1

If you are asking if there is a (eg. EIA/TIA-xxx) written standard for asynchronous communications, then the answer is "there isn't". You can read the history in this Wikipedia page.

The format has essentially been used/agreed upon by equipment/chip makers. There are programmable aspects to the protocol (like number of data bits (5/7/8), type of parity (even/odd/none) and number of stop bits (1/2)) that means you may need to change settings in software (or DIP switches) on 2 pieces of equipment to make them operate together.

Likewise, there are sets of "standard" (but not officially documented in an industry spec) baud rates that must be matched for equipment to talk to each other. Getting both bit settings and baud settings matched is required for correct operation.

Edit: Spelling

BrianB
  • 966
  • 4
  • 9