I know that there is lots of questions and answers that is related to this topic, but i can not clearly and distinctively find answers to my questions, yet. In addition i found some answers which seems to be contradictory to each other.
I know that baud is number of symbols per second in a data transmitting signal, As here is mentioned: https://electronics.stackexchange.com/a/282382/254779, Then its unit could not be "bits per second", right?
But these two answers bellow say that the baud rate unit is "bits per second": https://electronics.stackexchange.com/a/273817/254779, https://electronics.stackexchange.com/a/117245/254779.
Further more, I have ambiguity about the way measuring baud rate. In some web sites is said to be measured by number of times line changes per second and some where else as signal units per second. But i guess that they are not completely the same, right? Because this really means that every signal unit should be correspondent to only one signal change.
So for example how baud rate is measured for Manchester encoding? I guess that in this encoding number of signal changes per second is not the same as number of signal units per second, right?
And the last one is that i found some answers which said meta data bits like start and stop bits were not taken into account for measuring bit rate, however, I saw somewhere else that all bits were used for measuring bit rate. So which of these is correct at the end? For example is baud rate / bit rate = 1 in the UART protocol?
I have these questions in my head for long, and i decided to ask them once for all.
Thanks for reading.