Let's look at a fragment of machine code: (This is from stoned-b, but most executable code for many processors looks similar).
...
1000:0148 BB0002 MOV BX,0200
1000:014B 33C9 XOR CX,CX
1000:014D 8BD1 MOV DX,CX
1000:014F 41 INC CX
...
In most computers, "byte" means a single addressable unit of memory containing 8 bytes.
This example shows that the byte value "BB" is stored in location 1000:0148, the byte value "00" is stored in 1000:0149, the byte "02" is stored in 1000:014A, the byte "33" is stored in 1000:014B, and so on.
Programs are stored in memory.
Some processors have variable-length instructions -- as you can see in this example, the first instruction occupies 3 bytes, the next instructions occupy 2 bytes each, and the last instruction requires only 1 byte.
The "byte 1" of the first instruction in this example is "BB", "byte 2" is "00", and "byte 3" is "02".
The "byte 1" of the next instruction in this example is "33", "byte 2" is "C9", and it doesn't have a "byte 3".
Variable-length instruction sets, such as the 8085 and the one in this example, typically encode the opcode and the format of the instruction in the first byte. By examining only the first byte, the CPU can decode how many bytes long is the entire instruction, and what function this instruction does, and the "meaning" of the following bytes.
I hope that you and others will keep improving assembly language books, such as the "x86 Assembly" book, until they gives clear, easy-to-understand answers to this and related questions.