Assembly to Machine Code

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Assembly to Machine Code

javalilly
Hi,

I am slightly confused on how the conversion is happening, so I am trying to work through some of them by hand so I understand how the assembler works.

From my understanding, each line of assembly is getting converted to a 16 bit binary number.
In chapter 6 all the predefined symbols are given a hex value which converts to a 16 bit binary number. So in my symbol table, I can put these symbols and theire corresponding values and load them to print when I read them in.

Here is where I get confused. The destination and the jumps are only given as 3 bit binary numbers. The comp is 6 bits. That leaves 3 bits at the front. If it is a C instruction, those are just all ones.

So if I have D;JMP that should be Default: 111 Comp: 001100 Dest:000 JUMP:111
That leaves me one bit short and I can't figure out where it comes from? At first I thought it was a 1 or 0 for the semicolon, but that should be ignored. Any suggestions or something I am overlooking in the book?

Thanks
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Assembly to Machine Code

cadet1620
Administrator
Look at the C-Instruction definition in section 4.2.3. There are 7 bits in the comp field.

--Mark
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Assembly to Machine Code

javalilly
I was missing the a right?
So if I have D;JMP, I have D when a = 0. Therefore the leftmost bit of Comp =0.

The entire thing would be 1110 0011 0000 0111

Thanks
Loading...