Please help. I'm dying of confusion.
This is all guessing; I'm not one of the professors.
1. The Hack computer is simple enough that it was probably designed on paper. (OK, maybe a word processor and graphics program.) At some point is was probably simulated to prove the viability of the instructions set as something students could be expected to learn well enough to use it as the target language for the Jack compiler back-end (VM Translator). This would likely have been a primitive, non-GUI version of the CPUEmulator.
"Chip API" is in quotes because there is no "application" nor "programming" involved. When I was designing 35 years ago, we just called it "pinout", and that term came from the vacuum tube era. The chip manufacturers published books with all the pinouts and electrical timing parameters and requirements in them. I assume the profs used API because that is how combined interface and action is taught in (software) programming class these days.
Hardware/Software/System Architect is like a building architect. They design what the oberall building looks like and the internal layout so that easy to live/work in. Engineers are the people who ensure that hardware works and buildings don't fall down. These days they use simulation software; 50 years ago they did lots of hand calculations.
The hardware simulation tool the engineers use are generally commercially available tools. VHDL and Verilog are the most popular digital logic tools. The FPGA manufacturers supply tools that convert VHDL/Verilog into the files that can be loaded into their devices.
IC Engineers also need to use Analog simulation tools that can accurately simulate the continuous nature of the real world.
2. All the places I worked (no IC companies), there were no "architect" positions. The architect varied from product to product and was one of the senior/lead engineers. Often they collaborated with the other engineers if part of the design was awkward. They functioned more or less as benign dictators. There were several times when I offered design improvements that were accepted.
For the companies I worked for, cost was the major driving factor since we were selling to end users. We generally optimized for cost, speed, power in that order. (Computers in the '70s-'80s were expected to have lots of fan noise and eat lots of electricity.)
We had to have at least an overview of the entire system. For example the spec for a bus that an I/O card plugs into was usually well speced for signal electrical and timing requirements, but the command set was a negotiation between the card designer and the software guys, because it depended on what specific I/O chips were used on the card. The engineer designing the I/O card was responsible for his part selection and ensuring that his design met the timing specs. They were also responsible for the bill of material for the board, which tells purchasing and manufacturing which parts were used where.
HDL is basically a blueprint for a circuit board or the silicon in an IC.
Machine Language: the 1's and 0's the computer reads and executes.
Assembly Language: human readable version of machine language with useful additions like labels and symbol names.
Instruction Set Architecture: modern term for the definition of the machine language and a common assembly language representation of the instructions. Includes register descriptions which can be rater complex in modern processors.
Microcode: a specialized machine language that is used by the chip designers to implement the instructions in the ISA. Sort of like subroutines that implement the actual instructions.
Part of the problem is that there is a lot of history and lore in the development of computers, and early manufacturers did not have a commonly agreed upon vocabulary for the new ideas. (I worked with an old computer that had BRING HOLD JERK and KEEP instructions for what we now call load and store!)
4. I'll write something here later...
|Free forum by Nabble||Edit this page|