At my school and every other school I'm remotely familiar with, the knowledge needed to emulate a CPU would be covered in upper division computer engineering courses, and not covered by CS undergrads at all, certainly not in their first year.
To be fair, I did drop out a year later :)
[1] http://www.unibo.it/en/teaching/degree-programmes/study-plan...
I know this is mostly gone at this point, but it should not. The fundamentals are essentials. I often meet people who are totally clueless how a computer work and have CS degrees.
"How can you do C, it is so old? By now, we must have invented faster languages. Computers changed so much recently". Sure... Binary is now expressed with emoji.
All that to say that sometime, having real industry veterans as teachers really influence the teaching point of view.
https://www.youtube.com/watch?v=Z8qEz4DwFIg
It was challenging, but it was awesome (and finding that video to illustrate my comment is a blast of nostalgia). It wasn't any harder than most other CS or other sciences courses. And after digital logic, the other CS topics aren't really prerequisite or especially helpful in learning how simple processors work. I really appreciated getting straight to the foundations of how computers work and building up from there.
It wasn't like emulating x86 in javascript, but it was CPU internals. Up until I read your comment, I just assumed this was standard CS stuff.
The mathematician's version was half as long, but covered the material in more depth: ie they proved every result. The CS version was full of dumbed down and full of fluff. (And even those CS people did operating systems and compilers as undergrads.)