Vagabond, I love that you went and wrote a paper on this. I feel inspired.
Actually, writing a paper to explain stuff is probably a really good way to learn something. I've heard it said, if you really want to learn something well, try teaching it to someone else. I recently came across mention of some research that had a similar conclusion. You learn best when you struggle with trying to explain a concept.
I had a few thoughts reading through the paper. Some of it is additional knowledge, which you might find interesting.
The fastest 2-input logic gate is NAND. The second fastest is NOR. An AND gate is typically implemented as a NAND gate followed by a NOT gate. An OR gate is typically implemented as a NOR gate followed by a NOT gate. I'm willing to bet that was unexpected. If you're curious, I could give a considerable explanation (read "rant") as to why. =)
But wait, there's more! NAND is
Functionally Complete. Any boolean expression can be represented entirely by NAND. NOR is also functionally complete. NAND and NOR are known as universal gates. Neither AND, nor OR are functionally complete. (See how I expressed that there?
) Even together, the set {AND, OR} is still not functionally complete. The set {NOT, AND, OR} is functionally complete, however, it's not minimal. You can remove an operation and still have a functionally complete set. The sets {NOT, AND} and {NOT, OR} are both functionally complete (and minimal).
Though that last part shouldn't be surprising due to
De Morgan's Laws, which express AND and OR in terms of each other and NOT. Using fairly typical digital logic notation, these laws can be expressed as:
(AB)' = (A' + B') (Ambiguously read: "A (and) B not equals A not or B not")
(A + B)' = (A'B') (Ambiguously read: "A or B not equals A not (and) B not")
Which brings me to notation. There are a lot of variants, and it often depends on context, such as mathematically founded boolean algebra, digital logic textbooks, or programming languages. Here are a few common symbols:
NOT: ¬, ̅x (overbar), ', !, ~
AND: ∧, ·, (no symbol), &&, &
OR: ∨, +, ||, |
Examples:
C++:
(x && y) || !z (logical)
(x & y) | ~z (bitwise)
Mathematical Boolean Algebra:
(x ∧ y) ∨ ¬z = (x AND y) OR NOT(z)
Digital Logic textbooks:
AB + C̅ = (A AND B) OR NOT(C)
AB + C' = (A AND B) OR NOT(C)
You'll sometimes see a dot used for AND, much like for multiplication in Algebra: · (Unicode U+22C5)
Like multiplication, AND is the default assumed operation when the symbol is omitted.
ab = a · b = a * b ("a times b") (Math)
AB = A · B = A AND B (Digital Logic)
There's some fun stuff where given a truth table, you can implementing a corresponding boolean expression, minimizing the number of gates using
Karnaugh maps, and converting it to NAND form. Though perhaps I'll save that for another time.
Feedback specific to your paper:
16-bit Word 0 to 65,536 -32,767 to 32,767
-32,768
0 ^ 3 = 0 + 2 ^ 2 = 4 + 0 ^ 1 = 0 + 2 ^ 0 = 1 Decimal 5
The notation looks a bit off. I assume you mean something like:
0*2^3 + 1*2^2 + 0*2^1 + 1*2^0
= 0 + 4 + 0 + 1 = 5
XNOR ~(a ^ b ) OR a == b
Not wrong, but slightly odd. XNOR does represent equivalence of boolean values. As C++ code, the two expressions behave quite differently for non-boolean values. In particular, they would mean something different for int values, producing different results. The left uses bit-wise operators, which work on corresponding bits in parallel. The right side is a logical operator and works on single boolean values. This is the difference between & and &&, or | and ||. The single character forms are bitwise operations, while the double character forms are logical operations.
You're correct in that the size of a WORD is the register size for a particular processor. In the case of x86, the original processor in that family had 16-bit registers. When the architecture was extended in a backwards compatible way to 32-bit, and later to 64-bit, they kept referring to WORD as a 16-bit datatype and began calling 32-bit values DWORDs (double words), and 64-bit values QWORDs (quad words). Other architectures may not support operations smaller than the WORD size, relying on bitmasks and AND to work on smaller sets of bytes.
This one is kind of a nitpick, especially considering there isn't some obvious inherent ordering of operations, but I find it odd your tables have AND and OR next to each other, but split up and re-arrange NAND and NOR.
Leeor, there's usually a way to use references.