开发者

mathematical equation for AND bitwise operation?

开发者 https://www.devze.com 2023-03-30 17:14 出处:网络
In a shift left operation for example, 5 << 1 = 10 10 << 1 = 20 then a mathematical equation can be made,

In a shift left operation for example,

5 << 1 = 10

10 << 1 = 20

then a mathematical equation can be made,

n << 1 = n * 2.

If 开发者_运维百科there is an equation for a shift left operation,

then is it possible that there is also a

mathematical equation for an AND operation?

or any other bitwise operators?


There is no straightforward single operation that maps to every bitwise operation. However, they can all be simulated through iterative means (or one really long formula).

(a & b)

can be done with:

(((a/1 % 2) * (b/1 % 2)) * 1) +
(((a/2 % 2) * (b/2 % 2)) * 2) +
(((a/4 % 2) * (b/4 % 2)) * 4) +
...
(((a/n % 2) * (b/n % 2)) * n)

Where n is 2 to the number of bits that A and B are composed minus one. This assumes integer division (remainder is discarded).


That depends on what you mean by "mathematical equation". There is no easy arithmetic one.

If you look at it from a formal number-theoretic standpoint you can describe bitwise "and" (and "or" and "xor") using only addition, multiplication and -- and this is a rather big "and" from the lay perspective -- first-order predicate logic. But that is most certainly not what you meant, not least because these tools are enough to describe anything a computer can do at all.


Except for specific circumstances, it is not possible to describe bitwise operations in other mathematical operations.

An and operation with 2n-1 is the same as a modulus operation with 2n. An and operation with the inverse of 2n-1 can be seen as a division by 2n, a truncation, and a multiplication by same.


It depends on what you mean by “mathematical”. If you are looking for simple school algebra, then answer is no. But mathematics is not sacred — mathematicians define new operations and concepts all the time.

For example, you can represent 32-bit numbers as vectors of 32 booleans, and then define “AND” operation on them which does standard boolean “and” between their corresponding elements.


Yes,they are sums. Consider for a binary word of length n. It can be written as the following; A=a0*2^0+a1*2^1+a2*2^3....an*2^n. Where an is an element of {0,1}

Therefore if an is a bit in A and bn is a bit in B, then; AandB=a0*b0*2^0+a1*b1*2^1...an*bn*2^n similarly AxorB=(a0+b0)mod2*2^0+(a1+b1)mod2*2^1...+(an+bn)mod2*2^n

Consider now the identity; Axor1=notA

We now have the three operators we need (Bitwise AND,Bitwise XOR and Bitwise NOT)

From these two we can make anything we want.

For example, bitwise OR

not[(notA)and(notB)]=not[not(AorB)]=AorB

Its not guaranteed to be pretty though.

In response to the comment regarding mod2 arithmetic not being very basic, that's true in a sense. However,while its common because of the prevalence of computers nowadays, the entire subject we are touching on here is not particularly "basic". The OP has grasped something fundamental. There are finite algebraic structures studied in the mathematical field known as "Abstract Algebra" such as addition and multiplication modulo n (where n is some number such as 2, 8 or 2^32). There are other structures using binary operations (addition is a binary operation, it takes two operands and produces a result, as is multiplication, and xor) such as xor, and ,bit shifts etc, that are "isomorphic" to the addition and multiplication over integers mod n. that means they act the same way, they are associative, distributive etc. (although they may or may not be commutative, think of matrix multiplication) Its hard to tell someone where to start looking for more information. I guess the best way would be to start with a book on formal mathematics.(Mathematical proofs) You need that to understand any advanced mathematics text. Then a text on abstract algebra. If your a computer science major you will get a lot of this in your classes. If your a mathematics major, you will study these things in depth all in good time. If your a history major, Im not knocking history , im a history channel junkie, but you should switch majors because your wasting your talents!


Here is a proof that for 2-bit bitwise operations you cannot describe & with just + - and * (check this, just came up with it now, so, who knows):

The question is, can we find a polynomial

x & y == P(x, y)

where

P(x, y) = a0_0 + a1_0*x + a0_1*y + a2_0*x^ + ...

Here's what it would have to look like:

   0 1 2 3
  --------
0| 0 0 0 0
1| 0 1 0 1
2| 0 0 2 2
3| 0 1 2 3

First, clearly a0_0 == 0. Next you can see that if P is rewritten:

                     |------- Q(x, y) --------|
P(x, y) = xy*R(x,y) + a1_0*x + a0_1*y + ...

And y is held 0, while x varies over 0, 1, 2, 3; then Q(x, y) must be 0 for each of those values. Likewise if x is held 0 and y varied. So Q(x, y) may be set to 0 without loss of generality.

But now, since P(2, 2) = 2, yet 2 * 2 == 0, the polynomial P cannot exist.

And, I think this would generalize to more bits, too.

So the answer is, if you're looking for just +, * and -, no you can't do it.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号