开发者

Interpreting a uint16_t as a int16_t

开发者 https://www.devze.com 2023-04-05 13:12 出处:网络
Is there a portable and safe way to interpret the bit-pattern made by a boost::uint16_t as aboost::int16_t? I have a uint16_t, which I know represents a signed 16-bit integer encoded as little-endian.

Is there a portable and safe way to interpret the bit-pattern made by a boost::uint16_t as a boost::int16_t? I have a uint16_t, which I know represents a signed 16-bit integer encoded as little-endian. I need to do some signed arithmetic on this value, so is there anyway to convince the compiler that it a开发者_C百科lready is a signed value?

If I a not mistaken, a static_cast<int16_t> would convert the value, perhaps changing its bit-pattern.


If you are looking for something different than a cast, then copy its memory representation to that of a boost::int16_t since its what it represents to begin with.

Edit: If you have to make it work on a big endian machine, simply copy the bytes backwards. Use std::copy and std::reverse.


Just use the static cast. Changing the bit pattern happens to be exactly what you want, if you happen to be on a platform that defines them differently.

reinterpret_cast, or any equivalent pointer cast, is undefined (not implementation defined). That means the compiler is free to do nasty things like cache the undefined form in a register and miss the update. Besides, if you were on a platform where the bit patterns were different then bypassing the conversion would leave you with garbage (just like pretending a float is an int and adding 1 to it.)

More info is at Signed to unsigned conversion in C - is it always safe? but the summary C, in a roundabout way, defines the static cast (ordinary C cast actually) as exactly what you get by treating the bits the same on x86 (which uses two's complement.)

Don't play chicken with the compiler (this always worked on this compiler so surely they won't break everybody's code by changing it). History has proven you wrong.


Mask off all but the sign bit, store that in a signed int, then set the sign using the sign bit.


I guess *(boost::int16_t*)(&signedvalue) would work, unless your system architecture is not little-endian by default. endian ness will change behavior since after above operation cpu will treat signed value as a architecture specific boost::int16_t value (meaning if your architecture is big endian it'll go wrong).


Edit
To avoid controversy over *(int16_t)(&input_value), I changed the last statement in the code block to memcpy and added *(int16_t)(&input_value) as an addendum. (It as the other way around).

On a big endian machine you will need to do a byte swap and then interpret as a signed integer:

if (big_endian()) {
  input_value = (uint16_t)((input_value & 0xff00u) >> 8) |
                (uint16_t)((input_value & 0x00ffu) << 8);
}
int16_t signed_value;
std::memcpy (&signed_value, &input_value, sizeof(int16_t));

On most computers you can change the call to memcpy to signed_value = *(int16_t)(&input_value);. This is, strictly speaking, undefined behavior. It is also an extremely widely used idiom. Almost all compilers do the "right thing" with this statement. But, as is always the case with extensions to the language, YMMV.


As a different tack, the best way to maximize (but not ensure) portability is to store those signed 16 bit integers as signed 16 bit integers in network order rather than as unsigned 16 bit integers in little endian order. This puts the burden on the target machine to be able to translate those 16 bit network order signed integers to 16 bit signed integers in the native form to the target. Not every machine supports this capability, but most machines that can connect to a network do. After all, that file has to get to the target machine by some mechanism, so the odds are pretty good that it will understand network order.

On the other hand, if you are zapping that binary file to some embedded machine via some proprietary serial interface, the answer to the portability question is the same answer you'll get when you tell your doctor "it hurts when I do this."

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号