开发者

How to convert wchar_t to NSString?

开发者 https://www.devze.com 2023-04-13 05:49 出处:网络
I have wchar_t buffer [100] . Sometimes it ne开发者_JAVA技巧eded for Unicode letters, sometimes is not.

I have wchar_t buffer [100] . Sometimes it ne开发者_JAVA技巧eded for Unicode letters, sometimes is not.

I need to convert it to NSString.

I'm using NSString *str = [NSString string:(char *)buffer]; to conver it.

When I'm trying to NSLog my NSString, sometimes it getting right result, but sometimes is not.

Did I miss something?

Thanks!


My converter for "char", "wchar_t", "NSString". Use and enjoy.

//-=(W)=-

+(NSString *)stringFromChar:(const char *)charText
{
    return [NSString stringWithUTF8String:charText];
}

+(const char *)charFromString:(NSString *)string
{
    return [string cStringUsingEncoding:NSUTF8StringEncoding];
}

+(NSString *)stringFromWchar:(const wchar_t *)charText
{
    //used ARC
    return [[NSString alloc] initWithBytes:charText length:wcslen(charText)*sizeof(*charText) encoding:NSUTF32LittleEndianStringEncoding];
}

+(const char /*wchar_t*/ *)wcharFromString:(NSString *)string
{
    return  [string cStringUsingEncoding:NSUTF8StringEncoding];
}


Everything is as Totumus Maximus has said, but additionally you need to know how the characters in your buffer are encoded. As wchar_t is 32 bits you probably have some 32 bit encoding of which UTF32-LE is the most likely. What you want to do to get your NSString is:

NSString* result = [[NSString alloc] initWithBytes: (const void*)buffer 
                                            length: sizeof(wchar_t) * numberOfCharsInBuffer
                                          encoding: someEncoding];

where:

  • numberOfCharsInBuffer is the number of wchar_ts in the buffer that you want to decode. The method above does not assume that the string is null terminated and will happily try to put nulls into the NSString if they appear before the length you specify (note that with wchar_t "null" means a 32 bit value that is zero).
  • someEncoding is the encoding used by the string in the buffer. Try NSUTF32StringEncoding, NSUTF32LittleEndianStringEncoding, NSUTF32BigEndianStringEncoding.


Maybe this will clear things up.

C89 introduced a new integer type, wchar_t. This is similar to a char, but typically "wider". On many systems, including Windows, a wchar_t is 16 bits. This is typical of systems that implemented their Unicode support using earlier versions of the Unicode standard, which originally defined fewer than 65,535 characters. Unicode was later expanded to support historical and special purpose character sets, so on some systems, including Mac OS X and iOS, the wchar_t type is 32 bits in size. This is often poorly documented, but you can use a simple test like this to find out:

// how big is wchar_t?
NSLog(@"wchar_t is %u bits wide", 8 * sizeof(wchar_t));

On a Mac or iPhone, this will print "wchar_t is 32 bits wide". Additionally, wchar_t is a typedef for another integer type in C. In C++, wchar_t is a built-in integer type. In practice, this means you need to #include in C when using wide characters.

Ref: http://blog.ablepear.com/2010/07/objective-c-tuesdays-wide-character.html

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号