_T( ) macro changes for UNICODE character data

18,501

You can't - not without c++0x support. c++0x defines the following ways of declaring string literals:

  • "string of char characters in some implementation defined encoding" - char
  • u8"String of utf8 chars" - char
  • u"string of utf16 chars" - char16_t
  • U"string of utf32 chars" - char32_t
  • L"string of wchar_t in some implementation defined encoding" - wchar_t

Until c++0x is widely supported, the only way to encode a utf-16 string in a cross platform way is to break it up into bits:

// make a char16_t type to stand in until msvc/gcc/etc supports
// c++0x utf string literals
#ifndef CHAR16_T_DEFINED
#define CHAR16_T_DEFINED
typedef unsigned short char16_t;
#endif

const char16_t strABC[] = { 'a', 'b', 'c', '\0' };
// the same declaration would work for a type that changes from 8 to 16 bits:

#ifdef _UNICODE
typedef char16_t TCHAR;
#else
typedef char TCHAR;
#endif
const TCHAR strABC2[] = { 'a', 'b', 'b', '\0' };

The _T macro can only deliver the goods on platforms where wchar_t's are 16bits wide. And, the alternative is still not truly cross-platform: The coding of char and wchar_t is implementation defined so 'a' does not necessarily encode the unicode codepoint for 'a' (0x61). Thus, to be strictly accurate, this is the only way of writing the string:

const TCHAR strABC[] = { '\x61', '\x62', '\x63', '\0' };

Which is just horrible.

Share:
18,501
Nagasai Sowmya
Author by

Nagasai Sowmya

Updated on July 13, 2022

Comments

  • Nagasai Sowmya
    Nagasai Sowmya almost 2 years

    I have UNICODE application where in we use _T(x) which is defined as follows.

    #if defined(_UNICODE)
    #define _T(x) L ##x
    #else
    #define _T(x) x
    #endif
    

    I understand that L gets defined to wchar_t, which will be 4 bytes on any platform. Please correct me if I am wrong. My requirement is that I need L to be 2 bytes. So as compiler hack I started using -fshort-wchar gcc flag. But now I need my application to be moved to zSeries where I don't get to see the effect of -fshort-wchar flag in that platform.

    In order for me to be able to port my application on zSeries, I need to modify _T( ) macro in such a way that even after using L ##x and without using -fshort-wchar flag, I need to get 2byte wide character data.Can some one tell me how I can change the definition of L so that I can define L to be 2 bytes always in my application.

    • nothrow
      nothrow over 13 years
      AFAIK, wchar_t is 2bytes wide on Windows, so the size of wchar_t is implementation dependant.
    • MSalters
      MSalters over 13 years
      wchar_t is normally used as the base type for WCHAR, which certainly is 2 bytes wide. Functions like MessageBoxW have WCHAR* arguments, so having WCHAR and wchar_t identical makes Windows programming a a lot easier.
    • josesuero
      josesuero over 13 years
      L is just the character 'L'. It doesn't get defined to be anything. In C++, L"hello world" just defines a wide string literal. But the L doesn't get replaced by anything.
  • MSalters
    MSalters over 13 years
    Doesn't solve the problem - you still don't have a way to create const uint_least16_t[] literals.
  • MSalters
    MSalters over 13 years
    Mind you, on an IBM zSeries a is still equal to 0x61, but j is not 0x6a.
  • Nagasai Sowmya
    Nagasai Sowmya over 13 years
    I am using GCC compiler. Is there any other GCC compiler flag other than -fshort-wchar to change the size of wchar_t.
  • pmg
    pmg over 13 years
    @MSalters: const uint_least16_t data[] = { 'f', 'o', 'o', 'b', 'a', 'r', '\0' };
  • MSalters
    MSalters over 13 years
    did you spot the "I need to modify _T( ) macro" part of the question? How does _T("foobar") expand to const uint_least16_t data[] = { 'f', 'o', 'o', 'b', 'a', 'r', '\0' }; ?
  • pmg
    pmg over 13 years
    You want two incompatible things: at least one of the things has to compromise --- the computer is much more stubborn than you