Quote:
|
Originally Posted by Dave Flowerday
Just FYI, this is because standard ASCII strings (char*) use 8 bits per character but Unicode (WCHAR*) use more (16 bits I believe). The compiler correctly stopped you from passing a string of 16 bit characters to a function that is expecting 8 bits per character. If it hadn't, your program would have behaved very unexpectedly.
|
To add just a little bit more information to this, when UNICODE is defined, WCHAR is 16-bits but when it is not defined WCHAR is only 8-bits (the same as a regular char). This is why the cast was failing.
Matt