Quote:
|
Originally Posted by sciguy125
Apparently, the compiler doesn't like that. I'm not really sure why it's defined or why it works without it, but it taking it out fixed the problem.
|
Just FYI, this is because standard ASCII strings (char*) use 8 bits per character but Unicode (WCHAR*) use more (16 bits I believe). The compiler correctly stopped you from passing a string of 16 bit characters to a function that is expecting 8 bits per character. If it hadn't, your program would have behaved very unexpectedly.