Ftypes.h incorrectly enables Unicode (on MSVC/others?)

There’s an issue with this part in VST3_SDK\pluginterfaces\base\ftypes.h:

//#define UNICODE_OFF 	// disable / enable unicode

#ifdef UNICODE_OFF
	#ifdef UNICODE
	#undef UNICODE
	#endif
#else
	#define UNICODE 1
#endif

#ifdef UNICODE
#define _UNICODE 1
#endif

On MSVC (not sure about other compilers), _UNICODE is automatically defined by VC if the Project Properties are set for Unicode. The iPlug2 sample projects are all set to Multi-Byte (ie. non-Unicode) character sets and also don’t use wide strings. However the code above is enabling the _UNICODE define anyway?

That means that any Windows API calls (which come in Unicode and non-Unicode versions, automatically resolved via defines) will use Unicode strings, and all the wrapper defines for all string functions and the respective char type (used via <tchar.h>) also incorrectly resolve to Unicode.

adding UNICODE_OFF manually to the VST3_DEFS gives a compilation error in Steinberg’s tstrlen implementation:

1>n:\_projects\_3rd 
party\iplug2\dependencies\iplug\vst3_sdk\public.sdk\source\vst\vstparameters.cpp(131): error C2664: 'Steinberg::int32 Steinberg::tstrlen(const Steinberg::tchar *)': cannot convert argument 1 from 'const Steinberg::Vst::TChar *' to 'const Steinberg::tchar *'

They acknowledged this in 2017 but haven’t updated their SDK?
https://sdk.steinberg.net/viewtopic.php?t=355

(EDIT: just realised that you reported it Oli :slight_smile: )

They give a workaround on that link which I’ve applied to their fstrdefs.h. Can we apply this to the version the IPlug2 script downloads?

#ifndef UNICODE
inline int32 tstrlen (const char16* str) {return _tstrlen (str);}
#else
inline int32 tstrlen (const tchar* str) {return _tstrlen (str);}
#endif

But they should detect _UNICODE rather than messing with it on Windows. If it makes sense on other platforms they should make an MSVC codepath.

Can you describe the actual issue that arises for you with VST3? We’re aware of this aspect of the VST3 SDK and as far as I can remember (having done quite a bit of the character stuff in iplug2) we always explicitly call the versions of the windows string/character functions we want (rather than using the macros) and therefore, everything iplug2 related should be fine.

So is the problem:
A - that something iniplug2 doesn’t work as expected?
B - that you can’t call windows-specific string stuff in your code?

So the issue is that I’m a Windows-only programmer. I’ve built up masses of library code over the years, and all of it is based on the assumption that _UNICODE is correctly and automatically defined, based on MSVC project settings. This is standard practice for Windows development.

I use <tchar.h> (like most Windows C/C++ programmers) to automatically wrap all the standard string functions too, again this requires the correct definition of _UNICODE.

So the automatic _UNICODE define is a major aspect of Windows programming, and it should not be redefined by 3rd-party code, else everything breaks.

To be clear, this is not just for convenience. Writing string-type agnostic code (using all these techniques) allows you to write library code, and apps, than can work for either string type just by changing the project settings. So a Windows programmer should never explicitly call a specific function.

That means that, for example, all strings are wrapped like this:

<include tchar.h> // provides string function type wrappers, as well as the _T() macro

const TCHAR* string = _T("my string");

With _UNICODE defined, this resolves to:

const wchar_t* string = L"my string";

Without _UNICODE it resolves to

const char* string = "my string";

So as you can see this compiles correctly in either scenario.

Thanks for explaining the problem so clearly. I just wanted to make sure that this was an issue affecting windows-specific code outside of iplug2 - that’s not to say it isn’t important, but obviously it’s a slightly different scenario if it makes something default in iplug2 function incorrectly.

It is also important that this is a VST3 SDK bug, and so there are three possible routes:

  • convince Steinberg to fix it
  • maintain a workaround in iplug2 for all iplug2 users
  • leave it to individual developers to work around

To my mind the first one of those is the best solution, but that may not happen. I suggest (given that you have a workaround) that the best way forward is to make an issue or PR on the GitHub for this referencing this post. I’ve just checked, and currently we include the VST3 SDK without modification - if this were something we could apply in our own code I’d be happy to do it, but there will need to be a discussion with @olilarkin about how he would feel about maintaining a patch.

Yes these are VST3 SDK issues. In general, there is no problem with an app/library sticking to multi-byte, or even using string-type specific Windows calls, it won’t break existing code like mine. It does mean that you can’t just change the project settings to change string type.

But Steinberg should not be re-defining _UNICODE (on Windows at least).

Both issues can be worked around without modifying the VST3 SDK, by adding this to IPlugVST3.h before the VST3 header includes:

//  fix VST3 SDK handling the _UNICODE define incorrectly on Windows
#ifdef _MSC_VER       // Visual Studio
# ifndef _UNICODE     //  pre-defined by VC based on project settings
#  define UNICODE_OFF //   disable Unicode in the VST3 SDK
# endif
#endif

… actually both issues can be fixed in the same iPlug2 header without modifying the VST3 SDK (I’ve updated the above post).

If you could make a PR for this that would be great.

We’ve just merged some changes Unicode improvements and fixes by AlexHarker · Pull Request #1044 · iPlug2/iPlug2 · GitHub which will impact this issue @_gl do you have time to check if it is still a problem on master?