I deliberately used the character literal ‘A’ and not any of your UTF8 strings. I think you are mistaken to confuse a character with your strings. Is this wrong?
You can have a unicode character literal -- and depending on the language there's no distinction between character and string (at the type level), a character is just a string of length 1.