Many languages support an enumeration type which is backed by an integer but has friendly scoped names in the language proper.
C#, like many others, call theirs enum
. Whenever you use it, ensure you're only working with the numeric value at the boundary of your code if at all, and that you are careful about not changing the order or value of enumerands if you have persistent data represented with the enum.
Even if your value is truth:y or yes-no:y, consider a small enum anyway for clarity in function signatures, particularly if it's unclear what meanings map to true and false. The same holds to assigning meaning to numeric values.
No-one likes to see f(true, 0, 0, null, false, false, 3)
, not even with IDE functionality revealing the signature on hover or button press.
Performance-wise, you're not going to see any difference between a char and an integer. Strings will incur whatever (non-)penalties of references and whatever it costs to produce a known string literal.
Either will likely be negligible, optimize for readability and bug intolerance instead.
If your question actually is about databases, consider a foreign key into a separate table, that way you can ensure that only valid values end up in the column and it's reasonably extensible.
MSDN on enum