This may be an immature question, may be am missing something but my question is
Trying convert a char
to int
to get the ASCII value of that char
, in most cases I get correct/expected ASCII code for particular char
, in some cases I don't. Can someone explain me why?
Examples:
// Example 1:-
Console.WriteLine((int)'a');
// gives me 97 perfect!
// Example 2:-
Console.WriteLine((char)1); gives me ☺
// now
Console.WriteLine((int )'☺');
// this should give me 1, instead it gives me 9786 why?
this happens to ASCII > 127
or ASCII < 32
.