[lldb][TypeSystemClang] Deduce lldb::eEncodingUint for unsigned enum types
The motivating issue was the following:
```
$ cat main.cpp
enum class EnumVals : uint16_t {
VAL1 = 0
};
struct Foo {
EnumVals b1 : 4;
};
int main() {
// Assign value out-of-range if
// bit-field were signed
Foo f{.b1 = (EnumVals)8};
return 0; // Break here
}
(lldb) script
>>> lldb.frame.FindVariable("f").GetChildMemberWithName("b1").GetValueAsUnsigned()
4294967288
```
In the above example we observe a unsigned integer wrap-around
because we sign-extended the bit-fields underlying Scalar value
before casting it to an unsigned. The sign extension occurs because
we don't mark APSInt::IsUnsigned == true correctly when extracting
the value from memory (in Value::ResolveValue). The reason why sign
extension causes the wraparound is that the value we're assigning
to the bit-field is out-of-range (if it were a signed bit-field),
which causes `Scalar::sext` to left-fill the Scalar with 1s.
This patch corrects GetEncoding to account for unsigned enum types.
With this change the Scalar would be zero-extended instead.
This is mainly a convenience fix which well-formed code wouldn't
encounter.
rdar://
99785324
Differential Revision: https://reviews.llvm.org/D134493