Int to UInt (and vice versa) bit casting in Swift Int to UInt (and vice versa) bit casting in Swift swift swift

Int to UInt (and vice versa) bit casting in Swift


You can do:

let unsigned = UInt8(bitPattern: Int8(-1)) // -> 255let signed   = Int8(bitPattern: UInt8(0xff)) // -> -1

Many similar initializers exist:

extension Int8 {    init(_ v: UInt8)    init(_ v: UInt16)    init(truncatingBitPattern: UInt16)    init(_ v: Int16)    init(truncatingBitPattern: Int16)    init(_ v: UInt32)    init(truncatingBitPattern: UInt32)    init(_ v: Int32)    init(truncatingBitPattern: Int32)    init(_ v: UInt64)    init(truncatingBitPattern: UInt64)    init(_ v: Int64)    init(truncatingBitPattern: Int64)    init(_ v: UInt)    init(truncatingBitPattern: UInt)    init(_ v: Int)    init(truncatingBitPattern: Int)    init(bitPattern: UInt8)}


I took the algebra route. Testing has been a pain because it is easy to get an overflow with the strong typing breaking the execution, PlayGround returned a negative value from the toUInt function, it kept crashing or gave funny errors doing a double casting (I opened a bug report). Anyway this is what I ended up with:

func toUint(signed: Int) -> UInt {    let unsigned = signed >= 0 ?        UInt(signed) :        UInt(signed  - Int.min) + UInt(Int.max) + 1    return unsigned}func toInt(unsigned: UInt) -> Int {    let signed = (unsigned <= UInt(Int.max)) ?        Int(unsigned) :        Int(unsigned - UInt(Int.max) - 1) + Int.min    return signed}

I tested them with all extreme values (UInt.min, UInt.max, Int.min, Int.max) and when XCode doesn't go crazy it seems to work, but it looks overly complicated. Bizarre enough the UInt to Int bit casting could be simply achieved with the hashvalue property as in:

signed = UInt.max.hashValue // signed is -1

But obviously it isn't guaranteed to always work (it should, but I'd rather not taking the chance).

Any other idea will be appreciated.


numericCast(...) is what you're looking for. It's a set of generic functions that converts from and to different number types. It picks the correct implementation based on the types of its argument and the return type.