Ask Erica: Why can’t I initialize my UInt?

CPow asks: “Why does this give an error? This makes no sense. I smell a radar…?”

Screen Shot 2016-03-17 at 8.00.26 AM

Answer: To the best of my understanding, the compiler sees that literal as an integer and is trying to use it with a UInt initializer. If you explicitly make sure the compiler understands that the literal is a UInt and not an Int, the problem disappears.

Screen Shot 2016-03-17 at 8.05.55 AM

(As Joe Groff points out, you obviously don’t need the UInt() in the “c” example, but I was trying to build off CPow’s question.)

Got a better explanation for CPow? Please let me know!

2 Comments

  • Seems as if swift is unable to identify the correct initializer, maybe because 0x8000000000000000 is NOT an integer literal, but rather a hexadecimal floating point literal, UInt(integerLiteral: 0x8000000000000000) works though.

    • It’s not a float. Looking at the docs 0x is always interpreted as signed int (which is what shows in the error) If you use a Double it can represent that number so the conversion happens. Float hex literals are 0xp

      Here’s the playground i created just messing about. 🙂 I like that you can use _ in the literal which helps group bytes


      let numberInt = 0x7FFF_FFFF_FFFF_FFFF

      let number: UInt64 = UInt64(integerLiteral: 0xffff_ffff_ffff_ffff)
      let ex2: UInt = UInt(integerLiteral: 0x8000_0000_0000_0000)
      let maxSignedLiteral: UInt = UInt(0x7FFF_FFFF_FFFF_FFFF)

      let max = UInt.max

      let floatNumber: Double = 0x7FFF_FFFF_FFFF_FFFF

      let floatNumber2: Double = 0x8000_0000_0000_0000

      let floatNumber3: UInt = UInt(floatNumber2)

      print(String(format: "max %lu 0x%lx", max, max))
      print(String(format: "ex1 %lu 0x%lx", ex2, ex2))