Archive for the ‘Swift’ Category

Tests that don’t crash

Give a big hi to Tim V! He’ll be posting here when a topic inspires him and today, he’s going to talk about how to write tests that fail gracefully.

Most people that have been writing Swift code for a while try to limit their usage of optional force unwraps and try! as much as possible. Test code, on the other hand, is often still littered with unsafe code. It’s true that crashes in tests aren’t nearly as undesirable as in production code, but it’s fairly straight-forward to write tests that fail gracefully when an unexpected nil is encountered, or when an error is thrown unexpectedly.

Consider the following unit tests:

class Tests: XCTestCase {
    func testFoo() {
        let value = try! foo()
        XCTAssertEqual(value, 5)
    func testBar() {
        let value = bar()!
        XCTAssertEqual(value, "bar")

What a lot of people don’t seem to know is that individual tests can be marked with throws to automatically handle thrown errors:

func testFoo() throws {
    let value = try foo()
    XCTAssertEqual(value, 5)

Much better. To write testBar in a safer way, we’ll need to throw an error when the output of bar is nil. We could declare a separate error for each optional value we want to unwrap in one of our tests, but that requires writing a lot of extra code. Instead, we can throw a more general Optional.Error.unexpectedNil each time an unexpected nil value is encountered:

extension Optional {
    enum Error: Swift.Error {
        case unexpectedNil
    func unwrap() throws -> Wrapped {
        guard let value = self else { throw Error.unexpectedNil }
        return value

Note: Swift 3.0 and below does not support nesting types inside a generic type, so if you’re not yet using Swift 3.1, you’ll have to declare a separate enum OptionalError instead.

Now we can rewrite testBar as follows:

func testBar() throws {
    let value = try bar().unwrap()
    XCTAssertEqual(value, "bar")

After these changes, whenever a test would normally crash, it now simply fails. And as a bonus, all tests are guaranteed to be executed, where previously a single crash would prevent the remaining tests from being run.

5 Easy Dispatch Tricks

Swift Dispatch offers a great way to schedule and control concurrent code. Here are five easy ways to improve your Dispatch experience.

1. Adding Seconds

Swift makes it easy to fetch the current time as DispatchTime.

let timeNow =

You can easily add seconds to now (in Double increments) using the + operator. As you can see from its signature, the RHS of the + operator expects a Double value of seconds.

Here are a couple of examples that show how easy it is to do this:

let timeInFive = + 5
let timeInSevenAndHalf: DispatchTime = .now() + 7.5

DispatchTime also supports the minus operator, although subtraction is rarely used in dispatch:

public func -(time: DispatchTime, seconds: Double) -> DispatchTime

2. Adding unit time

DispatchTimeInterval offers handy units for microseconds, milliseconds, nanoseconds, and seconds. They all take integer arguments:

For example, you might dispatch five print commands, each offset by an increasing number of seconds:

(1 ... 5).forEach {
        deadline: .now() + .seconds($0)) {
        print("Hi there!")

This approach works by mixing and matching DispatchTime and DispatchTimeInterval math. These operators, too, are baked into Swift:

public func +(time: DispatchTime, interval: DispatchTimeInterval) -> DispatchTime
public func -(time: DispatchTime, interval: DispatchTimeInterval) -> DispatchTime

This version is arguably more readable and maintainable than the raw + 5 from trick #1:

let timeInFive = + .seconds(5)

3. Going Floating Point with Dispatch Time Intervals

DispatchTimeInterval case initializers are limited to integer arguments. You can extend the type to add support for double values, returning a nanosecond instance with the equivalent value for a Double number of seconds:

// Built-in enumeration
public enum DispatchTimeInterval {
 case seconds(Int)
 case milliseconds(Int)
 case microseconds(Int)
 case nanoseconds(Int)

// Custom factory using a `Double` value
extension DispatchTimeInterval {
    public static func seconds(_ amount: Double) -> DispatchTimeInterval {
        let delay = Double(NSEC_PER_SEC) * amount
        return DispatchTimeInterval.nanoseconds(Int(delay))

Like the earlier use of DispatchTimeInterval this convenience constructor improves code readability while bypasssing Int-only case initializers:

let timeInEightAndHalf: DispatchTime = .now() + .seconds(8.5)

4. Async Forwards

Although DispatchTime math is convenient, it’s super-common to schedule code with respect to the current time. Why not let DispatchTime do the heavy lifting for you? Instead of saying .now() + some interval, consider extending DispatchTime to  incorporate now into the call. Here’s an example that introduces a secondsFromNow(_:) dispatch time:

extension DispatchTime {
    public static func secondsFromNow(_ amount: Double) -> DispatchTime {
        return + amount

stride(from: 1.0, through: 5.0, by: 1.0).forEach {
    DispatchQueue.main.asyncAfter(deadline: .secondsFromNow($0)) {
        print("Hi there!")

5. Testing in Playgrounds

When working with dispatch in Playgrounds, ensure that execution  continues until your work is done. Use PlaygroundPage‘s indefinite execution and halting to control that work.

import PlaygroundSupport

PlaygroundPage.current.needsIndefiniteExecution = true
DispatchQueue.main.asyncAfter(deadline: .now() + .seconds(10)) {
    print("Ending Execution")

Got any other favorite dispatch tricks? See something I need to fix? Drop me a comment, a tweet, or an email and let me know.


Like this post? Consider buying a book. Thank you!

Randomness and Portability

If you’ve written cross-platform code between Apple and Linux, you may have run into the “missing arc4random” issue. Part of BSD, and therefore automatically distributed with Darwinarc4random uses “the key stream generator employed by the arc4 cipher, which uses 8*8 8 bit S-Boxes”

On Linux, you can revert back to rand()/random() using conditional compilation but that’s not a great solution for anyone looking for quality pseudo-randomness. Matt Gallagher has a lovely write-up about native Swift RNGs at Cocoa with Love, and links to his implementation over at the CwlUtils repository.

Jens Persson, on the swift-users list, offers this native Swift generator,  a “stripped down version” of Xoroshiro128+ PRNG as well.

Xcode Autocomplete Frustrations

A year after it debuted, Xcode’s enhanced autocomplete features continue to struggle with overly liberal matches:

In this example, several of the matching text results display few commonalities with my search phrase . There’s really no reason that “fatale” should match CFDataGetLength(theData: CFData!).

It shouldn’t be hard to create heuristics that count the number of matched chunks, their distance from each other, to build a score related to whether the match is chunky (a good thing for keywords) and singular (another good thing for discerning developer intent).

Successful autocompletion promotes good matches and discards inappropriate ones. “upper” should score high on CFStringUppercase and low on CGScreenUpdateOperation and CSSMERR_TP_INVALID_CERTGROUP_POINTER.

That’s not the only problem with autocomplete. Image literal completion is a big problem. Xcode often prioritizes images over code APIs. When starting to type “picker”, Xcode should not suggest “picture-of-lovely-cat”. Here are some real world examples of this issue:

One developer told me that while typing in for closures, that eighty percent of the time, he gets a random autocompleted image literal instead of the keyword he’s shooting for:

Surely, this is an obvious place to introduce autocomplete preferences that allow you to exclude literals from the API list. The auto complete for image literals should act more like colors, offering an Image Literal entry point to a image picker instead of clogging the API name space:

It would certainly get rid of those inappropriate in matches.

Thanks Olivier Halligon, Andrew Campoli, and everyone else who gave me feedback and direction for this post.

Fun with Unicode Names

I don’t use Unicode all that often but I tend to use the character picker copypasta or hex codes when I do:

var ghoti = "🐟" // from character picker
print(ghoti) // 🐟
ghoti = String(UnicodeScalar(0x1F41F)!)
print(ghoti) // 🐟

Cocoa also supports loading unicode characters by name using the \N{UNICODE CHARACTER NAME} escape sequence. You can use patterns to construct unicode characters as in the following example:

// Make sure to escape the backslash with a second
// backslash to allow proper string construction
        reverse: true) // 🐟
let constructed = "I want to eat a \\N{FISH} sandwich"
        reverse: true)
print(constructed!) // "I want to eat a 🐟 sandwich"

This is a reverse transform, in that it converts from escaped names to the symbol it represents. The forward transform takes a string and inserts name escape sequences in place of unicode characters:

let transformed = "🐶🐮💩"
    reverse: false)

Unicode escapes are also usable in Cocoa regex matching. This example searches for the little blue fish in a string, printing out the results from that point:

let fishPattern = "\\N{FISH}"
let regex = try! NSRegularExpression(pattern: fishPattern, options: [])

let string = "I wish I had a 🐟 to eat"

// You have to use Cocoa-style ranges. Ugh.
let range = NSRange(location: 0, length: string.characters.count)

// There's a fair degree of turbulence between 
// the Cocoa API and Swift here, especially with
// the Boolean stop pointer
regex.enumerateMatches(in: string, options: [], range: range) {
    (result, flags, stopBoolPtr) in
    guard let result = result
        else { print("Missing text checking result"); return }
    let substring = string.substring(from: 
        string.index(string.startIndex, offsetBy: result.range.location))
    print(substring) // "🐟 to eat"

It’s hard going back from Swift’s string indexing model to Cocoa’s NSRange system. Native regex can’t arrive soon enough.

You can also break down unicode scalars to components:

var utf16View = UnicodeScalar("🐟")!.utf16
print(utf16View[0], utf16View[1]) // 55357 56351
print(String(utf16View[0], radix:16), 
    String(utf16View[1], radix: 16)) // d83d dc1f

This scalar approach goes boom  when you try to push into highly composed characters:


let utf16View = "👨‍👩‍👦‍👦".utf16
for c in utf16View {
    print(c, "\t", String(c, radix: 16))
// 55357 	 d83d
// 56424 	 dc68
// 8205 	 200d
// 55357 	 d83d
// 56425 	 dc69
// 8205 	 200d
// 55357 	 d83d
// 56422 	 dc66
// 8205 	 200d
// 55357 	 d83d
// 56422 	 dc66

It’s interesting to see the four d83d components in there.

Got any fun little Unicode tricks? Drop a comment, a tweet, or an email and let me know.


Tuple assignments

Do you have any good examples of when it would be useful to have a tuple but be doing complicated enough stuff with them?

Here are some examples I grepped out of a local folder, including some from third parties:

var (x, y) = (7.5, 7.5)
let (controlPoint1θ, controlPoint2θ) = (dθ / 3.0, 2.0 * dθ / 3.0)
var (_, sceneWidth) = boundingNode.boundingSphere
let (vMin, vMax) = label.boundingBox
let (duration, _) = cameraController?.performFlyover(toFace: mainActor.rotation) ?? (0, 0)
struct Point { var (x, y) : (Double, Double) }

Hope that helps.

Holy War: This seems wrong

This works:

["23"].map({ Int($0) }) // works

But this doesn’t, presumably because of the defaulted radix argument for public init?(_ text: String, radix: Int = default):

["23"].map(Int.init) // nope
["23"].map(Int.init(_:)) // nope

But this does work:

[("23", 10)].map(Int.init(_:radix:)) // works

and this:

zip(["23"], repeatElement(10, count: .max))
    .map(Int.init(_:radix:)) // works

What do you think? Should this splattage be splermissable or splorbidden?


For those who “wish for this every day”, you can always extend Int:

extension Int {
    /// provides map-specific `String` initialization
    /// e.g. `["23"].map(Int.init(string:))`
    public init?(string value: String) {
        self.init(value, radix: 10)

Evolving Label-Directed Tuple Assignments

You may be familiar with standard “tuple shuffles”. In the most common form, you use tuples to rearrange data without intermediate values:

var item1 = "x"
var item2 = "y"

// swap values
(item1, item2) = (item2, item1)
print("\(item1), \(item2)") // "y, x"

You streamline this approach by incorporating tuples into the declaration as well as the value swap:

// declare `item1` and `item2`
var (item1, item2) = ("x", "y")

// swap values
(item1, item2) = (item2, item1)
print("\(item1), \(item2)") // "y, x"

However, you can also use tuple labels to mix and match declarations with values. Right now on Swift Evolution, this is being referred to (incorrectly in my opinion) as a “tuple shuffle”:

// declare `value`
let value = (x: "x", y: "y")

// declare `item1` and `item2`
var (y: item1, x: item2) = value // y, x
print("\(item1), \(item2)") // "y, x"

// use label-based re-assignment
(y: item1, x: item2) = (x: item1, y: item2) //(y "x", x "y")
print("\(item1), \(item2)") // "x, y"

This Swift language feature, regardless of its name, is both obscure, and according to a recent  evolution list discussion, a source of unnecessarily compiler complexity.

Robert Widman has introduced a draft proposal to deprecate label-based declaration:

[This] is an undocumented feature of Swift in which one can re-order the indices of a tuple by writing a pattern that describes a permutation in a syntax reminiscent of adding type-annotations to a parameter list…it can be used to simultaneously [deconstruct] and reorder a tuple…(and)…map parameter labels out of order in a call expression.

Their inclusion in the language complicates every part of the compiler stack, uses a syntax that can be confused for type annotations, contradicts the goals of earlier SE’s (see SE-0060), and is applied inconsistently in the language in general.

There’s been a lively discussion on the Swift Evolution list about the proposal. I thought it was a fairly obvious and simple win. The draft has generated dozens and dozens of replies. Members have discussed both the nuances and possible side effects of the proposed change.

I agree with T.J. Usiyan in disliking the following pattern:

// Declare using labels
let rgba: (r: Int, g: Int, b: Int, a: Int) = (255, 255, 255, 0)

// Declare using re-ordered labels
let argb: (a: Int, r: Int, g: Int, b: Int) = rgba 
// This is the line I have issue with

print(argb.a) // "0"

This unintuitive approach runs counter to Swift’s philosophy of clarity and simplicity. Consider Joe Groff’s default parameter proposal SE-0060 and Austin Zheng’s function argument label type significance proposal SE-0111. Why not ignore label order and focus on type checking? I think the “correct” behavior should act like this instead:


// Declare using re-ordered labels
let argb: (a: Int, r: Int, g: Int, b: Int) = rgba 
// (Int, Int, Int, Int) assigned to (Int, Int, Int, Int)

print(argb.a) // "255"

I’d rename the proposal to “Removing the Ordering Significance of Tuple Argument Labels in Declarations”, and ensure that tuple argument labels aren’t considered when typing the new constant in the previous example.

What do you think?

Swift Terms: arguments, parameters, and labels

Help me refine some terminology.

Start with this code snippet:

func foo(with a: Int) -> Int { return a }

You use arguments at call sites and parameters in declarations. This example function defines one parameter that accepts one argument from the call site, e.g. foo(with: 2). 

Apple’s Swift Programming Language book uses this approach:

Use func to declare a function. Call a function by following its name with a list of arguments in parentheses. Use -> to separate the parameter names and types from the function’s return type.

The mighty Wikipedia writes:

The term parameter (sometimes called formal parameter) is often used to refer to the variable as found in the function definition, while argument (sometimes called actual parameter) refers to the actual input passed.

The Swift grammar lays out the external and local differentiation available to parameter declarations:

parameter → external-parameter-name local-parameter-name type-annotation default-argument-clause

Like Apple, I use label instead of name to refer to a parameter’s API-facing component outside of the grammar. The Swift Programming Languages refers to these as a custom argument label, naming it from the consumption POV. Apple writes:

By default, functions use their parameter names as labels for their arguments. Write a custom argument label before the parameter name, or write _ to use no argument label.

I generally call it a label or an external label instead. I often use parameter here (external parameter label, for example), especially when talking about the declaration. I don’t think there’s any real harm in doing so.

In this example, the parameter’s local or internal name is a. Some developers also call it the local or internal variable name. I find that word overloaded in meaning. Swift differentiates constants and variables and does not permit variables in function signatures.

I don’t have any problem calling it an internal argument name either because it’s the name given to the argument passed to the function or method. This seems slightly out of sync with SPL standards and practices. What do you think?