Archive for the ‘Xcode’ Category

The Unexpected Joy of Vector Images in iOS 11

A few years ago, I griped about That Vector Thing, which was the way that Xcode 6 handled PDF vector assets and could not extend them to arbitrary use in UIImages.

Enter WWDC 2017. In Session 201 “What’s New in Cocoa Touch”, Apple described Asset Catalogs with PDF-backed vector images. All you have to do is tick the “Preserve Vector Data” checkbox.

After chatting with some colleagues about whether this would actually work as promised, I dragged up a 29×29 vector PDF image.

And I added it as a 1x image in an asset catalog. Notice the Resizing checkbox to the right. It is normally unchecked by default.

I then built a super-simple single view testbed to test things out.

override func viewDidLoad() {
    super.viewDidLoad()
    let imageView = UIImageView()
    imageView
        .translatesAutoresizingMaskIntoConstraints = false
    imageView.contentMode = .scaleAspectFit
    view.addSubview(imageView)
    ["H:|[v]|", "V:|[v]|"].forEach { format in
        NSLayoutConstraint
            .constraints(withVisualFormat: format, 
                options: .init(rawValue:0), metrics: nil, 
                views: ["v": imageView]).forEach { 
            $0.isActive = true 
        }
    }
    let image = UIImage(named: "Biff")
    imageView.image = image
}

Here’s what that 29×29 image looks like running on an iPhone 7+ in the simulator. The 1x image is being rendered on a 3x destination, at a greatly magnified size. Its vector data ensures the image renders without losing detail or clarity. Compare it to the same asset that does not preserve vector data and my original test from 2014:

  

Click above to see full size screenshot originals. Below is a comparison shot from the simulator at the largest size.

I expect there are minor performance hits in scaling and rendering the vector image compared to loading a standard PNG or JPEG, but I didn’t get around to measuring the costs.

If you like my write-ups, please consider buying a book.

The problem with Swift Playground localization

Starting in Swift Playgrounds 2, you can now use localized strings to guide the narration of your interactive lessons. As the screenshot above demonstrates, you can used localizable markup to provide the most appropriate text for titles, introductory text, and feedback.

However, what you can’t do is localize Swift members. Your French and Chinese consumers must tell Byte to moveForward(), not avancer() or 向前移动().

One of the guiding principles of the Swift language is demonstrated in its embrace of unicode for identifier symbols. This approach accommodates programmers and programming styles from many languages and cultures.

Xcode 9 has introduced major advances in code refactoring. It seems an obvious win to allow that technology to be applied to Swift Playgrounds 2, enabling identifier localization.

That’s because identifiers play such a key role in Swift Playgrounds. Unlike standard development tasks, where it’s unnecessary to create idiomatic APIs like IUContrôleurDeNavigation, the point of Swift Playgrounds is to teach and instruct. It uses small, limited, controlled API exposure, nearly all custom and supporting of the teaching story.

The anthropomorphized Byte character acts as a stand-in for the learner coder. And in doing so, it should communicate with commands that this coder identifies with, turnLeft and moveForward, not incomprehensibleForeignPhrase1 and evenMoreConfusingForeignPhrase2.

I think this is an opportunity waiting to happen, and I can’t imagine it would be all that hard to implement given the expansive identifier suite and the limited API visibility presented in a typical playgroundBook.

What do you think? Is it too much to ask for a localizable.Members.plist?

Simulating a second finger during drag

You can drag and drop in the iOS simulator by clicking and holding an item. The item “pops” and you can then drag it to a destination. Today, an Apple engineer shared a neat way to free up a “second finger” during this process.

Pause and press the control key. This pins an item mid-drag, enabling you to use the Mac cursor as another touch. You can then retrieve your drag by grabbing the paused item and conclude your drop.

Apple open sources key file-level transformation Xcode components

Ted Kremenek writes on the swift-dev list:

This afternoon at WWDC we announced a new refactoring feature in Xcode 9 that supports Swift, C, Objective-C, and C++.  We also announced we will be open sourcing the key parts of the engine that support file-level transformations, as well as the compiler pieces for the new index-while-building feature in Xcode.

We will be releasing the sources in stages, likely over the next few weeks:

– For the refactoring support for Swift, there are some cleanups we’d like to do as well as some documentation we’d like to author before we push these sources back.  Argyrios Kyrtzidis and his team from Apple will be handling that effort.

– For the refactoring support for C/C++/Objective-C, these are changes we’d like to work with the LLVM community to upstream to the LLVM project.  These will likely be first staged to the swift-clang repository on GitHub, but that is not their intended final destination.  Duncan Exon Smith and his team from Apple will be handling that effort.

– We’ll also be open sourcing the compiler support for indexing-while-building, which include changes to both Clang and Swift.   Argyrios and his team will be driving that effort.  For the clang changes they will likely be first staged to swift-clang, and then discussed with the LLVM community to upstream them to mainline Clang.

– Finally, we will be open sourcing the remaining pieces of the Swift migrator.  Argyrios and his team will be handling the push back of changes there, and those changes will only be impacting the swift repository.

As usually, we’ll also be pushing back changes to have Swift work with the latest Apple SDKs.  We’re expecting that push back to happen early next week.  When that happens we will temporarily lock commit access to the repositories.  Details about that will be sent out later in a later email.  Until then, the downloadable toolchains from Swift.org will continue to work with Xcode 8.3.2.  After we do the push back the downloadable toolchains will be moved to be baselined on the Xcode 9.0 betas.  This shift is necessary as changes to the overlays depend on the latest SDKs.

Xcode Autocomplete Frustrations

A year after it debuted, Xcode’s enhanced autocomplete features continue to struggle with overly liberal matches:

In this example, several of the matching text results display few commonalities with my search phrase . There’s really no reason that “fatale” should match CFDataGetLength(theData: CFData!).

It shouldn’t be hard to create heuristics that count the number of matched chunks, their distance from each other, to build a score related to whether the match is chunky (a good thing for keywords) and singular (another good thing for discerning developer intent).

Successful autocompletion promotes good matches and discards inappropriate ones. “upper” should score high on CFStringUppercase and low on CGScreenUpdateOperation and CSSMERR_TP_INVALID_CERTGROUP_POINTER.

That’s not the only problem with autocomplete. Image literal completion is a big problem. Xcode often prioritizes images over code APIs. When starting to type “picker”, Xcode should not suggest “picture-of-lovely-cat”. Here are some real world examples of this issue:

One developer told me that while typing in for closures, that eighty percent of the time, he gets a random autocompleted image literal instead of the keyword he’s shooting for:

Surely, this is an obvious place to introduce autocomplete preferences that allow you to exclude literals from the API list. The auto complete for image literals should act more like colors, offering an Image Literal entry point to a image picker instead of clogging the API name space:

It would certainly get rid of those inappropriate in matches.

Thanks Olivier Halligon, Andrew Campoli, and everyone else who gave me feedback and direction for this post.

Xcode Tricks: API Changes

Here’s a quick trick that helps you review changes between SDK releases, including beta SDKS. In Xcode, select Help > API Changes.

Xcode automatically navigates your web browser to https://developer.apple.com/api-changes/. (You can also visit this link directly.) At the web site, you’ll find a list of current and archival API deltas:

For example, when I clicked on the latest Beta release, here’s the screen that popped up today:

Color-coded overviews reflect the number of changes (modified, added, deprecated) at the top left. A pop-up at the right, enable you to switch to specific betas and releases.

Select individual modules to view changes in-place, annotated with corresponding color highlights.

As you navigate down to finer details, the change highlights trickle down to deeper and deeper levels.

Select an older version of Xcode from the top-right pop-up to view a current-to-previous comparison. In this case, the type constraint has moved to a trailing where condition.

It’s a fascinating way to present and navigate API changes.

Bluetooth Lessons II: Characteristics

Yesterday, I wrote about the basics involved in setting up a Bluetooth manager and scanning for available peripherals. The sample code left off after finding announced devices.

Last evening, I expanded this functionality to find a Mi wristband and execute a repeating vibration pattern. Today’s code is about four times longer and involves a lot more of the direct interaction with a BLE peripheral.

The changes start on finding a desired device. I looked at its peripheral.name, an optional value associated with a CBPeripheral. This approach is essentially the same as looking at SSIDs for WiFi. There’s not a lot of sophistication involved, and no pairing sequence. Once found, my code instructs the central manager to stop its scan (centralManager.stopScan()) and connect() to the matched peripheral.

At this point it’s really important that you do a few things:

  • Create a strong reference to the target peripheral. I used a property. Without this, the peripheral reference deallocates, as I found to my dismay.
  • Set the peripheral’s delegate, so you can monitor its callback routines, specifically centralManager(_:, didConnect:). You won’t be able to start the communication chain otherwise.

For the next step, request service discovery from the delegate callback. After the peripheral connects, call discoverServices(), and listen in peripheral(_:,didDiscoverServices:). At this point, the chain of command has passed from the manager that finds you the right peripheral, and moves to listening to the peripheral itself. Both components must establish delegation.

Today’s sample code uses a brute-force approach to find services, and then in the services callback, discover specific service characteristics (discoverCharacteristics(_:, for:)) for the device.

In production code, you’d limit these calls: supply a list of only those services and the characteristics (the actual API call points) you’re interested in. My code passes nil, because there aren’t that many calls for the target device and I don’t mind the extra overhead.

Like previous steps, characteristics have their own delegate method (peripheral(_:didDiscoverCharacteristicsFor:,error:)). Here’s where you can access and initialize specific actions that support those characteristics. I decided to use notifications to respond to the discovery of the vibration characteristic (“2A06”, an industry standard).

Notifications are short and sweet, especially for playgrounds. You won’t have to invest in designing protocols or implementing delegates. Just listen for the notification and then start doing whatever you need.

In this case, my open ended observer (it lives forever, so I don’t bother trying to save and release it) starts a vibration pattern by calling a custom type method startVibrating(degree:, delay:). This method writes request data to the peripheral’s characteristic, producing each vibration pattern on demand. The delay allows the vibration to repeat after an arbitrary number of seconds.

Like the peripheral, store your characteristics locally. Although they are characterized by a UUID, they aren’t meant to be built on the fly with raw UUID values — or at least not that I could find. Saving each characteristic for later reference appears to be a vital part of the set-up process.

At this point, I’ve pretty much taken this project where I need to. I can produce short and long vibration patterns, I’ve discovered the characteristic for direct vibration control (“FF05”, which appears to take a start and stop value), for testing the device (“FF0D”), and if I ever go that far, for pairing (“FF0F”).

I’m handing it off to my friend. Hopefully there’s enough functionality that he can perform buzzing-like therapy without further financial outlay.

Apparently, controlled buzzing has utility beyond (unconventional) autism therapy. There are any number of ADHD products built on the principle of re-focusing students back on-task every n minutes.

I suppose you could expand this as well with calendar integration for “take your meds” reminders or “get up and move around” ones, at a much lower price point than, for example, the full Apple Watch.

As for me, I discovered that I hate having anything on my wrists and that buzzing really gets on my nerves.  I’m glad I had the opportunity to play around with this, though.

If you end up building something interesting, please drop me a line and tell me about it.

Bluetooth Lessons I: Manager and Scanning

Last June, Izzy inspired me to do something with Bluetooth and playgrounds but honestly, I haven’t had the time and I couldn’t afford a Sphereo. I’ve wrapped up Swift Style. Attempting to write meaningfully about drawing while the Denver Public School system has for reasons I cannot begin to comprehend released my child to my recognizance for two entire weeks seems unlikely. (Another child has half days. Fun.)

To prepare, I purchased one of the cheapest BLE devices I could find, a Mi wristband (Amazon cost under $20 shipped), which has a reverse engineered API that lets you control vibration. A friend of mine just purchased the hugely expensive Buzzies for Autism bands. I’m  hoping I can mimic some of that functionality with a playground, a low-rent BLE device, and a full-price child.

Have I mentioned recently how awesome playgrounds are for playing around with and learning about new tech? They really are, especially because you can integrate just one concept at a time, and then test it live before expanding to the next.

I decided to go with Cocoa for my BLE exploration instead of iOS, although the tech is more or less the same on both platforms. When you work in Cocoa, using a macOS playground, the startup speed is phenomenal because you don’t have to work with a simulator.

My first project simply sets up a central manager (CBCentralManager), monitors its state, and lists any devices it finds. I’m pretty happy for this as a first day, not many hours to spend on it, playing around and doing something marginally useful result.

The CoreBluetooth documentation is pretty dire. For example, this is the Swift docs for CBManagerStatePoweredOn.  After SE-0005, the constant is actually .poweredOn, as you see in the following sample code, not CBManagerStatePoweredOn. And there’s no documentation in that documentation.

Nonetheless, I persevered and my first child-full day produced a basic helper class. You really need to work in NSObject land for this because of all the delegation. So I set up an objc-friendly class, set it as a manager delegate, and implemented the one required callback method, which follows the manger state.

Try sticking the Bluetooth icon in your system menu bar.  (System Preferences > Bluetooth > Show Bluetooth in menu bar.) It’s a lot of fun to toggle it on and off and watch your playground keep tabs on that state.

Next, I added a basic peripheral scan. You need to scan only when the manager achieves poweredOn state.

Apple writes, “Before you call CBCentralManager methods, the state of the central manager object must be powered on, as indicated by the CBCentralManagerStatePoweredOn constant. This state indicates that the central device (your iPhone or iPad, for instance) supports Bluetooth low energy and that Bluetooth is on and available to use.”

That’s why I added the scan to the playground’s “update state” callback. You’ll want to stop scans when the BLE powers off.

Finally, I implemented one more callback, which asynchronously lists discovered peripherals. It picked up nicely on my Apple TV and when I enabled and disabled a hotspot on my iPhone. Great fun.

Here’s the code involved. You can see how very short it is. The struggle wasn’t in the lines of code or complexity, it’s mostly about how very badly documented most everything seems to be.

I’ll post more as time allows.

https://gist.github.com/erica/d249ff13aec353e8a8d72a1f5e77d3f8

Dear Erica: Playground Support Folder

“N” asks: “Hey, is the “shared playground folder” long gone, or does it still exist?”

Still there, still useful.

The big difference for long-time playground users is that it moved into the PlaygroundSupport module from the XCPlayground module. The latter was deprecated in Xcode 7. It’s a tiny module that supports playground-specific features. This constant (playgroundSharedDataDirectory) gives you a well-defined sandboxed folder that’s shared between all playgrounds.

This is, by the way, a terrible symbol name (take note!), as it returns a URL. It used to return a string but the name never got updated:

public let playgroundSharedDataDirectory: URL

I often build playground-specific subfolders so my directory doesn’t get all messy.

Another valuable feature is indefinite execution support (needsIndefiniteExecution) for playground pages that have to perform asynchronous work before completion. You can use this support to build little playground-based utilities instead of writing shell scripts.

I have some pages that work with Imgur, Google search, Wolfram queries, etc. A nice thing about building in playgrounds vs shell is that you can integrate audio and visual elements rather than having to save them to files and open them in helper applications.

If you’re writing API utilities, enable manual execution. Constant reloads can almost immediately deplete, for example, your Gist API query count for the day. Oops.

In Xcode, the shared data folder is available for iOS, macOS, and tvOS playgrounds. The shared data folder is not available on iOS’s Swift Playgrounds. This policy discourages custom local storage and access beyond standard media library locations.

There are some further protocols and types under the PlaygroundSupport umbrella in Swift Playgrounds. These aren’t available for Xcode playgrounds because they’re meant for use in Playground Books.

The extra functionality is part of Playground Book support, which underlies the tech in “Learn to Code”, etc. These additional APIs include items like a key-value data store, message passing between the live view and the primary playground page, and more.

If you want to learn more about Playgrounds, I have a book.  It discusses the features you use in Xcode and an overview of how to use iOS Playgrounds. I quite deliberately did not include much about Playground Book authoring as the topics are somewhat orthogonal.

I’ll probably be revising both Playground Secrets and Power Tips and my Swift Documentation Markup after WWDC. There’s also a three-book bundle available with Swift from Two to Three.