Removing trailing white space

Your linter might find it, but did you know there’s an easy regex approach to removing trailing whitespaces from lines?

Andrew Wagner reminds me that there’s also a built-in setting:

I have this set but for whatever reason, pasting or re-indenting, I always seem to end up with a few scattered around.

If you like, you can start by visualizing the spaces by enabling Invisibles. This switches Xcode’s editor display mode to show all characters including whitespaces. It’s also a great way to track down invisible extra characters you may have entered accidentally while coding. This happens to me enough on a regular basis that I reach for this mode when it happens.

If you don’t like Invisibles mode or you want to go back once you don’t need it anymore, just switch it off in the menu. This menu is also helpful for getting rid of the minimap and listing commit authors.

Next, do a Search/Replace. Make sure to set the match to Regular Expression. The pop-up is towards the right:

Replace one or more (+) horizontal-only spaces (\h) that extend to the end of the line ($) with a blank/nothing replacement. Horizontal spaces won’t gobble up empty lines within your code as well as the trailing spaces.

Reader Rob adds: “I use ‘;$’ to remove the semis from swift code that a lifetime of C and Objective C (and other langs) have caused me to insert unconsciously.”

Helpful? Or did I mess something up? Let me know.

Coloring SVG assets in SwiftUI

Update: Huge thanks to Justin.

Retain the same code as the system image but use the asset inspector to change the SVG resource to a template image! So much easier and better. Thank you, Justin!


Coloring a SFIcon is simple. Here’s the default rendering:

struct ContentView: View {
  var body: some View {
    Image(systemName: "bandage")
      .resizable()
      .aspectRatio(contentMode: .fit)
      .padding()
  }
}

And here’s the same using a red tint:

Image(systemName: "bandage")
  .resizable()
  .aspectRatio(contentMode: .fit)
  .foregroundColor(.red)
  .padding()

But what about the new SVG image support? (The seal image is by mungang kim, the Noun project):

SVGs carry their own color information. I edited the seal in Adobe Illustrator CS4 (I have a computer dedicated to Mojave) to add intrinsic colors:

Again, the foregroundColor(.red) modifier is ignored and the native colors are shown. From my developer’s point of view, what I want is to be able to modify the SVG asset to use normal SwiftUI coloring. So the first thing I did was to create a ZStack and use blending modes to set my foreground color.

I finally got my first hint of victory by using a content blend with .colorDodge:

I discovered, though, that dodge wasn’t a great choice for non B&W assets. I needed a better blending mode.

When I tried to layer images in a ZStack, I discovered the color mode would bleed through:

I needed to:

  • Use a better blend mode that wouldn’t be affected by the SVG Image source colors and whether they were native Color s (like Color.red or system ones (like Color(.red), which uses UIColor/NSColor).
  • Isolate the blend mode so it wouldn’t affect other Views.
  • Move that functionality to a simple modifier, allowing a SVG Image to blend with a color.

I soon discovered that the sourceAtop blend mode got me the coloring I needed, whether I used the B&W or colorized asset:

ZStack {
  content
  color.blendMode(.sourceAtop)
}

Then, I needed to isolate the blend. I first turned to .drawingGroup(opaque:false) but it kept failing to provide the result I was aiming for until I discovered that isolating that into its own VStack bypassed any blends with ZStack elements at the same level:

VStack {
  ZStack {
    content
    color.blendMode(.sourceAtop)
  }
  .drawingGroup(opaque: false)
}

I then moved this into a custom View modifier:

public struct ColorBlended: ViewModifier {
  fileprivate var color: Color
  
  public func body(content: Content) -> some View {
    VStack {
      ZStack {
        content
        color.blendMode(.sourceAtop)
      }
      .drawingGroup(opaque: false)
    }
  }
}

extension View {
  public func blending(color: Color) -> some View {
    modifier(ColorBlended(color: color))
  }
}

This allowed me to create a standard SwiftUI ZStack that used the modifier in a normal cascade:

struct ContentView: View {
  var body: some View {
    ZStack {
      Image(systemName: "bandage.fill").resizable()
        .aspectRatio(contentMode: .fit)
      
      Image("ColorizedSeal")
        .resizable()
        .aspectRatio(contentMode: .fit)
        .padding(100)
        .blending(color: Color(.red))
    }
  }
}

Here’s how that renders:

You’ll want to make sure the blending happens after the image resizable and aspectRatio calls but other than that it can appear before or after the padding.

What I got out of this was a way to use Xcode 12’s new SVG asset support, standard SwiftUI layout, and flexibility when applying my color blend to assets that might not just be black and white.

I hope this helps others. If you have thoughts, corrections, or suggestions, let me know.

App Clips: when is an app an app and when should it be a webpage

Apple’s new App Clip technology lets people load transient mini-apps without installing through the App Store. Users don’t have to authenticate or authorize the mini-app. It just downloads and works. Whether scanning a code (think QR code) or detecting an NFC tag, iOS users can download and run these pre-vetted packages that represent a light, typically transactional, view of a larger app experience. I went through some writeups and video today and thought I’d share a mental dump of my thoughts.

All App Clips are accessed via URLs and limited to 10MB or less in size.  Their job is to move a user through a quick transaction and then either return control to the user or solicit the user to download the full application. So if you’re selling cupcakes, you can “upsell” the experience from a single purchase to a loyalty program app.

App Clips are designed for transaction. As an app, rather than a web page, they integrate seamlessly with the store ecosystem, allowing users to purchase goods and services from an instant menu and integrate with features like Apple Pay.

You can also use App Clips to decorate a museum or a city with points of interests or a bus stop with upcoming travel information. You can also do this with the phone’s built-in QR recognition for URLs to land on a web page instead but you’d miss the charm of the Apple “concierge” guiding you through the process.

Honestly, there’s nothing an App Clip can do (maybe other than something like Apple Pay) that a reasonably designed web page cannot but it’s that charm and an eye towards lowering the transaction barrier that makes the Clips so compelling. With Clips, end-users can point, pick, and pay with an absolute minimum of effort. If this works as promised, many of the typical web hurdles (I speak as someone who has ordered a lot of MadGreens pick-up food over the last few days) disappear.

A Clip’s transient data only is transient if it is not transferred to a client app. Since the Clip must be developed in tandem with that app, an impulse purchase can be applied directly to loyalty points, putting the user on the path of a redemption reward. The side-by-side development and the tandem review (which takes place together) plus the ability to share assets and re-use code makes this a promising area for developing commercial transaction apps that don’t stand between the user’s desire and impatience and the need for a more traditional app.

I use the Safeway app regularly and I utterly loathe it. Safeway has three tiers of user prices: general rip-off, loyalty-program, and expensive-but-you-can-live-with-it Just4U. Swiping a card only gets you to the middle tier. To get the Just4U prices, you must spend a half-hour before each visit entering all the coupons you think you might need or try to scan the teeny-tiny, often folded, ripped, or dirty QR codes on the shelves and then hope that the store’s remarkably bad WiFi and cell service doesn’t give you dozens of error alerts (which must each be manually dismissed) keeping you from actually being awarded those better prices (and the occasional $5 or $10 off a $100 purchase).

If the App Clip experience shows how to smooth the pathway from desire to acquisition, then Safeway’s approach shows how to tick off its customers and actively convince them not to purchase certain items as the coupons are not working.

However, I’m not only excited about the transactional nature of App Clip, I can easily see how a well-backed municipal or organizational effort can provide “more information”, “deeper information”, “found facts”, and “inspiration” using the same technology. Again, the key lies in reducing turbulence, steering the user, and completing the goal in the shortest period of time.

In terms of designing your own apps for App Clips, remember that simple is always better. The more choices, options, and features in your Clip you present, the more work the user must do. Instead, consider pushing your top sellers as “Quick Buys”. That doesn’t mean you can’t offer the deeper choices but if you do, remember the people queuing up in line, impatiently waiting for your customer to pick their deep-menthe chocolate-macchiato with half-skim/half-soy, two-and-three-quarters squirts of Pumpkin Spice, with low-fat whipped cream.

I’d imagine that the best transactional clips will not only capitalize on desire but also on the flow and customer-to-customer dynamics that might also exist within the store experience.

For App Clips out of the sphere of purchase, keep the same kind of locational awareness. A visitor who has just discovered the fascinating history of your city’s belltower probably shouldn’t step back into traffic to better look up and view the little dancing people on the clockface.

When I first saw this feature, I wasn’t all that excited. Now that I’ve dived in a little more I’m much more impressed by the thought care and clever delivery mechanism Apple. has put together.

What do you think? Too clever for its own good or a tech we’ll be seeing for years to come?

Swift Packages and the need for metadata

Right now, a Swift Package defines the sources and dependencies for successful compilation. The PackageDescription specifies items like the supported Swift version, linker settings, and so forth.

What it does not do is offer metadata. You won’t find email for the active project manager, a list of major authors, descriptive tags, an abstract or discussion of the package, a link to documentation, deprecation information or links to superceding packages upon deprecation.

For me, tags are especially important as they can drive discoverability on aggregators such as SwiftPackageIndex.com or SwiftPackageRegistry.com, among others, as they do in the various App Stores. However, all the other information I’ve mentioned can be equally valuable. The question is how this information should be stored and travel.

Extending SwiftPM’s PackageDescription is the most obvious way but the one with the greatest hurdles. Extending a specification means review, bikeshedding, and approval but is one that would produce the most rigorous and widely-applicable outcome:

let package = Package(
    name: "now",
    platforms: [
      .macOS(.v10_12)
    ],
    metadata: [
      .tags(["dates", "calendar", "scheduling", "time", "appointments"]),
      .maintainer("erica@ericasadun.com"),
    ],
...

Freeform tags are, as Mattt of three t’s pointed out to me, a folksonomy: a user-specified list that can be organized or freeform, sensible or not. Anyone familiar with the App Store will recall how its tags have both benefited developers and how its tags can be abused to drive traffic.

Of course, updating the PackageDefinition spec is not the only approach. The same information could be packaged into a second file cohosted with Package.swift. Perhaps it could be called Package.metadata (if stored as JSON, for example) or PackageMetadata.swift (if the information is SwiftPM-like with its own Metadata type and supporting package to better support automated validation and consumption).

Plain JSON has many advantages. It requires no secondary code development as PackageMetadata.swift would. It has an obvious place to live and can be just as easily omitted. The standard for contents could be community-sourced and Decodable developed specifically for it, plus the JSON could be validated according to that standard.

I am least enthused by PackageMetadata.swift with its high overhead and mimicking of Package.swift but it would certainly fit with the design and approach and lower the overhead for consumption.

What do you think? How would you design metadata delivery for Swift packages? The one file to rule them all expansion of SwiftPM itself, the simplicity of Metadata.json, or something else? Let me know.

The silly delight of the Xcode document opening `xed`

Julian Kahnert recently introduced me to xed, a command-line built-in that opens individual documents in Xcode. You give xed the name of a file (or files) and presto, it launches in my favorite IDE.

There’s some intriguing flexibility to xed. You can request that Xcode remains in the --background or specify a --line number to select once open. The line number request affects the last file in the invocation. In addition to passing it files to open, you can --create new files with the contents of stdin: promising for scripting and code-gen utilities.

If you pass --launch, xed promises to create a “new empty unsaved file”,  without involving stdin. Unfortunately, in my experiments with it over the past week, I’ve found that while it’s very good at opening a single file, complex commands can sometimes confuse the poor geriatric utility. xed was introduce way back in Xcode 3.0 and it kind of shows its age at times.

Even just limiting my xed use to opening a file for editing, it’s a neat little discovery for me, so thank you Julian!

Our discussion arose out of my own little utility xcopen, which I put up on Github. Mine looks in the working directory for an xcproj and then opens it. (I got tired of either working through the file completion, when there’s always a subdirectory with the same name as the xcproj or having to open Finder and then launch.)

I really like the idea behind xed and I’m half tempted to extend the feature set in xcopen using AppleScripting to get the promise of xed with a more reliable delivery platform.

Swift Process and osascript: so much easier than the command line

I use osascript a lot to automate things that need automating and until yesterday, I did so almost exclusively from shell scripts. Then,  while developing some sample code the other day, I discovered how much easier it is to call osascript from a command-line utility, even for the most trivial of scripts.

let script = """
  set appName to "\(appName)"
  tell application "System Events"
	if visible of application process appName is true then
		set visible of application process appName to false
	else
		set visible of application process appName to true
	end if
  end tell
  """

With Swift’s triple-quoted multiline string, you need little if any escaping and your script can basically be copy/pasted from the Script Editor app. Plus string interpolation enables you programmatically customize your AppleScript code.

For your Process, set your launchPath to "/usr/bin/osascript" and your arguments to ["-e", script]. Then launch and wait until exit.

if #available(macOS 10.13, *) {
    guard (try? process.run()) != nil
        else { throw RuntimeError.operationError }
} else {
    process.launch()
}
process.waitUntilExit()

Why is it worth creating an entire Xcode project to deal with a little osascript work? For me, it’s a big win when it is part of a larger utility. With the Swift Argument Parser, I’ve been porting a lot of my scripts — both shell scripts and swift scripts — to compiled utilities and I find that I tend to lean on osascript a lot more than I thought I did to automate my system.

SwiftPM and Tagging

A couple of days ago, I was having the oddest issues fetching a SwiftPM package. The package resolution/version solving failed. I still don’t know why this was happening as my package was tagged as “0.1.0” at the time, as confirmed over at Github.

In any case, I found that removing the tag locally and remotely and retagging and re-pushing my tags resolved the issue. The package resolved correctly afterwards.

While I was at this, I decided to update my .gitconfig to add tagging aliases. Here’s what I came up with. They work but they’re not pretty and I’m sure there are better ways to approach this:

####
# Tagging
retag = "!f(){ name=`git tag | tail -1`; \
	git tag -d \"$name\"; echo "Retagged $name"; \
	remote=`git remote -v`; \
	if [ \"$remote\" != \"\" ]; \
	then git push origin --delete \"$name\"; fi; \
	git tag \"$name\"; \
	if [ \"$remote\" != \"\" ]; \
	then git push --tags; \
	else echo "No remote"; fi; };f"
untag = "!f(){ name=`git tag | tail -1`; git tag -d \"$name\"; \
	remote=`git remote -v`; \
	if [ \"$remote\" != \"\" ]; \
	then git push origin --delete \"$name\"; \
	else echo "No remote"; fi; };f"
tagit = "!f(){ git tag \"$@\"; echo "Tagged $name";\
	remote=`git remote -v`; \
	if [ \"$remote\" != \"\" ]; \
	then git push --tags; \
	else echo "No remote"; fi; };f"

retag, which solved my issue, grabs the name of the current tag, removes it, and retags locally and remotely. untag removes the tag and tagit adds a new tag to the current commit.

Feel free to suggest improvements.

p.s. If you’re looking for it, I took down the time machine rant post at the suggestion of several commenters.

Building a silly WatchKit App

WatchKit apps are easy to build and draining to debug. I don’t know what it is about extensions but they’re never fun to work with. The other day, daughter asked for an app to play some favorite sounds. I promised I’d write up a post about this, so here it is.

The entirety of the code took maybe a couple of minutes. This is one of the many places where using SwiftUI is a sheer delight. I did have to play with renderingMode to override the default B&W scheme for three of my buttons:

I decided to go with bundle-stored audio rather than deal with the complexities of adding audio assets to an xcassets item. Quick and easy. I limited the daughter to just 4 out of her long list of requested sounds to simplify my life. She returned with her list grudgingly prioritized.

Trimming and converting the audio to wav took seconds, thanks to command-line ffmpeg. Creating the art, too, was simple. I just used Preview to create all the required App Icons. Call it another 5 minutes to get that done. It was a great time to use Preview’s magic selection wand and its support for layers and transparency.

Where all the trouble lay was getting the watch app to consistent deploy to hardware for testing. Although this was a watch-only app, the phone still plays a big role. And I finally discovered that keeping my phone tethered to the computer when deploying to the watch made for a much smoother and more consistent experience. But the frustrations of failed deploys led to over an hour invested.

Here’s a quick summary of where my development time was spent.  Area corresponds to the level of effort:

The actual coding was nothing. Asset prep is always tedious but it’s O(n). It’s pretty clear from the start how much effort is needed, which varies by the number of assets used. You don’t add more complexity when you’re working with the right formats and tools.

The vast majority of headache is getting the Xcode tooling and the hardware to sync, install, and test. This really should be the easiest part, but it was endlessly frustrating. I don’t know if my tether-breakthrough is a universal solution or if my next project will be just as frustrating.

In the end, what should have been a half hour project stretched to several hours. I hope the next one will be way shorter.

The surprising joys of Preview

For years, I’ve been saying how surprising macOS’s Preview app is. Just about everyone knows it is _there_ and lots of people use it to look at pictures and crop and occasionally to annotate. But there’s so much more that Preview can do.

Did you know that you can use Preview to scan and fill out forms using nearby Phones so you can get paper work done and submitted? That’s just one underused feature. It syncs with the phone’s document scanner, to find the outlines of the paper, then performs geometry correction so you can fill out paperwork, whether it’s for your next group hike or your kid’s camp outing. Fill, sign, send, and you’re done.

Preview can also be your new lightweight drawing program, whether you want to work with shapes and text or with freeform lines:

Preview can also adjust photo levels for basic color corrections. The right-hand side is the original. The left shows the picture after I applied auto levels, upped the contrast, and warmed the picture slightly. (Yes, that’s me on the left, and my son on the right in the Powerade reflection.)

I’m not sure who gave the Preview team the go-ahead to add lots of silly and delightful features or whether this is just a dogfood target that somehow got shared with the public, but there’s just so much you can do with it. The other evening, my daughter had begged for a watchkit app and I used Preview to populate my XCAssets for the watch app icon.

If I can get enough people to sign up, I’ll be giving a workshop this week on Preview through Try Swift World, although it’s a bit of a hard sell given how weird a topic it is for an audience of developers.

What’s your favorite hidden app feature that few people know about?

“mint” (a simple SwiftPM installer) has improved my command line life

SwiftPM’s lack of an easy install feature has long been an issue of mine and for other people too. As the linked forums thread suggests, to accomplish this for a general audience requires some careful thinking: adding to /usr/local/bin is not always the best solution for every user.

Still, it’s a feature whose absence is notable. To fill the gap, I’ve discovered Yonas Kolb’s mint. Thank you to Leo Dion, who introduced this to me. It is ridiculously simple to use. Could it get easier than mint install erica/now? Admittedly my github id is short and so is my project name, but I think my point stands…

Yesterday, I hastily added a SwiftPM project to remind so it too could be installed via mint. Suddenly, adding the project specification is no longer an afterthought but a driving feature. It’s made SwiftPM support far more valuable to me for executables.

Make sure you name your primary file main.swift, your product to .executable, and if you’re developing in Xcode, override your path to point to the Xcode-project-style folder for the source files, usually the same name as the project itself. I mention this because Xcode by default (File > New > Swift Package) builds library projects, not executable ones. At the command-line use swift package init --type executable.

// swift-tools-version:5.2
// The swift-tools-version declares the minimum version of Swift required to build this package.

import PackageDescription

let package = Package(
    name: "remind",
    platforms: [
        .macOS(.v10_12)
    ],
    products: [
        .executable(
            name: "remind",
            targets: ["remind"]),
    ],
    dependencies: [
        .package(url: "https://github.com/apple/swift-argument-parser", from: "0.0.6"),
    ],
    targets: [
        .target(
            name: "remind",
            dependencies: [.product(name: "ArgumentParser", package: "swift-argument-parser")],
            path: "remind/"
        ),
    ],
    swiftLanguageVersions: [
        .v5
    ]
)

As for mint itself, you can build it or install it via HomeBrew: brew install mint.

If you have any suggestions on how I can improve my SwiftPM work, better integrate tagging, or any other tips, please let me know. Thanks!