Falling back to an older MBP

I just recently switched from a 2018 15″ MBP back to my 2015 13″ and I thought I’d share some of my reactions to the change. My meandering and unstructured thoughts follow.

Unexpectedly, on returning to the 2015 MBP, I didn’t suddenly go “omgomg the keyboard is amazing again”. The older style was never amazing compared to any good mechanical keyboard. I adapted to the new keyboard just fine, even though my hands never really were big enough for it to be truly comfortable.  Yes, the older keyboard’s keys really are less of a stretch but it wasn’t that much of a hardship either way.

The basic truth for me is that both keyboards are fully usable and that having the dedicated escape key (or not) was never a big deal. The virtual one did the job just fine. That’s something I never expected to admit but it’s true. I may not love MBP keyboards but they work.

I will admit the older dedicated function keys are slightly better, especially given how I’ve rebound them using Keyboard Maestro, but not an order of magnitude more usable. Just…better. Not way better.

The trackpad feels old and quaint in comparison to the newer one. I miss the 2018 trackpad more than I expected, especially given how I’m an admitted trackpad hater. Moving documents has become again slightly more difficult on the older MBP, so there’s that.

My 13″ screen now feels cramped and tiny, as expected. But I once again appreciate how clean and beautiful the Retina display is, as I compare it to my one-older-purchase, the non-Retina MBA. The 2015 13″ is a great laptop, even if it feels chunkier and less streamlined than the 2018. You can tangibly feel the design differences between the 3 years as well as the battery improvements.

On the other hand, moving back from USB-C to all these wonderful ports is delightful. My 2018 was always an octopus, and I had to carry around a bag of hubs and adapters. (I even have one on my keychain, which I probably don’t need anymore.)

Yes, I still use some adapters like my HDMI to Thunderbolt 2, so I can have two monitors running from the 2015 laptop, but that lives on the HDMI cord, not with my laptop.

I have a bunch of extra file space with the built-in SD card reader with my computer-flush reader adapter so it looks built in. The two standard USB ports are so convenient. I have an entire bag of USB-C gizmos that I’d carry around with the 2018 machine that I dumped into my USB box-of-everything for now.

On the other hand, I do miss the fingerprint reader. My experiences unlocking with my watch are hit and miss. Sometimes it works great. Sometimes it doesn’t. I can’t really figure out when it will and won’t: it’s not just after restarts or a long time between use. In contrast, the fingerprint reader absolutely rocks on the 2018, even if I had to remove a bunch of items from the touch bar because sometimes I’d turn on Siri or mute my audio when I thought I was still unlocking.

Speaking of the touch bar, I don’t miss it at all. As a touch typist I was always a little stunned to discover that it had relevant information on it that I never looked at. If the touch bar had important stuff, it should have been on the screen and my screen should have been touchable all over. As someone or other said (I forget exactly who — sorry!), it’s a keyboard when it shouldn’t be and a touch screen where it shouldn’t be. I’m paraphrasing from memory.

Every one of you who develop content for the touch bar, bravo. I am glad of how you are helping the user, especially the user who can see the touch bar and interacts with it. I don’t want you to change a thing. Instead, I want Apple to step up and get that material integrated with the main display so I can take part too.

I’m holding off on upgrading a lot of things right now. I’m still using my iPhone 6+, my 2012 Mac mini, my 2015 MBP. They all work and get the job done and nothing yet has really given me the motivation to push forward to new hardware. The iPhone 11 is lovely but I don’t really take many pictures. The new mini isn’t self serviceable (at least to a klutz like me) and I honestly don’t want to leave Mojave on it and lose my 32-bit apps. The 2015 (running Catalina) is still a really great laptop.

I’m waiting to fall in love again, the way I did with the 5th gen iPad mini, which swept me off my feet early this year. I adore the mini. Revised and supporting the pencil, it gave me new ways to use it, better user experiences, and a solid beautiful form factor making it a natural upgrade from the 2nd gen.

I want my next hardware purchases to have that same passion instead of incremental utility.

What are you buying and what are you holding onto waiting for the right moment? Let me know.

Xcode: Basics of the four-block wonder

The official name is “Navigate to Related Items” but to me, it’s the four block wonder, a menu button that sits at the top left of the Xcode editor. With this menu, you can hop between file counterparts (for example, .m/.h, or .swift/generated interface, which is what a lot of people use it for). But there’s so much more.

Set the cursor on a type, and you can view or navigate its superclass, subclasses, or siblings, as well as the protocols, extensions, and categories it connects to. A single click navigates you to the interface or implementation in question:

Use the cursor to establish the context for the menu, otherwise you’ll only see a smaller subset menu options, such as recent files. The callers option shows you your clients, and callees the items your code is calling — all specific to the current cursor context.

One of my favorite tools in the four-block menu is Generated Interfaces, which allows you to view an item’s Swift interface or see how it translates to Objective-C. For example, if you use an obvious preposition label, the ObjC generated interface subsumes it into its selector:

With this, you don’t have to wonder if your selector is specified right and you don’t have to override the selector with objc. You just look up the definition and you’re good to call.

In addition to contextual helpers, the four-block lets you select recently viewed files and, when using version control, recent files that have been locally modified and not yet committed (basically the ones showing the “M”-for-modified).

The four little squares may be tiny but it is a powerful tool in your Xcode arsenal.

Update: Lilly reminds me that you can bind the related items menu. I have mine bound to ^1 but I don’t remember if that’s something I did or something that is default. If you want to add or change, visit Preferences > Key Bindings and search for “related”.

Mac Dictation 101

Dictating to text is one of the great things that macOS gave us a few years ago. In both Mojave and Catalina, you enable dictation in System Preferences in the Keyboard > Dictation Pane.

I use the “double-command” shortcut to enable dictation but I also find it helpful to set up the Mac version of “Hey Siri”. To start, hop over to Accessibility > Dictation (Mojave) or Accessibility > Voice Control (Catalina) to extend your interaction. Enable dictation or voice control, as supplied. On Catalina, you may need to download additional elements which takes a moment or two.

In Mojave, you can set a dictation keyword. I go with “Computer”, because it sounds very Scotty from Star Trek TOS. I prefer to enable sound feedback so I know when my command has been picked up properly.

To ask Siri about the weather, I say, “Computer. <beat> Open Siri. <wait for tone> What is the weather for today?”. There’s a definite pause needed after “Computer”.

Catalina offers an always-on version when you enable voice control.

This little control panel lets you sleep or wake your mike:

Once in place, you can say “Open Siri”.  Confirm that you want to enable Ask Siri and you know that employees or contractors somewhere — I believe it was Ireland — will be laughing at you, but privacy is an illusion these days. Search for articles similar to “Apple Resumes Human Reviews of Siri Audio With iOS 13.2 Update” for more details. As with Mojave, you’ll need to develop separate dialects for iOS and Mac for controlling your system, with significant pauses to let the OS catch up with you.

Always-on dictation seems to send my MBP into windtunnel spasms, so you may want to use those keyboard shortcuts instead.

Once on, you can request “What can I say” and a list of commands pops up. It’s pretty basic and uninspiring as a support doc goes but it’s a start and the commands are quite extensive.

For example, say “Open TextEdit <pause> New item (assuming you don’t have TextEdit setup to create one on launch).”

Next, try dictating. I recommend opening a browser tab apart from TextEdit and just speak from a reference document. For example:

Listen my children and you shall hear
Of the midnight ride of Paul Revere,
On the eighteenth of April, in Seventy-five;
Hardly a man is now alive
Who remembers that famous day and year.

You’ll immediately see some of the weaknesses involved in dictation.

Starting over, I’ve used some dictation-fu. For example, when “here” is mispelled, I used “correct that” and “choose 1”.

“New line” gets me to the second line. “Select word” highlights the most recent and “Capitalize that” changes “revere” to “Revere”. The next line after that is just ridiculously hard. I’ll leave that one as an exercise for the reader. Finish it off with “semicolon. press Return key”.

The rest is easy, particularly because TextEdit did the uppercasing on my behalf. Don’t forget to insert a new line and say “period” (Or, I suppose, full stop. Hi Paul.) at the end of the sentence.

As with iOS, Catalina appears to be using a scaled down version of Dragon Dictation so it’s always helpful to be able to use the Dragon documentation, even when you run up against some pretty hard edge cases.

It’s honestly not the worst dictation system but I prefer the one on iOS.

 

 

 

Fun with Xcode Search Domains: Excluding match text

Most Xcode users quickly become familiar with the basics of the Find Navigator panel.

With it, you can find text, regular expressions, and perform search-and-replace, whether matching or ignoring case. But that’s just scratching the surface of the Find Navigator.

I thought I’d drop a few words today about search scopes. Controlled from the bottom left,  under the search field, you can create narrowed searches. This enables you to, for example, search only in Swift files or exclude files containing the word Test.

To get started, click the icon (two lines with three squares on a line between them) and then New Scope (the plus icon). Here, you can name the scope, limit the search extent, and add criteria for exactly which files should be included or not.

The logic is straightforward. You choose where to look (the project, a folder, or through the entire SDK), and whether to include all conditions or some conditions:

Each condition is based on the file name, path, extension, UTI (the kind of file, like image which is useful for finding vector assets), Workspace location (namely groups), or source control status (handy for finding newly applied changes.)

Most of my conditions are file-name-based. And for those, you get the following matching conditions. The “ends with” is an obvious win for extensions (although you can also use UTIs for that), and “starts with” can help for projects organized in hierarchical ways.

Now, interestingly enough, this list fails to offer “does not contain” but that’s fairly easy to work around. Since Xcode supports regex matching, you can easily replicate “does not contain” with an appropriate regex:

Change the file name to a path to exclude source file directories.

You can create as many search domains as you like. At least, I haven’t found an upper bound yet. I haven’t found a way to reorder the find scopes, although if you’re really controlling about this, you can pop into  your workspace (ProjectName.xcodeproj/project.xcworkspace/xcuserdata/username.xcuserdatad), convert your UserInterfaceState.xcuserstate to xml (plutil -convert xml1), and hand-edit it the way you need.

There are lots of wonderful little Xcode tweaks like these throughout this monster of an IDE. What are some of your faves? If I have time this week, I’ll share some of mine, such as the four-square — another of my favorite tools — and a few great ways to connect your editor to the navigator.

How I got Rust working in Xcode

A while ago, I posted about how I set up Xcode to work with Python. Yesterday, I was taking a class on Rust and decided to use my friendly neighborhood (sp)IDE(rman) coding environment, namely Xcode.

I’m not going to say it was a stunning success but there was enough interest that I thought I’d share the steps so you too could embrace Rust through Xcode.

Install Rust. You start, as one does, by installing Rust. Hop over to https://www.rust-lang.org/tools/install to grab a copy of the tools. They install to ~/.cargo, for whatever reason. I put a link in to / usr/local/bin.

Create a Project. Create an external build system Xcode project by choosing File > New > Project > Cross-platform > External Build System > Next. Enter a product name (I called mine “Rust” because that’s exactly how creative I am.) and set your build tool (in my case, /usr/local/bin/rustc because of the link). Save it somewhere convenient.

Create a source file. Apparently “rs” (rust source?) is the proper extension. I went with “test” as my name. File > New > Empty > test.rs

fn main() {
    println!("hello world");
}

Don’t forget to add some code.

Compile. Edit your scheme.  Choose Run > Info > Build Executable > Other and select your compiler. Adding it to /usr/local/bin made it easier to select rustc for me. Then uncheck Debug executable because you’re not debugging the Rust compiler.

At this point you can click Run and you’ll see the standard option message because you haven’t specified what it should run.

Back in the scheme editor select Run > Arguments and add the source file and output file. Unfortunately, I could not get this to work with SRCROOT at all, so here it is in all its glory with complete paths.

The Pre-action removes any build product from a previous run:

So here we are. With luck, it compiles. If not, the errors appear in pretty horrible form in the Xcode console, where curses is what we do, not how the console interprets pretty text output.

You can get slightly less horrible feedback by adding the launch argument: –error-format=json

Yeah, it’s wordy but it’s slightly less awful.

Pick a path. Unlike python, rust is just a compiler. If you build, and then add a step execute, the execution output (unlike compiler errors) will not normally print at the Xcode console. The challenge is to get that information in some form where you can access it.

At first I went with a little post-action osascript and threw up the output in a separate window:

But I really wanted to make it work with the console So back I went to Applescripting. Instead of rustc, I changed my build tool to osascript:

I added this instead to my run scheme arguments.

Yep, I’m using osascript to run a shell script that just compiles with rust and then runs it, passing the output through back to Xcode.

I know this is bad. I know I should be ashamed. I hang my head.

But you know what? It works. Stray osascript-crud and all:

I’m not sure how much this makes me a programming outcast but it was kind of fun to figure out how far I could push my beloved enemy Xcode.

Repost: September 11, 2006

I originally wrote this on September 11, 2006, 5 years after the WTC. I’m reposting this on September 11, 2019, 18 years after.

During high school, I spent nearly every waking hour in the company of the Klitzman twins. We took biology together, English, physics, chemistry, social studies, lunch and band. I’d walk from class to class to class and they’d always be there. After school, we’d carpool together to after-school activities. I believe I spent more time with the twins than their parents did.

We had little in common. I was a computer geek, into science fiction and programming. They were athletic. They played tennis and were well liked. The band-twin was excellent at her instrument. I just played along and tried not to hit too many sour notes. Socially, we lived in very separate worlds and I never got to know them. We co-existed rather than interacted. I am the poorer for that.

They both became adults of great accomplishment. They went, I believe, to Princeton. From what I have googled, Karen did graduate work at Columbia and became the vice president of research for the New York Mercantile Exchange. Donna attended medical school and now practices medicine in New Jersey. It sounds like they were amazing people.

Five years ago today, a plane flew into the office of Cantor Fitzgerald and vaporized Karen. From what I can tell, her body was never found. Along with her at the World Trade Center died Edward Fergus and Thomas Collins and Christopher Panatier, who attended High School East at the same time we were at West and Martin Lizzul who graduated West a few years after we did. I don’t think I ever met or knew them, but they were from home.

The minutes of the board of the Half Hollow Hills school districts lists parents, uncles, aunts, cousins, and friends. A couple of teachers at West Hollow lost nearly a dozen friends all at once. Friends and acquaintances spent months going to memorial service after memorial service.

Today, all the cable channels will be replaying memories of that time. And tomorrow, Apple is going to introduce some new iPods and iMacs and life will go back to normal.

Life is short and unpredictable. We all have many missed opportunities and people of value that we never got to really know. Rather than focus on the obsessive hatred and corrosive philosophy that motivated the events of 9/11/2001, today I’m going to take a moment to appreciate and better get to know the people in my life.

We are surrounded by good people. Sometimes we forget about that.

SwiftUI: Modified Content, Type Erasure, and Type Debugging

When working with declarative views, you should be able to reach for a full tool suite of functional application. In a typesafe language like Swift, things can prove more difficult than you’d might first think. Consider the following code:

What is core‘s type? It isn’t Text. It’s actually an application of modified content, specifically Text passed through a rotation effect:

Just add a background color and a shadow and the type jumps to this:

You might ask: why is this a problem? After all, Swift is doing all the heavy lifting, right? In my case, the answer lies in my struggle to incorporate this core image into a multi-stage bit of text art using reduce. Paul Hudson tweeted a step-by-step approach to this and I was sure I could make it simpler and more elegant.

And that’s where I started throwing myself against what at first seemed like an impenetrable wall for a couple of hours. Between SwiftUI’s stroke-style Dysarthria error messages and the typesafe system, my attempt at creating a solution along these lines felt doomed:

[Color.red, .orange, .yellow, .green, .blue, .purple].reduce(core) { view, color in
  view.padding()
    .background(color)
    .rotationEffect(theta)
}

The code wouldn’t compile and the error messages couldn’t tell me why. The problem? Each stage created a new layer of modified content, changing the type and rendering reduce  unable to do the work. It was only with the help of some deep-dives into the docs and advice from colleagues that I was able to arrive at a solution.

Type erasure, using SwiftUI’s AnyView struct enables you to change the type of a given view, destroying the nested hierarchy. Importantly, it creates a single type, allowing a reduce operation to proceed.

At first, I used AnyView the way you’d typecast in Swift, namely:

AnyView(view.padding()
  .background(color)
  .rotationEffect(theta))

But I hated that. It sticks out as so Un-SwiftUI-y, with the parentheses spanning multiple lines and throwing off the clear logical flow. If you’re going to go fluent, just go fluent. So, eventually, I decided to create a View type extension to handle this:

extension View {
  /// Returns a type-erased version of the view.
  public var typeErased: AnyView { AnyView(self) }
}

The result looks, instead, like this:

view.padding()
  .background(color)
  .rotationEffect(Angle(radians: .pi / 6))
  .typeErased

And yes, I went with a property and not a function as I felt this was expressing a core characteristic inherent to each View. I can probably argue it the other way as well.

From there, it wasn’t much of a leap to ask “what other fluent interface tricks can I apply”, and I ended up putting together this little View extensions for inline peeks:

extension View {
    /// Passes-through the view with debugging output
  public func passthrough() -> Self {
    print("\(Self.self): \(self)")
    return self
  }
}

This prints out an instance’s type and a rendering of the instance, which will vary depending on whether there’s a custom representation, passing the actual instance through to whatever the next stage of chaining is. I don’t use it much but when I do, it’s been pretty handy at taking a peek where Xcode’s normal QuickLook features hit the edge.

In any case, I thought I’d share these in case they’re of use to anyone else. Drop me a note or a tweet or a comment if they help. Cheers!

Update: It suddenly occurred to me that I could make this a lot more general:

extension View {
  /// Passes-through the view with customizable side effects
  public func passthrough(applying closure: (_ instance: Self) -> ()) -> Self {
    closure(self)
    return self
  }
}

Isn’t that nicer? The equivalent is now:

struct MyView: View {
  var body: some View {
    [Color.red, .orange, .yellow, .green, .blue, .purple]
      .reduce(Text("👭")
        .font(.largeTitle)
        .rotationEffect(Angle(radians: .pi))
        .typeErased)
      { view, color in
        view.padding()
          .background(color)
          .rotationEffect(Angle(radians: .pi / 6))
          .passthrough { print("\(type(of: $0)), \($0)") }
          .typeErased
    }
  }
}

And I can put any behavior in from printouts to timing to any other side effect I desire. To all the functional purists out there, I sincerely apologize. 🙂

SwiftUI: Handling optionals

A friend recently asked me if I’d write a few words about SwiftUI and optionals. Like nearly everything in SwiftUI, you have to rewire your brain a little bit when thinking about this because SwiftUI is perfectly happy working with optional views, such as Image? and Text?.

The tl;dr of this post is going to be “use map” but before I get there, let me dive in a little deeper. And, of course, whatever I got wrong, please let me know so I can learn more and correct this.

You can feed SwiftUI an optional view, such as Text? with the understanding that the system will only render non-nil values. Here are some screenshots that show the output in both cases:

But what happens when you want to work with optional data that’s driving your view layout? You don’t want to use nil-coalescing (unless you have some compelling backup view case). Instead, if you want to render without a backup value, you have to dig a little deeper. Don’t automatically reach for the familiarity of conditional binding. You can’t if-let in SwiftUI the way you expect to:

My “clever” workarounds really weren’t very clever:

Although, SwiftUI supports the if-statement, prefer map as your first line of attack:

You can see how much more elegant the map version is in comparison. Force-unwraps make unicorns cry and contribute to overall levels of human misery. That’s not to say that if isn’t useful, rather it’s just not my preferred approach for optionals in SwiftUI:

VStack {
  Text("Top")
  if name != nil {
    Text(name!)
  }
  Text("Bottom")
}

(Note: I’m exploring @ViewBuilder closures right now and there’s some really cool stuff including buildEither and buildIf content that I haven’t dived deep into yet.)

Be especially careful and read the documentation when you think you’re going to be working with failable initializers because sometimes you won’t be. For example, SwiftUI’s Image does not use a failable initializer.

I can’t tell given the current stability of the system whether Image(systemName:"notarealname") returns an empty image, which I guess wouldn’t be too bad, or always crashes (I’ve had a bunch of bad crashes) but my most common outcome is a frozen playground with a severe emotional breakdown cowering in the corner and hugging itself.

I emphasize this gotcha because you might not catch the potential meltdown if you only pass it well behaved strings during testing (as in the following case). It’s important because it can bite:

In contrast, UIImage uses a failable initializer and returns an optional, which you can map through an Image with consistent good outcomes at each point:

If you want to get really OCD about all this stuff, you could add an extension on optional that allows you to include a visual error instead of omitting the view, but I’m not entirely sure that’s tremendously useful:

I’m out of time and have to head back to work. Thanks for having lunch with me.