Archive for the ‘WWDC’ Category

Xcode: Vimpocalypse Now

This has appeared at the bottom of Xcode’s Text Editing > Editing settings, sure to make many vi users very happy indeeed:

The change is heralded in the Release Notes:

Using the new vim features is super easy. When the Vim keybindings are enabled, the bottom of your source editor gets this, showing current state, plus a few reminders for those whose muscle memory isn’t quite current. It’s up to you to know your hjkluybn stuff.

You can tell if you’re navigating or editing by the shape of the cursor. The cursor is a large block when you’re navigating or using any of the commands listed here. It is a vertical pipe when directly entering text.

You can tap i to insert text, switching to text entry mode and then ESC to return to navigate. Using a colon : doesn’t do anything here, so no :wq, which is not a huge surprise as neither w nor q makes a ton of sense in Xcode.

Anyway, if your a vimficianado, congratulations and enjoy the new toy.

App Clips: when is an app an app and when should it be a webpage

Apple’s new App Clip technology lets people load transient mini-apps without installing through the App Store. Users don’t have to authenticate or authorize the mini-app. It just downloads and works. Whether scanning a code (think QR code) or detecting an NFC tag, iOS users can download and run these pre-vetted packages that represent a light, typically transactional, view of a larger app experience. I went through some writeups and video today and thought I’d share a mental dump of my thoughts.

All App Clips are accessed via URLs and limited to 10MB or less in size.  Their job is to move a user through a quick transaction and then either return control to the user or solicit the user to download the full application. So if you’re selling cupcakes, you can “upsell” the experience from a single purchase to a loyalty program app.

App Clips are designed for transaction. As an app, rather than a web page, they integrate seamlessly with the store ecosystem, allowing users to purchase goods and services from an instant menu and integrate with features like Apple Pay.

You can also use App Clips to decorate a museum or a city with points of interests or a bus stop with upcoming travel information. You can also do this with the phone’s built-in QR recognition for URLs to land on a web page instead but you’d miss the charm of the Apple “concierge” guiding you through the process.

Honestly, there’s nothing an App Clip can do (maybe other than something like Apple Pay) that a reasonably designed web page cannot but it’s that charm and an eye towards lowering the transaction barrier that makes the Clips so compelling. With Clips, end-users can point, pick, and pay with an absolute minimum of effort. If this works as promised, many of the typical web hurdles (I speak as someone who has ordered a lot of MadGreens pick-up food over the last few days) disappear.

A Clip’s transient data only is transient if it is not transferred to a client app. Since the Clip must be developed in tandem with that app, an impulse purchase can be applied directly to loyalty points, putting the user on the path of a redemption reward. The side-by-side development and the tandem review (which takes place together) plus the ability to share assets and re-use code makes this a promising area for developing commercial transaction apps that don’t stand between the user’s desire and impatience and the need for a more traditional app.

I use the Safeway app regularly and I utterly loathe it. Safeway has three tiers of user prices: general rip-off, loyalty-program, and expensive-but-you-can-live-with-it Just4U. Swiping a card only gets you to the middle tier. To get the Just4U prices, you must spend a half-hour before each visit entering all the coupons you think you might need or try to scan the teeny-tiny, often folded, ripped, or dirty QR codes on the shelves and then hope that the store’s remarkably bad WiFi and cell service doesn’t give you dozens of error alerts (which must each be manually dismissed) keeping you from actually being awarded those better prices (and the occasional $5 or $10 off a $100 purchase).

If the App Clip experience shows how to smooth the pathway from desire to acquisition, then Safeway’s approach shows how to tick off its customers and actively convince them not to purchase certain items as the coupons are not working.

However, I’m not only excited about the transactional nature of App Clip, I can easily see how a well-backed municipal or organizational effort can provide “more information”, “deeper information”, “found facts”, and “inspiration” using the same technology. Again, the key lies in reducing turbulence, steering the user, and completing the goal in the shortest period of time.

In terms of designing your own apps for App Clips, remember that simple is always better. The more choices, options, and features in your Clip you present, the more work the user must do. Instead, consider pushing your top sellers as “Quick Buys”. That doesn’t mean you can’t offer the deeper choices but if you do, remember the people queuing up in line, impatiently waiting for your customer to pick their deep-menthe chocolate-macchiato with half-skim/half-soy, two-and-three-quarters squirts of Pumpkin Spice, with low-fat whipped cream.

I’d imagine that the best transactional clips will not only capitalize on desire but also on the flow and customer-to-customer dynamics that might also exist within the store experience.

For App Clips out of the sphere of purchase, keep the same kind of locational awareness. A visitor who has just discovered the fascinating history of your city’s belltower probably shouldn’t step back into traffic to better look up and view the little dancing people on the clockface.

When I first saw this feature, I wasn’t all that excited. Now that I’ve dived in a little more I’m much more impressed by the thought care and clever delivery mechanism Apple. has put together.

What do you think? Too clever for its own good or a tech we’ll be seeing for years to come?

SwiftUI: Handling optionals

A friend recently asked me if I’d write a few words about SwiftUI and optionals. Like nearly everything in SwiftUI, you have to rewire your brain a little bit when thinking about this because SwiftUI is perfectly happy working with optional views, such as Image? and Text?.

The tl;dr of this post is going to be “use map” but before I get there, let me dive in a little deeper. And, of course, whatever I got wrong, please let me know so I can learn more and correct this.

You can feed SwiftUI an optional view, such as Text? with the understanding that the system will only render non-nil values. Here are some screenshots that show the output in both cases:

But what happens when you want to work with optional data that’s driving your view layout? You don’t want to use nil-coalescing (unless you have some compelling backup view case). Instead, if you want to render without a backup value, you have to dig a little deeper. Don’t automatically reach for the familiarity of conditional binding. You can’t if-let in SwiftUI the way you expect to:

My “clever” workarounds really weren’t very clever:

Although, SwiftUI supports the if-statement, prefer map as your first line of attack:

You can see how much more elegant the map version is in comparison. Force-unwraps make unicorns cry and contribute to overall levels of human misery. That’s not to say that if isn’t useful, rather it’s just not my preferred approach for optionals in SwiftUI:

VStack {
  Text("Top")
  if name != nil {
    Text(name!)
  }
  Text("Bottom")
}

(Note: I’m exploring @ViewBuilder closures right now and there’s some really cool stuff including buildEither and buildIf content that I haven’t dived deep into yet.)

Be especially careful and read the documentation when you think you’re going to be working with failable initializers because sometimes you won’t be. For example, SwiftUI’s Image does not use a failable initializer.

I can’t tell given the current stability of the system whether Image(systemName:"notarealname") returns an empty image, which I guess wouldn’t be too bad, or always crashes (I’ve had a bunch of bad crashes) but my most common outcome is a frozen playground with a severe emotional breakdown cowering in the corner and hugging itself.

I emphasize this gotcha because you might not catch the potential meltdown if you only pass it well behaved strings during testing (as in the following case). It’s important because it can bite:

In contrast, UIImage uses a failable initializer and returns an optional, which you can map through an Image with consistent good outcomes at each point:

If you want to get really OCD about all this stuff, you could add an extension on optional that allows you to include a visual error instead of omitting the view, but I’m not entirely sure that’s tremendously useful:

I’m out of time and have to head back to work. Thanks for having lunch with me.

SwiftUI: Modal presentation

I have regrettably little time to devote to SwiftUI. I explore when I can, although I wish I were a lot further in that journey.

Here’s my latest go, where I’m looking to build a modal presentation. Today is the first time I’ve been able to play with Modal, the storage type for a modal presentation. I tied it together with an isPresented state, but I’m wondering if I’ve done this all wrong.

I can’t help but think there’s a better way to do this. I’m using a text button for “Done” instead of a system-supplied item, so it won’t be automatically internationalized. Nor, can I find any specialty “Done” item in SFSymbols. When looking at Apple’s samples, such as Working with UI Controls, I see the same Text("Done") . While I know that Text elements are automatically localized should resources be available, is SwiftUI providing us with any core dictionary of terms?

I think using the isPresented state in the code below may be too clunky. I’d think that there would be a more direct way to coordinate a modal item. Any advice and guidance will be greatly welcomed.

I remain stuck in Mojave for most of my work, although I put an install of Catalina on a laptop. Although you can build proper SwiftUI apps using the beta Xcode, without the preview (and I’ve had no luck finding a secret default to enable it under Mojave) makes the experience way slower than working in a playground.

I’m hoping to dive next into Interfacing with UIKit.

SwiftUI: Boing!

Source: here

Note that you add the animation to the View object and update the view’s state in the gesture state handlers. The onEnded action passes a summary of the velocity, offset, and location of the gesture but I ignored it because I didn’t need it.

SwiftUI: Embracing the nonobvious?

This is going to be another day where I get to play with SwiftUI because I can’t get any real work done right now and am dealing with lots of interruptions.

This morning, I returned to yesterday’s mouse inventory sample to try to get my rounded corners working. Several people suggested that I implement my interface using a ZStack and a Rectangle, so I tried that first.

To my surprise, the Rectangle expanded my VStack and I haven’t to date figured out how to constrain its size to be no more than its sibling. I wanted the rectangle to grow weakly instead of pushing the title and total items towards the edge of the parent view, the way it did in this screenshot:

Here’s what it looks like without the monster-sized Rectangle, which I think is a much more appealing layout:

So instead, after messing around a bit, it occurred to me that everything is a view or at least everything is kind of view-ish and if so, then I could possibly apply my corner rounding to Color, which I did.

}.padding()
.background(Color.white.cornerRadius(8))

And surprise, this is what I got:

Isn’t that cool?

Although the final layout is exactly what I wanted, if you think about it, it’s not that intuitive that system uses tight placement for this and lax spacing for the one with the Rectangle.

In fact, as a developer, I’m not happy about not having direct control over the tightness of either layout or an obvious way to relate ZStack siblings. If there’s a way to describe how much content hugging I want in a ZStack layout and how to prioritize which item in that layout should guide the others, I haven’t discovered it. If you have, please let me know!

I’m still trying to learn to best use the deeply mysterious Length (and, no, don’t tell me “it’s just CGFloat“, because clearly it isn’t “just” that with all the Angle, Anchor, GeometryProxy, UnitPoint stuff,  and so forth) and apply layout relationships. Today, time allowing, I’d certainly like to learn more about the mysterious TupleView, a View created from a swift tuple of View values and see where it is used, the ForEach, which computes views on demand, Groups, EquatableView, and so forth.

SwiftUI: A little state

I wish I had more time to play. Here’s a little SwiftUI thing I threw together in the few moments I had free today. The source code is here.

Interestingly not including Color for backgrounds seems to kill my poor little sample. I suspect an overload where the type cannot be unambiguously inferred. Adding corner radiuses (shown here on the outside) destroys user interactivity. I have it commented out in the gist.

Originally, I tried to control state extrema (no negative inventory) in my model object but that led to a disconnect with the steppers. Instead, I finally found an initializer that allowed me to specify the valid range (in: range) to sanitize the user input, and disable the minus button for zero values.

A lot of the time I spent putting this together ended up with “helpful” results that looked like this:

That is to say, it’s really hard to provide a fluent functional framework in a typesafe language that feels like you’re constructing things into type erased collections because you never actually are…if that makes sense.

So far this week, I’ve managed to watch one video (the keynote) and about 20 minutes of another (the first bits introducing SwiftUI). I hope I have a chance to catch up. I’ll try to keep notes here on the website as I work through some of this stuff. It feels weird this year how far behind I am due to work commitments.

I spent today out of the office due to personal commitments and it’s been the first time I could really dive in (well, “dive” meaning for 10-20 minutes at a time here and there during the day). Loving this stuff, can’t wait to do more.

Good Things: SwiftUI on Mojave in iOS Playgrounds

Yes, you can upgrade to the Catalina Beta. Or, you can keep getting work done on Mojave and still enjoy the glory of SwiftUI exploration.

  1. Install Xcode 11 beta along side your 10.2.x Xcode
  2. Using the beta, create an iOS playground. (This won’t work with macOS, which attempts to use the native frameworks, which don’t  yet support SwiftUI)
  3. Import both SwiftUI and PlaygroundSupport.
  4. Set the live view to a UIHostingController instance whose rootView conforms to View.

Here’s an outline of the basic “Hello World” setup:

From there, you can create pages for each of your SwiftUI experiments, allowing you to build short, targeted applets to expand your exploration and understanding of the new technology.

Happy WWDC: Watch the keynote live

Apple’s keynote (aka it’s Special Event) will be streamed live at 10AM PDT (11 Mountain, 12 Central, 1 East coast, etc etc). If you didn’t score a golden ticket, you can watch along on the Apple event web page or on your Apple TV.

I’ll update this post with thoughts and reactions as the event unfolds.

  • iOS 12 announced as a free update. New measurement app for supporting devices will allow you to measure and scan physical scale using an AR experience. Lego’s demo shows that multiplayer AR can be a fun and exciting experience. Sample code going home with developers today.
  • Enhancements to photo search and sharing put the focus on the user-product experience. Today’s keynote, more than ever, seems to be focused on selling their existing products.
  • I love extensible Siri (aka “Siri Shortcuts”, with a dedicated drag-and-drop Shortcuts app), with custom end-user phrases to perform app integration and tasks. Siri will suggest these custom items. Create a macro for common tasks you group together and launch with a custom phrase. A great way to perform everything you need to get to work, hit your commute, or head off to lunch.
  • Several redesigned apps: News, Stocks (with integrated Apple News’s business section, adding iPad support), Voice Memos (also with iPad support), iBooks which is renamed to Apple Books.
  • CarPlay will now support third party navigation apps, so you can Waze your way home.
  • Lifestyle improvements continue with “Goodnight Moon, Goodnight Watch” do-not-disturb mode for bedtime and other naturally concluding cycles. Enhanced notification control, including grouped notifications, diminishes notification burnout.
  • Screentime lets you analyze how addicted you are to your devices, with full activity reports that summarize how often you pick up your device and what apps you spend the most time in. App Limits provide an alternative to 12-step Appholic programs.
  • Animoji enhancements adds tongue detection to improve language and emotional expressiveness. Memoji adds an iOS “mii” customization to animoji.
  • Group Facetime now supports up to 32 participants. With Animoji and special effects support, it’s like you’re doing drugs while sober.
  • Apple Watch gets some enhancements for health and fitness, including group competitions to motivate participants. Apple has added Clippy for automatic workout detection: “It looks like you’re starting an exercise routine”. New watch-to-watch walkie-talkie feature adds a little fun for kids and anyone who wants to harass the cook in the kitchen. “Whaaaaats for diiiiiiinnnner?”
  • Gymkit sounds like fun.
  • Apple TV adds Dolby Atmos sound and is Dolby Vision certified. Better integration with cable TV partners including Charter Spectrum in the US. “Zero Sign-on” means that all the channels you pay for are automatically configured on your behalf. Beautiful new whole-Earth aerial joins the lineup.
  • macOS Mojave: Inspired by the desert at night. Desktop stacks allow you to group material together to save space. Preview markup moves to the Finder! Markup augments screenshots. You can also capture video demos from your screen using a built-in video recorder. Integration between your Mac and your phone allow you to use the phone camera to capture pictures for use in iWork. (Apple Work?) Voice Memos, Home, News, and Stocks are now on macOS.
  • New privacy measures are in place. Today I learned: you can be tracked by a “fingerprint” including the fonts you have installed and other “public” customizations. Scary.
  • Redesigned App Store is redesigned. Pretty.
  • CreateML lets you develop assets and train models for Machine Learning. CoreML 2 is 30% faster and quantization reduces model size by up to 75%.
  • 2019: iOS Apps come to the Mac. A multi-year roadmap converges design and a whole new generation of developers flood the platform for the benefit of the Mac as well as iOS.