Archive for December, 2016

Swift Style vs ProseLint: The Smackdown

ProseLint is great. As I’m writing a book about style and linting, it’s natural to try to lint the book that lints your programming. In using this tool, I’ve encountered some amusing “lint fails” that I’d thought I’d share.

ProseLint vs Nil Coalescing: “hyperbolic.misc ‘`??`’ is hyperbolic.” Winner: Swift Style.

ProseLint vs discussion of Forced Unwwrapping: “leonard.exclamation.30ppm More than 30 ppm of exclamations. Keep them under control.” Winner: ProseLint. Any forced unwrapping, even in a discussion about forced unwrapping, is an obvious fail. Save the kittens, drop the !’s.

ProseLint vs Meaningful Variable Names: “typography.symbols.multiplication_symbol Use the multiplication symbol ×, not the letter x” Winner: Swift Style. As Freud said, sometimes an letter  “x”  is just a letter “x”. (Or was that Groucho Marx? I forget.)

ProseLint vs “Use American English Spelling” rule: “consistency.spelling Inconsistent spelling of ‘color’ (vs. ‘colour’)” Winner: Swift Style. When writing for a global audience, prefer “color” to “colour”. (See? I did it again. — B. Spears)

Winner? Forget the points. It’s ProseLint. This summary doesn’t include the great catches made and fixed, like excessive use of “very”, repeated word detection, etc. Great tool, check it out.

Reducing to Swift sets

A friend asked me, “Is there a better way to reduce to a set than .reduce (Set<String>()) { $0.union(CollectionOfOne($1)) } ?” He was fetching results from an external source and wanting to feed them into a set.

We kicked some ideas back and forth about how this could be implemented. Would he need to query the set before fetching all the items (no) and would his data set be so large that it would be impractical to store the intermediate results into an array before creating a set (also no).

I built a suite of tests, trying out his reduce method, using normal insertion, etc.  I assumed that using a Set initializer would probably be the best approach for a pre-computed array, but it turns out that union and insertion performed better in repeated tests:

timetest("initializer") { //  0.652348856034223
    var x: Set<String> = []
    (1 ... 5_000).forEach { _ in
        x = Set(letters)
    }
}

timetest("union") { // 0.524669112986885
    var x: Set<String> = []
    (1 ... 5_000).forEach { _ in
        x = x.union(letters)
    }
}

timetest("insert") { // 0.572339564969297
    var x: Set<String> = []
    (1 ... 5_000).forEach { _ in
        x = []
        letters.forEach ({ x.insert($0) })
    }
}

timetest("reduce") { //  0.762973523989785
    (1 ... 5_000).forEach { _ in
        var x = letters.reduce(Set<String>()) {
            $0.union(CollectionOfOne($1))
        }
    }
}

That surprised me since you’d imagine that  init<Source : Sequence where Source.Iterator.Element == Element>(_ sequence: Source) and func union<S : Sequence where S.Iterator.Element == Element>(_ other: S) -> Set<Element> would have similar performance characteristics.

What didn’t surprise me was that trying to create a set on the go cost more than storing intermediate results to an array and then building a set out of them. So long as the array was reasonably limited in size (that is large enough to be non-trivial but not so large that it put a burden on application memory), an intermediate array seems to be the better approach. Set(collectedResults) outperformed insert, formUnion, and reduce/union for non-trivial result sizes.

Unexpected precedence issues with try? and as?

Tim Vermeulen recently wrote on the Swift Evolution list that try? precedence can be unexpected:

if let int = try? mightReturnInt() as? Int {
 print(int) // => Optional(3)
}

Specifically, he discovered that try?’s precedence is lower than as?’s precedence, so you may have to add parentheses to get the right result.

if let int = (try? mightReturnInt()) as? Int {
    print(int) // => 3
}

He also found a similar issue with using try? on a throwing-optional-returning scenario:

if let int = try? mightReturnInt() {
    print(int) // => Optional(3)
}

if let int = (try? mightReturnInt()) ?? nil {
    print(int) // => 3
}

There’s some magic baked into if let item = item as? T that automatically lifts optionals, which doesn’t yet seem to extend to try?. If you’re running into these situations, consider adding parentheses and nil-coalescing as demonstrated in these examples.

In case you think a throwing-optional scenario is too “out there”, think of a file system request that would throw on an unreadable directory and return nil if a specific file does not exist. Although obscure, it is not unthinkable to combine the two approaches.

My first Cozmo game: Hide and Seek

In my continuing exploration of the Cozmo SDK, I’ve written my first  game. “Hide and Seek” is not particularly complex or challenging. I’m still pretty new to both Python and the SDK tools and my wrapper code is still pretty basic.

“Hide and Seek” goes like this. You place a light cube near Cozmo out of his direct line of sight. He searches until he finds it, picks it up, moves it to a new location, and drops it. He then turns around, with his back to the cube and says “Let’s play again”.

Although this is a first take, I think my wrapper approach remains pretty simple and readable. I’m trying to emphasize learning programming concepts, using Python, which means that Cozmo access needs to be encapsulated and procedural:

# run, cozmo, run
def actions(cozmoLink):
    '''Specify actions for cozmo to run.'''
    
    # Fetch robot
    coz = Cozmo.robot(cozmoLink)
    coz.say("Hide and seek!")

    # Look for a cube
    if not coz.findACube():
        coz.say("No cube")
        return

    # Found one!
    coz.say("I found a cube")
    coz.takeCube() # Pick it up
    coz.say("I have the cube")
    coz.drive(time = 3, direction = Direction.forward) # Drive
    coz.dropCube() # Place the cube
    coz.turn(degrees = 180) # Turn around
    coz.say("Let's play again")

Cozmo.startUp(actions)

The most complex concept here is searching for a block (coz.findACube()) and acting on a Boolean return value. I’m not completely in love with how I established this notion. Maybe something more along the lines of “ask to look and find” instead of just “find” would better indicate conditionality.

Under the covers, the Cozmo class now stores both a cube of interest and a list of cubes within view. I’m not sure I’m going to stay with this specific design but this new feature is what allows you to omit mentioning the cube instance in the latter half of the game. I think I probably need to step back and refactor to evolve a Cozmo’s “world” class, to describe what he sees, and better mirror the world in the direct APIs.

Here’s a video of the “Hide and Seek” gameplay:

My immediate goals are to encapsulate all the asynchronous and exception-handling code into very simple call styles. I want to model the world, the robot, and the interactions in a more human-based way, to support simple programming concepts: procedure, state, condition, iteration, and eventually functions.

Even though I’m directly interested in teaching, at the back of my mind, I want to eventually get to the point where I can introduce some emotion programming, which I think is perfect for Cozmo.

I wouldn’t have to start from scratch. There’s some GPL licensed work done in the Facemoji project. Facemoji harvests an emotion dataset, classifies the emotions, and then matches incoming video against the data set. Wouldn’t it be great if Cozmo could react to your face beyond recognition, playing off happiness, sadness, etc?

Buy a book: Swift Style

esswiftscaled

I’m happy to announce that Swift Style has gone into beta release at  the Pragmatic Bookshelf. My book is now ready for sale as part of their beta program. This program gives you early access to the book’s material as I work on it.

Be part of the writing process. Beta access enables you to offer feedback as I finish writing:

Before a book gets to the final, ready-to-publish state, it normally looks quite rough. It will have hundreds of typos and grammatical errors. It’s likely to have technical errors that would normally get corrected in a final read-through by reviewers. And it’ll certainly look fairly ugly—we don’t do any layout work until just prior to sending a book to the printer, so there will be widows, orphans, text split across page turns and so on.

As you find mistakes or technical errors, if want to argue for or against a style rule, or you’d like to submit an enhancement suggestion,  click the Report Erratum link on the book’s home page. If you have any in-depth feedback (either positive or negative) that goes beyond the scope of the erratum page, drop me an email. Enable notifications so you receive an email whenever the book updates.

Swift Style: Beta Book

Self-Published Books

Thank you for your support!

Apple TV, Home Sharing, and Missing Movies

I rented Hunt for the Wilderpeople last week, while it was the $0.99 featured rental. I’ve heard good things about this Kiwi movie (I’m a bit of a kiwiholic) and couldn’t wait to watch it.

screen-shot-2016-12-13-at-2-41-44-pm

So today, with a draft of Swift Style pushed up to Pragmatic, I thought I’d set it up for a nice family watch tonight. I opened the Computers > Rentals section on Apple TV and saw this:

screen-shot-2016-12-13-at-2-47-11-pm

I wasted about 20 minutes googling things like “why doesn’t my rental show up on my Apple TV” and checking my iTunes accounts and home sharing setup, when I suddenly remembered this had happened to me before.

With that spark of inspiration lingering in my mind, I went to iTunes on my computer (where I had rented it) and sure enough, it was still up in the cloud. I clicked the download button and got it down to my computer:

screen-shot-2016-12-13-at-2-41-42-pm

About 1.41 Gigabytes later (and several pause/resumes when the download speed got slow — seriously, at one point the ETA jumped from over 40 minutes to under 3), I returned to Apple TV and hopped into my home-sharing library.

screen-shot-2016-12-13-at-3-19-15-pm

Tada.

So if you’re looking for a lost movie, or you can’t find your rental on Apple TV, make sure that if you rented it on your home computer, that you’ve downloaded it from the cloud before attempting to play it from ATV.

Programming Cozmo

Anki has been kind enough to let me play with their new Cozmo unit and explore their SDK. Cozmo is a wonderful device, developed by people who understand a lot of core principles about human interaction and engagement.

Cozmo is adorable. When it recognizes your face, it wriggles with happiness. It explores its environment. When it’s bored, it sets up a game to play with you. It can get “upset” and demand attention. It’s one of the most personable and delightful robots I’ve played with.

At its heart is a well-chosen collection of minimal elements. The unit can move around the room, with a 4-wheel/2-tread system. It includes an onboard forklift that can rise and fall, an OLED “face” that expresses emotion, and a camera system that ties into a computer vision system, which I believe is based on PIL, the Python Image Library. (Anki tells me that Cozmo’s vision system “does not use PIL or Python in any way, though the Python SDK interface uses PIL for decoding jpegs, drawing animations, etc.”)

Three lightweight blocks with easily-identified markings complete the Cozmo package, which Cozmo can tap, lift, stack, and roll.

Between its remarkable cuteness and its vision-based API, it’s a perfect system for introducing kids to programming. I was really excited to jump into the SDK and see how far I could push it.

Here is Anki’s “Hello World” code (more or less, I’ve tweaked it a little) from their first developer tutorial:

import sys
import cozmo

'''
Hello Human
Make Cozmo say 'Hello Human' in this simple
Cozmo SDK example program.
'''

def run(sdk_conn):
    robot = sdk_conn.wait_for_robot()
    robot.say_text("Hello Human").wait_for_completed()
    print("Success")

if __name__ == '__main__':
    cozmo.setup_basic_logging()    
    try:
        cozmo.connect(run)
    except cozmo.ConnectionError as err:
        sys.exit("Connection error ????: %s" % err)

Although simple, this “Hello World” includes quite a lot of implementation details that can scare off young learners. For comparison, here’s the start of Apple’s tutorial on Swift “Learn to Code”:

screen-shot-2016-12-12-at-11-45-24-am

There’s such a huge difference here. In Apple’s case, everything that Byte (the main character) does is limited to easy-to-understand, simple calls. The entire implementation is abstracted away, and all that’s left are instructions and very directed calls, which the student can put together, re-order, and explore with immediate feedback.

In Anki’s code, you’re presented with material that’s dealing with set-up, exceptions, asynchronous calls, and more. That is a huge amount of information to put in front of a learner, and to then say “ignore all of this”. Cozmo is underserved by this approach. Real life robots are always going to be a lot more fun to work with than on-screen animations. Cozmo deserved as simple a vocabulary as Byte. That difference set me on the road to create a proof of concept.

In this effort, I’ve tried to develop a more engaging system of interaction that better mirrors the way kids learn. By creating high level abstractions, I wanted to support the same kind of learning as “Learn to Code”. Learn to Code begins with procedural calls, and then conditional ones, and moving on to iteration and functional abstraction, and so forth.

My yardstick of success has been, “can my son use these building blocks to express goals and master basic procedural and conditional code?” (I haven’t gotten him up to iteration yet.) So far, so good, actually.  Here is what my updated “Hello World” looks like for Cozmo, after creating a more structured entry into robot control functionality:

from Cozmo import *

# run, cozmo, run
def actions(cozmoLink):
    '''Specify actions for cozmo to run.'''
    
    # Fetch robot
    coz = Cozmo.robot(cozmoLink)

    # Say something
    coz.say("Hello Human")

Cozmo.startUp(actions)

Not quite as clean as “Learn to Code” but I think it’s a vast improvement on the original. Calls now go through a central Cozmo class. I’ve chunked together common behavior and I’ve abstracted away most implementation details, which are not of immediate interest to a student learner.

Although I haven’t had the time to really take this as far as I want, my Cozmo system can now talk, drive, turn, and engage (a little) with light cubes. What follows is a slightly more involved example. Cozmo runs several actions in sequence, and then conditionally responds to an interaction:

from Cozmo import *
from Colors import *

# Run, Cozmo, run
def actions(cozmoLink):
    '''Specify actions for cozmo to run.'''
    
    # Fetch robot
    coz = Cozmo.robot(cozmoLink)

    # Say something
    coz.say("Hello")

    # Drive a little
    coz.drive(time = 3, direction = Direction.forward)
    
    # Turn
    coz.turn(degrees = 180)
    
    # Drive a little more
    coz.drive(time = 3, direction = Direction.forward)

    # Light up a cube
    cube = coz.cube(0)
    cube.setColor(colorLime)
    
    # Tap it!
    coz.say("Tap it")    
    if cube.waitForTap():
        coz.say("You tapped it")
    else:
        coz.say("Why no tap?")
    cube.switchOff()

Cozmo.startUp(actions)

And here is a video showing Cozmo executing this code:

If you’d like to explore this a little further:

  • Here is a video showing the SDK feedback during that execution. You can see how the commands translate to base Cozmo directives.
  • I’ve left a bit of source code over at GitHub if you have a Cozmo or are just interested in my approach.

As you might expect, creating a usable student-focused learning system is time consuming and exhausting. On top of providing controlled functionality, what’s missing here is a lesson plan and a list of skills to master framed into “Let’s learn Python with Cozmo”. What’s here is just a sense of how that functionality might look when directed into more manageable chunks.

Given my time frame, I’ve focused more on “can this device be made student friendly” than producing an actual product. I believe my proof of concept shows that the right kind of engagement can support this kind of learning with this real-world robot.

The thing that appeals most to me about Cozmo from the start has been its rich computer vision capabilities. What I haven’t had a chance to really touch on yet is its high level features like “search for a cube”, “lift it and place it on another cube”, all of which are provided as building blocks in its existing API, and all of which are terrific touch points for a lesson plan.

I can easily see where I’d want to develop some new games with the robot, like lowering reaction time (it gets really hard under about three quarters of a second to tap that darn cube) and creating cube-to-cube sequences of light. I’d also love to discover whether I can extend detection to some leftovers my son brought home from our library’s 3D printer reject bin.

Cozmo does not offer a voice input SDK. It’s only real way to interact is through its cameras (and vision system) and through taps on its cubes. Even so, there’s a pretty rich basis to craft new ways to interact.

As for Anki’s built-ins, they’re quite rich. Cozmo can flip cubes, pull wheelies, and interact in a respectably rich range of physical and (via its face screen) emotional ways.

Even if you’re not programming the system, it’s a delightful toy. Add in the SDK though, and there’s a fantastic basis for learning.

Enter the Python: Peeking at a language

Last week, I wrote about how I set up Xcode to run Python. It’s been working great. Xcode may not be everyone’s cup of tea, but I love it. Syntax highlighting, familiar keybindings, symbol completion. I couldn’t be happier. A lot of people pushed me to use Pycharm community edition, but while I’ve installed it and tried it a few times, I keep going back to Xcode. Warts and all.

I haven’t logged many hours in Python but it’s been a fascinating language experience. Let me go all metaphor on you. Way back in the 90’s there was this show called “Sliders“, about a bunch of people moving between parallel worlds. Almost everything was the same from world to world — normal humans, trees, buildings, whatever — but there were always fundamental differences in the culture and the people that always reminded you that you weren’t home.

Python is the Sliders version of Swift, the one where Chris Lattner was never born. Everything is eerily familiar and nothing is quite right. Where are my value types? My generics? My type extensions. Let me throw out another metaphor — one that will probably resonate with even fewer people: Python is the language version of the Nethack Rogue Level, where you enter “what seems to be an older, more primitive world.” It’s all familiar. Nothing is exactly the same.

This morning, I attempted to extend a type. I’m working with Anki’s Cozmo robot SDK, which is written for Python 3.5.1 or later. I’m trying to reconfigure many of the basic calls into more appropriate chunks suitable for teaching kids some programming basics.

Instead of focusing on asynchronous callbacks and exceptions, I want to provide really simple blocks that extend the robot type API in a way that hides nearly all the implementation details. I’m trying to build, in a way, a Python version of Swift Playgrounds but with a real robot. (And it’s going well, but more about that in another post.)

What I found was that Python really doesn’t want to extend types. You can subclass. You can compose. But so far, I haven’t found a way to add an extension that services an existing type. When I asked around, the Python gurus on freenode recommended I stop worrying about polluting the global namespace and embrace freestanding functions as needed.

Oh, my delicate Swift sensibilities! Adding global functions and constants? Cluttering the global namespace? I find myself clinging to Swift conventions. I create enumerations and type my arguments:

class Direction(IntEnum):
    '''Permitted driving directions.'''
    forward = 1
    backward = -1

def drive(robot: cozmo.robot.Robot, 
    direction: Direction = Direction.forward): ...

The Cozmo SDK defines its constants like this:

LEFT = 1
RIGHT = 2
TOP = 4
BOTTOM = 8

I don’t think I’m in Swift-land anymore.

A lot of the things I like most about Python appear to be fairly new, like that ability to type arguments. I’m assured by some Pythonistas that this is almost entirely syntactic sugar, and there appears to be no type-checking, inference, or casting applied to calls.

I thought I would really hate the indentation-based scoping but I don’t. It’s easy to use (start a scope with a colon, indent 4 spaces for that scope). It reads well. It’s clean. Non-braced scoping ended up being a complete non-issue for me, and I mildly admire its clean look.

I’m less excited by Python’s take on structured documentation. The standard is outlined in PEP-257. Unlike Apple’s Swift Documentation Markup, Python markup doesn’t seem to support specific in-line tool use in addition to document generation. I’m sensitive to how much better Swift creates a structured system for detailing parameters, error conditions, return types, and descriptions, and how it scales from types to functions and methods to individual instances and provides Xcode integration.

So much in Python is very similar to Swift but with a slight twist to it. Closures? Lambdas are in there. Mapping? That’s there too. Partial application? Seems to be. Most times that I reach for a tool in my existing proficiencies, I can usually find a Python equivalent such as list comprehension, which is basically mapping across sequences and collections.

I’m sorely missing my value types. One of the first things I did when trying to work through some tutorials was to try to create a skeleton dictionary rather than type out full dictionaries for each instance. I quickly learned Python uses reference types:

# the original dict was more complicated
studentDict = {"name" : "", "tests" : []} 

joe = studentDict # create joe
joe["name"] = "joe"
bob = studentDict # create bob
bob["name"] = "bob"

# reference type
print(joe) # {'tests': [], 'name': 'bob'}
print(bob) # {'tests': [], 'name': 'bob'}

Oops.

In any case, I’m still really really new to the language given my full-court-press on finishing Swift Style. As much as I wish I were writing this code in Swift, I’m glad that I have the opportunity to explore Python and hope I get to spend some time with Scala in the near future. This project is offering me a lot of valuable insights about where Swift came from and increased appreciation for the work the core Swift team put in to give us the language we have now.

Cozmo: The Unboxening

I recently reached out to Anki, creators of the Cozmo robot, to  ask if I could explore the device from a developer’s perspective. Shipping with a Python SDK, Cozmo offers some surprisingly sophisticated image processing and recognition features, analogous to Apple’s Core Image.

Before jumping into the programming side of things, let me acknowledge that I am primarily an Apple developer. Therefore I must categorically kick off by evaluating package design.

I didn’t get to start unboxing last night. When my unit arrived during a cold snap, the package window was all fogged up. I opened the top, added a desiccant unit, and let the package dry out and warm out overnight.

czm6qavweaadneu

This morning, I finally could explore. As you’d expect with high end consumer goods (Cozmo retails for $180), the box used satisfyingly thick cardboard, was easy to open, and presented the product nicely while hiding the user manual, power cord, and accessories.

Unexpectedly, my favorite part of the entire boxing system was Cozmo’s perch. Made of quality yellow plastic, this industrial presentation feature is practically a toy in itself.

screen-shot-2016-12-09-at-9-00-16-am

Flip it over, and instead of expected wire wraps, there’s an ingenious system to release the Cozmo unit. Pull up the yellow spacer and pinch the two white tabs. It all comes apart, allowing you to remove Cozmo, and start him charging.

screen-shot-2016-12-09-at-9-01-55-am

Then while he charges, you can spend time putting the pieces back together and taking them apart over and over. It’s practically an extra free toy that ships with the robot. Beautiful design and completely unexpected.

screen-shot-2016-12-09-at-9-04-28-am

The full complement of parts include the Cozmo robot, a USB charging dock with a separate a wall-adapter, a trio of play blocks (“Interactive Power Cubes”), and a welcome packet.

screen-shot-2016-12-09-at-9-09-51-am

The instructions say to place him on an open table in a well lit room, with room to move around. Charging from empty to full is specified at 10-12 minutes, with a rated play time of 1-2 hours. (I wouldn’t be surprised if that number drops once you start putting extra demands on his processors.)

That 10-12 minutes subjectively lasts approximately 3-4 months after you first finish unboxing. Child and I got into a heated debate as to whether we’d name him after Cosimo de’ Medici, founder of the Medici political dynasty, or Cosmog, nebula Pokemon and opener of “Ultra Wormholes”.

I don’t want to spoil the instruction booklet for you so let me just say, the writeup is adorable, clever, and simple. The consumer warnings in particular made me laugh out loud. It’s a great example of technical writing and communication, focusing on simplicity and clarity. It’s really well done.

You must install a separate Cozmo application you on an iOS or Android device. It just shipped a 1.1.0 update, so make sure you’re running the latest version. It helps if you watch this video before trying to set-up.

Open Settings and connect to the Cozmo WiFi network. Lift Cozmo’s front arm to display the password, and then use Settings to log in, making sure to type everything using exact upper casing. The password is random mix of upper case letters and numbers, and the iPhone’s keypad doesn’t remember casing when moving between number input and text input.

Anki recommends typing the password into your Notes app, and I endorse this suggestion, especially when I point out how many times I tried to get it to connect. That’s because at first I didn’t realize Cloak VPN was trying to secure the Cozmo WiFi connection. It really really helps if you set it up to trust the Cozmo network. This one detail put a huge crimp into my set-up, causing immeasurable pain and frustration. I finally found this support write-up, which mentions VPN issues at the very bottom.

I haven’t spent much time playing with Cozmo yet but speaking as an iOS dev, there are quite a few things Anki could do to tighten up their app. There’s great content and some terrific games but the app reads as “cross platform with compromise”. (It seems to have Unity under its cover.)

To give just a couple of examples, the cursor controls in a custom text field don’t follow iOS standards. When you use cursor arrows during typing, the active cursor position does not update. That’s iOS 101.

In terms of user interface flow, it’s missing iOS’s inherent “deference to the user”. For example, there’s no “try again” button when attempting to connect without having to go to the help screen over and over again. (Which I did, over and over and over, through an hour or so of set-up until I found that VPN advice.) When you’re using games and other features, the ability to quickly switch tasks is somewhat limited.

The app would benefit from a HIG/GUI once-over for usability but that’s really the only weak spot in the big package that I’ve encountered.

As for Cozmo himself, he’s a delight. As part of set-up, he learned my face and that of my daughter. Who cannot love a robot that wakes up from charging, sees your face, recognizes you, greets you by name and then wiggles with happiness?!? It’s phenomenally adorable.

Of course, my interest lies primarily in the SDK, and that exploration will have to wait for another write-up. For now, a summary to date:

  • Adorable robot with amazing humanizing affect display.
  • Top notch Apple-worthy packaging.
  • Great starter games to inspire development possibilities.
  • There’s an app.

Cozmo is right now exploring the floor of my office, making random offhand comments in robotese about what he’s finding and generally having a ball. I think I’ll stop writing for a short while and join him in having fun.