Archive for the ‘Misc’ Category

Why we develop

From my inbox:

I have been steadily using folderol in order to help me define which folders are important, as well as which ones are not.

Folderol has also been handy to me in developing subfolders within folders and having those subfolders be different colors, which helps me find the information inside of them quicker.

Thank you for developing such a great product.  If the folderol app is any indication of other products that you might have developed or that are in the development stage, I look forward to seeing what other applications you have.

Respectfully,

Demetrius Moyston

Folderol at the Mac App Store

My first Cozmo game: Hide and Seek

In my continuing exploration of the Cozmo SDK, I’ve written my first  game. “Hide and Seek” is not particularly complex or challenging. I’m still pretty new to both Python and the SDK tools and my wrapper code is still pretty basic.

“Hide and Seek” goes like this. You place a light cube near Cozmo out of his direct line of sight. He searches until he finds it, picks it up, moves it to a new location, and drops it. He then turns around, with his back to the cube and says “Let’s play again”.

Although this is a first take, I think my wrapper approach remains pretty simple and readable. I’m trying to emphasize learning programming concepts, using Python, which means that Cozmo access needs to be encapsulated and procedural:

# run, cozmo, run
def actions(cozmoLink):
    '''Specify actions for cozmo to run.'''
    
    # Fetch robot
    coz = Cozmo.robot(cozmoLink)
    coz.say("Hide and seek!")

    # Look for a cube
    if not coz.findACube():
        coz.say("No cube")
        return

    # Found one!
    coz.say("I found a cube")
    coz.takeCube() # Pick it up
    coz.say("I have the cube")
    coz.drive(time = 3, direction = Direction.forward) # Drive
    coz.dropCube() # Place the cube
    coz.turn(degrees = 180) # Turn around
    coz.say("Let's play again")

Cozmo.startUp(actions)

The most complex concept here is searching for a block (coz.findACube()) and acting on a Boolean return value. I’m not completely in love with how I established this notion. Maybe something more along the lines of “ask to look and find” instead of just “find” would better indicate conditionality.

Under the covers, the Cozmo class now stores both a cube of interest and a list of cubes within view. I’m not sure I’m going to stay with this specific design but this new feature is what allows you to omit mentioning the cube instance in the latter half of the game. I think I probably need to step back and refactor to evolve a Cozmo’s “world” class, to describe what he sees, and better mirror the world in the direct APIs.

Here’s a video of the “Hide and Seek” gameplay:

My immediate goals are to encapsulate all the asynchronous and exception-handling code into very simple call styles. I want to model the world, the robot, and the interactions in a more human-based way, to support simple programming concepts: procedure, state, condition, iteration, and eventually functions.

Even though I’m directly interested in teaching, at the back of my mind, I want to eventually get to the point where I can introduce some emotion programming, which I think is perfect for Cozmo.

I wouldn’t have to start from scratch. There’s some GPL licensed work done in the Facemoji project. Facemoji harvests an emotion dataset, classifies the emotions, and then matches incoming video against the data set. Wouldn’t it be great if Cozmo could react to your face beyond recognition, playing off happiness, sadness, etc?

Enter the Python: Peeking at a language

Last week, I wrote about how I set up Xcode to run Python. It’s been working great. Xcode may not be everyone’s cup of tea, but I love it. Syntax highlighting, familiar keybindings, symbol completion. I couldn’t be happier. A lot of people pushed me to use Pycharm community edition, but while I’ve installed it and tried it a few times, I keep going back to Xcode. Warts and all.

I haven’t logged many hours in Python but it’s been a fascinating language experience. Let me go all metaphor on you. Way back in the 90’s there was this show called “Sliders“, about a bunch of people moving between parallel worlds. Almost everything was the same from world to world — normal humans, trees, buildings, whatever — but there were always fundamental differences in the culture and the people that always reminded you that you weren’t home.

Python is the Sliders version of Swift, the one where Chris Lattner was never born. Everything is eerily familiar and nothing is quite right. Where are my value types? My generics? My type extensions. Let me throw out another metaphor — one that will probably resonate with even fewer people: Python is the language version of the Nethack Rogue Level, where you enter “what seems to be an older, more primitive world.” It’s all familiar. Nothing is exactly the same.

This morning, I attempted to extend a type. I’m working with Anki’s Cozmo robot SDK, which is written for Python 3.5.1 or later. I’m trying to reconfigure many of the basic calls into more appropriate chunks suitable for teaching kids some programming basics.

Instead of focusing on asynchronous callbacks and exceptions, I want to provide really simple blocks that extend the robot type API in a way that hides nearly all the implementation details. I’m trying to build, in a way, a Python version of Swift Playgrounds but with a real robot. (And it’s going well, but more about that in another post.)

What I found was that Python really doesn’t want to extend types. You can subclass. You can compose. But so far, I haven’t found a way to add an extension that services an existing type. When I asked around, the Python gurus on freenode recommended I stop worrying about polluting the global namespace and embrace freestanding functions as needed.

Oh, my delicate Swift sensibilities! Adding global functions and constants? Cluttering the global namespace? I find myself clinging to Swift conventions. I create enumerations and type my arguments:

class Direction(IntEnum):
    '''Permitted driving directions.'''
    forward = 1
    backward = -1

def drive(robot: cozmo.robot.Robot, 
    direction: Direction = Direction.forward): ...

The Cozmo SDK defines its constants like this:

LEFT = 1
RIGHT = 2
TOP = 4
BOTTOM = 8

I don’t think I’m in Swift-land anymore.

A lot of the things I like most about Python appear to be fairly new, like that ability to type arguments. I’m assured by some Pythonistas that this is almost entirely syntactic sugar, and there appears to be no type-checking, inference, or casting applied to calls.

I thought I would really hate the indentation-based scoping but I don’t. It’s easy to use (start a scope with a colon, indent 4 spaces for that scope). It reads well. It’s clean. Non-braced scoping ended up being a complete non-issue for me, and I mildly admire its clean look.

I’m less excited by Python’s take on structured documentation. The standard is outlined in PEP-257. Unlike Apple’s Swift Documentation Markup, Python markup doesn’t seem to support specific in-line tool use in addition to document generation. I’m sensitive to how much better Swift creates a structured system for detailing parameters, error conditions, return types, and descriptions, and how it scales from types to functions and methods to individual instances and provides Xcode integration.

So much in Python is very similar to Swift but with a slight twist to it. Closures? Lambdas are in there. Mapping? That’s there too. Partial application? Seems to be. Most times that I reach for a tool in my existing proficiencies, I can usually find a Python equivalent such as list comprehension, which is basically mapping across sequences and collections.

I’m sorely missing my value types. One of the first things I did when trying to work through some tutorials was to try to create a skeleton dictionary rather than type out full dictionaries for each instance. I quickly learned Python uses reference types:

# the original dict was more complicated
studentDict = {"name" : "", "tests" : []} 

joe = studentDict # create joe
joe["name"] = "joe"
bob = studentDict # create bob
bob["name"] = "bob"

# reference type
print(joe) # {'tests': [], 'name': 'bob'}
print(bob) # {'tests': [], 'name': 'bob'}

Oops.

In any case, I’m still really really new to the language given my full-court-press on finishing Swift Style. As much as I wish I were writing this code in Swift, I’m glad that I have the opportunity to explore Python and hope I get to spend some time with Scala in the near future. This project is offering me a lot of valuable insights about where Swift came from and increased appreciation for the work the core Swift team put in to give us the language we have now.

Cocoapods and Copyright claims

So what do I do about this? Or more particularly about this: “Copyright (c) 2016 tangzhentao <tangzhentaoxl@sina.cn>”? I don’t mind people using sources from my repos, but I do mind them claiming copyright. I would appreciate advice and thoughts.

Update: On Jeremy Tregunna’s advice, I sent an email: “You are welcome to create Cocoapods using my repositories, but you are not welcome to claim copyright ownership, change the license (BSD attribution in this case), or otherwise misattribute my code. ” I asked tangzhentao to correct the matter immediately.

Update: Tangzhentao responds: “I just saw this problem today in Github and then I went to check my email. Thank everybody to piont out this mistake, especially Erica Sadun. Now I have corrected this mistake. But I don’t know if I really correct this mistake. If not, please remind me, thanks.” There are no changes to the authorship or copyrights, I have asked him/her to update within 24 hours or I will contact Github.

Solving Mathieu’s Phone: The mystery of disappearing gigs

The other day, Mathieu’s 16 GB phone suddenly had no space. Even after rebooting, even after reformatting (and not restoring from backup), all his spare bytes were being sucked into a black hole.

He had no songs, few apps, a modest number of photos, and under a gigabyte of space available, making him unable to compile, load, and tests his apps.

cvgmmpaxeaeooco

Each time he deleted one of his apps, the space would mysteriously fill up within a few minutes, adding to the ever increasing “other” bar in iTunes:

unspecified

This delete-then-lose-space behavior made me think that iCloud was trying to store files locally on his phone to reduce cloud access. I suggested that he disable iCloud and sync just the bare essentials like contacts, calendars, and notes. (Mathieu has a paid 300GB iCloud plan.) Sure enough once he logged out and rebooted, over 7GB of space was freed up and he was able to use his phone again.

I’m not super-familiar with iCloud so if anyone can further explain how this works, and how to set up the phone to limit it from glomming space, I’d sure appreciate being able to pass that along. Thanks!

Announcing tmdiff

For all I know this already exists and I just was unable to google it up. Assuming it doesn’t, tmdiff allows you to perform a command line diff on a text file against a time machine version.

Repo: https://github.com/erica/tmdiff

Usage:

Usage: tmdiff (--list)
       tmdiff [offset: 1] path

The list option just lists the dates for the backups in reverse chronological order. Supply a path to diff, e.g.

tmdiff Style600-Control\ Flow.md

It defaults to using the “but last” backup offset of 1. If you want to use the most recent backup, use 0 instead, or any number moving further back in time as the value increases:

tmdiff 0 Style600-Control\ Flow.md
tmdiff 3 Style600-Control\ Flow.md

I hope this is handy for someone out there on the opposite side of the Intertube, especially since version control is baked into stuff like TextEdit. Do let me know if you use it.

Update: See also github.com/erica/tmls and github.com/erica/tmcp. The former runs ls, complete with arguments. The latter performs a nondestructive copy with the Time Machine date appended.

When did Ikea ditch the sunshine, rainbows, and unicorns?

The new Ikea catalog arrived yesterday. Is it me or have they turned over their design to some crazed Swedish goth intern? My new catalog feels more Hitchcock and “Vogue Editorial” than “Affordable purchases for people who wish they could fix their out-of-control lives.”

Ikea’s gone from cute girls in a colorful apartment (top, 2015) to psychotic butcher knives that think they’re actually vegetables (check out that shadow) and this recurring weird backdrop thing, which makes me think they couldn’t afford an editor to crop the photos properly (bottom, 2016).IMG_1527

Suddenly, they’ve transitioned from simple product images inspiring you to simplify and organize  your life to a kind of nightmare clutter scenario where all reason has fled and you apparently must buy every product available from the company and store them in the open without drawers, cupboard doors, or any break in sanity.

IMG_1521

Look at that poor woman standing at that kitchen island. Her entire body communicates the tenseness from barely having a spare inch of counter space, banging her knees against all the junk on the two shelves, the shame of putting your dishes out for public viewing. Inside, she’s screaming “I will never get my life under control and it’s all IKEA’s fault! For just $499!”

(By the way, I love the LED light at the middle of the right page of the 2015 catalog. Mine is black, not red, and it’s perfect between my two computer monitors. Folds up out of the way when not in use.)

IMG_1526

Apparently 2016 is the year of dark spaces, drawn blinds, and Carmen cosplay. You can pretend to die of consumption in the gloomy shadows of your living room, while dressed in red and practicing ballroom in the  (perhaps) 2 square meters of space between couches.

And can you think of anything scarier than your sofa actually being your home. Last year, a beautiful, open plan living room, with a family happily getting work done on the laptop and reading to a kid. They seem happy, their plants seem happy, the lightness and brightness no doubt makes them feel free and open and relaxed. Compare that to this year.IMG_1525

No, Ikea, a sofa is not the home. And who are all those strange people who wandered into this poor woman’s life just to stare at and harass her?

Here’s Elsa. Elsa thought she’d have a lovely relaxing time, putting up her feet before picking up the kids and stopping by an organic locally sourced market for take out to eat while perched on a variety of ottomans and sleeper couches.

Who would ever have expected an entire gang from Twitter to take up residence on the other side of her monster sofa, laughing at her, mocking her, and critiquing her lounging style. That gang of four sure think Elsa is a hoot. And all at the same time, creepy Helmut from down the road just stares at Elsa with unrequited longing. I think perhaps he’s humming ska songs from the 1980s to her.

Poor Elsa. This is what comes of living in the middle of a photographic studio, without doors to keep out strangers, no storage for clothing, a ragtag group of floating sofas for the young ones to sleep upon, and three mysterious remote controls to remind her a time when she had a real house to call home.

Oh Ikea. It’s time to say goodbye to 2015, with its misty bright hopes for a world of knotty pine. 2016 has arrived with its dark bleak dystopian furniture and a bookshelf that looks like an insurance liability court case ready to happen.

IMG_1523

(As a side note, I had no idea that sleeper sofas crept out of their homes while we were at work to embrace that secret 24-hour life. It must get crowded at the bowling alley and at the local microbrewery when affordable furniture sits around, drinking lager, and sharing the stories you thought were kept secret.)

Nostalgia Tuesday: By request, my 2012 Siri Post

Well, if anything does happen on Monday, we can play “How badly did she get it wrong“, right? And to add some icing, here’s a what-if post about Siri controlling your Apple TV and a proof-of-concept Siri-style dictation used in-app.

(There’s a comment on the video that I particularly love: “Scammer watch the mouse across the screen at the end.” I hate to destroy the tinfoil but I was feeding the Apple TV output through EyeTV and recording the output on my Mac. Bless that person’s conspiratorial heart.)

How 3rd Party apps might integrate with Siri

Third-party integration into Siri remains at the top of many of our TUAW wish lists. Imagine being able to say “Play something from Queen on Spotify” or even “I want to hear a local police scanner.” And Siri replying, “OK, you have two apps that have local police scanners. Do you want ScannerPro or Wunder Radio?”

So why doesn’t Siri do that?

Well, first of all, there are no third party APIs. Second, it’s a challenging problem to implement. And third, it could open Siri to a lot of potential exploitation (think of an app that opens every time you say “Wake me up tomorrow at 7:00 AM” instead of deferring to the built-in timer).

That’s why we sat down and brainstormed how Apple might accomplish all of this safely using technologies already in-use. What follows is our thought experiment of how Apple could add these APIs into the iOS ecosystem and really allow Siri to explode with possibility.

Ace Object Schema. For anyone who thinks I just sneezed while typing that, please let me explain. Ace objects are the assistant command requests used by the underlying iOS frameworks to represent user utterances and their semantic meanings. They offer a context for describing what users have said and what the OS needs to do in response.

The APIs for these are private, but they seem to consist of property dictionaries, similar to property lists used throughout all of Apple’s OS X and iOS systems. It wouldn’t be hard to declare support for Ace Object commands in an application Info.plist property lists, just as developers now specify what kinds of file types they open and what kind of URL addresses they respond to.

Practicality. If you think of Siri support as a kind of extended URL scheme with a larger vocabulary and some grammatical elements, developers could tie into standard command structures (with full strings files localizations of course, for international deployment).

Leaving the request style of these commands to Apple would limit the kinds of requests initially rolled out to devs but it would maintain the highly flexible way Siri users can communicate with the technology.

There’s no reason for devs to have to think of a hundred ways to say “Please play” and “I want to hear”. Let Apple handle that — just as it handled the initial multitasking rollout with a limited task set — and let devs tie onto it, with the understanding that these items will grow over time and that devs could eventually supply specific localized phonemes that are critical to their tasks.

Handling. Each kind of command would be delineated by reverse domain notation, e.g. com.spotify.client.play-request. When matched to a user utterance, iOS could then launch the app and include the Ace dictionary as a standard payload. Developers are already well-acquainted in responding to external launches through local and remote notifications, through URL requests, through “Open file in” events, and more. Building onto these lets Siri activations use the same APIs and approaches that devs already handle.

Security. I’d imagine that Apple should treat Siri enhancement requests from apps the same way it currently works with in-app purchases. Developers would submit individual requests for each identified command (again, e.g. com.spotify.client.play-request) along with a description of the feature, the Siri specifications — XML or plist, and so forth. The commands could then be tested directly by review team members or be automated for compliance checks.

In-App Use. What all of this adds up to is an iterative way to grow third party involvement into the standard Siri voice assistant using current technologies. But that’s not the end of the story. The suggestions you just read through leave a big hole in the Siri/Dictation story: in-app use of the technology.

For that, hopefully Apple will allow more flexible tie-ins to dictation features outside of the standard keyboard, with app-specific parsing of any results. Imagine a button with the Siri microphone that developers could add directly, no keyboard involved.

I presented a simple dictation-only demonstration of those possibilities late last year. To do so, I had to hack my way into the commands that started and stopped dictation. It would be incredibly easy for Apple to expand that kind of interaction option so that spoken in-app commands were not limited to text-field and text-view entry, but could be used in place of touch driven interaction as well.