It should not come as a shock to you that Apple TV apps are not touchable.
Sure, you can use the remote and slide your finger around, but there’s no compelling real-world correspondence between the blank touch pad and the large screen it relates to.
When you design, you must design for relative interactions. Think “directions” instead of “points”. For example, “this button is to the right of that button, so shift the focus to the right” or “I want my toy to move up, so I’ll swipe in that direction”.
This is where gesture recognizers show great value and direct interaction like touchesEnded don’t.
Today, I decided to push the limit and see how much I could convey just using the blind touch region on the remote. The video shows my interactions. It’s okay for experiential messing around but kind of awful for any purpose.
It helped to reinforce that creating frogger-like games (such as the one shown at the event) is going to be an easier task that anything that involves indirect input like drawing or goal point setting. If you’re kicking around ideas of “what might be a good game to create”, this is going to be a huge design point to consider.
Think more GameBoy-style, side scrollers, etc than Drawing with Friends.
I’m also thinking this might be a good time to bring back the Graffiti-style text input code of yesteryear (not to mention file radars for Siri text input).
There’s some good discussion about UI focus on buttons, tables, etc. in the docs. Although my primary interest is SpriteKit, I think it’s time to kick the UIKit wheels around a little.