Last post I talked a bit about my new app Dandelion Breeze at a high level. I’m going spend this post going over what it has been like to develop for the Apple Watch. This is an incredibly simple app, but it’s been informative enough to help me plan for future Apple Watch apps. First off, pretty screenshots.
Coming from iOS and UIKit the watch was familiar, but is many ways different. For instance, rather than having UIImage and UIView, the watch’s counterparts are WKInterfaceImage and WKInterfaceGroup. The first adjustment I had to make was having no way to layer views on top of each other. Two images can’t occupy the same space. You can use the image property of the WKInterfaceGroup to simulate this, but you will not have anywhere near the control you might be used to with UIImageView and UIKit. In the screenshot above the gradient is an image that is on the group and the dandelion is a WKInterfaceImage within that group. A great (quick, dirty, and a bit crazy) use of groups and images is this Equalizer.
The next thing I was not really ready for was no [UIView animate] methods. You can use [InstanceOf_WKInterfaceGroup startAnimating] to animate the currently set image sequence, but you can’t say animate the size of the group, or a button within that group. WatchKit is separate from UIKit, though it has similarities, it’s its own monster. I actually didn’t realize how often I was calling [UIView animateWithDuration…] in my code till I didn’t have access to it. So often you want subtle animations to give your UI life, but with the watch they’ll need to be image sequences.
This segues into the last item that I wanted to discus, the device image cache. Within your watch app you have two main places to put your content, in the watch app bundle and in the device image cache. If you know the assets at build time, say a series of images in an asset catalog, then you would package those into the app bundle. You have 50 mb of storage at the time of this writing for all your stuff, not much, but considering you’re not building large experiences for the watch it should be enough for now, plus, limitations can be fun. The purpose of the cache is for images that you don’t know at build time. Remember, no [UIView animate] methods, so what if we want to animate a graph of data or visualize something else with some motion? This is where the cache comes in. With the image cache we have 5mb of storage and and put our generated image sequences there via kvp. A small project can be found on my GitHub page. This demonstrates how to use the cache and animate images via this cache.
5mb is not a lot of storage and it goes by quick, also, more importantly you will need to wait for the Bluetooth transfer to finish. In the code sample, after the call to [device addCachedImage], the image is “available”, but it needs to wait till the transfer is finished before it will display. If you try to display an image not yet transferred you will see a white spinning indicator letting you know it’s not ready. The transfer time can be slow, so don’t expect this to be viable for sequences that are a mb or more in size. Small data, procedurally generated, transferred and access with kvp, works pretty well.