Damian Mehers' Blog Evernote and Wearable devices. All opinions my own.


Android 5.0 Media Browser APIs

When I read the release notes for the Android 5.0 APIs I was delighted to see this:

Android 5.0 introduces the ability for apps to browse the media content library of another app, through the new android.media.browse API.

I set out to try to browse the media in a variety of apps I had installed on my phone.

First I listed the apps that supported the MediaBrowserService:

  private void discoverBrowseableMediaApps(Context context) {
    PackageManager packageManager = context.getPackageManager();
    Intent intent = new Intent(MediaBrowserService.SERVICE_INTERFACE);
    List<ResolveInfo> services = packageManager.queryIntentServices(intent, 0);
    for(ResolveInfo resolveInfo : services) {
      if(resolveInfo.serviceInfo != null && resolveInfo.serviceInfo.applicationInfo != null) {
        ApplicationInfo applicationInfo = resolveInfo.serviceInfo.applicationInfo;
        String label = (String) packageManager.getApplicationLabel(applicationInfo);
        Drawable icon = packageManager.getApplicationIcon(applicationInfo);
        String packageName = resolveInfo.serviceInfo.packageName;
        String className = resolveInfo.serviceInfo.name;
        publishProgress(new AudioApp(label, packageName, className, icon));

The publishProgress method updated the UI and soon I had a list of apps that supported the MediaBrowserService:

Apps that support MediaBrowserService

Next, I wanted to browse the media they exposed using the MediaBrowser classes:

public class BrowseAppMediaActivity extends ListActivity {
  private static final String TAG = “BrowseAppMediaActivity”;
  private final MediaBrowserConnectionListener mMediaBrowserListener =
      new MediaBrowserConnectionListener();
  private MediaBrowser mMediaBrowser;

  public void onCreate(Bundle savedInstanceState) {
    Log.d(TAG, “Connecting to “ + packageName + “ / “ + className);
    ComponentName componentName = new ComponentName(packageName, className);

    Log.d(TAG, “Creating media browser …”);
    mMediaBrowser = new MediaBrowser(this, componentName, mMediaBrowserListener, null);

    Log.d(TAG, “Connecting …”);

  private final class MediaBrowserConnectionListener extends MediaBrowser.ConnectionCallback {
    public void onConnected() {
      Log.d(TAG, “onConnected”);
      String root = mMediaBrowser.getRoot();
      Log.d(TAG, “Have root: “ + root);

    public void onConnectionSuspended() {
      Log.d(TAG, “onConnectionSuspended”);

    public void onConnectionFailed() {
      Log.d(TAG, “onConnectionFailed”);

I’ve cut some code, but assume that the packageName and className are as they were when queried above. No matter what I did, and which app I queried, the onConnectionFailed method was invoked.

Here is the log from when I tried to query the Google Music App:

29195-29195/testapp D/BrowseAppMediaActivity﹕ Connecting to com.google.android.music / com.google.android.music.browse.MediaBrowserService
29195-29195/testapp D/BrowseAppMediaActivity﹕ Creating media browser …
29195-29195/testapp D/BrowseAppMediaActivity﹕ Connecting …
16030-16030/? I/MusicPlaybackService﹕ onStartCommand null / null
16030-16030/? D/MediaBrowserService﹕ Bound to music playback service
16030-16030/? D/MediaBrowserService﹕ onGetRoot fortestapp
16030-16030/? E/MediaBrowserService﹕ package testapp is not signed by Google
16030-16030/? I/MediaBrowserService﹕ No root for client testapp from service android.service.media.MediaBrowserService$ServiceBinder$1
724-819/? I/ActivityManager﹕ Displayed testapp/.BrowseAppMediaActivity: +185ms
29195-29195/testapp E/MediaBrowser﹕ onConnectFailed for ComponentInfo{com.google.android.music/com.google.android.music.browse.MediaBrowserService}
29195-29195/testapp D/BrowseAppMediaActivity﹕ onConnectionFailed

Notice the message about my app not being signed by Google on line 7?

I’m assuming that only authorized apps are allowed to browse Google’s music app such as Google apps supporting Android Wear and Android Auto, but not arbitrary third party apps. Indeed the documentation for people implementing MediaBrowserService.onGetRoot indicates that:

The implementation should verify that the client package has permission to access browse media information before returning the root id; it should return null if the client is not allowed to access this information.

This makes sense, but it is disappointing. Just as users can grant specific apps access to notifications, it would be nice of they could also grant specific apps the right to browser other apps' media.

Please let me know if you discover I am wrong!

Filed under: Android No Comments

Using Android Wear to control Google Cardboard Unity VR

Using a VR headset, even one as simple as Google Cardboard, can be mind-blowing.  Nevertheless it is the little things that can also be disconcerting.  For example looking down and seeing you have no arms, despite the fact they still very much feel as though they exist.

I’m convinced that VR experiences are going to transform not just games, but interaction with computers in general, and I’ve been experimenting with some ideas I have about how to create truly useful VR experiences.

As I was working to implement one of my ideas, it occurred to me that I might be able to use the orientation sensors in the Android Wear device I was wearing.  Why not use them as input into the VR experience I was creating?  What if I could bring part of my body from the real world into the VR world?  How about an arm?

I decided to try to find out, and this was the answer:

The experience is nowhere near good enough for games.  But I don’t care about games.  I want to create genuinely useful VR experiences for interacting with computers in general, and I think this is good enough.  I can point to objects, and have them light up.  I can wear smart watches on both wrists (because I really am that cool) and have two arms available in the VR world. 

By tapping and swiping on the wearable screens I can activate in-world functionality, without being taken out of it.  It sure beats sliding a magnet on the side of my face, because it is my arm I am seeing moving in the virtual world.

In the rest of this article I’m going to describe some of technical challenges behind implementing this, how I overcame them, and some of the resources I used on the way.

The tools

This is part of my workspace: Android Studio on the left, Unity on the top-right and MonoDevelop on the bottom-left:

my workspace

I had many reference browser windows open on other screens (obviously), and creating this solution required me being very comfortable in Android, Java and C#.  I’m relatively new to Unity.

Creating a Unity Java Plugin by overriding the Google Cardboard Plugin

The Unity Android Plugin documentation describes how you can create plugins by extending the UnityPlayerActivity Java class, and I experimented with this a little.  I created an Android Library using Android Studio, and implemented my own UnityPlayerActivity derived class.

After a little hassle, I discovered that Unity now supports the “aar” files generated when compiling libraries in Android Studio, although I found the documentation a little out of date on the matter in places.  It was simply a question of copying my generated “aar” file into Unity under Assets|Plugins|Android



When it came to a Google Cardboard Unity project, what I discovered though, is that Google had got there first.  They had created their own UnityPlayerActivity called GoogleUnityActivity.  What I needed to do was override Google’s override:


I included Google’s unity classes as dependencies in my library project:


Once I’d copied the aar file into the Unity Android Plugins folder and ran the test app, I was delighted to see my activity say “Cooey” in the log.


Receiving the watch’s orientation to the phone

The next step was to receive Android Wear Messages from Android Wear on the watch, containing orientation messages.

I recreated my project, this time including support for Android Wear:


I made the Unity activity I’d created do a little more than say “Cooey”. 

First I used the Capabilities mechanism to tell other Android Wear devices that this device (the phone) was interested in arm orientation messages:


… and I set it up to receive Android Wear messages and pass them over to Unity using UnitySendMessage:


Sending the watch’s orientation to the phone

This was simply a question of looking out for Android Wear nodes that supported the right capability, listening for orientation sensor changes, and sending Android Wear messages to the right node.  This is the watch code:


I did discover that some wearables don’t support the required sensors, although I imagine more modern ones will.

Using the watch’s orientation to animate a block on the screen

Inside Unity I created a cube which tweaked into a rectangle, and made it a child of the CardboardMain’s camera, so that it moved when I moved:


See the “Script” field on the bottom right-hand side?  I have a script called “WristController” that is attached to the “wrist” (white blob).  This is where I receive orientation messages sent from the watch, via the UnityPlayerActivity derived Java class I’d created.

I started off simply assigning the received orientation to the block’s orientation by assigning to transform.eulerAngles


This worked, but was super-jerky.  I went searching and discovered Lerps and Slerps for smoothly moving from one rotation to another.  My updated code:


Animating an arm instead of a block

I was pleased to be smoothly animating a block, but my arm doesn’t quite look like that.  It is more armish.  I went looking for a model of an arm that I could import and use instead.  I found a YouTube Unity video called ADDING ARMS by Asbjørn Thirslund, in which he explains how to to import and use a free arms model by eafg.

It was simply a question of sizing and positioning the arms properly as a child of the Cardboard main camera, and then adding the script I’d used to animate the block.

I also removed the right-hand arm, since it looked a little weird to have a zombie arm doing nothing.


The ArmController script you see in this screen capture has the same contents as the WristController I’d used to move the block.

Final Thoughts

There is enough of a lag to make this technique impractical for games, but not enough to make it impractical for the kinds of non-game experiences I have in mind. 

I’d also need to add calibration, since the watch may be pointing in any direction initially – if I assume it always starts straight out, that would be good enough.  Detecting where the arm is pointing shouldn’t be too hard, since the cardboard code already does gaze detection – so many possibilities, but so little time for side-projects such as this!

This has been a fun interlude on my way to creating what I hope to be a genuinely useful VR experience based around browsing books your friends have read … more on that later.


Updating the Pebble Emulator python code

I recently wanted to make some changes to the Pebble emulator, which uses the PyV8 Python-JavaScript bridge to emulate the phone environment running your phone-bases JavaScript companion app.Screenshot 2015-05-29 12.46.07

These are some notes on how I did this, mainly so that I remember if I need to do it again, and also just in case it helps anyone else.

The first thing I did was to clone the Pebble Python PebbleKit JS implementation, used in the emulator. The original is at https://github.com/pebble/pypkjs and mine is at https://github.com/DamianMehers/pypkjs

Once I'd done that I cloned my one locally onto my Mac, and followed the instructions to build it.

It needs a copy of the Pebble qemu open source emulator to which to talk, and I started off trying to clone the Pebble qemu and build it locally.  Half-way through it occurred to me that I already had a perfectly good qemu locally, since I already had the Pebble Dev Kit installed.

By running a pbw in the emulator, with the debug switch enabled, I was able to determine the magic command to start the emulator locally:

Screenshot 2015-05-29 12.51.32

I copied the command, added some quotes around parameters that needed them, and was able to launch the emulator in one window:

Screenshot 2015-05-29 12.51.32

The phone simulator in another window:

Screenshot 2015-05-29 12.54.04

And then my app in another:

Screenshot 2015-05-29 12.55.08

Once I was up and running I started making changes to the Python code.  Since I've never written a line of Python before I made liberal use of existing code to make the changes I needed.

It all ended well when my pull request containing my changes to support sending binary data was accepted into the official Pebble codebase, meaning that Evernote now runs in the emulator.

Filed under: Pebble No Comments

Capture your Mac screen activity into daily videos

Screenshot 2015-05-26 14.42.37I know I'm not alone in wishing there was a TimeSnapper equivalent for the Mac.  Among many things it lets you look back in time at what you were doing on your computer minutes, hours or days ago.

Perfect for remembering what you were doing yesterday, and even to recover stuff that was displayed on your screen.

Inspired by TimeSnapper, I've created a small bash script that I've called MacBlackBox which takes regular screen-shots every few seconds. Every hour it combines the screenshots into an mp4 video, and every day it combines the hourly videos into daily videos, one per screen.

It is available in GitHub here.  Happy to accept improvement suggestions.

Filed under: Uncategorized No Comments

Keeping your Moto 360 alive while charging


If you are developing using the Moto 360 and debugging over bluetooth, you'll notice the battery plummeting quickly.

If you put the watch on a QI charging pad, the Moto 360's charging screen kicks in, and you can no longer do anything on the watch, although if you launch your app via Android Studio, it will run.

If you still want to use your watch while it is charging, root it, and disable Motorola Connect on the watch using:

adb -s 'localhost:4444' shell
$ su
# pm disable com.motorola.targetnotif

This works for me, although I am sure it stops plenty of other things from working, so only do this on a development device, and at your own risk.

Filed under: Uncategorized No Comments

On Pulse: Why your basal ganglia and wearables were made for each other

I just posted Why your basal ganglia and wearables were made for each other

Filed under: Wearables No Comments

On Pulse – How I got my dream job: My wearables journey at Evernote

I just wrote on LinkedIn's Pulse about How I got my dream job: My wearables journey at Evernote

Filed under: Uncategorized No Comments

Scrolling long Pebble menu items

This is a technical blog post.  Warning: contains code.

We recently pushed version 1.2 of Evernote for the Pebble to the Pebble App Store.  It is a minor release, with one bug fix, and one new feature.

The bug fix is related to support for the additional character sets that Pebble can now display.

The enhancement is what this blog post is about.  Since we released the first version of the app, which was generally well received, we’ve received emails from people complaining that their note titles, notebook names, tag names etc. don’t fit on the Pebble screen.  They are cut off, and hard to read.  People asked if we could make menu items scroll horizontally if they didn’t fit.

My response was generally something along the lines of “sorry, but we use the Pebble’s built-in menuing system, and until they support scrolling menu items horizontally, we can’t do anything”.  I never felt great about this response, but it was the genuine situation.  However before I pushed the 1.2 release with the character-set bug-fix, I thought I’d take a look at scrolling the menu items.  Turns out, it was surprisingly easy.

You can see what I’m talking about here:


The funny thing about the Evernote Pebble watch app is that it knows almost nothing about Evernote.  The Evernote intelligence is all delegated to the companion app that runs on the Phone.  The watch app knows how to display massive menus (paging items in and out as necessary), checkboxes, images, text etc. 

When the user scrolls to a new menu item, we kick off a wait timer using app_timer_register waiting for one second.  If the user scrolls to another menu item before the timer has expired, we wait for a new second, this time using app_timer_reschedule:

static void selection_changed_callback(Layer *cell_layer, MenuIndex new_index, MenuIndex old_index, 
void *data) {
WindowData* window_data = (WindowData*)data;
window_data->moving_forwards_in_menu = new_index.row >= old_index.row;
if(!window_data->menu_reloading_to_scroll) {
} else {
window_data->menu_reloading_to_scroll = false;

The above method is called by the Pebble framework when the user scrolls to a new menu item.  The check for menu_reloading_to_scroll is called to work around some behavior I’ve seen.  This callback invokes the following method:

static void initiate_menu_scroll_timer(WindowData* window_data) {
// If there is already a timer then reschedule it, otherwise create one
bool need_to_create_timer = true;
window_data->scrolling_still_required = true;
window_data->menu_scroll_offset = 0;
window_data->menu_reloading_to_scroll = false;
if(window_data->menu_scroll_timer) {
// APP_LOG(APP_LOG_LEVEL_DEBUG, "Rescheduling timer");
need_to_create_timer = !app_timer_reschedule(window_data->menu_scroll_timer,
if(need_to_create_timer) {
// APP_LOG(APP_LOG_LEVEL_DEBUG, "Creating timer");
window_data->menu_scroll_timer = app_timer_register(SCROLL_MENU_ITEM_WAIT_TIMER,
scroll_menu_callback, window_data);

As you can see it uses a WindowsData structure, which is a custom structure associated with the current window via window_set_user_data.  Once the timer expires it calls scroll_menu_callback:

static void scroll_menu_callback(void* data) {
WindowData* window_data = (WindowData*)data;
if(!window_data->menu) {
window_data->menu_scroll_timer = NULL;
if(!window_data->scrolling_still_required) {

// Redraw the menu with this scroll offset
MenuIndex menuIndex = menu_layer_get_selected_index(window_data->menu);
if(menuIndex.row != 0) {
window_data->menu_reloading_to_scroll = true;
window_data->scrolling_still_required = false;
window_data->menu_scroll_timer = app_timer_register(SCROLL_MENU_ITEM_TIMER, scroll_menu_callback,

This code is called once when the timer initiated by initiate_scroll_menu_timer expires (after the one second delay), and then it invokes itself repeatedly using a shorter delay (a fifth of a second), until the menu item is fully scrolled.  The call to menu_layer_reload_data is what causes the menu to be redrawn, using the menu_scroll_offset to indicate how much to scroll the text by.

This is the method that gets called by the draw_row_callback to get the text to be displayed for each menu item:

void get_menu_text(WindowData* window_data, int index, char** text, char** subtext) {
MenuItem* menu_item = getMenuItem(window_data, index);
*text = menu_item ? menu_item->text : NULL;
*subtext = menu_item && menu_item->flags & ITEM_FLAG_TWO_LINER ?
menu_item->text + strlen(menu_item->text) + 1 : NULL;
if(*subtext != NULL && strlen(*subtext) == 0) {
*subtext = NULL;

MenuIndex menuIndex = menu_layer_get_selected_index(window_data->menu);
if(*text && menuIndex.row == index) {
int len = strlen(*text);
if(len - MENU_CHARS_VISIBLE - window_data->menu_scroll_offset > 0) {
*text += window_data->menu_scroll_offset;
window_data->scrolling_still_required = true;

The bolded code “scrolls” the text if the row corresponds to the currently selected item by indexing into the text to be displayed, and indicating that scrolling is still required.  I’m not happy with using the fixed size MENU_CHARS_VISIBLE to decide whether or not to scroll – it would be much nicer to measure the text and see if it fits.  If you know of a simple way to do this please comment!

The final thing I needed to do was to actually send longer menu item text from the phone to the watch.  Since Pebble now support sending more than 120 or so bytes this was much easier.  I’m sending up to 32 characters now.

In summary I’m simply using a timer to redisplay the menu, each time scrolling the current menu item’s text by indexing into the character array, and I stop the timer once it has all been displayed.

Filed under: Pebble, Wearables 1 Comment

WatchKit Error – unable to instantiate row controller class

Trying to create a simple WatchKit table, I hit the error shown in this blog post title.

You mileage may vary, but the eventual cause was that when I added my custom RowController class I accidentally added it to the wrong module … I added it to the main iOS app (WatchTest) instead of the Watch extension:


The first hint of this was when I was trying to reference the RowController when calling rowControllerAtIndex, and my custom row controller class could not be found:

var rootRow = rootTable.rowControllerAtIndex(0) as RootRowController

By this time I’d already set it as the RowController class for my table’s row in the storyboard, and had inadvertently referenced the wrong module:


I fixed the compilation error by adding my custom RowController to the Watch extension module, but accidentally added it to both modules:


Everything compiled but when I ran the log shows the error from the title: Error - unable to instantiate row controller class


I eventually figured out my mistake, and made sure that the row controller only belonged to the extension module:


And I made sure the correct module was referenced when defining the RowController in the storyboard:


It would be nice if the Watch App’s storyboard only saw classes in the Watch Extension’s module.

Filed under: Apple Watch, Swift 6 Comments

Using the Evernote API from Swift

There is a fine Evernote iOS SDK complete with extensive Objective C examples.  In this blog post I want to share what I did to get it working with Swift.

First I created a new Swift iOS app (called “orgr” below), then I copied the ENSDKResources.bundle and evernote-sdk-ios sources ….


… into the new project, and added references to MobileCoreServices and libxml2 per the SDK instructions.


In order for the Swift code to see the Evernote Objective C SDK, I enabled the compatibility header and pointed it to a header in the SDK that included all the other headers I needed.


I also found (YMMV) that I needed to add a reference to the libxml2 path under Header Search Paths


Once I’d done this, I was able to build.  Next it was simply a question of translating the Object C example code to Swift.  This is the minimal example I came up with:


You’ll need to replace “token” and “url” parameters with the values you can obtain using the developer token page. This simple example just logs my notebooks.  Next steps are for you …

Filed under: Evernote, iOS 1 Comment