Damian Mehers' Blog Evernote and Wearable devices. All opinions my own.

16Mar/150

On Pulse: Why your basal ganglia and wearables were made for each other

I just posted Why your basal ganglia and wearables were made for each other

Filed under: Wearables No Comments
22Feb/150

On Pulse – How I got my dream job: My wearables journey at Evernote

I just wrote on LinkedIn's Pulse about How I got my dream job: My wearables journey at Evernote

Filed under: Uncategorized No Comments
22Jan/151

Scrolling long Pebble menu items

This is a technical blog post.  Warning: contains code.

We recently pushed version 1.2 of Evernote for the Pebble to the Pebble App Store.  It is a minor release, with one bug fix, and one new feature.

The bug fix is related to support for the additional character sets that Pebble can now display.

The enhancement is what this blog post is about.  Since we released the first version of the app, which was generally well received, we’ve received emails from people complaining that their note titles, notebook names, tag names etc. don’t fit on the Pebble screen.  They are cut off, and hard to read.  People asked if we could make menu items scroll horizontally if they didn’t fit.

My response was generally something along the lines of “sorry, but we use the Pebble’s built-in menuing system, and until they support scrolling menu items horizontally, we can’t do anything”.  I never felt great about this response, but it was the genuine situation.  However before I pushed the 1.2 release with the character-set bug-fix, I thought I’d take a look at scrolling the menu items.  Turns out, it was surprisingly easy.

You can see what I’m talking about here:

 

The funny thing about the Evernote Pebble watch app is that it knows almost nothing about Evernote.  The Evernote intelligence is all delegated to the companion app that runs on the Phone.  The watch app knows how to display massive menus (paging items in and out as necessary), checkboxes, images, text etc. 

When the user scrolls to a new menu item, we kick off a wait timer using app_timer_register waiting for one second.  If the user scrolls to another menu item before the timer has expired, we wait for a new second, this time using app_timer_reschedule:

static void selection_changed_callback(Layer *cell_layer, MenuIndex new_index, MenuIndex old_index, 
void *data) {
WindowData* window_data = (WindowData*)data;
window_data->moving_forwards_in_menu = new_index.row >= old_index.row;
if(!window_data->menu_reloading_to_scroll) {
initiate_menu_scroll_timer(window_data);
} else {
window_data->menu_reloading_to_scroll = false;
}
}

The above method is called by the Pebble framework when the user scrolls to a new menu item.  The check for menu_reloading_to_scroll is called to work around some behavior I’ve seen.  This callback invokes the following method:

static void initiate_menu_scroll_timer(WindowData* window_data) {
// If there is already a timer then reschedule it, otherwise create one
bool need_to_create_timer = true;
window_data->scrolling_still_required = true;
window_data->menu_scroll_offset = 0;
window_data->menu_reloading_to_scroll = false;
if(window_data->menu_scroll_timer) {
// APP_LOG(APP_LOG_LEVEL_DEBUG, "Rescheduling timer");
need_to_create_timer = !app_timer_reschedule(window_data->menu_scroll_timer,
SCROLL_MENU_ITEM_WAIT_TIMER);
}
if(need_to_create_timer) {
// APP_LOG(APP_LOG_LEVEL_DEBUG, "Creating timer");
window_data->menu_scroll_timer = app_timer_register(SCROLL_MENU_ITEM_WAIT_TIMER,
scroll_menu_callback, window_data);
}
}

As you can see it uses a WindowsData structure, which is a custom structure associated with the current window via window_set_user_data.  Once the timer expires it calls scroll_menu_callback:

static void scroll_menu_callback(void* data) {
WindowData* window_data = (WindowData*)data;
if(!window_data->menu) {
return;
}
window_data->menu_scroll_timer = NULL;
window_data->menu_scroll_offset++;
if(!window_data->scrolling_still_required) {
return;
}

// Redraw the menu with this scroll offset
MenuIndex menuIndex = menu_layer_get_selected_index(window_data->menu);
if(menuIndex.row != 0) {
window_data->menu_reloading_to_scroll = true;
}
window_data->scrolling_still_required = false;
menu_layer_reload_data(window_data->menu);
window_data->menu_scroll_timer = app_timer_register(SCROLL_MENU_ITEM_TIMER, scroll_menu_callback,
window_data);
}

This code is called once when the timer initiated by initiate_scroll_menu_timer expires (after the one second delay), and then it invokes itself repeatedly using a shorter delay (a fifth of a second), until the menu item is fully scrolled.  The call to menu_layer_reload_data is what causes the menu to be redrawn, using the menu_scroll_offset to indicate how much to scroll the text by.

This is the method that gets called by the draw_row_callback to get the text to be displayed for each menu item:

void get_menu_text(WindowData* window_data, int index, char** text, char** subtext) {
MenuItem* menu_item = getMenuItem(window_data, index);
*text = menu_item ? menu_item->text : NULL;
*subtext = menu_item && menu_item->flags & ITEM_FLAG_TWO_LINER ?
menu_item->text + strlen(menu_item->text) + 1 : NULL;
if(*subtext != NULL && strlen(*subtext) == 0) {
*subtext = NULL;
}

MenuIndex menuIndex = menu_layer_get_selected_index(window_data->menu);
if(*text && menuIndex.row == index) {
int len = strlen(*text);
if(len - MENU_CHARS_VISIBLE - window_data->menu_scroll_offset > 0) {
*text += window_data->menu_scroll_offset;
window_data->scrolling_still_required = true;
}
}
}

The bolded code “scrolls” the text if the row corresponds to the currently selected item by indexing into the text to be displayed, and indicating that scrolling is still required.  I’m not happy with using the fixed size MENU_CHARS_VISIBLE to decide whether or not to scroll – it would be much nicer to measure the text and see if it fits.  If you know of a simple way to do this please comment!

The final thing I needed to do was to actually send longer menu item text from the phone to the watch.  Since Pebble now support sending more than 120 or so bytes this was much easier.  I’m sending up to 32 characters now.

In summary I’m simply using a timer to redisplay the menu, each time scrolling the current menu item’s text by indexing into the character array, and I stop the timer once it has all been displayed.

Filed under: Pebble, Wearables 1 Comment
19Nov/143

WatchKit Error – unable to instantiate row controller class

Trying to create a simple WatchKit table, I hit the error shown in this blog post title.

You mileage may vary, but the eventual cause was that when I added my custom RowController class I accidentally added it to the wrong module … I added it to the main iOS app (WatchTest) instead of the Watch extension:

image

The first hint of this was when I was trying to reference the RowController when calling rowControllerAtIndex, and my custom row controller class could not be found:

var rootRow = rootTable.rowControllerAtIndex(0) as RootRowController

By this time I’d already set it as the RowController class for my table’s row in the storyboard, and had inadvertently referenced the wrong module:

image

I fixed the compilation error by adding my custom RowController to the Watch extension module, but accidentally added it to both modules:

image

Everything compiled but when I ran the log shows the error from the title: Error - unable to instantiate row controller class

image

I eventually figured out my mistake, and made sure that the row controller only belonged to the extension module:

image

And I made sure the correct module was referenced when defining the RowController in the storyboard:

image

It would be nice if the Watch App’s storyboard only saw classes in the Watch Extension’s module.

Filed under: Apple Watch, Swift 3 Comments
2Nov/141

Using the Evernote API from Swift

There is a fine Evernote iOS SDK complete with extensive Objective C examples.  In this blog post I want to share what I did to get it working with Swift.

First I created a new Swift iOS app (called “orgr” below), then I copied the ENSDKResources.bundle and evernote-sdk-ios sources ….

image

… into the new project, and added references to MobileCoreServices and libxml2 per the SDK instructions.

image

In order for the Swift code to see the Evernote Objective C SDK, I enabled the compatibility header and pointed it to a header in the SDK that included all the other headers I needed.

image

I also found (YMMV) that I needed to add a reference to the libxml2 path under Header Search Paths

image

Once I’d done this, I was able to build.  Next it was simply a question of translating the Object C example code to Swift.  This is the minimal example I came up with:

image

You’ll need to replace “token” and “url” parameters with the values you can obtain using the developer token page. This simple example just logs my notebooks.  Next steps are for you …

Filed under: Evernote, iOS 1 Comment
20Oct/140

Eyeglasses are broken

My eyeglasses are broken, and I want them fixed.

I vividly remember the morning I woke up, and could no longer read.

Everything was blurry, and no matter how much I blinked away the night, I still could not read.  I could see things further off, and if I moved my phone well back past my normal reading distance, I could still just about focus.

Eventually my eyes could focus as normal, and I put the experience down to tiredness.  But soon the blurriness came back, and didn't leave.  I was being abruptly welcomed into late middle age.  I needed reading glasses.

I picked up a pair of cheap glasses from the local supermarket, and miracle of miracles, I could read again.  Everything was fine and crisp, even when I used the smallest font on the kindle app.

There was, however, still an issue.  When I was wearing my reading glasses, and I was looking at something that wasn't a book, that was further away, say a person's face, or a stop sign, everything was blurry.  I had to take my glasses off to see beyond the page in front of me.

So, in this age of miniaturized sensors, 3D printers, new material science, why can I not buy a pair of glasses that sense how far away objects are that the glasses are pointing at, and physically deform the lenses appropriately to bring items into focus for the wearer

For me the lenses would become clear glass when looking at something in the distance, and would deform to +0.5 reading glasses when looking at a page in front of me.

There have been similar attempts in the past, but as technology advances, sensors become smaller and motors become miniaturized I think its time to look once again at eye-glasses.  The way they work now is broken.  If Google invested a fraction of the money they have in Google Glass, then I'm convinced they could bring these kind of glasses to the world, benefitting hundreds of millions, And just perhaps, by incorporating Glass-like functionality along for the ride, they could bring Glass to the masses.

Filed under: Product-Ideas No Comments
12Oct/140

The inevitable evolution from wearables to embedables

The inevitable evolution from wearables to embedables is at once both exciting and horrifying.

Let's think about bluetooth headsets. They are already becoming smaller, and will soon be invisible.

I believe that bluetooth headsets will miniaturize to the point of being so tiny they will be embedded subdermally, perhaps behind your ear. We'll solve the battery issues through using the body's own heat, or through body motion.

What will this give us? Only telepathy. You'll be able to communicate mind-to-mind with anyone on the planet through this device that is part of you, initially by voicing words sub-vocally, but perhaps one day through splicing directly into nerves.

It is as exciting as it is inevitable.

What is also inevitable is a despotic regime somewhere will use such capabilities to pipe their propaganda directly to their citizens minds. Can you imagine, from birth, having this incessant stream of brainwashing beamed directly to your brain? Its horrifying.

So, along with the best case scenarios with dreaming of new technologies, let's also think of the nightmare worst-case scenarios, and make sure we do what we can to mitigate them. In this case, let's start with a physical off-switch.

Filed under: Product-Ideas No Comments
11Oct/140

A useful in-car app experience

OK, I admit it: I can't help it. Whenever I hit a problem in the real world, I automatically seek to solve it, often through the hammer in my virtual toolbox, which is creating apps.

So what does this have to do with driving my kids home from school? There are traffic lights on the route. Lots and lots of them. Like all of you, I am sure, I never look at my phone screen when I am driving and the car is in motion, but when the car is stopped in front of traffic lights, it is often hard to resist quickly checking my email, or twitter, or whatever.

Of course that is a trap. Before I know it I've been sucked into my digital world, and am oblivious to the real world, until I am rudely and abruptly pulled out of it by the honking horn of the person behind me.

So what I want is this: An app that lets me use my phone as normal, but in the background, using the camera on my phone, locks in on to the red light of the traffic light, detects when an orange light appears next to it, and alerts me both audibly and visually that the lights are changing.

I'd even use it when I'm not looking at my phone, but instead lost in dreamy reverie, lost in my own thoughts,and equally oblivious to the lights changing.

But this is only part of my master plan. Oh no, it is not all.

Sometimes I'm stopped while in the car, and it isn't a traffic light that has stopped me. Instead it is a traffic jam. Like all of you, I am sure, I dream of being able to launch a small drone from my car to fly overhead to the front of the jam, to understand what is happening, and how long I will be stuck for. The drone would be paired with my phone, letting me control it from my phone, and beam back images to my phone.

It occurs to me that the whole drone thing is unnecessarily, potentially dangerous, and more than likely illegal. Instead all I need is an app that everyone on the traffic jam uses to broadcast live the scene in front of them. Then people far back from the front of the jam can zoom through the cameras, rushing forward car by car through the jam to the front, to understand what is happening.

With appropriate anonymizing safeguards in place (number plate blurring) it could also be used by news organizations and the emergency services.

Filed under: Product-Ideas No Comments
28May/140

Interview for Connectedly on Evernote and Wearables

I recently gave a brief interview about Evernote and Wearables, with special focus on the Pebble, for Adam Zeis at Connectedly, part of the Mobile Nations group (Android Central, iMore, etc).

More here.

Filed under: Uncategorized No Comments
1May/142

Evernote on your Pebble: your desktop duplicated?

At first glance it might look as though Evernote on the Pebble is a simply a clone of Evernote for the desktop.  image  pebble-screenshot_2014-03-15_13-17-53pebble-screenshot_2014-03-15_15-01-35

That would make absolutely no sense whatsoever, given that the Pebble has an entirely different form factor, with very different uses.

I’d like to share some of the ways in which Evernote on the Pebble has been tailored to the wrist-based experience, and what you can do to get the most out of it.   But first …

A step back … why wearables?

Earlier this year at the MCE conference I presented a hierarchy of uses for wearable devices:

  • Notifications, especially smart notifications based on your context, for example based on your current location, or who you are with, such as those provided by Google Now;
  • Sensors, especially health sensors, but also environmental sensors. Very soon we will examine the devices of someone who just died, as a kind of black box to determine what happened.
  • Control of the environment around you, such as the music playing on your phone or your house lights. The key is that you have to be able to do it without thinking about it … maybe gesture-based controls.
  • Capture of information, such as taking audio notes, or photos from your watch or Glass.
  • Consumption of information, such as viewing Evernote notes.  The key to this being useful is that the effort to view the information on your watch must be significantly lower than the effort to pull out your phone, unlock it, start the appropriate app, and navigate/search for the information.  Ideally the information should be pre-prepared for easy consumption based on your context, such as where you are, or what you are doing.

How does Evernote fit in?

Notifications work without the Evernote Pebble app

The Pebble already provides notifications from apps, so that when an Evernote reminder notification fires on your Phone …

05  … you’ll see that notification on your watch… 07

As the Evernote phone apps become more sophisticated about providing smarter, context-based notifications, you’ll get that for free on your watch. 

The Evernote app for the Pebble is very much focused on the last item in that list: consumption.

Easy access to your most important information: Your Shortcuts

On the desktop and mobile versions of Evernote, you use Shortcuts to give you easy, instant access to your most important information. Perhaps its information that you always need to have at your fingertips, or that you are working on right now.

08

It stands to reason that on the Pebble we’d give you an easy way to access those Shortcuts, and we do:

09 10

But wouldn’t it be cool if you could access your most important information, your shortcuts, as soon as you start Evernote? 

]1112

We thought so too, which is why you can put your Shortcuts at the top level menu, before all the other Evernote menu items, so that you can see your most important stuff instantly:

131415

Context-sensitive information: nearby notes

If you are about to walk into a meeting, or into a restaurant, then nearby notes are your friend:

16

This shows the notes that you created closest to your current location (yes, you can toggle between miles and kilometers), so that if you are about to go into a meeting with someone …

17

… you can quickly remind yourself about the person you are about to meet:

1819

Activity-sensitive information: a custom checklist experience

Ideally Evernote for the Pebble would automatically detect that you are in the supermarket, and present you with your shopping list.  It doesn’t do that yet, but it does make it easy for you to check and uncheck checkboxes.

Specifically it looks for all your notes that have unchecked checkboxes in them, and presents them as a list.  If you choose one, then it just displays the checkboxes from the notes, and lets you check/uncheck them.

This makes for a super-convenient shopping experience.  If you’ve ever had to juggle a small child in one hand, a supermarket trolley in the other hand, and a mobile phone in the other hand, you’ll really appreciate being able to quickly and easily check items off, as you buy them:

pebble-screenshot_2014-03-15_15-02-202122

What’s more, if you remembered to use Evernote on your phone take a photo of the yoghurt pot back home, because you knew that you were likely to be overwhelmed when faced with a vast array of dairy produce at the shop …

24

… then you can navigate to that note on your watch, and glance at the photo:

25

The Pebble’s screen is quite small, and black-and-white, so you may need to squint a little to make out the photo!

Easy access to your most important notes: Reminders

If you don’t make much use of Reminders, then you might be a little puzzled to see a dedicated Reminders menu item on the Pebble:

26

The reason is that many many people use Reminders as a way of “pinning” important notes to the top of their notes list.  Reminders are always shown at the top of the note list on the desktop apps:

27

On your Pebble you have quick and easy access to these important notes:

26pebble-screenshot_2014-03-15_21-57-31

You can view a reminder:

29

And you can mark it as “done” by long-pressing:

30

Information at a glance.  When is it a chore, and when is it a glance?

The ideal Evernote experience on your watch gives you instant access to your most important information.  Evernote on the Pebble does this by giving you quick and easy access to your shortcuts, nearby notes, checklists and reminders.

But sometimes, that isn’t enough.  Then you have a choice: do you pull out your phone, unlock it, start Evernote, and search or navigate to the information you want? Or, if it is a small text note, might it be easier to navigate to it on your watch?

Depending on what kind of a person you are, and on how you use Evernote, the idea of navigating to your notes on your watch, by drilling down using Tags (for example) might seem either laughably complex, or super-cool and powerful.  If you are an early-adopter of wearable technology, for example if you were a Pebble Kickstarter backer, then chances are you fall into the second camp.

This is the reason for the other menu items I have not discussed above: Notebooks, Tags, and Saved Searches.  For some people, it would be much easier to quickly drill down to a note on their watch, than to pull out their phone.

31343536

Glancability may not be a real word, but if it were, it would be in the eye of the beholder.

The future of Evernote on wearables

By providing you with a customized experience on the Pebble, Evernote serves you information based on what is most important to you (shortcuts and reminders), what makes sense based on your current context (nearby notes, checklist notes) as well as the more traditional ways of accessing your notes (notebooks, tags, saved searches).

These are very early days for wearable technologies.  Evernote for the Pebble is a start … as the capabilities of wearable devices evolve, so will your Evernote wearable experience.  Evernote is very much about working in symbiosis with you, completing your thoughts for you, providing information to you before you even know you need it.  There is so much more to come.

Filed under: Evernote, Pebble 2 Comments