Damian Mehers' Blog Xamarin from Geneva, Switzerland.

25Jan/142

Evernote tip 9: Index the physical

Like a lot of people I rely on Evernote as my external brain, but my use of Evernote extends beyond the digital realm to the physical realm too.

But first, a quick quiz. Can you identify this?

image

No? You know what? Neither can I. But somehow this weird piece of plastic turned up in my home office one day, and I was left with a dilemma with which I am sure you too are familiar.

Throw it away?

On the one hand, I could throw it away. The thing is, if I did that, then you can be absolutely sure that within a week or so, it will turn out that that piece of plastic was absolutely vital to the functioning of a critical piece of household equipment.

image

“File” it?

On the other hand, I could decide to "file" it in that drawer I have. You don't know which drawer I mean? Oh yes you do, it’s the same one you have, filled with cables for phones you no longer have, remote controls , batteries that may or may not be charged, and yes, nameless pieces of plastic.

If I decided to put it in that drawer then I can be equally sure I would never need it, and the only time I might touch it again is when I move house, although even that isn't a sure thing. There is a fair chance it might follow me to my grave...

image

Evernote to the rescue

So what do I do? I choose the second option, BUT before I "file" it in that drawer or box, I also file it in Evernote by taking a photo and putting it in Evernote, and tag it to say where it is. This means that whenever I find out that I need that piece of plastic, all I need to do is scan through my "real world" notes and I can quickly and easily retrieve it.

image

image

Of course this use of Evernote isn't restricted to anonymous pieces of plastic. I also use it to file other small objects that I can't quite bring myself to recycle, but which I know I won't be needing in the near future:

image

Filed under: Evernote 2 Comments
12Jan/142

Sony SmartWatch2 scrollable text

Sony have adopted an intriguing approach to development for their SmartWatch 2.  Unlike other Smart Watches you don’t write code that runs on the watch itself.  Instead all your code runs on an Android phone.  You define the watch UI using standard Android layouts, and that UI is remoted onto the watch.

Your app can respond to events such as touches, since these events are sent from the watch to your phone, and then delivered to your app.

This is kind of cool, in that you don’t have to debug on the watch.  It is simpler, and I think it works well for relatively simple apps.

There are, however limitations on the UI elements that you can display.  Lists work well, but it isn’t currently possible to create a scrollable text area.

For the experimental app I was working on, this was a big issue. I needed to display text that went on for more than one screen.

I eventually found a way around this restriction.  I render the text into a bitmap in memory on the phone, then I split the bitmap up into watch-screen sized chunks, and I use each chunk as an element in a list.  This works.  You can scroll through your text, albeit a page at a time.

My list is derived from ManagedControlExtension, and in the onResume I render the text to a bitmap member variable by calling renderTextToCanvas:

  private void renderTextToCanvas() {
    mBitmap = Bitmap.createBitmap(mScreenWidth, mScreenHeight * SCREEN_PAGES, Bitmap.Config.ARGB_8888);
    mBitmap.setDensity(DisplayMetrics.DENSITY_DEFAULT);
    mCanvas = new Canvas(mBitmap);
    mCanvas.setDensity(DisplayMetrics.DENSITY_DEFAULT);

    TextPaint tp = new TextPaint();
    tp.setColor(Color.WHITE);
    tp.setTextSize(18);

    String text = mNote.textContent;

    if(text == null) {
      Log.d(TAG, "Empty text ...");
      text = mContext.getString(R.string.empty_note);
      tp.setTextSkewX(-0.25f); // Italics
    }

    StaticLayout sl = new StaticLayout(text, tp, mScreenWidth, Layout.Alignment.ALIGN_NORMAL, 1.2f,
                                       0f, false);

    mCanvas.save();
    sl.draw(mCanvas);
    mCanvas.restore();
  }

Then when a list element is requested I render the appropriate bitmap chunk and return it:

@Override
  public void onRequestListItem(final int layoutReference, final int listItemPosition) {
    Log.d(TAG, "onRequestListItem() - position " + listItemPosition);
    if (layoutReference != -1 && listItemPosition != -1 && layoutReference == R.id.listView) {
      ControlListItem item = createControlListItem(listItemPosition);
      if (item != null) {
        sendListItem(item);
      }
    }
  }

  protected ControlListItem createControlListItem(int position) {
    Bitmap bitmap = Bitmap.createBitmap(mBitmap, 0, mScreenHeight * position,
                                                    mScreenWidth, mScreenHeight);
    ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
    bitmap.compress(Bitmap.CompressFormat.PNG, 100, byteArrayOutputStream);

    ControlListItem item = new ControlListItem();
    item.layoutReference = R.id.listView;
    item.dataXmlLayout = R.layout.note_content_item;
    item.listItemPosition = position;
    item.listItemId = position;

    Bundle imageBundle = new Bundle();
    imageBundle.putInt(Control.Intents.EXTRA_LAYOUT_REFERENCE, R.id.imageView);
    imageBundle.putByteArray(Control.Intents.EXTRA_DATA, byteArrayOutputStream.toByteArray());

    item.layoutData = new Bundle[] { imageBundle };

    return item;
  }

All this leads to scrollable text:

Filed under: Wearables 2 Comments
12Jan/141

So, how did you die? Wearables as the human black box.

Yesterday I gave a presentation at the excellent Mobile Central Europe conference in Warsaw, Poland, on Evernote and wearable devices.

When talking about the convergence of activity monitoring devices and smart watches I voiced a sudden thought.

How long will it be before someone dies of a heart-attack, and we use the health-sensors that are being incorporated into smart wearable devices to at what was happening the moments before they died?

We might then look for the same signs in others and warn them to “Lie down!, lie down!, medical assistance is on its way” … the human equivalent of “terrain!, terain!, pull up” in the cockpit.

File:Flightrecorder.jpg

Filed under: Wearables 1 Comment
5Jan/147

Word for Mac Focus keyboard shortcut: here’s how

You may well be wondering why I am writing a blog post on how to enter Focus view in Word 2011 on the Mac, when clearly I should be focusing on writing something… let’s not go there.

There is no built-in keyboard shortcut, and it wasn’t obvious to me how to add one, but I got there eventually.

Use the Tools|Customize Keyboard menu item:

Screen Shot 2014-01-05 at 10.05.28 AM

Then go to the View menu on the left, find ToogleFull on the right and enter the shortcut you wish to use.  This was the key for me: I’d never have guessed that Full meant Focus, since there is also a full-screen mode.

image

OK, so now you have no excuse not to focus!

Filed under: Fluff 7 Comments