No, let’s not use that date format

Doing the rounds is XKCD’s endorsement of the ISO 8601 date format. Let’s avoid that, because as another XKCD reminds us, you don’t just invent new standards in the hope of wiping out the old ones.

I don’t know how serious the proposal is, but I’ll bite:

  • 2013-02-27 is used by no-one; so it will confuse everyone.
  • Real people don’t use leading zeroes.
  • It’s still ambiguous. Given dates are already messed up, there’s really no reason to assume 2013-02-03 is the logical MM-DD order.

No, the real answer is either include the month name (or abbreviation), or (in a digital context) use the “N days ago” idiom. (Note that “N days ago” does suffer from one major issue, which is it goes stale if caching the content.)

Sure, if the context is filenames or something technical, use this format or just plain old 20130227 (it will sort nicely ( and I often do use this format for backups. But for humans, stick to what they know.

Discovering Users’ Social Path with the New Google+ API

Google announced a slew of identity and social updates today, most excitingly, the ability to browse users’ social paths. This happens after similar services recently blocking some folks from doing so, which tells you Google gave it due consideration and is committed to supporting this feature indefinitely.

Here’s how the authentication looks:

Now there’s a whole set of widgets and JavaScript APIs, but I was interested in the regular scenario for apps already using the “traditional” OAuth 2 dance. After asking on the G+ APIs community, I was able to get this running and I’ll explain how below.

Step 1. Visit the API doc:

Step 2. Scroll to the interactive part below and turn on OAuth 2.0 on the top-right switch.

Step 3. To the default scope, add a new one: That’s the magic scope that lets your app pull in social graphs.

Step 4. For userID, enter “me”. For collection, enter “visible”. (This collection property, representing circles/people the user can identify to the app, only has that one value at present.)

Step 5. Now hit execute and (as a test user) you’ll see the dialog shown at the top of this article. Then hit accept.

Step 6. I got a confirmation dialog saying “Clicking Confirm will let Google APIs Explorer know who is in your circles (but not the circle names). This includes some circles that are not public on your profile.” which is surprising as I believe circles are always private (for now), so I guess users will always see that. Accept it.

Step 7. The JSON response will now be shown below the form. It includes a top-level field called “items”, which is the list of your (the authenticated user’s) G+ people. If the list is too long, there will also be a “nextPageToken” field so the app can page through the list.

So that’s an overview of the new G+ social API. It’s a straightforward OAuth implementation and should be easy for anyone with a Google login to adopt. I’ve been looking forward to adding this functionality on Player FM so people can see what their friends are listening to … I think it’s a nice model where users can choose how much of their social graph they share with any app.

Glass Surrogates

Google Glass rolls out later this year. The commonly discussed applications have focused on receiving timely notifications and recording video from first-person, but in the hands of developers, many more ideas will emerge. One possibility I haven’t encountered is surrogates. Like all things Glass, a potentially transformative and empowering possibility teetering right on the creepy line.

A Glass surrogate is best exemplified by Larry Mittleman in Arrested Development’s third season (“Middleman”, get it? Portrayed to comedy perfection by Bob Einstein).

While bedroom-bound, George Bluth recruits the surrogate to walk through the world on his command, say what he says, do what he commands.

Being equipped with streaming mic and camera, the surrogate is starting to look awfully familiar in a world of Glass.

One area where this will likely happen is “commodified outsourcing”, an industry that’s already moved on from desk-bound eLance/oDesk type work to real-world delegation, a la Exec and TaskBunny.

These services let you post errands that will be conducted in fleshspace (delivery, cleaning, lining up for tickets, etc). These contractors will already be using their phones to send photos and converse with their providers. I’d be surprised if these workers weren’t standard-issued with Glass devices in a year or two. A busy person could send the surrogate to the store and provide instructions once the surrogate is there. So the busy person only needs to be engaged during the 10 minutes of actual shopping, instead of the hour it takes to visit the store and return.

On a grander scale, a well-located surrogate might save someone a timely overseas trip.

It’s more than just saving time. Some people are physically immobile or find it impractical to travel any length. For them, a surrogate would be the closest thing to being physically present.

This will also take shape in professions where telepresence is emerging. Medicine, for example. A surrogate medical specialist (maybe doctor, maybe not) would perform procedures on behalf of a remote doctor.

You can also see how a manager will be able to flip between worker’s points of view like flipping between CC cameras, even if said workers are in the field. They might be physically labouring on the factory floor or juniors in a business meeting, where the big boss might jump in and out. This is definitely a potentially creepy scenario, but one that would have immense training and feedback benefits.

Far-fetched? Consider that one of the services I mentioned above already does this in its own way. oDesk lets providers view periodic screenshots of contractors. When I’ve provided contracts this way, I haven’t used this facility often because I hire motivated workers and manage workflow in other ways (e.g. Trello), but it can be useful in a remote working context to check a contractor is on the right track. Glass would take all that out to the real world.

There are many ethical and well-being concerns here. I imagine this can quickly become a scenario where managers are able to view all their workers’ perspectives and chime in as “a voice of God” to direct their work. These scenarios will definitely need to be ironed out and as with other areas of Glass, etiquette and conventions will emerge.

Side note: The movie Surrogates is another example, but unlike Arrested Development, these surrogates are humanoid robots which is one or two AI generations removed from the imminent Glass scenario.

Order Of Magnitude Improvement: 3.16x

Disclaimer: Largely waffle.

A common principle in tech is that changes are only adopted on a grand scale when there is an order-of-magnitude improvement. That is, it’s not good enough to add a couple of new features to make the product 10% better; that will only bring a niche audience. You have to make it radically, qualitatively, better.

It’s easy to see examples of this: Google’s search was blatantly more useful to anyone acquainted with Alta Vista and friends; windowed UIs were blatantly more friendly to casual users than a text terminal, etc.

An interesting thing I wanted to mention is “how much is an order of magnitude improvement”? Well, the real answer is, it doesn’t really matter. It’s a principle, and it’s more about a disruptive, qualitatively different change, than something you can measure.

But that said, it’s often equated with a 10x improvement. That’s what English thinks anyway.

But actually, I always think of it as a 3.16x improvement. That is, the square root of 10. The reason I say that is that “orders of magnitude” implies a discrete scale, and jumping to the “next” order of magnitude means going 1 up. So you might say it’s anything more than 5x. But order of magnitude changes happen exponentially by definition, so if you can improve something by 3.16, you’re halfway there. (It would be halfway on a log-log chart.)

This is all very silly calculation, because like I say, the whole concept is wishy-washy. What are we even measuring anyway? Utility? Something else? And if someone came up with a 3.16x innovation as soon as the last one happened, then by this definition we’ve jumped only 10x but two orders of magnitude. Just thought I’d mention it anyway.

Bitten By Significant Whitespace

I’ve come to love significant whitespace since using it in CoffeeScript. (I’d dismissed it due to generally not getting on with Python, but really that’s for other reasons.) By eliminating the need for { }, code is more to the point.

However, significant whitespace is playing with fire and I just got burned.

The code helps to tailor sidemenu behaviour for a touch device. Anyway, the final false was wrongly indented. It should have been indented by 2 more characters to appear directly under the other 3 lines.

It must have been a quick edit or something, but the net effect was forms couldn’t be submitted when on a touch interface. I couldn’t quickly track it down, so made some workarounds to get things working, but then I realised it was happening on all forms, so looked into it more.

Lessons: * Be very careful changing any Coffee indents * Modernizr.touch would be a good starting point to search for the cause of any bugs like this.