Audio/Video Tag: Don’t Forget to load() before you play()

This is a gotcha that will catch a lot of people out because it goes against the expectation that you just change an image’s appearance by setting its “src”, done. And also, it’s a deviation from the more general convention that you change DOM properties declaratively to make stuff happen. Probably for good reason, given the temporal nature of audio and video playback, but just beware it’s a different model.

Bottom line is – load() between changing src and play()ing:

javascript

  1. function play(url) {
  2.   var audio = document.querySelector("audio");
  3.   audio.src = url;
  4.   audio.load(); // !HUL|_O! PAY ATTENTI0N!
  5.   audio.play();
  6. }

A lot of online sources talk about play() and omit to mention this vital fact, because they’re only play()ing a single track…and the first track works fine. This is an oddity – you can just call play() the first time and it will automatically load. This happens whether you set “src” in the initial HTML’s “audio” tag or whether you set it programatically. I did some fiddling and it looks like the behaviour is the same in Chrome and Firefox. It’s like the track is auto-load()ed the first time you set its source. (And note that Firefox doesn’t auto-buffer so that’s nothing to do with it.)

devx has you covered on this topic, and points out the subtle issues which you would need to look at for a quality production:

However, media by its very nature is a time-spanning process. You’re working with both the need to load and cache audio and video files and the network connection, which itself can prove to be a challenge that’s usually not a factor for most non-temporal resources. For these reasons, when dealing with video and audio on the web, if you want to take control of the process in any way, you have to work asynchronously, tying into the various events that are passed back to the client from the media-serving website. Table 4 (taken directly from the HTML 5 documentation) provides a full listing of the various events and when they are dispatched. … Of all the events in Table 4, perhaps the most useful are the canplay and canplaythrough events. The first, canplay, will fire when enough data has been loaded for the video player to start actually rendering content constructively, even if not all of the data has been loaded. The canplaythrough event, on the other hand, will fire when the data has essentially completely loaded into the browser’s buffer, making it possible to play the video all the way through without the need to pause and retrieve more content.

Photoshop Tamagotchi

… is a wicked expression @psd used in his hardboiled hCards talk at the GTUG event on Chrome and HTML5 the other night.

What’s cool about playing with CSS3 is you can radically muck around with appearance without resorting to graphic editors tweaks, which has always been painfully slow. And partly because of the editors themselves, which make it painful to keep editing-and-saving. ie. every time you save a jpeg, most editors will keep coming up with the same dialog to set quality level, progressiveness, etc.; and then they’ll get you to confirm you want to override the previous file. Oh…and they separate “exporting” to jpeg vs saving in their native format, instead of just doing both.

But mostly it’s because all the mouse-driven tweaking on a GUI graphical editor is just plain slow. And dealing with the file system – saving to the right place and linking to it from the HTML/stylesheet. Lots of overhead.

Some things can be automated by server-side image generation scripts. Tweaking the URL tweaks the gradient, or rounded corners, or angle, whatever. But flexibility lost, not to mention a massive performance and bandwidth drain.

CSS3 does away with all that. You can make all kinds of radical changes in one or two lines. Tweaking from the comfort of your favourite text editor (==vim). Some would say it’s “too easy”. (They are the same people who think spreadsheets shouldn’t include macros because non-programmers might be able to do useful things with them.) Not every project can justify a dedicated visual designer. Most can’t. So now that we have all this power, there’s a need for simple guidance on how to design visually with all these elements. To create passable designs, not masterpieces, which is good enough for most sites. Let’s see that kind of advice from the visual design world!

Joining Google

I mentioned I’m moving on from Osmosoft a few weeks ago and after 12days and a week of doing much less than one could ever have imagined, it’s time to announce where I’m going: Google. I’ll be joining Google as a Chrome developer advocate, which means focusing on HTML5, Javascript, the Chrome browser, ChromeOS, and related technologies.

I’m very excited about the role, for reasons best explained in terms of a timeline for the web. I see the history of web technologies divided into three phases and I believe we’re currently in the transition from the second to the third phase. Which is the starting context for this role.

Dark Ages (Web as a Black Art)

Rich web apps were hard work. Black art even. During this phase, I spent most of my time on server-side languages – Perl, PHP, Java/J2EE – and Javascript was a weird sideline. Taking an interest in the user experience, I always wanted to know more about this black art, since it’s the web technologies that touch the user and have the most direct influence over their experience with any application. But what paid the bills in the late ’90s and early noughties was form submissions and page refreshes. And most Javascript content was odd little ticker tape scripts you could download, not too much to learn from, and always difficult to achieve in one’s spare time. I think this was the frustrating experience for many web developers: not enough time and hard work. I did get by and pick up a few tricks here and there, but things only took off when the A-word was unleashed …

Ajax Era (Web as an Established Practice)

Everything changed five years ago, on February 18, 2005. Jesse James Garrett hopped in the shower, came up with the term “Ajax”, and published his thoughts on this rising architectural pattern. Many people bristle at the notion that a simple term, for something people were already doing, can make such an impact. “BUT I already knew how to do that!” they cry. Good for them. But the fact is, Ajax allowed the whole developer community to rally around this style of application. Events, blogs, books, libraries, tools, the whole thing flourished. I got seriously interested, recorded a podcast explaining the topic, and started diving into all the existing applications, which ultimately led to the Ajax Patterns wiki. All those applications, and many featured in the book, were created in the Dark Ages by brilliant pioneers who had scant resources to learn from. Today, such applications can be created at exponentially faster rates. This is thanks to improved knowledge, improved libraries, and improved tools.

There are historical analogies to this era, e.g. the rapid rise of the automobile industry once all parts were in place. In that case, it was more a matter of discovering new physical and engineering principles, whereas with Ajax, it was a superficial act. The browsers were already there, with a killer feature like IE’s XMLHttpRequest sitting largely unnoticed for five years. Few people had taken the time to really understand how they worked. It was the Ajax term to trigger the boom.

HTML5 Era (Web on an Improved Platform)

(I hesitate to call this the era of HTML5, as many things happening are outside HTML5, notably improvements to CSS and the core Javascript/ECMAscript language. But HTML5 is increasingly becoming a brand, much like Ajax in its heyday, a term that’s familiar to anyone with a passing interest in technology.)

The starting point for this era is the success of open web technologies. While we continue to learn more each day, we have the basics in place today. We know how to harness HTML, CSS, Javascript to make powerful libraries and applications on the existing browsers. This is where the Ajax Era landed us. We have seen some genius hacks. In fact, we have seen MANY ingenius hacks. On-Demand Javascript and JSONP, for example, are completely fundamental to the way business is done on the web these days, and is utterly a hack. Full credit to those who came up with them, but these hacks can only go so far. They can increase programming complexity, impact on performance, and cause security threats for those not fully-versed in their capabilities. And that’s just for the things we can do. The bigger problem is that we are heavily limited in our functionality on Ajax-Era technologies. We can’t do rich graphics or video, for example. So the next stage is all about improving the underlying platform. It’s not something an application programmer, however heroic, can do on their own. It’s something that has to involve improvements from the platform providers, i.e. the browser manufacturers and others.

To draw a line in the sand, take IE6. Although it was created back in 2001, we could do far more with it in 2004 (e.g. Google Maps) than we could in 2001, and we can do far more with it in 2010 than we could in 2004. This is because of all the knowledge, libraries, and tools the Ajax Era has provided. However, we’re in serious plateau mode now. We can continue to squeeze out 1-percenters, but the low-hanging fruit was plucked and digested a long time ago. The next stage has to be an improvement to the underlying platform, and that’s what’s going on today with modern web browsers.

Once you say “okay we’re now going to build the next-gen web platform”, you’re not just going to add support for round corners. So there’s been a truckload of effort going on, in HTML5 and beyond. Graphics in the form of canvas and SVG and WebGL and CSS3, media in the form of audio and video tags, semantic markup improvements, new form controls, offline storage, support for location-awareness. And much more.

We have the new platform and alongside it, we have the activity of understanding how to use the new platform. Not just understanding the parts, but the whole. How do you put all these things together in meaningful ways. Can we show custom-font text on a canvas? If so, how? And when does it make sense? And how to avoid performance problems? Or, to take another example, what kind of CSS transforms can we use to render videos? And since we can grab image data from the videos, what are some cool things we can do with that data? And all the while, where’s the graceful degradation story here? How do these apps look in older browsers? Most of the talk so far has been on the capabilities of the platform, not patterns of usage.

All these new capabilities take us to the place we’ve been aiming for all along: true applications deluxe, not just “html++”. Of course, there’s plenty of room for both of these, but we’re now in a place where we can really build true applications with all the capabilities of a traditional native experience. We’re seeing it in a big way in mobile space. The new model is all about sandboxing, doing away with the filesystem, and explicit permissions. Exactly what the web’s good for. They’re not only a good fit technically, but they’ve become the ubiquitous lingua franca among the programming community. The technologies are familiar, as are the patterns of UI design and code composition. Those patterns will continue to evolve with HTML5 and associated technologies.

We’re just scratching the surface here; there are exciting times ahead!


I’m not an official Googler yet, nor even a Noogler, as I’ve taken time off to chill, smell the roses, and pursue a few projects (like dusting off the homepage and … watch this space. Actually, watch that space). I’m starting in about a month. And of course, even when I am official, this blog continues to be my own opinions, caveat emptor yadda.

The role covers EMEA and I’m still based in London. Despite not starting just yet, I’m looking forward to participating in tonight’s Chrome/Chromium OS and HTML5 London event. See you there!

(I’m not the only one to announce a move to Google at this time. I’m looking forward to meeting and working with many of my talented future colleagues.)

Events Last Week: Web Fonts, Social Design Patterns, BT Dev Day, Real-Time Javascript

Last week saw a confluence of excellent events. In the same week as a house move, it proved to be a week of much learning and little sleep. I’d hoped to do a better write-up, it never happened, a combination of being too busy and new MAC BATTERIES SUCK, meaning the lappy couldn’t last through the whole session. But fortunately, I have some links to point to. Here’s a quick summary anyway, along with the linkage.

Web Fonts at London Web Standards

@otrops captured the live notes in glorious detail, as did Rob Crowther.

Ben is ideally placed to cover the brave new world of web fonts, being a web maker who studied fonts at uni. He walked us through the evolution of font hacks on the web: image with alt tag; CSS background image with text pushed off the page; rendering with Flash (SiFR); rendering with Canvas or SVG (Cufon, TypeFace.js), using JSON-based font spec data. It all leads up to the holy grail: @font-face.

Great, so we have @font-face, but issues remain: * The foundries – Mark Pilgrim, in no uncertain terms, complains the font vendors are stuck in the dark ages of the printing press, in their resistance to support @font-face. This seems to be changing with WOFF, a web-only format that seems to placate the foundries, who worry their fonts will be stolen. It seems more like a symbolic gesture, since the data could still be converted and in any event the print fonts could still be appropriated, but the foundries are feeling more reassured and making signs they will go along with it. * Performance issue – Bandwidth issues and Paul Irish’s “flash of unstyled text”, where the user notices the font change once the fancy @font-face font has been downloaded. * Compatibility – IE has long supported font-face, but with EOT format fonts, and that remains the case. You therefore need both types of fonts, and licenses will generally not give you both.

Social Design Patterns

I was, needless to say, psyched about this. Yahoo! has been the closest thing to a realisation of the inspiring design pattern vision of the mid-late ’90s. Patterns on the web, for both its own employees and the wider community to learn from and evolve. These are social design patterns mined by Christian Crumlish (@mediajunkie), in many respect the closest thing software has to an analogy of building architecture, where design patterns originally came from.

There are 96 patterns in all and I’m looking forward to poring through them. In these patterns are hundreds of people-years’ experience of observing real-world social systems. In my own pattern work, I’ve found it necessary to articulate the overarching design principles behind the patterns. Pattern languages should be opinionated, so it’s a good thing to make explicit your criteria for filtering design features. Christian has followed this model too, and identified 5 overarching principles:

  • Paving the cowpaths. Facilitating the patterns that are already happening, rather than imposing your own invented process. Also means evolving with your users, ev dogster started as photo sharing but evolved to social network.
  • Talk like a person.
  • Play well with others. Open standards, mashups, etc. “if you love something, set if free”
  • Learn from games.
  • game UI concepts
  • designing the rules, but ultimately letting the people who come into the space finish the experience themselves.*
  • Respect the ethical dimension.

See the wiki or the book for more details.

BT Developer Day

This was an internal conference for BTers in London covering a range of general tech trends, and also being a chance to get together and talk shop. The agenda included talks on Scala, Rails, Kanban, iPhone development, and even a lightning talk from @psd on the horrors and delights of APL.

I gave a talk on Embracing the Web, emphasising open standards and the supreme primacy of Konami Cornification.

Real-Time Javascript

At the Javascript meetup, a great talk on NodeJS and WebSockets. NodeJS is coming on thick and fast, and Makoto Inoue showed how the technology plays nicely with WebSockets. WebSockets are all about Comet-style interaction, so expect to see a lot more of this combo in the next couple years.

Luis Ciprian, visiting from Brazil, gave us an overview of XMPP and talked us through a real-time web app – a basketball score and conversation tracker – using XMPP.

Fin.

Live Blogging Open Source Show and Tell (OSSAT) at TheTeam, November, 2009 (Part 1)

standard live blogging warning

I’m here at TheTeam offices in London Bridge, with a number of my Osmosoft colleagues for this open source event.

Event Announcement

The agenda:

  • Iain Farrell, Canonical – Ubuntu Update
  • Julien Fourgeaud, Symbian – Managing the Symbian community
  • Jeremy Ruston, Osmosoft – HTML5 and the slow death of Flash
  • Leisa Reichelt, Disambiguity.com – Drupal 7 Update
  • Phil Hawksworth, The Team – Playing with each others toys: Developing with open technologies
  • Rain Ashford, BBC – Open Source Gaming for Handhelds
  • Robbie Clutton, BT – iPhone development using web technologies

Canonical: The Ubuntu Developer Summit

Iain explains the process of updating Ubuntu. Happily enough, the release cycle is 6 months, coinciding nicely with these OSSAT events, hence the trend is set for an update each OSSAT.

Every 6 months, there’s a big Ubuntu shindig; this is (I think) just after the last release ships. All ~240 employees invited and ~30 invited people from the community. Room for talks – the schedule is always evolving even during the conference – and spaces for just hacking. Very online too for those who can’t make it – stereo recording in every room. All the audio, mixing, publishing, and so on is power by the magic of open source. Naturally.

The design process is generally speaking 6 months ahead of the development process, i.e. in cycle t, they’re thinking about what they’ll be developing in cycle t+1. For users, a 6 month cycle means lots of incremental improvements, as opposed to

Julien Fourgeaud: Symbian’s Community Programme

Nokia started as an alliance and Nokia then purchased Symbian Ltd. When that happened, there was a move to open source the code. Julien explains the ecosystem – 90 people maing up the foundation (people in US, UK, Finland, Japan, China, elsewhere) – working with a large community of consumers, ODMs, OEMs, and operators. They often run BOFs embedded in other conferences.

What’s Symbian doing to get consumers what they want? They’re creating initiatives to get agile and engage in dialogue going with consumers. [I would have liked to see more on this, but it sounds like they’re just at the start of the path here.]

Of possible relevance to the next talk, Symbian’s portal is a Flash-heavy representation of standard web technologies such as hyperlinks …

Jeremy Ruston on HTML5 and the Slow Death of Flash

Jeremy points out he’s dogfooding by presenting in Cecily, his 3D zooming and panning interface powered by open web technologies. He points out the anti-Flash vibe isn’t new, with Jacob Nielson having declared Flash “99% bad” back in 2000. A few of the problems with Flash are:

  • Battery drain
  • Crashy
  • Can’t exploit hardware acceleration (except video and certain blitting operations) – the operations of HTML5 have been designed to take advantage of it
  • Big opaque ball of bits – can’t reach in and get at the data/content inside
  • Controlled by media interests (e.g. to ensure ads are watched)
  • Breaks the web UI, e.g. browser navigation, text selection, accessibility
  • Not properly open – the license stipulates for what purposes you can use it (restricts uses such as downloading video)
  • Not compatible with open source – can’t ship as part of open source; so a pain for users as it requires a separate download.

Jeremy shows Chrome Experiments and I Can’t Believe It’s Not Flash (the site I made with @philhawksworth which should really be launched at some point. ahem.)

He walks through the history of Flash. He explains there was a time when they looked quite innovative, but it’s become more closed systems and walled gardens, with silly license fees. So a young person who wants to get into the industry is required to pay 500 quid to get started with it.

Where does HTML5 fit in? A couple of weeks ago, When Can I Use launched, and the meta-message of that site – when you look at the complicated matrix of HTML feature support – can be interepreted as “Damn, HTML is complicated”. This led some to say – why would I want to use HTML5 when I can get it up and running, cross-browser compatible – with Flash right away. So Jeremy’s call to action is to take a few steps back and build some kller authoring tools that let us do similar things in Javascript.

In Q&A, @psd points out that Flash lets you make nice toys, but HTML5 lets you build something better than Flash, i.e. the data inside there.

Drupal 7 Update

Leisa Reichart explains the design work going on with Drupal 7. Where most open source (including Drupal) is done bottom-up, grassroots, stuff, there’s an element of top-down design here. Leisa’s talking about “open source design” (#d7ux), where she’s spent around 85% of her time engaging the community to help with design, rather than just doing it herself. The community includes end-users, agencies whose customers are using the CMS, and also other designers and usability studies. (Reminds me of the message in the community session at OWF – don’t just engage programmers, but also translators, designers, tech writers, etc.)

They set up a WordPress, yes WordPress, blog to act as the community portal; because (a) Drupal takes too long to get a very nicely designed blog; (b) it sent out a message to the Drupal community and got them engaged.

She also shows a nice eye-tracking study, something you don’t see much in open source land.