Yesterday, Marty McGuire published an audio version of the IndieWeb "This Week" newsletter, and shared the link in our chat. Aside from being a fantastic production, this also sparked some discussions about Webmentions. His podcast uses a few of of my 100Days songs, which he cites in the description. Since he included a link to my posts about those songs, his podcast now appears as a comment on those pages!
In the chat, Tantek posed the question of how a post like Marty's podcast could link to audio it uses and indicate the range of audio that uses that piece. There is already a Media Fragments URI spec, published in 2012, but it was at first unclear as to how widely that was implemented if at all. The spec is also kind of complicated. It starts out simple, but then has a bunch of features that seem to be made up out of nowhere. The simple stuff is based off of YouTube's fragment syntax for linking to specific times of a video.
In the Media Fragments spec, if you visit a URL of a media file with a fragment of "#t=10", the browser should immediately skip to 10 seconds into the file. There is also a syntax for specifying a range, such as "#t=10,15" which indicates the range from 10 to 15 seconds.
Thankfully, the spec links to an implementation report, on which I was somewhat surprised to see that both Firefox and WebKit have implemented at least basic support! I quickly tried this out on Chrome, Firefox and Safari on my computer, and sure enough, all three browsers have support! If you want to check it out, click on this link, which should start playing the video at 30 seconds: https://aaronparecki.com/2017/02/17/9/video.mp4#t=30
I was curious if there was a way to take advantage of this browser support to quickly implement the feature like YouTube has, where I could send time-offset links to my HTML pages that include the media. I figured if I could update the <video> and <audio> source attribute with Javascript, then I could make it work. This led me down a path of learning about the HTML5 tags, where I learned that changing the src attribute of the <video>'s inner <source> tag doesn't do anything.
Dynamically modifying a source element and its attribute when the element is already inserted in a video or audio element will have no effect. To change what is playing, just use the src attribute on the media element directly.
So here's the original Javascript snippet I came up with to put this together.
document.addEventListener("DOMContentLoaded", function(event) { // Apply time offset from this page's URL to any media on the page if(window.location.hash && window.location.hash.match(/^#t=/)) { document.querySelectorAll("video,audio").forEach(function(el){ el.src = el.currentSrc + window.location.hash; }); } });
Summary: If the page URL has a fragment, and that fragment starts with t=, then find all the video and audio tags on the page, and update their URL to include that fragment.
This actually works great, but I wanted to take it further. I left this example here because it's easier to read than the snippet that follows. To round out the whole experience, I wanted a few things.
-
If you pause the video, the page URL fragment should update to the time offset at which you've paused. This makes generating these fragment URLs easier.
- If you change the fragment URL manually, that should copy the fragment to the media again. This also makes it easier to generate these fragment URLs.
Here is the code that accomplishes all of these.
function cloneMediaFragment() { // Check that the fragment is a Media Fragment (starts with t=) if(window.location.hash && window.location.hash.match(/^#t=/)) { // Find any video and audio tags on the page document.querySelectorAll("video,audio").forEach(function(el){ // Create a virtual element to use as a URI parser var parser = document.createElement('a'); parser.href = el.currentSrc; // Replace the hash parser.hash = window.location.hash; // Set the src of the video/audio tag to the full URL el.src = parser.href; }); } } document.addEventListener("DOMContentLoaded", function() { cloneMediaFragment(); // When the media is paused, update the fragment of the page document.querySelectorAll("video,audio").forEach(function(el) { el.addEventListener("pause", function(event) { // Update the media fragment to the current time // Use replaceState to avoid triggering the "hashchange" listener above history.replaceState(null, null, "#t=" + Math.round(event.target.currentTime)); }); }); }); // If the user changes the hash manually, clone the fragment to the media URLs window.addEventListener("hashchange", cloneMediaFragment);
As you can see, not quite as simple as the other snippet, but this accomplishes all of the goals. I've written this in plain Javascript, so you can easily drop it in to your own website! All you need to do is drop this Javascript into your site on any pages that have video or audio players, and it will do the rest!
Now when you visit any of my pages that include audio or video, you can quickly get links to specific time offsets!
Try it out! Jump to 36 seconds into this video: https://aaronparecki.com/2017/02/17/9/day59#t=36
I've published this on GitHub so you can keep up with the latest version of the code there!
@realkimhansen @signlfm If you hadn't been aware, this came out earlier in the week and may be of overlapping interest: Three recommendations to enable Annotations on the Web | W3C News https://www.w3.org/blog/news/archives/6156https://www.w3.org/annotation/ Their method of breaking down the problem and their vocabulary may be particularly useful; the overlay onto audio may be another problem altogether... I also thought this post (with code) by Aaron Parecki, while a relatively simple concept, was exceptionally cool. https://aaronparecki.com/2017/02/19/4/day-61-media-fragments