Double-clicking On The Web

Want to help support this blog? Try out Oh Dear, the best all-in-one monitoring tool for your entire website, co-founded by me (the guy that wrote this blogpost). Start with a 10-day trial, no strings attached.

We offer uptime monitoring, SSL checks, broken links checking, performance & cronjob monitoring, branded status pages & so much more. Try us out today!

Profile image of Mattias Geniar

Mattias Geniar, April 17, 2015

Follow me on Twitter as @mattiasgeniar

Here’s a usability feature for the web: disable double-clicks on links and form submits.

Before you think I’m a complete idiot, allow me to talk some sense into the idea.

The Double-click Outside The Web

Everywhere in the Operating System, whether it’s Windows or Mac OSX, the default behaviour to navigate between directories is by double-clicking them. We’re trained to double-click anything.

Want to open an application? Double-click the icon. Want to open an e-mail in your mail client? Double-click the subject. Double-clicks everywhere.

Except on the web. The web is a single-click place.

Double The Click, Twice The Fun

We know we should only single-click a link. We know we should only click a form submit once. But sometimes, we double-click. Not because we do so intentionally, but because our brains are just hardwired to double-click everything.

For techies like us, a double-click happens by accident. It’s an automated double-click, one we don’t really think about. One we didn’t mean to do.

For lesser-techies, also know as the common man or woman, double-clicks happen all the time. The user doesn’t have a technical background, so they don’t know the web works with single-clicks. Or perhaps they do, and don’t see the harm in double-clicking.

But default browser behaviour is to accept user input. However foolish it may be.

If you accidentally double-click a form submit, you submit it twice. It’s that simple.

10.0.1.1 - - [18/Apr/2015:00:37:06 +0400] "POST /index.php HTTP/1.1" 200 0 
10.0.1.1 - - [18/Apr/2015:00:37:07 +0400] "POST /index.php HTTP/1.1" 200 0

If you double-click a link, it opens twice.

10.0.1.1 - - [18/Apr/2015:00:37:06 +0400] "GET /index.php HTTP/1.1" 200 9105 
10.0.1.1 - - [18/Apr/2015:00:37:07 +0400] "GET /index.php HTTP/1.1" 200 9104

The problem is sort of solved with fast servers. If the page loads fast enough, the next page may already be downloading/rendering, so the second click of that double-click is hitting some kind of void, the limbo in between the current and the next page.

For slower servers, that just take more time to generate a response, a double-click would still happen and re-submit or re-open a link.

Workarounds

I recently filed a feature request at our devs for a similar problem.

If you accidentally (and we’ve all done this) double-click a form submit, you submit it twice. That means whatever action was requested, is executed by the server twice.

The fix client-side is relatively simple, to disable the form submit button after the first submit was registered. There’s a simple jquery snippet that can solve this for you.

$(document).ready(function(){
    $("form").submit(function(){
        setTimeout(function() {
            $('input').attr('disabled', 'disabled');
            $('a').attr('disabled', 'disabled');
        }, 50);
    })
});

Server-side, a fix could be to implement some kind of rate limiting or double-submit protection within a particular timeframe. Server-side, this is a much harder problem to solve.

It’s 2015, why is this even a thing to consider?

Proposed Solution

I can not think of a single reason why something like a form submit should have to be executed twice as a result of a double-click.

For a slow responding server, it’s reasonable for a user to hit the submit again after more than a few seconds have passed and no feedback has been given. Because of the lack of visual feedback that the request is still being processed, the expectation has been raised that the form submit did not work.

So the user submits again, thinking he must have made a mistake the first attempt. If the same form submit has been registered by the browser in less than 2 seconds, surely that must have been a mistake and would count as an accidental double-click?

Why should every web service implement a double-click protection, either client-side or server-side, and reinvent the wheel? Wouldn’t this make for a great browser feature?

What if a double-click is blocked by default, and can be enabled again by setting a new attribute on the form?

<form action="/something.php" allowmultiplesubmits>
...
<form>

Setting the allowmultiplesubmits attribute causes the browser to allow multiple submits to the same form in the same page, and by default the browser has some kind of flood/repeat/double-click protection to prevent this.

Maybe I’m over thinking it and this isn’t an issue. But anyone who’s active on the web has, at one point, accidentally double-clicked. And I think we’ve got all the technology available to fix that, once and for all.



Want to subscribe to the cron.weekly newsletter?

I write a weekly-ish newsletter on Linux, open source & webdevelopment called cron.weekly.

It features the latest news, guides & tutorials and new open source projects. You can sign up via email below.

No spam. Just some good, practical Linux & open source content.