Double-clicking On The Web

Mattias Geniar, Friday, April 17, 2015 - last modified: Saturday, April 18, 2015

Here's a usability feature for the web: disable double-clicks on links and form submits.

Before you think I'm a complete idiot, allow me to talk some sense into the idea.

The Double-click Outside The Web

Everywhere in the Operating System, whether it's Windows or Mac OSX, the default behaviour to navigate between directories is by double-clicking them. We're trained to double-click anything.

Want to open an application? Double-click the icon. Want to open an e-mail in your mail client? Double-click the subject. Double-clicks everywhere.

Except on the web. The web is a single-click place.

Double The Click, Twice The Fun

We know we should only single-click a link. We know we should only click a form submit once. But sometimes, we double-click. Not because we do so intentionally, but because our brains are just hardwired to double-click everything.

For techies like us, a double-click happens by accident. It's an automated double-click, one we don't really think about. One we didn't mean to do.

For lesser-techies, also know as the common man or woman, double-clicks happen all the time. The user doesn't have a technical background, so they don't know the web works with single-clicks. Or perhaps they do, and don't see the harm in double-clicking.

But default browser behaviour is to accept user input. However foolish it may be.

If you accidentally double-click a form submit, you submit it twice. It's that simple.

10.0.1.1 - - [18/Apr/2015:00:37:06 +0400] "POST /index.php HTTP/1.1" 200 0 
10.0.1.1 - - [18/Apr/2015:00:37:07 +0400] "POST /index.php HTTP/1.1" 200 0

If you double-click a link, it opens twice.

10.0.1.1 - - [18/Apr/2015:00:37:06 +0400] "GET /index.php HTTP/1.1" 200 9105 
10.0.1.1 - - [18/Apr/2015:00:37:07 +0400] "GET /index.php HTTP/1.1" 200 9104

The problem is sort of solved with fast servers. If the page loads fast enough, the next page may already be downloading/rendering, so the second click of that double-click is hitting some kind of void, the limbo in between the current and the next page.

For slower servers, that just take more time to generate a response, a double-click would still happen and re-submit or re-open a link.

Workarounds

I recently filed a feature request at our devs for a similar problem.

If you accidentally (and we've all done this) double-click a form submit, you submit it twice. That means whatever action was requested, is executed by the server twice.

The fix client-side is relatively simple, to disable the form submit button after the first submit was registered. There's a simple jquery snippet that can solve this for you.

$(document).ready(function(){
    $("form").submit(function(){
        setTimeout(function() {
            $('input').attr('disabled', 'disabled');
            $('a').attr('disabled', 'disabled');
        }, 50);
    })
});

Server-side, a fix could be to implement some kind of rate limiting or double-submit protection within a particular timeframe. Server-side, this is a much harder problem to solve.

It's 2015, why is this even a thing to consider?

Proposed Solution

I can not think of a single reason why something like a form submit should have to be executed twice as a result of a double-click.

For a slow responding server, it's reasonable for a user to hit the submit again after more than a few seconds have passed and no feedback has been given. Because of the lack of visual feedback that the request is still being processed, the expectation has been raised that the form submit did not work.

So the user submits again, thinking he must have made a mistake the first attempt. If the same form submit has been registered by the browser in less than 2 seconds, surely that must have been a mistake and would count as an accidental double-click?

Why should every web service implement a double-click protection, either client-side or server-side, and reinvent the wheel? Wouldn't this make for a great browser feature?

What if a double-click is blocked by default, and can be enabled again by setting a new attribute on the form?

<form action="/something.php" allowmultiplesubmits>
...
<form>

Setting the allowmultiplesubmits attribute causes the browser to allow multiple submits to the same form in the same page, and by default the browser has some kind of flood/repeat/double-click protection to prevent this.

Maybe I'm over thinking it and this isn't an issue. But anyone who's active on the web has, at one point, accidentally double-clicked. And I think we've got all the technology available to fix that, once and for all.



Hi! My name is Mattias Geniar. 👋 I'm an independent software developer ⌨️ & Linux sysadmin 👨‍💻, a general web geek & public speaker. Currently working on DNS Spy & Oh Dear! Follow me on Twitter as @mattiasgeniar 🐦.

🔥 If you're stuck with a technical problem, I'm available for hire to help you fix it!

Share this post

Did you like this post? Help me share it on social media! Thanks. 🤗

Have feedback?

New comments have been disabled on this blog, existing comments will remain as-is. Want to give feedback? Is there a mistake in the post?

Send me a tweet on @mattiasgeniar!

Comments

Ed Saturday, April 18, 2015 at 11:11 -

The problem with disabling the submit button is that if the connection goes down (especially in async submissions) or the submit doesn’t work in some other way, the user can’t do anything. That means that your simple double click prevention suddenly now needs to consider that after x seconds it should re-enable.


    Alex Saturday, April 18, 2015 at 11:30 -

    The simple solution would be to re-enable it after a few hundred milliseconds:
    it takes less than a second to double-click. And you’d have the best of both worlds: no double-clicking and if you do need to resubmit a form, you have the possibility to do that without adding extra logic, wait for callbacks or server responses.


    Hilton Saturday, April 18, 2015 at 18:57 -

    On my sites, I simply save a timestamp when a form is submitted. If they try to submit again within 10 seconds, it simply ignores it.


Michal Saturday, April 18, 2015 at 11:27 -

Can you explain why setTimeout 50? On a good day my double click can happen below that threshold :)


    Mattias Geniar Saturday, April 18, 2015 at 23:38 -

    Something went wrong when I didn’t set a timeout, like the submit never happened in the first place. The timeout ensured a submit happened, the POST was fired to the server and the form was only disabled shortly afterwards.


      General Kafka Thursday, April 30, 2015 at 13:05 -

      Looks to me that you should disable the a and input elements immediately and tamale them 5 second later instead…


Mathias Bynens Saturday, April 18, 2015 at 11:38 -

The problem with your proposal is that it breaks backwards compatibility — all existing forms that lack the allowmultiplesubmits attribute would suddenly not allow double submits anymore. And on the Web, even a seemingly logical change like that is bound to break some content. We can’t break the Web.


    Mattias Geniar Saturday, April 18, 2015 at 23:39 -

    I thought about this before I made my suggestion of making it allowmultiplesubmits.

    Is there ever a case when a form (and I’m only talking forms here, not links, not anchors, not google maps navigations, just forms) would have a legit case of being submitted twice, as an intent? I couldn’t think of a single reason that should ever happen.


      Daniel Sunday, April 19, 2015 at 02:00 -

      I actually thought of a case… Sometimes a form is submitted to another frame on the same page, like a button that runs a repeatable action. I think it’s not too uncommon, though increasingly less common.


      Peter Tuesday, April 21, 2015 at 13:01 -

      For example an “Add to shopping cart” button might (intentionally) add the product multiple times to the shopping cart if the button is pressed multiple times.

      That would be a lot easier than having to go into the shopping cart page and change the requested amount there.


        Mattias Geniar Tuesday, April 21, 2015 at 13:09 -

        Peter, Daniel,

        True, that’s a use case where multiple submits are valid and to be expected. Maybe the double-click protection should be time-based, any additional click within a 2-3s timeframe (to be configured?) can be ignored?

        Something like a value to the attribute

        <form allowmultiplesubmits="2s">
        ...
        

          Daniel Tuesday, April 21, 2015 at 17:19 -

          Multiple submits can definitely be a huge problem, I’ve personally had to patch some forms to disable submits a number of times. It’s hard to say what an ideal solution would be. I guess for now we should build safeguards tailored to our applications… Your suggestion of adding an attribute could work well for a jQuery plugin that would parse the attributes…


Dblclicker Saturday, April 18, 2015 at 11:46 -

I don’t get why browsers don’t bind double click to the “Open in new tab” action. I’m using an extension to achieve this, and it’s glorious on touchpads.


njy Saturday, April 18, 2015 at 11:49 -

To avoid backward compatibility issues I propose the opposite attribute, and a slightly more precise name: disable-cast multiplesubmits on forms.
And even a disable-dblclick on any clickable element: that way no more click delay with full backward compatibility.


Matt Saturday, April 18, 2015 at 11:51 -

double clicking must die. my parents still continue to double click on almost everything they see — all because their first computer experiences were on my Mac SE. (Mac OS 5 or 6 or 3?)


njy Saturday, April 18, 2015 at 11:55 -

Sorry, friggin’ autocorrect :-\

To avoid backward compatibility issues I propose the opposite attribute, and a sligthly more precise name: disable-fastmultisubmits on forms.

And even more so a disable-dblclick on any element, with inheritance (so maybe via css, like the new pointer events stuff?): that way no more click delay with full backward compatibility, while avoiding pollution of dom with explicit attributes everywhere.


Andrea Giammarchi Saturday, April 18, 2015 at 12:03 -

I agree with Michal that timeout is superflous plus I agree with Mathias you cannot make a backward-hostile change like that, so I’ve written a snippet that will solve the problem without needing j-bloody-Query :D

Enjoy: https://gist.github.com/WebReflection/15fc0a2bbdd5afc7a669


Philipp Bock Saturday, April 18, 2015 at 12:58 -

Good idea in principle, although you still need to prevent against unintentional/malicious submissions on the server side, which means that you should include a server-generated one-time token with any sensitive form.

There are some issues with your code though: It doesn’t work on hyperlinks (they don’t support the disabled attribute), it doesn’t work on elements that have been added to the page dynamically, and it indefinitely disables all inputs. Here’s a solution that addresses this:

(function() {
  function stopClickEvent (ev) {
    ev.preventDefault();
    ev.stopPropagation();
  }
  document.body.addEventListener('click', function (ev) {
    if (ev.target.tagName === 'A' || ev.target.getAttribute('type').toLowerCase() === 'submit') {
      setTimeout(function () {
        // Needs to happen _after_ the request goes through, hence the timeout
        ev.target.addEventListener('click', stopClickEvent);
      }, 0);
      setTimeout(function () {
        ev.target.removeEventListener('click', stopClickEvent);
      }, 500);
    }
  });
})();

This only disables the link or submit button that has been clicked on and re-enables it after half a second, which should be long enough to prevent double clicks but short enough to allow intentional resubmissions. It also attaches a single listener on the body instead of one for each form element.

You can try it out here: http://codepen.io/anon/pen/xGKdLX


    Mattias Geniar Saturday, April 18, 2015 at 23:43 -

    Hi Philipp,

    My proposal is indeed not a permanent solution. Or at least, it can’t be the only prevention against accidental double-clicks.

    The server would still need to handle duplicate submissions, either via unique tokens, unique database constraints, check-values, rate limiting, …

    Having it implemented in the browser would just facilitate the implementation of the fix: a lot less CPU cycles would be wasted on the handling of duplicate requests.


Riot Saturday, April 18, 2015 at 13:43 -

There are a few exceptions where i wouldn’t agree. E.g. if you have a (Leaflet) map embedded on a page, a double click on that zooms to the clicked position. I would seriously miss that ;)

So in essence, a few elements would need to be kept out of the no-double-click scheme.


    Mattias Geniar Saturday, April 18, 2015 at 23:44 -

    Agree, map navigations is “allowed” to have double-clicks as part of its controls.

    Perhaps it’s better to have the prevent-doubleclick attribute be set specifically by webdevs on elements they don’t want to have double-clicked, instead of a general implementation …


njy Saturday, April 18, 2015 at 13:53 -

There are a few exceptions where i wouldn’t agree. E.g. if you have a (Leaflet) map embedded on a page, a double click on that zooms to the clicked position. I would seriously miss that ;)

Agree, and my proposal of an enable-by-default (for backward compat) and disabled-via-attribute/css with inheritance would play nice with that, don’t you think?
If done via css, I’m thinking something like:

body{dblclick: disabled;}
.map{dblclick:enabled;}
.map a{dblclick:disabled;}

Tha would be nice I think.


njy Saturday, April 18, 2015 at 14:07 -

Or, following along the css proposal, we may even go as far as to avoid a simple enable/disable case, instead going for a more customizable way, like allowing both enable/disable or a number of ms, with “enable” meaning 300 (being the default value for backward compat), and “disable” being 0 (zero).
Something like this:

body{dblclick-timeout:disabled;} // means 0
.map{dblclick-timeout:300;} // default browsers behaviour, same as "enable"
.map a{dblclick-timeout:disabled;}

I’ll add that this would (possibly?) pose some problems with high values and internal timers in browsers, and realistically we would just need an on/off switch, not a numeric value.
But, anyway, just sayin’ :)

What do you think?


Tim Locke Saturday, April 18, 2015 at 17:22 -

Double-click should not exist in any user interface.

Double-click (1) was invented by Bill Atkinson at Apple while working on the Lisa, because it was thought a two button mouse would be too complicated for users (2).

I think they should have used a meta-key (Shift, Control, Alt?), combined with a single-click, to emulate right-click. Techies could buy a two button mouse. This would have eliminated the need for the difficult and confusing double-click.

(1) https://en.wikipedia.org/wiki/Double-click
(2) http://www.gearlive.com/index.php/news/article/why-apple-makes-a-one-buttoned-mouse-01280820/


Frank Saturday, April 18, 2015 at 18:55 -

The older generation is dying off, problem will fix itself.

Before you respond, I’m a fool, I know. To dismiss such a pressing problem is startling evidence of my foolishness. But taking a macro view of the world, and our individual lives, is it really worth the energy to solve a problem like the inconvenience a misplaced click can result in?

We can approach problems a multitude of ways. A solution or several is useful. But what about what happens after the solution? What does addressing double-clicks lead to? No no no, you can’t back down from the discussion without invalidating it’s premise. If discussing what happens after the solution is over thinking it, then the solution was over thinking the problem.

Haha, nevermind, I just sound like a dick. Great article!


whoever Saturday, April 18, 2015 at 20:09 -

I don’t get it. It’s trivial to arrange your server-side code so that if, for instance, the user tries to pay for the same OrderID twice, then you can see that they’ve already paid for it and you don’t actually ding their credit card twice. This is a basic, common sense thing to do for more reasons than double clicks, e.g., refreshes, spotty mobile connections, etc.


Cactus Joe Saturday, April 18, 2015 at 20:44 -

For critical forms that should not be submitted twice, I use a nonce, a unique hidden value in the form. Once that nonce has been accepted on the server side, form submissions with the same nonce are not saved again. The response from the second form submission is the same as what would have come from the first response.


Jon Saturday, April 18, 2015 at 22:54 -

“Everywhere in the Operating System, whether it’s Windows or Mac OSX, the default behaviour to navigate between directories is by double-clicking them. We’re trained to double-click anything.”

If you started using a GUI before 2000, this is obviously true. If you started using a GUI after 2010, it’s obviously false.

In OS X, I rarely double-click anything. You don’t need to double-click in the Dock or in Spotlight, and most people (not software developers) don’t actually use the Finder much.

I haven’t used Windows recently, but from what I’ve seen of the Surface and Windows 8, it looks like they’re moving in the same direction.

And of course if you have a smartphone or tablet, they basically don’t use double-tap anywhere for anything.

This problem is solving itself. In another 5 years, I don’t think it will even make sense to ask this question.


    Peter Tuesday, April 21, 2015 at 13:10 -

    No entirely true. Double-clicks may still happen by accident, or (with a considerable delay) because the server doesn’t seem to respond.

    The fact that it is not common practice anymore in a few years, does not mean it will not happen anymore.


GregW Sunday, April 19, 2015 at 06:07 -

The problem you are worried about is double-POSTs, not double-clicks, and the bad news is that your solution will not solve the former problem. Take this as hard-won wisdom from a web programmer who has spent days poring through packet dumps of both client and server behavior (but verify for yourself!) You will never get rid of 100% of double-POSTs hitting the server no matter what you do in or from the browser. You can reduce them, but you cannot eliminate them for the simple reason that such double-POSTs are not all caused by double-clicks from the browser. It turns out that TCP/IP (implemented by both client and server operating systems) has a system where every packet sent generates a corresponding ACK packet back to the sender. If the sender, the (browser-side) client OS doesn’t receive the ACK back from the webserver in a moderately short time (~1 second?), it will retransmit the packet. All it takes is a slightly buggy cheap home routers or general network delays or queuing, and you get some logjam where the user clicked once, but the client OS sends multiple POSTs because it didn’t get a timely TCP ACK back, and often both POSTs then hit the server at the same time when the logjam clears. You have to write idempotent server-side code, there is no way around it.


Robi Monday, April 20, 2015 at 15:26 -

Btw, links are still clickable even you set an attribute “disabled” to them.


Rui Saturday, April 25, 2015 at 11:49 -

On a side note, this week I was doing a drag and drop UI for example. When testing it I found it to be simple to double click elements to get them added quickly to the dropzone without dragging all the way. So I think there can also be usability value on the positive side.


Jibone Friday, August 14, 2015 at 10:24 -

So what would happen if I click this submit form twice?


Inbound links