Skip to Content

How do I stop duplicate post requests?

Duplicate post requests can happen for a variety of reasons and lead to unintended consequences like duplicate orders or database entries. Here are some common causes of duplicate post requests and methods to prevent them in your web application.

What Causes Duplicate Post Requests?

There are a few common causes of duplicate post requests:

  • Users double clicking submit buttons: If a user accidentally double clicks a submit button, two requests will be sent. This is common with forms, login screens, checkout flows, etc.
  • Retry logic: Sometimes developers will implement retry logic that retries network requests. This can result in the same request being sent multiple times if the retries overlap or are not de-duplicated properly.
  • Browser prefetching: Some browsers will prefetch links and can cause post requests to fire twice – once on prefetch and again on click.
  • Page refresh: If a user refreshes the browser page after submitting a form, the browser will try to resend the request.
  • CDNs and proxies: CDNs and proxies sometimes replay requests which can cause duplicates.
  • Browser bugs: Some browser bugs can cause duplicate requests to be sent.

Solutions for Preventing Duplicate Post Requests

Here are some techniques you can use to prevent duplicate post requests:

1. Use Post/Redirect/Get

The Post/Redirect/Get pattern involves returning a 303 redirect after a successful POST request. This redirect sends the browser to a new page, usually a “success” page. The benefits are:

  • Avoids resending the request if the user refreshes the page
  • Changes the browser behavior from resending the POST to a safe GET request
  • No way to retrigger the POST request once browser is redirected

For example, after a user submits a contact form, you would return a 303 redirect to a “Message Sent” success page rather than rendering the form again. The user cannot refresh or double submit the form.

2. Disable Resubmit on Refresh

You can disable the browser functionality that resubmits POST requests when the page is refreshed. This will prevent duplicate submissions if the user refreshes the page.

To do this, send the following header in your POST response:

Cache-Control: no-cache, no-store, must-revalidate

This will tell the browser not to use its cached page that contains the POST request when the user refreshes.

3. Use Unique Tokens

Include a unique token with each POST request that is validated by the server. This token can be set to only allow one use.

For example, you can generate a unique UUID on page load and include it as a hidden form field. On the server, check that this token has not been used before accepting the request. The used tokens can be stored in the user session or a database to verify uniqueness.

4. Disable Resubmit Behavior

You may be able to disable the browser functionality that resubmits POST requests when the page is refreshed by setting autocomplete to off in the form element:

<form autocomplete="off">

Support for this technique is mixed across browsers so results may vary.

5. Use a Sequence Number or Nonce

Keep track of the sequence of requests on the server and reject any out-of-sequence repeats. This involves:

  • Server stores a sequence number or nonce for each client session
  • Include the sequence number in each POST request
  • Increment the stored sequence number on each accepted request
  • Reject requests that are below the expected sequence number

This ensures each POST request is unique and hasn’t been sent before. It does require storing some state on the server regarding the sequence.

6. Disable Caching and Prefetching

Disable caching and prefetching behavior in the browser by sending the appropriate Cache-Control and X-DNS-Prefetch-Control headers in your POST responses. For example:

Cache-Control: no-cache, no-store, must-revalidate
X-DNS-Prefetch-Control: off  

This prevents the browser from accidentally sending duplicate requests due to caching or prefetching behavior.

7. Use a CAPTCHA

Adding a CAPTCHA to forms can prevent duplicate submissions from automated scripts. This likely won’t prevent all user-induced duplicates (like double clicks), but does cut down on issues from bots.

8. Enforce Single Submission

Once a POST request is received on the server, you can mark it as “processed” in your system and reject additional submissions for that data. For example:

  • Mark form submissions as processed in the database
  • Check already processed status before accepting additional submissions
  • After a short period, allow submission again (protects against request retries)

This requires some state management and coordination between servers if running more than one, but can virtually eliminate duplicates.

9. Alert Users

Provide visual feedback in the browser that the form has already been submitted. This gives the user a clue not to resubmit accidentally.

For example, you can:

  • Change submit button text to “Submitted!”
  • Disable submit button after click
  • Show message “Your form has been submitted successfully”

This provides a good user experience and cuts down on accidental duplicate clicking.

10. Handle Duplicates Gracefully

Despite best efforts, duplicates may still occur. Often it’s a good idea to handle duplicates gracefully in code:

  • Idempotency checks before processing
  • Retry with different idempotency key
  • Return success responses for duplicate requests
  • Have duplicate detection in processing code
  • Perform “last write wins” on duplicates

This minimizes negative impacts if a duplicate does make it through.

Client-Side Techniques

In addition to server-side fixes, there are also useful client-side techniques for preventing duplicate submissions:

  • Disable submit button after click via JavaScript
  • Prevent multiple form submission browser events in JavaScript
  • Show progress indicators after submit to prevent reclicks
  • Use JavaScript variables to only allow one submit
  • Perform client-side duplicate checking before second submit

These techniques improve the user experience and cut down on accidental duplicate interactions.

When Are Duplicates Allowed?

There are some cases where duplicate requests are expected and allowed:

  • Idempotent requests – GET and some PUT requests don’t have side effects and don’t matter if duplicated.
  • Retries – Clients may retry failed requests which can appear as duplicates.
  • Multi-page forms – A form may POST updates to multiple pages before final submission.
  • Event handlers – Some event handler code may lead to duplicate submissions.

In these cases, the server-side logic needs to handle the duplicates appropriately.

Other Considerations

  • Duplicate detection can be difficult in distributed systems – often requires centralized coordination.
  • Duplicates degrade performance and efficiency – so important to minimize.
  • Duplicates can lead to over-billing or other unintended business impacts.
  • User perception is important – duplicates make the system seem broken.
  • Have monitoring to detect duplicate problems early.
  • Duplicate prevention needs coordinated efforts on both client and server side.

Conclusion

Duplicate post requests can be surprisingly common and lead to significant problems in web applications. With a combination of server-side and client-side techniques, duplicates can be minimized for a smooth user experience. Some key methods include Post/Redirect/Get, unique request tokens, sequence numbers, disabling browser resubmit behavior, and graceful duplicate handling in processing code.

What techniques have you found effective for preventing duplicate requests? Are there any other common causes of duplicates not mentioned here? Please share your thoughts and experiences dealing with duplicate post requests.