Thoroughly validate form data and allow users to review their submissions to ensure they do not submit unintended data.



Filling out forms presents several challenges for users, from ensuring they have specified all required fields to ensuring all fields have been filled in correctly. When fields are missed or errors occur, it is not always easy for users with disabilities to locate the problems or determine what needs to be fixed.

When to Validate

When user input is used solely by a digital publication, data validation occurs on the page that is requesting the data.

It is only when the author attempts to send data to an external web application that server-side validation may occur. But in these cases, the author will most likely use a scripting API like XmlHttpRequest and report errors without refreshing the page. Forms typically cannot be directly submitted to a web server without spawning a new window, as the return page will not be recognized as a resource of the publication.

This document consequently only reviews methods for validating and reporting errors from within a digital publication. For more information about handling errors in a page returned from a web application, see the Web Accessibility Tutorial on User Notifications.

Built-in Checks

The best option for validating input is to use the mechanisms built into HTML.

The required attribute, for example, is available on most form input elements. It provides a programmatic means of identifying which fields a user must fill in (sighted users will still require a visual cue).

Example — Identifying a required field

   <option value="">Month</option>

When the required attribute is set, users cannot submit the form in reading systems that support forms until the fields are completed. The reading system will alert them that required information is missing if they try.

The pattern attribute allows authors to specify a regular expression to validate the input against. This attribute is useful when the input must match a specific form.

Example — Pattern matching two comma-separated numbers in parentheses

    pattern="([0-9]{1,4}, *[0-9]{1,4})"
    … />

Authors can also use the placeholder attribute to provide a hint to the expected value, which is especially useful when specific patterns are expected. Note, however, that this value disappears once the user begins to input their data, so it should not be used in place of a proper label or description for the field.

Example — Hinting at the expected input

    pattern="([0-9]{1,4}, *[0-9]{1,4})"
    aria-label="Starting coordinate"
    … />

The autocomplete attribute provides another potential measure of ease for ensuring that users input the right information. Setting this attribute allows users to automatically insert saved form data.

Example — Identifying an autocomplete field

    … />

The drawbacks of this attribute for publishing are that most forms do not collect user information, they are generally for testing, plus application-based reading systems do not store previously submitted data for reuse. The attribute may only work with reading systems that run in browsers for limited information.

There are also attributes for specifying minimum and maximum values, minimum and maximum numbers of characters to input, and for using a regular expression to verify the text input by a user.

Example — Specifying minimum and maximum values

    … />

The type attribute on the input element is often overlooked for validation, but it allows authors to identify that the input represents a common field like a phone number or email address.

Example — Email input field

    … />

These specialized input types often include their own built-in validation. For example, user agents will check that the user has input a valid email address when an email field is specified. Using these built-in types avoids the need to write complex and redundant pattern checks for generic text input fields.

Locating Input Errors

Although using the built-in validation mechanisms allows reading systems to handle informing users when there are errors, it is not possible to use these mechanisms for all data validation. When validating custom data fields, it is necessary to make the process of finding and fixing the errors as easy as possible for users.

A common approach to manual validation is to write a list of input errors in a visually distinct, and clearly labelled, box at the top of form. While this technique is generally fine, to help users with assistive technologies, the element containing the errors should be marked as a live region using the ARIA role alert. This allows them to announce the new text when it is written to the element.

Example — Creating a live region

    <h6 id="errors">
       Errors Processing Form
    <p>The following items did not validate:</p>

JavaScript alert boxes should be avoided as the sole means for listing errors, as once the dialog is closed the user typically loses access to the list. For users with cognitive and learning disabilities, remembering multiple fields that need to be corrected can be difficult.

A better option would be to limit the alert to notifying the user that the submission contained errors that need fixing and direct them to the list on the page when the dialog closes.

When listing the errors, it helps to provide links to the invalid fields.

Example — Linking to invalid field

<p>The following items did not validate:</p>
   <li><a href="#postcode">Invalid postal
      code entered</a></li>

For digital publications, it is often a better option to validate user input as it is entered. This will avoid having to add and remove dynamic lists of errors.

While invalid fields should, of course, be visually identified so that visual users can locate them as they review the data, ARIA includes the aria-invalid attribute for programmatically marking the invalid fields.

Example — Using ARIA to identify errors


Setting the aria-invalid attribute allows assistive technologies to easily move users to the fields that are in error so they are not reliant on a list of errors or having to manually traverse every field in the form to find out what failed.

When users fix the errors, make sure to update or remove the aria-invalid attribute. (Be aware that setting aria-invalid="" means that the field is not invalid.)

Describing Errors

In addition to locating errors, users also benefit from explanations of why fields are marked invalid. The ARIA aria-errormessage allows authors to link a description of the issue to the invalid field.

Example — Using ARIA to identify error messages

<div id="postcode" class="error">
   <label for="address">
      Postal Code:
   <span id="post-err"
      Postal code must be of the
      form A1A 1A1.

The aria-errormessage is similar in function to aria-describedby but more clearly indicates that the attached description explains the issue. (Note that aria-errormessage is only allowed on an element when aria-invalid is also used on it.)

The aria-live attribute can also be set on the error message so that it gets announced to users when it is added (see the preceding example). If individual error fields are identified this way, it is not recommended to also alert users when adding a full list of issues (i.e., it will cause users to hear all the errors twice).

While it may only be possible to tell the user that the data they input is invalid in some cases, if the specific reason their data is invalid can be determined it is better to be as precise as possible in the error message. The more information you can provide users, the easier it will be for them to correct the error.

Confirming Submissions

Ensuring that form data validates to the author's expectations does not mean users have input the information they intended to. A user may not notice a typo when inputting information, for example, or a field may accidentally change without their noticing (e.g., hitting arrow keys can sometimes accidentally change selection box and radio button choices).

Providing users the option to review their data before submitting it is a helpful way to ensure that the information matches their expectations, especially as a detailed review of form data in its raw form can be challenging for all users.

Providing confirmation is a level AA requirement for test data, so educational publishers need to be aware when the success criterion applies to tests and quizzes they embed in a digital of having to fix data for users, or allow them to undo and resubmit, which is covered in the next section.

It is often difficult to provide this sort of independent validation within a digital publication, however, because it requires dynamically writing the information into the same page as the form, which not all reading systems support well. Similarly, using a JavaScript alert may work for very small data sets, but quickly becomes unwieldy to read for large data submissions.

If the author knows users will be able to submit a form directly to their server (i.e., the reading system will open a new browser window to submit the data rather than send the data through a JavaScript API), a confirmation page could be provided on the server side. This option is generally only available in select cases where the reading systems that users have access to is limited.


Data that stays exclusively within the digital publication is often less critical to review than data sent to an external source, as the potential harm to users from mistakes tends to be less significant. For example, it is less critical to review a practice exam marked within a digital publication than to review course exercises submitted to a school's servers.

Reverting Submissions

As mentioned in the last section, it is much better to allow users to review their data before submitting it than to provide options for reverting and correcting.

This is especially true with digital publications, as the means of retrieving and fixing data are complicated by the restrictions of the format. Persisting information within a digital publication is not easy, for example, as storage options are limited (i.e., users cannot easily come back later to undo a form submission if they close the publication). Similarly, network issues can lead to data being submitted but the publication not receiving confirmation of submission, preventing users from undoing.

Authors should strongly consider other options for collecting information if it is necessary to allow users to revert submissions. For example, whenever possible, direct users to a form outside the publication to collect information rather than embed the form.

Reverting and correcting submissions may only be realistically done through an external web site that can confirm the user's identity if submissions must occur through a digital publication. For example, a course portal could provide the option to revert or fix mistakes rather than trying to accomplish this through the publication itself.

Related Links