[Skip navigation links]
Login

Labelling forms, images and links Screen reader compatibility

Last updated: April 7, 2019

Screen reader compatibility test results for labelling, showing how failures and techniques work in different screen reader / browser combinations.

The results include two types of test:

Reliability by user agent

The solid area in the graph shows percentage of tests that pass in all tested interaction modes. The cross hatched area shows partial passes that only work in some interaction modes. An example of a partial pass is when form labels are read when tabbing, but ignored in browse mode.

ComboVersionsReliability
JAWS IEJAWS 2018.1811.2 with IE1196%
JAWS FirefoxJAWS 2018.1811.2 with FF6087%
NVDA IENVDA 2018.4 with IE1174%
NVDA FirefoxNVDA 2018.4 with FF6092%
VoiceOver MacVoiceOver macOS 10.13 with Safari 12.191%
VoiceOver iOSVoiceOver iOS 11.4 with Safari iOS 11.479%
WindowEyes IEWindowEyes 9.2 with IE1194%
Dolphin IEDolphin SR 15.05 with IE1177%
SaToGo IESaToGo 3.4.96.0 with IE1153%
Average Including older versions 83%

The average includes all versions, but some browser/AT combinations have tests for multiple versions (NVDA / JAWS / VoiceOver), while others only have tests for a single version (SaToGo and Dolphin).

Reliability trend

100%80%60%40%20%0%201382%201482%201585%201685%201787%201887%

Works as expected

These tests use conformant HTML or WCAG sufficient techniques, and work in all tested browser / screen reader combinations.

Screen ReaderNVDAJAWSVoiceOverWin-EyesDolphinSaToGo
BrowserIEFFIEFFMaciOSIEIEIE
Should work. Works in 100% A link containing only an IMG with ALTGoodGoodGoodGoodGoodGoodGoodGoodGood
Should work. Works in 100% A link containing only an IMG with TITLEGoodGoodGoodGoodGoodGoodGoodGoodGood
Should work. Works in 100% BUTTON with TITLE containing only an IMG with null ALTGoodGoodGoodGoodGoodGoodGoodGoodGood
Should work. Works in 100% IMG with ALTGoodGoodGoodGoodGoodGoodGoodGoodGood
Should work. Works in 100% IMG with TITLEGoodGoodGoodGoodGoodGoodGoodGoodGood
Should work. Works in 100% IMG with null ALT attributeGoodGoodGoodGoodGoodGoodGoodGoodGood
Should work. Works in 100% INPUT type=image with ALT attributeGoodGoodGoodGoodGoodGoodGoodGoodGood
Should work. Works in 100% INPUT type=text inside LABEL with text before controlGoodGoodGoodGoodGoodGoodGoodGoodGood
Should work. Works in 100% INPUT type=text with LABEL FORGoodGoodGoodGoodGoodGoodGoodGoodGood
Should work. Works in 100% INPUT type=text with aria-labelledby attributeGoodGoodGoodGoodGoodGoodGoodGoodGood

Expected to work

These tests use conformant HTML or WCAG sufficient techniques and might be expected to work in screen readers. This doesn't always happen.

Screen ReaderNVDAJAWSVoiceOverWin-EyesDolphinSaToGo
BrowserIEFFIEFFMaciOSIEIEIE
Should work. Fails in 1% - 25% A "click here" link with TITLE attributeBadGoodBetterBetterBetterWorseBadBadGood
Should work. Fails in 1% - 25% A "click here" link with aria-describedby attributeGoodGoodBetterBetterBetterGoodBadBadGood
Should work. Fails in 76% - 100% ABBR with titleBadBadBadBadBadBad BadBad
Should work. Fails in 26% - 50% AREA and IMG with ALT attributesBadBetterGoodBadBetterBadBetterGoodGood
Should work. Fails in 1% - 25% AREA with ALT attribute and IMG with null ALTBadGoodGoodBadGoodGood
Should work. Fails in 51% - 75% AREA with TITLE attributeBadBadGoodBadBetterBadBetterBadGood
Should work. Fails in 26% - 50% AREA with aria-label attributeBadBetterBetterBadBetterBadBetterBadBad
Should work. Fails in 51% - 75% AREA with aria-labelledby attributeBadBadBetterBadBadBadBetterBadBad
Should work. Fails in 26% - 50% BUTTON containing only an IMG with TITLE attributeBetterGoodBetterGoodBadBadBetterGoodGood
Should work. Fails in 1% - 25% BUTTON containing only an IMG with aria-labelBetterGoodBetterGoodGoodGoodBetterGoodBad
Should work. Fails in 26% - 50% BUTTON containing only an IMG with aria-labelledbyBadGoodBetterBadGoodGoodBetterGoodBad
Should work. Fails in 1% - 25% BUTTON with aria-label containing only an IMG with null ALTGoodGoodBetterGoodGoodGoodGoodGoodBad
Should work. Fails in 26% - 50% FIELDSET containing linksBadBetterBadGoodBetterBadBadBadBad
Should work. Fails in 1% - 25% IFRAME with fallback contentGoodBadGoodBetterBadGoodBetterGoodGood
Should work. Fails in 26% - 50% IMG with FIGCAPTIONGoodGoodGoodBetterBadBadGoodGoodGood
Should work. Fails in 1% - 25% IMG with aria-labelBetterGoodGoodGoodGoodGoodGoodBadBad
Should work. Fails in 1% - 25% IMG with aria-labelledbyBadGoodGoodGoodGoodGoodBetterBadBad
Should work. Fails in 1% - 25% INPUT type=image with TITLE attributeGoodGoodGoodGoodGoodGoodGoodGoodBad
Should work. Fails in 1% - 25% INPUT type=text inside LABEL with text after controlGoodGoodGoodGoodGoodGoodGoodGoodBad
Should work. Fails in 1% - 25% INPUT type=text inside LABEL with text before and after controlGoodGoodGoodGoodGoodGoodGoodGoodBad
Should work. Fails in 1% - 25% INPUT type=text with aria-describedby attributeGoodGoodBetterBetterWorseGoodBetterBadGood
Should work. Fails in 1% - 25% Link text replaced by `aria-label` attributeBetterBetterGoodBadGoodGoodBetterGoodBad
Should work. Fails in 1% - 25% Link text replaced by `aria-labelledby` attributeBadBetterBetterGoodGoodGoodGoodGoodBad
Should work. Fails in 1% - 25% Yes/No radio buttons inside `fieldset` elementGoodGoodGoodGoodBetterBetterBetterBadBad
Should work. Fails in 1% - 25% `button` element containing only an `img` with an `alt` attributeBetterGoodBetterGoodGoodGoodBetterGoodBad
Should work. Fails in 1% - 25% `iframe` with `title` attributeBadBadGoodBetterGoodGoodGoodGoodGood
Should work. Fails in 1% - 25% `input type=image` with `aria-label` attributeGoodGoodBetterGoodGoodGoodGoodGoodBad
Should work. Fails in 1% - 25% `input type=image` with `aria-labelledby` attributeBadGoodBetterGoodGoodGoodBetterGoodBad
Should work. Fails in 1% - 25% `input type=text` with `aria-label` attributeBadBetterGoodGoodGoodGoodGoodGoodBad
Should work. Fails in 1% - 25% `input type=text` with `title` attributeBadBetterGoodGoodGoodGoodGoodGoodBad

Expected to fail

These tests use non-conformant HTML or WCAG failures and are expected to fail in screen readers.

Screen ReaderNVDAJAWSVoiceOverWin-EyesDolphinSaToGo
BrowserIEFFIEFFMaciOSIEIEIE
Should fail. Fails in 76% - 100% A link containing only an IMG with null ALTBadBadBadBadBadBadBadBadBad
Should fail. Fails in 76% - 100% A link containing only an IMG without ALTBadBadBadBadBadBadBadBadBad
Should fail. Fails in 1% - 25% A link with TITLE containing only an IMG with no ALTGoodBetterGoodWorseBetterGoodBetterGoodGood
Should fail. Fails in 26% - 50% A link with aria-labelledby containing only an IMG with no ALTBadBetterBetterBadBetterGoodBetterGoodBad
Should fail. Fails in 76% - 100% ACRONYM with titleBadBadBadBadBadBadBadBadBad
Should fail. Fails in 76% - 100% AREA with no ALT attributeBadBadBadBadBadBadBadBadBad
Should fail. Fails in 76% - 100% AREA with null ALT attributeBadBadBadBadBadBadBadBadBad
Should fail. Fails in 76% - 100% BUTTON containing only an IMG with no ALTBadBadBadBadBadBadBadBadBad
Should fail. Fails in 76% - 100% BUTTON containing only an IMG with null ALT attributeBadBadBadBadBadBadBadBadBad
Should fail. Fails in 1% - 25% BUTTON with TITLE containing only an IMG with no ALTGoodGoodBadGoodGoodGoodGoodGoodGood
Should fail. Fails in 1% - 25% BUTTON with aria-label containing only an IMG with no ALTGoodGoodGoodGoodGoodGoodGoodGoodBad
Should fail. Fails in 1% - 25% FIELDSET containing no controls GoodGoodWorseWorseBetterBetterBetterGoodGood
Should fail. Fails in 1% - 25% FIELDSET used to put border round textGoodGoodGoodGoodGoodBetterBetterGoodGood
Should fail. Fails in 51% - 75% FIELDSET with blank LEGENDBadBadBadBadBadBetterBadBadBad
Should fail. Fails in 26% - 50% FIELDSET with no LEGENDBadBadBadBadBadBetterBadBadBad
Should fail. Fails in 1% - 25% IFRAME with blank titleGoodBadGoodBetterBadGoodBetterGoodGood
Should fail. Fails in 1% - 25% IFRAME with no alt content and no titleGoodBadGoodBetterBadGoodBetterGoodGood
Should fail. Fails in 26% - 50% IFRAME with title matching frame filenameBadBadGoodBadBadGoodBadGoodGood
Should fail. Fails in 51% - 75% IMG with ALT set to ASCII art smileyBadBadBadBadGoodGoodBadBadGood
Should fail. Fails in 76% - 100% IMG with ALT set to SRC filenameBadBadBadBadBadBadBadBadBad
Should fail. Fails in 76% - 100% IMG with aria-describedbyBadBadBadBadGoodBadBetterBadBad
Should fail. Fails in 76% - 100% IMG with null ALT and non-null TITLE attributesBadBadBadBadBadBadBadBad
Should fail. Fails in 76% - 100% IMG with null ALT and non-null aria-label attributesBadBadBadBadBadBadBadBad
Should fail. Fails in 76% - 100% IMG with null ALT and non-null aria-labelledby attributesBadBadBadBadBadBadBadBad
Should fail. Fails in 76% - 100% IMG with server side image mapBadBadBadBadBadBadBadBadBad
Should fail. Fails in 76% - 100% IMG without ALT attributeBadBadBadBadBadBadBadBadBad
Should fail. Fails in 76% - 100% INPUT type=image with no ALT attributeBadBadBadBadBadBadBadBadBad
Should fail. Fails in 76% - 100% INPUT type=image with null ALTBadBadBadBadBadBadBadBadBad
Should fail. Fails in 76% - 100% INPUT type=text inside blank LABELBadBadBadBadBadBadBadBadBad
Should fail. Fails in 76% - 100% INPUT type=text with blank LABEL FORBadBadBadBadBadBadBadBadBad
Should fail. Fails in 1% - 25% INPUT with aria-labelledby pointing to role=presentation elementGoodGoodBetterGoodGoodGoodBadGood
Should fail. Fails in 26% - 50% Image MAP with no NAME attributeBadBetterGoodBadBadBadBetterGoodGood
Should fail. Fails in 51% - 75% LABELs reference controls with duplicate idsBadBadBadBadBadBadBadBadBad
Should fail. Fails in 1% - 25% Link with an `aria-label`, and containing only an `img` with no `alt`GoodBetterBetterBadBetterGoodBetterGoodBad
Should fail. Fails in 51% - 75% Yes/No radio buttons without FIELDSETBadBadBadBadBadBetterBadBadBad

Key

Tests expected to fail (due to authoring errors) are marked with Expected to Fail.

Test notes

All tests were carried out with screen reader factory settings. JAWS in particular has a wide variety of settings controlling exactly what gets spoken.

Screen readers allow users to interact in different modes, and can produce very different results in each mode. The modes used in these tests are:

In the «What the user hears» column: