Color blindness, or color vision deficiency, is "an anomaly in color vision that affects the ability to distinguish certain hues, most commonly red and green" (source: INSERM, French National Institute of Health and Medical Research). This condition affects approximately 8% of men and 0.5% of women in populations of European descent — roughly 300 million people worldwide.
But reducing the contrast issue to color blindness alone would be a mistake. Insufficient contrast affects everyone: people checking their phone in bright sunlight, tired users at the end of the day, seniors with diminishing visual acuity, and even developers testing their site on a perfectly calibrated screen without realizing that 90% of their users don't have the same display.
Contrast is not a functional problem. Your button works, your link is clickable, your form submits data. But if no one can read the text or distinguish the button from the background, the functionality is useless. It's a visual problem, and it's solved with visual tools.
What Color Blindness Actually Changes in How a Website Is Perceived
Color blindness is not a complete absence of color vision in the majority of cases. Several types exist, each affecting perception differently.
Deuteranomaly is the most common form. It affects approximately 6% of men and reduces green sensitivity. Greens and reds blend together, and oranges and yellows become difficult to distinguish. This is the type most designers are vaguely aware of, without always measuring its real consequences.
Protanomaly (approximately 1% of men) reduces red sensitivity. Reds appear darker and blend with browns and dark greens. An error message in red on a dark background can literally disappear for these users.
Tritanomaly (less than 0.01% of the population) affects perception of blue and yellow. It's rare but it exists.
Achromatopsia (grayscale vision) is extremely rare but represents the extreme case where only luminosity contrast matters.
For a website, the consequences are concrete and numerous.
A status indicator that uses only red and green (success/error) is unusable for a deuteranope. A colored link without underlining may be invisible if its color blends with the surrounding text. An action button whose background color has sufficient contrast for normal vision may become illegible under deuteranomaly.
And these problems are not edge cases. With 8% of men affected, in a team of 25 developers, there are statistically 1 to 2 color-blind people. In your user base of 100,000 people, approximately 4,000 men perceive your site differently from what you designed.
WCAG Contrast: The Numbers You Need to Know
The WCAG define minimum contrast ratios between text and its background. These ratios are calculated from the relative luminance of the two colors, on a scale from 1:1 (no contrast) to 21:1 (maximum contrast, black on white).
For level AA, the most commonly required by regulations:
Normal text (under 18 points or under 14 points in bold) must have a contrast ratio of at least 4.5:1. Large text (18 points and above, or 14 points bold and above) requires a ratio of at least 3:1. Non-text UI elements (icons, field borders, status indicators) must also meet a ratio of 3:1 (criterion 1.4.11).
For level AAA, requirements are stricter: 7:1 for normal text and 4.5:1 for large text.
These numbers are not arbitrary. They are derived from research on visual perception and account for age-related loss of contrast sensitivity. An 80-year-old person needs approximately three times more contrast than a 20-year-old to read the same text under the same conditions.
The problem is that these ratios are calculated for normal color vision. A 4.5:1 contrast ratio between dark red text and a dark green background technically satisfies the WCAG criterion, but may be insufficient for a deuteranope who cannot distinguish these two colors. The WCAG actually acknowledge this in criterion 1.4.1 (use of color): information must not rely solely on color.
That's why checking the contrast ratio alone is not enough. You also need to verify contrast as perceived by different types of color blindness.
Why Traditional Tools Fall Short
The most commonly used contrast analysis tools (Chrome DevTools' built-in contrast checker, extensions like WAVE or axe DevTools, Lighthouse audits) are excellent at what they do: calculating the contrast ratio between two specific colors. But for a comprehensive approach, screenshot testing goes further by capturing the actual rendered page.
But they have several limitations.
They check declared contrast, not rendered contrast. If a background image, gradient, or transparency effect reduces the effective contrast, these tools may not detect it. White text on a gradient background from light blue to dark blue may have sufficient contrast at the top of the area but insufficient contrast at the bottom. The tool measures a single color pair, not the reality of the rendering.
They don't simulate color blindness in an automated testing context. Chrome DevTools allows manual color blindness simulation, but this check cannot be automated in a CI/CD pipeline. You can simulate deuteranomaly for one page at a time, but not for 200 pages with every deployment.
They don't detect regressions. A Lighthouse audit gives you a score at a point in time. It doesn't alert you when a developer changes a background color that degrades contrast. You only discover it during the next audit — if someone remembers to run it.
They analyze elements individually, not the overall rendering. An element's contrast depends on its visual context. Medium gray text may be perfectly readable on a white background but become illegible if a colored panel is placed behind it following a layout change. DOM analysis tools don't capture these visual interactions.
Visual Testing with Color Blindness Simulation
Automated visual testing brings a fundamentally different approach. Instead of analyzing CSS code, it captures the actual rendering of the page as it appears in the browser, then compares that rendering to a reference to detect any change.
Applied to contrast and color blindness, this mechanism offers powerful capabilities.
Capturing the actual rendering. Visual testing takes a screenshot of what the browser actually displays. Gradients, background images, transparency, drop shadows — everything is accounted for. The measured contrast is the real contrast, not the theoretical contrast calculated from CSS.
Integrated color blindness simulation. By applying color blindness simulation filters (colorimetric transformation matrices for deuteranopia, protanopia, and tritanopia) to captured screenshots, visual testing can verify that contrast remains sufficient for each type of color vision. What was a one-off manual test becomes a systematic automated test.
Regression detection. If a developer changes a color, a background, or a gradient, visual testing detects the difference. If this difference is significant (exceeding the configured tolerance threshold), the test fails and blocks deployment. The contrast regression is caught before reaching production.
Cross-browser verification. Browsers do not render colors identically. Differences in font rendering, antialiasing, and color profile management can affect perceived contrast. Cross-browser visual testing captures these differences and ensures sufficient contrast across all target browsers.
Exhaustive coverage. Unlike a manual audit that checks a few representative pages, automated visual regression testing can cover your entire site with every deployment. Every page, every component, under every simulation condition.
How to Integrate Contrast Verification into Your Workflow
Setting up contrast verification through visual testing follows a structured approach.
Identify your critical components. Forms, action buttons, error and success messages, status indicators, navigation links — these are the elements where a contrast issue has the most impact on user experience. Start with those.
Create baselines under normal conditions. Capture your pages and components in their reference state. These screenshots serve as the comparison point for detecting future regressions.
Add baselines with color blindness simulation. For each critical page or component, create additional baselines with simulation filters: deuteranopia, protanopia, and tritanopia. This gives you three additional baseline sets that reflect what your color-blind users actually see.
Set strict tolerance thresholds. For contrast tests, the tolerance for differences must be low. A color change of a few shades can push a contrast ratio below the WCAG threshold. A difference threshold of 1 to 2% is appropriate for these tests.
Integrate into CI/CD. Configure your tests to run automatically with every pull request. If a contrast test fails, the pull request is blocked until correction. This is the only way to guarantee that contrast regressions don't make it to production.
Educate your team. The biggest obstacle to color accessibility is not technical — it's ignorance. Share the results of color blindness simulation tests with your designers and developers. Show them concretely what color-blind users see. Awareness changes behaviors permanently.
A tool like Delta-QA allows you to set up this monitoring without advanced technical skills. You configure your pages, test conditions, and tolerance thresholds in a visual interface. Captures and comparisons are automatic.
Beyond Color Blindness: Contrast as a Universal Issue
Color blindness is the most documented case, but contrast affects a much larger population.
Lighting conditions. According to a Google study (2018), 70% of time spent on mobile is outside optimal lighting conditions. What's readable in an air-conditioned office becomes illegible in bright sunlight.
Aging. At age 60, the amount of light reaching the retina is reduced by approximately one-third compared to age 20. Seniors are increasingly active digital users.
Eye strain. After hours in front of a screen, the ability to distinguish low contrasts decreases. This is a documented phenomenon affecting virtually all office workers.
Screen quality. Cheap screens, conference room projectors, interactive kiosks — actual contrast varies considerably depending on the hardware.
Designing for contrast isn't just about complying with WCAG to avoid a lawsuit. It's recognizing that readability is a prerequisite for any interaction, and that optimal reading conditions are the exception, not the rule.
FAQ
How can I tell if my website has contrast issues for color-blind users?
The most direct way is to simulate different types of color blindness and visually check the result. Chrome DevTools offers this functionality manually (Rendering tab, Emulate vision deficiencies). For systematic and automated verification, visual testing with integrated simulation filters covers the entire site with every deployment.
Does the WCAG contrast ratio guarantee readability for color-blind users?
Not necessarily. The WCAG contrast ratio is based on relative luminance, which is independent of hue. A 4.5:1 contrast between two colors with sufficiently different luminance is compliant, but if those colors blend under deuteranopia, the text may remain difficult to distinguish. WCAG criterion 1.4.1 additionally requires that color not be the sole means of conveying information.
Which types of color blindness should be tested first?
Deuteranomaly and deuteranopia (red-green confusion related to the green channel) affect approximately 6% of men and are the priority. Protanomaly and protanopia (red-green confusion related to the red channel) affect approximately 2% of men. Tritanopia (blue-yellow confusion) is very rare and represents a supplementary test. Testing at minimum for deuteranopia covers the majority of cases.
Can visual testing replace a manual contrast audit?
Visual testing detects contrast regressions (a change that degrades contrast relative to the baseline). It does not perform a contrast ratio calculation for every text/background pair. The two approaches are complementary: manual or automated auditing (via axe-core) identifies initial violations, and visual testing prevents future regressions.
How can contrast issues be avoided from the design phase?
Three fundamental rules: never use color as the sole indicator (always add text, an icon, or a pattern), use color palettes tested for color blindness (tools like Sim Daltonism or Color Oracle allow real-time verification), and always check the contrast ratio before approving a color choice. Modern design systems integrate these checks into their color tokens.
How many people are actually affected by contrast issues?
Beyond the 300 million color-blind people worldwide, contrast issues potentially affect everyone depending on usage conditions. The WHO estimates that 2.2 billion people have a visual impairment, including 1 billion cases of uncorrected presbyopia. And any user viewing a screen under poor lighting conditions is temporarily affected by insufficient contrast.
Further reading
Conclusion
Contrast is a visual problem. Not a functional problem, not a structural problem, not a code problem. It's a problem of what people see when they look at your site, under their real usage conditions, with their real vision.
Code analysis tools detect contrast ratio violations in CSS. That's necessary but insufficient. Visual testing verifies contrast in the actual rendering, with color blindness simulation, across all browsers and resolutions, in an automated and continuous manner.
8% of men are color blind. 100% of users are affected by lighting conditions, eye strain, and variable screen quality. Contrast is not a niche topic. It's the foundation of your interface's readability.