How Our Design Audit Tool Works: Academic Foundations, Reviews, and Our Versioning Approach

Design Audit is a free tool we built at Hitit Medya. It loads a public URL in a secure server session, samples the interface (typography, color, controls, spacing, imagery, motion hints), and returns numeric indicators plus readable findings. This article explains how it works, which scientific and normative sources anchor it, how it connects to our Design Reviews, and our approach to versioning—in a way that stays honest for people and helpful for E-E-A-T (Experience, Expertise, Authoritativeness, Trust).
Why we built it
As a younger agency we care about proof, not only polish. We ship client work on Next.js and React with performance in mind. Design Audit extends the same discipline: a fast health check for any public page. The tool does not claim “AI magic.” It derives signals from the DOM, applies well-known formulas and literature-aligned heuristics, and always needs human interpretation.
How it works (short)
Capture: We open the URL you provide and collect structure and style signals from the rendered page.
Color & accessibility: We compute relative luminance and contrast ratios consistent with WCAG 2.1. For dichromat-style insight we use a widely cited deuteranopia simulation matrix after Viénot et al. (1999). We also flag near-collisions between simulated colors using a CIE76 ΔE-style distance check.
Typography & layout: Font stacks, modular rhythm, and spacing cues support readability and hierarchy hints.
Interaction & Gestalt: Touch-target sizing and diversity feed Fitts-inspired heuristics; consistency checks relate to practical Gestalt-aligned cues on controls.
Content & platform hints: Text vs. image balance and light CMS/heuristic signals add context beyond pixels alone.
Audit log: Short, page-specific findings complement headline scores for internal QA and client conversations.
Try it here: Design Audit (EN).
Scientific and normative foundations (selected)
Viénot, Brettel & Mollon (1999): Digital video colourmaps for dichromats — basis for our simulation pathway. DOI.
W3C WCAG 2.1: Contrast and accessibility criteria. W3C Recommendation.
Fitts, P. M. (1954): Information capacity of the motor system — conceptual grounding for target-acquisition heuristics. Journal of Experimental Psychology.
CIE76 color difference: Practical collision checks between simulated tones.
Further readings (also linked in-product): Lai et al. (2023, arXiv); Seckler et al. (2015); Michailidou et al. on visual complexity; Zheng et al. (2011) on aesthetics metrics.
These references inform and bound an agency tool—they do not turn it into a peer-reviewed clinical instrument or a substitute for your design team.
Design Reviews: grounding in real products
We do not leave the metrics orphaned. In Design Reviews we publish long-form audits for selected properties—for example:
That layer matters for E-E-A-T: observable work on real sites, transparent scoring language, and citable external sources.
Versions and roadmap
Design Audit is a living product. We expect iterations on detection quality, explainable weights, localization, resilience, and performance. We plan to document meaningful changes in release notes as the tool evolves.
E-E-A-T and who we are
Experience: We use the tool and reviews inside our own delivery loop. Expertise: Terminology maps to real domains (WCAG, ΔE, Fitts, Gestalt). Authoritativeness: Our team’s research-adjacent work appears elsewhere on this blog—e.g. the TLS handshake Zenodo announcement. Trust: We state limits and avoid hype.
Brand age is not the same as engineering maturity: Hitit Medya may be a newer name, but our stack choices, research literacy, and open-ecosystem alignment are deliberate and demonstrable.
Note: This article describes the tool at publication time; metrics and UI may change across releases. Run the tool for the latest behavior.
