UX 4 DATAVIZ ETHODOLOGY
UX 4 DATAVIZ ETHODOLOGY
My methodology merges the UX process (research–design–evaluation) with the data visualization lifecycle (data–design–insights), often supported by AI as a layer across all phases.
My work combines a user-centered UX process (research → design → evaluation) with the data visualization lifecycle (data → design → insights).
The goal is simple: create visual systems that are useful, understandable, and trustworthy, not just visually appealing.
Across projects, AI acts as a supporting layer, helping speed up analysis, synthesize feedback, and explore alternatives — always guided by human judgment and UX principles.
Goal: Define the real problem space and understand how data will be used in context.
This phase ensures we are solving the right problem before designing anything. Many visualization issues come from misaligned assumptions between stakeholders, data, and users.
Key activities:
Stakeholder interviews and service blueprinting to understand goals, constraints, and decision processes.
Data inventory and source evaluation, assessing structure, quality, limitations, and potential bias.
User research to identify personas, journeys, and pain points in data interpretation.
Definition of business KPIs and visualization goals (monitoring, explanation, persuasion, exploration).
Use of AI to cluster user feedback or summarize qualitative interviews, accelerating sense-making without replacing analysis.
Deliverables:
Research report, data audit, empathy maps, user paths, AI-summarized insights.
Why it matters:
This phase reduces rework later and ensures that visualizations support real decisions, not abstract metrics.
Goal: Align user needs with what the data can realistically support.
Here, data exploration and UX thinking come together. Instead of visualizing everything, the focus is on identifying what actually matters.
Key activities:
Exploratory Data Analysis (EDA) to detect patterns, outliers, and correlations.
Formulation of insight hypotheses grounded in both data and user needs.
Selection and prioritization of metrics and variables to visualize.
Use of LLMs or AutoML tools to support pattern discovery or variable selection when datasets are large or complex.
Deliverables:
EDA report, metric hierarchy, data–user matrix.
Why it matters:
This step prevents cognitive overload and ensures that dashboards communicate insights instead of noise.
Goal: Transform insights into clear visualization concepts and UX flows.
This phase focuses on how information will be perceived, explored, and understood by users.
Key activities:
Low-fidelity sketches and wireframes for charts and layout structure.
Definition of interaction models (filters, drill-downs, storytelling flows).
AI assistance for idea generation, automatic layout suggestions, and heuristic checking.
Deliverables:
Concept maps, wireframes, annotated prototypes.
Why it matters:
Early visualization concepts allow fast validation of ideas before committing to complex implementations.
Goal: Build interactive prototypes and prepare data pipelines. Measure usability, comprehension, and trust.
At this stage, ideas become tangible and testable. This ensures that visualizations are not only usable, but also correctly understood.
Key activities:
Creation of visualization components using tools such as D3.js, Plotly, or design tokens in Figma
Application of accessibility and perceptual best practices (color, contrast, visual encoding).
AI support to generate variants, check color contrast, or test accessibility constraints.
Cognitive walkthroughs and A/B testing.
User testing focused on comprehension, efficiency, and confidence in the data.
Advanced testing methods such as biometric measures or eye-tracking when needed.
AI support for sentiment analysis of feedback and clustering of user behavior logs.
Deliverables:
Interactive MVP, chart design system, prototype documentation.
Usability report, data comprehension metrics, iteration backlog.
Why it matters:
A strong prototype allows teams to test understanding early and creates a reusable foundation for scaling. Testing reveals gaps between intended meaning and perceived meaning — a critical risk in data visualization.
Goal: Launch, monitor, and evolve the solution over time.
Visualization systems improve through use, not only design.
Key activities:
Collection of behavioral analytics (interaction patterns, dwell time, usage frequency).
Evaluation of whether visualizations actually drive insight and decision-making.
Use of AI for anomaly detection, insight recommendation, or adaptive dashboards.
Deliverables:
Dashboard performance report, continuous learning pipeline.
Why it matters:
This phase closes the loop between design intent and real-world impact.