ABOUT COMPANY:
Waters Corporation (B2B eComm and SaaS) unlocks the potential of health science by providing the tools, technology and insights that enable scientific breakthroughs and allow us to advance human health and well-being. It is a leading provider of analytical instruments, software and services that allow scientists, lab managers and procurement teams worldwide to source instruments, consumables, software licenses and specialty chemical products. Waters significantly outperforms its direct competitors on UX benchmarks, scoring 28.6 overall against Thermo Fisher (competitor) at 10.0 and Sigma Aldrich (competitor) at 3.5 on the Baymard (UX Audit Tool) Institute's industry comparison.
MY ROLE:
Led and managed end-to-end research for the Checkout 2.0 experience, Carried out stakeholder alignment and secondary research, Conducted internal Baymard UX Audit, Designed and deployed Post-Checkout Survey (Qualtrics), Analysed users Decibel sessions for behavioral analytics, Collaborated with Engineers, UX Designers, Content Designers, Product Owners, Product Managers, etc, analyzed findings, prepared the final report and delivered high priority design recommendations.
PROJECT BACKGROUND:
The Checkout 2.0 experience was required because, despite performing well relative to competitors, the legacy checkout experience on Waters.com had accumulated friction over time, as each new feature added its own inconsistencies. A large-scale analysis of 3,648 customer responses (pre-launch) and internal order data exposed three systemic problems that demanded a holistic solution. Problems identified - (i) Too Much Friction - 18% of users who initiated checkout never completed their order. The legacy multi-step checkout was difficult to navigate when users needed to modify saved information, too many optional fields created confusion for users who needed a basic checkout and region-specific requirements made global optimization nearly impossible. (ii) Too Much Manual Intervention - 25% of all US orders were placed on hold for manual review, delaying shipping as users frequently changed their address post-checkout (often just adding building/room numbers), causing up to 24-hour delays. (iii) Misuse of 'Special Instructions' field - Customers used the 'Special Instructions' field as a workaround for missing form fields, representing a massive process failure. The internal Baymard UX Audit, designing and deploying Post-Checkout Survey (Qualtrics), analyzing users decibel sessions for behavioral Analytics empowered product teams to take data-informed decisions to redesign the checkout experience, which resulted in checkout 2.0 and increased conversion, ultimately adding value to the end user experience and business growth.
OBJECTIVES:
To evaluate UX Compliance for Checkout 2.0 against Baymard Institute's guidelines to identify compliance gaps and areas of excellence before launch.
To measure Post-Launch satisfaction via Qualtrics to quantify improvements in NPS, ease-of-use, payment satisfaction and checkout confidence after the launch.
To understand Passive and Detractor behaviors through Decibel session analytics to empathize with Passives (7-8 NPS) and Detractors (0-6 NPS), uncovering the real-world friction points that might have driven negative scores.
To identify abandonment causes (exact moments and errors) through users decibel session analysis, specifically for users who reached checkout but did not place an order.
RESEARCH PROCESS:
1 - Stakeholder Alignment and Secondary Research
I started with stakeholder alignment involving the Engineer, Product Owner, Product Manager, Content Designer, UX Designer and others working on the project to understand the strategic intent behind Checkout 2.0. The goal was to ensure my research agenda directly addressed business-critical questions around friction, manual intervention and the 3rd-party payment integration, so I had them talk about the background of the project, covering facts, opinions, guesses, etc. I further determined our target user segments and defined research goals and objectives and discussed success metrics.
I also analyzed existing internal data - 3,648 pre-launch customer survey responses, US order data from the past 3 months ('Special Instructions' field analysis) and funnel analysis showing the 18% checkout non-completion rate. This helped me triangulate where to focus more and prepare the most rigorous research methodology.
2 - Internal Baymard UX Audit (Pre-Launch)
I then conducted a systematic internal audit of the Checkout 2.0 experience against Baymard Institute's 130 guidelines spanning 16 categories. Baymard's catalog is built on 49,000+ hours of large-scale e-commerce UX Research making it the gold standard for checkout UX evaluation. Each guideline was assessed against our prototype/staging environment and scored on a Poor → Mediocre → Decent → Good → Perfect scale.
The audit revealed that Checkout 2.0 scored Good (68.7), a significant jump from our legacy checkout audit score of Decent (47.6), representing a 21.1-point UX improvement. All outstanding issues identified were logged into backlogs for prioritization.
3 - Post-Checkout Survey Design and Deployment (Qualtrics)
I also designed a lean, branched post-checkout survey (2 versions iterated) that appeared to users as a modal after completing their order on the confirmation page. The survey was carefully constructed to minimize respondent burden while capturing both quantitative NPS data and qualitative sentiment. The new design of Checkout 2.0, with all the enhancements, was rolled out on the live website along with the survey.
The survey used conditional branching - first-time buyers saw a neutral version of the Likert statements, while returning customers saw statements explicitly framed around the "improvements we recently made." This allowed me to compare sentiment across user cohorts without introducing bias for new users who had no prior experience to benchmark against.
The survey deployed included: (1) NPS score question (0–10), (2) Open-text NPS reason, (3) First-order flag (yes/no branching), (4) Five Likert-scale statements on ease-of-use, address entry, payment availability, detail completeness and confirmation confidence (5) an open-text additional feedback field. In total, 172 responses were captured.
4 - Deep Dive Into Behavioral Analytics (Users Decibel Session Analysis)
After Qualtrics analysis, I wanted to empathize with users who gave low NPS scores (Passives: 7-8 and Detractors: 0-6), so I analyzed their Decibel session recordings. Decibel captures behavioral signals, including rage clicks, scroll depth, dead clicks and hesitation, giving each session a DXS score. I filtered sessions from Passive and Detractor survey respondents to trace what actually went wrong in their experience.
I also ran a targeted analysis of sessions where users reached the cart but did not proceed to checkout and separately, sessions where users reached the final "Place Order" step but abandoned. This triangulation of behavioral data with survey sentiment analysis gave me both the "what" and the "why" behind conversion failure.
5 - Analysis and Reporting
I collated all data streams, including Baymard audit findings, survey quantitative data (NPS, Likert), survey verbatims (sentiment analysis of 70 open-text responses), Decibel session patterns and drop-off session analyses and synthesized them into a single findings framework. I also categorized qualitative responses into positive themes (signaling ease, efficiency, intuitiveness, etc) and constructive themes (addressing all identified issues throughout analysis).
I then organized findings by severity and actionability and prepared an engaging final report in which design recommendations and potential bugs were discussed. The engineering and product team was debriefed over a Microsoft Teams call focused on how we can refine and improve the user's experience with Checkout 2.0 on our platform and further increase conversion. Also, the outstanding issues tied to specific Baymard guidelines and user behaviors observed in Decibel were logged into backlogs with Jira tickets.
KEY INSIGHTS:
The +9 NPS Points increase (62 -> 71) from the pre-launch baseline was recorded. An NPS over 70 indicates customers love the experience and are generating positive word-of-mouth referrals. 16% of survey respondents were completing their very first order on Waters.com.
Most users (New and Returning both) found Checkout 2.0 was easy to use, preferred payment was available, felt comfortable when confirming order, were able to provide shipping information easily and were also able to provide all details for the order.
Critical Error at Final Step: Some users, when they reached the Review Order page and clicked "Place Order," the system returned a generic error: "Something went wrong. Please try again". Users were not given any indication of what went wrong or how to fix it. After multiple failed attempts, users abandoned the checkout entirely.
Cart Loading Failure and Data Loss: A few users encountered a cart that took too long to load. The system displayed: "Something went wrong. Please try again." When the user switched browser tabs, the cart appeared empty (the system failed to persist cart state). Clicking back into the cart produced the same error again.
'Add to Cart' Failure (Global Regions): Users in European and Chinese markets clicked 'Add to Cart,' but the cart icon in the top-right navigation failed to update, showing no item was added. Users were left confused as they could not confirm whether their action was registered. This was a critical trust-breaking moment in the early purchase journey.
RECOMMENDATIONS:
Fix Critical Error Handling on the "Place Order" CTA: Users who reach the Review Order page are high-intent buyers. Replace the generic error with specific, actionable messages that diagnose the issue (e.g., payment timeout, address validation failure, etc) and provide a clear next step.
Implement Cart State Persistence Across Sessions and Tab Switches: Cart data should be server-side persisted and survive browser tab changes, page refreshes and temporary navigation away from the checkout flow. Additionally, the checkout form should persist all entered data (especially Special Instructions) when users navigate away and return. This directly addresses one of the top Decibel-observed pain points.
Fix 'Add to Cart' Confirmation for International Regions: The "Add to Cart" button failure in European and Chinese markets needs immediate fixes, including a visible cart badge increment animation upon successful addition, a persistent toast notification confirming the addition and an inline overlay with product thumbnail, name and price.
RESEARCH IMPACT:
My research insights directly went into Checkout 2.0's further development and it is now Live and providing a seamless checkout experience to the end users (10% uplift in conversion rate over a 3-month period, +9 point increase in NPS from pre-launch baseline, 76% NPS Promoters post-launch indicates strong customer loyalty and referral likelihood, 68.7 Baymard UX Audit score (Good) up from 47.6 (Decent)).
The audit findings were directly integrated into engineering backlogs with ticket references, making research outputs immediately actionable for engineering sprints. Engineers recognised me as a trusted partner and came to me for further clarifications on end-user experience.
By integrating Decibel session analysis into the post-launch research workflow alongside traditional surveys and audits, this project demonstrated the power of behavioral triangulation. The combination of "what users say" (survey verbatims), "what users score" (NPS and Likert) and "what users do" (Decibel sessions) gave my team a complete picture of the checkout experience. This mixed-methods approach has since become the model for ongoing Waters.com feature research.
Through collaboration, designers were able to quickly relay feedback from research and incorporate it into the prototype - Agile.
The 'Special Instructions' field analysis directly informed structural changes to the Checkout 2.0 form, adding dedicated fields for Attention To, Building/Department, Tax Exemption, etc. Each of these additions was justified by quantitative evidence (instance counts from real order data) and qualitative evidence (Decibel sessions showing users struggling to find the right field). This made the business case for every new form element unambiguous and defensible to stakeholders.
KEY LEARNINGS FOR ME:
Behavioral data reveals what surveys cannot (Survey responses gave me the "what" - users scoring 6 or 7, saying the address was difficult. Decibel sessions gave me the "why").
Tying research to backlog tickets wherever possible drives adoption (Framing audit findings with a specific Jira ticket reference transformed research from a document into an engineering input).
Survey branching preserves data integrity (Showing the same Likert statements to first-time buyers and returning customers without branching would have contaminated the data).
Global regions behave differently (The 'Add to Cart' failure surfaced specifically in European and Chinese sessions and not in the US cohort. This taught me to always segment behavioral analytics by region).
Thank you for your time
To discuss more about this project, you can reach out to me at shoryasaxena96@gmail.com or +91 9784088400