A Q&A with Digital Design Expert Priyankaa Krishnan
For this installment of our Spotlight series, we sat down with digital product designer and change-management leader Priyankaa Krishnan. Trained in product design and instructional design, Priyankaa now works on complex, internally-facing tools within the supply-chain organization at a large west coast based tech company—one known for building its own systems from the ground up. Her path from international student to designer and change leader is defined by curiosity, systems thinking, and a relentless focus on user adoption.

In this conversation with Interwoven founder Rebeccah Pailes-Friedman, Priyankaa shares how she weaves user insights into strategy, balances business priorities with human needs, and runs rigorous testing cycles that turn ideas into durable, real-world solutions. She also offers candid, actionable advice for emerging designers.
Priyankaa Krishnan is a program manager, and in her job she pairs digital product design with change management to launch and scale internal supply-chain tools. She earned a master’s in Industrial Design from Iowa State University, and is pursuing a PhD in Human–Computer Interaction at her alma mater. An international student turned design leader, she blends UX strategy, analytics-driven decision-making, and Prosci-informed change practices—translating BRDs into measurable outcomes and guiding cross-functional teams through rigorous testing (E2E, QA, SIT, UAT) so new processes become everyday habits.
Q:
Could you start by sharing a bit about your career path—how you found your way into digital product design and change management?
A:
I came to the U.S. as an international student, which meant a lot of my early choices were constrained by visas and timing. I studied product design at Iowa State and discovered I loved teaching, so I completed a graduate certificate in instructional design. That opened the door to my first role—building learning experiences and micro-learnings inside a large enterprise.
So many of my early decisions weren’t really choices. Because of my visa status, I often had to pick from whatever options were available and run with them, and that shaped me. The instructional design credential led to an offer for the job she currently holds. At the same time, I had an entry-level product design offer from Ford and was waiting to hear back from Sac State (California State University, Sacramento). I even asked the my interviewers for a week so I could wait on Sac State. On my mentor Ana Luz’s advice—“take the tech job!”—I accepted. I call myself an “accidental” instructional designer: I didn’t know the corporate toolset at first, but I learned by doing, practiced hard, and that’s how I landed and grew in the role.
Q:
So, what exactly is an instructional designer?
A:
An instructional designer builds learning experiences—often micro-learnings—for a learning management system, guiding people from not knowing to knowing. In corporate settings that also means unlearning and relearning. The courses I built focused on changes to systems, processes, people, and tools. That’s where “change management” comes in: when something changes, you have to educate people so they can adopt it. Early on, my job was to create courses, publish them to Cornerstone (our LMS), and track participation. But I kept wondering about the bigger picture: I can design beautiful courses because I’m a designer—but are people actually adopting the change, and to what purpose? Leadership was asking the same question.
A mentor at work, Mikki Lee, introduced me to Prosci and change management. I took the workshop, earned my certification, and transitioned from instructional designer to change manager. I came to see instructional design as a subset of change management. Knowledge is just one part of it; you also build awareness, gauge desire, develop knowledge, assess ability, and ensure reinforcement—from the top down and the bottom up—so adoption actually sticks.
While I was driving end-user adoption, I also became a critical part of designing the solution itself. I work in the supply-chain space with internal users, making sure our products are ready for customers while our systems, processes, and tools continually improve for employees. That only works if our teams stay current with the changes we’re designing for them.
Q:
You specialize in Design Thinking and UX strategy—how do you weave user insights into your process from the very beginning?
A:
In an established corporate environment, we don’t go hunting for problems—the business brings them to us through Business Requirements Documents (BRDs) that outline use cases and current pain points. In parallel, we run periodic audits of existing tools, collect in-product quick-feedback surveys, and use a “bug nub” (a small bug icon in the UI) so anyone can file bugs or request features. Because the ecosystem is mature, we’re usually redesigning or improving something rather than building from scratch; about once a year we do ship a completely new tool for a team and then manage onboarding.
When a BRD comes in, we map the pain points and triage them: what’s a nice-to-have vs. what’s P0. P0 means the business cannot function without it—if a P0 breaks, the business breaks—so we tackle those first.
From there, the design process is standard: we identify the stakeholders who submitted the requirements and run quick interviews to confirm the five W’s and H (who, what, when, where, why, how) and close any gaps. Then we publish a plan: the three things we will accomplish this half, clearly separating what must happen immediately to keep the business running from what belongs in the long-term roadmap.
Q:
When you’re working on change management, how do you balance business goals with the human experience of design?
A:
Change is tricky. When we roll out new ways of working, the business often resists—people are used to doing things one way, and we’re asking them to do it another. The most effective lever is top-down alignment. We take a proposed solution to leadership and, when it maps to organizational goals, secure a clear mandate. That makes the change mandatory, not optional, and leaders can communicate, “This change is coming and we need to adopt it.”
Not every rollout is mandatory, though. I distinguish between mandatory tools and reference tools (nice to use, not required). For reference tools, we run a “rally” approach: we visit teams during their regular meetings and give a focused 15-minute demo—“Here’s the tool; here’s exactly how it benefits your team.” Crucially, we aim to have the team’s manager deliver and reinforce the message. If the instruction comes from a manager, people do it; if I pop into a room and say, “Stop using this button, use that one,” it doesn’t land the same way. When the message comes from the business, via the right manager, adoption accelerates.
A recent example (kept high-level due to sensitivity): shifting government-driven costs have impacted pricing. We built a reference tool to visualize pricing changes, but some teams still rely on spreadsheets out of habit. In these cases, I partner with the group that’s accountable for those expenses and have them champion the tool to the affected teams—so they can answer to leadership with confidence and the organization gets consistent, accurate data.
All of this is deeply cross-functional. I sit between engineers and software developers, design, the business, and end users—coordinating requirements, aligning incentives, and making sure the right people carry the right messages at the right time.
Q:
What role does collaboration play, and how do you bring cross-functional teams into the process?
A:
It’s highly cross-functional—solution leads, engineers, analytics, QA, and business stakeholders. Honestly, it’s “a lot more talking than doing” at the start—on purpose. We align on use cases, metrics, and constraints before we build. Weekly feedback loops let us present options (never a blank page) and tradeoffs. I don’t ask, “What do you want to see?”—I bring two or three viable concepts and facilitate a decision.

Q:
In digital product design, how do you gather and interpret insights to keep a concept on track?
A:
Gathering insights: I lean strongly on analytics. Because we’re often improving existing systems rather than starting from scratch, we begin by measuring current performance and identifying gaps. We partner closely with our (very large) Analytics team.
From each Business Requirements Document (BRD) we produce two artifacts: (1) product development requirements, and (2) an analytics requirements spec. Working with solution leads for the supply chain, we define which metrics matter and the exact definitions for each (numerators/denominators, business rules, segments). All relevant event and system data is stored in a central data repository.
Analytics uses the spec to build dashboards that map gaps to requirements. They’ll literally chart, “this metric is low here, that one is high there,” against our targets. If the organization says the goal is 95%, and we’re at 87%, we run root-cause analysis using taxonomy slices (flow, geography, vendor, tool, milestone, etc.). For example, we might see in-transit delays spike and discover a missing milestone between a supplier and our system.
Q:
Physical product design leans on iterative prototyping. How do digital testing methods compare or complement that process?
A:
In our ecosystem, the UI kits and brand patterns reduce surface-level design decisions, so we can focus on solution design. For new tools, we still iterate: weekly reviews, side-by-side concepts, test environments, and structured test scripts. We run multiple layers before production: end-to-end team testing, QA, systems-integration testing (SIT) across upstream/downstream tools, and user acceptance testing (UAT). It’s iterative—just like physical prototyping—only the fidelity is software builds instead of foam and muslin.
Q:
Can you share an example where testing or validation revealed something surprising and shifted direction?
A:
We built a data-collection feature and tested single-record flows successfully—then discovered we hadn’t prepared the environment for multi-record scenarios. Everything failed, and we had to push the project by three months. Another time, notification settings weren’t validated in the dev environment; when we went to prod, thousands of emails failed over a weekend. Painful—but those misses made us tighten our checklists and guardrails.
Q:
What advice would you give designers who want to strengthen their testing and validation skills—digital, physical, or hybrid?
A:
Communicate early. Build awareness that change is coming and invite feedback while it still matters.
Detach your ego. Don’t treat the work like your baby—critique is data.
Timebox and prioritize. Fix the P0 issues immediately; queue lower-priority requests for later.
Make it real. Whether it’s a clickable prototype or a rough build, tangible artifacts unlock better conversations than slides.
Q:
Anything else you’d like to share with young designers getting started?
A:
Apply steadily—even 15 minutes a night. You never know who’s looking. And practice showing unfinished work. Confidence grows when you iterate in public, learn fast, and move on.

—
Check out the rest of our Spotlight series to hear more from leaders in the design industry. Sign up for our newsletter and follow us on Instagram and LinkedIn for design news, multi-media recommendations, and to learn more about product design and development!