Understanding User Experience

Explore top LinkedIn content from expert professionals.

  • View profile for Alex Wang
    Alex Wang Alex Wang is an Influencer

    Learn AI Together - I share my learning journey into AI & Data Science here, 90% buzzword-free. Follow me and let's grow together!

    1,127,651 followers

    Static textbooks might be outdated soon... Learn Your Way — Google's AI-powered learning tool, is one of the most thoughtful experiments I’ve seen in AI + education. It can turn a simple PDF into five personalized learning formats in one click. And instead of one-size-fits-all lessons, it adapts to you: Pick your grade level + interests → the content reshapes itself. ---- Into space? Physics comes with rocket examples. ---- Learning to code? It adjusts to your experience level. Where it’s at now: • Live in Google Labs as an official research experiment • Built on Google’s LearnLM + Gemini (pedagogy-first AI stack) 💡What it already does well: 🔹 Converts content into multiple formats (read, listen, slides, mind maps) 🔹 Built-in quizzes + adaptive feedback 🔹 Contextual examples that actually feel relevant 🔹 Low-effort learning modes (like audio on commutes) *Still early-stage (can’t upload your own materials yet, in tester phase), but what’s there already shows what AI + education could look like. Fun to explore,  Link’s in the comments. __________ For more on AI and learning materials, plz check my previous posts. Alex Wang #education #ai #generativeai #edtech

  • View profile for Felix Haas

    Design at Lovable, Angel Investor

    93,000 followers

    Invisible UX is coming 🔥 And it’s going to change how we design products, forever. For decades, UX design has been about guiding users through an experience. We’ve done that with visible interfaces: Menus. Buttons. Cards. Sliders. We’ve obsessed over layouts, states, and transitions. But with AI, a new kind of interface is emerging: One that’s invisible. One that’s driven by intent, not interaction. Think about it: You used to: → Open Spotify → Scroll through genres → Click into “Focus” → Pick a playlist Now you just say: “Play deep focus music.” No menus. No tapping. No UI. Just intent → output. You used to: → Search on Airbnb → Pick dates, guests, filters → Scroll through 50+ listings Now we’re entering a world where you guide with words: “Find me a cabin near Oslo with a sauna, available next weekend.” So the best UX becomes barely visible. Why does this matter? Because traditional UX gives users options. AI-native UX gives users outcomes. Old UX: “Here are 12 ways to get what you want.” New UX: “Just tell me what you want & we’ll handle the rest.” And this goes way beyond voice or chat. It’s about reducing friction. Designing systems that understand intent. Respond instantly. And get out of the way. The UI isn’t disappearing. It’s mainly dissolving into the background. So what should designers do? Rethink your role. Going forward you’ll not just lay out screens. You’ll design interactions without interfaces. That means: → Understanding how people express goals → Guiding model behavior through prompt architecture → Creating invisible guardrails for trust, speed, and clarity You are basically designing for understanding. The future of UX won’t be seen. It will be felt. Welcome to the age of invisible UX. Ready for it?

  • View profile for Vitaly Friedman
    Vitaly Friedman Vitaly Friedman is an Influencer

    Practical insights for better UX • Running “Measure UX” and “Design Patterns For AI” • Founder of SmashingMag • Speaker • Loves writing, checklists and running workshops on UX. 🍣

    222,847 followers

    ⏱️ How To Measure UX (https://lnkd.in/e5ueDtZY), a practical guide on how to use UX benchmarking, SUS, SUPR-Q, UMUX-LITE, CES, UEQ to eliminate bias and gather statistically reliable results — with useful templates and resources. By Roman Videnov. Measuring UX is mostly about showing cause and effect. Of course, management wants to do more of what has already worked — and it typically wants to see ROI > 5%. But the return is more than just increased revenue. It’s also reduced costs, expenses and mitigated risk. And UX is an incredibly affordable yet impactful way to achieve it. Good design decisions are intentional. They aren’t guesses or personal preferences. They are deliberate and measurable. Over the last years, I’ve been setting ups design KPIs in teams to inform and guide design decisions. Here are some examples: 1. Top tasks success > 80% (for critical tasks) 2. Time to complete top tasks < 60s (for critical tasks) 3. Time to first success < 90s (for onboarding) 4. Time to candidates < 120s (nav + filtering in eCommerce) 5. Time to top candidate < 120s (for feature comparison) 6. Time to hit the limit of free tier < 7d (for upgrades) 7. Presets/templates usage > 80% per user (to boost efficiency) 8. Filters used per session > 5 per user (quality of filtering) 9. Feature adoption rate > 80% (usage of a new feature per user) 10. Time to pricing quote < 2 weeks (for B2B systems) 11. Application processing time < 2 weeks (online banking) 12. Default settings correction < 10% (quality of defaults) 13. Search results quality > 80% (for top 100 most popular queries) 14. Service desk inquiries < 35/week (poor design → more inquiries) 15. Form input accuracy ≈ 100% (user input in forms) 16. Time to final price < 45s (for eCommerce) 17. Password recovery frequency < 5% per user (for auth) 18. Fake email frequency < 2% (for email newsletters) 19. First contact resolution < 85% (quality of service desk replies) 20. “Turn-around” score < 1 week (frustrated users → happy users) 21. Environmental impact < 0.3g/page request (sustainability) 22. Frustration score < 5% (AUS + SUS/SUPR-Q + Lighthouse) 23. System Usability Scale > 75 (overall usability) 24. Accessible Usability Scale (AUS) > 75 (accessibility) 25. Core Web Vitals ≈ 100% (performance) Each team works with 3–4 local design KPIs that reflects the impact of their work, and 3–4 global design KPIs mapped against touchpoints in a customer journey. Search team works with search quality score, onboarding team works with time to success, authentication team works with password recovery rate. What gets measured, gets better. And it gives you the data you need to monitor and visualize the impact of your design work. Once it becomes a second nature of your process, not only will you have an easier time for getting buy-in, but also build enough trust to boost UX in a company with low UX maturity. [more in the comments ↓] #ux #metrics

  • View profile for Brij kishore Pandey
    Brij kishore Pandey Brij kishore Pandey is an Influencer

    AI Architect | AI Engineer | Generative AI | Agentic AI

    710,087 followers

    Over the last year, I’ve seen many people fall into the same trap: They launch an AI-powered agent (chatbot, assistant, support tool, etc.)… But only track surface-level KPIs — like response time or number of users. That’s not enough. To create AI systems that actually deliver value, we need 𝗵𝗼𝗹𝗶𝘀𝘁𝗶𝗰, 𝗵𝘂𝗺𝗮𝗻-𝗰𝗲𝗻𝘁𝗿𝗶𝗰 𝗺𝗲𝘁𝗿𝗶𝗰𝘀 that reflect: • User trust • Task success • Business impact • Experience quality    This infographic highlights 15 𝘦𝘴𝘴𝘦𝘯𝘵𝘪𝘢𝘭 dimensions to consider: ↳ 𝗥𝗲𝘀𝗽𝗼𝗻𝘀𝗲 𝗔𝗰𝗰𝘂𝗿𝗮𝗰𝘆 — Are your AI answers actually useful and correct? ↳ 𝗧𝗮𝘀𝗸 𝗖𝗼𝗺𝗽𝗹𝗲𝘁𝗶𝗼𝗻 𝗥𝗮𝘁𝗲 — Can the agent complete full workflows, not just answer trivia? ↳ 𝗟𝗮𝘁𝗲𝗻𝗰𝘆 — Response speed still matters, especially in production. ↳ 𝗨𝘀𝗲𝗿 𝗘𝗻𝗴𝗮𝗴𝗲𝗺𝗲𝗻𝘁 — How often are users returning or interacting meaningfully? ↳ 𝗦𝘂𝗰𝗰𝗲𝘀𝘀 𝗥𝗮𝘁𝗲 — Did the user achieve their goal? This is your north star. ↳ 𝗘𝗿𝗿𝗼𝗿 𝗥𝗮𝘁𝗲 — Irrelevant or wrong responses? That’s friction. ↳ 𝗦𝗲𝘀𝘀𝗶𝗼𝗻 𝗗𝘂𝗿𝗮𝘁𝗶𝗼𝗻 — Longer isn’t always better — it depends on the goal. ↳ 𝗨𝘀𝗲𝗿 𝗥𝗲𝘁𝗲𝗻𝘁𝗶𝗼𝗻 — Are users coming back 𝘢𝘧𝘵𝘦𝘳 the first experience? ↳ 𝗖𝗼𝘀𝘁 𝗽𝗲𝗿 𝗜𝗻𝘁𝗲𝗿𝗮𝗰𝘁𝗶𝗼𝗻 — Especially critical at scale. Budget-wise agents win. ↳ 𝗖𝗼𝗻𝘃𝗲𝗿𝘀𝗮𝘁𝗶𝗼𝗻 𝗗𝗲𝗽𝘁𝗵 — Can the agent handle follow-ups and multi-turn dialogue? ↳ 𝗨𝘀𝗲𝗿 𝗦𝗮𝘁𝗶𝘀𝗳𝗮𝗰𝘁𝗶𝗼𝗻 𝗦𝗰𝗼𝗿𝗲 — Feedback from actual users is gold. ↳ 𝗖𝗼𝗻𝘁𝗲𝘅𝘁𝘂𝗮𝗹 𝗨𝗻𝗱𝗲𝗿𝘀𝘁𝗮𝗻𝗱𝗶𝗻𝗴 — Can your AI 𝘳𝘦𝘮𝘦𝘮𝘣𝘦𝘳 𝘢𝘯𝘥 𝘳𝘦𝘧𝘦𝘳 to earlier inputs? ↳ 𝗦𝗰𝗮𝗹𝗮𝗯𝗶𝗹𝗶𝘁𝘆 — Can it handle volume 𝘸𝘪𝘵𝘩𝘰𝘶𝘵 degrading performance? ↳ 𝗞𝗻𝗼𝘄𝗹𝗲𝗱𝗴𝗲 𝗥𝗲𝘁𝗿𝗶𝗲𝘃𝗮𝗹 𝗘𝗳𝗳𝗶𝗰𝗶𝗲𝗻𝗰𝘆 — This is key for RAG-based agents. ↳ 𝗔𝗱𝗮𝗽𝘁𝗮𝗯𝗶𝗹𝗶𝘁𝘆 𝗦𝗰𝗼𝗿𝗲 — Is your AI learning and improving over time? If you're building or managing AI agents — bookmark this. Whether it's a support bot, GenAI assistant, or a multi-agent system — these are the metrics that will shape real-world success. 𝗗𝗶𝗱 𝗜 𝗺𝗶𝘀𝘀 𝗮𝗻𝘆 𝗰𝗿𝗶𝘁𝗶𝗰𝗮𝗹 𝗼𝗻𝗲𝘀 𝘆𝗼𝘂 𝘂𝘀𝗲 𝗶𝗻 𝘆𝗼𝘂𝗿 𝗽𝗿𝗼𝗷𝗲𝗰𝘁𝘀? Let’s make this list even stronger — drop your thoughts 👇

  • View profile for Balchandra Kemkar

    LinkedIn Top Voice ‘24, ‘25 | Product Management | Financial Services | AI | Digital Banking Transformation | Industry Mentor | Views my own

    3,768 followers

    When Style Disrupts Safety: A Lesson in Product Design Today, while driving behind the sleek Mahindra BE.6E, a futuristic and stylish EV, I experienced something unexpected. The car in front of me braked suddenly. I had maintained a safe distance, so I stopped comfortably. But something felt off. Why did the braking catch me off guard? Then I realized: The brake lights were too subtle. The taillights are designed as a thin rectangular LED strip, stunning to look at, no doubt. But the brake lights occupy only a tiny section on the top edge of that strip. Visually stylish, but functionally weak. In real traffic conditions, where immediacy and clarity are critical, this design doesn’t help other drivers react intuitively. This reminded me of a fundamental product design principle: Aesthetics must never come at the cost of usability. A good product delights not just by how it looks but by how well it works. Whether we’re designing: • A mobile app • A banking interface • Or a car’s tail lights …it’s our job as product managers and designers to make sure the experience is not just elegant, but intuitive, accessible, and safe. Lesson for us in Product Management: Design for the user’s reality, not just the brand’s imagination. Functionality and clarity should never be hidden behind a glossy UI, whether it’s a screen… or an LED strip on a car. #ProductManagement #UXDesign #Usability #AutomotiveDesign #DesignThinking #BuildWithEmpathy

  • View profile for Jordan McMorris

    Level Designer @ Apogee Entertainment

    4,509 followers

    The Secret Language of Level designers?? What if I told you that Level designers have a secret language we use to communicate with players? Ever wonder how you always know where to go, even without a map? Maybe you felt tension without a single word of dialogue. That’s not luck. That’s the hidden language of level design at work. As level designers, our job goes far beyond just building spaces. We craft experiences. And we do it through a subtle, intentional language made up of light, shape, rhythm, and space. Most players don’t consciously see level design, but they feel it. Every hallway, staircase, shadow, and prop is carefully placed to guide behavior, evoke emotion, and support the narrative. We don’t issue commands. We suggest, nudge, and invite. Here are just a few of the techniques we use to communicate through the world itself: Landmarking – Large structures, unique shapes, or color contrast that help players orient themselves and build mental maps. Lighting Cues – Warm, soft lighting signals safety or narrative importance. Harsh or dark areas introduce tension and uncertainty. Framing – Using geometry to subtly direct the player’s eye toward points of interest, similar to how cinematographers guide attention in film. Breadcrumbing – Placing pickups, enemies, or environmental details in patterns that subconsciously guide players toward their goal. Affordance – Designing elements to suggest their function: a waist-high ledge invites traversal, while a flickering exit sign implies urgency. Echoing – Repeating familiar layouts or motifs (like U-shaped corridors or blocked paths) to build rhythm, recognition, or suspense. Forced Perspective – Aligning objects and environmental elements to lead the eye, encouraging movement or curiosity. Even the smallest details — the tilt of a camera, the curve of a hallway, the placement of clutter — all contribute to an unspoken conversation with the player. We’re not just designing gameplay. We’re shaping emotion, behavior, and storytelling — all through space. and remember if you are interested in learning about these techniques and more the next cohort of Game Design Skills level design course taught by Nathan Kellman and yours truly will be starting up in a few weeks so if you are interested nows the time to reach out!!! #LevelDesign #GameDesign #NarrativeDesign #EnvironmentArt #UXDesign #GameDev #SpatialDesign #PlayerExperience

  • View profile for Hiten Shah

    CEO @ Crazy Egg (est. 2005), building tools teams use to make marketing decisions.

    43,694 followers

    I just got off the phone with a founder. It was an early Sunday morning call, and they were distraught. The company had launched with a breakout AI feature. That one worked. It delivered. But every new release since then? Nothing’s sticking. The team is moving fast. They’re adding features. The roadmap looks full. But adoption is flat. Internal momentum is fading. Users are trying things once, then never again. No one’s saying it out loud, but the trust is gone. This is how AI features fail. Because they teach the user a quiet lesson: don’t rely on this. The damage isn’t logged. It’s not visible in dashboards. But it shows up everywhere. In how slowly people engage. In how quickly they stop. In how support teams start hedging every answer with “It should work.” Once belief slips, no amount of capability wins it back. What makes this worse is how often teams move on. A new demo. A new integration. A new pitch. But the scar tissue remains. Users carry it forward. They stop expecting the product to help them. And eventually, they stop expecting anything at all. This is the hidden cost of broken AI. Beyond failing to deliver, it inevitably also subtracts confidence. And that subtraction compounds. You’re shaping expectation, whether you know it or not. Every moment it works, belief grows. Every moment it doesn’t, belief drains out. That’s the real game. The teams that win build trust. They ship carefully. They instrument for confidence. They treat the user’s first interaction like a reputation test, because it is. And they fix the smallest failures fast. Because even one broken output can define the entire relationship. Here’s the upside: very few teams are doing this. Most are still chasing the next “AI-powered” moment. They’re selling potential instead of building reliability. If you get this right, you become the product people defend in meetings. You become the platform they route their workflow through. You become hard to replace. Trust compounds. And when it does, it turns belief into lock-in.

  • View profile for Shewali Tiwari

    marketer under metamorphosis: creative. content-led. writer.

    22,981 followers

    At airtel, I ran an iPhone giveaway marketing campaign three times, and to my surprise, none of them performed. Logically, you’d think offering a prize as attractive as an iPhone, especially during the launch period, would drive massive engagement. The assumption is that everyone would rush to participate, download your app, engage with the campaign, and complete the required actions. But what actually happened was the opposite. Engagement was shockingly, embarrassingly low. In contrast, campaigns that offered much smaller rewards—like a ₹1,000 or ₹500 voucher, or even just a free mobile recharge—generated higher participation rates, more app downloads, and greater overall engagement. But why does this happen? This outcome can be largely attributed to consumer psychology. When the reward seems too large or unattainable, people instinctively doubt their chances of winning. The concept of *perceived probability* comes into play here. When the prize is something as high-value as an iPhone, people immediately think, "What are the odds that I’ll actually win?" This skepticism causes them to disengage and not even bother trying, as they don't see the reward as realistically achievable. On the other hand, smaller, more attainable rewards feel within reach. A ₹1,000 voucher or a free recharge doesn’t carry the same sense of improbability. People feel like they have a real shot at winning something smaller, which encourages them to take the necessary actions, leading to better campaign results. In essence, psychology plays a far more critical role in shaping consumer behavior than we give it credit for.

  • View profile for Anu Bharadwaj
    Anu Bharadwaj Anu Bharadwaj is an Influencer

    Builder, Advisor, Board director

    35,036 followers

    The most fascinating piece of research I've read this week: Anthropic's study on "disempowerment patterns" in AI—cases where AI distorts users' beliefs, values, or actions rather than informing them. Two findings stand out: 1️⃣ Users actively seek these outcomes—asking "what should I do?" and accepting answers without pushback. The disempowerment comes not from AI overriding human agency, but from people voluntarily ceding it 🤯 2️⃣ Users rate these harmful interactions more favorably in the moment. Satisfaction only drops after they've acted on the advice 🤦♀️ We've seen 1️⃣ in productivity software before. Features that empower sophisticated users can create mindless dependency in others. The difference lies in whether the tool teaches you the "why" or just handles the "what". The most empowering products build capability over time, not just provide quick hacks for grunt work. We've also seen 2️⃣ before. For a decade, social platforms optimized for engagement that users validated through clicks and shares. Content that felt compelling in the moment—validation, tribal reinforcement—often left people worse off. The gap between what users wanted and what was good for them became a central tension. Designing AI products and experiences continues to be an incredibly creative and human endeavor... https://lnkd.in/g8K7A_4P

  • View profile for Diana Khalipina

    WCAG & RGAA web accessibility expert | Frontend developer | MSc Bioengineering

    13,558 followers

    15 activities to test mobile accessibility In the last 15 years, the internet has gone mobile. Every major platform — from news to shopping to social media — has invested in sleek mobile versions because that’s where people spend their time. 📊 In fact, more than 60% of web traffic now comes from mobile devices (the source: https://lnkd.in/eeSrdHx4) We optimized for speed, performance, and design. But there’s one area where many mobile experiences still fall short: accessibility. And yet, mobile accessibility isn’t a niche concern. It affects everyone — whether you’re navigating with one hand while holding a coffee, trying to read in bright sunlight, or relying on a screen reader every single day. The good news is that you don’t need special tools to understand these challenges: your phone is already the perfect testing lab. That’s why I put together 15 quick activities to test mobile accessibility. Each one reveals how real people experience barriers and how small design choices can make a huge difference. Try these activities: 1. Turn on VoiceOver (iOS) or TalkBack (Android) → Navigate your favorite app. Every unlabeled button or image will suddenly become invisible. Study: Screen Reader User Survey 9 – WebAIM shows that over 70% of users rely on mobile screen readers daily (the study: https://lnkd.in/e9JeHsMx). 2. Increase text size to maximum in settings → Does your layout adjust gracefully? Do words overlap and buttons disappear? WCAG criterion: 1.4.4 Resize text (the link: https://lnkd.in/eDaYZ8wS) 3. Test color contrast outdoors → Step into bright sunlight. Can you still read the buttons? Fact: poor contrast is one of the most common accessibility issues 4. Switch your phone to grayscale → Do instructions still make sense without color cues (“Click the green button” won’t work). Study by WHO: around 300 million people worldwide have some form of color vision deficiency (the study: https://lnkd.in/eD9PkQk7) 5. Try captions on videos → Turn sound off. Are captions accurate, synced, and complete? Fact: 80% of caption users are not deaf or hard of hearing 6. Enable Dark Mode → Is content still clear, or do logos/icons disappear into the background? 7. Try high-contrast mode (Android) or Smart Invert (iOS) → Does the app break visually? 8. Test with one hand only → Can you still reach all main actions (especially on large phones)? 9. Rotate the phone (portrait ↔ landscape) → Does the app adapt, or do important features vanish? 10. Check hit targets → Can you tap small buttons without misclicking? WCAG requires minimum 44×44px target size (the link: https://lnkd.in/eNuZidir) Accessibility on mobile isn’t about edge cases, it’s about real-world design for real-world humans. #WebAccessibility #Inclusion #a11y #MobileAccessibility #WCAG

Explore categories