AI Data-Scientist
Role
Product Designer
Industry
B2B, SaaS, GTM, AI
Timeline
Summer 2025
Team
1 Designer, 1 PM, 3 Engineers
Context
Amoeba AI is a neuro-symbolic AI platform that acts like a data scientist embedded inside your go-to-market (GTM) team.
Amoeba AI is designed for businesses and professionals who need actionable insights from their data without relying on a dedicated data science team.
Problem
Currently, there exists too many ways to make an exploration. There is no clear distinction between explorations that is led by Amoeba and a pure custom exploration by the user. Most marketers are not technical, and are unsure how to start exploring their data.
Solution
Data Lab: Users can create data explorations to uncover insights in their data. Users will chat with Amoeba, who behaves like a data scientist. Key deliverables:
2 types of explorations: Amoeba-driven, User-driven
Chat field as landing page, emphasize Amoeba-driven explorations
Latest Explorations section with easily accessible exploration projects
Conversational chat interface and features for users to engage with Amoeba and capture insights
Amoeba suggestion of relevant sources based on neurosymbolic processes
Results
Metrics in progress. Acquired 3 new enterprise customers.
Launched Product
Understanding the Data Lab & My Role
The purpose of Data Lab is to easily design, refine, and test marketing experiments without requiring a data science background. Validate strategies and creative ideas in a controlled, risk-free environment to see what works before committing resources.
My role ranged from product discovery to creating the flow of creating an exploration to bookmarking insights and monitoring specific explorations
Competitive Analysis
In order to combine both core AI chat functionalities and Marketing & Data discovery features, I reviewed leading AI analytics and task management tools.
Key takeaways:
Analytical AI tools utilize side panels with tabs (insights, notes, outline, etc)
Conversational vs Research focused AI agents (Gemini vs Perplexity)
Encouraging users to Custom Query vs using a Prompt
User Flow
Product Discovery & Brainstorming
After creating low fidelity wireframes and critiquing them with the Product Manager and Engineers, I improvised and created mid fidelity wireframes to prepare for user interviews.
Iterations of Chat Interface Page with Insights, Bookmarks, and Sources
Iterations of Data Lab Landing Page
Customer Interview
To understand the core pain points, I interviewed two customers who hold different roles.
Daniel: head of market team
Appreciated the open, spacious design for managing complex data.
Requested better organization of recent explorations, suggesting categorical grouping.
Proposed a monitoring feature for timely updates.
Alex: marketing operations manager
Wanted clear differentiation between Amoeba-led and custom explorations.
Preferred a chat-first interface
Recommended color-coded pills to visually separate prompt types.
Suggested organizing explorations with custom tags.
These insights directly informed design priorities: clarity between exploration types, streamlined chat-based workflows, and enhanced organization and recall of insights.
Insights -> Features
After rounds of interviews and iterations, I consolidated these key features for the Data Lab.
Monitoring
Subscribe to explorations, and receive a digest everyday to your preferred platform.
Bookmarking
Save important responses from Amoeba within-chat, and jump back to it later
Summary & insights
Amoeba generates summaries and insights based on user-Amoeba conversation. Allows manual refresh.
Auto sources integration
Amoeba is intelligent and selects sources for the user. Provides reasoning if asked.
Final Designs:
Data Lab Landing page
Chat Interface
Deep Dive into the Landing Page Challenge: Prioritizing Guided Prompts Over Custom Chat
The Data Lab's landing page presented a core design tension. While most AI applications feature a prominent custom chat field, Amoeba's primary value lies in its "Amoeba-led prompts"—curated starting points for exploration.
Our challenge was to steer users toward these valuable prompts without completely hiding the familiar chat function.
During user testing, I gathered 2 pivotal insights:
Users preferred simple, concise phrasing for the prompt pills. On a data-intensive product, they favored scannability over descriptive, lengthy prompts.
Initially, 3 users suggested moving the prompts below the custom chat field, aligning with conventional layouts. However, after I explained that "Amoeba-led" explorations are the core feature, they immediately grasped the value and agreed with the top placement. This validated my decision to prioritize the guided prompts.
Iterations below.
Final Decision: I went with V3 because of its emphasis on Amoeba prompts, clean interface for a data-heavy product, and information hierarchy. Validated by customer interviews.
Launched Product
Takeaways
If I had more time, I want to work on prototyping interactions and spend more time with engineers to implement smoother transitions between tabs and pop-ups. Since the Data Lab is in still in beta, I want to continue to gather user feedback from their usage (misclicks, time it takes to start an exploration, etc).
Nonetheless, I appreciated this experience of working on an AI product as well as the close-collaboration with engineers. It was super rewarding when I saw my designs go from Figma to staging and ultimately production.
Below are some key areas of reflection.
Designing an AI Product
Designing for AI required more than just a chat box. Research showed a blank prompt intimidated users, so my focus became designing the entire conversation. By using guided starters, query examples, and helpful empty states, I onboarded users to the AI's capabilities. The key learning: a successful AI interface must actively guide the user and manage expectations from the very first interaction.
Designing a B2B SaaS Product
The core challenge was designing a data-heavy product for a non-technical GTM audience who needed answers, not raw data. My design approach prioritized actionable insights over complexity, using goal-oriented dashboards and clear AI responses. This solidified a key principle: effective B2B design must translate complex system data into clear, strategic actions for the user.
Product Discovery & Testing with Engineers
Many bugs and issues were discovered during user acceptance testing sessions and springs with engineers. For example, sorting was more complicated than adding tags to each card, it required the data in the backend to be categorized.
Value of Customer Interviews
In interviews, users often requested out-of-scope features. Instead of dismissing these tangents, I learned to see them as signals of deeper pain points. The main takeaway was to always listen for the underlying "why" behind a user's requested "what." This practice was crucial for maintaining a focused product vision while still solving the right core problems.
Metrics in progress. Coming soon.
San Francisco, CA