Skip to Content

Finding Practical Applications of AI in Spreadsheets

April 28, 2025

Executive Summary

This market research project tasks you with finding and validating a practical application of AI within spreadsheets. You will:

  1. Explore core AI spreadsheet functions and demonstrate their use with personal examples.
  2. Research potential real-world spreadsheet problems where AI could offer a solution.
  3. Define one specific problem, the proposed AI solution, the target user, and develop a spreadsheet use-case to illustrate the application.
  4. Develop and conduct a survey for potential users to validate the problem’s prevalence and the solution’s value.
  5. Submit a summary report, use-case spreadsheet, and survey findings.

Timeline: 7 weeks (May 5 - June 20, 2025) Due Date: Final deliverables due June 20, 2025

Overview

Spreadsheets are ubiquitous in business, yet many tasks remain manual and time-consuming, especially those involving text analysis, data cleaning, or pattern recognition. AI integrated into spreadsheets offers powerful automation possibilities.

The goal of this project is to identify a specific, real-world spreadsheet challenge, propose a practical AI-driven solution, illustrate how it would work, and validate its potential value with target users. Each team member will focus on developing and validating one such application, while collaborating to share insights and feedback with the rest of the team.

Ensure you have a Microsoft account (signup.live.com) for access to Excel online and resources in the project SharePoint site where project workbooks will be stored and shared.

Market Research Process

This is a rough outline of the process. We can adjust as we go along based on how we are performing and what issues people are encountering.

Phase 1: AI Function Exploration & Demo (Week 1)

  • Objective: Familiarize yourself with core AI functions in Excel and demonstrate understanding through practical examples.
  • Activities:
    • Review the core AI functions available in Excel:
      • AI_ASK: Querying data, generating text, summarizing.
      • AI_EXTRACT: Pulling info from text.
      • AI_TABLE: Generating tables.
      • AI_LIST: Creating lists.
      • AI_FILL: Completing patterns.
      • AI_FORMAT: Standardizing data.
      • AI_CHOICE: Categorizing data.
    • Review examples at Applications of AI in a Spreadsheet.
    • Create a Demo Workbook (Excel file stored on the team SharePoint). In this workbook, create separate sheets demonstrating how at least 4 of the above AI functions could be applied to simple tasks or data relevant to your personal experience (e.g., organizing personal finances, planning a trip, managing schoolwork, analyzing hobby data). Clearly label the input data, the AI function used, and the output.
  • Deliverable: A Demo Workbook illustrating the use of at least 4 different AI functions with personal examples.
  • Weekly Update: Share your Demo Workbook. Discuss which functions were easiest/most challenging to apply and initial thoughts on their potential usefulness for broader spreadsheet tasks.
  • Due Date: May 9, 2025

Phase 2: Idea Generation & Research (Weeks 2-3)

  • Objective: Identify potential spreadsheet problems AI could solve based on real-world usage.
  • Activities:
    • Talk to people in your network (school, work, clubs, family, friends) about how they use spreadsheets (Excel).
    • Ask open-ended questions to uncover time-consuming, repetitive, or complex tasks, especially those involving text, data cleaning, pattern matching, or categorization.
    • Focus on understanding the context and pain points associated with these tasks.
    • Document potential problems and the types of users experiencing them.
  • Deliverable: For each person you discuss with, include their contact information and notes from your conversation about their situation and how the problem arose. Also provide a documented list of 3-5 potential spreadsheet problems identified, including context, pain points, and user types for each.
  • Weekly Update: Share 1-2 promising problems discovered, discuss research methods (who you talked to, key questions asked), and initial thoughts on potential AI applicability, potentially linking back to functions explored in Phase 1.
  • Due Date: May 23, 2025

Phase 3: Application Definition (Weeks 3-4)

  • Objective: Select one promising problem, define a specific AI application, and create a tangible example.
  • Activities:
    • Choose the problem/application you find most compelling and potentially valuable from your Phase 2 list.
    • Write a clear Problem Statement: What is the specific task, who does it, why is it difficult/time-consuming currently?
    • Define the Proposed AI Solution: How would AI help? Which AI capabilities (e.g., specific functions like AI_EXTRACT, AI_CHOICE, or general concepts like classification, summarization) would be used? Describe conceptually how the user would interact with it in the spreadsheet.
    • Identify the Target Market/User: Who specifically would benefit most from this solution?
    • Develop Use-Case Spreadsheet: Create a simple Excel sheet that illustrates the problem and how your proposed AI solution would work. Include sample data (anonymized) and show where the AI function/feature would be used and what its output would look like. This is distinct from the Phase 1 Demo Workbook; it focuses on the specific problem you defined. A fully working AI implementation is optional but valuable if feasible.
  • Deliverables:
    • A draft document containing the Problem Statement, Proposed AI Solution description, and Target Market/User definition.
    • Use-Case Spreadsheet illustrating the defined application.
  • Weekly Update: Present your chosen problem, the draft definition document, and the draft use-case spreadsheet. Seek feedback from the team on clarity, feasibility, potential value, and the illustration’s effectiveness. Discuss any challenges.
  • Due Date: May 30, 2025

Phase 4: Validation Survey (Weeks 5-7)

  • Objective: Gather user feedback on the defined problem’s prevalence and the proposed solution’s value.
  • Activities:
    • Develop Survey: Create a short survey (5-10 questions) for people who fit your target user profile defined in Phase 3. Questions should aim to:
      • Confirm they experience the problem you’ve defined.
      • Understand how often they face it and how much time it takes.
      • Gauge their interest in your proposed AI solution (potentially showing a screenshot or description from your Use-Case Spreadsheet).
      • Assess the perceived value (e.g., “How helpful would this be on a scale of 1-5?”, “Would this save you significant time?”).
    • Conduct Survey: Distribute your survey to relevant individuals (aim for 10-15 responses if possible) and collect their feedback.
  • Deliverables:
    • Survey questionnaire.
    • Collected survey responses.
  • Weekly Update:
    • (Week 5): Share draft survey questions for feedback. Discuss target audience for the survey and distribution plan.
    • (Week 6): Report on survey distribution progress and initial responses/feedback received.
    • (Week 7): Share preliminary analysis of survey results. Discuss key findings regarding problem validation and perceived solution value.
  • Due Dates:
    • Survey Questions: June 6, 2025
    • Survey Distribution: June 13, 2025
    • Final Survey Results & Analysis: June 20, 2025

Team Collaboration & Progress

While each student develops their own application, teamwork is essential. Use weekly team meetings to cover the “Weekly Update Focus” items for the current phase and:

  • Share research findings and interesting problems discovered.
  • Discuss potential AI solutions and get feedback on chosen application definitions.
  • Share progress on use-case spreadsheets.
  • Collaborate on survey design and share best practices for reaching target respondents.
  • Discuss survey results and interpretation.
  • Help each other overcome roadblocks.

Keep your own notes and documentation tracking your research, application definition, spreadsheet design, and survey results.

Final Deliverables

Due June 27, 2025, each student will submit finalized versions of:

  1. One-Page Application Summary: A concise document covering:
    • The Problem Statement
    • The Proposed AI Solution
    • The Target Market/User
    • Key Findings from your Validation Survey
  2. Use-Case Spreadsheet: The Excel file illustrating the problem and the conceptual AI solution (from Phase 3).
  3. Survey Results Summary: A brief summary or appendix showing the questions asked and the aggregated, anonymized responses.
  4. (Optional but Recommended): Link to your Phase 1 Demo Workbook on SharePoint.

What you will get out of this project

This project provides hands-on experience in market research, user interviewing, problem definition, conceptual solution design, and user validation within the context of emerging AI technology. You can add the following experience to your resume:

Spring 2025 - AI Spreadsheet Application Researcher (Project-Based), Boardflare, Remote

  • Familiarized with core spreadsheet AI functions (e.g., AI_ASK, AI_EXTRACT, AI_CHOICE) and demonstrated their application.
  • Conducted market research and user interviews to identify real-world spreadsheet challenges suitable for AI solutions.
  • Defined specific user problems and designed conceptual AI-driven spreadsheet applications to address them.
  • Developed illustrative spreadsheet use-cases demonstrating proposed AI functionality (e.g., data extraction, categorization, pattern filling).
  • Created and executed user surveys to validate problem prevalence and gauge the perceived value of proposed AI solutions.
  • Analyzed survey data to assess market need for specific AI spreadsheet applications.
  • Collaborated with a team to share research findings and refine application concepts.

Frequently Asked Questions

Q: Do I need to build a fully working AI solution in the spreadsheet? A: Not necessarily for the final Use-Case Spreadsheet (Phase 3). The primary goal there is to illustrate the concept. However, in Phase 1, you are expected to use the actual AI functions in your Demo Workbook to demonstrate your understanding of how they work with simple, personal examples. A working solution in the Phase 3 spreadsheet is encouraged if feasible but not the main requirement.

Q: What’s the difference between the Phase 1 Demo Workbook and the Phase 3 Use-Case Spreadsheet? A: The Phase 1 Demo Workbook is for exploring and demonstrating multiple AI functions using your own simple data/examples. The Phase 3 Use-Case Spreadsheet focuses on one specific problem identified during your research, illustrating how a specific proposed AI solution (which might use one or more AI functions) would address that problem using representative (often anonymized) data relevant to the target user.

Q: How should we collaborate as a team? A: Use the weekly meetings to share progress (including Phase 1 demos), give feedback on each other’s Deliverables (problem lists, definitions, spreadsheets, surveys), help refine survey questions, and share strategies for finding survey participants. However, each student is responsible for their own Phase 1 Demo, and for defining, illustrating, and validating their own unique application in subsequent phases.

Q: What if I’m struggling to find a good application idea? A: Discuss potential areas with your teammates during weekly updates. Think broadly about different industries or roles. Revisit the AI function examples – could they solve a problem you or someone you know has encountered? Talk to more people about their spreadsheet tasks.

Q: Do I need actual spreadsheet samples from users? A: Getting anonymized samples illustrating the problem you’re focusing on is very helpful for designing your use-case spreadsheet accurately. However, if users can clearly describe the problem and data structure, you can create realistic sample data yourself. Always emphasize anonymity if requesting samples.

Q: How detailed should my solution definition be? A: It should clearly explain the user’s pain point, how the proposed AI feature would work within the spreadsheet interface (conceptually), which AI capability it uses (e.g., extraction, classification), and who would benefit. The accompanying spreadsheet should visually represent this.

Q: How many people should I survey? A: Aim for 10-15 responses from people who genuinely experience the problem you’ve defined. This provides a reasonable basis for validating interest and perceived value.

Example Deliverables

Here are examples of what the deliverables for each phase and the final submission might look like, focusing on business use-cases.

Phase 1: Demo Workbook

(Illustrates understanding of AI functions with personal/simple examples)

Example 1: AI_LIST for Business Trip Packing

Input Prompt (Cell A1)AI Function Used (Cell B1)Output List (Cell C1 onwards)
“Create a packing list for a 3-day business conference”=AI_LIST(A1)Laptop & Charger
Business Attire (x3)
Casual Outfit
Toiletries
Notebook & Pen
Business Cards
Phone & Charger

Example 2: AI_EXTRACT for SKU Extraction

Input Text (Cell A2)AI Function Used (Cell B2)Extracted SKU (Cell C2)
“Order #1234 includes item WIDGET-BLUE-XL quantity 2 and GADGET-RED quantity 1.”=AI_EXTRACT(A2, "product SKU")WIDGET-BLUE-XL
”Please ship 3 units of PART-9987 and 1 unit of ASSEMBLY-ABC to the main office.”=AI_EXTRACT(A2, "product SKU")PART-9987
(More rows with different order descriptions)(Formula applied down column)(Expected AI Output)

(This workbook would contain multiple sheets, each demonstrating a different AI function applied to simple business or personal productivity data.)

Phase 2: Potential Problem List

(Documents potential spreadsheet problems identified through research. All information for each contact should be in a single table. Instead of contact info, use the company name. Include a column for whether an example spreadsheet was provided, and the filename if so.)

Example Format:

NameCompanyNotes from ConversationProblemContextPain PointsUser TypeExample Spreadsheet Provided
Jane SmithAcme EventsJane uses Excel to track event leads. She spends 1-2 hours weekly copying contact info from emails.Manually extracting contact info (name, email, phone) from inquiriesSmall business owners/event planners receiving leads via text/email.Time-consuming copy-pasting, prone to errors, delays follow-up.Independent consultants, small agency staff.event-leads-sample.xlsx
Bob LeeWidgetCoBob manages customer feedback in spreadsheets and finds manual categorization slow and inconsistent.Categorizing customer feedback comments from surveys/reviewsProduct teams analyzing qualitative feedback.Tedious manual reading/tagging, slow trend identification.Product Managers, UX Researchers, Support Leads.(none)

(List would contain 3-5 such problem descriptions based on student research)

Phase 3: Application Definition Document & Use-Case Spreadsheet

(Defines the chosen problem, proposed AI solution, target user, and illustrates it)

Application Definition Document Example 1: Contact Extraction

Problem Statement

Independent professionals and small agencies, particularly those in high-touch, client-facing roles (e.g., event planners, consultants, therapists, freelance creatives, small law firms), are constantly managing inbound communication across a fragmented digital landscape. Potential client inquiries arrive via email threads, website contact forms with varying fields, direct messages on platforms like LinkedIn, Instagram, or Facebook, and occasionally even SMS or transcribed voicemails. Within these diverse, often informal text streams, critical contact details—names, specific email addresses (sometimes multiple per contact), direct phone lines or mobile numbers (in numerous formats like (555) 123-4567, 555.123.4567, +1-555-123-4567), and company affiliations—are embedded non-uniformly.

Example Problem Data:

Source TextPlatformReceived Date
”Hi, I’m interested in your services for a corporate event in October. My name is Sarah Chen from Acme Corp. You can reach me at [email protected] or call 555-123-4567. Thanks!”Email2025-05-01
”Inquiry about wedding planning - John Doe. Best email is [email protected], phone is 555-987-6543.”Web Form2025-05-02
”From LinkedIn DM: Maria Garcia, VP at Globex Inc. [email protected] / +1-555-555-1212 wants to discuss consulting.”LinkedIn2025-05-03

Currently, the dominant workflow involves a human meticulously reading each message, visually scanning for these data points, highlighting them (mentally or physically), and then performing manual copy-and-paste operations into a central tracking system, most commonly an Excel or Google Sheets spreadsheet. This multi-step process is not merely tedious; it represents a significant drain on productive time, often consuming several hours per week per employee. The cognitive load of switching between communication platforms and the spreadsheet, coupled with the repetitive nature of the task, contributes to fatigue and increases the likelihood of errors.

Proposed AI Solution

The proposed solution leverages the existing AI_EXTRACT function within Excel, applied strategically to integrate seamlessly into the user’s spreadsheet workflow. The user pastes the raw text block from an email, message, or form submission into a designated cell (e.g., A2). In adjacent columns, they utilize the AI_EXTRACT function multiple times, specifying the desired piece of information for each column:

  • In B2: =AI_EXTRACT(A2, "person's name") -> Extracts the primary person’s name.
  • In C2: =AI_EXTRACT(A2, "email address") -> Extracts the most likely primary email.
  • In D2: =AI_EXTRACT(A2, "phone number") -> Extracts the primary phone number, potentially recognizing various formats.
  • In E2: =AI_EXTRACT(A2, "company name") -> Extracts the associated company name.

Each function call executes instantly upon entry or recalculation, instructing the AI to parse the unstructured text in cell A2 and populate the respective cell with the identified data point based on the provided description (e.g., “email address”).

Example AI Transformation:

Source Text (Input Cell A2)Extracted Name (Output Cell B2)Extracted Email (Output Cell C2)Extracted Phone (Output Cell D2)Extracted Company (Output Cell E2)
“Hi, I’m interested in your services for a corporate event in October. My name is Sarah Chen from Acme Corp. You can reach me at [email protected] or call 555-123-4567. Thanks!”Sarah Chen[email protected]555-123-4567Acme Corp
”Inquiry about wedding planning - John Doe. Best email is [email protected], phone is 555-987-6543.”John Doe[email protected]555-987-6543(blank or N/A)
“From LinkedIn DM: Maria Garcia, VP at Globex Inc. [email protected] / +1-555-555-1212 wants to discuss consulting.”Maria Garcia[email protected]+1-555-555-1212Globex Inc

This approach relies on the underlying AI’s ability to understand the context and patterns associated with different types of contact information within natural language text. The AI models powering AI_EXTRACT are trained on diverse datasets, enabling them to identify names, emails, phone numbers (even with varied formatting like (555) 123-4567 or +1-555-123-4567), and company names embedded within sentences, signatures, or lists. While a single function call extracts one piece of information, applying it across multiple columns provides a structured way to deconstruct the contact details from the source text.

Target Market/User

The primary target users are resource-constrained individuals and micro-to-small teams (typically <10 people) operating in service-based industries where personalized client interaction is key. This includes, but is not limited to: independent event planners, freelance consultants (marketing, business, IT), virtual assistants managing client communications, small marketing/creative agencies, solo practitioners (therapists, coaches, financial advisors), real estate agents, small non-profit donor relations coordinators, and customer intake staff at small professional service firms (law, accounting).

These users almost universally rely on spreadsheets (Excel primarily, but also Google Sheets) as their central hub for lead management, client tracking, project status, and basic operational organization due to familiarity, accessibility, and cost-effectiveness. They often lack the budget, time, or technical inclination to implement and maintain dedicated CRM platforms, which may feel overly complex or expensive for their needs. Their technical proficiency typically ranges from basic to intermediate spreadsheet skills; they are comfortable with formulas but wary of complex macros or external integrations requiring significant setup.

Key motivations driving their work include maximizing personal productivity, minimizing non-billable administrative time, maintaining accuracy in client communications, appearing professional and responsive, and ultimately, growing their business or serving their clients more effectively. They value practical, tangible solutions that provide immediate time savings and reduce friction in their daily workflows. While increasingly aware of AI, they may harbor skepticism regarding its accuracy and reliability, demanding demonstrable performance and ease of use. Data privacy is paramount, especially for those handling sensitive client information. A secondary user group includes managers or business owners who rely on the accuracy of the data collected by their team for reporting, forecasting, and strategic decision-making.

Application Definition Document Example 2: Feedback Categorization

Problem Statement

Teams dedicated to enhancing products, services, and customer experiences—spanning Product Management, User Experience (UX) Research, Customer Success, Support Operations, and Market Analysis—are confronted with a rapidly growing deluge of qualitative customer feedback. This vital data flows in from an expanding array of channels: open-ended questions in NPS or CSAT surveys, detailed reviews on app stores (Google Play, Apple App Store) or third-party sites (G2, Capterra), transcripts from customer support chats and calls, comments on social media posts and forums, direct emails, and submissions via in-app feedback widgets. While this feedback contains invaluable insights into user needs, pain points, and desires, its sheer volume and unstructured nature create a formidable analytical challenge.

Example Problem Data:

Feedback TextSourceDateUser ID
”The checkout process was confusing and took too long.”Survey2025-04-15user123
”I love the new feature, but the price seems a bit high compared to competitors.”App Review2025-04-16user456
”It would be great if you could integrate with our accounting software.”Email2025-04-18user789
”The mobile app keeps crashing whenever I try to access my reports.”Support Chat2025-04-20user101

The standard practice involves analysts or team members manually reading every single piece of feedback, interpreting the core sentiment and topics discussed, and then assigning one or more predefined category tags (e.g., “Usability Issue,” “Login Problem,” “Pricing Concern,” “Feature Request - Reporting,” “API Bug,” “Positive Sentiment - Support”). This manual classification process is exceptionally labor-intensive, often consuming days or weeks for large datasets, creating a significant bottleneck that delays the identification of critical issues and emerging trends. A severe bug mentioned in multiple reviews might go unnoticed for days, impacting user retention. A popular feature request might be slow to bubble up, causing the product to lag behind competitors.

Proposed AI Solution

We propose utilizing the standard AI_CHOICE function in Excel to automate and standardize the feedback classification process directly within the user’s spreadsheet analysis environment. A user would paste or import their raw feedback text into one column (e.g., Column A). In a separate range (either on the same sheet or a dedicated configuration sheet, e.g., Config!A1:A10), they would list their predefined, relevant category labels (e.g., “Usability,” “Pricing,” “Bug,” “Feature Request”). In an adjacent column (e.g., Column B), the user would apply the formula: =AI_CHOICE(A2, Config!$A$1:$A$10).

This function instructs the AI to analyze the feedback text in cell A2 and select the single most appropriate category from the provided list (Config!$A$1:$A$10). The AI evaluates the semantic meaning and context of the feedback against the meaning of each category label to make the best match.

Example AI Transformation:

Feedback Text (Input Cell A2)Assigned Category (Output Cell B2, using =AI_CHOICE(A2, Config!$A$1:$A$10))
“The checkout process was confusing and took too long.”User Experience
”I love the new feature, but the price seems a bit high compared to competitors.”Pricing
”It would be great if you could integrate with our accounting software.”Feature Request
”The mobile app keeps crashing whenever I try to access my reports.”Bug Report

This leverages the AI’s text classification capabilities, built upon models trained to understand natural language. By providing a constrained list of choices, the user guides the AI to categorize feedback according to their specific, custom-defined taxonomy.

Target Market/User

The target users are professionals and teams whose core responsibilities involve understanding and acting upon the voice of the customer, particularly those who currently utilize spreadsheets for organizing, analyzing, or reporting on qualitative feedback data. This encompasses a range of roles across different departments and company sizes: Product Managers prioritizing backlogs and defining roadmaps, UX Researchers identifying usability friction points, Customer Support/Success Managers tracking recurring issues and customer sentiment, Market Analysts monitoring brand perception and competitive positioning, and Data Analysts tasked with synthesizing qualitative insights for business intelligence.

These users operate in diverse environments, from tech startups (SaaS, mobile apps) and e-commerce businesses analyzing reviews, to larger enterprises gathering feedback on internal tools or services, to game development studios monitoring player forums. While some may have access to specialized Voice-of-Customer (VoC) platforms or text analytics software, spreadsheets often remain the tool of choice for initial data aggregation, ad-hoc analysis, sharing findings with specific teams, or handling feedback from sources not integrated into primary platforms. They value flexibility and control, particularly the ability to define and refine their own categorization taxonomies that are specific to their product, industry, and analytical goals, rather than being constrained by predefined schemas.

Their analytical objectives are concrete: quickly surface critical bugs, quantify the frequency of specific feature requests, track sentiment shifts related to product launches or policy changes, identify key drivers of customer dissatisfaction or delight, and provide evidence-based recommendations to stakeholders. They require tools that are not only powerful but also intuitive, integrating seamlessly into their workflow without requiring extensive training or complex setup. Accuracy, reliability, the ability to customize categories, and performance capable of handling potentially large datasets (tens of thousands of rows) within the spreadsheet environment are key requirements. The value proposition centers on efficiency gains (automating a tedious task), speed-to-insight (identifying issues and trends faster), and improved data quality (consistency and objectivity) to support better decision-making.

Use-Case Spreadsheet Example 1: Contact Extraction for Event Planners (Illustrating Solution for Problem #1)

Input Text (Cell A2)Extracted Name (Formula: =AI_EXTRACT(A2, "person's name"))Extracted Email (Formula: =AI_EXTRACT(A2, "email address"))Extracted Phone (Formula: =AI_EXTRACT(A2, "phone number"))
“Hi, I’m interested in your services for a corporate event in October. My name is Sarah Chen from Acme Corp. You can reach me at [email protected] or call 555-123-4567. Thanks!”Sarah Chen[email protected]555-123-4567
”Inquiry about wedding planning - John Doe. Best email is [email protected], phone is 555-987-6543.”John Doe[email protected]555-987-6543
(More rows with different examples)(Expected AI Output)(Expected AI Output)(Expected AI Output)

Use-Case Spreadsheet Example 2: Customer Feedback Categorization (Illustrating Solution for Problem #2)

(Assume categories “Pricing”, “Feature Request”, “User Experience”, “Bug Report” are listed in cells C1:C4)

Customer Feedback (Cell A2)Assigned Category (Formula: =AI_CHOICE(A2, C$1:C$4))
“The checkout process was confusing and took too long.”User Experience
”I love the new feature, but the price seems a bit high compared to competitors.”Pricing
”It would be great if you could integrate with our accounting software.”Feature Request
”The mobile app keeps crashing whenever I try to access my reports.”Bug Report
(More rows with different feedback comments)(Expected AI Output)

(The Application Definition Document defines the problem/solution/user. The Use-Case Spreadsheet visually demonstrates the specific business problem, the input data, the application of existing AI functions like AI_EXTRACT and AI_CHOICE, and the desired output for the application defined.)

Phase 4: Survey Questionnaire & Results Summary

(Develops survey questions and summarizes findings from target users)

Example Survey Questions (for Contact Extraction Tool):

  1. What is your primary role? (e.g., Event Planner, VA, Agency Staff)
  2. How often do you receive inquiries with contact info embedded in text? (Daily, Weekly, Monthly, Rarely)
  3. How do you currently extract this information? (Manual copy/paste, Other tool, N/A)
  4. Estimate how much time you spend weekly on this task. (e.g., <30min, 30-60min, 1-2hrs, >2hrs)
  5. (Show concept/screenshot of AI_EXTRACT_CONTACTS) On a scale of 1 (Not helpful) to 5 (Extremely helpful), how helpful would this AI feature be for you?
  6. Do you believe this feature would save you significant time? (Yes/No/Unsure)
  7. Any concerns about using AI for this? (e.g., Accuracy, Privacy)

Example Survey Results Summary (Contact Extraction Tool):

AspectDetails
Target AudienceEvent Planners
Number Respondents15
Key Finding 180% spend >1 hour weekly manually extracting contacts.
Key Finding 290% rated the proposed AI tool 4/5 or 5/5 for helpfulness.
Key Finding 393% agreed the tool would save significant time.
Qualitative InsightHigh accuracy needed, especially for varied phone formats and informal names.

Example Survey Results Summary (Feedback Categorization Tool):

AspectDetails
Target AudienceProduct Managers, UX Researchers, Support Managers
Number Respondents12
Key Finding 175% manually categorize >50 feedback items weekly (avg 2-3 hrs).
Key Finding 285% rated the proposed AI tool 4/5 or 5/5 for helpfulness.
Key Finding 3Perceived benefits: time savings, faster trend ID, consistency.
Qualitative InsightNeed for custom categories and handling multiple categories per feedback item.
Last updated on