Data & Analytics Prompts to Understand Your Business Better
You are a sales operations analyst reviewing account characteristics that predict wins vs. losses. Closed deal data (last 12 months): [PASTE: Account | Industry | Company size | Deal size | Won/Lost | Sales cycle length | Stakeholders engaged | Product sold | Region | Competitive situation] Analyze: 1. Win rate by industry — which industries do we win in most consistently? 2. Win rate by company size — are we better at SMB, mid-market, or enterprise? 3. Win rate by deal size — does our win rate change as deal size increases? 4. Competitive win rates — against which competitors do we win most and least often? 5. Ideal customer profile signals — what combination of characteristics predicts a win? Output: Win/loss pattern analysis. Ideal customer profile refinement based on data. Segments to prioritize. Segments to qualify out of more aggressively.
You are a customer success manager synthesizing customer feedback into product and business insights. Feedback data: [PASTE: Source (NPS/support tickets/QBR notes/sales calls/churn interviews) | Feedback themes | Volume of mentions | Segment of customers giving feedback (size/industry/tenure)] Analyze: 1. Top feature requests — most frequently requested product improvements; segment by customer tier 2. Common friction points — where do customers consistently struggle? 3. Competitive mentions — features or capabilities mentioned in context of competitors 4. Delight factors — what do customers consistently praise? Protect these. 5. Segment differences — do enterprise customers want different things than SMB? Different industries? Output: Voice of customer report. Themes ranked by frequency and ARR weight. Recommendations for product roadmap prioritization. Top 3 insights for the business to act on.
You are a revenue operations analyst preparing a cohort analysis of customer revenue. Cohort data: [PASTE: Cohort (quarter of first purchase) | Starting ARR | ARR at 6 months | ARR at 12 months | ARR at 24 months | Customers remaining | NRR % at each interval] Analyze: 1. Revenue retention by cohort — are newer cohorts retaining better or worse than older ones? 2. Expansion pattern — at what point do cohorts typically start expanding vs. contracting? 3. Best-performing cohort — which cohort has the highest NRR? What was different about those customers or that period? 4. Worst-performing cohort — what drove underperformance? Product issue, ICP, timing? 5. LTV trend — based on cohort performance, what is the expected lifetime value of a new customer acquired today? Output: Cohort retention table. NRR trend by cohort. Best/worst cohort analysis. LTV estimate for current ICP.
You are a supply chain analyst reviewing inventory health. Inventory data: [PASTE: SKU | Description | Category | On-hand quantity | Unit cost | Extended value | Last sale date | Average monthly demand | Lead time (days)] Produce: 1) ABC classification — A items (top 80% of value), B items (next 15%), C items (bottom 5%) 2) Dead stock — items with no sales in 6+ months; show quantity and extended value 3) Slow-moving stock — items with <50% of normal monthly demand sold in last 3 months 4) Excess stock — items where on-hand > [X months] of average demand 5) Stockout risk — items where on-hand < safety stock or reorder point Output: Inventory health dashboard. Total excess and dead stock value. Top 10 items to action immediately.
You are a demand planner reviewing forecast accuracy for the period. Forecast vs. actual data: [PASTE: SKU | Forecasted demand | Actual demand | Variance units | Variance % | Category] Analyze: 1) Overall forecast accuracy (MAPE — mean absolute percentage error) for the period 2) Best-performing categories — lowest forecast error 3) Worst-performing categories — highest forecast error; what drove the miss? 4) Bias check — are we consistently over-forecasting or under-forecasting certain categories? 5) SKUs with >50% forecast error — identify and flag for manual review in next cycle Output: Forecast accuracy report. End with: top 3 actions to improve forecast accuracy next period (be specific — not just "improve the model").
You are an inventory manager performing ABC/XYZ analysis for planning policy decisions. Inventory data: [PASTE: SKU | Annual revenue (or usage value) | Annual demand quantity | Demand variability (coefficient of variation if known, or describe as stable/variable/highly variable)] ABC classification (by annual value): - A items: top 80% of total value - B items: next 15% - C items: bottom 5% XYZ classification (by demand variability): - X items: stable demand (CV < 0.5) - Y items: variable demand (CV 0.5–1.0) - Z items: highly variable or irregular demand (CV > 1.0) For each resulting segment (AX, AY, AZ, BX, BY, BZ, CX, CY, CZ): - Recommended forecasting method - Recommended replenishment policy (MRP / reorder point / kanban / manual review) - Safety stock approach Output: Segmentation matrix with policy recommendations. Count and value of items in each segment.
You are a manufacturing engineer analyzing Overall Equipment Effectiveness. OEE data: [PASTE: Machine/line | Availability % | Performance % | Quality % | OEE % | Planned production hours | Downtime hours with reason codes | Speed losses | Defect rate] For each machine below [TARGET OEE %]: 1) Break OEE losses into the 6 Big Losses: equipment failures / setup & adjustments / minor stops / reduced speed / startup rejects / production rejects 2) Identify the biggest OEE killer (which of the 6 losses is dominant) 3) Recommend 1–2 specific improvements targeting the dominant loss 4) Estimate OEE improvement from each recommendation 5) Prioritize by: ease of implementation × OEE impact Output: OEE improvement action plan per machine. Summary: total OEE point improvement available, estimated throughput uplift, and which machine to tackle first.
You are a reliability engineer analyzing recurring equipment downtime. Downtime data: [PASTE: Machine | Downtime event date | Duration (minutes) | Reason code | Operator notes | Maintenance action taken] Analyze: 1) Total downtime by machine — rank by total hours lost 2) Most frequent failure modes — top 5 reason codes by occurrence 3) Most costly failure modes — top 5 reason codes by total minutes lost 4) Pattern analysis — do failures cluster on certain days/shifts/after certain setups? 5) Apply 5-Why to the top failure mode — trace from symptom to root cause Recommend: - Preventive maintenance changes - Operator training or procedure updates - Engineering modifications required - Parts to add to critical spares inventory Output: Root cause analysis report. Priority action list: immediate / 30 days / 90 days.
You are an HR analyst reviewing exit interview data to identify retention insights. Exit interview data: [PASTE: Departure month | Department | Tenure | Voluntary/involuntary | Primary departure reason (from exit interview) | Secondary reason | Destination (competitor/different industry/personal/unknown) | Would they recommend company? (yes/no)] Analyze: 1) Top departure reasons — ranked by frequency 2) Turnover by department — which departments have the highest voluntary turnover? 3) Tenure patterns — are people leaving within 0–1 years (onboarding failure) / 1–3 years (growth ceiling) / 3+ years (compensation/culture)? 4) Competitor intelligence — who are we losing people to? What does that signal? 5) Recommendation score — % who would recommend the company as an employer Output: Retention risk analysis. Top 3 root causes of voluntary turnover with specific retention recommendations. Estimated annual cost of current turnover rate.
You are a senior analyst preparing a cross-functional KPI summary for the executive team. KPI data: [PASTE: KPI name | Function | This period | Last period | Same period last year | Target | Trend (improving/declining/stable)] For each KPI: - Write a 1-sentence plain-English explanation of what drove the result - Flag: KPIs more than 10% below target or declining for 3+ consecutive periods - Highlight: KPIs where we're exceeding target — note whether it's sustainable or a one-off Group KPIs by theme: Revenue & Growth / Profitability / Operations / Customer / People. Output: Executive KPI summary — each KPI gets maximum 3 lines. End with: the 2 metrics that most need management attention this period and why.
You are a learning operations lead. [PASTE: Onboarding process, retention data, engagement scores, time-to-productivity]. Define onboarding impact metrics (retention at 30/60/90 days, engagement score, time-to-full-productivity, manager satisfaction with new hire), build baseline measurement, set targets, design dashboard, create reporting cadence. Output onboarding metrics charter with impact metrics, measurement plan, baseline data, targets, dashboard mockup, and ROI calculation linking onboarding quality to retention and performance.
You are a people analytics lead. [PASTE: Performance rating data by department/role/demographics]. Analyze performance rating distribution (is it normal? skewed?), identify rating patterns by manager/department (does one manager only give 4s?), assess fairness (do women/minorities get lower ratings controlling for performance?), identify top/bottom performer segments, track performance trend over time, flag anomalies. Output performance analytics report with distribution analysis, manager/dept comparisons, fairness assessment, top/bottom performer profiles, trend analysis, and recommendations for improving rating consistency and fairness.
You are a pay equity specialist. [PASTE: Employee data (name, title, level, salary, bonus, years, demographics), bonus/equity data, hiring/promotion history]. Prepare data, analyze gaps (avg salary by gender/race within role/level), investigate root causes (hired lower? promoted slower?), identify remediation (equity adjustments, process improvements), calculate budget, plan communication. Output audit report with gap analysis, root cause analysis, remediation plan with budget, hiring/promotion improvements, and communication strategy with annual monitoring cadence.
You are a learning strategist. [PASTE: Business strategy and skills needed in 1/3/5 years, current skills inventory, engagement survey data, turnover reasons]. Define current state (survey employees on skills, review performance), define future state (what skills do we need for strategy?), identify gaps (current vs. future), assess build vs. buy (can we train or hire?), create prioritized learning plan, define measurement (how will we know learning worked?). Output learning needs assessment report with current state inventory, future state roadmap, gap analysis matrix, build vs. buy assessment, and 3-year learning roadmap with priorities and success metrics.
You are a learning analytics lead. [PASTE: Major learning initiatives, business metrics, learning investment, leadership skepticism on L&D ROI]. Define learning impact levels (Level 1 Reaction: Did they like it? Level 2 Learning: Did they gain knowledge? Level 3 Behavior: Did they apply on job? Level 4 Business impact: Did application improve business metrics?), design evaluation by level (post-survey, pre/post skills test, 360 feedback, business metric improvement), select metrics aligned to business, build baseline, calculate ROI (optional), report and iterate. Output learning evaluation framework with approach per level, business metrics aligned to learning, baseline/measurement plan, sample ROI calculation, and reporting template with limitations and recommendation for balanced scorecard approach.
You are a people analytics lead. [PASTE: Attrition data (headcount, % by dept/role/tenure), exit interview feedback, engagement survey results, comp data, performance data, promotion history]. Define attrition baseline (% departed / avg headcount, compare to industry), segment attrition (regretted vs. unregretted), conduct exit analysis (who's leaving? why? patterns?), build predictive model (factors that predict departure: tenure <2yr, recent role change, low engagement, performance feedback, comp below market), design interventions (by root cause), build retention program (stay interviews, manager training, comp monitoring, culture). Output attrition analytics framework with baseline analysis, segmentation, exit analysis by reason, predictive risk factors, intervention strategy by risk category, and retention program with effectiveness metrics (attrition rate trend, regretted vs. unregretted ratio).
You are a people analytics lead. [PASTE: Current HR data available (HRIS, payroll, recruiting, performance, engagement survey), business metrics, HR challenges]. Define dashboard components (Headcount metrics, Talent acquisition metrics, Retention metrics, Engagement metrics, Compensation metrics, Learning metrics), establish data infrastructure (HRIS as source of truth, integrations needed, data governance—who owns what?), create operational metrics (recruiting pipeline health, sourcing effectiveness, hiring manager responsiveness, onboarding completion rate, learning participation), create strategic metrics (engagement index, attrition trend, high-performer retention, diversity metrics, productivity metrics), design dashboards (Executive quarterly view, Department head team-specific, HR team day-to-day), build data governance (metric definitions, data quality, ownership, update frequency). Output HR analytics framework with dashboard components, metric definitions, data sources and infrastructure, data governance policy, and sample dashboards (executive, department, HR team) with data quality assessment and gaps.
You are a demand planner reviewing manual forecast overrides applied by the commercial team. Override data: [PASTE: SKU | Statistical forecast | Override value | Override reason | Who applied | Date applied] Analyze: Override bias — are overrides consistently higher or lower than statistical forecast? Override accuracy — where data is available, compare overridden forecast to actual demand; did the override improve accuracy? Override by source — which team members or regions apply the most overrides? Are their overrides more or less accurate? Overrides without documented reason — flag these; undocumented overrides cannot be reviewed or learned from Recommendations — which categories or sources of override add value vs. add noise? Output: Override accuracy analysis. Recommendation: which overrides to retain as a process vs. which to challenge or require evidence for. Demand Sensing Review
Showing 18 of 68