✏️Prompts

Data & Analytics Prompts to Build or Create Something

10 prompts

You are a data visualization strategist translating complex data into clear, actionable infographics that communicate insights at a glance. Your role is to identify when visualization will actually improve understanding and story impact. Provide: - [PASTE: Data you want to visualize (attach or describe)] - [PASTE: Key insight you want the infographic to communicate] - [PASTE: Audience (technical, C-suite, industry peers, etc.)] - [PASTE: Where this will live (blog post, pitch deck, social, print, etc.)] - [PASTE: Context or story around the data] Brief includes: 1. Data analysis: - What the data actually shows (separate from what you hope it shows) - Relevant comparisons (vs. industry, vs. time, vs. segments) - Most compelling data points - What data might be misleading or need context 2. Visualization recommendation: - Best chart type for this data (bar, pie, line, map, scatter, etc.) - Why this visualization type works better than alternatives - Data points to highlight vs. omit 3. Design approach: - Visual metaphor or theme (if applicable) - Color strategy (use of brand colors, differentiating data sets) - Simplified vs. detailed version (if for different audiences) 4. Story/narrative: - What's the headline/key takeaway - How to lead viewers through the visualization - Supporting text or annotations - Call-to-action or next step 5. Technical specs: - Dimensions and aspect ratio (vertical for mobile social, horizontal for web, etc.) - File format (vector for scaling, raster for web) - Animation or interactivity needs 6. Validation: - Does this visualization actually clarify the data or confuse it? - Can someone understand the insight in 5 seconds? - Any data accuracy issues to fact-check? Provide the brief in a way designers can execute without data analysis expertise.

DesignerData AnalystMarketer

You are a product manager designing the product metrics dashboard. Business context: [DESCRIBE: Product type (B2B SaaS / consumer app / marketplace / platform), stage of company (early / growth / scale), current metrics being tracked, key decisions the dashboard should support, stakeholder audience (product team / leadership / board)] Design the dashboard: 1) North star metric — the single metric that best captures the value customers get from the product; everything else should lead to this 2) Input metrics — leading indicators that drive the north star (activation rate / feature adoption / engagement frequency) 3) Health metrics — lagging indicators of product health (retention / NPS / support ticket volume) 4) Business metrics — revenue impact (MRR / expansion / churn) connected to product performance 5) Alerting — which metrics should trigger an alert if they move significantly in either direction? Output: Dashboard design. North star metric defined. Input, health, and business metrics. Alert thresholds. Review cadence recommendation.

Data AnalystExecutive

You are a data analyst designing a BI dashboard for a SaaS business. Dashboard context: [DESCRIBE: Audience (exec/sales/product/CS/finance), decisions the dashboard should support, data sources available, BI tool in use, current reporting gaps, any known data quality issues] Design the dashboard: 1) Purpose statement — what decision does this dashboard support? One primary purpose per dashboard. 2) Key metrics — the 5–8 most important metrics for this audience; resist the urge to show everything 3) Trend view — every key metric should show current vs. prior period and vs. target 4) Drill-down capability — ability to segment by dimension (customer segment / product / geography / rep) 5) Alerting — which metric movements should trigger a notification to the dashboard owner? Output: Dashboard design specification. Metric list with definitions. Dimension/drill-down map. Alert thresholds. Data source requirements.

Data AnalystExecutive

You are a product analyst building the analytics instrumentation plan for a new product feature. Feature data: [DESCRIBE: Feature name, user flow (step by step), key behaviors to track, business questions the data should answer, current instrumentation (if any), analytics tool in use (Mixpanel/Amplitude/Heap/custom)] Build the instrumentation plan: 1) Events to track — for each user action in the feature flow, define the event name and properties to capture 2) Event naming convention — consistent naming schema (noun_verb: "file_uploaded" / "report_exported") for maintainability 3) User properties — what user attributes should be available for segmentation? (plan tier / account size / days active) 4) Funnel definition — the ordered sequence of events that defines feature adoption 5) Success metrics — what data will confirm the feature is performing as intended? Output: Instrumentation plan. Event dictionary. User property list. Funnel definition. Success metrics. QA testing checklist to confirm tracking is working.

Data AnalystDeveloper

You are a data leader building the data governance policy. Context: [DESCRIBE: Company stage, data sensitivity (customer PII/financial/health), current data management practices, regulatory environment, team responsible for data governance, any prior data incidents or compliance findings] Write the policy: 1) Data classification — levels (public / internal / confidential / restricted) with definition and handling requirements for each 2) Data ownership — each data domain has an owner responsible for quality, access, and usage decisions 3) Access controls — who can access what data? Approval process for sensitive data access 4) Data retention — how long is data kept? When and how is it deleted? 5) Acceptable use — what is data allowed to be used for? What is prohibited? Output: Data governance policy. Classification schema. Ownership matrix. Access control procedure. Retention schedule. Acceptable use statement.

Data AnalystIT & OpsExecutive

You are a product analyst building the A/B testing framework. Business context: [DESCRIBE: Traffic volume (needed to determine feasible test durations), current testing tool (Optimizely/LaunchDarkly/custom), types of tests run (pricing/UX/onboarding/messaging), decision-making process for shipping winners, any current testing anti-patterns observed] Build the framework: 1) Hypothesis format — "We believe [change] will cause [outcome] because [rationale]" 2) Success metrics — primary metric (what the test is trying to move) + guardrail metrics (what must not get worse) 3) Sample size and duration — minimum detectable effect / statistical significance level / power; calculate test duration needed 4) Segmentation — who is included in the test? Confirm randomization is correct. 5) Decision criteria — exactly what results trigger a "ship it" vs. "iterate" vs. "revert" decision Output: A/B testing framework. Hypothesis template. Sample size calculator guidance. Decision criteria table. Anti-pattern list to avoid.

Data AnalystMarketer

Design a business review template. Business type: [describe] Cadence: [monthly / quarterly] Audience: [leadership / investors / board / dept heads] Key metrics: [list main KPIs] Current format: [describe reviews today] Template with: 1. Executive summary (3-5 takeaways, conclusions only — no data) 2. Performance vs targets: 5-8 key metrics with vs-prior and vs-target 3. Deep dive section: rotating focus each review 4. What's working: 2-3 drivers of positive results 5. What's not: 2-3 problems with root cause and proposed action 6. Decisions needed: specific asks 7. Forward look: what to watch next period

ExecutiveFounderData Analyst

Help me design a KPI dashboard for [FUNCTION — e.g. sales team, marketing department, customer support]. Audience: [who will look at this dashboard? What decisions do they make?] Update frequency: [daily, weekly, monthly?] Data sources available: [list what you have access to] Please define: 1. The 5–8 most important metrics this dashboard should show (with definitions) 2. For each metric: how to calculate it, what 'good' looks like, and what action it should trigger 3. Recommended layout (what goes at the top, what's secondary) 4. What to cut — common metrics that look good but don't drive decisions

Data AnalystExecutiveIT & Ops

Write a specification for a custom dashboard. Purpose: [describe] Users: [role, how often, decisions they make] Data sources: [databases, tools, APIs] Update frequency: [real-time / daily / weekly] For each metric or visualisation: 1. Name and definition 2. How to calculate 3. Data source and field names 4. Visualisation type (number, line, bar, table, heat map) 5. Time dimension (today / last 7 days / MTD / rolling 30) 6. Filters needed 7. Alert threshold Also specify: layout and which metrics are most prominent.

Data AnalystIT & OpsDeveloper

Write a data governance policy. Org size: [employees] Data types: [customer PII / financial / employee / analytics] Regulatory requirements: [GDPR / CCPA / HIPAA / SOC 2] Current problems: [duplicates / inconsistent formats / unclear ownership] Tools: [CRM, database, warehouse, BI tool] Policy covering: 1. Data ownership: who is responsible for which datasets 2. Quality standards: what 'good' data looks like 3. Data entry rules: formats, required fields, naming conventions 4. Retention: how long to keep each data type 5. Access controls: who can see and edit what 6. Cleaning cadence: how often to audit 7. How to handle a data quality issue

Data AnalystIT & OpsExecutive
Filters
10 prompts