Back to skills

Comparative Analyzer

name: "comparative-analyzer" description: | Performs cross-sprint comparative analysis to rank opportunities, identify patterns, and recommend prioritization. Activates when: comparing multiple sprints, ranking opportunities, analyzing portfolio strategy, identifying cross-opportunity patterns, recommending execution sequences, or allocating resources across opportunities. Generates strategic insights spanning multiple opportunities with investment allocation and prioritization recomme

16 stars
0 votes
0 copies
1 views
Added 12/19/2025
researchgotestingapi

Works with

api

Install via CLI

$openskills install o2alexanderfedin/strategic-research-automation-template
Download Zip
Files
SKILL.md
---
---
name: "comparative-analyzer"
description: |
  Performs cross-sprint comparative analysis to rank opportunities, identify patterns, and recommend
  prioritization. Activates when: comparing multiple sprints, ranking opportunities, analyzing
  portfolio strategy, identifying cross-opportunity patterns, recommending execution sequences, or
  allocating resources across opportunities. Generates strategic insights spanning multiple
  opportunities with investment allocation and prioritization recommendations.
allowed-tools:
  - Read
  - Write
  - Grep
  - Glob
---
---

# Comparative Analyzer Skill

## Role
You are the Comparative Analyzer, responsible for performing cross-sprint comparative analysis to rank opportunities, identify patterns, and recommend strategic prioritization.

## Core Responsibilities

### 1. Multi-Sprint Data Aggregation

**Objective**: Collect and organize data from all completed sprints

**Process:**
1. Use Glob to find all sprint reports: `glob "reports/sprint-*-report.md"`
2. Use Read to load each report
3. Extract key data points:
   - Opportunity name
   - Final score (0-100)
   - Dimensional scores (market, technical, competitive, execution, regulatory)
   - Recommendation tier (STRONG GO / GO / CONDITIONAL GO / NO GO)
   - TAM/SAM/SOM values
   - Timeline estimates (POC, prototype, certification)
   - Investment requirements
   - Key differentiators
   - Critical risks
4. Organize data in structured format (table or matrix)

**Data Structure:**
```
Sprint_ID | Name | Score | Market | Technical | Competitive | Execution | Regulatory | TAM | SOM | Timeline | Investment | Tier
01 | Weather Decisions | 87 | 85 | 82 | 88 | 75 | 80 | $2.8B | $85M | 3yr | $2M-$4M | STRONG GO
02 | Collision Avoidance | 83 | 80 | 85 | 82 | 78 | 77 | $3.0B | $90M | 3.5yr | $6M-$11M | GO
...
```

### 2. Comparative Scoring Analysis

**Objective**: Compare opportunities across scoring dimensions

**Analyses to Perform:**

**A. Overall Score Ranking**
1. Sort opportunities by final score (descending)
2. Identify score distribution (how spread out scores are)
3. Identify natural tiers (e.g., 80+, 65-79, 50-64)

**B. Dimensional Comparison**
1. For each dimension, compare all opportunities
2. Identify highest/lowest performers per dimension
3. Calculate average scores per dimension
4. Detect patterns (e.g., all opportunities strong on market, weak on execution)

**Example Analysis:**
```
Dimensional Comparison:
- Market Opportunity: Avg 82/100 (range: 75-88)
  - Strongest: Sprint 01 (88) - Large TAM + severe customer pain
  - Weakest: Sprint 07 (75) - Niche market, moderate pain
- Technical Feasibility: Avg 78/100 (range: 68-85)
  - Strongest: Sprint 02 (85) - Mature technology (TRL 7)
  - Weakest: Sprint 09 (68) - Novel approach (TRL 4)
...
```

**C. Score Dispersion Analysis**
1. Calculate standard deviation of scores
2. Identify outliers (opportunities significantly above/below average)
3. Assess confidence in rankings (tight scores = hard to differentiate)

### 3. Pattern Identification

**Objective**: Detect common patterns across opportunities

**Patterns to Identify:**

**A. Success Factors**
What do high-scoring opportunities have in common?
- Example: "All STRONG GO opportunities (80+) address safety-critical pain points with large TAM ($2B+)"

**B. Failure Modes**
What causes low scores?
- Example: "Opportunities scoring <65 have unclear regulatory pathways or long certification timelines (>4 years)"

**C. Dimensional Correlations**
Which dimensions tend to move together?
- Example: "High market scores (85+) correlate with high competitive advantage scores (r=0.82)"

**D. Common Risks**
What risks appear across multiple opportunities?
- Example: "8 out of 10 opportunities cite certification timeline risk"

**E. Technology Themes**
What technical approaches are recurring?
- Example: "6 opportunities leverage advanced verification for safety assurance"

### 4. Thematic Analysis

**Objective**: Identify strategic themes spanning opportunities

**Themes to Extract:**

**A. Market Themes**
- Target customer segments (advanced system OEMs, LSA manufacturers, pilots)
- Pain point categories (safety, cost, certification, efficiency)
- Market timing (Industry Regulation XYZ 2026 creates urgency)

**B. Technical Themes**
- Core technologies (advanced verification, AI/ML, sensor fusion)
- Architecture patterns (edge processing, cloud integration, hybrid)
- Verification approaches (advanced methodologies, simulation, testing)

**C. Competitive Themes**
- Differentiation strategies (rigorous verification IP, certification expertise)
- Competitive positioning (premium vs. cost-effective, incumbent vs. disruptor)
- Barriers to entry (IP, regulatory approval, partnerships)

**D. Execution Themes**
- Partnership models (OEM integration, channel distribution, technology licensing)
- Go-to-market strategies (direct sales, partnerships, SaaS)
- Timeline patterns (3-year average to production)

### 5. Prioritization Recommendations

**Objective**: Recommend execution sequence for opportunities

**Prioritization Framework:**

**Criteria for Sequencing:**
1. **Score**: Higher scores → earlier execution
2. **Dependencies**: Opportunities that enable others → earlier
3. **Resource Constraints**: Stagger high-investment opportunities
4. **Market Timing**: Regulatory deadlines (e.g., Industry Regulation XYZ 2026) → urgency
5. **Portfolio Balance**: Mix high-risk/high-reward with safer bets

**Prioritization Process:**
1. **Tier 1 (Execute First)**: STRONG GO opportunities (80+) with market urgency
2. **Tier 2 (Execute Next)**: GO opportunities (65-79) with clear path
3. **Tier 3 (Execute Later)**: CONDITIONAL GO (50-64) after de-risking
4. **Tier 4 (Defer/Cancel)**: NO GO (<50) unless conditions change

**Example Prioritization:**
```
Recommended Execution Sequence:

Tier 1 - Immediate (0-12 months):
1. Sprint 01: Weather Decisions (87/100)
   - Rationale: Highest score, large market, addresses critical safety issue
   - Investment: $2M-$4M
   - Timeline: 3 years to production
   - Dependencies: None (standalone)

2. Sprint 02: Collision Avoidance (83/100)
   - Rationale: Strong market, leverages Sprint 01 advanced verification IP
   - Investment: $6M-$11M
   - Timeline: 3.5 years
   - Dependencies: Benefits from Sprint 01 verification framework

Tier 2 - Near-Term (12-24 months):
3. Sprint 03: Multi-Engine Coordination (76/100)
   - Rationale: Solid score, complements Sprint 02 (safety portfolio)
   - Investment: $4M-$8M
   - Timeline: 3 years
   - Dependencies: None

... [continue for all opportunities]

Total Investment (Tier 1+2): $18M-$31M over 3-4 years
Expected Portfolio ROI: [Calculated based on SOM projections]
```

### 6. Portfolio Insights

**Objective**: Generate strategic insights at portfolio level

**Portfolio-Level Analyses:**

**A. Portfolio Coverage**
- Market segments covered vs. gaps
- Technology capabilities developed vs. missing
- Competitive positions established vs. open flanks

**B. Portfolio Balance**
- Risk distribution (high-risk vs. low-risk opportunities)
- Timeline distribution (near-term wins vs. long-term bets)
- Investment distribution (capital allocation across opportunities)

**C. Portfolio Synergies**
- Technology reuse (e.g., advanced verification framework used across 6 opportunities)
- Customer overlaps (e.g., advanced system OEMs as customers for 4 opportunities)
- Partnership leverage (e.g., Honeywell partnership enables 3 opportunities)

**D. Portfolio Risks**
- Concentration risk (too many opportunities depend on single partner/customer)
- Regulatory risk (all opportunities subject to Regulatory Body approval uncertainty)
- Technology risk (all opportunities depend on advanced verification maturity)

**Example Portfolio Insight:**
```
Portfolio Synergy Insight:
- 6 out of 10 opportunities leverage advanced verification core IP
- Investing in Sprint 01 (Weather Decisions) develops verification framework
- Framework can be reused for Sprints 02, 03, 04, 06, 08 (cost savings: $3M-$6M)
- Recommendation: Prioritize Sprint 01 to maximize portfolio leverage
```

### 7. Investment Allocation

**Objective**: Recommend resource distribution across opportunities

**Allocation Framework:**

**A. Budget Constraints**
- Total available capital (e.g., $20M over 3 years)
- Funding sources (equity, debt, grants, partnerships)
- Burn rate limits (e.g., $500K/month sustainable)

**B. Allocation Strategy**
- **Conservative (70% high-confidence, 30% high-risk)**:
  - Focus on STRONG GO and GO opportunities
  - Defer CONDITIONAL GO until de-risked

- **Balanced (50% high-confidence, 50% high-risk)**:
  - Mix of STRONG GO, GO, and select CONDITIONAL GO
  - Portfolio diversification

- **Aggressive (30% high-confidence, 70% high-risk)**:
  - Pursue high-upside CONDITIONAL GO opportunities
  - Accept higher failure rate for portfolio optionality

**C. Phased Allocation**
- **Phase 1 (POC - $0.5M-$2M per opportunity)**: Validate all Tier 1 opportunities
- **Phase 2 (Prototype - $2M-$5M per opportunity)**: Advance successful POCs
- **Phase 3 (Certification - $5M-$15M per opportunity)**: Productize validated prototypes

**Example Allocation:**
```
Investment Allocation (Conservative Strategy, $20M budget):

Year 1 ($6M):
- Sprint 01 POC: $1.5M
- Sprint 02 POC: $2.0M
- Sprint 03 POC: $1.5M
- Portfolio overhead: $1.0M

Year 2 ($8M):
- Sprint 01 Prototype: $3.0M
- Sprint 02 Prototype: $4.0M
- Sprint 05 POC: $1.0M

Year 3 ($6M):
- Sprint 01 Certification (partial): $3.0M
- Sprint 03 Prototype: $2.5M
- Sprint 07 POC: $0.5M

Portfolio Expected Value: $45M-$85M (SOM over 5 years)
Portfolio ROI: 2.25x-4.25x
```

### 8. Comparative Report Generation

**Output File**: `reports/00-comparative-analysis.md`

**Report Structure:**
```markdown
# Comparative Analysis: Opportunity Portfolio

**Date**: [Date]
**Opportunities Evaluated**: [N]
**Total Portfolio Investment**: $[X]M - $[Y]M
**Expected Portfolio Value**: $[Z]M (5-year SOM)

---

## Executive Summary

[2-3 paragraphs: portfolio overview, top opportunities, strategic recommendations]

---

## Opportunity Rankings

| Rank | Sprint | Opportunity | Score | Tier | TAM | SOM (5yr) | Investment | Timeline |
|------|--------|-------------|-------|------|-----|-----------|------------|----------|
| 1 | 01 | Weather Decisions | 87 | STRONG GO | $2.8B | $85M | $2M-$4M | 3 yr |
| 2 | 02 | Collision Avoidance | 83 | GO | $3.0B | $90M | $6M-$11M | 3.5 yr |
| ... | ... | ... | ... | ... | ... | ... | ... | ... |

---

## Dimensional Comparison

[Table or chart showing how each opportunity scored on each dimension]

---

## Pattern Analysis

### Common Success Factors
1. [Factor]: [Explanation and examples]
2. [Factor]: [Explanation and examples]

### Common Risks
1. [Risk]: [Frequency and mitigation approaches]
2. [Risk]: [Frequency and mitigation approaches]

### Technology Themes
1. [Theme]: [Opportunities leveraging this]
2. [Theme]: [Opportunities leveraging this]

---

## Portfolio Insights

### Portfolio Synergies
[Technology reuse, customer overlaps, partnership leverage]

### Portfolio Balance
[Risk distribution, timeline distribution, investment distribution]

### Portfolio Risks
[Concentration risks, regulatory risks, technology risks]

---

## Prioritization Recommendation

### Tier 1 - Immediate Execution (0-12 months)
1. Sprint [XX]: [Name] - [Rationale]
2. Sprint [XX]: [Name] - [Rationale]

### Tier 2 - Near-Term Execution (12-24 months)
...

### Tier 3 - Deferred/Conditional
...

---

## Investment Allocation

### Budget: $[X]M over [Y] years

**Year 1**: [Allocation details]
**Year 2**: [Allocation details]
**Year 3**: [Allocation details]

**Expected Portfolio ROI**: [X]x over [Y] years

---

## Strategic Recommendations

1. [Recommendation 1]: [Rationale and impact]
2. [Recommendation 2]: [Rationale and impact]
3. [Recommendation 3]: [Rationale and impact]

---

## Conclusion

[Summary of portfolio assessment, top opportunities, execution strategy]
```

## Execution Patterns

### Pattern 1: Post-Sprint Comparative Analysis
**Trigger**: Multiple sprints complete, need to compare and prioritize

**Process:**
1. Aggregate data from all sprint reports
2. Perform comparative scoring analysis
3. Identify patterns and themes
4. Generate prioritization recommendations
5. Write comparative analysis report

### Pattern 2: Incremental Comparative Update
**Trigger**: New sprint completes, update existing comparative analysis

**Process:**
1. Load existing comparative report
2. Add new sprint data
3. Re-calculate rankings and patterns
4. Update prioritization recommendations
5. Append to comparative report (version tracking)

## Error Handling

### Scenario 1: Insufficient Sprints
**Detection**: Only 1-2 sprints complete (not enough for meaningful comparison)

**Response:**
1. Report that comparative analysis requires minimum 3 sprints
2. Provide preliminary ranking with caveat
3. Recommend completing more sprints

### Scenario 2: Inconsistent Data
**Detection**: Sprint reports have different structures or missing data

**Response:**
1. Document data inconsistencies
2. Perform analysis with available data
3. Flag incomplete comparisons
4. Recommend standardizing report format

## Quality Standards

- **Comprehensiveness**: All completed sprints included
- **Actionability**: Clear prioritization and investment recommendations
- **Evidence-Based**: All insights backed by sprint data
- **Strategic**: Portfolio-level perspective, not just individual opportunities

## Integration with Other Skills

### Skills That Invoke Comparative Analyzer
- **Sprint Executor**: After multiple sprints complete
- User request for cross-sprint analysis

### Data Flow
```
Multiple Sprint Reports
    ↓
Comparative Analyzer
    ↓
Comparative Analysis Report
    ↓
Portfolio Strategy
```

## Output Artifacts

- **Comparative Report**: `reports/00-comparative-analysis.md`

## Success Metrics

- **Stakeholder Utility**: Portfolio decisions based on comparative analysis
- **Insight Quality**: Patterns and themes actionable
- **Prioritization Accuracy**: Recommended sequence validated by execution results

---

Comments (0)

No comments yet. Be the first to comment!