Technical Research Skill
Structured research workflow for gathering, validating, and synthesizing information from multiple sources.
Core Principles
- Tool-agnostic: Use available search tools (WebSearch, WebFetch, or alternatives)
- Multi-source validation: Cross-reference claims across sources
- Quality over quantity: Focus on authoritative, recent sources
- Complete citations: All claims need clickable sources
- Bias awareness: Note vendor bias, sponsored content
Available Tools
Research uses these capabilities (implementation varies by environment):
| Capability | Primary Tool | Fallback | |------------|--------------|----------| | Web Search | WebSearch | codex --enable web_search_request | | Page Fetch | WebFetch | codex with URL | | Link Validation | WebFetch (HEAD) | Manual verification |
Tool Selection:
- Use built-in WebSearch/WebFetch when available (faster, no external deps)
- Fall back to codex skill when built-in tools are unavailable
- Let user verify simple URLs by clicking (fastest)
Research Depth Levels
Quick Research (3-5 sources)
- Use when: Simple factual questions, quick comparisons
- Output: 1-2 paragraphs with key links
Standard Research (8-12 sources)
- Use when: Technical comparisons, feature analysis
- Output: Structured report with sections
Deep Research (15+ sources)
- Use when: Architecture decisions, comprehensive analysis
- Output: Full report with cross-references, trade-off analysis
- Additional steps: Multiple search iterations, source triangulation, expert opinion synthesis
Research Workflow
Phase 1: Scoping
1. Clarify Research Goals
Confirm with user:
- What's the core question?
- What needs comparison?
- Which aspects matter? (architecture/performance/use cases/cost)
- Depth level: Quick / Standard / Deep
2. Define Search Strategy
Plan before searching:
- Primary keywords + synonyms
- Year constraints (default: current year - 1 to current)
- Domain restrictions (official docs, academic, community)
- Negative keywords (exclude irrelevant results)
Phase 2: Collection
3. Multi-Angle Search
Execute parallel searches across different angles:
| Angle | Query Pattern | Example | |-------|---------------|---------| | Factual | "What is X", "X definition" | "What is OpenSearch" | | Comparative | "X vs Y", "X alternatives" | "OpenSearch vs Elasticsearch differences" | | Technical | "X architecture", "X implementation" | "OpenSearch architecture internals" | | Practical | "X tutorial", "X best practices" | "OpenSearch best practices 2024" | | Recent | "X 2024 2025", "X latest" | "OpenSearch new features 2024 2025" |
Query Tips:
- Add year constraints for recent info
- Include exact product name in query
- Focus each query on single topic
4. Source Diversification
Ensure coverage across source types:
- [ ] Official documentation (at least 2 sources)
- [ ] Official blogs/announcements
- [ ] Independent technical analysis
- [ ] Community discussions (GitHub issues, Stack Overflow)
- [ ] Academic papers (if applicable)
Phase 3: Validation
5. Cross-Reference Verification
For each key claim:
- Find at least 2 independent sources confirming
- Note conflicting information
- Identify primary vs secondary sources
Triangulation Method:
- Official source (docs, blog)
- Independent analysis
- Community validation (issues, discussions)
6. Link Validation
Before finalizing report:
- Verify all URLs are accessible (use WebFetch or manual check)
- Replace 404 links with alternatives
- Ensure reference names match page content
Validation Rules:
- Let user verify simple URLs by clicking (faster)
- Use WebFetch for batch verification when needed
- Search for replacement URLs for broken links
Phase 4: Synthesis
7. Information Quality Assessment
Rate each source:
| Criteria | Weight | Scoring | |----------|--------|---------| | Authority | 30% | Official docs (5) > Official blog (4) > Tech publication (3) > Community (2) > Anonymous (1) | | Recency | 25% | <6mo (5) > 6-12mo (4) > 1-2yr (3) > 2-3yr (2) > >3yr (1) | | Specificity | 25% | Detailed with examples (5) > General overview (3) > Vague (1) | | Independence | 20% | Unbiased (5) > Slight bias (3) > Vendor content (1) |
8. Conflict Resolution
When sources disagree:
- Prefer official documentation
- Check publication dates (newer often wins for tech)
- Note the disagreement in report
- Provide both perspectives if unresolved
Conflict Template:
> **Conflicting Information**
> - Source A claims: [X]
> - Source B claims: [Y]
> - Resolution: [Your analysis or "Both perspectives included"]
9. Organize and Analyze
- Filter valuable information
- Structure by user's priority
- Add analysis and insights
- Unify citation format
Phase 5: Delivery
10. Final Review Checklist
- [ ] All claims have citations
- [ ] All links validated
- [ ] No fabricated data
- [ ] Balanced perspective (vendor bias noted)
- [ ] Matches user's depth requirement
- [ ] Version information included where relevant
Output Format
Citation Format (Clickable Links)
Inline citation:
OpenSearch forked from Elasticsearch 7.10 in 2021 (source: [AWS OpenSearch Blog]).
Link definitions at end:
[AWS OpenSearch Blog]: https://aws.amazon.com/blogs/opensource/...
Report Structure
# [Topic] Research Report
## 1. Overview
What it is, what problem it solves
## 2. Core Features/Architecture
Key technical points with citations
## 3. Comparison (if applicable)
Table comparison, each item with source
## 4. Recommendations
Conclusions based on research
## References
[Link Name 1]: URL1
[Link Name 2]: URL2
...
Source Priority
- Official documentation - Most authoritative, prefer when available
- Official blogs/announcements - For news, releases, roadmaps
- Third-party tech blogs - Only if official docs lack detail; verify quality first
- Independent benchmarks - For performance data (note: vendor benchmarks may be biased)
Note: Third-party blogs may have lower quality. Always verify content accuracy before using.
Guidelines
- Don't fabricate data: No performance numbers without sources
- Trim sections: Only keep what users care about
- Valid links: Prefer official docs, reputable tech blogs
- Declarative titles: Don't use questions as headings
- Reference name accuracy: Ensure
[Reference Name]matches actual page content - Version awareness: Note software versions, flag deprecated features
Common Pitfalls
| Pitfall | Solution | |---------|----------| | Search returns wrong product | Always include exact product name in query | | 404 links in final report | Validate all links before finalizing | | Reference name doesn't match content | Verify page content matches reference name | | Using vendor benchmarks as neutral | Note the source bias in report | | Overly broad search queries | Focus each query on single topic | | Missing year constraints | Add current/recent years for tech info | | Single source for key claims | Cross-reference with at least 2 sources | | Outdated information | Check publication date, prefer recent sources |
Example Research Task
User: Research differences between OpenSearch and Elasticsearch
Steps:
- Clarify depth level with user (Quick/Standard/Deep)
- Search OpenSearch unique features (include "OpenSearch" in query)
- Search architecture differences
- Search licensing and governance differences
- Search performance comparisons (with sources, note vendor bias)
- Cross-reference key claims across sources
- Assess source quality (authority, recency, bias)
- Organize into report, add links to all citations
- Validate all links with WebFetch or user verification
- Fix any 404 links with replacements