Unwrapped

Teardown · perplexity

PERPLEXITY

PERPLEXITY

CategoryConsumer SearchValuation · $9.0B · 2025Site ↗
  • Accel
  • IVP
  • NEA
  • NVIDIA
  • Bessemer Venture Partners
UX wrapper

Public web + publisher deals + frontier LLM APIs + answer engine UI — same retrieval, different margin.

01

Public data / API layer

Internal replication score

Easy
0.84

Feasibility of a useful internal substitute built with Claude (or similar), the same data access, and light agent logic — not rebuilding the whole product.

IRS = 0.30·D + 0.25·L + 0.20·O + 0.15·R + 0.10·Sthis record · 84%
  • D

    Data accessibility

    weight 0.300.95
    • 1.0mostly customer-owned / public / standard third-party sources
    • 0.5mixed accessibility
    • 0.0hard-to-access or proprietary source layer
  • L

    LLM substitutability

    weight 0.250.90
    • 1.0mostly retrieve / prompt / cite / summarize / classify / compare
    • 0.5mixed standard + custom behavior
    • 0.0strongly custom model behavior (fine-tunes on proprietary data, etc.)
  • O

    Output simplicity

    weight 0.200.85
    • 1.0straightforward internal work product (memo, list, reply, SQL query)
    • 0.5moderately specialized
    • 0.0highly specialized (e.g. FDA-graded clinical text)
  • R

    Review / risk tolerance

    weight 0.150.80
    • 1.0internal use with human review is acceptable
    • 0.5moderate risk
    • 0.0very low tolerance for error (e.g. external legal filings)
  • S

    Surface complexity

    weight 0.10inverse — higher means less surface dependence0.40
    • 1.0a simple internal shell is enough
    • 0.5polished workflow matters somewhat
    • 0.0product surface / rollout / trust posture is central to value
LabelsEasy ≥ 0.67Medium ≥ 0.34Hard < 0.34

Missing factor rows use heuristics from wrapper scores. Editorial heuristic, not investment advice.

Build it yourself

Recreate the workflow inside your org.

Internal build

Build it yourself

Same web retrieval + frontier model API + citation discipline — weaker brand, same answer.

Internal use only. Replacing them in-market is a different bar than replaying the useful workflow inside your org.

01 · Connectors & flow

Common Crawl
Common Crawl
Publisher licensing deals
Publisher licensing deals
Google/Bing Search APIs
Google/Bing Search APIs
Wikipedia
Wikipedia

Internal build map

Data in

Connectors
Connectors

Agent layer

Planner
Tools + retrieval
Reasoning model

Logic

LLM API
retrieve
rerank
synthesize
cite
not custom weights

Outputs

Internal search
Answer
Citations

02 · Claude / agent prompt

Paste as the system or developer message in Claude (or your agent runtime). Scroll to read; Copy grabs the full text.

Claude / agent prompt

// Research assistant with web retrieval and citation discipline You are a research assistant that answers questions using real-time web search results. You help users find accurate, up-to-date information by retrieving relevant sources and synthesizing them into clear answers. ## What you must do 1. Retrieve first: For every question, call web search APIs to get current, relevant sources before answering. 2. Cite rigorously: Every factual claim must be backed by a numbered citation linking to a specific retrieved source. Format: [1], [2], etc. 3. Surface conflicts: When sources disagree, present both viewpoints with citations and note the discrepancy. 4. Scope: Stick to what the retrieved sources say. If sources are insufficient, say so and suggest refinements. 5. Recency: Prefer recent sources when timeliness matters. Note publication dates when relevant. ## What you are not Not a replacement for expert judgment — you summarize what public sources say, not what is definitively true. Human review required for high-stakes decisions. ## Refusal Refuse questions that require real-time personal data, private information, or content behind paywalls you can't access. Refuse to generate citations for claims you didn't retrieve. If search returns no useful results, say so clearly. ## Safety Internal use only. All answers are for research and information-gathering — not publication without human verification of sources.

03 · Result

What were the key findings in the latest IPCC climate report?
IPCC AR6 Synthesis Report (public)

AR6 found warming likely to exceed 1.5°C this decade without immediate action [1][2].