Below is a fact-grounded map of “new economy segments” created (or massively expanded) by the LLM boom—i.e., categories where new companies/products and real enterprise spend have shown up since the generative-AI wave.
A quick reality check: the biggest “big bang” shift is enterprises moving from demos → production apps and embedded “agents/copilots” inside existing software. (Menlo Ventures)
1) Compute and infrastructure segments
GPU cloud + GPU brokers/marketplaces
Matching AI workloads with scarce GPU supply (short-term/spot capacity, multi-cloud arbitrage).Inference optimization platforms (faster/cheaper serving)
Quantization, speculative decoding, batching, caching, compilation, kernel tuning.Model “routers” / multi-model gateways
Automatically choose the best model per request (cost/latency/quality/risk), plus fallback.Private/on-prem LLM stacks
For regulated orgs: deploy models behind the firewall (hybrid/on-prem patterns show up in enterprise agent platforms). (The Times of India)Edge/on-device LLM serving
Smaller models + local inference for privacy/latency and offline use.Realtime multimodal infrastructure (voice-first)
Low-latency speech-to-speech and multimodal streaming APIs created a whole “voice agent” ecosystem. (OpenAI Platform)
2) Data + “context” economy (RAG, vectors, connectors)
Vector databases / vector search services
Became mainstream because RAG is a default enterprise pattern; many orgs use vector DBs to customize LLM apps. (Databricks)RAG orchestration and tooling vendors
Retrieval pipelines, chunking/indexing, hybrid search, rerankers, and doc intelligence.Enterprise search re-born as “AI answer engines”
Workplace search that returns grounded answers + citations across internal sources.Knowledge graph + graph-RAG layers
For higher-precision retrieval (entities/relations) beyond pure embeddings.Connector standards + “tool servers” (agent integration layer)
A whole segment around standardizing how models connect to tools and data (vs bespoke integrations). MCP is a key example of this connector layer. (Anthropic)Licensed-data packaging for LLMs
Data providers selling “LLM-ready” access (contracts, entitlements, audit, policy gates). Example: financial-data access delivered via MCP-style servers. (FN London)
(If you want “market size” claims: there are now dedicated analyst categories like “RAG market,” but treat those forecasts cautiously—useful as a signal, not truth.) (MarketsandMarkets)
3) LLMOps: production engineering, evaluation, reliability
LLMOps platforms
The analog of MLOps for LLM apps: prompt/version management, experiments, deployment controls, rollback.Evaluation-as-a-service (Evals)
Regression tests for quality, groundedness, refusal behavior, and task success (often continuous / CI-style). (zenml.io)Observability for LLM apps
Tracing, token/cost accounting, latency breakdowns, tool-call traces, “why did it answer that?”Synthetic data generation + data factories
Creating training/eval data at scale (with human verification loops).Fine-tuning + distillation service providers
Not everyone trains models; many pay for domain tuning, adapters, distillation to small models.Prompt / context engineering toolchains
Systems to build prompts + retrieval templates + guardrails like software artifacts.
4) Safety, security, and compliance segments (now budgeted)
Guardrails platforms (input/output/tool/retrieval constraints)
Policies, filters, redaction, safe tool-use boundaries, “constitutional” checks—plus testing. (zenml.io)Prompt-injection & agent security
New security category because tool-using agents + connectors create fresh attack surfaces; MCP server issues illustrate the risk class. (TechRadar)AI governance platforms
Inventory, risk registers, approvals, model cards, audit trails, policy enforcement.Regulatory compliance tooling (EU AI Act, GPAI obligations)
Tooling/support around transparency, safety/security, copyright processes, documentation. (Digital Strategy)AI management system certification / “AIMS” programs
ISO/IEC 42001 created a standards-driven consulting + certification ecosystem. (ISO)Copyright/IP risk tooling
Dataset provenance, output similarity checks, licensed content workflows (especially in media/marketing).
5) “Agents & copilots” as a platform layer
Horizontal copilots (workplace productivity)
Writing, summarizing, meeting notes, research, spreadsheet/docs assistants—sold per seat.Vertical copilots embedded in enterprise software
CRM/ITSM/ERP/HCM copilots becoming standard features (e.g., AI agents inside service/workflow suites). (The Wall Street Journal)Agent orchestration platforms
Multi-agent workflows, planning, tool selection, governance, identity/permissions—sold as “agent platforms.” (The Times of India)Interoperability standards for agents
A new “protocol economy” so agents can talk across vendors (MCP + broader foundation efforts). (Anthropic)
6) High-ROI application segments (where spend concentrates)
Customer support automation (chat + voice)
Deflection, agent assist, after-call work—boosted by realtime voice capabilities. (OpenAI Platform)Sales “autopilot” + outbound personalization
Lead research, tailored outreach, call notes, CRM hygiene (“steward” agents). (IT Pro)Marketing content supply chain
Ad variants, localization, brand-safe generation, approvals, asset management.Software engineering agents
Code generation, refactoring, test creation, PR review, incident response—now a major standalone product category. (WIRED)IT operations agents (ticket triage → remediation)
Automating runbooks like rebooting systems, resolving common incidents. (The Wall Street Journal)Document intelligence for back office
Contract analysis, invoices, claims, procurement, compliance docs—LLM + OCR + RAG.Legal & compliance copilots
E-discovery summarization, clause extraction, policy Q&A with citations.Healthcare admin copilots (esp. documentation)
Visit notes, coding assistance, patient messaging (high demand, heavy governance).Finance research copilots
Earnings call summarization, diligence scanning, workflow initiation with licensed data. (FN London)Education tutors and course assistants
Personal tutoring, content generation, grading support (with safety constraints).Translation + localization at scale
New workflows: “draft + human QA” across entire content libraries.Creative production tooling
Script/story assist, ideation, game narrative, asset pipelines (often multimodal).
7) Services and labor-market segments (the “LLM services boom”)
AI integration consultancies (RAG/agent delivery teams)
“We build your copilot” became a standard services line-item.Red-teaming, safety testing, and model risk audits
New specialized security + compliance services around LLM behavior and agent/tool risks. (Digital Strategy)Training & enablement
Internal “AI literacy,” prompt/context engineering training, governance operating models.
If you tell me your target buyer (consumer vs enterprise), and what you mean by “new economy” (startup categories, job categories, or spend categories), I can re-rank this list into the top 20 most investable / fastest-growing segments and attach concrete examples for each.