The knowledge layer
for podcasts.

Real-time synchronization between conversation and research.

// PARSING AUDIO STREAMS
// VERIFYING FACTS WITH GEMINI 3
// GENERATING KNOWLEDGE GRAPH...
noeron.app — Research Stream

Research Stream

Claims Extracted As You Listen

Mechanism
31:10

Evolution reprograms bodies by changing signals, not physical hardware

"But much of the time it's not by changing the hardware, it's by changing the signals that the cells give to each other... It's doing what we as engineers do, which is try to convince the cells to do various things..."

CONFIDENCE: 80%
Theoretical
30:24 - 1 MIN AGO

Every biological level, from molecules to organs, pursues its own goals

"Biology uses a multi scale competency architecture, meaning that every level has goals. So molecular networks have goals, cells have goals..."

CONFIDENCE: 95%

POWERED BY GEMINI 3

Why Gemini 3 makes this possible

COST REDUCTION25x
1M Token Context
150+ Papers

Load entire podcast transcripts alongside the full research corpus in a single context window

Context Caching
$2 / 1K queries

Process the paper corpus once, then query thousands of times — making real-time responses economically viable

Thinking Levels
Medium -> High

Adaptive reasoning depth: fast claim detection with medium thinking, deep synthesis with high thinking

Structured Outputs
JSON Schema

Generate context cards with proper citations, confidence scores, and provenance tracking automatically

TWO-PASS GEMINI ARCHITECTURE
01
Claim Detection
gemini-3-flash

thinking: 'medium'

input: 60s window

output: claims+tags

02
Context Synthesis
gemini-3-pro + cache

thinking: 'high'

cached: 150+ papers

output: cards+cites

TECHNICAL STACK

How it all fits together

MCP PROTOCOL
DATA PIPELINE
01
INGESTSemantic Scholar + ArXiv
02
EXTRACTGROBID TEI Processing
03
TRANSCRIBEAssemblyAI + Diarization
04
CHUNK400 tokens / 50 overlap
05
EMBEDGemini text-embedding-004
06
INDEXSupabase pgvector
07
DETECTGemini Claim Extraction
08
VERIFYRAG + Citation Scoring
Backend
Python + FastMCP
MCP protocol server with HTTP adapter
Frontend
Next.js + React
Real-time sync via API proxy routes
Vector Store
Supabase pgvector
Persistent embeddings + metadata
AI Engine
Gemini 3 Pro
Claim detection + synthesis
500+
Papers Indexed
< 3s
Query Latency
768
Embedding Dims
EXPOSED MCP TOOLS
rag_search
save_paper
save_author_papers
get_saved_paper
list_saved_papers
rag_stats

SEE IT IN ACTION

3 minute walkthrough

DEMO VIDEO
Noeron Research Platform
00:00Podcast begins
00:45First claim detected
01:30Research surfaces
02:15Knowledge graph generates
03:00Cross-episode connections

</> Built for Gemini 3 Global Hackathon