Research & Papers

[D] Two college students built a prototype that tries to detect contradictions between research papers — curious if this would actually be useful

Prototype uses LLMs to extract causal claims and flag opposing findings researchers might miss.

Deep Dive

Two college students have developed a prototype AI system designed to automatically detect contradictions between research papers, addressing a common frustration in academic literature review. Their tool uses LLMs to extract causal claims (like "X improves Y" or "X reduces Y") from papers, builds relationship graphs in Neo4j, and flags when different papers make opposing assertions about the same relationships. In initial testing on approximately 50 papers from one professor's publication list, the system successfully surfaced conflicting findings that might have been missed through traditional abstract reading alone.

The technical stack combines Python/FastAPI backend, React frontend, Neo4j graph database, OpenAlex for paper data, and LLMs for claim extraction. While promising, the prototype faces challenges including occasional loss of conditional statements during extraction and domain filtering limitations. The students are now seeking feedback from researchers about whether such contradiction detection would be valuable in real workflows, how researchers currently identify paper disagreements, and what would build trust in automated tools. Their open prototype represents an innovative approach to literature synthesis that could potentially save researchers significant time in identifying conflicting evidence across growing publication volumes.

Key Points
  • Uses LLMs to extract causal claims like "X increases Y" from research papers
  • Built relationship graphs from 50+ papers and flagged opposing findings researchers missed
  • Tech stack includes Python/FastAPI, React, Neo4j, OpenAlex, and LLMs for extraction

Why It Matters

Could save researchers hours by automatically surfacing contradictory evidence they might otherwise miss in literature reviews.