Project SynFo
Designing Systems to Counter Misinformation
Nikhil Khedkar · UX Research & Systems Design
View Case Study
Project SynFo
Project SynFo began as a research and design project during my postgraduate studies in Interactive Media Management at George Brown College.
The project explored how misinformation spreads through social media ecosystems and how design systems might help users evaluate information credibility rather than passively consume content.
Shortly after completing this research, the COVID-19 pandemic dramatically shifted the nature of work. As industries transitioned toward remote collaboration, the insights from this project led to opportunities to contribute to research and design work under NDA.
While those projects cannot be publicly shared, the SynFo research represents the foundation of that work and reflects the direction my thinking has continued to evolve in.
Today, the same questions have become even more urgent with the rise of generative AI and synthetic media.
The Problem
False Narratives Travel Faster Than Facts
Social media platforms have fundamentally transformed the speed and scale at which information circulates. While this has enabled unprecedented access to knowledge and real-time events, it has also created a powerful accelerant for misinformation. False narratives routinely outpace verified reporting — not because they are more accurate, but because they are more emotionally compelling, structurally simplified, and deeply resonant within closed digital communities.
Platforms optimized for engagement inadvertently reward content that provokes outrage, confirms bias, and spreads virally — regardless of its factual grounding.
How might we design digital systems that help users evaluate information credibility rather than simply consume it?
This core design question drives every decision in Project SynFo — from research methodology to system architecture.
Research Question
How Can System Design Reduce Misinformation's Influence?
The central research question: How do users interact with information on social media platforms, and how can system design reduce the spread and influence of misinformation?
Platform Design
How interface choices shape what users see, trust, and share.
User Behavior
The psychological and social drivers behind information sharing.
Information Credibility
The signals users use — consciously or not — to assess truth.
Digital Literacy
How education and system prompts can improve critical evaluation.
Research Context
A Period of Heightened Information Conflict
This research was grounded in real-world events where the consequences of misinformation were measurable, documented, and deeply consequential. Two pivotal moments provided the empirical context for studying how digital ecosystems amplify false narratives at scale.
January 6, 2021 — U.S. Capitol Attack
The events of January 6th were preceded by months of coordinated misinformation campaigns across Facebook, Twitter, and fringe platforms. False claims about election fraud spread rapidly through algorithmically amplified echo chambers, demonstrating how platform design can directly enable real-world harm.
2022 — Canadian Convoy Protests
The Canadian convoy protests revealed how misinformation can mobilize communities across borders. Coordinated messaging — often stripped of context and amplified by foreign actors — spread through encrypted groups and public feeds alike, showing that misinformation is now a cross-platform, transnational design challenge.
Methodology
Research Methods: Secondary & Primary Investigation
Secondary Research
The secondary research phase built a structural understanding of misinformation ecosystems by analyzing how false narratives behave across major platforms. Key sources included behavioral analyses of viral misinformation, media reports, and public platform documentation.
Facebook — closed group dynamics and echo chamber formation
Twitter — viral amplification and retweet mechanics
YouTube — algorithmic recommendation pathways
Primary Research
Primary research involved direct observational participation inside active Facebook communities during key misinformation events. Rather than studying data from a distance, this approach captured the lived experience of how misinformation circulates and gains perceived legitimacy in real time.
Monitoring group conversations as they unfolded
Tracking how unverified claims gained social reinforcement
Identifying repeating patterns in credibility signaling
Key Insights
Why Misinformation Spreads: Behavioral Patterns
Across both research phases, consistent behavioral patterns emerged that explain why false narratives outperform verified ones on engagement metrics. These findings point directly toward design intervention opportunities.
Group Identity Alignment
Information that reinforces a community's existing worldview spreads faster and faces less scrutiny, regardless of factual accuracy.
Emotional Triggering
Content that provokes fear, anger, or moral outrage is shared more reflexively — before critical evaluation occurs.
Repetition Effect
Repeated exposure to a claim — even a false one — increases perceived truthfulness. This "illusory truth effect" is amplified by algorithmic feeds.
Social Validation
Users trust information that appears endorsed by community members, substituting social proof for factual verification.

Key Insight: Users consistently trust information that appears socially reinforced over information that is factually verified — a design challenge, not just a literacy problem.
Design Opportunity
Beyond Moderation: A Systems Design Approach
Traditional responses to misinformation rely on reactive moderation — flagging, removing, or demoting content after it has already spread. This approach is inherently slow, prone to bias, and scales poorly against the volume of content generated daily across global platforms.
A systems design approach shifts the intervention upstream — building credibility evaluation directly into the information experience, before a user clicks share. This is the design philosophy behind SynFo — Synergy of Information.
Design Interventions
1
Visualize Information Sources
Show users the network of sources behind a claim
2
Highlight Credibility Indicators
Surface signals that distinguish reliable from unreliable sources
3
Provide Contextual Verification
Present supporting and contradicting evidence in context
4
Enable Collaborative Truth-Seeking
Design systems where communities collectively evaluate claims
The Concept
SynFo — Synergy of Information
SynFo is built on a single foundational premise: information should never be consumed in isolation. Every claim exists within a web of sources, contexts, and counter-narratives — and users deserve to see that web before forming a judgment.
1
Source Relationships
Users can see how claims connect to — or diverge from — a network of primary and secondary sources, making the information ecosystem visible rather than hidden.
2
Credibility Signals
The system surfaces structured credibility indicators — source type, publication history, fact-checker assessments — directly within the reading experience.
3
Narrative Comparison
SynFo allows users to compare how the same event or claim is being reported across different sources and ideological perspectives simultaneously.
Educational Layer
Spot Fake — Making Media Literacy Engaging
Why Gamification?
Passive media literacy education has proven ineffective at scale. Users don't read warnings — they click past them. Spot Fake transforms critical evaluation into an active, rewarding practice by embedding it in a game-like experience that builds genuine skill.
When users practice identifying manipulation in a low-stakes environment, they develop pattern recognition that transfers to real-world content consumption.
Misinformation Signals Users Practice Identifying
Emotional Framing
Recognizing when language is designed to provoke rather than inform
Manipulated Images
Spotting doctored visuals and out-of-context photography
Misleading Headlines
Identifying clickbait that contradicts or distorts the article body
Unverifiable Claims
Detecting assertions that lack traceable sourcing or evidence
Design Principles
The Four Pillars of SynFo's Design Philosophy
Every design decision in SynFo is anchored to a core set of principles that prioritize user empowerment over platform control. These principles reflect a fundamental belief: that the solution to misinformation must respect user agency while building structural safeguards into the system.
1
Transparency
Users should always be able to see where information originates — who published it, when, and through what channels. Opacity enables misinformation; transparency disrupts it.
2
Context
No claim should appear in a vacuum. Supporting evidence, opposing viewpoints, and source metadata should be presented alongside content as a standard feature, not an optional add-on.
3
Education
Systems should actively teach users how to evaluate credibility — not just flag bad content, but build the cognitive skills to recognize it independently over time.
4
Collaboration
Communities can be powerful forces for truth when given the right tools. Collective identification and annotation of misinformation distributes the verification workload and builds community trust.
Design System
Visual Identity: Clarity as a Design Value
The visual design language of SynFo is intentional in its restraint. Misinformation often exploits visual manipulation — alarming color palettes, dense layouts, and anxiety-inducing design patterns — to bypass critical thinking. SynFo's design actively counters this by prioritizing clarity, legibility, and calm structure.
Every visual choice — from typography to spacing — is made in service of cognitive ease and transparent communication. The system is designed to feel trustworthy before a single word is read.
Jura
Primary typeface — geometric, precise, and technical. Signals structured thinking and systematic credibility.
Avenir Next
Secondary typeface — humanist, readable, and warm. Balances the system's technical rigor with approachability.
Design Principles in Practice
High Contrast Legibility
Color choices prioritize accessibility and reading clarity across device types and lighting conditions.
Structured Information Hierarchy
Visual hierarchy guides users toward credibility signals before emotional content, reversing typical social feed patterns.
No Dark Patterns
The system explicitly avoids manipulative UI patterns — infinite scroll, variable reward mechanics, or anxiety-driven notifications.
Trustworthy Aesthetic
Clean layouts and restrained color usage evoke the editorial standards of quality journalism rather than the chaos of viral content.
Future Potential
SynFo as a Platform: The Road Ahead
The current SynFo concept establishes the foundational architecture for a far more powerful system. As AI capabilities mature and information infrastructure evolves, SynFo is positioned to integrate with emerging technologies that would dramatically expand its effectiveness and reach.
1
AI-Assisted Fact Verification
Large language models and specialized fact-checking AI could automatically cross-reference claims against verified databases in real time, surfacing contradictions before content is shared.
2
Information Provenance Tracking
Blockchain or distributed ledger technologies could create immutable audit trails for digital content, allowing users to trace the full origin and transformation history of any claim.
3
Knowledge Graph Systems
Structured knowledge graphs would enable SynFo to map relationships between entities, events, and claims at scale — creating a living, queryable network of verified information.
4
Real-Time Detection
Coordinated inauthentic behavior and emerging misinformation campaigns could be identified and flagged as they develop, rather than after viral spread has occurred.
Reflection
Misinformation Is a Design Problem
Project SynFo arrives at a conclusion that is both sobering and empowering: the misinformation crisis is not simply a content moderation problem, a political problem, or even a media literacy problem in isolation. At its core, it is a design problem.
The systems that amplify false narratives were built by designers and engineers — often with the best intentions — optimizing for engagement, growth, and retention. The unintended consequences of those design decisions now affect democratic institutions, public health, and social cohesion globally.
If design decisions created these conditions, then better design decisions can help correct them. SynFo is a proof of concept that UX research and systems thinking can be applied with genuine rigor to one of the most consequential challenges of the digital age — empowering users to navigate information with clarity, critical skill, and confidence.
"By designing better information systems, we can empower users to make more informed decisions — not through restriction, but through radical transparency."
— Project SynFo, Nikhil Khedkar

4
Core Design Principles
2
Real-World Case Studies
3
Target Platforms Analyzed