Contributing to Events Data

Welcome! This guide explains how you can help improve the p(Doom)1 event timeline by contributing real quotes, fixing metadata, and curating event quality.

📊 Quick Overview

The p(Doom)1 timeline contains 1028 events tracking the history of AI safety research, funding crises, technical breakthroughs, and policy developments. Many events currently have placeholder reactions that need to be replaced with real quotes from researchers and media.

1028 Total Events
~949 Need Real Quotes
~28 Have Summaries
0 Real Quotes (Yet!)

✅ Your contribution matters because:

đŸŽ¯ What Needs Help?

🟠 Placeholder Quotes (~949 events)

Most events show this badge: âš ī¸ Placeholder - Needs Real Quote

Example placeholder quote:

"Critical insights for the field"

What we need:

Real quotes from LessWrong, EA Forum, Twitter, blogs, or news articles

đŸ”ĩ Human Summaries (~28 events)

Some events show: â„šī¸ Summary (Not Direct Quote)

These are better than placeholders but could still be upgraded to real sourced quotes when available.

đŸŸĸ Real Quotes (0 events - we're just starting!)

Our goal: ✓ Verified Quote

Example of what we want:

"This is the worst funding crisis in EA history"
— Scott Alexander (2022-11-16) (source)

🚀 How to Contribute

Option 1: Suggest a Real Quote (Easiest!)

  1. Browse the Events Timeline
  2. Click on an event you know about
  3. Look for the placeholder/summary badges in the Reactions section
  4. Click "💡 Found a Real Quote? Suggest it here"
  5. Fill out the form with the quote, author, source link, and date
  6. Submit - it creates a GitHub issue for review

✅ Good sources to check:

Option 2: Fix Event Metadata

If you notice errors in categories, tags, rarities, or game impacts:

  1. Click the event detail page
  2. Scroll to "đŸˇī¸ Event Metadata" section
  3. Click "→ Suggest different [category/rarity/tags]"
  4. Opens GitHub issue - explain what should change and why

Option 3: Bulk Review (Advanced)

If you're familiar with the AI safety landscape:

  1. Use the Events Browser table view
  2. Filter by category (e.g., "funding_catastrophe")
  3. Select multiple events with checkboxes
  4. Click "Bulk Suggest Metadata"
  5. Review events systematically by topic area

✅ Quote Quality Guidelines

✅ Good Quotes

Safety Researcher Reactions:

Media Reactions:

❌ Bad Quotes

Don't submit:

📋 Required Information

When suggesting a quote, you must provide:

Optional but helpful:

đŸŽ¯ High-Priority Events

Top 10 Priority Events

These events have significant historical importance and would benefit most from real quotes:

  1. FTX Future Fund Collapse (2022) - Massive AI safety funding loss
  2. OpenAI Fires Sam Altman (2023) - Leadership crisis
  3. GPT-4 Release (2023) - Major capability advance
  4. Anthropic's Constitutional AI (2022) - Technical breakthrough
  5. AI Safety Camp Events - Community building
  6. DeepMind's AlphaFold (2020) - Capability demonstration
  7. EU AI Act Passage (2024) - Policy milestone
  8. MIRI Research Program Changes - Institutional shifts
  9. Pause Giant AI Experiments Letter (2023) - Public awareness
  10. Superintelligence Book Release (2014) - Field foundation

📍 Where to Find Quotes

Source What to Look For
Scott Alexander (ACX) Covers major AI safety events in blog posts. Check archives for event dates Âą 2 weeks
LessWrong Search for event title or organization name. Check "Alignment Forum" tag for technical events
Twitter/X Search for event name + "AI safety". Check @elonmusk, @sama, @ESYudkowsky, @DarioAmodei
News Archives Google News search with date range. Fortune, MIT Tech Review, TechCrunch AI sections

📝 Review Process

After You Submit

  1. GitHub issue created with your quote suggestion
  2. Maintainer reviews for accuracy and quality
  3. Quote verified - checks source is valid and quote is accurate
  4. Metadata updated in pdoom-data repository
  5. Website regenerated - your contribution goes live!
  6. Credit given (if you provided your name)

Timeline

📈 Statistics & Progress

1028 Total Events
0 Real Quotes
~28 Summaries
~949 Placeholders

đŸŽ¯ Goals

❓ Questions

Why are there so many placeholders?

The 949 alignment research events were imported from a database that didn't include reaction quotes. We're now working with the community to source real reactions from the literature.

Can I suggest multiple quotes for one event?

Yes! Major events often have multiple notable reactions. Submit them separately and we'll track them all. In the future, we may implement a quote carousel for events with multiple high-quality reactions.

What if I don't have a GitHub account?

You can email quotes to team@pdoom1.com with the subject "Event Quote Suggestion". Include all the required information (quote, author, source, date).

Will my contributions appear in the game?

Yes! Real quotes will be used in:

🙏 Recognition

We deeply appreciate all contributors who help build this historical record of AI safety. Quality contributions will be:

Thank you for helping preserve the history of AI safety research!

📅 Browse Events Timeline 💡 Suggest a Quote 📂 View Source Data

Technical docs for developers: See Quote Database Schema