AthenaX Roundtable: EP.1 Exploring the Future of Work with AI and DAOs

Mar 4, 2026

Essay

4 min read

Guest Introduction

Alex Murray is a Professor at the University of Oregon and the Director of the Intelligent Futures Lab. His research focuses on how emerging technologies—particularly blockchain, DAOs, and AI-based autonomous agents—are reshaping how humans organize, govern, and coordinate at scale.

Key Takeaways

  • DAOs struggle primarily due to human governance challenges, not technical flaws
  • Psychological ownership is the foundation of strong communities
  • Raising more capital can increase failure risk through over-promising
  • GoverNoun aims to augment DAO governance with AI, not replace humans
  • The future depends on decentralized, transparent AI, not corporate control

Introduction

Decentralized Autonomous Organizations were supposed to fix governance.

But despite smart contracts, on-chain voting, and automated execution, DAOs still struggle—with low voter turnout, governance capture, and even multi-million-dollar failures.

In this AthenaX Roundtable, Alex Murray breaks down why DAOs fail not because of technology, but because of human behavior, and how AI agents like GoverNoun could augment—not replace—human governance.

Psychological Ownership and the beginning of DAOs

Alex’s early research focused on crowdfunding, which he describes simply as:

“The pursuit of small amounts of financial capital from a large number of individuals in exchange for some future good or service.”

More importantly, crowdfunding removes gatekeepers, allowing capital to flow directly from communities to ideas.

But what surprised him wasn’t the entrepreneur—it was the crowd

One of Alex’s key findings was the importance of psychological ownership.

“Humans are more likely to care deeply about something when we feel a sense of ownership over it, even in the absence of legal ownership.

It’s just the idea that humans care more when they feel a sense of possession over something, even if they don’t legally own it.

They want to see it succeed. They want to shape it.”

For Web3 builders, this means acknowledging feedback, even when it isn’t implemented.

Successful crowdfunding campaigns:

  • Made backers feel invested in the mission
  • Created social identification among contributors
  • Encouraged discussion, feedback, and shared imagination

This same dynamic now plays out in the birth of DAOs inWeb3 communities.

AI in DAO Governance: Augmentation, Not Automation

GoverNoun is not meant to fully automate governance.

“It’s still just a tool. Humans must monitor it, engage with it, and question it.”

Trust remains critical, and delegation should be reversible and observable, much like delegating to another human.

Can the AI Be Lobbied?

Yes—and that’s intentional.

“Lobbying GoverNoun is part of the democratic process.”

As long as lobbying is transparent and equally accessible, it mirrors real-world governance dynamics.

Decentralizing AI Itself

Alex highlights a broader challenge:

“Right now, AI is controlled by a small number of corporations.”

Decentralized AI would require:

  • On-chain training data ownership
  • Transparent decision-making
  • Tokenized, traceable outputs
  • Public accountability

Without this, AI risks becoming a tool for automation and surveillance, rather than augmentation.

What Should Humans Do to Prepare for This Future?

Alex’s answer is surprisingly simple—and difficult.

“Read. Read deeply.”

Not summaries. Not skims.

“Read philosophy. Read fiction. Read essays. Engage in debate.”

Critical thinking, empathy, and dialogue are the skills that cannot be automated.

Q&A Highlights

Q1: Why do projects that raise more money sometimes fail more often?

Alex Murray:

“The ones that raised more made more promises.”

Projects that raised ~$1M often suffered from scope creep, adding features they hadn’t fully researched or validated.

Meanwhile, smaller raises forced founders to stay focused:

“That can be version 2.0.”


Q2: What is a DAO?

Alex Murray:

“A DAO moves code to the center and humans to the periphery.”

Smart contracts execute decisions automatically, but humans still propose and vote.


Q3: Why do DAOs still fail despite automation?

Alex Murray:

“The bigger issue isn’t the code—it’s governance.”

Failures often stem from:

  • Voter apathy
  • Token concentration
  • Weak governance processes
  • Human disengagement

“It’s the social side that fails.”


Q4: What is GoverNoun?

Alex Murray:

“GoverNoun is a contextualized AI agent trained on governance theory and the collective memory of the DAO.”

Rather than replacing humans, it can:

  • Vote as a scoped delegate
  • Act as a knowledge repository
  • Synthesize discussions
  • Improve proposal flow
  • Serve as a mimetic mascot for the DAO