FundsLong-Term Future Fund

Long-Term Future Fund

We make grants that address global catastrophic risks, especially potential risks from advanced artificial intelligence and pandemics. We also seek to promote longtermist ideas and increasing the likelihood that future generations will flourish.

Impact

The Long-Term Future Fund has recommended several million dollars' worth of grants to a range of organizations, including:

Created an instruction-generalization benchmark for LLMs

Built and maintained digital infrastructure for the AI safety ecosystem

Conducted public and expert surveys on AI governance and forecasting

Ran an AI safety independent research program

About the fund

The Fund has historically supported researchers in areas such as cause prioritization, existential risk identification and mitigation, and technical research on the development of safe and secure artificial intelligence—where it was among the first funders. Most of our fund managers have built their careers working full time in areas directly relevant to the Fund’s mission.
The Fund managers can be contacted at longtermfuture[at]effectivealtruismfunds.org

Focus areas

The Fund has a broad remit to make grants that promote, implement and advocate for longtermist ideas. Many of our grants aim to address potential risks from advanced artificial intelligence and to build infrastructure and advocate for longtermist projects. However, we welcome applications related to long-term institutional reform or other global catastrophic risks (e.g., pandemics or nuclear conflict). We intend to support:
  • Projects that directly contribute to reducing existential risks through technical research, policy analysis, advocacy, and/or demonstration projects
  • Training for researchers or practitioners who work to mitigate existential risks, or help with relevant recruitment efforts, or infrastructure for people working on longtermist projects

Payout reports

Payout date
Total grants
No. of grantees
Payout report

Payouts over time

Applications

Stats for the last 1000 applications received

Why donate to this fund?

The future could include a large number of flourishing humans (or other beings). However, it is possible that certain risks could make the future much worse, or wipe out human civilization altogether. Actions taken to reduce these risks today might have large positive returns over long periods of time, greatly benefiting future people by making their lives much better, or by ensuring that there are many more of them. Donations to this fund might help to fund some of these actions and increase the chance of a positive long-term future.
Many people believe that we should care about the welfare of others, even if they are separated from us by distance, country, or culture. The argument for the long-term future extends this concern to those who are separated from us through time. Most people who will ever exist, exist in the future.
However, the emergence of new and powerful technologies puts the potential of these future people at risk. Of particular concern are global catastrophic risks. These are risks that could affect humanity on a global scale and could significantly curtail its potential, either by reducing human civilization to a point where it could not recover, or by completely wiping out humanity.
For example, tech companies are pouring money into the development of advanced artificial intelligence systems; while the upside could be enormous, there are significant potential risks if humanity ends up creating AI systems that are many times smarter than we are, but that do not share our goals.
As another example, previous disease epidemics, such as the bubonic plague in Europe, or the introduction of smallpox into the Americas were responsible for many millions of deaths. A genetically-engineered pathogen to which few humans had immune resistance could be devastating on a global scale, especially in today’s hyper-connected world.
In addition to supporting direct work, it’s also important to advocate for the long-term future among key stakeholders. Promoting concern for the long-term future of humanity — within academia, government, industry, and elsewhere — means that more people will be aware of these issues, and can act to safeguard and improve the lives of future generations.

Why you might choose not to donate to this fund

Donors might conclude that improving the long-term future is not sufficiently tractable to be worth supporting. It is very difficult to know whether actions taken now are actually likely to improve the long-term future. To gain feedback on their work, organizations must rely on proxy measures of success: Has the public become more supportive of their ideas? Are their researchers making progress on relevant questions? Unfortunately, there is no robust way of knowing whether succeeding on these proxy measures will cause an improvement to the long-term future. Donors who prefer tractable causes with strong feedback loops should consider giving to the Global Health and Development Fund.

Fund managers

Lawrence Chan

Lawrence Chan

Fund Manager
Daniel Eth

Daniel Eth

Independent
Fund Manager
Eli Lifland

Eli Lifland

Guest Fund Manager

Fund advisors

Jonas Vollmer

Jonas Vollmer

Fund Advisor at Effective Altruism Infrastructure and Long-Term Future Fund

Frequently asked questions

How do I make a donation to an EA Fund?

What is the risk profile of the Long-Term Future Fund?

Why donate to the Long-Term Future Fund instead of donating directly to individual organizations?

Can I apply for funding to the Long-Term Future Fund?

Rigorous grantmaking for high-impact projects