What are the most pressing world problems?
We aim to list issues where each additional person can have the most positive impact. So we focus on problems that others neglect, which are solvable, and which are unusually big in scale, often because they could affect many future generations — such as existential risks. This makes our list different from those you might find elsewhere.
It’s also a constant work in progress, doubtless incomplete and mistaken in some ways, and may not align with your worldview — so we also provide a guide to making your own list. To learn why we listed a specific issue and how you can help tackle it, click the profiles below and see our FAQ below.
Table of Contents
Our list of the most pressing world problems
These areas are ranked roughly by our guess at the expected impact of an additional person working on them, assuming your ability to contribute to solving each is similar (though there’s a lot of variation in the impact of work within each issue as well).
The development of AI is likely to greatly influence the course we take as a society. We think that if it goes badly, however, it could pose an existential threat.
Biotechnological developments threaten to make much deadlier pandemics possible, due to accidental leaks or malicious use of engineered pathogens.
Nuclear weapons were the first genuine man-made existential threat. Despite some progress, we have not reduced the threat of nuclear war enough.
There’s a significant chance that a new great power war occurs this century — and this seems like a major risk factor for existential catastrophe. However, it seems hard to reduce this risk.
Beyond the suffering it’s already causing, worse climate change could increase existential risks from other causes and affect standards of living far into the future.
Building capacity to solve problems
We also prioritise issues that enable others to have a greater impact regardless of which issues turn out to be most pressing, through building community and infrastructure, research, and better decision making.
We are part of effective altruism, so we might be biased — but we think growing and improving this network of people working on solving the world’s most pressing problems is one way to do a lot of good.
Rigorously investigating how to prioritise global problems and best address them will make the efforts of people aiming to do good more effective.
Can the decision-making processes of the most powerful institutions be improved to make important decisions better in a range of areas?
We think all these issues present many opportunities to have a big positive impact. If you want to help tackle them, check out our page on high-impact careers.
Similarly pressing but less developed areas
We’d be equally excited to see some of our readers (say, 10–20%) pursue some of the issues below — both because you could do a lot of good, and because many of them are especially neglected or under-explored, so you might discover they are even more pressing than the issues in our top list.
There are fewer high-impact opportunities working on these issues — so you need to have especially good personal fit and be more entrepreneurial to make progress.
If we make it more likely that the world’s population could eventually recover from a catastrophic collapse, we could save the possibility of a flourishing future even if a catastrophe does occur.
We may soon create machines capable of experiencing happiness and suffering, whose wellbeing will matter just like our own. But our understanding of consciousness is so incomplete, we might not even realise when this becomes possible.
If we could effectively spread positive values — like (we think!) caring about the wellbeing of all sentient beings impartially — that could be one of the broadest ways to help with a range of problems.
If a totalitarian regime ever becomes technologically advanced enough and gains enough global control, might it persist more or less indefinitely?
Even as investment in space increases, we have very little plan for how nations, companies, and individuals will interact fairly and peacefully there.
There are many ‘public goods’ problems, where no one is incentivised to do what would be best for everyone. Can we design mechanisms and institutions to mitigate this issue?
The ability to manipulate the creation of molecules would plausibly have large impacts and could be crucial in many of the worst — and best — case scenarios for advanced AI.
Some of the worst possible futures might be less likely if we better understood why some people intentionally cause great harm (and how that harm could be limited).
The world’s most pressing problems pose immense intellectual challenges. Better reasoning by researchers and decision-makers could give us a better shot at solving them.
More world problems we think are important and underinvested in
We’d also love to see more people working on the following issues, even though given our worldview and our understanding of the individual issues, we’d guess many of our readers could do even more good by focusing on the problems listed above.
Problems many of our readers prioritise
Factory farming and global health are common focuses in the effective altruism community. These are important issues on which we could make a lot more progress.
Every year, billions of animals suffer on factory farms, where standards of humane treatment generally range from low to nonexistent.
Preventable diseases like malaria kill hundreds of thousands of people each year. We can improve global healthcare and reduce extreme poverty with more funding and more effective organisations.
Other underrated issues
There are many more issues we think society at large doesn’t prioritise enough, where more initiatives could have a substantial positive impact. But they seem either less neglected and tractable than factory farming or global health, or the expected scale of the impact seems smaller.
Digitally running specific human brains — ‘mind uploading’ — might be a safer way to get some of the benefits of artificial intelligence, but might also pose its own risks.
There is an unfathomable number of wild animals. If many of them suffer in their daily lives and if we can find a (safe) way to help them, that would do a lot of good.
Liberal democracies seem more conducive to innovation, freedom, and possibly peace. There’s a lot of effort already going into this area, but there may be some ways to add more value.
Keeping people from moving to where they would have better lives and careers can have big negative humanitarian, intellectual, cultural, and economic effects.
The algorithms that social media companies employ to curate content may be contributing to harmful instability and erosion of trust in many societies.
Incentives shaped by universities and journals affect scientific progress. Can we improve them, e.g. to speed up development of beneficial technologies (and limit the proliferation of risky ones)?
Faster economic growth could improve global standards of living and cooperation, and might help future generations flourish.
Depression, anxiety, and other conditions directly affect people’s wellbeing. Finding effective and scalable ways to improve mental health worldwide could deliver large benefits.
Frequently asked questions
Our aim is to find the problems where an additional person can have the greatest social impact — given how effort is already allocated in society. The primary way we do that is by trying to compare global issues based on their scale, neglectedness, and tractability. To learn about this framework, see our introductory article on prioritising world problems.
To assess the problems based on this framework, we mainly draw upon research and advice from subject-matter experts and advisors in the effective altruism research community, including the Global Priorities Institute, Rethink Priorities, and Open Philanthropy, though we also make our own judgement calls, especially for areas we’ve researched more, like artificial intelligence and building the effective altruism community. See more on our research process and principles.
To see the reasons why we listed each individual problem, click through to see the full profiles.
To be clear, comparing global issues is very messy and uncertain. We are far from confident that the exact ordering presented on this page is correct – in fact, it’s likely that it’s incorrect in at least some way. (More on this below).
Moreover, assessments of the scale and tractability of different global issues depend on your values and worldview. You can see some of the most important aspects of our worldview in our advanced series, especially our article on how we define social impact.
Given our values and worldview as an organisation, we’ve found some heuristics that help guide our prioritisation:
- Emerging technologies and global catastrophic risks. New transformative technologies may promise a radically better future, but also pose catastrophic risks. We think that mitigating these risks, while increasing the chance these technologies allow future generations to flourish, may be the crucial challenge of this century. Though there is a growing movement working to address these issues, work on mitigating many risks remains remarkably neglected — in some cases receiving attention from only a handful of researchers. So you’ll see many issues dealing with technology on the lists above.
- Building capacity to explore problems. Comparing global issues involves lots of uncertainty and difficult judgement calls, and there have been surprisingly few serious attempts to make such big-picture comparisons, so we’re strongly in favour of work that might help resolve some of this uncertainty — whether in the form of research or in the form of trying to see what works in more speculative areas. The importance of information value is reflected in our recommendation of speculative areas, as well as global priorities research itself.
- Building communities to solve problems. We think it can be extremely valuable to invest in organisations and communities of people who are trying to do good as effectively as possible. We’re especially keen to build the effective altruism community, because it explicitly aims to work on whichever global challenges will be most pressing in the future. We count ourselves as part of this community because we share this aim.
We think some problems are much bigger and more neglected than others, such that by choosing carefully, an additional person can have a far greater impact.
Holding all else equal, we think that additional work on the most pressing global problems can be between 100 and 100,000(!) times more valuable in expectation than additional work on many more familiar social causes like developed world education, where your impact is typically limited by the smaller scale of the problem (e.g. because it only affects people in one or a few countries), or the best opportunities for improving the situation are already being taken by others. Moreover, it seems like some of the issues in the world that are biggest in scale — especially those that could affect the entire future of humanity, like mitigating risks from AI or biorisks — are also among the most neglected. This combination means you can have an outsized impact by helping tackle them.
For this reason, we think our most important advice for people who want to make a big positive difference with their careers is to choose a very pressing problem to work on. This page is meant to help readers make that choice. Read more about the importance of choosing the right problem.
A key consideration for where to work is how society is currently allocating resources. If an important problem is already widely recognised, then it is likely that a lot of people are already trying to solve it, in which case it will usually be harder for a few extra people who decide to work on the issue to have a very large impact. All else equal, you are likely to be able to do far more good in an area that is not getting the attention it deserves.
One way to think about this is in terms of a ‘world portfolio’: What would the ideal allocation of resources be for all social causes? And in which causes are we farthest from that ideal allocation?
This is why our list looks a bit surprising: we purposefully want to highlight global issues that we think are furthest from getting the attention they need — such as the risk of a catastrophic engineered pandemic, which currently gets $1–2 billion of funding per year, which is only 1/500th of a more widely recognised problem like climate change (which also needs more work).
To learn more about why we prioritise more neglected issues, see our article on comparing global problems in terms of scale, neglectedness, and solvability, and our advanced series.
Another reason our list might look different from others’ lists is we think it makes sense morally to value the interests of all sentient beings equally — regardless of where they live, when they live, or even what species they are — which is uncommon. One upshot is that if it seems like something could impact a huge number of future lives, we think that’s a very big deal.
Some find it objectionable to say one problem is more pressing than another — perhaps because they think it’s impossible to make such determinations, or because they think we should try to tackle everything at once.
We agree that it’s difficult to determine which issues will affect lives the most, as well as how tractable and neglected different problems are. The field of global priorities research exists because these questions are so complicated, and we are far from certain about our views (see below). But we think with careful thought and research people can make educated guesses — and do better than random.
We also agree that we can make progress on different issues at the same time, and advocating for more people to work to help others can increase the total amount of work done to solve all problems.
However, resources are still very much limited, so we can’t do everything at once. And we think that given the seriousness of the many challenges humanity faces, we have to prioritise among issues and use our resources effectively to solve them as much as we can, given our limitations.
Refusing to compare problems to one another doesn’t get you out of prioritising — it just means you’ll be choosing to prioritise something with your time without thinking much about what.
No — though we’ve put a lot of work into thinking about how to prioritise global issues, ultimately we are drawing on a modest amount of research to address an unbelievably large and complex question, so we are very likely to be wrong in some ways (see below). You might be able to catch some of our mistakes.
Moreover, it’s very useful for people trying to make the world a better place with their careers to develop their own views about what to prioritise — you’ll be more motivated and more able to help solve an issue if you understand the case and have chosen the problem for yourself.
To help you form your own views, below we suggest a rough process for creating your own list of problems.
The most important and unusual driver of our lists is probably that we especially focus on the impact different issues can have on all future generations, an idea called longtermism. This increases the importance we place on reducing existential risks and on shaping other events that could affect the long-run future.
If we were to reject longtermism, issues that contribute to existential risk would stand out much less (including most of our top-recommended issues), while issues like ending factory farming, improving global health, speeding up economic growth, improving science, and migration reform would all be boosted.
That said, even if we rejected longtermism, we still think positively shaping AI and reducing the chance of a catastrophic pandemic would be top problems for more people to work on due to their large near-term and medium-term effects, as well as their neglectedness.
You can read about some counterarguments to longtermism on our page about it and in the second half of this article.
Of course there are other parts of our broad worldview that could be badly wrong — you can read about some of them in the articles in our advanced series.
Another major worry we have about the lists is that there’s an important issue we haven’t even thought of, but should be among our top-ranked issues. We sometimes call this the possibility of finding a ‘Cause X.’ The possibility of finding Cause X is one reason why we rate further research and capacity-building so highly.
Finally, we could easily be wrong about any of the particular issues we list — maybe some are much bigger or smaller than we think, or turn out to be more or less tractable. For example, perhaps the development of AI will be largely safe by default. You can see some of our key uncertainties about each individual issue by clicking through to the individual profiles, and we invite you to investigate these questions for yourself.
“What are the most pressing problems in the world?” is a crucial — and rare — question to take on! But it’s also difficult, and we’re uncertain about our answers. So it’s essentially unavoidable that we’ll be seriously mistaken about this list in some important ways.
In particular, we might include problems on the list that don’t belong, the list might be missing some crucial problems, and it may be presenting some issues in misguided ways.
But despite the limits on our knowledge, working to understand the world’s most pressing problems is worth serious effort. And we need to have some working answers (or assumptions!) — otherwise we can’t decide where to put our efforts. Most just make assumptions that inform their prioritisation. We want to be transparent about the tentative conclusions we’ve come to as they continue to inform our decisions.
By sharing this list, we encourage some people to work on pressing world problems and inspire others to think about these questions more deeply — potentially providing different, better answers.
Previous versions of this list have already resulted in thoughtful feedback, so we’ve adjusted our views. We anticipate this list will continue to evolve, as both the circumstances of the world change and we come to understand them better.
No — we don’t think everyone in our audience — let alone everyone in the world — should work on our top list of problems (even if everyone totally agreed with our views).
First, the pressingness of a problem is only one aspect — though a very important one — of our framework for comparing careers.
Different people will find different opportunities within each problem, and will have different degrees of personal fit for those opportunities. These other factors also really matter — you may well be able to have 100 times the impact in an opportunity that’s a better fit, and this can easily make it higher impact to work on an issue you think is less pressing in general.
Moreover, as our audience expands, we need to think more in terms of a ‘portfolio’ of effort by our readers, which creates additional reasons for members to spread out (we cover this in more detail in our article on coordination). Two of the most important such reasons are:
- As more people work on an issue, it gets less neglected, and there are diminishing returns to additional work. This means that a group of people that’s large compared to the capacity of an issue to absorb people will start to run out of fruitful opportunities to make progress on that issue, making it better for new people to spread out into other areas.
- If you work with others, there is value of information in exploring new world problems — if you explore an area and find out that it’s promising, other people can enter it as well.
Among people who follow our advice, we aim to help a majority shoot for one of the top world problems we list above, but we’d also like 10–20% to work on the second longer list, and perhaps another 10–20% to work on the others.
If we consider the world as a whole, not just our readers, it’s even more obvious they shouldn’t all work on our top-ranked issues. The world wouldn’t function if everyone tried to work on AI safety and preventing pandemics. Clearly, we need people working on a wide range of issues, as well as keeping society running and taking care of themselves and their families.
However, in practice it’s safe to assume that what most of the world will do will remain unaffected by what we say. (If that changes, we’ll change our advice accordingly!) So we focus on finding the biggest gaps in what the world is currently doing, to enable our readers to have as much impact as they can.
Much of our effort is allocated to resources that are useful no matter which issue you want to work on (such as our advice on career planning), but some are only relevant to a particular issue.
When it comes to these, we roughly try to allocate our efforts in line with what we think are the most pressing issues. This means we aim to spend most of our time learning, writing, and thinking about our highest-priority issues and less time on issues we think are less pressing.
However, the distribution of our effort does not exactly match our views about which issues are most pressing. This is because we are a small team and there are returns to focusing and really learning about some issues, and we are better positioned to help people work on some issues vs. others.
This means that we tend to put more effort into having great advice about the very top issues we prioritise and those for which we can have stellar advice compared to the other sources. In practice, that means we put more effort into having great advice and support in AI safety, biorisk, and building the effective altruism community.
We’d love to have the resources to spend substantial effort on all the issues in the lists above. But given the size of our team, we aren’t able to do much more than write an article or two on many of the topics and point readers in the direction of more informed groups.
We are so glad you are interested! It can seem daunting, but we’ve seen lots of people make real contributions to these problems, including people who didn’t think they could when they first came across them.
The short answer is that the individual problem profiles each have a section on how to help tackle that problem, so click through to read the full profiles.
Also, see our career reviews page and job board to get ideas for specific jobs and careers that can help.
If you want to think about what to do in more depth, see our materials on career planning.
This includes a career planning worksheet that takes you through a step-by-step process for creating your plan. In summary:
The first step is to learn more about the issues you are considering, as well as what is important to you in your career.
Then you’ll want to brainstorm longer-term career paths that will let you contribute the most — we have some ideas for these on our career reviews page. Note that each issue requires a lot of different kinds of work, from advocacy to research to helping build organisations, so you’ll have many paths to consider.
Don’t rule something out too early because it doesn’t sound at first like it’d be a fit for your skills — this is a mistake we see a lot. For example, you can help with AI safety using a variety of non-technical skills (see some suggestions for work in governance as well as supporting roles here).
Next, you’ll gain more information about your career options — either by talking to people, reading, or trying things out — and then start narrowing them down.
It may be best to first focus on building career capital — skills, knowledge, connections, and credentials that put you in a better position to have an impact in the future.
It’s much more important to maximise the impact you can have over the course of your career than it is to have a big impact next year — which often means starting by investing in yourself. So, early in your career, we often recommend focusing on building skills — see our list of the most valuable skills to learn.
Finally, you’ll figure out your next career step. You may still be very uncertain where to aim longer-term, but that’s OK so long as you can find a next step that puts you in a better position, either by improving your career capital or teaching you more about where to aim longer-term, or that is impactful in itself.
We have guides to particular career paths that contain common early steps as well as pointers on how to eventually put all your experience and skill to the best use.
You can also apply for free one-on-one career advice from our advisors, who can help you compare options and connect you with mentors and other opportunities.
Check out our full list of career planning resources for more.
The only thing you can control is contributing as much as you can — and that’s a matter not just of what the world needs, but also of your own motivation and abilities.
This said, we think people often can enjoy more kinds of things than they intuitively think — motivation can come from working with great colleagues on something you think is really important, even if the area isn’t immediately interesting or captivating to you. People’s interests also develop over time. So it’s worth keeping an open mind about what you might be able to get motivated to work on.
But if you try to get motivated and it doesn’t work, you can try working on something else. In the answer to the next question, you’ll find a few other lists of issues you can investigate besides ours.
If you really want to help with these issues but don’t feel motivated to help with them directly, you could try helping by donating to organisations that work on them. If you do this as a primary aim of your career, we call it ‘earning to give.’ You can also just donate 10% of your income or however much you’re comfortable with.
Read more about how to have a positive impact in any job.
We have an article that leads you through a process for comparing global problems for yourself. In brief, we recommend doing the following steps:
Clarify your broad worldview and values
What do you think is important? And how do you come to know the answers to that? Your answers to these questions are part of your worldview.
We discuss our worldview in our advanced series. Our article on comparing problems lists some alternative views that are common among our readers and some articles discussing their pros and cons.
Spending some time thinking about the big picture can make a big difference to where you end up focusing — just keep in mind that you’ll never have a complete and fully confident answer.
Learn more about frameworks for comparing issues
For example, we often use the importance, neglectedness, and tractability framework, in which how you assess the importance and tractability of problems is partially determined by your worldview. Also, see the article on this framework in our advanced series.
Start generating ideas
Once you have frameworks and your worldview clarified to some extent, you can start generating ideas for pressing problems, perhaps using other people’s lists to get started (like ours or others’ listed just below).
Compare
Now that you have your list of issues, compare them according to your worldview and using the frameworks you learned about above.
Identify key uncertainties about your list, work out what research you might do to resolve those uncertainties, then go ahead and do it, and then reassess and repeat. If some of the issues on your list overlap with ours, you can use our problem profiles as a jumping-off point for learning more.
Again, it would take a lifetime to get totally confident and make your list complete, so aim for action-relevant information instead: is further investigation likely to change what you actually work on next? If your best guesses aren’t changing, it’s probably time to stop thinking about it and focus on something else.
Moreover, you can (and should) continue to think about which issues you think are most pressing throughout your career.
Other lists of pressing global issues, for inspiration:
- The United Nations’ sustainable development goals
- The United Nations’ more general list of global issues
- The Global Challenges Foundation’s list of global risks
- A big list of cause candidates from the Effective Altruism Forum
Different problems need different skills and expertise, so people’s ability to contribute to solving them can vary dramatically. That said, there are also many ways to contribute to solving a single problem, so you also shouldn’t assume you can’t help with something just because you don’t have some salient qualification.
To learn more about what’s most needed to address different world problems, click through to read the profiles above.
To explore your own skills and other aspects of your personal fit (especially early in your career) and find your comparative advantage, we encourage you to make a list of career ideas, rank them, identify key uncertainties about your ranking, and then try to do low-cost tests to resolve those uncertainties. After that, we often recommend planning to explore several paths if you’re able to.
You can find more thorough guidance in our resources on career planning.
You can also look at our list of the most valuable skills you can develop and apply to a variety of issues, and how to assess your fit with each one.
If you already have experience in a particular area, see our article about how you might best be able to apply it.
New to 80,000 Hours? Take a look at our career guide.
Our career guide is based on 10+ years of research alongside academics at Oxford. It aims to teach you how to find a fulfilling career that does good.
It’s full of practical tips and exercises. At the end, you’ll have a draft of your new career plan.
Enter your email and we’ll mail you a book (for free).
Join our newsletter and we’ll send you a free copy of The Precipice — a book by philosopher Toby Ord about how to tackle the greatest threats facing humanity. T&Cs here.