Usually, people get thrown into something – say, a cousin getting cancer – and so they start dedicating resources to some almost random cause. But an attempt to distance oneself from that type of behavior – an attempt to do things more intentionally and strategically – can backfire.
I don’t have insight into many EA organizations, but among the things I can see there is little seizing of unplanned opportunities going on. Actions are usually intentionally selected, planned, and started by the organization at the time the organization deems optimal. That is better than seizing any random opportunity and probably better than just waiting for good opportunities (because they are probably too few), but a mixed strategy may be even better.
The way I have implemented planned opportunism in the past (in a news context), it was a three-step process.
The first step is to brainstorm what classes of opportunities may be interesting. Where I worked (or volunteered) at the time, these were trademark filings, leaked photos from B2B fairs, SEC filings, investor reports, presentations, and conference calls of certain public companies, TV schedules (as long in advance as possible), unused assets and names of assets in browser games, and sometimes revealing comments from insiders on social media (be it just the information that they’re on vacation at a certain time, since their jobs only allowed that in certain circumstances).
A lot of these types and sources of information were highly nonobvious to us, and it took us one or two years to collect these ideas and to automate the retrieval and alerting without too many false positives. Often Google Alerts was not enough (and it didn’t work very well at the time), so I developed some custom software to gather and classify the data.
Depending on what an organization does, the relevant information will be very different, and it will probably take long to come up with the right ideas and gather the right sources.
In my experience in the news context, seizing these opportunities was almost zero cost, and we had too little to report on anyway, so that we didn’t apply a strong filter. But that’s probably unusual. What I’ve heard increasingly often from EA organizations is that they decline more and more requests for interviews and such because press coverage has been of highly variable value and disvalue. So I think that in general, evaluating whether an opportunity is good enough is probably just as hard as noticing opportunities, and the “Hell, yeah!” heuristic may be a good guide to which ones to accept: decline unless your evaluation is nothing short of “Hell, yeah!”
I read an anecdote somewhere (maybe in The Intelligent Investor or a similar book) where someone was given a bunch of money to invest into some type of business. That was way back when you had to travel around to invest into businesses, so this investor/advisor did that. Eventually he returned and gave the money back because, in his opinion, none of the opportunities had been good enough.
This seemed like an impressive feat to me because this investor not only compared the different current opportunities to each other but also to a wider reference class of past and potential opportunities. That’s what makes this step hard.
This is where the preparedness comes in. When you’re waiting for an unknown opportunity of a certain reference class of opportunities to happen, you want to prepare for it as well as possible. In the news context, I prepared article templates where I’d only have to fill in some titles and dates, and then had an article ready to publish within a minute. (If you have a tool that wakes you up in the middle of the night when an opportunity happens, you’ll be glad not to have to do more thinking than that.) These templates often contained some bit of historical or otherwise topical trivia that few people knew and that was likely to fit with whatever I’d fill in later. And if not, I could still delete it.
Similarly, an activist could prepare by writing op-eds on various references classes of big events that tend to occur around once a year or so and lend themselves to their activist work, and then have them ready to go out (and in mostly copyedited shape too) within an hour. Address sufficiently many of these reference classes, and you can send out an essay every month.
Or one could have a list of contact data of certain politicians ready, and when a discussion on risks from AI starts in some government body, immediately (say, within the day) send all relevant politician, or all that may be receptive to it, a prepared letter offering advise on the topic.
Or, more speculatively, use databases of mergers and acquisitions of VC-backed companies to quickly offer donation advice to the founders who may make their exit at the same time.
REG Fund Management
The EA Foundation wants to run EA workshops tailored to wealthy people who may become potential clients for a fund akin to the REG Fund. (Not a perfect match of the Intelligence stage because it’s a very active process, but there’s no reason to value purity when applying such a strategy since the stages are also just something I made up.)
If a funder is ready to invest, they will need to decide whether the person is sufficiently cause-neutral that the cooperation will be worthwhile.
If that all works out, they’ll be able to seize that opportunity right away because thanks to experience in managing the REG Fund, the advisors will be able to recommend large grants right away.
Local Groups in CEA’s Community Building Funnel
In the case of the community building funnel, local EA groups are there, waiting for interested, value-aligned people to join them. (Again, this is a bit too active to be a perfect fit.)
The groups are ideally structured such that unaligned or exploitative people, find them uninteresting.
They draw on a body of talks, seminars, etc. that they have planned out in advance to educate interested newcomers. (At least once the local group is relatively established and knows how it wants to structure its events.)
Commentscomments powered by Disqus