Grid view
Report abuse
Use this data
Sign up for free
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
Drag to adjust the number of frozen columns
Name
Notes
Ask X other people to evaluate if it's an info hazard and only move forward if at least Y people agree it's net good
"Even if you independently do a thorough analysis and decide that the info-benefits outweigh the info-hazards of publishing a particular piece of information, that shouldn't be considered sufficient to justify publication. At the very least, you should privately discuss with several others and see if you can reach a consensus." https://forum.effectivealtruism.org/posts/KPwgmDyHaceoEFSPm/informational-hazards-and-the-cost-effectiveness-of-open
Talk about the idea in dry technical ways to avoid it spreading too much
Don’t publicise a vulnerability until there’s a fix in place
"There is a norm in computer security of ‘don’t publicise a vulnerability until there’s a fix in place’" https://forum.effectivealtruism.org/posts/KPwgmDyHaceoEFSPm/informational-hazards-and-the-cost-effectiveness-of-open
Model the benefits and harms of sharing
Make a model of how much benefit you expect from sharing and harm you risk from sharing.
Gradual roll out
“In this FLI podcast episode, Andrew Critch suggested handling a potentially dangerous idea like a software update rollout procedure, in which the update is distributed gradually rather than to all customers at once” https://futureoflife.org/2017/07/31/transcript-art-predicting/ https://forum.effectivealtruism.org/posts/KPwgmDyHaceoEFSPm/informational-hazards-and-the-cost-effectiveness-of-open?commentId=dTnDBDfoNHNqWZeRP
Disclose information in the way that maximally disadvantages bad actors versus good ones
Gregory Lewis https://forum.effectivealtruism.org/posts/KPwgmDyHaceoEFSPm/informational-hazards-and-the-cost-effectiveness-of-open?commentId=Fe6zi4v2MTYf9rHSK
Develop and/or share the information
When you decide that the risks are low enough that it's fine to treat the information normally
Delay sharing
"Develop the information but don't (yet) share it": continue developing the information, but decide either that it shouldn't be shared or that it would be better to decide later
Think more about the risks
Spend more time deciding what to do. Consider pausing any development/sharing of the information that might already be occurring.
Frame the information to reduce risks
Use language to present the information in a way that makes it less likely to be misused
Share a subset of the information
Selectively share some of the information that would be net-beneficial to spread
Share the information with a subset of people
Figure out who it would be net-beneficial to share the information with, either to get their opinion on wider sharing or so they can develop defenses against the hazards raised
Avoid developing and/or sharing the information
If too high risk, don't do anything with it.
Monitor others
"Monitor whether others may develop and/or share the information": check if others might be discovering similar things and try to ensure that they don't misuse or irresponsibly share the information
Remove and decrease odds
"Decrease the likelihood of others developing and/or sharing the information": destroy your work that led you to the discovery and avoid spurring any others towards the same path of research
No undercut principle
"The export control regime known as the Australia Group incorporates a “no-undercut” principle, whereby countries are expected not to permit exports that another country has rejected, without consulting that other country first. Similar norms could be applied to funders, regulators (for example, institutional biosafety committees and institutional review boards), and journals. These collaborations could also consolidate decision making among a smaller number of better-resourced groups." https://thebulletin.org/2018/02/horsepox-synthesis-a-case-of-the-unilateralists-curse/
Try to avoid coming up with (and don’t publish) things which are novel and potentially dangerous
"With the standard of novelty being a something that a relatively uninformed bad actor [wouldn't come up with] rather than an expert (e.g. highlighting/elaborating something dangerous which can be found buried in the scientific literature should be avoided)." Gregory Lewis https://forum.effectivealtruism.org/posts/KPwgmDyHaceoEFSPm/informational-hazards-and-the-cost-effectiveness-of-open?commentId=dWJjTEiXY2aNk5Hc6
Check and see if it's already been publicly talked about, and if yes, then talk about publicly, just dryly to not increase attention hazard
Turchin https://forum.effectivealtruism.org/posts/KPwgmDyHaceoEFSPm/informational-hazards-and-the-cost-effectiveness-of-open
Info hazard comittee
Some kind of a committee for info-hazards assessment, like a group of trusted people who will a) will take responsibility to decide whether the idea should be published or not b) will read all incoming suggestions in timely manner с) their contacts (but may be not all the personalities) will be publicly known. https://forum.effectivealtruism.org/posts/KPwgmDyHaceoEFSPm/informational-hazards-and-the-cost-effectiveness-of-open?commentId=5tRaRokJXt89wWYet
19 records

Alert

Lorem ipsum
Okay