hazn
1K posts

hazn
@hazn_com
emphasis not mine | https://t.co/SX27gE1DEC













In 2013, this Turkish man locked his head in a cage to quit smoking with his wife only opening it for meals

Interesting! I certainly think "take all the power" pretty heavily implies they will do it by force. "We will cede power to them, due to them being useful" is a much different feeling claim. I am excited to hear more of your thoughts on this! There's a thread here which is less relevant to my own interests (I am interrogating this because you plausibly can give me a signal I can turn into an experiment which can reduce loss of control likelihood) which is that humans straightforwardly shouldn't cede power to systems smarter than them. I mostly think this is not a great frame, and that we already have ceded control to systems (government, capital) whose sum is smarter than any one of us and this is basically fine. The issue for me is whether we can *trust* that the system, once deployed, will act with our best interests in mind. I think there are a set of signals one can collect which would offer peace of mind that a system will act in ones best interest, even if that system is much smarter than you. I do not think I have all of those signals, but I think I have some, and I am always looking for more. I think a world is possible where we build systems smarter than us, and we still maintain human autonomy over human-shaped things. I think this world is even likely when extrapolating from current trajectories. I am very obsessed with making this more likely and open to actionable insights which can be turned into experiments that make this more likely (much more than I am interested in things like debating x-risk).











