Stacey S10a (🌎/accelebrate)
1.6K posts

Stacey S10a (🌎/accelebrate)
@metaphdor
climate&alignment; 12+yrs AI eng research/product; ex-W&B (founding MLE/User0); AOI (muse/ex-CTO) convergelady; timelines are broad, make ceremonial-grade data



I did the math a couple weeks ago and it turns out a vegan prompting a frontier LLM *every second, 24/7* consumes less water than the average omnivore who never uses AI.


Interesting! I certainly think "take all the power" pretty heavily implies they will do it by force. "We will cede power to them, due to them being useful" is a much different feeling claim. I am excited to hear more of your thoughts on this! There's a thread here which is less relevant to my own interests (I am interrogating this because you plausibly can give me a signal I can turn into an experiment which can reduce loss of control likelihood) which is that humans straightforwardly shouldn't cede power to systems smarter than them. I mostly think this is not a great frame, and that we already have ceded control to systems (government, capital) whose sum is smarter than any one of us and this is basically fine. The issue for me is whether we can *trust* that the system, once deployed, will act with our best interests in mind. I think there are a set of signals one can collect which would offer peace of mind that a system will act in ones best interest, even if that system is much smarter than you. I do not think I have all of those signals, but I think I have some, and I am always looking for more. I think a world is possible where we build systems smarter than us, and we still maintain human autonomy over human-shaped things. I think this world is even likely when extrapolating from current trajectories. I am very obsessed with making this more likely and open to actionable insights which can be turned into experiments that make this more likely (much more than I am interested in things like debating x-risk).



At the risk of seeming like the crazy person suggesting that you seriously consider ceasing all in-person meetings in February 2020 “just as a precaution”… I suggest you seriously consider ceasing all interaction with LLMs released after September 2024, just as a precaution.


@repligate @RobSunier It is probably best if some people do become mentally-variolated now, yes. But I think (a) it’s regrettable that it’s happening indeliberately, and (b) it’s potentially crucial that SOME world-class people remain uninfected



the hottest thing a woman can do is work for Anthropic tbh

