
Andrew Certain
11.2K posts

Andrew Certain
@tacertain
Former VP/Distinguished Engineer in AWS. See the blog for what you'll get if you follow (plus transit/housing quips) He/his https://t.co/FGTsM2Xy1K



Making Herself Heard. Throughout a troubled childhood and growing up in a post-divorce home, Kraken Hockey Network reporter @PiperShawTV fought to maintain her independent voice and her sanity → bit.ly/PiperShaw25












This would make me believe that there's something profound going on: if you could train a model on two different languages, but not give any translation examples, and have the model do translation. If it's really inferring meaning, this should be no problem. Humans can do this.



Literally the first thing I tried on GPT-4, a replica of a jq question I had asked on 3.5. No - there's still no "understanding" here, just fancy autocomplete. It doesn't "know" that it told me it was showing a |= example - it's all just symbols without meaning.

@monkchips As it usually follows an 80/20 rule, that 80% the AI is doing for you is 20% of the work, leaving the 80% of debugging, maintaining etc. So you're looking at a 25% speedup, not 5x. Unless you skip the 80% work by just shipping what the AI spits out, which I think will happen











