
Carl Jung made a point long ago that both foreshadows fractured entangled representation (FER) and offers a thought-provoking critique of modern ML in general: "Beware of unearned wisdom." (I'd update it to "unearned knowledge" for AI today.) If the way that you acquire knowledge impacts your facility for applying that knowledge in the future through its consequent underlying representation, then what price do you pay for the unnatural vacuuming up of vast swaths of knowledge in a giant disorganized batch? Unearned knowledge has a cost that's rarely if ever discussed in AI or ML. Thank you to @jakobmrees, an undergrad at NYU, for perceptively bringing this quote to my attention!

















