i absolutely agree with this, especially the bit about personal intuition. the only thing id point out is that GPT's dont use logic. they look at billions of examples and incrementally use them to refine associations with different patterns. So in that way, its actually more similar to the symbolic/associative thinking to which you try to draw a distinction.
i absolutely agree with this, especially the bit about personal intuition. the only thing id point out is that GPT's dont use logic. they look at billions of examples and incrementally use them to refine associations with different patterns. So in that way, its actually more similar to the symbolic/associative thinking to which you try to draw a distinction.
doesn't make using a language model for magic practice any less impotent, just thought id mention.