GPT-4 is 500 times more powerful and boasts 100 trillion machine learning parameters,
It sounds most excellent. But does the technology scale?
I don't know that expansion of learning parameters can improve upon contextual chain based associations. That appears to be the missing link as far as GPT and soft AI go. However many contextual points of reference artificial intelligence requires to produce creative works that are linear in context to human cognitive function. It is possible that we have not yet developed an optimal data structure or algorithm for producing or even emulating the mechanism within a software engineering context.
This issue runs parallel to tech corps like tesla having difficulty with self driving apps. Translating contextual cues of driving on a road. To a format that an AI can comprehend and extrapolate to the correct response. Could well be a vastly understated issue as far as machine learning and AI go.