I happened to be among the first to use ChatGPT (which allows me free entry), and in accordance with my background and curiosity I asked him a lot about the trends and history of expert system development, visual analysis, voice analysis, and learning mechanisms.
.
He blurted out more or less what I expected him to blurt out, but what amazed me was a clear and well-designed ability to articulate complex issues - which, we must admit, is a quality that human beings also seek for themselves, if only to explain to themselves, what they see.
.
<Get to the point laddy, time waits for no one>
.
This ability was invented by a person - the software he used is a unique tool, a mechanism that knows how to manifest an astronomical number of machine states - hence also being a substrate for the wild growth of bugs - a bug is any given machine state encountered in a precedent, a state that the program is not prepared to deal with. This is Godel's great discovery.
.
- as a machine, program or mechanism - a precedent is an immediate stop of any interrupted procedure, even if "stop" is a pre-programmed condition to refer to, in the output of an error message.
.
The computer (including the software that was running on it at that moment) is simply a multi-mode automaton, designed to serve human inventions, but the nature of the service is also planned by a person whose mind is never perfect - (again, according to Kurt Gadel), but he, the person , as an animal, is used to dealing with a precedent - he simply improvises, while immediately updating the picture of the world that exists in his head - a feature that a machine could take advantage of, if the person knew how to articulate to the machine from a precise objective understanding of the structure and operation of his soul.
.
- It would be nice if we knew about ourselves, what we pretend to teach the machine.
.
Here is what GPT3,5 thinks of itself:

*this article was google-translated from Hebrew
*this article was google-translated from Hebrew
Comments rated to be Good Answers:
Comments rated to be "almost" Good Answers: