!— End Facebook Pixel Code -->
Some people react to artificial intelligence with fear, while others use it for work, writing scientific papers, and daily search.
Recently, ChatGPT has become a real sensation. General Motors is introducing a voice assistant based on it in their cars. In Ukraine, artificial intelligence is actively used to create creative content. Based on it, the website https://naurok.com.ua/chat has been developed, which is visited by up to 300,000 teachers and 700,000 students, according to Kantar Ukraine.
The truth is, the principle of AI improves over time, but it can produce inaccurate and unethical decisions. It helps to solve problems, but it can destroy science and ethics by adding fundamentally flawed language concepts to our technology.
As American linguist Noam Chomsky noted, "it is simultaneously comical and tragic that so much attention is focused on such a tiny thing compared to human intelligence." After all, artificial intelligence has a complex operating system that can create a logical chain and make a correct or incorrect analysis based on it. Machines and humans make mistakes. The problem Chomsky is referring to is that AI lacks the most important ability of any intelligence. ChatGPT can talk about what is, what was, and what will be. However, it cannot talk about what is not, and what could or could not have been. If programmed to believe that the earth is flat, it will assert that without any doubt.
In 2016, Microsoft's chatbot Tay, the predecessor of ChatGPT, was contaminated by internet trolls and produced blatantly misogynistic, anti-Semitic, and racist content a day after its launch. ChatGPT was severely restricted by programmers to avoid such mishaps.
In summary, such programs can be useful in certain limited domains. ChatGPT and its analogs are absolutely unable to balance creativity with limitations. They either produce too much truth and falsehood, ethical and amoral decisions, or produce too little, focusing on their indeterminacy and lack of preparedness for consequences.