Six Already Creepy Uses of ChatGPT AI

Artificial intelligence is making a lot of noise today. Especially ChatPGT. This chatbot can also be misused. Here are six use cases that are scary to say the least.

ChatGPT, OpenAI’s text generator, is imperfect in producing accurate or interesting writing, but it can create relatively appropriate text on almost any topic, almost instantly. It is quite remarkable. But even with many safeguards in place, the system can also be quite dangerous.

We are just beginning to discover less than ideal uses. Anything that can create content out of thin air can create something dangerous. It’s just a matter of time. Below are six scary, or at least dubious uses that people have already seen. And all this, before ChatGPT was really popular and when the application was still in its infancy.

1. Create malware

ChatGPT who creates malware is frightening, and rightly so. Not because malware is new, but because ChatGPT can do it endlessly. AIs don’t sleep. As explained Infosecurity Magazine, “cybersecurity researchers have been able to create a polymorphic program that is very complex and difficult to detect”. Researchers might use their creation to craft malware and use the app to create variations on that code and make it hard to detect or stop.

2. Cheating at school

Less scary, but perhaps more predictable. A tool that can generate text on any subject is perfect for a child who wants to cheat in school. Teachers have already said they caught students in the act. Schools have banned the app. And this trend should not slow down, on the contrary. AI will most certainly become another tool that younger children will learn to master in school.

3. Spamming in dating apps

Spam may be the wrong word, but people use ChatGPT to chat with their matches on Tinder. It’s not necessarily scary, but knowing that you might be trading with a computer program and not a potential partner can be confusing.

4. Take the work of reporters and other editors

Should I be worried regarding my job?

5. Phishing and other scams

It is difficult to prove the existence of this one, but a tool like ChatGPT would be perfect for the phishing. The posts of phishing are often easy to recognize precisely because the language is flawed. With ChatGPT, this would no longer be the case. Experts have in any case already warned that this was a very suitable practical case.

6. Fool recruiters

Everyone knows how hard it can be to find a job. It’s a long, often demoralizing process, and sometimes a good job in the end. But if you’re looking for a job, you might be missing out on an opportunity for a job app.artificial intelligence. A company recently discovered that answers written by ChatGPT performed better than 80% of humans. ChatGPT might more easily use the keywords expected by recruiters and, in fact, through the various filters put in place by human resources.

Leave a Replay