The AI tool Chat GPT has recently been gaining attention for its potential use in nefarious activities

 image credit: Google 

Cybersecurity research firm Checkpoint Research reported three incidents in which bad actors used the AI tool to create malicious tools.

 image credit: Google 

These tools included info stealers, encryption tools and tools to facilitate fraud.

 image credit: Google 

Cybercriminals have been drawn to Chat GPT, indicating that they have begun to use it to compose malicious code.

 image credit: Google 

A thread on a popular underground hacking forum revealed that the publisher was conducting experiments with Chat GPT to recreate malware strains. 

 image credit: Google 

USDoD, a threat actor, posted a Python script on December 21, 2022, claiming it was the first he ever created. 

 image credit: Google 

It is believed that Open AI provided assistance in its creation, which could lead to cybercriminals with technical capabilities despite having little to no coding skills.

 image credit: Google 

A cybercriminal had also developed scripts for a Dark Web marketplace, which could be used to facilitate the trading of illicit goods.

 image credit: Google 

   All payments made in these transactions were made in cryptocurrencies.

image credit: Google

Chat GPT has the potential to make it easier for hackers to write code and while it can be used beneficially, it can also be utilized for malicious objectives.

image credit: Google