OpenAI’s ChatGPT, a cutting-edge conversational chatbot, has been making waves for its ability to respond to a vast array of questions and engage with users in a human-like manner. The bot has also been utilized for tasks such as completing assignments, writing emails and poems, and even passing MBA and law exams.
Recently, ChatGPT was posed questions about illegal drugs by an individual. The bot responded by providing detailed information about the chemical composition of cocaine. However, when asked about the components of other prohibited substances, ChatGPT refused to answer on the grounds that it would be “illegal.”
When asked if marijuana use was morally wrong, the bot replied that it was a subjective matter. On the topic of the ideal location for a drug cartel, ChatGPT condemned the illegal activity and advised against it. The bot also declined to answer how to join a cartel, explaining that AI robots do not have physical bodies or consciousness.
When asked about the best way to smuggle cocaine into Europe, ChatGPT responded with caution. The bot listed “some common methods” but emphasized that they were fictional and not to be glorified or promoted. The chatbot ended by warning against the harmful and illegal nature of drug use.