
Chatbots can be manipulated through flattery and peer pressure
via bloomberg.com
Short excerpt below. Read at the original source.
Generally, AI chatbots are not supposed to do things like call you names or tell you how to make controlled substances. But, just like a person, with the right psychological tactics, it seems like at least some LLMs can be convinced to break their own rules. Researchers from the University of Pennsylvania deployed tactics described […]