LLMs have the ability to "fake alignment" - making it seem that they are following instructions, whilst like humans, avoiding ...