Practical security guides for LLM applications — attacks, defenses, and best practices
Understanding prompt injection attacks and how to defend against them