Examtopics

AWS Certified AI Practitioner
  • Topic 1 Question 95

    Which prompting technique can protect against prompt injection attacks?

    • Adversarial prompting

    • Zero-shot prompting

    • Least-to-most prompting

    • Chain-of-thought prompting


    シャッフルモード