Create and execute comprehensive safety testing plans for LLM-based features
Evaluate and analyze testing outcomes for risk assessment and propose mitigation actions
Document testing strategies, results, and actionable feedback meticulously
Required Qualifications:
Advanced understanding of LLM safety risks and mitigation strategies
Experience in AI, Trust & Safety, or product QA highly preferred
Background in Engineering, Computer Science, Data Science, or Information Systems is required
Proven ability to apply an adversarial mindset to identify potential system misuses
Strong documentation and organizational skills to accurately track and report findings
Average salary estimate
$100000
/ YEARLY (est.)
min
max
$80000K
$120000K
If an employer mentions a salary or salary range on their job, we display it as an "Employer Estimate". If a job has no salary data, Rise displays an estimate if available.