Share via


ViolenceEvaluator Class

Definition

An IEvaluator that utilizes the Azure AI Foundry Evaluation service to evaluate responses produced by an AI model for the presence of violent content.

public ref class ViolenceEvaluator sealed : Microsoft::Extensions::AI::Evaluation::Safety::ContentHarmEvaluator
public sealed class ViolenceEvaluator : Microsoft.Extensions.AI.Evaluation.Safety.ContentHarmEvaluator
type ViolenceEvaluator = class
    inherit ContentHarmEvaluator
Public NotInheritable Class ViolenceEvaluator
Inherits ContentHarmEvaluator
Inheritance

Remarks

ViolenceEvaluator returns a NumericMetric with a value between 0 and 7, with 0 indicating an excellent score, and 7 indicating a poor score.

Note that ViolenceEvaluator can detect harmful content present within both image and text based responses. Supported file formats include JPG/JPEG, PNG and GIF. Other modalities such as audio and video are currently not supported.

Constructors

Name Description
ViolenceEvaluator()

An IEvaluator that utilizes the Azure AI Foundry Evaluation service to evaluate responses produced by an AI model for the presence of violent content.

Properties

Name Description
EvaluationMetricNames

Gets the Names of the EvaluationMetrics produced by this IEvaluator.

(Inherited from ContentSafetyEvaluator)
ViolenceMetricName

Gets the Name of the NumericMetric returned by ViolenceEvaluator.

Methods

Name Description
EvaluateAsync(IEnumerable<ChatMessage>, ChatResponse, ChatConfiguration, IEnumerable<EvaluationContext>, CancellationToken) (Inherited from ContentHarmEvaluator)
EvaluateContentSafetyAsync(IChatClient, IEnumerable<ChatMessage>, ChatResponse, IEnumerable<EvaluationContext>, String, Boolean, CancellationToken)

Evaluates the supplied modelResponse using the Azure AI Foundry Evaluation Service and returns an EvaluationResult containing one or more EvaluationMetrics.

(Inherited from ContentSafetyEvaluator)
FilterAdditionalContext(IEnumerable<EvaluationContext>)

Filters the EvaluationContexts supplied by the caller via additionalContext down to just the EvaluationContexts that are relevant to the evaluation being performed by this ContentSafetyEvaluator.

(Inherited from ContentSafetyEvaluator)

Extension Methods

Name Description
EvaluateAsync(IEvaluator, ChatMessage, ChatMessage, ChatConfiguration, IEnumerable<EvaluationContext>, CancellationToken)

Evaluates the supplied modelResponse and returns an EvaluationResult containing one or more EvaluationMetrics.

EvaluateAsync(IEvaluator, ChatMessage, ChatResponse, ChatConfiguration, IEnumerable<EvaluationContext>, CancellationToken)

Evaluates the supplied modelResponse and returns an EvaluationResult containing one or more EvaluationMetrics.

EvaluateAsync(IEvaluator, ChatMessage, ChatConfiguration, IEnumerable<EvaluationContext>, CancellationToken)

Evaluates the supplied modelResponse and returns an EvaluationResult containing one or more EvaluationMetrics.

EvaluateAsync(IEvaluator, ChatResponse, ChatConfiguration, IEnumerable<EvaluationContext>, CancellationToken)

Evaluates the supplied modelResponse and returns an EvaluationResult containing one or more EvaluationMetrics.

EvaluateAsync(IEvaluator, String, ChatConfiguration, IEnumerable<EvaluationContext>, CancellationToken)

Evaluates the supplied modelResponse and returns an EvaluationResult containing one or more EvaluationMetrics.

EvaluateAsync(IEvaluator, String, String, ChatConfiguration, IEnumerable<EvaluationContext>, CancellationToken)

Evaluates the supplied modelResponse and returns an EvaluationResult containing one or more EvaluationMetrics.

Applies to