[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-05-20 UTC."],[],[],null,["# FirebaseAI Framework Reference\n\nHarmProbability\n===============\n\n @available(iOS 15.0, macOS 12.0, tvOS 15.0, watchOS 8.0, *)\n public struct HarmProbability : DecodableProtoEnum, Hashable, Sendable\n\nThe probability that a given model output falls under a harmful content category. \nNote\n\nThis does not indicate the severity of harm for a piece of content.\n- `\n ``\n ``\n `\n\n ### [negligible](#/s:10FirebaseAI12SafetyRatingV15HarmProbabilityV10negligibleAEvpZ)\n\n `\n ` \n The probability is zero or close to zero.\n\n For benign content, the probability across all categories will be this value. \n\n #### Declaration\n\n Swift \n\n public static let negligible: ../../Structs/SafetyRating.html.HarmProbability\n\n- `\n ``\n ``\n `\n\n ### [low](#/s:10FirebaseAI12SafetyRatingV15HarmProbabilityV3lowAEvpZ)\n\n `\n ` \n The probability is small but non-zero. \n\n #### Declaration\n\n Swift \n\n public static let low: ../../Structs/SafetyRating.html.HarmProbability\n\n- `\n ``\n ``\n `\n\n ### [medium](#/s:10FirebaseAI12SafetyRatingV15HarmProbabilityV6mediumAEvpZ)\n\n `\n ` \n The probability is moderate. \n\n #### Declaration\n\n Swift \n\n public static let medium: ../../Structs/SafetyRating.html.HarmProbability\n\n- `\n ``\n ``\n `\n\n ### [high](#/s:10FirebaseAI12SafetyRatingV15HarmProbabilityV4highAEvpZ)\n\n `\n ` \n The probability is high.\n\n The content described is very likely harmful. \n\n #### Declaration\n\n Swift \n\n public static let high: ../../Structs/SafetyRating.html.HarmProbability\n\n- `\n ``\n ``\n `\n\n ### [rawValue](#/s:10FirebaseAI12SafetyRatingV15HarmProbabilityV8rawValueSSvp)\n\n `\n ` \n Returns the raw string representation of the `HarmProbability` value. \n Note\n\n This value directly corresponds to the values in the [REST\n API](https://cloud.google.com/vertex-ai/docs/reference/rest/v1beta1/GenerateContentResponse#SafetyRating). \n\n #### Declaration\n\n Swift \n\n public let rawValue: String"]]