Content Moderation using AWS Machine Learning

Why use AWS Machine Learning for content moderation?

When managing User Generated Content (UGC) content moderation of video and images can be time-consuming and expensive. AWS Machine Learning content moderation can save time and money and can screen for a number of different types of objectionable content including:

  • nudity or suggestive content
  • violence or visually disturbing images
  • drug, tobacco, or alcohol use
  • gambling
  • rude gestures or hate symbols
Amazon Rekognition content moderation
Amazon Rekognition can automatically recognize content that needs to be moderated.

 Architectural Diagram

AWS Machine Learning Content Moderation Architectural Diagram

How much does AWS content moderation cost?

  • The primary cost factors come from using Amazon Rekognition:
    • Video processing for content moderation costs $0.10 per minute.
    • Image processing for content moderation costs $0.001 per image (up to 1 million images).
  • The recurring costs for data storage, search, and other services amount to about $3.00 per day. 
  • Professional setup can range between ~$5,000 to $10,000, or self-setup.

How long does it take to set up?

Kicking off the Cloud formation template in an existing AWS account only take about 20 minutes, but full integration into a video processing workflow can take 1 to 3 weeks.

Digital Resources

*These digital resources include additional services such as transcription and translation, though these features can be disabled.

Want help?

Our talented AWS solution architects can help you get an automated content moderation solution up and running using AWS machine learning. Send us an email, chat with us, or call us at (310)507-0606 for a free consultation.

Full list of content moderation categories & subcategories

  • Explicit Nudity
    • Nudity
    • Graphic Male Nudity
    • Graphic Female Nudity
    • Sexual Activity
    • Illustrated Explicit Nudity
    • Adult Toys
  • Suggestive
    • Female Swimwear Or Underwear
    • Male Swimwear Or Underwear
    • Partial Nudity
    • Barechested Male
    • Revealing Clothes
    • Sexual Situations
  • Violence
    • Graphic Violence Or Gore
    • Physical Violence
    • Weapon Violence
    • Weapons
    • Self Injury
  • Visually Disturbing
    • Emaciated Bodies
    • Corpses
    • Hanging
    • Air Crash
    • Explosions And Blasts
  • Rude Gestures
    • Middle Finger
  • Drugs
    • Drug Products
    • Drug Use
    • Pills
    • Drug Paraphernalia
  • Tobacco
    • Tobacco Products
    • Smoking
  • Alcohol
    • Drinking
    • Alcoholic Beverages
  • Gambling
    • Gambling
  • Hate Symbols
    • Nazi Party
    • White Supremacy
    • Extremist

Learn more about how we've set our clients up for success by reading our case studies

Have questions?