Content moderation is a technology that uses AI algorithms to automatically detect and classify multimedia content such as text, images, videos and live broadcasts. It identifies bad information such as violence, terrorism, pornography, vulgarity, illegal advertisements and prohibited items, realizing early warning, interception and disposal of illegal content, and purifying the network content ecosystem.



