One thing we learned from launching FaceStat is that if you let users upload images, they will upload something inappropriate.
It’s not a trivial problem to solve. Machine learning algorithms aren’t good enough to be trustworthy, but users expect their images to be available immediately. FaceStat doesn’t have enough volume to justify hiring a person to sit around the clock checking every uploaded image. But even if it did, there has got to be a better way right?
We’re really excited about this product, and hope others will find it as useful as we have. In fact, we will moderate 1000 images from your site for free.
Watch a one minute video of CrowdSifter in action after the fold.
CAUTION: this video contains brief and pixelated nudity…