User talk:Ebrahim/Gadget-DeepLearningServices.js

From Wikimedia Commons, the free media repository
Jump to navigation Jump to search

This gadget is an AI tool that, currently, puts the following two links in the user's toolbox that, when clicked, do these informing-only actions above any photo on a page which advise the user of the probabilities of the presence of attributes like NSFW (Not Safe For Work, limited to porn) and matching common English words based on current deep learning technologies, It does not affect any page stored on the wiki, what the user does with the information (like categorizing a file or adding {{Nsfw}} to an image tag on a project page) is up to the user.

  • TAG (xception): It uses xception CNN model to detect all images tags and puts them as text above each image on the current page.
  • NSFW: Similar to above put just a ratio is put, it uses Yahoo's OpenNSFW, which is based on how probable the image depicts something that is NSFW, restricted to porn.

There is also a page, here that currently enlists just uploaded files having score of more than 50% on OpenNSFW model.