Hi there!
You asked specifically to be included in new feature feedback and here’s an example of something that we’re hoping to launch soon.
Below is a potential explanation of this new feature that would be shared with all creators.
I am particularly interested in the following:
- What questions about this feature do you have after reading this?
- Do you have any concerns about this feature?
- What about this makes you happy?
- What else do you want to know?
- Anything else?
I’ll do my best to share as much as possible if the explanation is insufficient - this will help us to share more broadly and make certain we’re communicating things clearly.
Patreon is planning to roll out a better way for people to file reports when they believe a creator is not meeting Patreon’s community guidelines (https://www.patreon.com/guidelines). This new in-product reporting tool will help our Trust and Safety team filter out frivolous reporting while providing guidance and support to creators a one-to-one basis.
Key Improvements
Only logged-in Patreon users can make reports: Right now, anyone (yes, anyone) can claim a creator is violating our community guidelines. This sometimes led to frivolous reports by people outside the Patreon community. Moving forward, only people who have a Patreon account will be able to report content.
Individual posts can be reported (and not a creator’s entire account): Often, people were trying to report only a piece of content, not an entire creator. Now, by flagging just a piece of content, reporting is more accurate, which makes it faster for us to evaluate the reports and work with creators on updates (if changes are needed at all).
Better reports from users help us make better decisions about content: With the old system, we often didn’t get enough information. The new reporting system provides more information for content like DCMA takedown requests, which have a different evaluation process than something like doxing. The new system also directs the person filing the report with relevant FAQs, which we hope will alleviate concerns and improve reporting accuracy. [For example - one of the common patron complaints is that creators are not delivering on what was promised, that’s not an issue for trust and safety and we have robust FAQs on how to handle this issue that we can surface directly in the reporting tool.]
Quicker resolution for content in question: Over the long term, we’ll be able to see patterns in reporting, which will help our Trust and Safety team as they review each case, again leading to quicker resolutions for creators.
How It Works
You’ll now see a button with three dots and the word “more” in both web and mobile versions that will load a form to be filled out. (Please note: since you can’t report yourself, you won’t see three dots on your own profile when you’re logged in.)
This will generate a dialog box that will walk the person through a process to file a report. Logged-in users can file a report via the website, mobile website and, eventually, in the app. Once a report is submitted we confirm that we’ve received it. Behind-the-scenes, it’s routed to a member of the Trust and Safety team for review. The person filing the report does not hear back about the results of their claim, and we only contact creators if there is a need for some discussion.