As more and more players register to become members on the forum, more and more people have the ability to see what we create. These people could have either good or bad intentions for our place as a whole. You can say one thing, but you will do another and actions speak louder than words.
To help keep the bad out, we're adopting the Valve-made model of security barriers for content we distribute. Whatever it may be, such as boards, information, servers, etc. we need to keep a safe environment for our members as we grow further.
To see a visual of what these barriers look like, simply google "trust visualization' and one of the results should look like this. The dashed lines are what keep out the bad actors and the rainbow of trust is based on a statistic such as how likely someone is to cheat. The longer they play, the less likely they are to cheat, thus they get into better-quality matches. You can see this play on GTA's "Good behavior" system of CS:GO's "Trust factor" system.
This isn't anything new, it's been here for a while, even since the start. But the most noticeable changes will be in the future where new boards and limits are provided to everyone. Please reply if you have any suggestions what the barriers should be, time spent on the website? Posts made on the website? Anyone can just AFK and spam post, bypassing those barriers easily. Try to think of a creative solution to the problem.