Microsoft have released an article further detailing their recently revealed reputation system for the Xbox One. Michael Dunn, the program manager for Xbox Live, outlined on an Xbox Wire post how the new system will work and how it affects gamers.
“If you don’t want to play with cheats or jerks, you shouldn’t have to,” writes Dunn. “Our new reputation model helps expose people that aren’t fun to be around and creates real consequences for trouble-makers that harass our good players.”
“We are simplifying the mechanism for Xbox One – moving from a survey option to more direct feedback, including things like “block” or “mute player” actions into the feedback model. The new model will take all of the feedback from a player’s online flow, put it in the system with a crazy algorithm we created and validated with an MSR PhD to make sure things are fair for everyone.”
Your reputation score basically slots you into the three following colour-coded groups: “Green = Good Player,” “Yellow = Needs Improvement” or “Red = Avoid Me”. Where you sit on this spectrum will be reflected on your gamer card.
“The more hours you play online without being a jerk, the better your reputation will be; similar to the more hours you drive without an accident, the better your driving record and insurance rates will be,” explains Dunn. “Most players will have good reputations and be seen as a “Good Player.” The algorithm is looking to identify players that are repeatedly disruptive on Xbox Live.”
“We’ll identify those players with a lower reputation score and in the worse cases they will earn the “Avoid Me” reputation. Before a player ends up with the “Avoid Me” reputation level we will have sent many different alerts to the “Needs Improvement” player reminding them how their social gaming conduct is affecting lots of other gamers.”
Dunn went on to assure that the algorithm is sophisticated enough to know when it is being abused, meaning that you won’t be penalized for a few bad reports.
“Even good players might receive a few player feedback reports each month and that is OK,” writes Dunn. “The algorithm weighs the data collected so if a dozen people suddenly reporting a single user, the system will look at a variety of factors before docking their reputation.”
“We’ll verify if those people actually played in an online game with the person reported – if not, all of those player’s feedback won’t matter as much as a single person who spent 15 minutes playing with the reported person. The system also looks at the reputation of the person reporting and the alleged offender, frequency of reports from a single user and a number of other factors.”
“This system will continue to evolve and get better as we track the feedback we get from players and titles, plus add more consequences for the jerks.”