Thursday, December 5, 2013

Practice 2013: Understanding Players Panel

·         Davin Pavlas (Riot Games), Morgan Kennedy (Assassin’s Creed), Naomi Clark (GameLab, FreshPlanet, RebelMonkey)
(by Davin Pavlas)
·         Research at Riot Games, Senior User Researcher
·         Research as player focus, being player focused requires understanding players (we’re players too, but many kinds of players play our games). Research equips us to understand wants, needs, behaviors.
·         League of Legends – MOBA game with lots of players, competitive 5v5
·         Research definition = propose a model of the world and test that data without bias.
·         Actual definition = test design assumptions to ensure player experiences are positive and intentional.
·         Data-informed approach for design = informed but not driven (use knowledge and data alongside intuition, work with goals and assumptions, intended experiences), players are collaborators and not just data, research isn’t a gate for a box to check (integrated into the design process in an iterative fashion)
·         Interdisciplinary inquiry = telemetric data, modeling, surveys, community management
·         Case Study #1
o   Lucian the Demon Hunter with two guns (Equilibrium style). Sleek, speedy, serious ranged damage character with finesse gameplay.
o   Research flow = concept testing, 3 lab studies, release, post-launch
o   Concept test = simple read of concept art, survey & interview (fantasy, mood, personality, role, fit into the fiction), high appeal scores, good theme conveyance. People thought his two different colored guns meant something.
o   Lab test = moderated one-on-one with players, think aloud protocol, streamed and recorded for designers and team
o   Lab study #1 = felt like stiff movements, not what was promised by concept, movements felt like a turret, abilities didn’t fit player model of the concept art, visual theme compelling but his gameplay was problematic. Solution = remove stuff movements and confusing abilities.
o   Lab study #2 = aesthetics still a mixed bag, high base speed but players feel he’s still slow, ultimate was cool but frustrating to use.
o   Why does he feel slow?  Animations
o   Lab study #3 = he finally felt fast,  gameplay and aesthetics fit the concept, mix of animation and frame speed changes make Lucian feel fast, ultimate was easier to use.
o   Post-launch = fiction quality, theme quality, and default aesthetics were considered high. Visual quality was okay and gameplay quality was low.
o   Summary = difficult champion to get right, theme was very compelling but its mechanics needs to be matched with experience design
·         Case Study #2
o   Champion select screen is where many intense interactions happen. There’s very little time and little information to help make players make these tough decisions.
o   Champion select is not an effective setting for collaboration. Solution = remake champion select to be more collaborative and have intent-based queuing that does not set the meta-game.
o   Externalize all the assumptions and test them.
o   Research flow = concept testing, global testing, prototype testing, lab studies
o   Initial survey = random sample of 1000 NA/EU players, player preferences (each role has at least 15% favorite rate), feature disposition, composition reactions
o   Global testing = initial survey revealed lack of role terminology consensus (fighter, carry, tank)
o   Prototype testing = an idea with merit still needs to be executed correctly. Results = positive disposition, Captain vs. Solo experience, usability flow
o   Onboarding study = test with brand new players, Team Builder vs. Champion Select
o   Public beta testing = more testing with the player base
·         Summary = system and character testing are similar (test assumptions, validate and ensure intended experiences are being made)
(by Morgan Kennedy)
·         Understanding Players (Assassin’s Creed 4 Silhouette and Behavior Differentiation)
·         Was a manager at Gamestop before working at Ubisoft. Worked on Assassin’s Creed series, Ghost Recon, Michael Jackson Experience, Your Shape, and Just Dance 3.
·         Affordances in AC3 = a railing means that the player can free-run there.
·         Mental models = the idea starts in the players head, player learns things about your game each time they play them
·         Ship identification = players had problems telling the different ships apart, can usually tell the gunboat (smallest ship) and the man of war (biggest ship), but unable to tell the difference between the three middle-sized ships (Schooner, Frigate, Brig).
·         Perceived differences = small ships were faster and tended to flee, big ships were slower and more aggressive, the gunboat was the exception
·         Players identify ships = differences in behavior (speed, aggressive, size)
·         Ship archetypes should be optimized around familiar affordances (man of war kept its large size, made the medium long-ranged ship less aggressive so it was more differentiated with the other medium ship that was short-ranged)
(by Naomi Clark)
·         How Indies Playtest, enlisted over 20 indie developers about their playtesting process
·         halfrobot.com/indiesplaytest.pdf
·         “Playtest before you think you are ready. Is it too early for you to playtest? If the answer is yes, then playtest anyway.” – Eric Zimmerman
·         “Don’t playtest very early one.” – Dan Cook
·         Methodical time-slice testing = know your questions, know what your build is suppose to deliver, hit your targets
·         Doug Wilson doesn’t even like the word playtesting. Conventional playtesting was good for small UI tweaks, but not mechanics design. Get key suggestions from influential people.
·         Techniques = don’t talk while players are playing, ask players to think out loud, listen to what players say about problems they had and not the solutions they come up with, construct a heat map, do a more targeted demo (depending on the audience, tell them what to do instead of letting them go through the coded tutorial)
·         “Playtesting is for masochists.” You need thick skin, it’ll upset your expectations, let a day or two to let the ideas swim around in your head without thinking about them too consciously
·         Eric Zimmerman’s under-the-desk creative meditation
·         It’s very bad to show your game to other developers, they will tell you how to make the game instead, other devs can help you construct your own weak points, kid testing is great because they are honest and blunt, don’t necessarily listen to everything kids are saying
·         Maybe games aren’t meant to be played (lose/lose), or will never be played (Jason Rohrer’s A Game for Someone), or refused to be played (Train)
QUESTIONS AND ANSWERS
·         Lucian’s iterative design took about 3 weeks per cycle. Designer’s intention of the character can sometimes be quite different from player’s perception of the character.
·         Playtesting at Ubisoft is a different service, so testers are not considered game designers. They just provide a mirror of how players are playing the game. The Assassin’s Creed 4: Black Flag team is about 300 people, with 20 people being game designers. The testers write about 80 page book report every two weeks with lots of empirical data (telemetry data, player quotations and feedback).
·         “Data is fiction. Data cannot tell you anything. Only you checking your own assumptions and intuition will tell you something.” – Davin
·         The community from the internet is not representative of your full community. Everybody on the team engages with the community. You need to learn how to be qualitative with that data.
·         Adam Saltsman’s Grave has complete publicity and openness to the community.
·         Be wary of changing your entire design, which can happen if you listen to too much feedback.
·         Bad research is bad research, and it can come from anywhere.
·         The commonality between all designers is that playtesting is extremely useful for seeing how well your UI works. This is something that is not shared with films because films don’t need to test the viewer’s usage of the product.
·         There is more than just raw playability testing. Testing is useful for testing emotional resonance and connection that players have with the space and characters.
·         In Assassin’s Creed 4, players complained that they hated fog, but sailors hate fog.
·         Set is a game that Davin loves and cannot play because he is red-green color-blind. That is a game that wasn’t playtested enough.
·         It’s not about whether a course of action is correct, it’s about your goals and being able to meet them. Lucian was meant to be a very high skill ceiling char that is fast, and the playtesting and design choices should work towards that.
·         Tight-loop in game design when professional players become game designers of the game. There’s a natural progression at least in the triple A space for players to become its designers. You don’t need to be an expert to give data and insight, but you do if you want to be an effective collaborator.
·         David Sirlin’s games have a very developer-y like feeling to them.
·         People know they are compensated to test, but for League of Legends andAssassin’s Creed, the playtesters are so passionate and excited to test that the games that they give biased information.
·         What kind of affordances can you create with Team Builder to not dictate strategy? Team Builder decoupled roles and position, it focused on how players expressed themselves, not on how the character is meant to be played.
·         Riot uses a lot of designer intuition to make decisions so they come into failures a lot fewer times. Failures are usually not of testing and researching, but of interpretation of the data.
·         There are people who are very excited to use data to dictate decisions. We need stewards to guide the data and interpret them correctly.
·         Self-selection bias – compare to a random sample.
·         In Assassin’s Creed 4’s fiction, you are actually a game developer that’s making the simulation that you experience in the Caribbean. At the end of each mission, you will rate each level from 1 to 5, and it will actually be sent to Ubisoft. It’s gamified gamification.
·         Riot uses modeling to look at impact, temper risks, and find parts to sweat out and be agile about. You want to be able to knee-jerk and develop longitudinally.
·         Was Dark Souls playtested at all? It’s a beautiful disaster that makes you feel alone, afraid, and frustrated… so it clearly has been tested and achieved its goals. It has a very good tutorial too.
·         Riot has a whole team devoted to improving the community and encouraging positive behavior. Team Builder was made by that team. In general, the online player base is not very negative and the negative players that you see in games are not from trolls. Negative behaviors usually come from someone having a bad incident. This makes it easier for the developers because that’s a point where they can intervene and try to change for the better.
·         Testers are injected midway into the process and collaboratively co-design. It’s not a waterfall process.
·         Recommendations = Katherine Ibister’s book, Jakob Nielson’s Usability,Pedagogy of the Opressed

No comments: