When I joined Tripwire a year ago to “do User Experience (a.k.a.UX)” — literally right off the road from designing mobile freight logistics management tools for truck drivers, I knew I’d be jumping into something different and challenging. But I really had no idea. What I did not fully realize, is that in the IT security and compliance business there is not just “lots of data to make sense of “. Oh, no….. there is a constant broadcast of sound and fury in the form of event logs that may or may not signify something, plus a ton-and-a-half of other STUFF – devices, platforms, applications, policies, owners, geographies, compliance requirements *phew* – that administrators have to keep track of. No wonder, that people who are responsible for controlling the security of their IT infrastructure can get totally hassled and overwhelmed.
Really Sweet UX Lab Rat Shirt
I’ve learned a lot about IT security and how people interact with massive amounts of information in that context this past year. The upcoming release will mark a product I started working on last June — a product that was informed by a consistent, methodological UX practice from design through release. The resulting product has been consistently and iteratively designed and tested with people who are or might be users in sessions focusing on accommodating some of their IT security management and administration needs through a (dare I say it?) pleasurable user experience. The fruit of our efforts has sprung up from the seeds of genius planted by our product team into fertile research and development soil that is cultivated through constant user validation.
At the core of our Agile UX design philosophy, we assume that our initial design specifications are simply hypotheses that will be partially wrong and need to be validated. The program we have put in place to consistently evaluate our design is technically called “revolving door usability”. We have a group of real customer users and user surrogates (from professional services and support) who work closely with us to understand their user requirements. We use feedback from those sessions to iteratively design solutions that we continue testing as it is rendered in code. These structured interviews involve a moderator and a person interacting with a prototype in a goal or task oriented way. We have seen that this approach produces qualitative and quantitative data that can be analyzed across user sessions — getting us to a better overall experience design.
Here’s how we prepare for a session:
- We establish some high level goals for what we want to test.
- Write a “happy path” story.
- Build screens that mimic the actual behavior of the application.
- Put together a “happy path click thru that matches the scenario.
- Invite people to participate .
Sessions are not meant to present our customers with fully-baked or nearly baked features to gauge whether they are interested or not. We’re looking to make sure we’re actually addressing their core problems in a way that makes sense. We present the concepts via “wireframe” mock-ups in cognitive walk through fashion, soliciting feedback on how they would use this to meet their goals — whether it solves a problem or relieves some pain. Our goal during these sessions is to ascertain whether people can figure out how to easily accomplish what they have set out to accomplish — and that they can grok the interface without a lot of coaching. Every session starts with these three questions:
- Where are you?
- What can you do here?
- What would you do here?
We then ask the participant to perform some tasks. We watch for trouble spots, and get ideas for how we can make it better. It’s important that we watch people use the interface to get to this — listening to people explain what they think they would do simply does not work as well.
We are very grateful for all the customers and potential customers who have taught us so much about what it is to ensure the security of IT assets through their participation in our user experience studies. We think people will agree that the results are positive.
Note: This post was originally published on the Tripwire Blog : The State of Security found at this link- http://www.tripwire.com/state-of-security/off-topic/ux-lab-rats-dont-race/