Don't underestimate the difficulty of structuring that user test. When I say you want to ask some open ended questions, I want you to be careful not to leave the witness, I want you to give them the opportunity to take the conversation where they were like to a situated to the value you're providing. But I don't want you to go to them and say here, what do you think? That is not going to be terribly effective? Instead, expect the best, but prepare for the worst. Prepare a script, talk about what you're going to do, talk about what you hope to gain from the experience. Think about what they're going to ask, anticipate how you'll respond and how they'll respond. You really want to plan out this activity, you also want to think about what do you really want to know? Don't ask them too many questions that you're not going to take action on, be sure also that you ask the right questions, you accomplish your goal. You don't want to spend your time and their time to get nowhere in the end, so you want to ask the right questions, you also want to think about the ordering of the question, and also have some time for Q&A at the end. Practice is important. Many of us are not trained interviewers, many of us have not been through experiences where we've learned how to do this before, so you conduct some pilot tests, you may ask a friend to role-play this with you. Time yourself as well, get a sense of how long you think you're going to need to get through the questions and to reach the goals that you set forth. I would suggest 20 minutes, maybe 30 to 40 minutes on the outside for user test, some people will spend an hour with you, that's okay, as long as you get through what are those you need to do. Again, practice, be prepared, have a script, have some notes, and you also need some consistency for one user to another, you don't want to go to a 100 different people and have a 100 different questions that you ask everybody, you're not going be able to really analyze that and see areas of consensus or diversion, so in that sense, be sure that you've got a structured approach that you have planned in practice. Also don't overlook the need for building rapport and emphasizing honesty. If you build that rapport, it's more likely that they're going to tell you the truth anyway. You also want to set the expectation with them that you want the critique, you're not looking for an endorsement, you're not looking for a cheerleader, you're looking for an honest reviewer that's going to represent the voice of the customer that you're trying to serve. Let them know that the more critical the feedback, the board's going to help you, that you want to know, what could you do better, and that's the way that they're going to help you. It's not just by telling you what you want to hear or being overly complimentary, that's not valuable. Also ask them to think out loud as they get the product, as they start using the product, as they're clicking around on the screen as they're pressing buttons, ask them to talk about what they're doing, what they're trying to do, do they like what they see, can they find what they want? If not, you want to know that and you can take some notes real-time, so you're kind of a reporter as well, you might video record the session, you might audio record it, you might do a screen capture, you might record the video conference with their permission. Again, you want to go back as being a bit of an investigator and began to analyze what you're finding as well. Also, once it's done, assess how it went, after each user test, grade yourself. Did you learn what you had hoped for? Was that a valuable profile for the person that you interviewed? Could you structure the questions a bit different? Could your screener change to identify the people that you want to talk to versus how you don't want to talk to? Again, don't overlook the element of planning the session and also see how you did and be honest with yourself afterwards. That way you can get better as you go incrementally.