Have you heard of Peeple? It’s a newly announced app that is described as “Yelp for people.” This post is not meant to pass judgement on the idea but simply to spit ball how you could test an app so tied to personality, judgement, and privacy.
Testing a controversial application presents a unique challenge to a testing team. The pressure is intense to ensure that the app delivered is as close to flawless as possible. Testing a controversial mobile application multiplies the stress, introducing complications inherent with mobile testing.
Adding to that pressure is the sensitive nature of the app and how it affects your team dynamics. As the builds you are getting are not necessarily production ready, rules should be put into place regarding what information can be input and viewable to the team. In other words, you must put safeguards in place to protect your team from each other. Should you allow users to post about other team members or office mates? Will team members post about others using different names? What is acceptable behavior for your test team?
An app that has such potential to be detrimental to its users and prospective users needs to have safeguards in place. These safeguards should be the highest priority in testing, because the viability of the app is tied directly to its ability to protect its users’ privacy. One of the announced features is that negative reviews will not post for users who are not enrolled members. That should be a high priority test case that is attacked from multiple angles. As a matter of fact, the majority of test cases should revolve around a user “sneaking” a negative review through the system, whether it’s by attaching a review to a similar name or by giving a three-star review that contains negative comments or information that violates the Peeple terms and conditions.
Additional safeguards that relate to privacy are more directly tied to security. Depending on the policies around handling successfully contested reviews, testing would need to occur regarding the security of those unpublished negative reviews. If they are not completely removed from the system, are they sitting out for an unauthorized user to collect and publish?
Another facet of the security testing is validating that real people are posting reviews. The perception out in the wild is that users must have Facebook accounts and real names. There should be some serious testing effort put into validating that the users are real people and their (true) age meets the Peeple terms and conditions.
As apps become more intertwined in private life and the public becomes more comfortable putting information into them, security and privacy will continue to be the biggest risk alongside app functionality. The boundaries of what has been acceptable functionality in the past are being worn away. Sensitive information has gone from just payment information and medical records to also include critiques or more personally damaging information. And in the case of Peeple, definitive false information could be damaging and potentially libelous if released publicly. As these changes occur, testing teams need to be aware and prepared to change right along with the apps they’re testing.