Lucid's Experience with Crowdsourced Testing
Lucid's Experience with Crowdsourced Testing
To gain a stronger foothold in their path toward automated testing, check out how Lucid implemented crowdsourced testing.
Join the DZone community and get the full member experience.Join For Free
xMatters delivers integration-driven collaboration that relays data between systems, while engaging the right people to proactively resolve issues. Read the Monitoring in a Connected Enterprise whitepaper and learn about 3 tools for resolving incidents quickly.
Lucid realized that crowdsourced testing could be helpful in our transition towards a more automated testing process. As a result, Lucid introduced Rainforest QA, a crowdsourced testing service, into our testing process last year. We now have over 200 active tests in our suite, which has freed up quality resources to do more exploratory testing and allowed Lucid to perform more frequent releases to production. Here are some of the pros and cons that we've seen as we've used crowdsourced testing.
Pros of Crowdsourced Testing
Test results come fast
A good crowdsourced testing system can make scheduling tests and getting results very convenient. At Lucid, for example, it only takes an average of 20 minutes to get results from a single test. Running multiple tests at once will take longer, but even when we run all 200+ of our tests at once, we get results for all of them in about 90 minutes. Test results from Rainforest include what step a test failed at, testers' notes on why it failed, and a video recording of the test session. The video is especially useful for diagnosing test case failures.
Everyone on the QA team can author crowdsourced tests
One of the strengths of most crowdsourcing platforms is that there is no programming language that testers have to learn to use the platform-every tests is written in plain English. This means that anyone who is familiar with the test cases can contribute to the suite. The web app also facilitates ease of use by allowing users to access and run any of your company's tests.
Crowd testers can use visual instructions
It's often easier to describe a test case through visual demonstration, especially when the test's success criteria involve visual validation. The Rainforest QA platform facilitates the inclusion of visuals by allowing users to include screenshots and animated GIFs with each test step. Providing both visual and text-based instruction saves time in both test authoring and maintenance and adds a level of redundancy that helps testers understand instructions.
Excels at basic tests and visual tests
Crowdsourced testing can often be useful for tasks that would be tedious for your quality assurance team to perform week after week. Testing repetitious features can be wearying for QA testers, but it's less of an issue for crowd testers.
Crowdsourcing is also well-suited for visual and drag-and-drop tests. Automation tools struggle with visual validation and complex mousing interactions, but humans can understand these tasks more easily. This is another benefit of using screenshots in tests: visual tasks are even easier to understand when testers have visual instructions.
Cons of Crowdsourced Testing
Rainforest tests must be written in layman's terms
At Lucid, we have seen a 25:1 ratio between the time needed to author a crowdsourced test script and the time needed to execute the test case manually. So if a test case takes one minute to execute manually, it takes 25 minutes to write a script that a crowdsourced tester could understand.
The reason this ratio is higher than Lucid expected is that we can't just export our test cases into Rainforest and call it a day. Because crowd testers are unfamiliar with our product, every test case has to be rewritten specifically for Rainforest QA. A phrase like "use the import tool to verify a CSV populates the widget correctly" works for members of a QA team. Crowdsourced testers, however, require step-by-step instructions. Balancing necessary detail and brevity can be the biggest challenge of authoring crowdsourced tests, but mastering it will improve the reliability of your suite.
Bugs are caught later in the dev cycle
Since crowdsourced test suites limit your test runs based on a subscription or payment plan, Lucid only runs its testing suite on a stable release candidate. The downside to this approach is that test results comes later in the dev cycle. This has never delayed a scheduled release at Lucid, but it's an important tradeoff to consider when integrating a crowdsourced system into your software QA process.
Not all tests are suitable for crowdsourcing
Crowdsourced services can provide results quickly, but not all test cases are well-suited for crowdsourcing. Lucid has found that test cases can be too complex if the instructions are more than 15 steps long (with an ideal length of 10 steps or less). Tests longer than this threshold failed more often due to tester error. Other tests have features that can't be tested in Rainforest or are judged to be critical enough to test manually. Redesigning and splitting up test cases mitigated these issues in many cases, but we've recognized that crowd testing isn't always the best way to test a feature.
It's also worth noting that giving testers access to internal systems raises security concerns. Lucid was able to mitigate this risk successfully through proper network and account security practices, but security should always be considered when exposing test environments that may contain sensitive information to testers.
Crowdsourced testing can be a powerful testing tool in a company's overall QA strategy. It may not solve all of your testing problems, but it has helped Lucid develop its testing suite and complements our company's agile approach to testing.
Published at DZone with permission of Chandler Wakefield , DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.