Let’s assume you’ve heard about crowdtesting and are considering giving it a try.
Now what? How to get you started? Which crowdsourced testing platforms to choose?
A series of question will pop-up in your head. I can’t guarantee to answer them all, but I am willing to share my personal experience, hoping it will be useful.
To be honest, at the beginning I wasn’t at all optimistic. Still, being a tester (quite a stubborn one) I had at least to give it a try and test the available bug hunting platforms.
Fully committed I started my Odysee in the ocean of uncertainty (Googling any relevant information), looking for the islands of hope (searching for trustworthy testing platforms).
I didn’t really have any system. After getting to know about a platform I would search for any reviews on the Internet and register.
That’s how I got a nice short list of crowd testing platforms:
As you may suspect I got infected by “I can manage it all” virus. Luckily it didn’t take long to dismiss my illusions.
Choose and focus. That’s what I did. I opted for 4 platforms:
Yeah, I know… But 4 is not 14 – I was getting better.
Looking with a retrospective I am now aware of the amount of stress this strategy has costed me.
So here is my first advice. If you are a beginner in crowdtesting or moreover in software testing overall, start with one platform and get to know the workflow. Otherwise you will get overwhelmed. Even if you are an experienced tester and think you are an amazing bug hunter, still you may reconsider rushing head over heals for multiple crowd testing sites.
Despite the fact that all crowdsourced testing platforms seem similar, their requirements may differ just enough for you to get frustrated.
Differences between crowd testing platforms may be in:
Criteria concerning bug classification. Roughly the same defect may be accepted at one platform and rejected on another one. The trickiest is usability/functional triage. Given that in many cycles, usability is out of scope, the destination is crucial here. Every platform has its own standards which will define the defect acceptance outcome.
Level of details in the bug reports. You may be required to explicitly describe steps, actual and expected results in the desired format. For example, the expected result “Search should work” gets rejected at Test.io as it’s too generic. They would expect something like “After inputting the search term “hat” I expect the available products to be shown in the search results”. It all depends on the platform although, so you should familiarize with the examples and requirements each crowd testing company provide.
Platform-specific expectations, bug attachment policies. In case you got used to Jing software while recording the screencasts (.svg default output format), you should know that anything else than .mp4 video is likely to get rejected. Every platform sets its own rules regarding defect attachments. For example, at Test.io you are required to show the date and time for each and every defect. The length of the video should not exceed 60 seconds. The file size for attachments is 25 MB. Forgot to show the date while testing on mobile? Defect screencast is 63 seconds long? You guessed it.
Therefore I suggest starting only with 1 testing platform. You may register at several ones and choose the one that seems the most appealing. After a certain amount of practice, you are more comfortable to switch between multiple projects of different crowdsourced testing platforms.
At the same time here is the next advice:
Don’t bite off more than you can chew. Meaning don’t get involved in too many projects.
You may get thrilled to receive multiple test invitations as your inbox gets several e-mails an hour. Anyway, it’s not a reason to sign up for each and every one. In case you aren’t sure about the project, you may join it and leave afterward. Some platforms (like Test.io) offer an easy way to quit any test cycle simply clicking one button and specifying a reason. At TesterWork you may e-mail the support with the request to be removed from the cycle. At Bugfinders, joining the cycle and not taking part in it doesn’t affect your rating in any negative way. Why not taking advantage of that?
After you figure out the workload you can handle, you’ll be able to choose your projects accordingly.
As I’ve previously stated I didn’t work at all 14 platforms I’ve initially registered on.
Nevertheless, I don’t have the intention to degrade them based purely on my limited experience. I made no follow-up with the platforms I gave up at the initial stage, hence can’t make any definite conclusions about them.
At the same time I may note the most common issues that discouraged me from work on such platforms:
- Didn’t receive any invitations to test software products
- Very few invitations, unable to join any project
- Interface itself being buggy
- Unable to download the product under test.
The issues of such kinds are time and motivation killers for testers. Nevertheless, the last thing I would like is to tag any platform as unprofessional. The guys work hard to transform the way classical testing procure is performed. They should at least take credit for that.
Let’s summarize the workflow while starting with the crowd:
- Define your objectives. You should know what you need the crowd for (more experience, diversity, financial remuneration?)
- Identify the workload you can handle.
- Choose the projects based on the objectives defined in step 1.
- Analyze the outcomes and adapt accordingly.