As in the case of the other 2 compared applications, we have carried out 2 differentiated social validation tests. On the one hand, Andrés (user with zero vision) with an IPhone IOS terminal and on the other hand Domo (user with reduced vision) with a Samsung Android terminal.
IOS tests carried out by Andrés.
Andrés, our volunteer specialized in testing products for null vision, has tested the 3 applications of this comparison and in the following lines we summarize his experience of use.
Andrés installs the application on his IPhone 5S terminal with great ease, aided by the Voiceover screen reader.
A message appears advising that Taptapsee wants to access the camera that needs to be accepted.
It is important to highlight that this application does not work in real time but from photographs, which differs from the Aipoly vision application, in such a way that it only delivers one result for each object analyzed.
The additional information is translated into Spanish which facilitates its configuration through the screen reader.
The option to share photos seems good, Andrés tries to share a photo on WhatsApp but gives an error that causes the app to close, so he ends up sharing it by email.
She is very good with text, being able to identify text written on the different objects related to product brands or other characteristics. In turn, the application has a bit more difficulty identifying colors.
The following images are part of the list of objects used to test the application.
We wanted to test the same objects as with the other applications compared, these are the results:
- Pen: It is identified as "Image 2 is bluish click on the pen on a white surface", it follows that the identification is good but the phrase used towards the user does not make much sense. In a second attempt "image 1 is blue and white click pen".
- Mug: "image 2 is a black ceramic mug", in this case the mug is purple but the object is correctly identified.
- Paper cup: "image 3 is black and white, disposable cup on a blank surface", in this case it does not hit the color again but it does the object.
- Chair: “white plastic armchair image with gray metal base” which is a good identification.
- Smartphone phone: “Black Samsung smartphone on table” is a very good identification and the brand is included.
- Computer mouse: “HP black mouse” again a good identification, it is very good at identifying marks and texts.
- Water bottle: “Clear plastic water bottle” is correct.
- Keys: "Assorted keys in carabiner on white background" is correct.
- Wallet: “black bifold leather wallet on black table” correct identification.
- Notebook: "Blue and black spiral on white wooden surface", the type of object remains to be identified.
General impressions:
- In this application there is no functionality dedicated to color analysis, but this data is included in the global analysis, this functionality does not always match the color.
- As it is a photo, although it works slower than Aipoly it does not generate as much repetitive information, the colors identify them worse but the text especially with marks is better identified.
- It is easy to handle and it is appreciated that the explanations are in Spanish.
- The fact of sharing the photos via email can facilitate the identification with a second opinion of a person with vision.
Android tests carried out by Domo
Juan Carlos Domonte (Domo) is our volunteer specialized in product testing for people with reduced vision. Domo's viewing area is very small so you can only see something in the central area up close and up.
Domo uses an Android terminal, getting very close to see the screen. It uses the screen zoom function, and with a Samsung S8 it handles very well.
Domo has installed the application quickly, finding it on Google Play.
The first thing that the application warns in a message is that we have the talback deactivated and asks if we want to activate it.
In this case, Domo does not use its terminal with a screen reader so it decides not to activate the talkback and here a problem appears because if I do not activate the talkback I cannot hear the audio description of the identified images. Domo suggests correcting this problem for people with reduced vision who are not yet using the screen reader.
The application takes longer than Aipoly to process, but gives more information in each analysis.
- Pen: "blue and white click the pen on the surface" well identified but the phrase is somewhat abstract.
- Mug: "black ceramic mug on wooden table" well identified the object, wrong color.
- Paper cup: “Black and white coffee cup” well identified the object and wrong the color.
- Chair: "White plastic armchair with gray metal frame" correct analysis.
- Smartphone Phone: "Black Samsung Android Smartphone" correct analysis.
- Computer mouse: "HP wired mouse black" analysis correct.
- Water bottle: "Transparent plastic water container" well identified, although it remains to be specified that it is a bottle as in the IPhone analysis.
- Keys: "assorted keys on white surface" correct identification.
- Wallet: “black bi-fold wallet” correct identification.
- Notebook: "Blue notebook on white background" correct identification, in this case better than in the test with IPhone.
General impressions:
- The application offers very detailed information.
- The analysis must be allowed to be read without having the screen reader activated.
The following table shows a comparison of the social validation criteria of the 3 applications compared.
Comparison of social validation criteria
Criteria |
aipoly vision |
taptapsee |
Camfind |
USABILITY |
. |
. |
. |
COSTE |
. |
. |
. |
RELIABILITY |
. |
. |
. |
IMPACT |
. |
. |
. |
UTILITY |
. |
. |
. |
UNDERSTANDING |
. |
. |
. |
ACCESSIBILITY |
. |
. |
. |
IMAGE IDENTIFICATION |
. |
. |
. |