158
USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL IMPAIRMENT TO ACCESS MOBILE PHONES Tuhin Chakraborty

USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

USER INTERFACE MECHANISMS FOR PEOPLE WITHVISUAL IMPAIRMENT TO ACCESS MOBILE PHONES

Tuhin Chakraborty

Page 2: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest
Page 3: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

USER INTERFACE MECHANISMS FOR PEOPLE WITHVISUAL IMPAIRMENT TO ACCESS MOBILE PHONES

Thesis submitted to theIndian Institute of Technology Kharagpur

for award of the degree

of

Master of Science (by Research)

by

Tuhin Chakraborty

Under the guidance of

Dr. Debasis Samantaand

Dr. Sudip Misra

School of Information TechnologyIndian Institute of Technology Kharagpur

Kharagpur - 721302, IndiaDecember 2015

©2015 Tuhin Chakraborty. All rights reserved.

Page 4: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest
Page 5: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

Approval of the Viva-Voce Board

/ /2015

Certified that the thesis entitled User Interface Mechanisms for People with Vi-sual Impairment to Access Mobile Phones submitted by Tuhin Chakraborty tothe Indian Institute of Technology Kharagpur, for the award of the degree Master of Sci-ence has been accepted by the external examiners and that the student has successfullydefended the thesis in the viva-voce examination held today.

(Member of DAC) (Member of DAC) (Member of DAC)

(Supervisor) (Co-Supervisor)

(Internal Examiner) (Chairman)

Page 6: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest
Page 7: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

CERTIFICATE

This is to certify that the thesis entitled User Interface Mechanisms for People withVisual Impairment to Access Mobile Phones, submitted by Tuhin Chakrabortyto Indian Institute of Technology Kharagpur, is a record of bonafide research work underour supervision and we consider it worthy of consideration for the award of the degreeof Master of Science (by Research) of the Institute.

Dr. Debasis SamantaAssociate ProfessorSchool of Information TechnologyIndian Institute of Technology KharagpurKharagpur - 721 302, India

Dr. Sudip MisraAssociate ProfessorSchool of Information TechnologyIndian Institute of Technology KharagpurKharagpur - 721 302, India

Page 8: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest
Page 9: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

DECLARATION

I certify that

a. The work contained in the thesis is original and has been done by myself under thegeneral supervision of my supervisors.

b. The work has not been submitted to any other Institute for any degree or diploma.

c. I have followed the guidelines provided by the Institute in writing the thesis.

d. I have conformed to the norms and guidelines given in the Ethical Code of Conductof the Institute.

e. Whenever I have used materials (data, theoretical analysis, images, and text) fromother sources, I have given due credit to them by citing them in the text of thethesis and giving their details in the references.

f. Whenever I have quoted written materials from other sources, I have put themunder quotation marks and given due credit to the sources by citing them andgiving required details in the references.

Tuhin Chakraborty

Page 10: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest
Page 11: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

Dedicated To,My beloved Parents,

Tapan Kumar Chakraborty

Godhuli Chakraborty

Page 12: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest
Page 13: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

ACKNOWLEDGMENTThis thesis would not have seen the light of day had not it been for the help and supportof many people. Foremost, I would like to thank my advisors, Dr. Debasis Samanta andDr. Sudip Misra, who have supported me throughout with their patience and knowledgewhile allowing me the room to work in my own way. Their persistence to do qualitywork has helped immensely in shaping this thesis. They have been ever encouraging andsupportive whenever I explored new research directions or pursued new research ideas,continually inspiring me to face fresh new challenges each day.

It gives me immense pleasure to thank the head of the department Prof. Rajib Malfor extending me all the possible facilities to carry out the research work. I am grateful toDr. Krothapalli Sreenivasa Rao, Dr. Pabitra Mitra and Dr. M. Manjunatha for servingon my Departmental Academic Committee and for their valuable advice, suggestions andencouragement.

In my daily work, I have been blessed with a friendly and cheerful group of fellowstudents, researchers and lab-mates who have all been a strong pillar of support, bothacademically, emotionally and psychologically. I owe it to them for having given me theencouragement and strength to slowly pave my way through my journey at IIT Kharag-pur. A special vote of thanks to Dr. Debasish Kundu, Dr. Aditi Roy, Sankar NarayanDas, Santa Maiti, Goutam Mali, Sujata Das, Sayan Sarcar, Soumalya Ghosh, Manoj Ku-mar Sharma, Pradipta Saha, Arindam DasGupta, Chandan Misra, Barun Kumar Saha,Tamaghno, Pratik, Arijit, Kumar Raja, Raj Kumar Dutta, Kadiyala Sai Praveen, Gau-rang Panchal, Anupam Mandal, Arun Patra, Tirthankar Das Gupta, Manjira Sinha, C.Ghanthimala, Rajeev, Late Siddharth Bharti, Harika, and Jyothi for being such wonder-ful friends or lab-mates. I also cherished the company of a special group of friends fromIIT Kharagpur – Kanchan, Ayan, Sandip, Asish, Arunava, Dipanjan, Biswajit, Sanku,Amit, Manas, Srikanth, Kiran, and Avirup; and also from Kolkata – Dipen, Siddhartha,Aritra, Sougata da, Bapi da, and Pradip da.

My greatest strength comes from my mother. She has continued to remain to bemy greatest source of inspiration. Her unconditional love and support has given me thestrength to continue with my research work unhindered even during the most difficultphases of my life. She has been the wind beneath my wings and the silver lining to eachweary day.

Tuhin Chakraborty

ix

Page 14: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest
Page 15: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

List of Symbols and Abbreviations

List of Symbols

∠ angle

·° degree

× times

% percent

> greater than

< less than

| · | Absolute value

|T | Length of a entered string

S Time taken to entered string in second

WPM Words per minute

C All correct characters in transcribed text

INF All incorrect characters in transcribed text

IF All characters backspaced during entry

List of Abbreviations

ITU International Telecommunication Union

MPO Mobile Phone Organizer

xi

Page 16: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

List of Symbols and Abbreviations

NAB National Association for the Blind

GSV Google Street View

MB Megabyte

ROM Read Only Memory

RAM Random Access Memory

GB Gigabyte

SMS Short Message Service

VI Visual Impairment

xii

Page 17: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

Abstract

Of late, there has been rapid advancement in mobile technologies and mobile phone hasbecome an indispensable personal digital gadget in modern human life. Besides voiceand video calling over the phone, communication through Short Message Service (SMS)is also gaining its popularity day-by-day. However, these advantages are not accessible tothe people with visual impairment because of the human interaction with this essentialdevice is highly relying upon the visual perception in terms of placing commands (input)and accessing information (from output screen) to the device.

Aiming to search for suitable mobile device interaction in absence of vision, we ex-plored different modes of interactions: voice based interaction, tactile feature based in-teraction, and gesture based interaction. We found that keypad based interaction canprovide good tactile support to assist our target users, but the presence of too manykeys in the keypad hampers different accessibility attributes. Based on this observation,we propose a keypad based interaction mechanism with a less number of keys compareto the existing keypad setup. Experiment with users reveals that our approach providesbetter reachability of a particular key in less time with enhanced tactile features.

We design a text entry procedure with this keypad setup which is based on the designdecisions that consider users’ capabilities and discard users’ inabilities. Our system boundusers’ finger movement only within the four-way navigation and a selection key duringthe text entry task to avoid the demands to percept the directional and distance changesduring the movement from one key to another on keypad which was a highly visualdemanding process. We have continued a user study which establish that our designdecisions can provide faster and accurate and a very friendly text entry mechanism.

We also investigated the prospect of touch screen based mobile devices for blind users.We have inspected how the knowledge of directional sense can be transformed by ourtarget users to perform the directional movement gestures on flat device screen. Wefound that the directional flick gesture, based on think once, move once strategy, as afast and suitable interaction for them. Then we propose a text entry mechanism basedon this interaction procedure on touch enabled mobile devices. We found from a userstudy that our system achieves a good text entry rate with less error and also it is easein use for the users.

Keywords: User-centric design; Text entry mechanism; Mobile device interaction forpeople with visual impairment; Rehabilitation engineering.

xiii

Page 18: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest
Page 19: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

Contents

Approval i

Certificate iii

Declaration v

Dedication vii

Acknowledgment ix

List of Symbols and Abbreviations xi

Abstract xiii

Contents xv

List of Figures xix

List of Tables xxiii

1 Introduction 11.1 Accessing Hand-Held Mobile Devices by Blind People . . . . . . . . . . . . 21.2 Motivation of the Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31.3 Research Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41.4 Proposed Methodologies . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

1.4.1 Exploring Suitable Interaction Modes . . . . . . . . . . . . . . . . 61.4.2 Easy and Efficient Keypad-Based Text Entry Mechanism . . . . . 71.4.3 Gesture-Based Text Entry for Touch Screen Devices . . . . . . . . 8

1.5 Contributions of the Thesis . . . . . . . . . . . . . . . . . . . . . . . . . . 101.6 Organization of the Thesis . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

xv

Page 20: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

Contents

2 State of the Art 132.1 Tactile Feature-Based Solutions . . . . . . . . . . . . . . . . . . . . . . . . 132.2 Voice-Based Interaction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142.3 Gesture-Based Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152.4 Transformed Physical Keypad Design Based Solutions . . . . . . . . . . . 162.5 Transformed Braille Keypad Based Solutions . . . . . . . . . . . . . . . . 172.6 Miscellaneous Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192.7 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

3 Interaction Mode Exploration 233.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243.2 Analysis of the Existing Styles of Interactions . . . . . . . . . . . . . . . . 25

3.2.1 Lacuna with the Existing Mode of Interactions . . . . . . . . . . . 253.2.2 Specific Needs of Users . . . . . . . . . . . . . . . . . . . . . . . . . 27

3.3 User Study to Realize the Difficulties with Existing Keypad Conditions . 283.3.1 Keypad Conditions as Test Cases . . . . . . . . . . . . . . . . . . . 293.3.2 Participants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 293.3.3 Task . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 303.3.4 Apparatus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 313.3.5 Experimental Design . . . . . . . . . . . . . . . . . . . . . . . . . . 323.3.6 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 323.3.7 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

3.4 Current Boom in Touch-Based Smart Phones . . . . . . . . . . . . . . . . 353.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

4 Solution with the Presence of Tactile Sense 394.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 394.2 BlindGuide: A Design of a Caller Guide with Less Number of Keys . . . . 41

4.2.1 Formative Interviews . . . . . . . . . . . . . . . . . . . . . . . . . . 424.2.2 Design of Eyes-Free Interaction . . . . . . . . . . . . . . . . . . . . 43

4.2.2.1 Design Rationale . . . . . . . . . . . . . . . . . . . . . . . 434.2.2.2 Menu Organization . . . . . . . . . . . . . . . . . . . . . 454.2.2.3 Name Entry Mechanism . . . . . . . . . . . . . . . . . . . 474.2.2.4 Number Entry Mechanism . . . . . . . . . . . . . . . . . 47

4.2.3 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 494.3 Efficacy of Hot Keys Setup . . . . . . . . . . . . . . . . . . . . . . . . . . 50

4.3.1 Task, Apparatus, Participants . . . . . . . . . . . . . . . . . . . . . 50

xvi

Page 21: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

Contents

4.3.2 Experimental Design and Results . . . . . . . . . . . . . . . . . . . 504.3.3 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53

4.4 EasyTap: Efficient Text Entry System for Blind Users . . . . . . . . . . . 544.4.1 Specify Users’ Demands . . . . . . . . . . . . . . . . . . . . . . . . 544.4.2 Iterative Design of EasyTap . . . . . . . . . . . . . . . . . . . . . . 574.4.3 User Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59

4.4.3.1 Participants . . . . . . . . . . . . . . . . . . . . . . . . . 594.4.3.2 Task . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 604.4.3.3 Experimental Design . . . . . . . . . . . . . . . . . . . . . 604.4.3.4 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . 614.4.3.5 Subjective Preferences . . . . . . . . . . . . . . . . . . . . 63

4.4.4 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 634.5 An Example Prototype Utilizing EasyTap . . . . . . . . . . . . . . . . . . 64

4.5.1 Context . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 644.5.2 Functional Organization . . . . . . . . . . . . . . . . . . . . . . . . 654.5.3 Contact Entry Mechanism . . . . . . . . . . . . . . . . . . . . . . . 674.5.4 Text Message Composing Using EasyTap . . . . . . . . . . . . . . 694.5.5 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 704.5.6 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72

4.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72

5 Gesture-Based Text Entry for Touch Screen Devices 755.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 755.2 Scopes and Objective of the Work . . . . . . . . . . . . . . . . . . . . . . 775.3 To Study the Performance of Guided Directional Movements . . . . . . . . 78

5.3.1 Participants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 795.3.2 Apparatus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 795.3.3 Design of Experiments . . . . . . . . . . . . . . . . . . . . . . . . . 80

5.3.3.1 Procedure for a Unit Entry . . . . . . . . . . . . . . . . . 805.3.3.2 Experiment . . . . . . . . . . . . . . . . . . . . . . . . . . 81

5.3.4 Observation, Results and Discussion . . . . . . . . . . . . . . . . . 825.3.5 Motivation to Go for Further Study . . . . . . . . . . . . . . . . . 88

5.4 To Study the Performance of Unguided Flick Gestures . . . . . . . . . . . 895.4.1 Apparatus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 895.4.2 Participants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 905.4.3 Procedure and Experimental Design . . . . . . . . . . . . . . . . . 90

xvii

Page 22: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

Contents

5.4.3.1 Input Mechanism for Unit Entry . . . . . . . . . . . . . . 905.4.3.2 Design of the Experiment . . . . . . . . . . . . . . . . . . 91

5.4.4 Observation, Results and Discussions . . . . . . . . . . . . . . . . . 915.4.4.1 Detection of Gestures . . . . . . . . . . . . . . . . . . . . 915.4.4.2 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . 96

5.5 VectorEntry : a Text Entry Mechanism by Directional Flicks . . . . . . . . 975.5.1 Entry Mechanism: a Two-step Procedure . . . . . . . . . . . . . . 97

5.5.1.1 Group Selection Mechanism . . . . . . . . . . . . . . . . . 975.5.1.2 Character Selection Mechanism . . . . . . . . . . . . . . . 100

5.5.2 User Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1005.5.2.1 Deciding Prototype Mechanism as Test Cases: . . . . . . 1005.5.2.2 Experiment . . . . . . . . . . . . . . . . . . . . . . . . . . 1025.5.2.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104

5.5.3 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1125.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113

6 Conclusion and Future Research 1156.1 Contributions of the Thesis . . . . . . . . . . . . . . . . . . . . . . . . . . 1186.2 Significance of our Research . . . . . . . . . . . . . . . . . . . . . . . . . . 1196.3 Future Scope of Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119

Publications 121

References 123

xviii

Page 23: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

List of Figures

2.1 Physical keypad based solutions with the presence of tactile features. . . . 15(a) Physical QWERTY keypad layout with tactile identifier at ‘G’ key. . . . . . . . . . 15(b) Traditional 4×3 telephonic keypad with tactile identifier at ‘5’ key. . . . . . . . . . 15(c) ALVA’s Mobile Phone Organizer (MPO) (Denham 2004) that uses Braille keys. . . . . 15(d) Braille Dots are placed on the physical keys in Intex Vision phone set (mobigyaan.com). 15

2.2 Modified glyph set for single-stroke implementation of the alphabets. . . . 162.3 Text entry mechanisms on touch-based mobile devices those are influenced

from physical keypad layouts. . . . . . . . . . . . . . . . . . . . . . . . . . 18(a) Text entry with QWERTY method identical to VoiceOver as pre-

sented in (Oliveira et al. 2011a). . . . . . . . . . . . . . . . . . . . . 18(b) Text entry in MultiTap as presented in (Oliveira et al. 2011a). . . . 18(c) Text entry with No-Look Notes (Bonner et al. 2010). At left, group

selection mechanism; at right, letter selection mechanism from theselected group. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

2.4 Braille based text entry mechanism for touch enabled mobile devices. . . . 19(a) Enter texts with BrailleType as presented in (Oliveira et al. 2011b). 19(b) Enter texts using BrailleTouch as presented in (Frey et al. 2011). . . 19

3.1 Regions of a mobile phone shown by Paul et al. . . . . . . . . . . . . . . . 303.2 The keypad conditions QWERTY, Traditional 4×3 are used in the exper-

iment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31(a) QWERTY keypad condition (Test Case1). . . . . . . . . . . . . . . . . . . . . 31(b) Traditional 4×3 keypad condition (Test Case2). . . . . . . . . . . . . . . . . . 31

3.3 Selection Time of each participant in both keypad conditions QWERTY,Traditional 4×3. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

3.4 Error rate of each participant in both the keypad conditions QWERTY,Traditional 4×3. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

xix

Page 24: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

List of Figures

3.5 Exploring different interaction modes and analyzing their suitability toaccess mobile phones, we found, two prominent directions of solution re-quired for the people with blindness. We will discuss those solution strate-gies in Chapter 4 and Chapter 5 in the remaining of the thesis. . . . . . . 37

4.1 Focus is at ‘call by number ’ widget. . . . . . . . . . . . . . . . . . . . . . . 444.2 Keys required to access BlindGuide. . . . . . . . . . . . . . . . . . . . . . 454.3 An interface with nine button widgets where focus is on R2C2. . . . . . . 464.4 Menu organization of BlindGuide. . . . . . . . . . . . . . . . . . . . . . . . 464.5 Proposed Interfaces of BlindGuide. . . . . . . . . . . . . . . . . . . . . . . 48

(a) Screenshot for name entry. . . . . . . . . . . . . . . . . . . . . . . . . . . . 48(b) Screenshot for calling by match name. . . . . . . . . . . . . . . . . . . . . . . 48(c) Screenshot for calling by number. . . . . . . . . . . . . . . . . . . . . . . . . 48

4.6 Selection Time of each participant in three keypad conditions QWERTY,Traditional 4×3 (Ambiguous), and Hot Keys. . . . . . . . . . . . . . . . . 51

4.7 Error rate of each participant in three keypad conditions QWERTY, Tra-ditional 4×3 (Ambiguous) and Hot keys. . . . . . . . . . . . . . . . . . . . 52

4.8 Blind users with traditional MultiTap keypad. . . . . . . . . . . . . . . . . 55(a) A traditional 4×3 MultiTap text-entry keypad. . . . . . . . . . . . . . . . . . . 55(b) A typical user’s finger movement on traditional 4×3 MultiTap keypad. . . . . . . . 55

4.9 EasyTap interaction mechanism. . . . . . . . . . . . . . . . . . . . . . . . 57(a) Virtual interface with focus is at group number 5. . . . . . . . . . . . . . . . . . 57(b) 4-way navigation key and a selector. . . . . . . . . . . . . . . . . . . . . . . . 57

4.10 Mapping of letters after selection of group number 7. . . . . . . . . . . . . 584.11 Text entry rate registered in all sessions with 2 techniques. . . . . . . . . . 614.12 Error rate registered in all sessions with 2 techniques. . . . . . . . . . . . . 624.13 Menu organization for the Texting (Text Message Sending) Activity. . . . 664.14 Screenshot of options appear to the users to enter contact for Text Message

Sending (Texting) activity. . . . . . . . . . . . . . . . . . . . . . . . . . . . 674.15 Screenshots for contact entry mechanism for Text Message Sending (Tex-

ting) activity. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68(a) Screenshot for entering name phrase of Text Message Sending (Texting) activity by

using EasyTap. First-time selection of the right-bottom widget, read out the entered

name phrase to be searched from the entire contact list; a consecutive second-time

selection of this widget initiates the searching task and redirect the system to present

the list of matched contacts. . . . . . . . . . . . . . . . . . . . . . . . . . . 68

xx

Page 25: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

List of Figures

(b) Screenshot for selection of the contact name from the matched list appear to them for

Text Message Sending (Texting) activity. . . . . . . . . . . . . . . . . . . . . 68

4.16 Screenshot for composing the texts for message body of Text MessageSending (Texting) activity. First-time selection of the right-bottom wid-get, read out the composed texts for the body of the message; a consecutivesecond time selection of this widget performs the sending task. . . . . . . 69

5.1 Different positions of a phone set appearance to the participants. . . . . . 82

(a) Height dominant position. . . . . . . . . . . . . . . . . . . . . . . . . . . . 82

(b) Width dominant position. . . . . . . . . . . . . . . . . . . . . . . . . . . . 82

5.2 Angle of tilt and time taken to perform simple directional movementsmeasured for the participants in height dominant orientation and widthdominant orientation of a phone set appear to them. . . . . . . . . . . . . 83

(a) Angle of tilt mesured in two device orientation. . . . . . . . . . . . . . . . . . . 83

(b) Time taken to perform simple directional movements in two device orientation. . . . . 83

5.3 Measurement procedures of angular movements in different device orien-tation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84

(a) Measurement of angles for the diagonal movements in height dominant position of the

device. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84

(b) Measurement of angles for the diagonal movements in width dominant position of the

device, considering the the directions similar, as it was measured in height dominant

device orientation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84

(c) Measurement of angles for the diagonal movements in width dominant position of the

device, considering the diagonal movements similar with respect to fixed corner-zone,

as it was in the height dominant device orientation. . . . . . . . . . . . . . . . . 84

5.4 A quick look on movement times taken by the participants in height dom-inant orientation and width dominant orientation of phone set given tothem. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87

(a) Time taken to perform diagonal movements in two device orientation. H � presents

height dominant device position and W � presents width dominantdevice position. . . 87

(b) Time taken to perform simple directional movements versus diagonal movements in

height dominant device orientation. . . . . . . . . . . . . . . . . . . . . . . . 87

(c) Time taken to perform simple directional movements versus diagonal movements in

width dominant device orientation. . . . . . . . . . . . . . . . . . . . . . . . 87

xxi

Page 26: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

List of Figures

5.5 Comparison of performance for diagonal gestures used in Initial Studyand Study 2 in terms of average time taken per gesture and the numberof misses occurred during the experiments. . . . . . . . . . . . . . . . . . . 93(a) Average time taken per gesture in both the experiments. . . . . . . . . . . . . . . 93(b) Number of misses ocurred during the entry of diagonal gestures in both the experiments. 93

5.6 A quick look on movement times taken by the participants in height dom-inant orientation and width dominant orientation of a phone set appearto them. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95(a) Time taken to perform simple directional flicks Vs diagonal flicks in height dominant

device orientation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95(b) Time taken to perform simple directional flicks Vs diagonal flicks in width dominant

device orientation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95(c) Time taken at average to perform flicks in height dominant device orientation and in

width dominant device orientation. . . . . . . . . . . . . . . . . . . . . . . . 955.7 An example scenario of two-step entry mechanisim of VectorEntry. ‘Step 1’ describes group

entry, ‘Step 2’ describes letter entry in the selected group. Figure (a): shows the letter distri-

bution in telephonic keypad. (b): shows that the move towards left-down (direction of key3 to

key7 in (a)) selects group ‘PQRS’. (c): a left flick after group selection will select ‘P’. (d): a up

flick after group selection will select ‘Q’. (e): a right flick after group selection will select ‘R’.

(f): a down flick after group selection will select ‘S’. . . . . . . . . . . . . . . . . . . . 995.8 Two-step text entry mechanism in No-Look Notes. . . . . . . . . . . . . . 101

(a) Eight angular slices of No-Look Notes. Each slice represents a set of characters. . . . . 101(b) Linear representation of characters in No-Look Notes after selection of a set. . . . . . 101

5.9 Comparison between two touch based two-step text entry mechanisms:No-Look Notes and VectorEntry in terms of entry speed and errors duringentry. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106(a) Average of the text entry rates in wpm for all the participants with No-Look Notes and

VectorEntry. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106(b) Day-by-day text entry rates averaged across the participants with No-Look Notes and

VectorEntry. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106(c) Error rates in the study for all the participants with No-Look Notes and VectorEntry. . 106

xxii

Page 27: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

List of Tables

2.1 Summary of the existing work. . . . . . . . . . . . . . . . . . . . . . . . . 22

3.1 Different interaction modes. . . . . . . . . . . . . . . . . . . . . . . . . . . 26

4.1 Functions and related sub-functions of BlindGuide. . . . . . . . . . . . . . 46

4.2 Basic characterization of the participants. . . . . . . . . . . . . . . . . . . 59

4.3 Subjective ratings on EasyTap(ET) and MultiTap(MT) in 5 point Likertscale. Strongly Agree = 5, Agree = 4, Neutral = 3, Disagree = 2, StronglyDisagree = 1, NA = Not Applicable. . . . . . . . . . . . . . . . . . . . . . 63

4.4 Functions and related sub-functions to perform text message sending (Tex-ting) task. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65

4.5 Settled agreements by the participants against the query statements placedto test Design Assumption. Agreements are in 5 point Likert scale. StronglyAgree (SA) = 5, Agree(A) = 4, Neutral(N) = 3, Disagree(D) = 2, StronglyDisagree (SD) = 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72

5.1 Distribution of alphabets by groups and associated flicks. NA = NotApplicable. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98

5.2 Basic characterization of the participants. . . . . . . . . . . . . . . . . . . 102

5.3 Summary of text entry rates and error rates for No-Look Notes and Vec-torEntry. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104

5.4 Subjective ratings on No-Look Notes(NN) and VectorEntry(VE) in 5 pointLikert scale. Strongly Agree (SA) = 5, Agree(A) = 4, Neutral(N) = 3,Disagree(D) = 2, Strongly Disagree (SD) = 1, ∗ s. indicates the differenceis significant and n.s. indicates the difference is not significant. . . . . . . 107

xxiii

Page 28: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

List of Tables

5.5 Response from the participants against the query statements for DesignAssumption 1. Agreements are in 5 point Likert scale. Strongly Agree(SA) = 5, Agree(A) = 4, Neutral(N) = 3, Disagree(D) = 2, StronglyDisagree (SD) = 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109

5.6 Settled agreements by the participants against the query statements placedto test Design Assumption 3. Agreements are in 5 point Likert scale.Strongly Agree (SA) = 5, Agree(A) = 4, Neutral(N) = 3, Disagree(D) =2, Strongly Disagree (SD) = 1. . . . . . . . . . . . . . . . . . . . . . . . . 111

5.7 Settled agreements by the participants against the query statements placedto test Design Assumption 4. Agreements are in 5 point Likert scale.Strongly Agree (SA) = 5, Agree(A) = 4, Neutral(N) = 3, Disagree(D) =2, Strongly Disagree (SD) = 1, NR = Not replied. . . . . . . . . . . . . . . 112

xxiv

Page 29: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

Chapter 1

Introduction

Mobile phone is one of the most influential digital gadget in human life. It makes a

person connected with the world around the globe. This portable, handheld communi-

cating device opens up many scopes in several aspects in our day-to-day life. Besides

communicating tasks like placing calls, using short message service (SMS), it is also used

to store and manage various important data such as personal notes to maintain personal

calendar etc. Texting is required for most of these tasks.

Recently, touch enabled mobile phone devices come into the scenario that trim the

involvements for the peripherals like keyboard and become a craze to the young aged

population [1], [2]. Touch sensible screens add a new spark to grow the popularity of

technology and going to occupy a very large market space. Nevertheless, still those

advantages are not accessible to the people with visual impairment because the human

interaction with this essential device is highly relying upon the visual perception in terms

of placing command (input) and accessing information (from output screen) of the device.

It is therefore a demand to bridge a gap between mobile devices and people with visual

impairment. This thesis takes this issue as a research investigation and aims to develop

interface mechanisms for people with visual impairment (VI) to access hand-held mobile

devices. By the people with visual impairment, we address the people with blindness,

1

Page 30: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

1. Introduction

and we will use these two terms alternatively to address the same.

1.1 Accessing Hand-Held Mobile Devices by Blind People

As far as special need is concerned, in our world about 255 million people are visually

impaired; among them, more or less 39 million are blind [3]. The worst effected coun-

tries are the developing countries; where about 90% of the total visually impaired are

from here [3]. About 65% of the visually impaired people are older of aged 50 years

and above while this age group comprises about 20% of world’ population. Approxi-

mately, 19 million children of aged 15 years and below are visually impaired, and among

them 1.4 million have lost their vision permanently [3]. They urgently required visual

rehabilitation interventions for their psychological and personal development [3].

It is also possible for blind people to perform the tasks which require entering text but

it demands changes in text entry mechanisms. Several voice-based solution techniques

have been introduced to incorporate accessibility features of mobile devices for blind

users. For example, Google introduced VoiceAction [4] for Android device [5]; Apple

introduced SIRI [6], which is a voice-enabled application on iPhone 4s [7] and heigher

versions; Apple’s VoiceOver [8] can provide audio feedback on touching a widget on the

screen. Nuance TALK [9] is an audio-based application software for blind people to

access many functionalities of a feature phone. Intex Vision [10] is an existing product

in the market which is particularly launched for the people with visual impairment. In

this phone set, Braille dots [11] are placed on the keys to recognize each key in the 4×3

telephonic keypad setup.

Hard-wired based solution like Braille keyboards and screens has compromised with

portability of the device and cannot be used in mobile scenario and also these are very ex-

pensive [12]. The sense of touch is extremely related to accessing capabilities to blind [13]

and keypad based systems can provide tactile differences between keys. Traditional 4×3

telephonic keypad system is a way to achieve it where three or four consecutive letters are

2

Page 31: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

1.2. Motivation of the Work

grouped and assigned under one key. In mini-QWERTY keypad system each character is

assigned with single key and it contains more than 30 keys. Also, to fulfill the demands

for the portable hand-held devices, with such a large number of keys, manufacturer com-

pelled to make the key size small and can allow a very less space between the keys. This

hampers to pick-up the right key among the jumble of keys on the keypad in absence of

vision.

Several initiatives have been taken to make the touch screen enabled mobile devices

accessible for the blind community [14], [15], [16], [17], [18]. No-Look Notes [14] is

a two step eyes-free text entry mechanism build on touch-screen environment. Kane

et al. [19] proposed different gestures on a multi-touch enabled devices and assigned

different functionalities against those gestures. BrailleTouch [15], [20], [21] is a text entry

mechanism based on six dot based Braille [11] script which needs six fingers to operate

and each finger is assigned to a single dot; different combinations of fingers are used to

represent different alphabets.

Nevertheless, in spite of several attempts, still these initiatives are not able to achieve

the ultimate goal. Rather, all categories of solutions exist with their own limitations and

drawbacks. We briefly report these facts and consequences in Section 1.2.

1.2 Motivation of the Work

Although several attempts, mentioned earlier, have been made, but they are not nec-

essarily free from limitations. Voice-input suffers from different situational impairment.

It also diminishes the privacy of the blind users. As-well-as, voice input conjoined with

some practical challenges like to use in a noisy environment and to handle accents and

intonations of individuals [20]. On the other hand, touch sensible screens cannot provide

the tactile differences between the objects on a flat surface of display unit. In contrast,

the sense of touch is extremely related to accessing capabilities to blind people [13]. So,

there exist scopes to explore for suitable interaction strategies that can help our target

3

Page 32: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

1. Introduction

users to access the mobile devices with touch screen, without the presence of tactile dif-

ferences between the on-screen objects. Again, as the tactile sense is a lead influencing

factor to accessing capabilities, hence, interaction through physical keys may be an effi-

cient technique for mobile text entry, so that our target users can touch and feel a key

among the jumble of keys on a keypad. In any keypad based system like in 4×3 telephonic

and mini-QWERTY keypad, it is possible to provide tactile differences between the keys

but these keypads contain a large number of keys which may impede the reachability for

a particular key in a jumble of keys, and in turn hamper the accessibility. So, there is a

scope to find better solution utilizing the tactile features with physical keys.

Motivated with the existing issues and limitations, our objective is to investigate for

suitable interaction modes for the blind people to access mobile phones. In this work,

our main aim is to develop a system based on users’ capabilities, keeping their inabilities

in mind to access mobile phones in an easy and interactive manner.

1.3 Research Objectives

Motivated by the limitations found in the existing access mechanisms and related text

entry techniques developed for visually impaired people, in this thesis, we set our objec-

tives to explore effective interaction procedure to access mobile devices for such people,

and based on that, we design keypad-based, and gesture-based suitable text entry tech-

niques which can enable many basic functionalities of a mobile phone. More precisely,

our research objectives in this dissertation are the followings.

1. Design easy interaction

• To explore for an effective interaction mode for blind mobile users.

• To design interaction that would provide more tactile sensitivity through which

blind users can easily access a mobile phone device.

• Also, to explore the interaction on touch based screen considering the fact that the

4

Page 33: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

1.4. Proposed Methodologies

trend of future mobile phone interaction is indeed moving towards the touch-based

devices.

2. Design interface with tactile features

• To introduce a text entry mechanism that can be performed through an interac-

tion with enhanced tactile features.

• As the text entry enables most of the basic functionalities of a mobile phone, so,

we would choose this task and made interface with it which can be handled via our

proposed easy interaction mechanism with the presence of a few number of physical

keys.

• To show that the proposed entry mechanism is much faster, accurate and user

friendly compare to the existing state of the arts.

• To design a basic use case scenario through which we show how the basic phone

functionalities can be performed by using our proposed entry mechanism.

3. Design interface for touch screen devices

• To explore the competency of our target users to perform on the touch screen

based mobile devices and accordingly we propose a suitable way of interaction for

them.

• To design a user-friendly mobile text entry mechanism for the blind people on

touch enabled mobile devices aiming to make touch screen accessible to them.

1.4 Proposed Methodologies

Here, we brief our main findings towards providing suitable interaction graceful with the

current craze to access mobile devices by our target users followed by two efficient text

entry techniques; one is assisted with enhanced tactile features, and the other is using

directional gestures on the touch screen devices.

5

Page 34: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

1. Introduction

1.4.1 Exploring Suitable Interaction Modes

This work aims to find an effective interaction mode to access mobile phone by the

visually impaired people in India. We discuss three types of interaction procedures

namely voice-based interaction, touch and gesture based interaction for touch screen

devices, tactile feature based interaction for keypad-based devices and explored their

possibilities to provide better accessibility to them. We have discussed pros and cons for

all three modes. Considering the current trend and the future direction of technological

advancement, we acknowledge the need of easy and efficient mechanism to make the

touch screen based devices accessible to our target users which we will discuss later at

the end of this section. But, here we have conducted an in-depth exploration to mine

the interaction issues present in the popular keypad-based handsets in-spite-of being

familiarity by our target users with the key-based interaction.

In keypad-based system, differences between the keys can be perceived by touch and

feel which help to detect and reach for a particular key in the keypad. But, presence of

too many keys fitted in the small keypad area destroys that procedure.

We explored how the detection time for a particular key in a jumble of keys varies

with different keypad conditions where target users searched for a particular key only

through the tactile sense. We consider two popularly used keypad condition –QWERTY

condition, and Traditional 4×3 keypad condition (also known as Ambiguous keypad or

MultiTap system, we will use these names to address the same keypad type) to realize

how the detection time and accuracy vary over these conditions to pick up a particular

key from the jumble. We continued an experiment with eight expert participants where

we measured the average detection time to reach to particular key and the accuracy to

detect a particular key from the jumble of keys in both the keypad conditions.

The results reflect that the detection time for a particular key in the keypad is significantly

faster when they are using the keypad with relatively a less number of keys. The average

detection time was higher for all the participants in case of QWERTY keypad condition

6

Page 35: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

1.4. Proposed Methodologies

which contains more number of keys. Detection for a particular key was much errorless

in a keypad that contains less number of keys, that is, Traditional 4×3 keypad.

Next, we discuss about the current boom of touch enabled interaction technology and the

associated issues it brings for the blind community. Here, we accept the immediate need

of providing good solution to protect our target community, from the digital separation

due to this recent change of interaction technology.

At the end, we well-establish the immediate need of two prominent directions of solution

to access hand-held mobile devices by the people with blindness. The first one is towards

for a keypad-based solution with the presence of tactile features, and the second one is

towards to make the touch enabled screen accessible for our target users.

1.4.2 Easy and Efficient Keypad-Based Text Entry Mechanism

According to our target, we find a way to implement basic functionalities utilizing a

less number of keys that provide better tactile features in compare to traditional 4×3

telephonic keypad condition.

The need of our target users is to get an easy mobile text entry mechanism that is

designed by considering their capabilities and discarding their inabilities. More precisely,

our target is to fulfill the demand for a mobile text entry mechanism which can utilize

the usage habits and the mental map of letter distribution on keypad but can avoid the

movements through a long path only assisted by tactile sense. As-well-as, they are to be

informed about the character before accepting it, so that, they hesitate a less and get

more control over the system.

To fulfill their requirements, we propose a text entry technique named EasyTap which

is built based on our target users’ capabilities and keeping their inabilities in mind to

fulfill the needs identified in the previous. The key idea behind our approach is to use

the popular character distribution map of traditional 4×3 telephone keypad system. As-

well-as, we want to avoid the requirement for traversing a long path to reach a key on

7

Page 36: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

1. Introduction

the keypad. Lengthy traversal on a keypad is identified as inabilities of our target users.

We continued user study with five participants to evaluate our prototype system

EasyTap to realize the benefit of the system in terms of text entry rate, error occurred

during the entry, users’ control over the system and ease during the performance. The

entry rate was higher and occurrences of error were significantly less in EasyTap for all the

participants in all six days of testing in compare to traditional system. We conducted a

subjective evaluation and according to their given score, participants were more confident

and found more control with a less hesitation in using EasyTap rather than traditional

system. At the end, we present a basic example prototype of Text Message Sending

(Texting) use case scenario, through which, we show how the basic phone functionality

can be implemented by using our proposed entry mechanism.

1.4.3 Gesture-Based Text Entry for Touch Screen Devices

At the beginning of this work, we had conducted an initial study with five blind par-

ticipants to better realize their potentiality to perform directional movements in eight

basic directions on a flat, touch enabled on-screen display of mobile device, guided by

the spatial landmarks like edges, corners of the device. Although, they prefer the ges-

tures guided by landmarks, but, in case of the diagonal moves, the lack of assistance in

the mid-path to fulfill the demands for the accuracy to reach to a fixed corner location

starting from one corner-zone was troublesome. They had to scrawl this path only relied

on their own spatial perception that were included the perception of screen size and the

perception of the directional and distant changes during the free scrawling on the screen.

So, at that point, there was a possibility that if they perform short directional moves

accurately without the assistance of landmarks like edges or corners, then, those gestures

will have less demands for the location accuracy to reach to a particular zone or edge, and

at the same time, it does not need to scrawl through a long path without any assistance,

which in turn may lowering the time to complete a gesture. In the next study, we have

8

Page 37: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

1.4. Proposed Methodologies

explored this possibility.

Through this study, we wanted to realize the participants’ proficiency to accomplish

the basic eight directional moves. The final aim of this study was to envision the fact,

without the favors of edges, corners and other landmarks; would blind people prefer to

perform the short directional moves (flicks in 8 basic directions)? Could they perform

it with similar accuracy of the previous? If so, then, which one would be of more time

efficient? In response to those queries, we continued our second study.

In study 2, the directional flick gestures followed the strategy think once, move once,

which was more simple in nature and reduced the demand for spatial perception through-

out the scrawling path. As-well-as, it highly reduced the demand for location accuracy as

the directional flicks could perform at anywhere on the active screen area of the device.

In addition to it, the directional flicks in this study, took significantly less time to com-

plete a gesture in compare to the initial study and had less demand to percept the screen

size and performed accordingly. Hence, we found the directional flick gestures of study

2, as an efficient gesture set and in the next section, we propose a text entry mechanism

by using that gesture set and reproducing the knowledge of using 4×3 telephone keypad.

We propose VectorEntry – a text entry mechanism with touch-based interaction on

mobile devices for the people with visual impairment. The nuts and bolts behind this

approach were to reproduce or reuse the popular character distribution of 4×3 telephone

keypad which can be accessed by an efficient set of directional flick gestures.

Based on the observation in user studies, we present a text entry mechanism for blind

people on touch-based mobile devices that import their knowledge of using traditional

4×3 telephone keypad to the touch-based device. The participants were able to transfer

their existing skill of typing on alpha-numeric keypad when use our newly proposed entry

mechanism on a touch screen device. We report the text entry rate, error rate and its

improvement over No-Look Notes [14], and the lessons learn in the designing process.

We conducted a study with eight participants to realize its performance compare to

9

Page 38: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

1. Introduction

No-Look Notes in terms of text entry rate, error occurred during the entry, users’ control

and ease during the performance for our target users. We found 83.3% improvement in

the entry rate in favor of VectorEntry. But, the mean of error rate was little bit higher in

VectorEntry but this difference was not significant (p=0.15) according to paired t-test.

The scores settled by the participants in subjective evaluation, reveal the acceptance of

VectorEntry as an easily interactive and friendly text entry mechanism in touch based

mobile device for our target users. At the end, we present a basic use case scenario

to show how the basic phone functionalities can be performed by using this text entry

mechanism.

1.5 Contributions of the Thesis

We summarize the major contributions of our thesis in the following.

• Exploring for suitable mode of interaction: We provide an in-depth report

that analyzes the pros and cons of various existing modes of interaction used for the

blind people to access the mobile devices. We have inspected their usage habits,

interaction suitability etc. and accordingly have specified their requirements. We

propose a suitable keypad condition that is beneficial in different aspects for them.

We present an example prototype of Calling use case that explains how basic phone

functionality can be handled by using proposed keypad condition.

• Keypad-Based Solution: We present a suitable keypad-based interaction mech-

anism for the blind users with the presence of enhanced tactile features. We propose

a fast and less error-prone keypad based text entry mechanism for the blind people

that utilize our target users’ experience with 4×3 telephone keypad but alleviating

the accessibility issues present in telephonic keypad.

We design a prototype system use cases to show how our proposed interaction

mechanism can be utilized to implement basic phone functionalities.

10

Page 39: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

1.6. Organization of the Thesis

• Gesture-Based Solution: We explore the possibilities of directional flick gestures

as a potential input technique to access touch screen based mobile devices by the

blind users.

We design a user interface for blind people to enter text on touch-based mobile

devices utilizing their knowledge of using alpha-numeric traditional 4×3 telephone

keypad.

1.6 Organization of the Thesis

Chapter 1: This chapter provides the background, motivations, objectives, contri-

butions and organization of the thesis.

Chapter 2: In this chapter, we survey the state of art in the field. We report major

initiatives that enhance the accessibility and provide assistive support

to blind users to use mobile phone set. A detail literature survey of

the text entry mechanisms which is the key task to enable all the basic

phone functionalities suitable for our target users is also included.

Chapter 3: In this chapter, we explore different interaction modes and their po-

tentiality for blind mobile-phone-set users in order to decide suitable

interaction mechanisms for them.

Chapter 4: In this chapter, we propose a keypad-based efficient text entry mech-

anism for blind users with enhanced tactile features and we describe

how the basic phone functionalities can be implemented by using this

proposed entry mechanism.

Chapter 5: In this chapter, we propose an efficient way of interaction to access

touch enabled mobile devices for the users with blindness. We design a

directional flick gesture based efficient text entry mechanism for them

11

Page 40: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

1. Introduction

on touch enabled mobile devices.

Chapter 6: In this chapter, we summarize the potentiality of this research as con-

clusions and discuss about the future scopes of this work.

12

Page 41: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

Chapter 2

State of the Art

Of late, a number of research work have been initiated towards the accessibility support

to the blind people. These work address different challenges from various directions

and provide assistive supports and opportunities [22], [23]. In this section, we present a

brief survey related to our work, more precisely, relevant to the text entry techniques with

mobile devices for people with visual impairment. All these work broadly can be classified

into four categories: a) tactile feature based solutions, b) voice-based interaction, c)

gesture-based solutions and d) transformed physical keypad design based solutions. The

major work in these categories are briefly surveyed in the following.

2.1 Tactile Feature-Based Solutions

Several initiatives have been taken to support tactile-based interaction with mobile de-

vices (see Fig. 2.1). The tactile feature of the control of a mobile phone set plays an

important role to access the device. With this aim, mobile phones set with physical

keypads are widely explored for them with audio feedback support. Both the QWERTY

layout and 4×3 telephonic layout are supported with the tactile identifier to assist blind

and visually impaired people. The marked keys can be identified by the sense of touch in

the jumble and accordingly it can help to identify other keys by realizing the associated

13

Page 42: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

2. State of the Art

relative placement of the surrounding keys in the keypad. These identifiers are placed

either on ‘F’, ‘K’ or ‘G’ key in QWERTY layout and placed on ‘5’ key in telephonic

keypad. The dimension and the placement on a key of the raised identifier are also rec-

ommended by the International Telecommunication Union (ITU) [24] so that the tactile

identifier shall unambiguously identify the marked keys. The possibilities of hardware

Braille keyboards attached with a phone set have been investigated. In Braille keyboard,

each key represents a dot in Braille and thus a particular combination of keys is equiv-

alent to a combination of dots which in turn represent a character. ALVA introduced

Mobile Phone Organizer (MPO) [25] based on Braille keyboard which combined personal

digital assistant with telephonic activities. It provides call, contact management, SMS

support with personal note editing facility. This product features with eight Braille input

keys with a space bar and two menu keys, a 20-cell refreshable Braille display with the

support of synthetic speech output. Intex Vision [10] is another product in the market

which is particularly launched for the people with visual impairment. Braille dots [11]

are placed on the keys to provide tactile support that helps to recognize each key in the

telephonic 4×3 keypad setup.

2.2 Voice-Based Interaction

Several voice-based solutions are also known to provide accessibility features in mobile

devices for blind users. For example, Google introduced VoiceAction [4] for Android

device [5]; Apple introduced SIRI [6] which is a voice-enabled application on iPhone

4s [7] and higher; Apple’s VoiceOver [8] can provide audio feedback on touching a widget

on the screen. Nuance TALK [9] is an audio-based application software for blind people

to access many functionalities of a feature phone. Azenkot et al. have explored the

scopes and future challenges of speech input by blind people on mobile devices [26].

14

Page 43: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

2.3. Gesture-Based Solutions

(a) Physical QWERTY keypadlayout with tactile identifier at ‘G’key.

(b) Traditional 4×3telephonic keypad withtactile identifier at ‘5’key.

(c) ALVA’s Mobile Phone Orga-nizer (MPO) (Denham 2004) that usesBraille keys.

(d) Braille Dots areplaced on the physi-cal keys in Intex Vi-sion phone set (mobi-gyaan.com).

Figure 2.1: Physical keypad based solutions with the presence of tactile features.

2.3 Gesture-Based Solutions

With the rapid advancement of touch-screen technology and its growing popularity, a

number of gesture-based interaction techniques including several text composing mech-

anisms have been introduced. Different glyph alike gestures for the alphabets on the

screen also have been reported. Graffiti [27], [28] and Unistrokes [29] are two such text

entry systems [30] (see Fig. 2.2). Roman alphabets are re-shaped to make single-stroke

implementation of them. Single-stroke means a continuous gesture that can be performed

15

Page 44: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

2. State of the Art

Figure 2.2: Modified glyph set for single-stroke implementation of the alphabets.

in one action. Users need to put such a continuous gesture through a stylus or finger

on the touch-screen surface of the device. The stroke considers to begin with the first

touch-point detected on the screen and ends with the release of the touch-point. On

completion of the continuous gesture, system detects the matched alphabet and it gets

entered. SlideRule [19] is another solution proposed by Kane et al. to make the touch-

screen accessible for blind people. They proposed different gestures like One-finger Scan,

Second-Finger Tap Selection, Flick, L-Select, Double-Tap etc. and explored these on a

multi-touch enabled devices for the blind users and then assigned different functionalities

against those gestures to perform several activities.

2.4 Transformed Physical Keypad Design Based Solutions

Popular QWERTY physical keypad layout is utilized on touch-screen based devices as-

sisted with voice feedback and scrawling gesture support (Fig. 2.3a). Apple [8] introduced

VoiceOver apps where an on-screen QWERTY keypad is presented with voice feedback

support. It provides voice feedback for the item in touch. When a user scrolls her finger

on the screen from one object to another, it gives the sound feedback at every focus

change. On getting the sound of the desired alphabet, the user needs to put a second

finger tap anywhere on the screen to enter the alphabet. Similar scrawling and voice

feedback based solution also has been proposed based on the layout of traditional 4×3

telephonic keypad [17], [16] (Fig. 2.3b). In this type of soft keyboard, three or more

16

Page 45: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

2.5. Transformed Braille Keypad Based Solutions

alphabets are grouped under one button. System provides voice feedback for the item-

group in touch. Like the previous one, when we scroll our finger on the screen from one

object to another, it also gives the sound feedback at every focus change. On hearing

the desired group of the target alphabet, users need to put a second finger tap multiple

times, according to the sequential position of target character in this group, anywhere

on the screen to enter the letter. The letter distribution of traditional 4×3 telephonic

keypad is also used by Bonner et al. [14] but in a different manner (Fig. 2.3c). In their

No-Look Notes [14], device screen interaction area is partitioned into 8 segments. Each

segment represents the group of letters following the similar letter distribution of tele-

phonic keypad. An audio feedback is given when a user touches the related segment of

screen. A second finger tap will select that group. After selection of a group, the related

letters will appear in a linear layout on the screen from top to bottom. Again in a similar

manner users can select their target letter appeared on the screen.

2.5 Transformed Braille Keypad Based Solutions

Researchers also have tried to transform the notion of Braille into the touch screen based

devices (Fig. 2.4). Oliveira et al. presented BrailleType [18] (Fig. 2.4a), where six dots

of Braille were presented as six cells on the device screen. Areas of the Braille cells

were large and mapped at the corners and edges of the screen to provide spatial ease

to find. Users need to select all the corresponding Braille cells that present the target

letter and then a confirmation for entry. Users should touch on the desired Braille cell

and wait for audio confirmation cue to get the cell selected and thus after selecting the

required cells, a double tap anywhere on the screen will select the corresponding letter.

An empty Braille matrix is presented for a space. Milne et al. presented a game based

learning procedure to promote Braille literacy in BraillePlay [31], and VBGhost [32].

DigiTaps, introduced by Azenkot et al., is an eyes-free number entry method for touch-

screen devices that requires little auditory attention [33]. BrailleTouch [15], [20], [21]

17

Page 46: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

2. State of the Art

(a) Text entrywith QWERTYmethod identical toVoiceOver as pre-sented in (Oliveira etal. 2011a).

(b) Text entry inMultiTap as pre-sented in (Oliveira etal. 2011a).

(c) Text entry with No-Look Notes (Bonneret al. 2010). At left, group selection mech-anism; at right, letter selection mechanismfrom the selected group.

Figure 2.3: Text entry mechanisms on touch-based mobile devices those are influencedfrom physical keypad layouts.

is another text-entry mechanism, which has been recently introduced based on six dot

based Braille [11] script (Fig. 2.4b). It needs six fingers to operate the system where each

finger is assigned as a single dot; different combinations of fingers are used to represent

18

Page 47: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

2.6. Miscellaneous Work

different alphabets. Users hold the device facing display outside from them and can place

their finger holding the device by two hands. Now a combination of touched fingertip on

the screen represent the corresponding letter and that letter will get entered by releasing

the finger tips together from the screen.

(a) Enter texts with BrailleType as presented in (Oliveiraet al. 2011b).

(b) Enter texts using BrailleTouch as presented in (Frey etal. 2011).

Figure 2.4: Braille based text entry mechanism for touch enabled mobile devices.

2.6 Miscellaneous Work

Pirhonen et al. discussed the use of gesture and non-speech audio as ways to improve

the user interface of a mobile music player [34]. Their key advantages are that users

could use a player without having to look at its controls when on the move. Two differ-

ent evaluations of the player took place: one based on a standard usability experiment

(comparing the new player to a standard design) and the other a video analysis of the

19

Page 48: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

2. State of the Art

player in use. Both of these evaluations reveal significant usability improvements for the

gesture/audio-based interface over a standard visual/pen-based display. The similarities

and differences in the results produced by the two studies are discussed in the following.

Direct-touch interaction on mobile phones revolves around screens that compete for

visual attention with users’ real-world tasks and activities. Bragdon et al.’s paper in-

vestigated the impact of these situational impairments on touch-screen interaction [35].

They probe several design factors for touch-screen gestures, under various levels of en-

vironmental demands on attention, in comparison to the status-quo approach of soft

buttons. They observe that in the presence of environmental distractions, gestures can

offer significant performance gains and reduced attentional load, while performing as well

as soft buttons when the users’ attention is focused on the phone. In fact, the speed and

accuracy of bezel gestures did not appear to be significantly affected by environment,

and some gestures could be articulated eyes-free, with one hand. Bezel-initiated ges-

tures offered the fastest performance, and mark-based gestures were the most accurate.

Bezel-initiated marks therefore may offer a promising approach for mobile touch-screen

interaction that is less demanding of the users’ attention.

Azenkot et al. introduced a robust non-visual authentication method named Pass-

Chord on touch surfaces for blind smart-phone users [36]. Users can enter PassChord

by tapping several times on a touch surface with one or more fingers. Hara et al. intro-

duced a scalable method for collecting bus stop location and landmark descriptions by

combining online crowdsourcing and Google Street View (GSV ) [37].

Yfantidis et al. described the development of a new technique for touch-screen in-

teraction, based on a single gesture-driven adaptive software button [38]. The button

is intended to substitute the software keyboard, and provides text-entry functionality.

Input is accomplished through recognition of finger gestures that is comprised of move-

ment towards the eight basic directions in any position. The target user group of such

an interaction technique is primarily blind people who are benefited significantly. The

20

Page 49: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

2.7. Summary

adaptability of the button provides complementary help and follows the style of inter-

action in a natural way. The analysis of the results, collected from twelve blind-folded

subjects, revealed an encouraging tendency. During blind manipulation on touch screen,

three of the subjects achieved a maximal typing speed of about 12 wpm after five trials.

This suggests that the technique developed is reliable and robust enough to be possibly

applied to diverse application platforms, including personal device assistants.

Mobile text entry methods traditionally have been designed with the assumption that

users can devote full visual and mental attention on the device, though this is not always

possible. Banovic et al. presented an iterative design and evaluation of Escape-Keyboard,

a sight-free text entry method for mobile touch-screen devices [39]. Escape-Keyboard al-

lows the user to type letters with one hand by pressing the thumb on different areas of the

screen and performing a flick gesture. Then, they examined the performance of Escape-

Keyboard in a study that included 16 sessions in which participants typed in sighted

and sight-free conditions. Qualitative results from this study highlight the importance of

reducing the mental load with using Escape-Keyboard to improve user performance over

time. The authors thus also explored features to mitigate this learnability issue. Finally,

they investigated the upper bound on the sight-free performance with Escape-Keyboard

by performing theoretical analysis of the expert peak performance.

2.7 Summary

At the end, we present a brief summary of the existing state-of-the-art related to the

remarkable initiatives taken by the researchers to provide enhanced accessibility support

to access mobile phone-sets by the people with visual impairment. Different categories of

solution and the remarkable work under those categories are reported in brief in Table 2.1.

21

Page 50: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

2. State of the Art

Cat

egor

yof

solu

-ti

onN

ame

ofth

ein

itia

tive

Solu

tion

proc

ess

Effe

ctiv

enes

sin

use

QW

ERT

Yla

yout

used

inm

obile

phon

eP

hysi

calk

eys

acce

ssed

byta

ctile

sens

eK

eys

can

bepe

rcei

ved

byto

uch

but

pres

ence

ofm

any

keys

ham

pers

findi

ngpr

oces

s4×

3te

leph

onic

used

inm

obile

phon

eP

hysi

calk

eys

acce

ssed

byta

ctile

sens

eK

eys

can

bepe

rcei

ved

byto

uch

but

pres

ence

ofm

any

keys

ham

pers

findi

ngpr

oces

sTac

tile

feat

ure

base

dA

LVA

’sM

obile

Pho

neO

rgan

izer

Phy

sica

lke

yseq

uiva

lent

toB

raill

edo

tsw

ith

som

efu

ncti

onal

keys

Key

sca

nbe

perc

eive

dby

touc

hea

sily

butsi

zeof

the

devi

ceis

not

port

able

Tac

tile

iden

tifie

ron

spec

ialke

ysD

ots

plac

edon

spec

ialke

yson

keyp

adof

the

mob

ilede

vice

,ac

cord

ing

toth

eru

leof

ITU

Spec

ial

keys

can

beid

enti

fied

ata

chan

ce.

Pro

vide

mor

eco

ntro

lin

use.

Inte

xV

sion

4×3

layo

utB

raill

eta

ctile

iden

tifie

ron

the

phys

ical

keys

Spec

ially

desi

gned

for

VI

user

s.Ju

mbl

eof

dots

onal

lke

ysso

met

ime

mak

eco

nfus

ion.

App

le’s

Voi

ceO

ver

On-

scre

enQ

WE

RT

Yla

yout

wit

hvo

ice

feed

-ba

ckK

eepi

ngto

uch

ellip

seof

inpu

tfin

ger

onth

esm

all

butt

onar

eaof

the

touc

hen

able

scre

enis

diffi

cult

Voi

ceba

sed

inte

r-ac

tion

Goo

gle’

sV

oice

Act

ion

Acc

ept

pre-

defin

edvo

ice

com

man

dSt

art

func

tion

ing

bya

butt

onta

pon

touc

hen

able

dsc

reen

,fin

ding

that

cont

rol

istr

ou-

bles

ome

Nua

nce

TA

LK

Giv

ing

soun

dfe

edba

ckto

allth

eac

tion

sW

idel

yus

edbu

tno

hint

sof

any

cont

rolb

efor

em

ake

itac

tive

Uni

stro

kes

Gly

phof

the

char

acte

rha

sbe

ench

ange

dto

mak

eit

sing

le-s

trok

ecu

rve

Rea

lizat

ion

ofth

etr

ansf

orm

edgl

yph

ofev

ery

char

acte

rw

itho

utvi

sion

isdi

fficu

ltG

estu

reba

sed

so-

luti

onto

acce

ssto

uch

scre

en

Gra

ffiti

Gly

phof

the

char

acte

rha

sbe

ench

ange

dto

mak

eit

sing

le-s

trok

ecu

rve

keep

ing

sim

ilari

tyw

ith

the

orig

inal

shap

e

Kee

ping

the

shap

ein

min

dw

itho

utan

yvi

sual

mem

ory

isdi

fficu

lt

Slid

eRul

eM

any

fam

iliar

type

ofge

stur

eson

the

touc

hsc

reen

has

been

prop

osed

Req

uire

dco

ntro

lon

touc

hfin

ger

tope

rfor

mth

ege

stur

es,

but

sati

sfac

tory

perf

orm

ance

trac

edin

aco

ntro

lenv

iron

men

tB

oth

QW

ERT

Yan

dtr

adit

iona

l4×

3la

yout

tran

sfor

med

toit

’sto

uch

vers

ion

wit

hso

und

feed

back

and

mul

ti-t

ouch

feat

ure

Fin

ding

requ

ired

char

acte

rby

scra

wlin

gon

the

touc

hsc

reen

,se

lect

ing

desi

red

one

byse

cond

finge

rta

pon

eor

mor

eti

mes

Exh

aust

ive

sear

chin

gon

the

flatdi

spla

ysc

reen

wit

hout

visi

on

Tra

nsfo

rmed

Phy

sica

lan

d

No-

Loo

kN

otes

Scre

enre

gion

isdi

vide

din

to8

angu

lar

seg-

men

ts,

each

repr

esen

ta

grou

pof

char

acte

rs,

sele

cts

grou

pby

seco

ndfin

ger

tap

onsc

reen

,th

en,c

hara

cter

sw

illco

me

ina

linea

rw

ayan

dca

nbe

sim

ilarl

yse

lect

ed

Eva

luat

ion

ofth

iste

chni

que

prod

uced

sati

s-fa

ctor

yre

sult

s

Bra

ille

keyp

adba

sed

solu

tion

Bra

illeT

ype

Reg

ion

ofsc

reen

adja

cent

tolo

nger

edge

isdi

-vi

ded

into

bloc

ks,

one

bloc

kas

sign

edto

one

dot

ofB

raill

e.H

ave

tose

lect

bloc

ksan

das

-su

reat

the

mid

-scr

een

Nee

dhi

ghle

vel

ofpo

siti

onal

accu

racy

toac

-ce

ssw

hich

isdi

fficu

ltw

itho

utvi

sion

Bra

illeT

ouch

Req

uire

dsi

xfin

ger

toge

ther

tous

e,ea

chfin

-ge

rm

appe

dw

ith

aB

raill

edo

t,co

mbi

nati

onof

inpu

tfin

gers

rele

ased

toge

ther

from

scre

ense

lect

the

asso

ciat

edch

arac

ter

Exp

erim

ents

are

goin

gon

this

wor

k.Ava

ilabl

efo

rM

AC

devi

ces.

Hav

eto

follo

wa

part

icul

arde

vice

-hol

ding

patt

ern

toac

cess

Tab

le2.

1:Su

mm

ary

ofth

eex

isti

ngw

ork.

22

Page 51: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

Chapter 3

Interaction Mode Exploration

Interpersonal communication through communication aids like mobile is also possible for

the people with blindness, but, it requires several changes in interaction mechanisms. In

previous chapter, we have discussed about the existing techniques to provide assistive

support to use mobile phone set for users with blindness. Although we have reported

several techniques, but, these approaches are not necessarily free from limitations. In

this chapter, first we summarize our findings on different interaction mechanisms with

their benefits and shortcomings and then, we discuss the possibilities to go with further

exploration. So, in this chapter our objective is to explore the suitability of different

interaction mechanisms to access the mobile phones by our target users. In order to

do this, first we analyze the state of the art mobile interaction mechanisms in terms

of their benefits and shortcomings, pertinent to the usability issues faced by our target

community. Then, we do a usability study to understand the competence with the

existing designs and usage habits. Finally, we summarize this chapter, clearly indicating

two directions for solutions.

23

Page 52: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

3. Interaction Mode Exploration

3.1 Introduction

The existing solutions for the blind people to interact with mobile phones can be broadly

categorized into three categories. These are (i) voice-based interaction (ii) touch-based

interaction (iii) keypad-based interaction. Nevertheless, all these solutions have their own

shortcomings and benefits. The pros and cons of the existing solutions are highlighted

in the following.

Interaction through audio is one of those techniques which is already being used

in many products. But, Interaction through voice suffers from some inherited draw-

backs like poor performance in noisy environment, different accents and intonations of

individual [20]. Also, it may lead to increase mental load of a user to memorize voice-

instructions. Further, it may hamper users’ privacy and may lead to different situational

impairment. Auditory feedback and overlaying virtual controls are used to enhance the

accessibility of touch-screen but overlaying degrades the touch sensing capability of the

screen [40] and auditory feedback has also its own merits and demerits [41, 42].

Recently, most of the mobile phones come with touch-enabled screen where the dif-

ference in tactile sense is missing. So, it is very hard to trace a widget within the same

surface of display unit [43]. Several research work have been reported to provide inter-

action mechanism that uses auditory feedback, overlaying virtual controls, multi-touch

events, gesture-based events etc. to enhance the accessibility of the touch screens [43]

for blind people. Indeed, the sense of touch matters in accessing capabilities of blinds.

Keypad-based mobile phones can provide tactile differences between keys and keys can

be identified by Braille dots on each key [10].

In the keypad-based system, it is possible to provide tactile differences between the

keys and that tactile feeling is an important influencing factor to the accessing capabilities

of the blind people. Besides, the people in Indian subcontinent are well habituated with

the keypad-based handsets which may induce the spatial ability of the users. So, the

effective system designed for blinds should include physical keys so that they can touch

24

Page 53: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

3.2. Analysis of the Existing Styles of Interactions

and feel the controls. But the presence of too many keys on the keypad may affect the

accessibility to a mobile phone.

In this chapter, we explore how the presence of jumble of keys effect the finding

process for a desired key in different keypad conditions. We have conducted a user study

to realize how the presence of jumble of keys of a mobile phone can diminish both - the

search time and error to detect a particular key present on the keypad.

The organization of the rest of this chapter is as follows. In Section 3.2, we have

reported an in-depth analysis on the existing styles of interactions for the blind users of

mobile phones. This section contains a detail discussion on the lacunas of the present

modes of interaction followed by a detail specification of the demands in several dimen-

sions of our target users. In Section 3.3, we describe the detail of the user study continued

to decide suitable keypad condition with the presence of tactile features for our target

users. In Section 3.4, we knocked the current boom of touch enabled interaction tech-

nology and the associated issues it brings for the blind community. Here, we accept the

immediate need of providing good solution to protect our target community, from the

digital separation due to this recent change of interaction technology. In Section 3.5, we

report the summary of our findings.

3.2 Analysis of the Existing Styles of Interactions

Table 3.1 summarizes the three major interaction mechanisms, which have considered to

support blind users of mobile phones.

3.2.1 Lacuna with the Existing Mode of Interactions

Many approaches have been reported to provide the accessibility of the mobile phone

devices for blind people. In fact, existing approaches are not necessarily free from lim-

itations. We summarize our findings on the existing interaction modes exist, which is

shown in Table 3.1.

25

Page 54: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

3. Interaction Mode Exploration

Existing Interactions Comments1. Talking is not appropriate in many context (situational

impairment) [20].2. No privacy for user [20].3. Performance degrades in noisy environment [20].

Voice-based interaction 4. Sensitive to individual accents and intonations [20].5. Computationally expensive.6. Users need to keep voice-instruction in their mind which

increases their memory load.1. Difference in tactile sense is missing [43].

Touch-based interaction 2. Overlaying virtual control degrades touch sensing capa-bility [40].

3. Gesture preference of blind users is different [44].1. Harder to identify the desired key in a jungle of keys.2. Presence of many keys on keypad leads to small key size.

Keypad-based interaction 3. Presence of many keys on keypad allow less space betweentwo keys.

4. Keys are too small to fit finger without looking to key-pad [20].

5. Influence to compromise with the portability of the de-vice.

Table 3.1: Different interaction modes.

We have seen that interaction techniques that use voice-based input compromise with

the users’ privacy [20]. Interacting through voice-inputs to a mobile phone may lead to

different situational impairments, such as, talking is not appropriate in many contexts like

when users are in a meeting room or conference hall. Voice-input in real life environment

suffers from some practical challenges like - noisy environment, different accents and

intonations of individual [20]. Moreover, in interacting through voice, users need to keep

voice-instruction in their mind which increases the memory load. On the other hand,

with touch-screen, as the difference in tactile sense is missing, it is very hard to trace the

widgets within the same surface of display unit [43]. Overlay on touch screen is helpful

to increase tactile sensitivity but degrades the touch sensing capability of the touch-

screen [40]. With gesture-based interaction techniques [14, 19, 34, 45] on touch-screen,

blind people have some different gesture presences than the sighted people [44].

26

Page 55: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

3.2. Analysis of the Existing Styles of Interactions

3.2.2 Specific Needs of Users

We have pointed out the limitations with existing interaction styles. Further, if we

analyze the users’ perspective, then they need some specific things to be fulfilled. Now,

we identify the users need, which are as follows.

• The need of seeking solution which would support the usage habit and utilization

of previous experience.

• The need of a mobile phone with easily sensible keys that can be accessed by blind

and visually impaired users more quickly and with less error.

In the following, we elaborate these needs in details.

Usage habits of blind people in India: We conducted an informative interview

with 150 informants in order to learnt the usage habit and to understand the demands

of our target users. We found about 5% of the informants (from two well-established

organizations for blind at the centre of a metropolitan city (Kolkata, India)) use mobile

phones with touch-screen. All of our informants prefered a phone with tactile features

(even if who owned touch-screen phone), that is, a handset with keypad containing clearly

differentiable keys which can be easily felt and percepted without vision. About 9% of

the informants use handsets with QWERTY keypad. They prefered to use it as they

were experienced with the QWERTY layout. All of our informants were experienced

with traditional 4×3 keypad. Further, it was observed that a prominent longitudinal

effect exists for keypad-based mobile phone users.

Realization of user friendliness: Another observation with two blind users (who

have more than six years of working experience to train and teach the students with

visual impairment and more than eight years of experience of using mobile phones) is

as follows. During informal interviews, all of them expressed their strong preference to

the keypad-based mobile phone due to the presence of tactile differentiable keys on it,

although, they raised some issues on the keypad-based phones that are popularly used by

27

Page 56: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

3. Interaction Mode Exploration

blinds in India and also two of the interviewees have their previous experience with both

the smart (voice instruction enabled and touch-screen enabled) phones and keypad-based

phones. They informed us that most of the blind persons, whom they trained, are using

keypad-based phones and they are well habituated with it and instead of facing some

issues they still prefer to use it due to its tactile features. They informed us few points:

first, they become confused about the current key position on the keypad under their

finger in time of performing a task. Once they confused they start searching from the

starting finger position. To learn their usage behaviour and common habits we asked

two of them to call to a known number. First we observed that both of them hold the

device in their left-hand and feel the keys by their right-hand index finger. Both of the

interviewees were right-handed. They started to navigate on the keys from the left-top

corner on the keypad. Whenever they became confused in the jumble of keys they again

restart their finding procedure from the left-top corner position on the keypad. The

second issue, raised by the interviewees was before pressing a key it is not possible to

assure that they are pressing a right one in the current keypad-based system and as a

result they were pressing the wrong keys several times.

3.3 User Study to Realize the Difficulties with Existing Key-

pad Conditions

According to the information gathered from the informants and consultants, we found,

the usage habit of our target users, is towards the keypad-based system. Although, they

have that usage habit, but, the consultants pointed out several issues with the current

keypad-based system. We have continued a user study to realize the identified difficulties

at the time of use, in terms of their performance efficiency.

The main aim of this study is to realize the effect of the presence of jumble of keys on

the finding process for a desired key in different popular keypads. We realize the fact in

28

Page 57: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

3.3. User Study to Realize the Difficulties with Existing Keypad Conditions

terms of detection time and error to detect a key in different keypad types and analyze

them in order to detect suitable attributes for our target users. The details of this study

are in the following.

3.3.1 Keypad Conditions as Test Cases

This section discusses about different region on a keypad and their purposes followed by

the keypad conditions that are considered for our experiment and the motivating factors

behind our consideration.

Paul et al. described different regions of a keypad based mobile phone in [46]. Generally,

a mobile phone have three regions to interact with its users namely (a) text entry region

which contains keys to enter texts (b) hot key region which contains functional keys and

(c) display region, containing display screen of the mobile phone (see Fig. 3.1).

As we are experimenting for the people with blindness, so, the display region of a

mobile phone set has no any functionality with our target users. Hot key region (see

Fig. 3.1) of a keypad contains only a few number of keys and these are functional keys.

On the other hand, entering texts is a necessary task that can enable many basic function-

alities of a phone set. Keys to enter texts are placed into the text entry region (Fig. 3.1)

of a keypad and this region contains a large set of keys.

Now, for text entry region of a keypad, either QWERTY or traditional 4×3 organization

are used most popularly. So, we consider these two types as our test cases to realize how

the presence of a lot of keys affects the performance of our target users.

3.3.2 Participants

Eight volunteers (3 female) ranging ages 17 to 45 years (median 35) were participated

in our experiment. All participants were experienced and owned keypad-based mobile

phone. All of them had QWERTY keypad based mobile phone and also they were

experienced with traditional 4×3 keypad-based phone. They all had more than two

29

Page 58: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

3. Interaction Mode Exploration

Figure 3.1: Regions of a mobile phone shown by Paul et al.

years of experience in using both QWERTY and traditional 4×3 keypad based mobile

phones.

3.3.3 Task

We measured the discoverability of a key in the keypad using simple key-pressing task.

During each trial participants had to percept and enter a randomly generated key name.

When they become ready to enter a key a sound was played saying the name of the

key to be entered and that key name is taken randomly from the set of keys present

in the test keypad condition. By hearing that key name, participants had to search

for the key and press it. A sound repeating the key name entered by participants was

given to acknowledge the participant against each correct entry. Incorrect key presses

were expressed by an error sound. This process to test a single key entry is termed as

a unit entry. After completing one unit of entry task partipants had to ready again to

enter another key. They were given sufficient time (5 second was the adjusted gap for

all participants) to get ready for another entry after completing one. After being ready

for a unit entry there were no additional stimuli. A pair of speakers were plugged-in

with the headset jack of the phone to ensure that participants could hear the sound of

keyname and the feedback sound. Task time for each unit entry was measured from the

beginning of the sound prompt to the moment a key was pressed. Only the right entries

30

Page 59: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

3.3. User Study to Realize the Difficulties with Existing Keypad Conditions

(a) QWERTY keypad condition(Test Case1).

(b) Traditional 4×3 keypad condi-tion (Test Case2).

Figure 3.2: The keypad conditions QWERTY, Traditional 4×3 are used in the experiment.

are considered to calculate the task time for unit entry.

3.3.4 Apparatus

Our study was conducted by using a Java based phone. Java language is used to im-

plement the experiment. Two test cases are taken in our experiment namely QWERTY

and traditional 4×3 kepads and the test layouts are shown in Figure 3.2. For that we

customize the keypad condition of the same phone. We cover the keys with a tape cover

for traditional 4×3 keypad (see Fig. 3.2b) condition. Our experiment device offered 256

MB RAM and 512 MB ROM.

We have to use same phone to test different keypad conditions to avoid other influ-

encing factors like tactile sensibility (which is related to the keypad attributes like size

of keys and spacing between the keys), key organization, processing speed of phones etc.

As our target was to measure the effect of the presence of jumble of keys on the keypad

so this factor had to vary across the test cases. The keys were well-distinguishable only

31

Page 60: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

3. Interaction Mode Exploration

by the sense of touch and were well spaced.

Before starting our experiment, we were assured by all of the participants that they

could percept each key on the keypad very clearly and also they knew the name of

each key present on the keypad. We verifed these two facts for all the participants very

clearly before starting our experiment. No participants reported any difficulties either in

distinguishing any key or in reminding any key name present on the test keypad.

3.3.5 Experimental Design

Our experimental design included QWERTY and traditional 4×3 keypad layouts counter

balanced across participants. Each participant performed each keypad conditions in two

separate sessions with one day gap between sessions. We allowed 5 minute breaks between

the trials. Each trial comprised of 30 unit entries. We informed all the participants that

the experiment is carried out not to test them but to test the system. But all the

participants were instructed to perform each unit entry correctly and quickly at their

best. For all the keypad conditions, when they become ready to enter a key, they hold

the device at their non-dominant palm and the index finger of dominant hand was placed

at the left-top of the keypad region. We did not encourage the participants to hold the

phone or to interact by fingers in any particular pattern, so, it was absolutely a free-style

interaction but we instructed them to perform the entry task quickly and correctly at

their best.

In summary, the experimental design was: 2 keypad conditions (QWERTY, Tradi-

tional 4×3 ) × 5 trials × 30 unit entry in each trial = 300 entries per participant. In

total, 2400 entries were attempted.

3.3.6 Results

The effects of different keypad types on selection time and error rates were measured.

Task time: Selection time for each unit entry was measured. Selection time for the key

32

Page 61: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

3.3. User Study to Realize the Difficulties with Existing Keypad Conditions

Figure 3.3: Selection Time of each participant in both keypad conditions QWERTY,Traditional 4×3.

presses was measured from when the sound prompt of the name of a key start playing

to when the key was pressed down. The time instances were stored in a file during the

experiment and at the end collected from that file. To measure the performance in both

the keypad conditions, we take the average of selection times of unit entries of all the

trials for each keypad condition (two keypad condition) per participant. Only the right

entries are considered to calculate the average Selection Time of a key. The results are

shown in Fig. 3.3 For all the participants, the average time taken to pick the desired

key from the jumble of keys in traditional 4×3 keypad condition was much faster than

the QWERTY system and this difference is significant by paired T-test. This trend was

consistent through all the trials for each participant.

Error rate: Error rate is measured as the number of incorrect key presses divided

by the number of required key presses(150) for each keypad condition. There were no

options provided to make a correct entry if participants made any mistake. Error rates

are shown for each participants in Figure 3.4 for each keypad conditions and it is much

less for the traditional 4×3 keys condition rather than the widely used QWERTY system.

33

Page 62: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

3. Interaction Mode Exploration

Figure 3.4: Error rate of each participant in both the keypad conditions QWERTY, Tra-ditional 4×3.

3.3.7 Discussion

As we continued our experiment on the same mobile phone set, so condition of each key

like size, spacing, and height were not varied across the two keypad conditions (traditional

4×3, QWERTY ); only the number of keys in the search set varied and this variation

affected the search time for a key in both the test keypads for all participants. The keys

also were enough tactile sensible what we confirmed before starting our experiment by the

participants. So, from the above discussions and the results it is very clear that the keys

are more discoverable and the detection of a key is more accurate in the traditional 4×3

keys scenario rather than the QWERTY and this difference is significant. As these two

factors are the keys for the efficiency to access the mobile phones, and for both the factors

traditional 4×3 keypad condition performs nearly twice better than the QWERTY, so, we

should further go on with this system and should explore how it can be turned towards the

more betterment in terms of accessibility features and performance. QWERTY system

contains relatively more keys than the traditional 4×3 keypad, so, in absent of vision,

whenever one tries to pick up, one key from the jumble only guided by the tactile sense,

then there exist high chances of losing navigation into the jumble of keys. As a result,

34

Page 63: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

3.4. Current Boom in Touch-Based Smart Phones

it leads to a destruction for our target users in terms of accessibility.Here, the scenarios

establish the right direction to decide a suitable interaction mechanism to access mobile

phones by the blind mobile users in India and strongly claimed more researches in this

direction.

3.4 Current Boom in Touch-Based Smart Phones

The research direction presented in Section 3.3) was clearly towards to decide the suit-

able keypad interaction for the blind users. But, if we see the current trend of use, it

clearly indicates towards the touch-enabled smart hand-held devices. The new interaction

technology helps to boost the popularity of smart phones day-by-day.

The total number of smart phone users in USA was about 62.6 million in 2010, the

number raised to about 144.5 million in 2013 [47]. It is forecasted to be about 220

million in 2018 [47]. These numbers increase rapidly with the timeline in last decade.

From around 1311.2 million of smart phone users worldwide in 2013, it is expected to

reach about 2561.8 million by 2018 [48].

In last few years, clear indications of a consistent growth of the market space of smart

phones in the first world countries were maintained. The developing countries are also

at the same track of race. In India, the popularity of smart phones is rapidly increased

in last few years.

The total number of smart phone users in India in the year 2013 was about 76 million,

which will raised to about 279.2 million in the year 2018 [48]. It increases at a few folds

over the years. Around 22 percent of the population in India is projected to use a smart

phone by 2018 [49]. India will exceed 200 million mobile users by 2016 and will appear

as the country of second highest smart phone users. China and USA are expected to

be at number one and at number three position with the number of smart phone users

about 624.7 million and about 198.5 million respectively by 2018 [48].

At the same time, a remarkable portion of blind people are from the developing

35

Page 64: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

3. Interaction Mode Exploration

countries. About 90% of the total blind population in the world are from the developing

countries. As these smart devices are featured with touch enabled screen, so, the inter-

action procedure with the device is highly visual demanding process. So, we should not

neglect the recent changes in paradigm of mobile interaction. Rather we have to provide

solutions for the blind people so that they can adopt this technology and will be able to

efficiently use that new interaction mode. Otherwise, this new innovation in interaction

technology will influence to make the blind community digitally divide.

3.5 Summary

In this chapter, we report the different interaction modes exist to access a mobile phone

for blind people. We discuss the usage habits and the presence of tactile features as

factors to avail any solution by the target users and found preference to keypad-based

mobile phone for our target users. We investigated the keypad interaction for different

popularly used keypad types and presented results from a user study. The results reflect

the detection time for a particular key is significantly faster and the detection is much

more errorless in a traditional 4×3 keypad type rather than widely used QWERTY, and

hence a keypad with less number of keys is strongly preferable and recommended. Thus

the needs of our target users are well-explored.

From this work, we are encouraged to explore quick and easy mechanisms to access

information only by using a keypad with less number of keys. So, a clear direction of

research opens up as to investigate the facts mentioned below.

• Whether a design is possible or not, that contains a less number of keys than

traditional 4×3 system but can perform basic activities that use text entry task?

• If this design is possible, then, would our proposed design can perform better in

terms of speed and accuracy, than the existing traditional 4×3 system or not?

After this investigation, we knocked a burning issue which is hugely driven by the

36

Page 65: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

3.5. Summary

Figure 3.5: Exploring different interaction modes and analyzing their suitability to accessmobile phones, we found, two prominent directions of solution required for the people withblindness. We will discuss those solution strategies in Chapter 4 and Chapter 5 in theremaining of the thesis.

market force. In last decade, the growth of smart phone’s market is very remarkable. It

increases in a few folds. Most of such phones are using touch enabled screens to interact

with them, and this technology helps to grow the popularity of smart hand-held devices.

The access mechanism of such devices is a highly visual demanding procedure. But, it

really becomes very challenging for the blind people to cope-up with this new technology

shift. Even we cannot ignore the most popular interaction technology of near future. So,

we also should focus to bridge this gap.

In target to investigate both the directions (Fig. 3.5) of research addressed in this

37

Page 66: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

3. Interaction Mode Exploration

chapter, we further continued our investigation and the details are reported in the re-

maining chapters of this thesis. We propose a solution on keypad-based system with the

presence of tactile differences, as-well-as, we propose a solution to access the touch en-

able screen of mobile phone devices by the people with blindness. In addition to it, these

solutions also have to be consistent with the usage habits and familiarity of our target

users. In the next chapter (Chapter 4), we provide a keypad-based solution and then at

Chapter 5, we propose a solution to access the touch enabled mobile devices where both

these solutions are consistent with the usage habits and familiarity of our target users.

38

Page 67: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

Chapter 4

Solution with the Presence of

Tactile Sense

In the previous chapter, we well-established the immediate need of two prominent direc-

tions of solution to access hand-held mobile devices by the people with blindness. The

first one was towards for a solution with the presence of tactile features for our target

users. The detail about the solution procedure, related experimental details, and the

outcome of the experiment with results are described in this chapter.

4.1 Introduction

Hard-wired based solution like Braille keyboards and screens has compromised with

portability of the device and cannot be used in mobile scenario and also these are very

expensive [12]. Text entry mechanism can provide tactile differences between keys. But

accessing a particular key in the jumble of keys is really difficult without visual assis-

tance. Realization of the directional and distantial changes (as a whole it is the positional

change) during the movement from one key to another on keypad is a highly visual de-

manding process. Visual perception of human continuously guide us in movements to

39

Page 68: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4. Solution with the Presence of Tactile Sense

make positional changes of controlling fingers over the keypad. So, this interaction with

the presence of a jumble of keys is not well suited to the users’ requirements and as a

result it leads to erroneous entries, reduced performance, less control over the system.

So, instead of the existing keypad condition if we can implement the basic activities of a

phone by utilizing a less number of keys, then it will involve comparatively less movement

from one key to another on the keypad and in turn it will make the visual demanding

access procedure much easier. So, at that point we can present the way-out of our future

investigation by a sequence of queries and these are as follows.

Query 1: Whether it is possible to provide a way to implement basic functionalities

utilizing a less number of keys in compare to traditional 4×3 keypad?

If possible, then, have to present it in detail with example prototype.

Query 2: If the solution for Query 1 can be designed and presented, then is the

keypad condition with less number of keys used in the solution is better than QW-

ERTY or traditional 4×3 telephonic keypad condition in terms of speed, accuracy,

and easiness to reach to the desired key among the jumble of keys?

Query 3: If the remedy of Query 2 is properly countered and accordingly the

keypad condition with less number of keys is presented where users could find their

target key much faster and accurately with less effort, then, we have to investigate

how users can enter texts (as text entry enable many activities of a mobile phone

set) with this setup.

If such text entry mechanism will be presented, then, will it be better in terms of

text entry rate, errors during entry, ease of use etc. than its’ old existing competi-

tor?

40

Page 69: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4.2. BlindGuide: A Design of a Caller Guide with Less Number of Keys

The rest of the portion of this chapter is organized to provide the remedies for the

queries according to their appearance. In Section 4.2, we present a way to implement

basic functionalities utilizing a less number of keys. In Section 4.3, we establish the

fact, that, in keypad condition with a less number of keys, users easily can reach to

their target key more quickly and accurately with a less effort than the popularly used

4×3 telephonic keypad system. In Section 4.4, we present an iterative design process of

a suitable text entry technique utilizing the keypad condition revealed in the previous

sections and then evaluate our entry mechanism with the participants in terms of entry

rate, accuracy, and ease of use. In Section 4.5, we present an example prototype of Text

Message Sending Activity (Texting) that is implemented by utilizing our newly proposed

text entry technique. Finally, in Section 4.6, we report the summary of our findings on

the work presented in this chapter.

4.2 BlindGuide: A Design of a Caller Guide with Less Num-

ber of Keys

In this section, we discuss the remedy of Query 1 appeared in the previous section

(Section 4.1), and accordingly we present a technique to implement basic functionalities

utilizing a less number of keys through a proper example. Here, we consider an example

of Calling use case to do that.

We design a caller system for mobile phone devices to provide sightless access to the

blind people. We named the system as BlindGuide. People with blindness can enter

any phone number and also can search any name from contact list to make call to other

person. Users interact to the system using five physical keys present in the phone keypad.

We provide an audio guideline to use BlindGuide and an auditory feedback against each

unit interaction to ensure users about their inputs. BlindGuide requires a less number

of physical keys (with respect to QWERTY keypad or traditional telephone keypad)

41

Page 70: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4. Solution with the Presence of Tactile Sense

which favors to provide higher tactile sensitivity. We followed a design procedure which

is supported by information gathered from participants with blindness, and accordingly

we present our design rationale followed by detail access mechanism of BlindGuide. The

rest of the portion of this chapter is organized accordingly.

4.2.1 Formative Interviews

We conducted a formative interview in order to inform the design process. We interviewed

6 blind mobile phone users (4 male, 2 female) to collect information about their calling

habits. Our basic aim was to understand the ways of calling preferred by the users to

fix the organization of our prototype application – BlindGuide as well as the hardware

design (mainly the number of physical keys required in the keypad).

Informants are recruited from the West Bengal State Branch of National Association

for the Blind, India [68]. Interviews lasted about 30 minutes per informants. The average

age of the informants was 23 ranging from 14 to 35 (SD= 7.66). Informants reported

average approximately 150 calls per month.

Results Average informants reported that more than 90% of the calls they made in

last month was being made with the contact person whose phone number was already

saved in directory. A very less number of times they entered a number to make a call.

Our interviews were very much helpful to identify the key issues related to hardware

design. Some informants prefer easy and familiar physical layouts like QWERTY key-

board or 4×3 telephonic keyboard. All of the informants want a clear tactile distinguish

of physical keys on keyboard. Again tactile aptitudes of users are related with keypad

characteristics like key size and spacing between two keys [13]. As well as the portability

of mobile phone devices is highly demanded. So, giving preferences to physical character-

istics of a keypad, favoring the portability of the device contradicts the physical keyboard

layout containing more number of physical keys like QWERTY keyboard which contain

more than about ‘30’ physical keys. In other words, we may say that keyboard with a

42

Page 71: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4.2. BlindGuide: A Design of a Caller Guide with Less Number of Keys

less number of physical keys influences to favor on two vital constraints: 1. keypad’s

physical characteristics and 2. portability of the mobile devices.

In this work, we provide a design which required only five physical keys but can serve

all 26 alphabets (a-z) and 10 numerals (0-9). We use guided scanning mechanism to access

our interfaces. Users can enter all the alphabets and numerals by 4-way navigation key

and a selector present in the physical keypad. The detail of our design principles and

methodologies are given in the next sections.

4.2.2 Design of Eyes-Free Interaction

Our prototype application BlindGuide provides eyes-free access to the mobile phone.

Users interact with BlindGuide by pressing physical keys and BlindGuide helps user by

providing a complete auditory guideline.

4.2.2.1 Design Rationale

In this section, we present the design principle of using the auditory guideline [8], [4] to

control the application, auditory feedback [9], [41] of actions (key-press), menu organi-

zations and input mechanism of alphabets and numerals.

• Auditory guideline: We provide auditory guidelines in BlindGuide to facilate

users to get the control over the interfaces by pressing appropriate physical keys.

BlindGuide says users about the activities they can perform on an interface at the

time of new interface appears to them. The BlindGuide says its users the role

of each widget in the current interface when it get focus in an active screen. For

example, to explain the second widget in the interface given in Fig. 4.1, BlindGuide

says its users “focus is at call by number ”.

• Feedback in auditory form: BlindGuide provides users an audio feedback for

each action they performed. For example, we explain a scenario here. Suppose, an

user want to make a call by entering phone number as input. From number input

43

Page 72: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4. Solution with the Presence of Tactile Sense

Figure 4.1: Focus is at ‘call by number ’ widget.

interface users need to enter the digits one by one. Let, we want to call to the

phone number ‘8016155690’. After each entry of a single digit to the system, it will

confirm each entry to the user. If ‘8’ is entered by user, system says ‘8 is entered’

to confirm the user about the entry. Also before going to call to the phone number,

system will read all the digits of the phone number.

• Brevity: BlindGuide demonstrate their users in a very brief and specific manner.

When it explains one interface, many of the times, it only says a single phrase or

say in a brief manner that we used to follow in telegram message.

• Discoverability: BlindGuide requires to have only a few number of physical keys

(see Fig. 4.2) and all these are functional. Blind users need not to search the desired

one into a jungle of physical keys like in a mobile phone with QWERTY keypad

or standard 12-key telephone keypad; so, it enriches the discoverability of keys for

blind users, specially, when they sense the physical keys only through touch. Users

can input all the texts only with a 4-way navigation key and a selector key.

• Guided scanning mechanism: To enrich the accessibility of mobile phone de-

vices we use guided scanning mechanism [69] to access all the widgets present in

the active screen. In this mechanism, all the widgets are aligned into some rows

and some widgets will be there into each row. Control will be on the first row with

44

Page 73: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4.2. BlindGuide: A Design of a Caller Guide with Less Number of Keys

the activation of the screen. Control can be changed from one row to other row

through the up-down navigation keys present on the physical keypad. User can

select a widget by pressing the selector key on the keypad. User can transfer the

control from one item to other item within the same row by pressing the left-right

navigation key and within the same column by pressing the up-down navigation

key.

Figure 4.2: Keys required to access BlindGuide.

User can trigger one item by pressing the selector on the keypad and the respective

function will be fired. The system dictates its user about the item on which the

control exists or on just after each transfer of the control takes place. Figure 4.2

shows the physical keys with a 4-way navigation key and selector that need to be

present in the physical keypad to run BlindGuide. Figure 4.3 explains the transfer

of control among the items in an active screen. If the focus is on the item R2C2

and if you press left, right, up, down key, respectively then focus will be on R2C1,

R2C3, R1C2, R3C2 respectively, that is focus will go to its immediate neighbor

item in that direction.

4.2.2.2 Menu Organization

Calling activity includes two sub-functions: The first one is entering contact (either

entering a number or selecting a name from the contact list) and the second one is the

45

Page 74: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4. Solution with the Presence of Tactile Sense

R2C2

R3C2 R3C3

R2C3

R3C1

R2C1

R1C1 R1C3R1C2

Figure 4.3: An interface with nine button widgets where focus is on R2C2.

exact calling instruction to that name or number. So, for the first sub-function users

need to choose one procedure among these two and the second one (Calling task) is

obvious(see Table 4.1). We are taking care of that issue in a little bit different way. At

Table 4.1: Functions and related sub-functions of BlindGuide.

Function Sub-function Sub-function Sub-function

Calling Entering contact Entering name Enter name phraseSearch for match

Entering numberCalling task

the starting of BlindGuide, we provide options to the users: 1) calling by select name

from the contact and 2) calling by entering number. Users need to select any one of the

options at the beginning. Figure 4.4 shows menu organization of BlindGuide. Users can

BlindGuide

Call by number Call from contact

Enter name phrase Enter number

Find all matches

Select one to call

Call to the number

Figure 4.4: Menu organization of BlindGuide.

have the control over any of the options and can change the control from one to another

by using up-down navigation keys on the keypad. Users are able to select any of the

46

Page 75: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4.2. BlindGuide: A Design of a Caller Guide with Less Number of Keys

options through the selector on the keypad. Selection of first option (see Fig. 4.1)

redirects users to the name entry screen (see Fig. 4.5a) and the second option redirect

to the phone number entry screen (see Fig. 4.5c).

After finishing the entry (phone number or contact name) system will automatically

call to the entered number from phone number entry screen or to the contact person

entered information from name entry screen. In next two sections we discuss the name

entry mechanism and the phone number entry mechanism (Fig. 4.5).

4.2.2.3 Name Entry Mechanism

We are using guided scanning mechanism to enter the message text in BlindGuide. We

arrange the characters in six rows and the characters are in sequential order. Figure 4.5a

shows the arrangement. In each row we place five characters except in the last row. The

last row contains the character ‘z’, ‘Clear’, ‘Space’ and ‘Finish’ buttons. We already

discussed how this guided scanning mechanism works in our Design rationale section. In

Figure 4.5a, ‘Clear’ button is used to delete the last entry and ‘Finish’ button is used to

finish the ‘name entry ’ task by the user.

When user select the ‘Finish’ button, system will check in contacts to find the match

and redirect user to the screen containing the list of all matches in alphabetical order

(see Fig. 4.5b). User can browse the contact list of all matches through the up-down

navigation key on the keypad and can select a contact name by pressing the selector

from the keypad when hear the desired candidate. System calls to the candidate with

the selection of the name by selector.

4.2.2.4 Number Entry Mechanism

Here also we are using guided scanning mechanism to take the inputs from user. We use

4×3 matrix structure for phone number entry screen. The first three rows contain 0-2,

3-5, 6-8 respectively and the last row contain ‘9’, ‘Clear’ and ‘Finish’ buttons. Figure 4.5c

47

Page 76: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4. Solution with the Presence of Tactile Sense

(a) Screenshot for nameentry.

(b) Screenshot for callingby match name.

(c) Screenshot for callingby number.

Figure 4.5: Proposed Interfaces of BlindGuide.

48

Page 77: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4.2. BlindGuide: A Design of a Caller Guide with Less Number of Keys

shows the arrangement. ‘Clear’ button is used to delete the last single entry and ‘Finish’

button is used to finish the phone number entry task by the user. When user selects the

‘Finish’ button, system says user the input number and calls to that number.

4.2.3 Discussion

Here, we have presented BlindGuide: a prototype application designed to call to others

phone number through a hierarchy of steps with auditory feedback.

Our system needs to have only five physical keys: a set of 4-way navigation key with

a selector and three more keys – ‘back key’, ‘home key’, and ‘end key’ on the physical

keypad (see Fig. 4.2); ‘back key’ returns user the most recently visited screen, ‘home

key’ always redirects users to the home screen of our prototype application and ‘end key’

ends the initiated activity. So, it only utilizes the keys from Hot Keys region (Fig. 3.1).

Moreover, tactile sensitivity is important influencing factor for the blind user to access

the mobile phones and less number of physical keys in the keyboard favors to bigger size

of the keys and more space between two keys which in turn favors to provide more tactile

sensitivity.

In this sub-section (Section 4.2), we have discussed the remedy for Query 1 posed in

the last sub-section (Section 4.1), and accordingly we have presented a solution for the

blind people that can provide basic functionality of a mobile phone utilizing a less number

of keys (only the keys from Hot Keys region) compare to QWERTY or traditional 4×3

keypad.

Now, is the keypad condition with a less number of keys used in the just discussed

solution is better than QWERTY or traditional 4×3 telephonic keypad condition in

terms of speed, accuracy, and easiness to reach to the desired key among the jumble of

keys?

Our approach to find an answer to this question is stated in the next sub-section (Section

4.3).

49

Page 78: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4. Solution with the Presence of Tactile Sense

4.3 Efficacy of Hot Keys Setup

In Section 4.2, we have shown through a prototype example of Calling activity, that

the functionalities can be implemented only by utilizing the keys from Hot Keys region,

which only contains functional keys. But, whenever our target users will go to find for

a particular key, in this keypad setup in absence of vision, then how that finding speed

and accuracy will be altered? To realize this, we performed a user study like described in

Section 3.3 and will present the results comparing with the outcomes as it was presented

in Section 3.3.6. The detail of this study is described below.

4.3.1 Task, Apparatus, Participants

We measured the time duration to reach to a particular key on the Hot Keys keypad

setup, like we measured it in Section 3.3. The same task was given to the participants as

given in Section 3.3.3. Everything of the task is same as it was presented in Section 3.3.3.

The same apparatus also has been used covering the text entry region of the keypad. The

participants, participated in this study were also same as the previous and during this

time gap between two studies, participants were not experienced any technical training,

practices, or habit changes.

4.3.2 Experimental Design and Results

The experimental design is also similar like it was described in Section 3.3.5, but here

the experiment continued only on a single keypad condition.

In summary, the experimental design is: Single keypad condition (Hot Keys) × 5 trials

× 30 unit entry in each trial = 150 entries per participant. In total, 1200 entries were

attempted for all 8 participants.

Results Selection time to find a key from the keypad and the associated error rate

were measured. Each and everything about the procedure of measurement is same like

it was in Section 3.3.6.

50

Page 79: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4.3. Efficacy of Hot Keys Setup

Task Time: Selection time for each unit entry was measured. Selection time for the

key presses was measured from when the sound prompt of the key-name start playing

to when the key was pressed down. The time instances were stored in a file during the

experiment and at the end collected from that file. To measure the performance, we

take the average of selection times of unit entries of all the trials per participant. Only

the right entries are considered to calculate the average Selection Time. The results

are shown in Figure 4.6 with the results of the other two keypad conditions (that were

presented earlier in Figure 3.3 in Section 3.3.6 of Chapter 3) to realize the difference we

got in the current keypad condition.

For all the participants, the average time taken to pick the desired key from the jumble of

Figure 4.6: Selection Time of each participant in three keypad conditions QWERTY,Traditional 4×3 (Ambiguous), and Hot Keys.

keys in Hot Keys setup was much faster than the QWERTY and traditional 4×3 keypad

condition and this difference is significant by paired T-test. This trend was consistent

through all the trials for each participant.

Error rate: Error rate is measured as the number of incorrect key presses divided by

51

Page 80: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4. Solution with the Presence of Tactile Sense

the number of required key presses(150). There were no options provided to make a cor-

rect entry if participants made any mistake. Error rates are shown for each participants

in Figure 4.7 with the results of the other two keypad conditions (that were presented

earlier in Figure 3.4 in Section 3.3.6 of Chapter 3) to realize the difference we got in the

current keypad condition.

It is much less for the Hot Keys setup rather than the widely used traditional 4×3 and

QWERTY keys condition.

Figure 4.7: Error rate of each participant in three keypad conditions QWERTY, Traditional4×3 (Ambiguous) and Hot keys.

Subjective Preferences : We asked all the participants to compare these three keypad

conditions, and according to our expectation most of them (7 out of 8 participants) have

preferred the keypad interaction for Hot keys condition. In spite of their experience

of using QWERTY keypad, they reported that finding a key in QWERTY keypad is

relatively hard especially when the target keys are lie in the middle portion of the keypad.

But in Hot keys condition all the keys are very easily and undoubtly traceable. One of

the participants has gave his preference to traditional 4×3 keypad condition even if he

was also experienced with QWERTY keypad. He reported that he prefers to use 4×3

keypad because he was well habituated with it (although he had well experience with

52

Page 81: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4.3. Efficacy of Hot Keys Setup

QWERTY ) and did not wish to change it. But he also reported that he did not do text

messages to others. One of the participants commented If the tasks (basic functionalities

like calling messaging etc.) are possible by this (Hot keys condition) then of course I’ll

use this (Hot keys condition) one only.

4.3.3 Discussion

As we have continued our experiment on the same mobile phone set, so conditions of each

key like size, spacing, and height were not varied across the three keypad conditions (Hot,

4×3, QWERTY ); only the number of keys in the search set varied and this variation

affected the search time for a key in all three test keypads for all participants. The keys

also were enough tactile sensible what we confirmed before starting our experiment by

the participants. So, from the above discussions and the results it is very clear that the

keys are more discoverable and the detection of a key is more accurate in the Hot keys

scenario rather than the other two (traditional 4×3, QWERTY ) for our target users

as they sense the keys without any visual assistance. Also they settled their subjective

preferences to this setup. So, this key setup is naturally preferable to interact to a mobile

phone for the blind people.

This keypad setup only contains the functional keys and do not contain any keys in

the text entry region. But we have to investigate how an efficient text entry mechanism

can be implemented with this setup.

If it is implemented, then, will it be better in terms of text entry rate, errors during

entry, ease of use etc. than its’ old existing competitor?

We will start this investigation from the basic usability issues exist in the traditional

4×3 keypad system. Traditional 4×3 keypad system is alternatively called as a MultiTap

system as to enter a character, we may need to press multiple times to a key. We use

these two terms alternatively. Our aim is to propose an efficient text entry technique

utilizing our target users’ capabilities and usage habits with keeping their inabilities in

53

Page 82: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4. Solution with the Presence of Tactile Sense

mind to fulfill the needs identified by them only. In next section, we are going to present

this work.

4.4 EasyTap: Efficient Text Entry System for Blind Users

Here, our objective is to propose an efficient text entry technique for the people with

blindness to access mobile phones. We already found a suitable keypad type for our

target users. Now our target is to design an efficient text entry technique which can

be implemented by utilizing the keys from that setup. In order to do that, we have

initiated our investigation from the baseline of realizing our users. We have to identify

our target users’ capabilities, usage habits, and the issues they faced to access traditional

4×3 MultiTap system. Then, accordingly we have to propose an efficient text entry

mechanism followed by a detail evaluation.

The remaining of this section is organized in the following manner. In Section 4.4.1,

we have identified the accessibility issues and accordingly have specified users’ demands.

Then, in Section 4.4.2, we describe the iterative design process of our proposed text entry

method, followed by a detail user evaluation in Section 4.4.3.

4.4.1 Specify Users’ Demands

To better understand the users’ needs and to realize their accessing capabilities, we in-

terviewed 5 visually impaired person. Two among them were consultants of blinds and

visually impaired having 2 plus years experience and they are also blind. They used 4×3

traditional keypad (multitap) based mobile on a regular basis. Two of the consultants

and two other interviewees were well experienced with multitap system both with and

without voice-feedback support. Another one interviewee used multitap system without

any voice-feedback support but he used mobile phone only to receive and make calls. We

ensured the following facts about the interviewees.

54

Page 83: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4.4. EasyTap: Efficient Text Entry System for Blind Users

(a) A traditional4×3 MultiTap text-entry keypad.

(b) A typical user’sfinger movementon traditional 4×3MultiTap keypad.

Figure 4.8: Blind users with traditional MultiTap keypad.

1. They knew the assignment of letters on each key (ensured through questionnaires).

2. They knew the sequence of the letters assigned to a key, for all the keys on keypad

(ensured through questionnaires).

3. They knew the position of a particular key on the keypad (ensured by asking several

times to find keys arbitrarily).

4. They knew the relative position of a key from another key in the keypad.

We ensured the above mentioned facts by giving them a very small task to go from

one arbitrary key position to another arbitrary key, 10 times to each interviewee, and

asked them to comentry the steps and observed it to crosscheck the scenario. For an

example, at an instance the acting finger of an interviewee was on key8 and she was

asked to reach key3. The inner steps according to her commentary (also observed and

crosschecked) was a path that started from key8 and traversed through key5, key2 and

then came to key3 (see Fig. 4.8b). It is to be noted that movements were limited to up,

down, left, right for all the cases of all interviewees. We have not instructed them to

move in any particular manner, it was absolutely free movement interaction. But no any

angular movement has been informed. They informed us a few issues which helped us to

55

Page 84: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4. Solution with the Presence of Tactile Sense

realize inabilities of the blind mobile users. Here is the summary of our understandings.

Instead of knowing the basic facts mentioned earlier, what are the exact points of diffi-

culties they faced to enter texts?

1. Instead of knowing or having good mental map, the direction from one key to

another, it still requires very high visual assistance to complete the movements from

one key to another. The movement includes the full control over the directional

and distantial changes over the path from source to destination key. Without

vision it is the tactile sense/feature which assists users to realize or be sure about

his movements. So, obviously the procedure becomes slow and users become less

confident.

2. As there is no any way for the users to know that they properly reach to their

target key or not, before selecting a key, so, in many cases it leads to a hesitation

to select the key and also may cause of occurring more errors.

So, the needs of users which consider their capabilities and discard their inabilities

are as follows:

1. The need of a mobile text entry mechanism which can utilize the usage habits

and the mental map of letter distribution on keypad but can avoid the movements

through a long path only assisted by tactile sense.

2. The need of a mobile text entry mechanism which informed users about the letter

before accepting it and thus lead the users to hesitate less and have more control

over the system and also it influences to less error occurrences.

In the following, we explore the mechanism that can fulfill these needs of blind users.

56

Page 85: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4.4. EasyTap: Efficient Text Entry System for Blind Users

(a) Virtual inter-face with focus is atgroup number 5.

(b) 4-way navigation key and a selector.

Figure 4.9: EasyTap interaction mechanism.

4.4.2 Iterative Design of EasyTap

EasyTap is a mobile text entry mechanism that is built based on our target users’ capa-

bilities and keeping their inabilities in mind to fulfill the needs identified in the previous

section.

The key idea behind our approach is to use the popular letter distribution map of

multitap system by avoiding the requirement of long path to reach a key (which was

identified as inabilities of users without vision). We built a text entry interface (see

Fig. 4.9a) in which we assign a group of 3-4 letters to an object.

We present the group number similar as the key number from 1 to 9 as present in

a multitap keypad. Initially, the focus will be on group number 5 when the text entry

interface shown in Fig. 4.9a will be loaded. Users can interact with the interfaces only

by using a 4-way navigation key and a selector (shown in Fig. 4.9b).

Users can change the focus from one letter group to another by using the 4-way

navigation scheme. Users get a sound feedback after change of control from one group to

another. Through ‘selector’ users can select any one of the group at a time. Suppose, at

an instance User-I have selected say group number 2 and that consists of letters [a,b,c].

Then firstly we designed our prototype in a way that, through left-right navigation or

through up-down navigation, control may be changed over the individual letter. Through

57

Page 86: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4. Solution with the Presence of Tactile Sense

sound feedback any user can easily percept the currently focused letter and can select it

through selector. After selection of a letter, system automatically return to first interface

with group number 5 is at focus (Fig. 4.9a). Later we changed the scheme a little

bit in selecting individual letter after successful selection of a letter group (excluded

group number 1 as it represents the punctuation group). We made that changes by

discussing with the interviewees to whom we contacted to realize our users’ capabilities

and inabilities and specify their needs. All of them strongly favored to implement that

changes and our prototype approach EasyTap got its final shape. We explain the scheme

through an example.

Let, at any instance user have successfully select group number 7. Then he can enter

either P or Q, or R, or S through left, up, right, down navigation respectively. That is

after selection of group number 7 the key-letter mapping will be as follows:

Left - P (1st letter of the selected group), Up - Q (2nd letter of the selected group), Right

- R (3rd letter of the selected group) and Down - S (4th or last letter of the selected

group if it exist, it only exist for group number 7 and group number 9), rule that follows:

clockwise direction starting from Left navigation. After selection of group number 7, the

mapping of key to letter is described in Fig. 4.10.

Figure 4.10: Mapping of letters after selection of group number 7.

Group number 1 contains the punctuation marks. After selection of group1 users

have the different punctuation options; focus is controlled through left-right navigation

keys with sound feedback. Users can enter the currently focused punctuation mark from

‘selector’. The selection of down navigation of any group that contains 3 characters

58

Page 87: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4.4. EasyTap: Efficient Text Entry System for Blind Users

allows to enter a ‘space’ character. ‘*’ marked group is used to toggle between numerics

and alphabets and ‘#’ marked group allows user to select different writing styles. We

continued a user study to evaluate our prototype system EasyTap, discussed in the next

section.

4.4.3 User Evaluation

Our study aims to realize the benefit of our prototype system EasyTap. We realize

the fact in terms of text-entry rate, error occurred during the entry, users’ control and

comfortness during the performance for our target users.

4.4.3.1 Participants

Five volunteers (2 female) ranging in age 16 to 38 years were participated in our experi-

ment. All participants were experienced and owned keypad based mobile phone. All of

them had traditional 4×3 keypad based mobile (MultiTap) and they used it in reguler

basis. All of them had more than six months of experience of using traditional 4×3 key-

pad based mobile phone. They regularly used it to place and receive call. Most of the

cases they reminded the phone number and entered that number directly to place a call.

All of our participants were well aware about screen reader by means of mobile devices

or personal computer; and two of them regularly used this kind of software. The details

about the participants are given in Table 4.2.

User Age Gender Education Impairment periodP01 20 Female 12th Grade 10 yearsP02 35 Male BA 29 yearsP03 38 Male BA 32 yearsP04 16 Male 10th Grade 11 yearsP05 18 Female 11th Grade 6 years

Table 4.2: Basic characterization of the participants.

59

Page 88: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4. Solution with the Presence of Tactile Sense

4.4.3.2 Task

To evaluate EasyTap, we performed several trials with target population in calm and

quiet environment. Each participant performed both input techniques: MultiTap (tra-

ditional 4×3 input mechanism), EasyTap. The MultiTap system was known by all the

participants instead of that we allowed 30 minutes practice session for it. The training

time we determined for EasyTap system was also 30 minutes for each participant. That

practice and training session lasted for 6 days before we start logging the performance of

the participants.

We performed daily evaluation session for 6 days in a controlled environment. In each

day each participant had to enter texts in two sessions with one hour gap between the

sessions. In those sessions, participants were asked to enter 3 phrases taken from MacKen-

zie’s phrase set [70] for both MultiTap and EasyTap techniques counter balanced across

participants. Text entry rate in words per minute (wpm) in all the sessions over the 6

days period for all the participants is measured.

Our study was conducted on a Java based handset. A pair of speakers was plugged-in

with the head set jack of phone set to ensure clear sound feedback for the participants

during the experiment.

4.4.3.3 Experimental Design

The design of experimental study included text entry tasks in MultiTap and EasyTap

system. After a one week practice and training session, we start logging the performance

of our participants. The experiments were running over 6 days and participants had

to enter 3 phrases (varying across the day) for each MultiTap and EasyTap daily. We

allowed a one hour gap between these two systems. All participants are instructed to

enter texts correctly and quickly at their best. There were no additional stimuli.

In summary experimental design was: two mechanisms (MultiTap and EasyTap) in two

sessions a day × 3 phrases in a session × running over 6 days × 5 participants = 36

60

Page 89: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4.4. EasyTap: Efficient Text Entry System for Blind Users

phrases in 6 days for each participant.

4.4.3.4 Results

The text entry speed and the errors during the entry were measured.

4.4.3.4.1 Entry rate: The text entry rate in words per minute (WPM) were

measured in each session for all participants to assess speed. The words per minute

(WPM) [71] of text entry rate is calculated as:

WPM =|T | − 1

S× 60× 1

5

where |T | is the length of the transcribed string, S is the time taken to enter the texts

0

0.5

1

1.5

2

2.5

3

3.5

4

4.5

5

1 2 3 4 5 6

Wor

ds P

er M

inut

e (W

PM)

Day

P1, MultiTap

P1, EasyTap

P2, MultiTap

P2, EasyTap

P3, MultiTap

P3, EasyTap

P4, MultiTap

P4, EasyTap

P5, MultiTap

P5, EasyTap

Figure 4.11: Text entry rate registered in all sessions with 2 techniques.

in second, and 5 is taken to consider 5 characters per word according to [71]. In a day

each participant performed both the techniques in two different sessions and the WPM

in both sessions was calculated. Figure 4.11 shown the WPMs for all 5 participants in

two sessions daily for all 6 days. For all participants the entry rate (WPM) was higher

61

Page 90: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4. Solution with the Presence of Tactile Sense

for EasyTap than traditional 4×3 MultiTap system.

4.4.3.4.2 Error rate: The error rate was higher in case of MultiTap system rather

than the EasyTap system. The error rate E is measured as [71]:

E =IF + INF

C + IF + INF

where Correct (C) is denoted as all correct characters in the transcribed text, Incorrect-

fixed (IF ) is denonted as all characters backspaced during entry, and Incorrect-not-fixed

(INF ) is denoted as all incorrect characters in the transcribed text. Error rate [71] is

measured for all sessions for all the participants in all 6 days for both the systems and

the average of it is shown in percentage in the Figure 4.12.

Figure 4.12: Error rate registered in all sessions with 2 techniques.

62

Page 91: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4.4. EasyTap: Efficient Text Entry System for Blind Users

4.4.3.5 Subjective Preferences

We performed a subjective evaluation for EasyTap. Some questionnaires were placed to

the participants and they settled their agreement in a 5 point Likert Scale. Table 4.3

shows the result for subjective evaluation for all 5 participants. The results reveal the

acceptance of EasyTap as really an easy and friendly text entry mechanism in mobile

device for our target users. The results clearly show the difficulties to percept their finger

movement during the entry become much easier in case of EasyTap for our participants.

Feedback for first query statement clearly reveals that fact. They were getting audio

support before going to select a group in EasyTap and as a result they did not hesitate

before selecting a group which was clearly missing in traditional 4×3 MultiTap system.

They also had more control and confidence in using EasyTap.

RatingsQuery statements Participants

P1 P2 P3 P4 P5ET MT ET MT ET MT ET MT ET MT

Easy to reach to a particularletter

5 2 5 2 5 2 5 3 5 3

Hesitation before select a key 2 4 1 3 1 4 2 4 1 3Lack of control over the system 1 4 1 4 1 4 2 3 1 4Feel confident when use 4 3 5 2 5 2 4 2 5 3Prior usage habit with Multi-Tap keypad was helpful

5 NA 5 NA 5 NA 5 NA 5 NA

Table 4.3: Subjective ratings on EasyTap(ET) and MultiTap(MT) in 5 point Likert scale.Strongly Agree = 5, Agree = 4, Neutral = 3, Disagree = 2, Strongly Disagree = 1, NA =Not Applicable.

4.4.4 Discussion

In Section 4.4, we have presented a fast and accurate text entry mechanism which can

be accessed only by utilizing the keys from Hot Key region. In this technique, users

can find their target key much faster and accurately with less effort. In addition to it,

users can enter text much faster and accurately with more control over the system than

63

Page 92: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4. Solution with the Presence of Tactile Sense

the traditional 4×3 MultiTap (also known as Ambiguous keypad) system. Participants,

participated in this study were experienced with the 4×3 MultiTap system, in-stead-of

that, our proposed approach proved as a better one. From our design policy, we have

discarded our users’ inabilities, but we have considered users’ capabilities. We have

cleverly utilized the users’ usage habits of using the 4×3 system, but, have discarded the

usability issues from our solution.

Now, the query comes, how this effective text entry technique can be integrated to

implement real life use case scenario, that is, any basic functionality by a mobile phone.

We have to establish it with a proper example prototype. In next section, we are going

to present this example prototype.

4.5 An Example Prototype Utilizing EasyTap

In the previous sections (Section 4.4), we discussed the detail about the text entry pro-

cedure with the functional keys, and also we reported the efficacy and advantages of our

proposed method. Now, we describe our study with an example use case that shows how

our proposed entry method can be utilized at a practical prototype. We explain this

through Text Message Sending, that is, through Texting Activity.

4.5.1 Context

Text entry is the most popular communication technique with mobile phone in recent

time. Now-a-days, texting through mobile device has become an integral part of daily

mobile phone use. Texting enables many basic functionalities like short message service

(SMS), contact management, etc. in a phone. Also, SMSs gain a top rated popularity

in recent years and turn into the number one communicating media or process through

a cell phone. For example, about 54% of American teenagers communicating through

text messages compare to face-to-face interaction(33%) or talking over phone (30%) [20].

Nearly 6.1 trillion SMS text messages were sent only in 2010 [2]. This rate was increased

64

Page 93: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4.5. An Example Prototype Utilizing EasyTap

and predicted to be increased over the years.

In this section, we present an example prototype through which we show how our

proposed text entry technique EasyTap can be utilized to implement basic activity. We

describe this through an example of Text Message Sending Activity, which is popularly

known as Texting. We design an interface to perform text message sending activity

consistent with the text entry mechanism EasyTap.

4.5.2 Functional Organization

Texting activity includes three sub functions: The first one is to enter the contact infor-

mation of the person to whom users want to send the text message. Again, it can be

performed either by entering a number or by selecting a name which is already saved into

the contact list. The second subtask users need to perform is to enter the body of the

text message. It comprises of a simple text entry task. The last task is the exact sending

message instruction to that name or number. So, for the first sub-function users need to

choose one procedure among these two, and the second one (text composition) and last

one (sending task) are obvious (see Table 4.4). We are taking care of that issue in a little

bit different way. At the starting of the application, we provide options to the users: (1)

Function Sub-function Sub-function Sub-function

TextingEntering contact Entering name Enter name phrase

Search for matchEntering number

Entering message textSending task

Table 4.4: Functions and related sub-functions to perform text message sending (Texting)task.

sending message by select name from the contact list and (2) sending message by entering

number. Users need to select any one of the options at the beginning. Figure 4.13 shows

menu organization of Texting activity.

Users can have the control over any of the options and can change the control from

65

Page 94: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4. Solution with the Presence of Tactile Sense

Figure 4.13: Menu organization for the Texting (Text Message Sending) Activity.

one to another by using up-down navigation keys on the keypad. Users are able to select

any of the options through the selector on the keypad. Selection of first option (see

Fig. 4.14) redirects users to the name entry screen (see Fig. 4.15a) and the second option

redirects to the phone number entry screen which is similar as the Fig. 4.5c of the Call-

ing use case, which we already have discussed in Section 4.2 in this chapter (Chapter 4).

After finishing the entry (phone number or contact name) system will go for text message

entry screen.

Phone number entry procedure is same with the previous example described at Sec-

tion 4.2.2.4 (Fig. 4.5c). In Calling use case, in the interface presented by Fig. 4.5c in

this chapter (Chapter 4), the selection of ‘Finish’ widget initiates the call to the entered

66

Page 95: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4.5. An Example Prototype Utilizing EasyTap

Figure 4.14: Screenshot of options appear to the users to enter contact for Text MessageSending (Texting) activity.

number. But, here, the selection of ‘Finish’ widget redirects our users to the screen to

compose the texts for the message. Functionalities of all other widgets are remain same

as the previous one.

Contact name entry mechanism is developed by using the EasyTap entry procedure.

Contact entry mechanism and text message entry mechanism for this example prototype

is discussed in next two sections (Section 4.5.3 and 4.5.4).

4.5.3 Contact Entry Mechanism

Contact entry mechanism can be done in two ways. Users either will enter phone number

of the target receiver (procedure is already discussed) or it already be saved into the user’s

contact list. In case of searching the name from contact list, at first, users have to put

character(s) as input by using EasyTap text entry mechanism (Fig. 4.15a), and then, by

selecting the bottom-right button of the interface, for the first time, it will read out the

given input of the entered name phrase (Fig. 4.15a), then, if the same button is selected

for the second time consecutively (Fig. 4.15a) by ‘selector’ key then it will redirect to the

67

Page 96: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4. Solution with the Presence of Tactile Sense

(a) Screenshot for entering namephrase of Text Message Send-ing (Texting) activity by usingEasyTap. First-time selection ofthe right-bottom widget, read outthe entered name phrase to besearched from the entire contactlist; a consecutive second-time se-lection of this widget initiates thesearching task and redirect the sys-tem to present the list of matchedcontacts.

(b) Screenshot for selection of thecontact name from the matchedlist appear to them for Text Mes-sage Sending (Texting) activity.

Figure 4.15: Screenshots for contact entry mechanism for Text Message Sending (Texting)activity.

68

Page 97: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4.5. An Example Prototype Utilizing EasyTap

Figure 4.16: Screenshot for composing the texts for message body of Text Message Sending(Texting) activity. First-time selection of the right-bottom widget, read out the composedtexts for the body of the message; a consecutive second time selection of this widget performsthe sending task.

interface where all the matched contact from the contact list will appear in a linear order

(Fig.4.15b), which can be accessed by the up and down navigation keys. Users can hear

these matched contact names by changing the focus through up-down navigation keys.

They can select the target contact by pressing the ‘selector’ key after hearing the target

contact name. After selecting the contact name, system will automatically redirect to

the text entry interface for entering the body of the text message (Fig.4.16). Like the

entire previous example explained here, the focus on the button can be changed by the

4-way navigational keys, and an audio feedback is given with the change of focus on the

buttons. By hearing that audio users can find their target button from screen. In the

next section, text entry interface in context with the Texting activity is explained.

4.5.4 Text Message Composing Using EasyTap

Our proposed text entry technique – EasyTap is already discussed in Section 4.4. To

incorporate this entry technique at our prototype message sending example, we just have

69

Page 98: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4. Solution with the Presence of Tactile Sense

changed the functionality of one button. The right-bottom widget in Fig. 4.16 is used

to read the entered message body when it is selected by pressing the ‘selector’ key for

single time. By hearing the message body, if users again press the ‘selector’ consecutively,

then the composed text message will be sent to the previously entered phone number or

selected contact name. All other functionalities of EasyTap explained in the Section 4.4.2

is remain same.

4.5.5 Evaluation

We already have reported the performance for EasyTap by a set of participants. Here, in

this section our aim was to realize whether our participants could perform the message

sending activity when the EasyTap method is already known to them. The same set

of participant was given the text message sending task on the same device as used in

Section 4.4.3. We built ten phrases to use as message body for the experiment. These

are: good morning, i am fine, good to see you, thank you, good evening, happy birthday,

all the best, glad to meet you, looking fresh, best of luck .

We explained the whole message sending procedure to them. They all sent the mes-

sage text hello to one of their known number and to one of the contact which was already

saved in their contact book, as to practice the procedure.

Then each of them was given four phrases randomly from the phrase set, and all of

them had to send one phrase to one contact. It was repeated for four times for four

phrases by each participant. Among the four repetitions two times they had to send

it to two different phone numbers, and two times to two contacts previously stored in

address book. Before the experiment, each participant submitted two phone numbers,

which is already in their mind and two contact names that are already stored in their

address-book. We cross-checked the fact that participant can remember the respective

phone numbers and the respective contact names. Counterbalancing has been maintained

between these two types (texting to a phone number and texting to a contact name) of

70

Page 99: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4.5. An Example Prototype Utilizing EasyTap

message sending scenario.

In earlier study, we did measure the performance of text entry technique – EasyTap

in terms of its speed and accuracy. So, now we need to check how well the participants

can handle the text message sending activity when it is implemented by utilizing the

EasyTap. A usability study has been continued with participants to verify how well the

users can operate the current prototype interface and to realize their comfort or any

difficulties they might face during the operations. Some assumptions were made during

the design phase of the prototype. Now these assumptions are transformed to a set of

testable criteria. These assumptions during design phase and the related criteria are as

follows.

Design Assumption: Users would comprehend the hierarchy of the prototype in-

terface and the arrangement of system commands on the interface. They will be able to

execute them efficiently.

Criterion 1: At the time of performance, users will be able to percept the current stage

of interface appear to them at any given point of time.

Criterion 2: To perform the message sending activity users are given, they can select

the correct steps easily to perform it.

Criterion 3: After completion of the training, users are able to percept the changes

occurred in the interface due to input of system commands from the associated system

feedbacks.

Criterion 4: Users will be able to find the appropriate command easily on an interface.

Queries we made to test the criteria and the responses from the participants are given

in Table 4.5.

Participants had to settle their agreement in a 5 point Likert scale against each query

statement. The responses collected from the participants reveal the acceptance in favor

of our assumption.

71

Page 100: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4. Solution with the Presence of Tactile Sense

Query statements ParticipantsP1 P2 P3 P4 P5

I clearly understand from the system feedbackthat which screen is appears to me wheneverit changes through the hierarchy to completethe activity

5 5 4 4 5

I clearly understand the next step I have tofallow from the current state of activity 5 5 4 4 5

I was able understand the changes occurredat the interface system after each of my inputapplying to the system 5 5 4 5 5

I was able to get the control whichever I re-quired at any state of accessing system 5 5 5 5 5Overall interaction procedure to the systemwas easy 5 5 4 4 5

Overall sequence of activities towards thecompletion of message sending activity waseasy to understand

5 5 4 4 5

Table 4.5: Settled agreements by the participants against the query statements placed totest Design Assumption. Agreements are in 5 point Likert scale. Strongly Agree (SA) = 5,Agree(A) = 4, Neutral(N) = 3, Disagree(D) = 2, Strongly Disagree (SD) = 1.

4.5.6 Discussion

In this Section 4.5, we have established that our proposed text entry mechanism – Easy-

Tap, can be utilized to implement basic tasks with a phone set. In support of our claim,

as an example, we show a prototype system to perform very popular Text Message Send-

ing activity. As our system can be accessed only through the functional keys, so, our

proposed design is free from the associated overhead of dealing with jumble of keys with-

out the presence of vision. So, this procedure may be claimed as a successful alternative

system for the blind mobile-phone users.

4.6 Summary

Mobile phones are very common tool that is used popularly in everyday life, but it highly

relies on the visual assistance of its’ users. Text entry is a task that enables many of the

72

Page 101: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4.6. Summary

basic functionalities of the mobile phone. People with visual impairment have limited

access to that device. In this chapter, the authors described how blind people are facing

difficulties to enter texts in MultiTap system. The inabilities to percept the directional

and distantial changes of finger movements in absence of vision, even after keeping a

good mental map of the keypad layout and character distribution among keys, destructs

the accessibility. Considering their capabilities, the authors provide EasyTap, a simple

text entry technique that utilizes users’ experience and mental map of the character

organization of MultiTap system. EasyTap requires very less movement of the finger and

thus the dependences to realize own directional and distantial changes is very less.

In this research, we also evaluated our system by blind participants and got im-

provement in text entry rate and with less errors. Participants were able to reach to a

particular character more easily and they were able to use their previous experience and

got full control over the system and also were full confident in use.

Our proposed mechanism EasyTap also can be used to implement basic phone func-

tionality. As for example, we took popularly used Texting activity. We show how our

entry system can be utilized to implement the activity. We continued a study with blind

participants with our example prototype - to perform Text Message Sending activity.

From the response of the participants, it is established that our system is easily under-

standable, overall interaction with the system is easy and friendly; users can have well

control over the system. So, the presented keypad-based solution in this chapter has

strong potentiality to be used as an easy and friendly mobile phone access mechanism

for the people with blindness.

In addition, the navigational scheme with a proper voice guideline can be explored

to access many applications like to handle a music player, playing games, set alarm, etc.

We are required to decide all the functions and related sub-functions of each particular

application to implement this. Furthermore, we need to decide a suitable hierarchy

of activities to perform the allied functions, so that, users can perform it with fewer

73

Page 102: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

4. Solution with the Presence of Tactile Sense

interaction steps, fast, and with lower cognitive load.

74

Page 103: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

Chapter 5

Gesture-Based Text Entry for Touch

Screen Devices

In Section 3.6 of Chapter 3, we have specified the urgency of two kind of solution ap-

proaches. One was of keypad based solution and another was to access touch screen based

devices. In Chapter 4, the first one, that is, a suitable keypad based solution approach

with the presence of tactile differences has been discussed. In this chapter, we present

the proposed gesture-based access mechanism on touch screen enabled mobile devices for

blind people.

5.1 Introduction

Of late, touch enabled mobile phone devices hit the market that trim the obligations for

the peripherals like keyboard and have become a trend of the population [1]. In fact, touch

sensible screens add a new thrust to grow the general popularity of technology and going

to occupy a very large market space. Note that, accessing devices through touch screens

is highly a visual demanding procedure. Although, those devices are gaining popularity

day-by-day, but the blind community face a great difficulty due to the accessibility issues

75

Page 104: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5. Gesture-Based Text Entry for Touch Screen Devices

to cope with touch enabled devices. To bridge this gap, in this chapter, we present

VectorEntry – a text entry mechanism consistent with the character distribution and

layout of commonly used traditional 4×3 telephone keypad but designed on touch-enabled

mobile devices for blind people.

In this chapter, we present a user friendly mobile text entry mechanism for the blind

people on touch enabled mobile devices aiming to make the touch screen accessible to

them. Our research investigation is as follows. First, we explored how blind users’

knowledge of directional sense can be transformed to perform the directional movement

gestures on a flat device screen. We did this through two consecutive user studies.

The second study was conducted, motivated by the observation of the first study. Our

investigation reveals that the directional flick gesture, based on think once, move once

strategy, is a fast and suitable interaction for them. Based on this observation, we propose

a text entry mechanism. Taking this into account, we design an interface consistent with

the character distribution of traditional 4×3 telephone keypad layout. We observe that

blind participants are able to utilize their knowledge regarding the telephone keypad

layout to access applications on touch based devices.

We compare our proposed technique with the Bonner et al.’s No-Look Notes for blind

community which is also build based on similar character distribution scheme. We found

our system achieves a better text entry rate with no significant error difference. Also, it

provides more comfort to users.

The organization of this chapter is as follows. The scopes behind this research and

our objective to meet the addressed scopes are discussed in Section 5.2. We have con-

ducted two user studies in this work to decide a suitable directional gesture for our

target users. Details about these two studies and results observed are discussed in Sec-

tion 5.3 and 5.4, respectively. The first study was conducted to explore the performance

of guided directional movements on touch screen based mobile devices assisted by the

device’s landmarks like edges, corners of the device, and the second study was to explore

76

Page 105: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5.2. Scopes and Objective of the Work

the performance of unguided flick gestures with the similar device. We propose a text

entry mechanism based on the interaction procedure decided from the studies, which is

presented in Section 5.5. Finally, we report the summary of our findings from the work

presented in this chapter, in Section 5.6.

5.2 Scopes and Objective of the Work

We have reported several initiatives taken by the researchers to facilitate blind people.

But these solutions are not necessarily free from limitations. In this section, we highlight

the scopes this research work in terms of the current issues for our target users with the

touch enabled mobile devices, and the immediate urgencies created for our target users

with the recent boom in the market space of touch enabled smart phones.

Recent trend of the mobile devices is clearly tending towards the touch enabled in-

teraction procedure. Several initiatives to make the touch screen accessible for the blind

users are also reported, but these are also not free from the limitations. The major

concern about the touch screen devices is the lack of tactile features involved in the in-

teraction. We have reported some single-stroke based gestures like Graffiti, Unistrokes to

enter texts on touch based mobile devices. But, to perform the gestures again require a

high degree of perception on distance and directional changes during the entry, which is

very difficult in the absence of vision. As-well-as the new set of gestures for the alphabets

has a huge primary learning load. As an alternative to that, researchers have tried to

transform the knowledge of Braille into the touch screen based devices. But the Braille

literacy rate is significantly lower compare to the blind community. Keeping this limita-

tion in mind, researchers has tried to transform the popular physical keypad designs into

the touch enabled devices. Apple’s VoiceOver [8] uses QWERTY layout on touch screen

but controlled scrawling on the screen to search for a small widget make the process very

slow [14]. Demand for the location accuracy on the screen affects the accessibility. The

similar shortcomings exist for touch-enabled implementation of MultiTap [16] system and

77

Page 106: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5. Gesture-Based Text Entry for Touch Screen Devices

also for No-Look Notes [14].

According to the recent scenario touch enabled mobile phone devices trim the involve-

ments for the peripherals like keyboard and have become a craze to the young generation.

These touch sensible screens add a new spark to grow the popularity of technology and

going to occupy a very large market space. Accepting this recent technological alteration,

a huge scope exists to provide better accessibility to blind community on the touch-based

devices, where, interaction procedure is highly relying upon the visual perception in terms

of placing command (input) and accessing information (from output screen). Considering

the limitation with the existing methodologies and recent trend of touch-enabled mobile

hand-held devices, we lay down the following research in this work. In this work, we set

our objective to identify users’ capabilities and accordingly reach to a suitable design

to access mobile devices using gesture-based interaction for blind people. Plan of our

research is as follows.

• To understand the efficacy of two types of gesture-based interactions namely guided

directional movement and unguided flick gestures. To do this, we perform two user

studies.

• Based on the lesson learned, our next task is to develop a suitable interaction

mechanism.

The main objective of our interaction mechanism would be to provide a text entry

system with the required functionalities. How we have carried out the above mentioned

tasks is discussed in the following sections.

5.3 To Study the Performance of Guided Directional Move-

ments

We underwent through a sharp observation to decide the interaction mechanism, espe-

cially about the directional gestures. Kane et al. [44] reported that the perception for a

78

Page 107: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5.3. To Study the Performance of Guided Directional Movements

particular position on a flat display surface of a mobile device is although very difficult for

the blind users, but, the landmarks like edges and corners of the device can be used to in-

fluence for better perception. These landmarks can be used to provide better interaction.

Suitable gestures can be made for the blind people that can be accomplished guided by

these landmarks. In addition to it, blind people have grown the notion of directions from

their miscellaneous day-to-day work. We organize an initial study to better realize their

potentiality to perform directional movements on a flat, touch-enabled on-screen display

of mobile device, guided by the spatial landmarks like edges, corners of the device. Our

experiment and results observed are discussed in details in the following.

5.3.1 Participants

Six participants (four males and two females) were appointed, ages ranged between 32 to

56 (average age = 40.8). All of them were blind. They used computer, assisted with screen

reader software on a regular basis. They reported that they are using mobile phones on

a regular basis with voice synthesis software. They all were well known to traditional

4×3 telephone keypad layout and also QWERTY layout. Two of the participants were

appointed from local blind organization via word of mouth and others were come through

them. Before giving any task, we ensured the following.

1. They knew the sides, that is, edges (top, bottom, left, right edges) of a device when

use.

2. They also knew the corners in terms of top-bottom-left-right.

5.3.2 Apparatus

The participants were given a Samsung Galaxy Grand 2 to get oriented themselves with

the (screen and bezels of the) device. It has quad-core 1.2 GHz processor, 1.5 GB RAM,

5.25 inch screen with 720×1280 resolution, ran on Android platform. As there was no

any tactile difference between the touch enabled display area and it’s non-active outer

79

Page 108: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5. Gesture-Based Text Entry for Touch Screen Devices

surface of the interacting device face, so, it was difficult for our participants to bound

their movement only on the active touch enabled screen area, rather than the whole

interacting face of the device. To provide a tactile guidance to differentiate the active

touch enabled screen area and the non-sensitive outer surface on the same plane – we

fix a strip of 3 mm width and 1 mm height surrounding the active screen area according

to tactile identifier guideline in [24]. A custom made application ran on the device that

captured all the movements on the screen and information was logged.

5.3.3 Design of Experiments

In this section, we discuss the procedure to perform the directional movements and define

a unit entry used for the current experiment. We explain the details of the unit entry

including the procedure to perform it followed by the details about the experimental

design.

5.3.3.1 Procedure for a Unit Entry

Participants were given instruction to move in two specific manner. The first type was

from one side (edge) to the opposite side (we termed it as a simple directional move) and

the second type was from one corner to its opposite corner (we termed it as a diagonal

move). Instructions were limited only within the basic eight directions: four simple di-

rectional moves (bottom to top, top to bottom, right to left, and left to right) and four

diagonal moves (bottom-right to top-left, bottom-left to top-right, top-left to bottom-

right, and top-right to bottom-left) for each trial. When they become ready to perform a

gesture, a directional description was conveyed by playing a sound, saying the direction

name (say bottom to top) to be performed. By hearing that direction, participants had to

perform the respective movement. A sound feedback saying ok was delivered to acknowl-

edge the participants against each correct entry. Un-matched entries were expressed by

an error sound. We termed this process of single directional movement gesture entry as

80

Page 109: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5.3. To Study the Performance of Guided Directional Movements

a unit entry in this experiment. After completing a unit entry, participants had to ready

again for the next directional movement gesture. We allowed a sufficient time gap (10

seconds was the adjusted gap for all participants) to get ready for the next entry after

completing one unit. To ensure that our participants could hear all the sound prompts,

a pair of speakers was plugged-in with the headset jack of the phone set. Task time for

each unit entry was measured from starting of sound prompt saying direction name to

the finger release from the active screen area at the end of completion of the gesture.

Just after the finger released from the screen, the outcome of the gesture is decided and

either an error sound or sound saying ok was prompt from the system accordingly.

5.3.3.2 Experiment

We made octuple (eight-tuples) of trial sequences by taking sequences of eight directions

to decide how the directions will be presented to the participants. We made 50 such

tuples. Each tuple contains all eight directions. Each direction appeared one time in

a tuple. Sequence of directions was arranged uniquely in all 50 octuple. At each trial,

participants were assigned to a tuple to maintain a proper sequence of appearing the di-

rections to them. This experiment was run on two device orientation – height dominant

position and width dominant position (see Fig. 5.1) counter balanced across participants.

Device was keep fixed on a front table of the participants.

In summary the experiment was – 6 participants × 2 device orientation × 5 trials

× 8 directional moves in each trial = 480 directional moves and among them half were

simple directional moves, and half were diagonal moves.

Before we started our experiment, a very short training session was arranged where they

were briefed about the system and their task. They were given the device with the

strip fixed around the screen, to percept it physically and to get familiar with the setup.

They were allowed to run the directional movements for 20 minutes and placed any

81

Page 110: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5. Gesture-Based Text Entry for Touch Screen Devices

(a) Heightdominantposition.

(b) Width dominant posi-tion.

Figure 5.1: Different positions of a phone set appearance to the participants.

queries regarding the experiment and system. In total it was continued for 40 minutes

per participant. We cross-checked and ensured that the entire participants are familiar

about the basic eight directions. They were able to distinguish the active screen area

surrounded by the strip. No one had reported any kind of difficulties in distinguishing

either any edge or any corner of the test device. Also no one had faced any trouble to

percept the sound prompted from the system.

5.3.4 Observation, Results and Discussion

We log their moves on the screen and we observed the followings.

1. Simple directional movements: Simple directional moves were performed suc-

cessfully. We logged each move. We have measured the tilt of the line segment

joined by the first and last touch points. The average angle of tilt for hori-

zontal moves and vertical moves including both device orientation were 4.28°and

3.96°respectively. In case of horizontal moves, the amount of tilt in two device ori-

entation had no significant difference (by paired t-test p=0.4, n.s.). Also, amount

of tilt in vertical moves for two device orientation has no significant difference (by

paired t-test p=0.71, n.s.). We took 95% confidence interval for t-test. On the

average, for the simple directional moves, in two different device positions did not

82

Page 111: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5.3. To Study the Performance of Guided Directional Movements

(a) Angle of tilt mesured in two device orientation.

(b) Time taken to perform simple directional movementsin two device orientation.

Figure 5.2: Angle of tilt and time taken to perform simple directional movements measuredfor the participants in height dominant orientation and width dominant orientation of aphone set appear to them.

have significant angle of tilt difference (by paired t-test p=0.58, n.s.; Fig. 5.2a).

Time taken at an average for the simple directional moves, in two different de-

vice positions had no significant difference (by paired t-test p=0.9, n.s., 95% CI.;

Fig. 5.2b). For simple directional move on portrait appearance, variances for P1-P6

are 0.1936, 0.1, 0.1576, 0.1336, 0.1416, and 0.1096, respectively; and on landscape

are 0.1336, 0.148, 0.1, 0.112, 0.1436, and 0.092, respectively for P1-P6.

2. Diagonal movements: At the beginning of this point, we start with a detail

description about the measurement procedure for the diagonal movements followed

83

Page 112: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5. Gesture-Based Text Entry for Touch Screen Devices

(a) Measurement of angles for the diag-onal movements in height dominant po-sition of the device.

(b) Measurement of angles for the diagonalmovements in width dominant position of thedevice, considering the the directions similar,as it was measured in height dominant deviceorientation.

(c) Measurement of angles for the diagonalmovements in width dominant position of thedevice, considering the diagonal movementssimilar with respect to fixed corner-zone, as itwas in the height dominant device orientation.

Figure 5.3: Measurement procedures of angular movements in different device orientation.

by the results for the same. In case of diagonal moves, starting from a corner, they

first reached to the opposite nearer edge then move to the respective corner by

following the edge. It was a general tendency captured during the experiment. We

treated the finger tip is at the corner if it touches with-in 1cm2 (152pixel2) square

bounded box of the respective corner. We considered the finger tip touches an edge

when it comes within the 76 pixel (5mm) distant from the screen’s actual edge and

84

Page 113: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5.3. To Study the Performance of Guided Directional Movements

we named it as edge line. We have measured the angle of the line segment starting

from the first touch point in corner region to their first cross of edge line as angle

of diagonal moves. Figure 5.3 describes how the diagonal moves and the respective

angles are measured by a symbolic example in both height dominant and width

dominant device orientation.

It is further discussed as follows. The boundary box ABCD represents a device

screen in height dominant position (Fig. 5.3a). Let L1 represents the respective

line segment of a left-bottom to top-right diagonal movement which starts at cor-

ner zone A, ends at the BC edge line. Now, ∠a will be the angle measured for the

L1, that is, for the respective diagonal move. Similarly, if L2, L3, and L4 represent

a right-bottom to top-left, right-top to left-bottom, left-top to right-bottom moves,

respectively, then, ∠b, ∠c, and ∠d are the associated angles for the movements.

Similarly, when the device appears to the participants at width dominant position

(Fig. 5.3b), then ∠a, ∠b, ∠c, and ∠d measure the angles of line segments L1, L2,

L3, and L4 which represent the diagonal moves from left-bottom to top-right, right-

bottom to top-left, right-top to left-bottom, and left-top to right-bottom respec-

tively. We measured the angles of the movements for both the device orientation.

Average of angles of all the successfully entered diagonal movements considering all

the diagonal directions and all the participants, was 56.8°in the height dominant

position and 34.1°for the width dominant position. A diagonal movement gesture

was treated as a successfully entered move, when it was started accurately at the

respective corner-zone according to instruction, and it also ended in the instructed

corner-zone by releasing the touch point for the first time from starting. Release the

touch point before coming to the target corner-zone was commonly found miss in

the performance. The difference captured for angle values in two device orientation

was probably due to their try to realize the size of the device screen and perform

accordingly. This scenario again realized if when measured the angles with respect

85

Page 114: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5. Gesture-Based Text Entry for Touch Screen Devices

to fixed device corners. Unlike the angle measurement style shown in Fig. 5.3b,

if we measure the angles like shown in Fig. 5.3c, then the similar fact, that is,

effort to fit the moves according to device size by the participants, will reflect. In

Fig. 5.3a, ∠a is measured against the move from corner-zone A to corner-zone C,

similarly in Fig. 5.3c also, ∠a is measured similarly for the move from corner-zone

A to corner-zone C. We named this style of angle measurement as the measure-

ment with respect to the fixed position. The angles ∠b, ∠c, and ∠d are also the

respective measures of this style as shown in Fig. 5.3a and Fig. 5.3c. Here, Fig. 5.3b

and Fig. 5.3c are similar position that can be appeared by rotating 90°clockwise

from Fig. 5.3a position. To avoid the confusion between Fig. 5.3b and Fig. 5.3c, we

again say that, Fig. 5.3b and Fig. 5.3c present the same device orientation (width

dominant), only the measurement style of angles for the diagonal moves are differ-

ent. In the pair of Fig. 5.3a and Fig. 5.3b, the measurement style with respect to

fixed direction is explained whereas in the figure pair of Fig. 5.3a and Fig. 5.3c,

the measurement style with respect to fixed device corners is described. However,

when the angles were measured like described in Fig. 5.3c style, the average angles

of diagonal moves considering all diagonal directions and all the participants was

55.9°. This value is much closer to the average angle value measure described in

Fig. 5.3a, and this observation again strengthen the fact that our participants really

have tried to realize the device screen size approved to them and accordingly tried

to fit their movements.

3. Time for movements: We also measured the time taken for the diagonal move-

ments for all moves of each participant in both device orientation. There was no

significant difference for average time taken at an average for all the successfully

entered diagonal movements in two device orientation (by paired t-test p=0.52,

n.s.; Fig. 5.4a). The simple directional moves took less time than the diagonal

moves for all the participants and the difference was significant in both the device

86

Page 115: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5.3. To Study the Performance of Guided Directional Movements

(a) Time taken to perform diagonal movements in two de-

vice orientation. H � presents height dominant device po-

sition and W � presents width dominantdevice position.

(b) Time taken to perform simple directional movements

versus diagonal movements in height dominant device ori-

entation.

(c) Time taken to perform simple directional movements

versus diagonal movements in width dominant device ori-

entation.

Figure 5.4: A quick look on movement times taken by the participants in height dominantorientation and width dominant orientation of phone set given to them.87

Page 116: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5. Gesture-Based Text Entry for Touch Screen Devices

orientation (for height dominant position: (by paired t-test p <0.05, s.; Fig. 5.4b) and

for width dominant position: (by paired t-test p<0.05, s.; Fig. 5.4c). T-tests were run

with 95% CI. Discussion: Results indicate that participants performed simple move-

ment easily with a less amount of tilt with respective axes. In case of diagonal movements,

there were few concerns. We observed that, in case of diagonal moves, the difference in

between the measured angles in two device orientation was much more when they were

measured in terms of directions. But, when this angle measurement is taken with re-

spect to the device corners, the angle difference was very less. It clearly indicates that

the participants were tried to realize the screen size and accordingly they tried to fit

their movements in different device orientation. Although, blind people strongly prefer

the gestures that used landmark like edges, corners of the screen, but, instead of their

endeavor, rather than going directly from one corner to its diagonally opposite corner,

they find an elementary way to come from one corner to its opposite corner (by taking

the help of opposite edge as mentioned earlier). In other words, our users failed to took

the advantage of corner-to-corner movement guided only by the corner landmark (rather

they performed corner-edge-corner movement), and as a result they had to traverse rela-

tively long path that took more time. Also there is a possibility to have extra load for the

participants to percept the screen dimension and coping up with the different orientation

of the device; this may be a cause that made the diagonal movements much slower than

the simpler directional movements.

5.3.5 Motivation to Go for Further Study

Although, blind people strongly prefer the gestures that used landmarks like corners,

edges of the screen, but, in case of the diagonal moves, it had the demands for the

accuracy to reach to a fixed corner location starting from one corner-zone. Participants

had to scrawl a certain portion of the path until it reach to the opposite edge, from

the starting corner. In this middle portion of the path, participants did not get any

88

Page 117: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5.4. To Study the Performance of Unguided Flick Gestures

assistance of any edges, corners or any other landmarks. They scrawled this path only

relied on their own spatial perception that were included the perception of screen size in

two (height dominant and width dominant) different orientation and the perception of

the directional and distant changes during the free scrawling on the screen.

As an alternative to this type of gesture, it would be interesting to explore an alterna-

tive gesture type, that is, if they perform short directional moves accurately without the

assistance of landmarks like edges or corners, then, those gestures having less demands

for the location accuracy to reach to a particular zone or edge, and at the same time,

it does not need to scrawl through a long path without any assistance, which in turn

may lowering the time to complete a gesture. In the next study, we have explored this

possibility.

5.4 To Study the Performance of Unguided Flick Gestures

In this study, we investigated the participants’ proficiency to accomplish the basic eight

directional moves. The aim of this study was to test the hypothesis that without the

favors of edges, corners and other landmarks; would blind people prefer to perform the

short directional moves (flicks in eight basic directions)? Could they perform it with

similar accuracy of the previous? If so, then, which one will be of more time efficient?

In response to those queries, our experiment and results observed are as follows.

5.4.1 Apparatus

Apparatus setup was similar to the previous study (Section 5.3.2). The only difference

was the border strip through the screen boundary were withdrawn. Our study was

continued with both the device orientations like the previous study.

89

Page 118: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5. Gesture-Based Text Entry for Touch Screen Devices

5.4.2 Participants

We continued the experiment with the same participants participated in the previous

study (Section 5.3.1). The participants did not experience any technical changes at the

period in between the end of the first study and start of the second study. The second

study was started just after the three days from the end of the first study.

5.4.3 Procedure and Experimental Design

Our procedure to perform the unguided flick gestures and define a unit entry, for the

current experiment. We explain the details of the unit entry including the procedure to

perform it followed by the details about the experimental design.

5.4.3.1 Input Mechanism for Unit Entry

In this study, participants had to perform eight basic directional flick gestures. The

experimental procedure was similar to the previous study. But, instead of guided by the

edges and corners and instead of started and ended at the fixed landmarks, participants

were given instruction to perform eight basic directional flicks freely at anywhere on

the touch-enabled device screen. They were aware that the tilt of their performed flick

gesture will be measured by the angle of tilt of the line segment joined by the first and

last touch points. We instructed them, to perform the flicks, based on think once, move

once strategy, rather than to adjust their direction of movements at middle of scrawling

to perform the flick gesture. After they became ready to perform, a name of a direction

was played and by hearing that, participants performed the respective flick. A sound

feedback saying ok or an error sound was generated against each correct and incorrect

entry, respectively. This process was termed as unit entry for this study. Participants

had to ready again for the next move, immediately after completing a unit entry. We

maintained a sufficient gap (10 seconds was the adjusted gap for all participants) between

two unit entries. Task time for each unit entry was measured from the starting time stamp

90

Page 119: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5.4. To Study the Performance of Unguided Flick Gestures

of the sound prompt of direction name to the time stamp of touch release from the active

screen area. Just with the touch released from the screen the performed flick gesture is

decided as correct or incorrect one and accordingly the sound feedback is played.

5.4.3.2 Design of the Experiment

We followed the similar procedure to build the trial sets for the participants like the

previous study. Also, we maintained the similar scheme like the previous study to present

the tuple of trial sequences to our participants. This experiment was also run on two

device orientation: height dominant and width dominant positions (Fig. 5.1) counter

balanced across the participants similar to the previous study.

The experiment was similar like the previous study. In brief, the experiment was � 6

participants × 2 device orientation × 5 trials × 8 directional moves in each trial = 480

flick gestures and among them half were simple directional flicks, and half were diagonal

flicks. Like the previous study, before we started the experiment, we continued a brief

training session where they were briefed about the experiment and their task. They were

given the system to get familiar with it and allowed to place any queries regarding the

system. They were given 15 minutes to practice the directional flicks. We again cross-

checked our system and ensured that no one had faced any difficulties to hear the sound

from the system.

5.4.4 Observation, Results and Discussions

5.4.4.1 Detection of Gestures

We log all their moves on the screen and we measured the angle of tilts of the line

segments joined by the first and last touch points. Participants were reluctant to put

their gestures at any location on the screen, in target to reduce the need for location

accuracy. To detect the direction of the line segments, we got from the gesture input, we

pursued the following scheme. The whole two dimensional space can be considered as the

91

Page 120: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5. Gesture-Based Text Entry for Touch Screen Devices

360°, and can be divided into eight segments each of 45°. Now, if the tiltness of the line

segment is within the range of 45°made with the vertical line(±22.5°with vertical line),

then, it will be considered as a vertical flick. If the direction of the gesture is at upwards;

then it will be considered as a up flick ; if the direction of the gesture is at downwards,

it will be treated as down flick. Similarly, we detect the left flick, right flick ; the tiltness

is measured with respect to the horizontal edge. If the tiltness of the line segment is

within the range of 45°(±22.5°) made with horizontal line, then the gesture is treated

as a horizontal flick. Now, if the direction is towards the left, then it is decided as a

left flick ; if the direction is towards the right direction, then it is treated as a right flick.

Other conjugate angle zones are clearly separated by these angle zones and diagonal flicks

(up-right flick, up-left flick, down-left flick, down-right flick) were measured similarly by

measuring the angle of tiltness and the direction. We observed the followings –

1. Simple directional flicks: Angles for all the simple directional flicks were mea-

sured and accordingly flicks were decided as rightly or wrongly performed gesture.

Our major concern was on the accuracy of the flicks, that is, whether the flicks

were rightly performed or not, rather than the angles of tilt of the flicks. And we

found all simple directional flicks were performed successfully. But, whether or not

these flicks can be used as a successful alternative, would be decided by the next

criteria.

2. Diagonal flicks: In case of diagonal flicks, most the moves were correctly inputted.

The angles for the diagonal flicks in both device orientation were measured in

the similar way described in the previous study (Fig. 5.3). We measured all the

angles of the movements for both the device orientations. Average of angles of

all the successful diagonal flicks considering all the diagonal directions and all the

participants was 47.2°in the height dominant position and this measure was 41.4°for

the width dominant position. A diagonal flick was treated as a successful entry

when it’s angle range and direction would match and decided as the participant was

92

Page 121: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5.4. To Study the Performance of Unguided Flick Gestures

(a) Average time taken per gesture in both the experiments.

(b) Number of misses ocurred during the entry of diagonalgestures in both the experiments.

Figure 5.5: Comparison of performance for diagonal gestures used in Initial Study andStudy 2 in terms of average time taken per gesture and the number of misses occurredduring the experiments.

instructed to perform. Angle measured against a flick did not fall into the range of

instruct directional flick was commonly found miss in the performance. Figure 5.5b

shows the total number of misses to perform diagonal flicks by the participants in

this study, as-well-as the total number of misses to perform diagonal movements

in the previous experiment. The difference captured for total number of misses in

two studies was not significant by paired t-test (p = 0.36, n.s., 95% CI.; Fig. 5.5b).

However, the difference in the captured angle values (47.2° and 41.4°) were very less

in compare to the previous study (which were 56.8° and 34.1° respectively). But,

in the width dominant position, when the respective angles were measured with

93

Page 122: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5. Gesture-Based Text Entry for Touch Screen Devices

respect to the fixed device corners like described in Fig. 5.3a and Fig. 5.3c of the

previous study, the average angle was 48.6°, which was not far different from the

average angle (47.2°) measured in height dominant position. So, they did not try

to fit their move with the change of device orientation which was also not required

in that case.

3. Time for flicks: We also measured the time taken to perform the flick gestures

for all moves of each participant. Although, time taken at an average of all the

right moves had significant differences in-between the simple directional flicks and

diagonal flicks for both the device orientations (for height dominant position: by

paired t-test p<0.5, s.; (Fig. 5.6a) and for width dominant position: by paired

t-test p<0.5, s.; (Fig. 5.6b)), but the average time taken for the flicks in two

device orientation did not have significant difference (by paired t-test p=0.19, n.s.;

(Fig. 5.6c)). But, most important finding is that the average time taken for the

gestures was significantly less for the current study than the same captured in the

previous experiment (from paired t-test p<0.001; Fig. 5.5a). Tests were run with

95% CI. The mean time to complete one gesture in the previous study was 3.2s

whereas in the current study respective time was 1.9s. We only considered the

rightly entered moves to compute the average time taken for the gestures for both

the studies (previous study and this study).

94

Page 123: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5.4. To Study the Performance of Unguided Flick Gestures

(a) Time taken to perform simple directional flicks Vs di-

agonal flicks in height dominant device orientation.

(b) Time taken to perform simple directional flicks Vs di-

agonal flicks in width dominant device orientation.

(c) Time taken at average to perform flicks in height dom-

inant device orientation and in width dominant device ori-

entation.

Figure 5.6: A quick look on movement times taken by the participants in height dominantorientation and width dominant orientation of a phone set appear to them.

95

Page 124: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5. Gesture-Based Text Entry for Touch Screen Devices

5.4.4.2 Discussion

In spite of their strong preferences towards the gestures that use landmarks like edges

and corners of the device, we found that blind participants performed the eight basic

directional flick gestures that were not assisted by any edges or corners, faster with similar

accuracy than the long directional scrawl from one corner to its diagonally opposite

corner and one edge to its opposite edge. In case of landmark to landmark scrawling,

specially, from corner to corner, participants had to scrawl a certain portion of the path

until it reach to the opposite edge, from the starting corner. In this middle portion

of the path, participants did not get any assistance of any edges, corners or any other

landmarks. They scrawled this path only relied on their own spatial perception that were

included the perception of screen size in two different orientation and the perception of the

directional and distant changes during the free scrawling on the screen. In contrast to it,

in second study, the directional flick gestures followed the strategy think once, move once,

which is more simple in nature and reduce the demand for spatial perception throughout

the scrawling path, as-well-as it highly reduce the demand for location accuracy as the

directional flicks can be performed at anywhere on the active screen area of the device.

In addition to it, the directional flicks took significantly less time to complete a gesture

compare to the guided gestures study and had less demand to percept the screen size

and performed accordingly. Hence, we found the directional flick gestures investigated

in the second study as a better gesture set. In the next section, we propose a text entry

mechanism by using the unguided flick gesture set and reproducing the knowledge of

using 4×3 telephone keypad.

96

Page 125: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5.5. VectorEntry : a Text Entry Mechanism by Directional Flicks

5.5 VectorEntry : a Text Entry Mechanism by Directional

Flicks

In this section, we propose VectorEntry – a text entry mechanism accessed through touch

based interaction on mobile devices for the people with visual impairment. The principle

behind this approach are to reproduce or reuse the popular character distribution of 4×3

telephonic keypad which can be accessed by an efficient set of directional flick gestures.

Now, we discuss about the entire procedure of our proposed mechanism followed by a

detail user evaluation.

5.5.1 Entry Mechanism: a Two-step Procedure

VectorEntry follows a two-step entry mechanism; the first step is the selection of an

alphabet group, the second one is the selection of an alphabet from the selected group.

Next, we elaborate both the group and alphabet selection procedures.

5.5.1.1 Group Selection Mechanism

A group of three or four letters is assigned to a key in traditional 4×3 telephone keypad.

We keep this distribution of alphabets remain same and in our proposed VectorEntry

mechanism, each group is assigned to either one directional flick or a long tap on the

screen. We have reproduced the mapping of character sets with the keys present in 4×3

telephone keypad, so that, users can utilize their previous knowledge of using such layout.

We coined the term VectorEntry because the direction of the flick gestures plays role to

make the decision that which group is selected by which directional flick. The mapping

for character group selection with the gesture set is given in Table 5.1. The basis of

group mapping with the gesture set is as follows.

First we come to the selection of ‘Group 1’ to ‘Group 9’ except the ‘Group 5’. A left-

up flick gesture will select ‘Group 1’. That flick gesture may be thought as a scrawl from

97

Page 126: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5. Gesture-Based Text Entry for Touch Screen Devices

Alphabets Group Flick direction/typePuntuation Group 1 up-left

[a,b,c] Group 2 up[d,e,f] Group 3 up-right[g,h,i] Group 4 left[j,k,l] Group 5 double tap

[m,n,o] Group 6 right[p,q,r,s] Group 7 down-left[t,u,v] Group 8 down

[w,x,y,z] Group 9 down-right[‘space’] NA right then down

[‘backspace’] NA left then down

Table 5.1: Distribution of alphabets by groups and associated flicks. NA = Not Applicable.

key 9 to key 1 on traditional 4×3 telephone keypad and this factor might be helpful for

the users to keep the gesture association in their mind. The assumption that reminding

the direction of gestures by the support of thinking them as kinetic scrawl from one key

to another was verified and reported in the evaluation section (see Table 5.7). Similarly

up flick may be thought as a kinetic scrawl from key 8 to key 2 and will select ‘Group 2’.

In a similar way, up-right, right, down-right, down, down-left, left flicks may be thought

as a kinetic scrawl from key 7 to key 3, key 4 to key 6, key 1 to key 9, key 2 to key 8,

key 3 to key 7 and key 6 to key 4 on telephonic keypad respectively. Figure 5.7 presents

a typical example to select ‘Group 7’, that is, [P,Q,R,S] by a flick towards down-left and

the direction of the flick can be reminded as a move from key 3 to key 7 (see Fig. 5.7(a)

and Fig. 5.7(b)). The ‘Step 2’ portion of this example presents how each character can

be select after the selection of group. Letter selection procedure is described in the next

section of this writing. But here, the direction towards a key will select the associate

group. However, this policy is taken only to keep the mapping (from flick direction to

key-group) in our mind. As for example, the tilt of up-right and down-left flick gesture

is similar but the direction of the up-right flick is towards the key 3 and hence will select

‘Group 3’, and the direction of the down-left flick is towards the key 7 and hence will

98

Page 127: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5.5. VectorEntry : a Text Entry Mechanism by Directional Flicks

Figure 5.7: An example scenario of two-step entry mechanisim of VectorEntry. ‘Step 1’ describes groupentry, ‘Step 2’ describes letter entry in the selected group. Figure (a): shows the letter distribution intelephonic keypad. (b): shows that the move towards left-down (direction of key3 to key7 in (a)) selectsgroup ‘PQRS’. (c): a left flick after group selection will select ‘P’. (d): a up flick after group selection willselect ‘Q’. (e): a right flick after group selection will select ‘R’. (f): a down flick after group selection willselect ‘S’.

select ‘Group 7’. ‘Group 5’ will be selected by a double tap anywhere on the screen.

A right follwed by a down flick will enter a ‘space’ character. A left followed by a down

will delete the last entered character. System will provide a sound feedback saying the

group name just after performing the gesture. Detection of the angle of tilt of a gesture

99

Page 128: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5. Gesture-Based Text Entry for Touch Screen Devices

and decision on the direction of a gesture was totally same with the study in Section 5.4.

Like the study presented in Section 5.4, here also users were able to put their gestures at

anywhere on the active touch screen area and hence the demand for location accuracy is

decreased. After the detection of gestures, system announces the group name again and

enter to the next screen for letter selection.

5.5.1.2 Character Selection Mechanism

The selection of a character will be done again only by the up, down, left, and right flick

gesture after the selection of a particular group. As for example, after selection of group

7 through a down-left flick (see Fig. 5.7(b)), a second flick will enter ‘P’, ‘Q’, ‘R’, and ‘S’

when the direction of second flick will left, up, right, down respectively (see Fig. 5.7(c),

Fig. 5.7(d), Fig. 5.7(e), and Fig. 5.7(f) in ‘Step 2’ portion of Fig. 5.7). That is, after the

selection of a group a left flick will select the first character, up the second character,

right the third character, down the fourth one (if it exists) in the selected group. An

automatic sound feedback saying the character that gets entered will be delivered to user

for confirming the entry.

5.5.2 User Evaluation

We continued user study to evaluate our prototype system VectorEntry, discussed in this

section. Our study aims to realize the benefit of our prototype system VectorEntry. We

realize the fact in terms of text-entry rate, error occurred during the entry, users’ control

and ease during the performance for our target users.

5.5.2.1 Deciding Prototype Mechanism as Test Cases:

We have utilized the character distribution of traditional 4×3 telephone keypad layout

with the touch based interaction on mobile devices. The existing technique related to

our work namely, No-Look Notes [14], proposed by Bonner et al., is deployed based on

100

Page 129: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5.5. VectorEntry : a Text Entry Mechanism by Directional Flicks

the similar character distribution strategy with gesture based interaction on touch based

mobile devices. So, we consider this one to compare it with our proposed prototype

mechanism. In No-Look Notes (Fig. 5.8) the total screen area is split into eight angular

pie slices and each slice is assigned to a group of alphabets (Fig. 5.8a). Touch on the

screen announces the group is in touch. Resting that finger as is a split-tapping selects

that group and system moves to secondary screen where the characters of that segment

then appear on the screen in linear layout (Fig. 5.8b). Users can select their target

character in similar fashion when announced via a second finger tap on the screen. As

users enter via a second finger tap so they are allowed to drag and tap around the screen

guided by the screen-edges without any unwanted entering. A quick one finger swipe to

the left is a ‘backspace’ and a quick one finger swipe to the right is a ‘space’. A backspace

in character entry screen return the user back to group selection screen.

(a) Eight angular slices ofNo-Look Notes. Each slicerepresents a set of charac-ters.

(b) Linear representationof characters in No-LookNotes after selection of aset.

Figure 5.8: Two-step text entry mechanism in No-Look Notes.

We compare No-Look Notes with VectorEntry in terms of entry speed, accuracy and

user friendliness. The prototypes only contain the alphabets and space character along

101

Page 130: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5. Gesture-Based Text Entry for Touch Screen Devices

with the backspace option. As well as it featured with announcement of group and

character selected by the users; backspace is supported with a ‘beep’ sound.

5.5.2.2 Experiment

In this section, we describe the details about the experiment.

5.5.2.2.1 Participants Eight volunteers (five males, three females) ranging in age

20 to 46 years were participated in our experiment. All participants were experienced of

using a telephone keypad based mobile phone. All of them had traditional 4×3 keypad

based mobile and they used it in reguler basis. All of them had more than four years of

experience of using traditional 4×3 keypad based mobile phone with voice support but

none of them had any experience to drive any touch based devices. The details about

the participants are given in Table 5.2.

User Age Gender Education Impairment period Previous phone set usedP1 20 Female 12th Grade 10 years Nokia E5P2 35 Male Graduate 29 years Nokia N73 TalkP3 38 Female Graduate 32 years Samsung HeroP4 36 Male Graduate 11 years Nokia 1110i, Nokia E5P5 29 Male Graduate 26 years Nokia N92P6 48 Male Graduate 31 years Nokia N73P7 36 Male Graduate 32 years Nokia E5P8 46 Female Graduate 41 years Nokia N92

Table 5.2: Basic characterization of the participants.

5.5.2.2.2 Task To evaluate VectorEntry, we performed several trials with target pop-

ulation in calm and quiet environment. Each participant performed the entry task with

both the input prototypes: No-Look Notes, and VectorEntry. We arranged a dictation

based entry mechanism. Before the experiment, a five days practice and training session

was run. At the begining of the first day of training session, they were briefed about the

experiment. In each day of practice session, all participants spent 45 minute session with

102

Page 131: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5.5. VectorEntry : a Text Entry Mechanism by Directional Flicks

each mechanism to get familiar with the system where they were allowed to place any

query regarding the systems and they practiced a few phrases from an alternative cor-

pus, the Children’s Phrase Set [72]. After getting confirmation about strong familiarity

with both the presented prototype systems from all the participants, the testing sessions

were continued and logged for each. We used MacKenzie and Soukoreff’s phrase set [70]

that contains 500 phrases with correlation 0.954 for our experiment. Like in all typical

text entry studies, we also run on our study with only lower-case characters, space, and

backspace.

5.5.2.2.3 Design of Experiment: The design of experimental study included text

entry tasks on both prototypes No-Look Notes and VectorEntry systems. After a five-day

practice and training session, we started logging the performance of our participants. The

experiments were running over 12 days and participants had to enter 5 phrases (varying

across the day) for each No-Look Notes and VectorEntry daily. At the beginning, we

presented five phrases to them and ensured that no one had found any difficulties to

understand any phrases and all were very well-known to the spellings of each word of

every phrase. We allowed a one hour gap between these two systems. During the test

phases, two sessions for two different prototypes were performed by each participant per

day. Order of the prototype systems appear to the participants was counterbalanced

across participants. Five phrases taken from the phrase set [70] had to enter with one

prototype in a session. Participants were able to take rest between the two phrases.

The timer begins after the user enters the first character and continues until it matches

completely with the target phrase. Incorrect entry of a character caused an error sound

and participant needed to backspaced it and enter the correct one. With completion of

one phrase, the timer pauses and the next phrase is prompted by the system. Participants

were allowed to take a short break here and again they hear and remind the current phrase

that they were going to enter. Participants were instructed to perform the entry task

correctly and quickly at their best.

103

Page 132: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5. Gesture-Based Text Entry for Touch Screen Devices

In summary experimental design was: 2 prototypes (No-Look Notes, VectorEntry) in

two sessions in a day × 5 phrases in a session × running over 12 days × 8 participants =

120 phrases in 12 days for each participant, so, total 960 phrase entry were logged during

the experiment.

5.5.2.3 Results

Eight participants had entered a total of 960 phrases typing tests on three text entry

protypes. We shall discuss: 1) text entry speed; 2) errors in typing; 3) subjective analogies

in between the two systems. Table 5.3 and Table 5.4 represent a brief summary of our

findings. The details of text entry rates, errors during entries, and subjective analogies

are reported as follows.

Measurement parameters No-Look Notes VectorEntryThe lowest text entry rate (inwpm) captured by a single partic-ipant averaging all the sessions ineach text entry mechanism

0.7 1.9

The highest text entry rate (inwpm) captured by a single partic-ipant averaging all the sessions ineach text entry mechanism

3.1 4.9

Average of the text entry rates (inwpm) considering all the sessionsin each text entry mechanism forall the participants

1.8 3.3

Overall increment in text entryrate in favor of VectorEntry � 83.3%

Mean of error rates captured dur-ing entries in each text entrymechanism

0.17 0.19

Table 5.3: Summary of text entry rates and error rates for No-Look Notes and VectorEntry.

5.5.2.3.1 Text Entry Rate: The text entry rate in words per minute (wpm) were

measured in each session for all participants to assess the speed. The words per minute

104

Page 133: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5.5. VectorEntry : a Text Entry Mechanism by Directional Flicks

(WPM) is calculated according to [71] as:

WPM =|T | − 1

S× 60× 1

5

where |T | is the length of the transcribed string, S is the time taken to enter the texts in

second, and 5 is taken to consider 5 characters per word according to [71]. In a day, each

participant performed both the techniques in two different sessions and the wpm in all

sessions were calculated. Figure 5.9a shows the overall text entry rates for all participants

in both the systems. The average of the text entry rate achieved by the participants with

these two text entry mechanisms were ranged from 0.7 to 4.9 wpm. The highest text

entry rate by an individual was recorded at VecorEntry mechanism and the lowest was

captured from No-Look Notes. The text entry rate in No-Look Notes was ranging from

0.7 to 3.1 by an individual and this rate in VectorEntry was ranging from 1.9 to 4.9.

Overall mean text entry rates across participants were 1.8 in No-Look Notes and 3.3 in

VectorEntry. It was a 83.3% increase in favor of VectorEntry. This difference in entry

rate was found as significant (95% CI) by paired t-test (p<0.05). Figure 5.9b shows the

day-wise average performance of the participants.

5.5.2.3.2 Error Rate: We calculate the error rate as

ErrorRate(ER) =IncorrectCharactersEntered(IC)

CorrectCharactersInTargetPhrase(CT ).

It was also done by Widgor, Balakrishnan in [73] and Bonner et al. in No-Look Notes [14].

Figure 5.9c show the error rate involved in typing for all participants in each techniques.

Overall mean of error rate across participants was higher for VectorEntry. The mean of

error rates were 0.17 in No-Look Notes and 0.19 in VectorEntry. But this difference was

not significant (95% CI.) (p=0.15) according to paired t-test.

105

Page 134: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5. Gesture-Based Text Entry for Touch Screen Devices

(a) Average of the text entry rates in wpm for all the par-ticipants with No-Look Notes and VectorEntry.

(b) Day-by-day text entry rates averaged across the par-ticipants with No-Look Notes and VectorEntry.

(c) Error rates in the study for all the participants withNo-Look Notes and VectorEntry.

Figure 5.9: Comparison between two touch based two-step text entry mechanisms: No-Look Notes and VectorEntry in terms of entry speed and errors during entry.

106

Page 135: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5.5. VectorEntry : a Text Entry Mechanism by Directional FlicksR

atin

gsSu

mar

ryof

findi

ngs

Impa

ctof

diff-

Que

ryst

atem

ents

Part

icip

ants

No-

Look

Vec

tor

Ent

ryer

ence

byP

1P

2P

3P

4P

5P

6P

7P

8N

otes

pair

ed

NN

VE

NN

VE

NN

VE

NN

VE

NN

VE

NN

VE

NN

VE

NN

VE

(mea

n,sd

)(m

ean,

sd)

t-te

st

Eas

yto

reac

hor

sele

cta

part

icul

argr

oup

44

33

22

33

34

45

33

23

3, 0.57

3.37

,0.

84n.

s.,

p=0.

08E

asy

tore

ach

orse

lect

apa

rtic

ular

lett

er∗

45

34

33

34

45

55

45

34

3.62

,0.

554.

34,

0.55

s.,

p<0.

05P

rior

usag

eha

bit

wit

htrad

ition

al4×

3te

leph

one

keyp

adw

ashe

lpfu

l

55

44

33

45

55

55

55

44

4.38

,0.

554.

5,0.

57n.

s.,

p=0.

35

Con

trol

over

the

prot

o-ty

pesy

stem

∗3

52

32

32

33

44

53

42

32.

6,0.

553.

75,

0.79

s.,

p<0.

05Fe

elco

nfide

ntw

hen

use

44

33

22

33

34

44

34

23

3, 0.57

3.37

,0.

55n.

s.,

p=0.

08Fu

nto

use

44

23

22

23

34

44

33

33

2.87

,0.

73.

25,

0.5

n.s.

,p=

0.08

Ove

rall

inte

ract

ion

wit

hth

epr

otot

ype

syst

emw

asea

syto

use

35

24

23

24

35

35

35

24

2.5,

0.29

4.5,

0.57

s.,

p<0.

05

Fast

tous

e∗

35

23

22

22

34

45

34

33

2.75

,0.

53.

5,1.

43s.

,p<

0.05

Pra

ctic

em

ayim

prov

epe

rfor

man

ce∗

55

34

33

34

44

45

44

34

3.6,

0.55

4.12

,0.

41s.

,p<

0.05

Tab

le5.

4:Su

bjec

tive

rati

ngs

onN

o-Lo

okN

otes

(NN

)an

dVec

torE

ntry

(VE

)in

5po

int

Like

rtsc

ale.

Stro

ngly

Agr

ee(S

A)

=5,

Agr

ee(A

)=

4,N

eutr

al(N

)=

3,D

isag

ree(

D)

=2,

Stro

ngly

Dis

agre

e(S

D)

=1,

∗s.

indi

cate

sth

edi

ffere

nce

issi

gnifi

cant

and

n.s.

indi

cate

sth

edi

ffere

nce

isno

tsi

gnifi

cant

.

107

Page 136: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5. Gesture-Based Text Entry for Touch Screen Devices

Subjective preferences: We performed a subjective evaluation for our test pro-

totypes. Some query statements were placed to the participants. All participants have

settled their agreement in a 5 point Likert Scale. Table 5.4 presents the result of this

subjective evaluation for all the participants. The results reveal the acceptance of Vec-

torEntry as an easily interactive and friendly text entry mechanism in touch based mobile

device for our target users. The results clearly show the difficulties to percept their finger

movement during the entry using No-Look Notes, which become much easier in case of

VectorEntry for our participants. Participants were successfully able to get control over

the VectorEntry mechanism. They were able to transfer or reuse the knowledge of using

4×3 telephonic keypad to the shifted touch based interaction technology which was really

a big challenge for them without the visual perception. Hence, it reveals the acceptance

of VectorEntry as a potential, effective, and better alternative.

5.5.2.3.3 Usability study for VectorEntry A usability study has been continued

with participants to verify how well the users can operate the current prototype interface

and to detect the difficulties they have faced during the operation. Some assumptions

were made during the design phase of the prototype. Now these assumptions are trans-

formed to a set of testable criteria. These assumptions during design phase and the

related criteria are as follows.

Design Assumption 1: Users would comprehend the two steps entry process of the

prototype interface and their previous experience of using phone set with traditional 4×3

telephone keypad will help them to understand the process. Respective criteria against

this assumption are as follows.

Criterion 1.a: After being trained, users are able to understand the first step – group

selection is similar to reach to a particular key on the keypad before pressing the key in

a traditional 4×3 telephone keypad system.

Criterion 1.b: After being trained users are able to understand the second step – letter

selection is similar to decide the number of press needed on a particular key in a traditional

108

Page 137: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5.5. VectorEntry : a Text Entry Mechanism by Directional Flicks

4×3 telephone keypad system. In case of the current prototype they just have to decide

the associated direction of the flick.

Criterion 1.c: Users percept that after every two successful flick operation only one

letter will got entered into the system. At the time of performance, users will be able to

percept the current stage of interface appear to them at any given point of time.

Queries we made to test the criteria and the responses from the participants are given

in Table 5.5

If a letter entry on traditional 4×3 telephone keypad system is considered as a combination

of two stages — the first one is to reach to a particular key through which target letter

can be accessed and the second one is a consecutive press to that key until the target

letter came, then, the above results clearly indicate that users were able to realize the

two step entry system of the presented prototype.

They were able to match the entry procedure with their previously used traditional 4×3

Query statements ParticipantsP1 P2 P3 P4 P5 P6 P7 P8

Stage 1 in traditional 4×3 telephonekeypad system is matched with thestep 1 or with the step 2 of the Vec-torEntry

Step1

Step1

Step1

Step1

Step1

Step1

Step1

Step1

Step 2 of the VectorEntry ismatched with the stage 1 or withthe stage 2 of the traditional 4×3telephone keypad system

Step2

Step2

Step2

Step2

Step2

Step2

Step2

Step2

Subjective Ratings: Previous expe-rience of using traditional 4×3 tele-phone keypad was helpful to per-cept the two step entry model ofVectorEntry

5 4 4 5 5 5 5 5

Table 5.5: Response from the participants against the query statements for Design As-sumption 1. Agreements are in 5 point Likert scale. Strongly Agree (SA) = 5, Agree(A) =4, Neutral(N) = 3, Disagree(D) = 2, Strongly Disagree (SD) = 1.

keypad and hence that experience was helpful to understand the presented prototype.

109

Page 138: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5. Gesture-Based Text Entry for Touch Screen Devices

For the first two queries they had to answer it and for the third query statement they

had to settle their agreement in a 5 point Likert scale.

Design Assumption 2: Users would understand the distribution arrangements of the

alphabets of the presented prototype. Respective criteria against this assumption are as

follows.

Criterion 2.a: If a letter is presented to the users then its related group can be easily

detected by them.

Criterion 2.b: If a group is presented to the users then all the letters under that group

can be described by them.

Just after describing our system to them we placed two types of queries to them. We

placed one letter name and they had to answered under which group that letter will

assigned and vice versa. This question-answering were continued after describing the

current prototype to the participants and before start entering any text. Each participant

had to answer five queries of both types. At that stage, we found all the queries were

rightly answered and hence this assumption was satisfied.

Design Assumption 3: Users would understand system commands and feedbacks on

the interface and can execute them efficiently. Respective criteria against this assumption

are as follows.

Criterion 3.a: After completion of the training users are able to put the appropriate

command (directional flick gesture) in target to enter a letter.

Criterion 3.b: After completion of the training users are able to percept the changes

occurred in the interface due to input of system commands from the associated system

feedbacks.

Queries we made to test the criteria and the responses from the participants are given

in Table 5.6

Participants had to settle their agreement in a 5 point Likert scale against each query

110

Page 139: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5.5. VectorEntry : a Text Entry Mechanism by Directional Flicks

Query statements ParticipantsP1 P2 P3 P4 P5 P6 P7 P8

I clearly understood when I was go-ing to select a group, not a letter 5 5 4 4 5 5 5 5

I was clearly understood when Iwas going to select a letter, not agroup

5 5 4 4 5 5 5 5

Just after completion of a letter en-try, I understood from the systemfeedback that which letter just Ihave entered

5 5 4 5 5 5 5 5

Just after the selection of a group,I understood from the system feed-back that which group just I haveselected

5 5 5 5 5 5 5 5

Table 5.6: Settled agreements by the participants against the query statements placed totest Design Assumption 3. Agreements are in 5 point Likert scale. Strongly Agree (SA) =5, Agree(A) = 4, Neutral(N) = 3, Disagree(D) = 2, Strongly Disagree (SD) = 1.

statement. The responses collected from the participants reveal the acceptance in favor

of our assumption.

Design Assumption 4: Users will be able to transfer their knowledge of using phone

set with traditional 4×3 telephone keypad when they will use the VectorEntry. Respec-

tive criteria against this assumption are as follows.

Criterion 4.a: The experience of users gained through traditional 4×3 telephone keypad

helps users to memorize the distributions of letters under the group.

Criterion 4.b: The experience of users gained through traditional 4×3 telephone keypad

helps users to memorize the directions of the flick gestures and its related group in the

current prototype.

Queries we made to test the criteria and the responses from the participants are given

in Table 5.7. From the results it is clear that the previous knowledge of 4×3 telephonic

keypad helped our participants to keep the letter-group mapping in their mind and ex-

111

Page 140: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5. Gesture-Based Text Entry for Touch Screen Devices

Query statements ParticiantsP1 P2 P3 P4 P5 P6 P7 P8

Previous knowledge of using phoneset with traditional 4×3 telephonekeypad system was helpful to un-derstand the letter-group mappingin the current prototype

5 5 5 5 5 5 5 5

Memorizing the direction of flick intarget to select a particular groupwas easy

5 4 3 4 5 5 5 5

Memorizing the direction of flick toselect a group was easy to remem-ber because it can be considered asto move from a key to another ontraditional 4×3 keypad.(Give yourremarks if you found it easy, other-wise, no need to reply.)

5 3 NR 4 5 5 5 4

Table 5.7: Settled agreements by the participants against the query statements placed totest Design Assumption 4. Agreements are in 5 point Likert scale. Strongly Agree (SA) =5, Agree(A) = 4, Neutral(N) = 3, Disagree(D) = 2, Strongly Disagree (SD) = 1, NR = Notreplied.

cept two of them, others were successful to utilize their experience, to remind the flick

directions against the associated group of letters.

5.5.3 Discussion

Through our user experiment, we compared two text entry systems in terms of entry

speed, errors during entry, subjective usability principles. The results reflect that the

entry rate is very high in VectorEntry in compare to No-Look Notes. But the error rate

is little bit higher in VectorEntry. But instead of that users are typing significantly faster

by using VectorEntry than No-Look Notes. VectorEntry is not only beat the No-Look

Notes by its entry rate but it also stands as more easy, friendly interactive method. Both

for the group selection and character selection steps are found easier by the participants

in VectorEntry than No-Look Notes. The main reason raised by the participants is that

the demand for location accuracy was very less in both group and character selection

112

Page 141: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5.6. Summary

steps in VectorEntry technique with compare to No-Look Notes. So, less dependencies

over the location accuracy moreover the directional gestures based on think once move

once strategy wins over the kinetic scrawl strategy to reach to a region on the screen.

5.6 Summary

In this work, we analyze the directional sense of blind people on the small screen of the

mobile devices and proposed text entry technique that is made by the directional flick

gesture based on think once, move once strategy at anywhere on the device screen. Spatial

landmarks on the screen like edges, corners of a mobile device play an important role in

perception and interaction with the devices in lack of visual assistance. The directional

movement ability assisted by the spatial landmarks on screen was analyzed for the blind

participants. We found that participants were able to perform these movements. They

were able to realize the screen size and tried to fit their movement accordingly. But most

surprisingly, they were also able to import their sense of direction from their day-to-

day activities to perform the small length directional flick gesture on the touch enabled

screen of mobile devices successfully. Their success story was not limited only to this

fact. When we analyze the results, we found that the performance was better when

participants performed the directional flicks anywhere on the screen based on think once,

move once policy rather than the directional movements with the support of the device’s

spatial landmarks on the screen. So, we utilized this ability of our users to design

an efficient text entry interface and as an outcome, we proposed a prototype named

VectorEntry. The design of VectorEntry uses the character distribution of 4×3 telephonic

keypad, and the access depends on the ability of users to make directional flick freely,

at anywhere on device screen. Blind participants were able to import their knowledge

of using telephone keypad layout to successfully percept the design of VectorEntry. So,

at the end VectorEntry is proved as more efficient in terms of entry speed, errors, and

ease of use than the No-Look Notes which is also developed for touch screen devices and

113

Page 142: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

5. Gesture-Based Text Entry for Touch Screen Devices

based on the similar character distribution as telephone keypad.

Additionally, the flick gesture decided in our approach also can be used to implement

many applications like to handle a music player, playing games, set alarm, etc. It requires

exploring for a suitable hierarchy of activities, through which application can be made.

This hierarchy should be easily understandable with fewer interaction steps needed to

access.

114

Page 143: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

Chapter 6

Conclusion and Future Research

In this thesis, we present user interface mechanisms for people with visual impairment

to access mobile Phones. At the beginning, we surveyed the existing work that support

to enhance accessibility to use mobile devices for the people with visual impairment. We

categorized the existing related work with respect to different solution platforms. Then,

we have analyzed the existing style of interaction modalities to access hand-held mobile

devices by the people with visual impairment. We have traced and realized different

usability issues faced by our target users. We have addressed our research work in

two specific direction. The first one is towards finding the solution with the presence

of enhanced tactile features. The sense of touch plays a remarkable role in non-visual

interaction. The second one is towards finding a suitable access mechanism for the touch

enabled mobile devices in absence of any tactile differences on the device by the people

with blindness.

Keypad based solution with the presence of tactile sence: Continuing our

investigation, in first direction, we found that, widely used keypad based mobile phone

devices can provide the tactile differences between the keys but, the presence of a jumble

of keys hampers to find the target key from the keypad. So, accordingly we have inves-

tigated to find a design which will use less number of keys, as-well-as users can perform

115

Page 144: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

6. Conclusion and Future Research

their functionalities. We have presented a design with proper example prototype. Then,

we continued user study to prove that the finding of a particular key from the keypad

in absence of vision is less time consuming and easy in our new keypad setup. Then, we

concentrated on the performance in terms of text entry rate and error during entry by

utilizing the new setup. Accordingly we redesign the access procedure. We proposed a

text entry mechanism that demands very less movement on keypad but fully utilize the

usage habits and their experience of previous uses of mobile phones. We continued user

study, and found that our proposed technique performs faster and more accurate than

the widely used traditional system. At the end, we have discussed a way to incorporate

our proposed text entry technique to implement any basic functionality. We have pre-

sented it by implementing a prototype of text message sending activity.

From our investigation, related experiments and user studies, we can provide some guide-

line to our future contributor in this research direction, and these are in the followings.

• A tactile difference among the control points is a key factor of accessibility for the

people with visual impairment.

• Less number of physical keys can provide better tactile features for a mobile phone

device.

• Try to avoid the demands for long path movement from the design solution to

access any system.

• Acquiring knowledge of device use in absence of vision is really very difficult, so,

try to provide solution which is consistent with their previous usage habits.

Solution to access the touch-based devices: The second direction of our inves-

tigation is towards accessing the touch enabled mobile devices by the visually impaired

people. To investigate their performance ability on the touch enabled mobile devices, we

continued one user study. We check their performance on a device by utilizing their own

directional sense and using the spatial landmarks (e.g. corners and edges) of the device.

116

Page 145: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

We found that they are able to realize the size (that is, height and width) of the screen

when moving from one corner to it’s diagonally opposite corner, and accordingly tried

to change their movement path. But, any interaction that demands long path traversal

on the flat display screen is not suitable for them. They can perform short directional

movement gesture very efficiently. We integrate this ability of our target users, with

their previous usage habits of using traditional layout, and accordingly have proposed a

new text entry procedure to access touch enables mobile devices. We performed a user

study, and established that, our technique performs faster and more easy to use than the

No-Look Notes, which was built based on touch-n-explore technique.

From our investigation, related experiments and user studies, we can provide some guide-

line to our future contributor in this research direction, and these are followings.

• Avoid long movement on the screen which required to percept positional changes

from the design.

• Users with visual impairment try to percept the screen size when they performing

tasks on the screen.

• Users with visual impairment can perform directional movement on the flat display

screen of a device.

• Common knowledge of daily-life can be utilized to design a solution to access the

devices.

• Knowledge of previous device use of a particular interaction modality can be utilized

to provide access mechanism to other interaction modality.

• Performing a familiar gesture at anywhere on the screen is preferable over the

demand for spatial accuracy to access the device.

117

Page 146: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

6. Conclusion and Future Research

6.1 Contributions of the Thesis

We summarize the major contributions of this thesis in the following.

• Exploring for Suitable Mode of Interaction: We provide an in-depth report

that analyzes the pros and cons of various existing modes of interaction used for the

blind people to access the mobile devices. We have inspected their usage habits,

interaction suitability etc. and accordingly have specified their requirements. We

propose a suitable keypad condition that is beneficial in different aspects for them.

We present an example prototype of Calling use case that explains how basic phone

functionality can be handled by using proposed keypad condition.

• Keypad-based solution: We present a suitable keypad-based interaction mecha-

nism for the blind users with the presence of enhanced tactile features. We propose

a fast and less error-prone keypad based text entry mechanism for the blind people

that utilize our target users’ experience with 4×3 telephone keypad but alleviating

the accessibility issues present in telephonic keypad.

We design a prototype system use cases to show how our proposed interaction

mechanism can be utilized to implement basic phone functionalities.

• Gesture-based solution: We explore the possibilities of directional flick gestures

as a potential input technique to access touch screen based mobile devices by the

blind users.

We design a user interface for blind people to enter text on touch-based mobile

devices utilizing their knowledge of using alpha-numeric traditional 4×3 telephone

keypad.

118

Page 147: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

6.2. Significance of our Research

6.2 Significance of our Research

• We deeply analyzed different mode of interaction and practicality to access mobile

devices for visually impaired people. From that analysis, we clearly specified the

demands in two direction of research.

• Our research provides a direction to make a real-life usable mechanism to access

mobile phones with the presence of tactile features by the people with visual im-

pairment.

• Knowledge of simple daily-life activity of the people with blindness can be utilized

to access touch-based mobile devices by them. We present a novel text entry

mechanism for the people with visual impairment to access mobile phones.

6.3 Future Scope of Work

This work however leaves a number of issues opens and problems to address giving a

scope for further extension of this research. We mention few such issues.

• Information of a mobile phone device can be present in a manner such that users

required traversing minimum steps to perform any activity.

• Incorporate our proposed text entry method to implement a use case on touch

enabled mobile devices.

• Use multi-modal interaction technique to access mobile phones by the visually

impaired users.

• Implement different mobile applications like to access music player, play games,

set alarm, access calculator, etc. in a manner such that users required to traverse

minimum steps to perform an activity by using our proposed interaction techniques.

119

Page 148: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest
Page 149: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

Dissemination of Research Work

• Tuhin Chakraborty and Debasis Samanta, “A Text Entry Mechanism with TouchEnabled Mobile Devices for Blind Users”, ACM Transactions on Accessible Com-puting (TACCESS). [Under Revision]

• Tuhin Chakraborty and Debasis Samanta, “EasyTap: An Easy Text EntryMethod for Blind Mobile Phone Users”, Australian Journal of Intelligent Infor-mation Processing Systems (AJIIPS), Vol-13, No-4, pp. 41-47, 2014.

• Tuhin Chakraborty and Debasis Samanta, “Exploring an effective interactionmode for blind mobile users in India”, Proceedings of Asia Pacific Conference onComputer Human Interaction (APCHI), ACM, pp.371-378, 2013, Bangalore, India.

• Tuhin Chakraborty and Debasis Samanta, “BlindGuide: An audio based eyes-free caller guide for people with visual impairment”, Proceedings of InternationalConference on Intelligent Human Computer Interaction (IHCI), IEEE Xplorer,pp.1-6, 2012, Kharagpur, India.

Others

• Tuhin Chakraborty, Sayan Sarcar and Debasis Samanta, “Design and Evaluationof a Dwell-free Eye Typing Technique”, Proceedings of CHI’14 Extended Abstractson Human Factors in Computing Systems, ACM, pp.1573-1578, 2014, Toronto,Canada.

• Sayan Sarcar, Prateek Panwar and Tuhin Chakraborty, “EyeK: An EfficientDwell-Free Eye Gaze-Based Text Entry System”, Proceedings of Asia Pacific Con-ference on Computer Human Interaction (APCHI), ACM, pp.215-220, 2013, Ban-galore, India.

121

Page 150: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest
Page 151: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

References

[1] “Smart phone users in young aged population,” March 2014,urlhttp://www.edisonresearch.com/the-infinite-dial-2014/.

[2] “Number of SMSs sent,” http://www.itu.int/ITU-D/ict/material/FactsFigures2010.pdf.

[3] “World Health Organization,” http://www.who.int/mediacentre/factsheets/fs282/en/.

[4] “Google’s VoiceActions,” http://code.google.com/p/eyes-free.

[5] “Android Operating System,” http://www.google.com/mobile/search/.

[6] “Apple’s SIRI,” http://www.apple.com/ios/siri/.

[7] “iPhone,” http://www.apple.com/iphone/iphone-4s/specs.html.

[8] “Apple’s VoiceOver,” http://www.apple.com/accessibility/iphone/vision.html.

[9] “Nuance,” http://www.nuance.com/.

[10] “Intex-Vision mobile phone for blind,” http://intextechnologies.com/vani/vani-april-june11.pdf.

[11] L. Braille, “Procedure for writing words, music and plain song using dots for the useof the blind and made available to them,” Royal Institution Of Blind Youth, Paris,1829.

[12] T. Guerreiro, H. Nicolau, J. Jorge, and D. Gonçalves, “Navtap: a long term studywith excluded blind users,” in Proceedings of the 11th international ACM SIGAC-CESS conference on Computers and accessibility, 2009.

[13] J. Benedito, T. Guerreiro, H. Nicolau, and D. Gonçalves, “The key role of touch innon-visual mobile interaction,” in Proc. MobileHCI’10, 2010, pp. 379–380.

123

Page 152: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

References

[14] M. Bonner, J. Brudvik, G. Abowd, and W. Edwards, “No-look notes: Accessibleeyes-free multi-touch text entry,” in Proc. Pervasive’10, Springer, 2010, pp. 409–427.

[15] B. Frey, C. Southern, and M. Romero, “Brailletouch: Mobile texting for the visuallyimpaired,” in Proc. HCII. Springer, Berlin, 2011.

[16] J. Oliveira, T. Guerreiro, N. Hugo, J. Joaquim, and D. Gonçalves, “Blind peopleand mobile touch-based text-entry: Acknowledging the need for different flavors,”in Proc. ASSETS’11, Dundee, Scotland, UK., 2011, pp. 179–186.

[17] T. Guerreiro, N. Hugo, J. Joaquim, and D. Gonçalves, “Mobile text-entry: theunattainable ultimate method,” in Pervasive 2012 workshp on Frontiers in Accessi-bility for Pervavise Computing, Newcastle, UK, June 2012.

[18] J. Oliveira, T. Guerreiro, N. Hugo, J. Joaquim, and D. Gonçalves, “Brailletype:Unleashing braille over touch screen mobile phones,” in Proceedings of the 13th IFIPTC 13 International Conference on Human-computer Interaction - Volume Part I,Lisbon, Portugal, 2011, pp. 100–107.

[19] S. Kane, J. Bigham, and J. Wobbrock, “Slide rule: Making mobile touch screensaccessible to blind people using multi-touch interaction techniques,” in Proc. AS-SETS’08, ACM, 2008, pp. 73–80.

[20] M. Romero, B. Frey, C. Southern, and G. Abowd, “Brailletouch: Designing a mobileeyes-free soft keyboard,” in Proc. MobileHCI’11, Stockholm, Sweden, 2011, pp. 707–709.

[21] C. Southern, J. Clawson, B. Frey, G. Abowd, and M. Romero, “An evaluation ofbrailletouch: Mobile touchscreen text entry for the visually impaired,” in Proc.MobileHCI’12, San Francisco, USA, 2012, pp. 317–326.

[22] J. Sánchez, M. Saenz, and G. J. M., “Usability of a multimodal video game toimprove navigation skills for blind children,” ACM Transactions on Accessible Com-puting, vol. 3, no. 2, pp. 1–29, November 2010.

[23] J. Moll and E. S. Pysander, “A haptic tool for group work on geometrical conceptsengaging blind and sighted pupils,” ACM Transactions on Accessible Computing,vol. 4, no. 4, pp. 14:1–14:37, July 2013.

124

Page 153: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

References

[24] “E.161 : Arrangement of digits, letters and symbols on telephones and other devicesthat can be used for gaining access to a telephone network,” 2002, http://www.itu.int/rec/T-REC-E.161-200102-I/en.

[25] J. Denham, “An accessible, pricey answer: A review of the mobile phone orga-nizer,” AFB AccessWorld Magazine, May 2004, http://www.afb.org/afbpress/pub.asp?DocID=aw050305.

[26] S. Azenkot and L. N. B., “Exploring the use of speech input by blind people onmobile devices,” in Proc. ASSETS’13. ACM, 2013.

[27] C. H. Blickenstorfer, “Graffiti: Wow!” Pen Computing Magazine, January 1995, pp.30-31.

[28] I. S. MacKenzie and S. X. Zhang, “The immediate usability of graffiti,” in Proc.GI’97, Canadian Information Processing Society, 1997, pp. 129–137.

[29] D. Goldberg and C. Richardson, “Touch-typing with a stylus,” in Proc. Interact ’93and CHI’93, ACM Press, 1993, pp. 80–87.

[30] S. Castellucci and I. MacKenzie, “Graffiti vs. unistrokes: an empirical comparison,”in Proc. of CHI’08, New York, USA, 2008, pp. 305–308.

[31] L. R. Milne, C. L. Bennett, R. E. Ladner, and S. Azenkot, “Brailleplay: educationalsmartphone games for blind children,” in Proc. ASSETS’14. ACM, 2014, pp. 137–144.

[32] L. R. Milne, C. L. Bennett, and R. E. Ladner, “Vbghost: a braille-based educationalsmartphone game for children,” in Proc. ASSETS’13. ACM, 2013.

[33] S. Azenkot, C. L. Bennett, and R. E. Ladner, “Digitaps: eyes-free number entry ontouchscreens with minimal audio feedback,” in Proc. UIST’13. ACM, 2013, pp.85–90.

[34] A. Pirhonen, S. Brewster, and C. Holguin, “Gestural and audio metaphors as a meansof control for mobile devices,” in Proc. CHI’02, ACM Press, 2002, pp. 291–298.

[35] B. A, N. E, and K. Li. Y, Hinckley, “Experimental analysis of touch-screen gesturedesigns in mobile environments,” in Proc. ACM CHI’11. ACM, 2011, pp. 403–412.

[36] S. Azenkot, K. Rector, R. E. Ladner, and J. O. Wobbrock, “Passchords: securemulti-touch authentication for blind people,” in Proc. ASSETS’12. ACM, 2012,pp. 159 – 166.

125

Page 154: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

References

[37] K. Hara, S. Azenkot, M. Campbell, B. C. L., V. Le, S. Pannella, R. Moore, K. Minck-ler, H. N. Rochelle, and J. E. Froehlich, “Improving public transit accessibility forblind riders by crowdsourcing bus stop landmark locations with google street view,”in Proc. ASSETS’13. ACM, 2013.

[38] G. Yfantidis and G. Evreinov, “Adaptive blind interaction technique for touch-screens,” Universal Access in the Information Society, vol. 4, no. 4, pp. 328–337,May 2006.

[39] N. Banovic, K. Yatani, and K. N. Truong, “Escape-keyboard: A sight-free one-handed text entry method for mobile touch-screen devices,” International Journalof Mobile Human Computer Interaction, vol. 5, no. 3, pp. 42–61, July 2013.

[40] W. Buxton, R. Hill, and P. Rowley, “Issues and techniques in touch-sensitive tabletinput,” in Proc. SIGGRAPH ’85, ACM Press, 1985, pp. 215–224.

[41] M. Yin and S. Zhai, “The benefits of augmenting telephone voice menu navigationwith visual browsing and search,” in Proc. CHI’06, 2006, pp. 319–328.

[42] M. Marics and G. Engelbeck, Designing voice menu applications for telephones.Handbook of Human-Computer Interaction, Elsevier, 1997.

[43] D. McGookin, S. Brewster, and W. Jiang, “Investigating touch screen accessibilityfor people with visual impairment,” in Proc. NordiCHI’08, 2008, pp. 298–307.

[44] S. Kane, J. Wobbrock, and R. Ladner, “Usable gesture for blind people: Under-standing preference performance,” in Proc. CHI’11, 2011, pp. 413–422.

[45] M. Wu and R. Balakrishnan, “Multi-finger and whole hand gestural interaction tech-niques for multi-user tabletop displays,” in Proc. UIST’03, ACM Press, 2003, pp.193–202.

[46] H. Paul, F. Otto, H. Hussmann, and A. Schmidt, “Keystroke-level model for ad-vanced mobile phone interaction,” in Proceedings of the SIGCHI Conference on Hu-man Factors in Computing Systems, San Jose, California, USA, 2007, pp. 1505–1514.

[47] “Forecast of smartphone users in the usa,” http://www.statista.com/statistics/201182/ forecast-of-smartphone-users-in-the-us/.

[48] “2 billion worldwide consumers of smartphones by 2016,” Decem-ber 2014, http://www.emarketer.com/Article/2-Billion-Consumers-Worldwide-Smartphones-by-2016/1011694.

126

Page 155: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

References

[49] “Smartphone user penetration in india,” http://www.statista.com/statistics/321487/smartphone-user-penetration-in-india/.

[50] “Smart phone users in India,” http://www.nielsen.com/in/en/nielsen-pressroom/2012/smartphone-incidence.html.

[51] “Demographic Map of India,” http://www.mruc.net/new-demographic-map-of-india.pdf.

[52] “Visual Disability in India,” http://paa2009.princeton.edu/papers/91928.

[53] “Number of smart phone users in USA,” http://www.statista.com/statistics/201182/ forecast-of-smartphone-users-in-the-us/.

[54] “Smart phone owned by adult U.S. Americans, by age,”http://www.statista.com/statistics/195012/share-of-us-adults-owning-a-smartphone-or-a-regular-cell-phone-by-age/.

[55] “MPCE in India,” http://articles.economictimes.indiatimes.com/2012-05-04/ news/31559329_1_rural-areas-mpce-nsso-survey.

[56] K. Li, P. Baudisch, and K. Hinckley, “Blindsight: eyes-free access to mobile phones,”in Proc. CHI’08, ACM Press, 2008, pp. 1389–1398.

[57] S. Zhao, P. Dragicevic, M. Chignell, R. Balakrishnan, and P. Baudisch, “Earpod:eyes-free menu selection using touch input and reactive audio feedback,” in Proc.CHI’07. ACM Press, 2007, pp. 1395–1404.

[58] S. Zhao and R. Balakrishnan, “Simple vs. compound mark hierarchical markingmenus,” in ACM UIST Symposium on User Interface Software and Technology, 2004,pp. 33–42.

[59] S. Zhao, M. Agrawala, and K. Hinckley, “Zone and polygon menus: using relativeposition to increase the breadth of multi-stroke marking menus,” in ACM CHI Con-ference on Human Factors in Computing Systems, 2006, pp. 1077–1086.

[60] J. O. Wobbrock, B. A. Myers, and J. A. Kembel, “Edgewrite: A stylus-based textentry method designed for high accuracy and stability of motion,” in Proceedings ofthe 16th annual ACM Symposium on User Interface Software and Technology, 2003,pp. 61–70.

127

Page 156: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

References

[61] G. Vanderheiden, “Use of audio-haptic interface techniques to allow nonvisual ac-cess to touchscreen appliances,” in Human Factors and Ergonomics Society AnnualMeeting Proceedings, 40, 1996, p. 1266.

[62] H. Tinwala and I. S. MacKenzie, “Eyes-free text entry on a touchscreen phone,” inPro-ceedings of the IEEE Toronto International Conference Science and Technologyfor Humanity TIC-STH 2009, 2009, pp. 83–89.

[63] R. W. Soukoreff and I. S. MacKenzie, “Text entry research: An evaluation of msdand kspc, and a new unified error metric,” in Proc. SIGCHI Conference on HumanFactors in Computing Systems’03. Ft. Lauderdale, FL: ACM, 2003, pp. 113–120.

[64] J. Sánchez and F. Aguayo, “Mobile messenger for the blind,” in Proc. ERCIM’06,Springer, 2006, pp. 369–385.

[65] G. Kurtenbach, “The design and evaluation of marking menus,” Ph.D. Thesis, Uni-versity of Toronto, 1993.

[66] T. Guerreiro, P. Lagoá, H. Nicolau, D. Gonçalves, and J. Jorge, “From tapping totouching: Making touch screens accessible to blind users,” IEEE Multimedia, vol. 15,no. 4, pp. 48–50, 2008.

[67] J. Callahan, D. Hopkins, M. Weiser, and B. Shneiderman, “An empirical comparisonof pie vs. linear menus,” in ACM CHI Conference on Human Factors in ComputingSystems, 1988, pp. 95–100.

[68] “National Association for the Blind,” http://www.nabindia.org.

[69] S. Bhattacharya, A. Basu, and D. Samanta, “Computational modeling of user errorsfor the design of virtual scanning keyboards,” IEEE Transactions on Neural Systemsand Rehabilitation Engineering, vol. 16, no. 4, pp. 400–409, August 2008.

[70] I. S. MacKenzie and R. W. Soukoreff, “Phrase sets for evaluating text entry tech-niques,” in Extended Abstracts on Human Factors in Computing Systems, CHI ’03,2003, pp. 754–755.

[71] I. S. MacKenzie and K. Tanaka-Ishii, Text Entry Systems: Mobility, Accessibility,Universality. Morgan Kaufmann Publishers Inc., 2007.

[72] A. Kano, J. C. Read, and A. Dix, “Children’s phrase set for text input methodevaluations.” New York, NY, USA: ACM, 2006, pp. 449–452.

128

Page 157: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest

References

[73] D. Wigdor and R. Balakrishnan, “Tilttext: Using tilt for text input to mobilephones,” in Proc. of the 16th annual ACM symposium on User interface softwareand technology. ACM, 2003, pp. 81–90.

129

Page 158: USER INTERFACE MECHANISMS FOR PEOPLE WITH VISUAL ...cse.iitkgp.ac.in/~dsamanta/resources/thesis/Tuhin-Chakraborty-Thes… · Aritra, Sougata da, Bapi da, and Pradip da. My greatest