33
1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

  • View
    217

  • Download
    0

Embed Size (px)

Citation preview

Page 1: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

1

Interaction Devices:Output Devices

Lecture 12Date: 9th March

Page 2: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

2

Overview of Lecture

Visual output – screens: CRT , LCD, projector

Sound output

-suitable for specific application scenarios

-speech synthesis – uses concatenation or synthesis-by-rule

-speech applications

Page 3: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

3

Interaction Device Introduction

InteractionInteractionDevicesDevices

InputInputDevicesDevices

OutputOutputDevicesDevices

Page 4: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

4

Output Devices

• Output devicesOutput devices are those devices that convert the computer’s response in a form perceptible by a human, known as output

• Wide range of output techniques, each of which has advantages and disadvantages depending upon the context in which they are used

• Output devices can be categorised into :• Visual Output devices• Sound Output devices

Page 5: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

5

Visual Output Devices

• Visual output using a screen is by far the most common form of output today

• The display has become the primary source of feedback to the user from the computer

• The display has many important features, including:• Physical dimensions (usually the diagonal dimension and

depth)• Resolution (the number of pixels available)• Number of available colors, color correctness• Luminance, contrast, and glare• Power consumption• Refresh rates (sufficient to allow animation and video)• Cost• Reliability

Visual Output Devices

Page 6: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

6

Visual Output Devices

•Raster-scan cathode-ray tube (CRT)

•Liquid-crystal displays (LCDs)

•Projectors

•Printers

Visual Output Devices

Page 7: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

7

Visual Output Devices - CRTs•Most comment screen technology is the Cathode Ray Tube (CRT) screen

•+ Cheap•+ Can support rapid animation•+ High colour capability•- Bulky device•- Has associated health risks – radiation concerns. Flicker, poor legibility and low contrast can cause eyestrain and fatigue

Visual Output Devices

Page 8: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

8

Visual Output Devices - LCDs

•Light, flat plastic screens•+ Smaller (than CRTs)•+ lighter (than CRTs)•+ use less power (than CRTs)•+ no radiations problems•+ Less tiring than c.r.t. displays, and reduce eye-strain, due to reflected nature of light rather than emitted•Typically used in personal organiser or notebook computer

Visual Output Devices

Page 9: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

9

Visual Output Devices - Projectors

Visual Output Devices

• Projects output from computer onto another medium

• Useful for where output is required for a group rather than an individual

• Wide range in quality. High end projects can cost up to £10,000

Page 10: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

10

Visual Output Devices - Printers

Visual Output Devices

• Popular printing technology builds up characters on page, as on the screen, as a series of dots:

• dot-matrix printers• inkjet printers• thermal printers or fax machines• laser printers• color printers• photographic printers

Page 11: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

11

Sound Output

•Sound Output devices

•Application scenarios

•Speech synthesis – uses concatenation or synthesis-by-rule

•Speech applications

Sound Output Devices

Page 12: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

12

Sound Output Devices

•Speakers

•Various types of speakers e.g. Common coil and cone, Electro static, Direct digital speaker

•Headphones

Sound Output Devices

Page 13: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

13

Application Scenarios

•Applications where sound complement a visual interface (e.g computer games)

•Applications where for alerting and feedback purposes are required (e.g. incorrect action alert)

•Applications where eyes are engaged another task (e.g. flight decks)

•Applications where background processes need to be monitored (e.g. disk access noise when saving a file)

•Applications for blind or partially sighted users

•Applications for illiterate users

•Applications for users with limited mobility

Sound Output Devices

Page 14: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

14

Speech Synthesis

•Speech SynthesisSpeech Synthesis: The process of automatic generation of speech output from data input which may include plain text, formatted text or binary objects. Text-to-speech generation is an example of speech synthesis (W3C)

•Two methods of speech synthesis:•Concatenation•Synthesis-by-rule

Sound Output Devices

Page 15: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

15

Speech Synthesis

•Concatenation:Concatenation:•Digital recordings of real human speech are stored by the computer•May be stored as sentence, phrase or word segments•Usually applied to applications with small vocabularies of 200 words or less•Example: “the number you require is …”•Demo – personal organiser demo

Sound Output Devices

Page 16: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

16

Sound Output Devices

•Synthesis-by-ruleSynthesis-by-rule•Does not use recorded human speech•Is commonly used in text-to-speech•a set of rules of phonemes and rules relating to the context of a sentence or phrase is used to synthesis works and sentences•When used with databases, has potential for producing much larger range of responses than speed produced by concatenation. •Can sound artificial •Useful where larger vocabularies are required.

Sound Output Devices

Page 17: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

17

Sound Output Devices

•Synthesis-by-ruleSynthesis-by-rule•Phonemes•Are used for speech generation using synthesis-by-rule•are used where the words needed for the application cannot be predicted in advance •~40 phonemes in English, e.g., the words “bad” and “had” differ in their initial phoneme.

Sound Output Devices

Page 18: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

18

Speech Applications

•So far, have discussed speech as an input (speech recognition) and speech as a type of sound output (using speech synthesis)

•Applications that use speech recognition and/or speech synthesis are called speech applications

Sound Output Devices

Page 19: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

19

Speech Applications – Hardware Requirements

•In order to use a speech application, computer must have

•Microphone (for audio input)

•Sound card (for audio reproduction)

•Digital signal processor card (some some Speech recognition systems

•Speaker (for audio output)

•Headphones (for audio output)

•Adequate processing power

•Adequate memory

Page 20: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

20

Speech Applications – Specific Software Requirements

•Speech recognition engine

•Speech synthesis (e.g. text-to-speech engine)

Page 21: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

21

Screen readersScreen readers - software that works together with a speech synthesizer to read aloud everything contained on a computer screen, including icons, menus, text, punctuation, and control buttons.

•Useful for visually impaired or blind users

•Useful for illiterate users

•Useful for users with impaired mobility

•Sample screen readers:•JAWs by Henter-Joyce (http://www.hj.com/)•Microsoft Narrator- supplied with Windows operating system•Home Page Reader by IBM (http://www-3.ibm.com/able/hpr.htm) •ReadPlease (http://www.readplease.com/)

Screen Readers

Page 22: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

22

Speech Application Technologies

•There are a range of mark-up languages available that enable speech applications to be used in combination with the web

•Examples:Examples:

VoiceXML

CallXML

SSML

SALT

•Also have the Java Speech API for incorporatingspeech into java applications, including those builtfor WWW.

Page 23: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

23

Voice XML

•Allows users to use a web application by speech recognition from the user using a microphone

•Also allows applications that are accessed by telephone

•VoiceXML-enabled applications read Web pages to the user, and allow the user to respond by talking

•Applications require a VoiceXML interpreter on the server side and VoiceXML browser on the client

•More at http://www.w3.org/TR/voicexml/

Page 24: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

24

Voice XML example

<?xml version="1.0"?><vxml version="2.0"><form>

<field name="birth-year"><prompt>In which month were you born?</prompt><option>January</option><option>February</option><option>March</option><option>April</option><option>May</option><option>June</option><option>July</option><option>August</option><option>October</option><option>November</option><option>December</option>

</field>

Next page…

The system asks for 2 pieces of information year of birth and day of birth:

Page 25: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

25

Voice XML Examples

 <field name="birth-date" type="number"><prompt> On what day of <value expr="birth-year"/> were you born? </prompt></field> <block><submit next="http://www.voicexml-example.com/birth-info.vxml "/></block></form></vxml>

Page 26: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

26

•Speech Application Language Tags

•Tags extend other markup languages e.g. (X)HTML, WML. Can’t use SALT tags on their own.

•Can enable existing Web applications for speech

•Users can use telephone or multimodal devices such as PCs, notebooks, tablets, cell phones and wireless PDAs.

•Simple and fast to implement

•Seven tags in total, including:<PROMPT> - Plays speech output

(speech synthesis)<LISTEN> - “takes in” speech (I.e.

speech recognition tag)

SALT

Page 28: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

28

Call XML

•Created by Voxeo

•Assists visually impaired users by providing access to web pages from the telephone

•XML-based mark-up language

•Examples for applications: voicemail, interactive voice response systems

Page 29: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

29

Speech Synthesis Mark-up Language (SSML)

•Currently under review by World wide web consortium (W3C)

•An XML-based markup language for assisting in the generation of text-to-speech in Web and other applications

•Provides standard ways of controlling and specifying speech characteristics such as pronunciation, volume, rate, tone

•Can be embedded within VoiceXMl and SALT

More at http://www.w3.org/TR/speech-synthesis/

Page 30: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

30

Java Speech API (JSAPI)

• Version 1 released in 1998

• API to allow java applications and applets to use speech at the user interface – supports both speech recognition and speech synthesis – javax.speech

• Speech applications can be written completely in java

• Includes a mark-up language, Java Speech API markup language (JSML) to adjust speech synthesis (e.g. tone, volume etc)

• Example overleaf is of the method required to generate speech.

Page 31: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

31

JSAPI – code sample

public void MySpeech(String SpeakText){ try { // Create a synthesizer for English Synthesizer synth = Central.createSynthesizer(new SynthesizerModeDesc(Locale.ENGLISH));

// Get it ready to speak synth.allocate(); synth.resume();

//Speak Now... synth.speakPlainText(SpeakText, null); // Wait till speaking is done

synth.waitEngineState(Synthesizer.QUEUE_EMPTY); // Clean up synth.deallocate(); } catch (Exception e1) { System.out.println("EXCEPTION in MySpeech :" + e1); } } ……

Later…. In application, to “speak out” text just call the method with the text to be spoken:

MySpeech(customerName);

Page 32: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

32

Summary of Lecture

References

•Input Devices•Text entry devices

•Keyboards – QWERTY, Alphabetic, Chord, Dvorak, Braille•Handwriting Recognition•Speech Recognition

•Positioning and pointing devices •Direct control devices•Indirect control devices

•Devices for disabled

•Output Devices•Visual output – screens: CRT , LCD, projector•Sound output •suitable for specific application scenarios•speech synthesis – uses concatenation or synthesis-by-rule•speech applications

Page 33: 1 Interaction Devices: Output Devices Lecture 12 Date: 9 th March

33

Terms of Reference• Shneiderman, B. & Plaisant, C. (2005)

Designing the User Interface

• Preece, J. et al. (2002) Interaction Design

• Benyon, D. et al (2005) Designing Interactive Systems

• Helander, M. et al (1997) Handbook of Human-Computer Interaction

• Norman, D. (1990) The Design of Everyday Things

References