Security and Usability. Designing Secure Systems that People Can Use - Helion
ISBN: 978-05-965-5385-2
stron: 740, Format: ebook
Data wydania: 2008-07-14
Księgarnia: Helion
Cena książki: 143,65 zł (poprzednio: 167,03 zł)
Oszczędzasz: 14% (-23,38 zł)
Human factors and usability issues have traditionally played a limited role in security research and secure systems development. Security experts have largely ignored usability issues--both because they often failed to recognize the importance of human factors and because they lacked the expertise to address them.
But there is a growing recognition that today's security problems can be solved only by addressing issues of usability and human factors. Increasingly, well-publicized security breaches are attributed to human errors that might have been prevented through more usable software. Indeed, the world's future cyber-security depends upon the deployment of security technology that can be broadly used by untrained computer users.
Still, many people believe there is an inherent tradeoff between computer security and usability. It's true that a computer without passwords is usable, but not very secure. A computer that makes you authenticate every five minutes with a password and a fresh drop of blood might be very secure, but nobody would use it. Clearly, people need computers, and if they can't use one that's secure, they'll use one that isn't. Unfortunately, unsecured systems aren't usable for long, either. They get hacked, compromised, and otherwise rendered useless.
There is increasing agreement that we need to design secure systems that people can actually use, but less agreement about how to reach this goal. Security & Usability is the first book-length work describing the current state of the art in this emerging field. Edited by security experts Dr. Lorrie Faith Cranor and Dr. Simson Garfinkel, and authored by cutting-edge security and human-computer interaction (HCI) researchers world-wide, this volume is expected to become both a classic reference and an inspiration for future research.
Security & Usability groups 34 essays into six parts:
- Realigning Usability and Security---with careful attention to user-centered design principles, security and usability can be synergistic.
- Authentication Mechanisms-- techniques for identifying and authenticating computer users.
- Secure Systems--how system software can deliver or destroy a secure user experience.
- Privacy and Anonymity Systems--methods for allowing people to control the release of personal information.
- Commercializing Usability: The Vendor Perspective--specific experiences of security and software vendors (e.g., IBM, Microsoft, Lotus, Firefox, and Zone Labs) in addressing usability.
- The Classics--groundbreaking papers that sparked the field of security and usability.
This book is expected to start an avalanche of discussion, new ideas, and further advances in this important field.
Osoby które kupowały "Security and Usability. Designing Secure Systems that People Can Use", wybierały także:
- Learning Java Lambdas 373,75 zł, (29,90 zł -92%)
- The DevOps 2.1 Toolkit: Docker Swarm 332,22 zł, (29,90 zł -91%)
- Securing Network Infrastructure 199,33 zł, (29,90 zł -85%)
- Mastering Linux Security and Hardening 186,88 zł, (29,90 zł -84%)
- Blockchain Development with Hyperledger 175,88 zł, (29,90 zł -83%)
Spis treści
Security and Usability. Designing Secure Systems that People Can Use eBook -- spis treści
- Security and Usability
- Preface
- Goals of This Book
- Audience for This Book
- Structure of This Book
- Conventions Used in This Book
- Safari Enabled
- How to Contact Us
- Acknowledgments
- I. Realigning Usability and Security
- One. Psychological Acceptability Revisited
- 1.1. Passwords
- 1.2. Patching
- 1.3. Configuration
- 1.4. Conclusion
- 1.5. About the Author
- Two. Why Do We Need It? How Do We Get It?
- 2.1. Introduction
- 2.2. Product: Human Factors, Policies, and Security Mechanisms
- 2.2.1. Impossible Demands
- 2.2.2. Awkward Behaviors
- 2.2.3. Beyond the User Interface
- 2.3. Process: Applying Human Factors Knowledge and User-Centered Approaches to Security Design
- 2.3.1. Security Is a Supporting Task
- 2.3.2. A Process for Designing Usable Secure Systems
- 2.4. Panorama: Understanding the Importance of the Environment
- 2.4.1. The Role of Education, Training, Motivation, and Persuasion
- 2.4.2. Building a Security Culture
- 2.5. Conclusion
- 2.6. About the Authors
- Three. Design for Usability
- 3.1. Death by Security
- 3.2. Balance Security and Usability
- 3.2.1. Exploit Differences Between Users and Bad Guys
- 3.2.2. Exploit Differences in Physical Location
- 3.2.3. Vary Security with the Task
- 3.2.4. Increase Your Partnership with Users
- 3.2.4.1. Trust the user
- 3.2.4.2. Exploit the special skills of users
- 3.2.4.3. Remove or reduce the users burden
- 3.2.5. Achieve Balanced Authentication Design
- 3.2.5.1. Remove unnecessary password restrictions
- 3.2.5.2. The Doctor and password madness
- 3.2.6. Balance Resource Allocation
- 3.3. Balance Privacy and Security
- 3.4. Build a Secure Internet
- 3.4.1. Ringworld
- 3.4.1.1. Within the Castle Keep
- 3.4.1.2. Within the Ramparts
- 3.4.1.3. The Town Wall
- 3.4.1.4. Beyond the Town Wall
- 3.4.2. Ringworld Interface
- 3.4.1. Ringworld
- 3.5. Conclusion
- 3.6. About the Author
- Four. Usability Design and Evaluation for Privacy and Security Solutions
- 4.1. Usability in the Software and Hardware Life Cycle
- 4.1.1. Unique Aspects of HCI and Usability in the Privacy and Security Domain
- 4.1.2. Usability in Requirements
- 4.1.3. Usability in Design and Development
- 4.1.4. Usability in Postrelease
- 4.2. Case Study: Usability Involvement in a Security Application
- 4.2.1. The Field Study
- 4.2.2. The User Tests
- 4.2.2.1. Test 1
- 4.2.2.2. Test 2
- 4.2.2.3. Test 3
- 4.2.3. The Return on Investment (ROI) Analysis
- 4.3. Case Study: Usability Involvement in the Development of a Privacy Policy Management Tool
- 4.3.1. Step One: Identifying Privacy Needs
- 4.3.2. Step Two: Performing In-Depth Interview Research
- 4.3.3. Step Three: Designing and Evaluating a Privacy Policy Prototype
- 4.3.4. Step Four: Evaluating Policy Authoring
- 4.4. Conclusion
- 4.5. About the Authors
- 4.1. Usability in the Software and Hardware Life Cycle
- Five. Designing Systems That People Will Trust
- 5.1. Introduction
- 5.1.1. Definitions of Trust
- 5.1.2. The Nature of Trust in the Digital Sphere
- 5.2. The Trust-Risk Relationship
- 5.2.1. Technology Factors
- 5.2.2. Trust and Credibility
- 5.3. The Time-Course of Trust
- 5.4. Models of Trust
- 5.4.1. Early Work on Modeling Trust
- 5.4.2. Bhattacherjees Model of Trust
- 5.4.3. Lee, Kim, and Moons Model of Trust
- 5.4.4. Corritores Model of Trust
- 5.4.5. Eggers Model of Trust
- 5.4.6. McKnights Model of Trust
- 5.4.7. Riegelsbergers Model of Trust
- 5.4.8. Looking at the Models
- 5.5. Trust Designs
- 5.6. Future Research Directions
- 5.7. About the Authors
- 5.1. Introduction
- One. Psychological Acceptability Revisited
- II. Authentication Mechanisms
- Six. Evaluating Authentication Mechanisms
- 6.1. Authentication
- 6.1.1. Accessibility Barriers
- 6.1.2. Human Factors
- 6.1.3. Security
- 6.1.4. Context and Environment
- 6.2. Authentication Mechanisms
- 6.2.1. What the User IsBiometrics
- 6.2.2. What the User KnowsMemometrics
- 6.2.2.1. Random passwords (uncued recall)
- 6.2.2.2. Cultural passwords (cued recall)
- 6.2.3. What the User RecognizesCognometrics
- 6.2.3.1. Recognition-based systems
- 6.2.3.2. Position-based systems
- 6.2.4. What the User Holds
- 6.2.5. Two-Factor Authentication
- 6.3. Quality Criteria
- 6.3.1. Accessibility
- 6.3.2. Memorability
- 6.3.3. Security
- 6.3.4. Cost
- 6.4. Environmental Considerations
- 6.4.1.
- 6.4.1.1. Accessibility
- 6.4.1.2. Memorability
- 6.4.1.3. Security
- 6.4.1.4. Cost
- 6.4.1.
- 6.5. Choosing a Mechanism
- 6.5.1. An Online Banking Example
- 6.4.1.5. The critical criterion: accessibility
- 6.4.1.6. The vital criterion: security
- 6.4.1.7. The significant criteria: memorability and cost
- 6.4.1.8. The incidental criterion: nothing
- 6.5.1. An Online Banking Example
- 6.6. Conclusion
- 6.7. About the Author
- 6.1. Authentication
- Seven. The Memorability and Security of Passwords
- 7.1. Introduction
- 7.2. Existing Advice on Password Selection
- 7.3. Experimental Study
- 7.4. Method
- 7.5. Results
- 7.6. Discussion
- 7.7. Acknowledgments
- 7.8. About the Authors
- Eight. Designing Authentication Systems with Challenge Questions
- 8.1. Challenge Questions as a Form of Authentication
- 8.1.1. Using Challenge Questions for Credential Recovery
- 8.1.2. Using Challenge Questions for Routine Authentication
- 8.2. Criteria for Building and Evaluating a Challenge Question System
- 8.2.1. Privacy Criteria
- 8.2.2. Security Criteria
- 8.2.3. Usability Criteria
- 8.3. Types of Questions and Answers
- 8.3.1. Question Types
- 8.3.2. Answer Types
- 8.4. Designing a Challenge Question Authentication System
- 8.4.1. Determining the Number of Questions to Use
- 8.4.2. Determining the Types of Questions and Answers to Use
- 8.4.2.1. Determining the appropriate question type
- 8.4.2.2. Determining the appropriate answer type
- 8.4.3. Complementary Security Techniques
- 8.5. Some Examples of Current Practice
- 8.5.1. About the Author
- 8.1. Challenge Questions as a Form of Authentication
- Nine. Graphical Passwords
- 9.1. Introduction
- 9.2. A Picture Is Worth a Thousand Words
- 9.2.1. Image Recognition
- 9.2.2. Tapping or Drawing
- 9.2.3. Image Interpretation
- 9.3. Picture Perfect?
- 9.3.1. Security
- 9.3.1.1. Key generation
- 9.3.1.2. Authentication
- 9.3.2. Usability
- 9.3.3. Discussion
- 9.3.1. Security
- 9.4. Lets Face It
- 9.5. About the Authors
- Ten. Usable Biometrics
- 10.1. Introduction
- 10.1.1. Biometrics Types
- 10.1.2. Issues of Biometrics Specificity
- 10.1.3. The Fingerprint Example
- 10.2. Where Are Biometrics Used?
- 10.2.1. Physical Access Control
- 10.2.2. Immigration and Border Control
- 10.2.3. Law and Order
- 10.2.4. Transaction Security
- 10.3. Biometrics and Public Technology: The ATM Example
- 10.3.1. ATM Fingerprint Verification
- 10.3.2. ATM Face Verification
- 10.3.3. ATM Iris Verification
- 10.3.4. ATM Retina Verification
- 10.3.5. ATM Hand Verification
- 10.3.6. ATM Speaker Verification
- 10.3.7. ATM Signature Verification
- 10.3.8. ATM Typing Verification
- 10.4. Evaluating Biometrics
- 10.4.1. Performance Metrics
- 10.5. Incorporating User Factors into Testing
- 10.5.1. Size of User Base
- 10.5.2. Designing a Biometrics Solution to Maximize the User Experience
- 10.5.3. Enrollment
- 10.5.4. Biometrics Capture
- 10.5.5. Outliers and Fallback Strategies
- 10.5.5.1. Exception handling of outliers
- 10.5.5.2. Exception handling of temporary exclusions
- 10.5.5.3. Exception handling of aging
- 10.5.6. User Acceptance
- 10.5.6.1. Promoting user acceptance
- 10.5.6.2. Privacy
- 10.6. Conclusion
- 10.7. About the Author
- 10.1. Introduction
- Eleven. Identifying Users from Their Typing Patterns
- 11.1. Typing Pattern Biometrics
- 11.2. Applications
- 11.2.1. Authentication
- 11.2.2. Identification and Monitoring
- 11.2.3. Password Hardening
- 11.2.4. Beyond Keyboards
- 11.3. Overview of Previous Research
- 11.4. Evaluating Previous Research
- 11.4.1. Classifier Accuracy
- 11.4.2. Usability
- 11.4.3. Confidence in Reported Results
- 11.5. Privacy and Security Issues
- 11.6. Conclusion
- 11.7. About the Authors
- Twelve. The Usability of Security Devices
- 12.1. Introduction
- 12.2. Overview of Security Devices
- 12.2.1. OTP Tokens
- 12.2.2. Smart Cards
- 12.2.3. USB Tokens
- 12.2.4. Biometrics Devices
- 12.3. Usability Testing of Security Devices
- 12.3.1. Setting Up the Test
- 12.3.2. Related Work
- 12.3.3. Usability Testing Methodology
- 12.4. A Usability Study of Cryptographic Smart Cards
- 12.4.1. Aim and Scope
- 12.4.2. Context and Roles Definition
- 12.4.3. User Selection
- 12.4.4. Task Definition
- 12.4.5. Measurement Apparatus
- 12.4.6. Processing for Statistical Significance
- 12.4.7. Computation of the Quality Attributes Scores
- 12.4.8. Results and Interpretation
- 12.4.9. Some Initial Conclusions
- 12.5. Recommendations and Open Research Questions
- 12.6. Conclusion
- 12.7. Acknowledgments
- 12.8. About the Authors
- Six. Evaluating Authentication Mechanisms
- III. Secure Systems
- Thirteen. Guidelines and Strategies for Secure Interaction Design
- 13.1. Introduction
- 13.1.1. Mental Models
- 13.1.2. Sources of Conflict
- 13.1.3. Iterative Design
- 13.1.4. Permission and Authority
- 13.2. Design Guidelines
- 13.2.1. Authorization
- 13.2.1.1. 1. Match the most comfortable way to do tasks with the least granting of authority.
- 13.2.1.2. 2. Grant authority to others in accordance with user actions indicating consent.
- 13.2.1.3. 3. Offer the user ways to reduce others authority to access the users resources.
- 13.2.1.4. 4. Maintain accurate awareness of others authority as relevant to user decisions.
- 13.2.1.5. 5. Maintain accurate awareness of the users own authority to access resources.
- 13.2.2. Communication
- 13.2.2.1. 6. Protect the users channels to agents that manipulate authority on the users behalf.
- 13.2.2.2. 7. Enable the user to express safe security policies in terms that fit the users task.
- 13.2.2.3. 8. Draw distinctions among objects and actions along boundaries relevant to the task.
- 13.2.2.4. 9. Present objects and actions using distinguishable, truthful appearances.
- 13.2.2.5. 10. Indicate clearly the consequences of decisions that the user is expected to make.
- 13.2.1. Authorization
- 13.3. Design Strategies
- 13.3.1. Security by Admonition and Security by Designation
- 13.3.1.1. Security by admonition
- 13.3.1.2. Security by designation
- 13.3.1.3. Advantages of designation
- 13.3.1.4. Implementing security by designation
- 13.3.1.5. Implementing security by admonition
- 13.3.2. User-Assigned Identifiers
- 13.3.3. Applying the Strategies to Everyday Security Problems
- 13.3.3.1. Email viruses
- 13.3.3.2. Other viruses and spyware
- 13.3.3.3. Securing file access
- 13.3.3.4. Securing email access
- 13.3.3.5. Cookie management
- 13.3.3.6. Phishing attacks
- 13.3.3.7. Real implementations
- 13.3.1. Security by Admonition and Security by Designation
- 13.4. Conclusion
- 13.5. Acknowledgments
- 13.6. About the Author
- 13.1. Introduction
- Fourteen. Fighting Phishing at the User Interface
- 14.1. Introduction
- 14.1.1. Anatomy of a Phishing Attack
- 14.1.2. Phishing as a Semantic Attack
- 14.2. Attack Techniques
- 14.3. Defenses
- 14.3.1. Message Retrieval
- 14.3.1.1. Identity of the sender
- 14.3.1.2. Textual content of the message
- 14.3.2. Presentation
- 14.3.3. Action
- 14.3.4. System Operation
- 14.3.5. Case Study: SpoofGuard
- 14.3.1. Message Retrieval
- 14.4. Looking Ahead
- 14.5. About the Authors
- 14.1. Introduction
- Fifteen. Sanitization and Usability
- 15.1. Introduction
- 15.2. The Remembrance of Data Passed Study
- 15.2.1. Other Anecdotal Information
- 15.2.2. Study Methodology
- 15.2.3. FORMAT Doesnt Format
- 15.2.4. DELETE Doesnt Delete
- 15.2.5. A Taxonomy of Sanitized Recovered Data
- 15.3. Related Work: Sanitization Standards, Software, and Practices
- 15.3.1. DoD 5220.22-M
- 15.3.2. Add-On Software
- 15.3.3. Operating System Modifications
- 15.4. Moving Forward: A Plan for Clean Computing
- 15.5. Acknowledgments
- 15.6. About the Author
- Sixteen. Making the Impossible Easy: Usable PKI
- 16.1. Public Key Infrastructures
- 16.2. Problems with Public Key Infrastructures
- 16.3. Making PKI Usable
- 16.3.1. Case Study: Network-in-a-Box
- 16.3.2. Case Study: Casca
- 16.3.3. Case Study: Usable Access Control for the World Wide Web
- 16.3.4. Instant PKIs
- 16.3.5. What Makes a PKI Instant?
- 16.4. About the Authors
- Seventeen. Simple Desktop Security with Chameleon
- 17.1. Introduction
- 17.1.1. File Organization in Chameleon
- 17.1.2. Interrole Communication and Network Access
- 17.1.3. Advanced Role Features
- 17.2. Chameleon User Interface
- 17.3. Chameleon Interface Development
- 17.3.1. Study 1: Paper Prototype (Security in Context)
- 17.3.2. Study 2: Paper Prototype (Security Mechanisms)
- 17.3.3. Study 3: Visual Basic Prototype
- 17.4. Chameleon Implementation
- 17.4.1. Window System Partitioning
- 17.4.2. Filesystem Security
- 17.4.3. Network Security and Interprocess Communication
- 17.4.4. Software Architecture for Usability and Security
- 17.5. Conclusion
- 17.6. Acknowledgments
- 17.7. About the Authors
- 17.1. Introduction
- Eighteen. Security Administration Tools and Practices
- 18.1. Introduction
- 18.2. Attacks, Detection, and Prevention
- 18.3. Security Administrators
- 18.3.1. Profile of a Security ManagerJoe
- 18.3.2. Profile of a Security EngineerAaron
- 18.4. Security Administration: Cases from the Field
- 18.4.1. Security Checkup
- 18.4.1.1. Case 1: MyDoom
- 18.4.1.2. Case 2: Intrusion alertfalse alarm
- 18.4.1.3. Case 3: Real-time network monitoring
- 18.4.1.4. Case 4: Security scan
- 18.4.2. Attack Analysis
- 18.4.2.1. Case 5: Persistent hackers
- 18.4.3. The Need for Security Administration Tools
- 18.4.1. Security Checkup
- 18.5. Conclusion
- 18.6. Acknowledgments
- 18.7. About the Authors
- Thirteen. Guidelines and Strategies for Secure Interaction Design
- IV. Privacy and Anonymity Systems
- Ninteen. Privacy Issues and Human-Computer Interaction
- 19.1. Introduction
- 19.2. Privacy and HCI
- 19.3. Relevant HCI Research Streams
- 19.3.1. Usability Engineering
- 19.3.2. Computer-Supported Cooperative Work
- 19.3.3. Individual Differences
- 19.3.4. Ubiquitous Computing (Ubicomp)
- 19.4. Conclusion
- 19.5. About the Authors
- Twenty. A User-Centric Privacy Space Framework
- 20.1. Introduction
- 20.1.1. Privacy
- 20.1.2. Exoinformation
- 20.2. Security and Privacy Frameworks
- 20.2.1. Codes of Fair Information Practice
- 20.2.2. The ISTPA Privacy Framework
- 20.2.3. Schneiers Security Processes Framework
- 20.2.4. The Privacy Space Framework
- 20.3. Researching the Privacy Space
- 20.3.1. Feature Analysis
- 20.3.1.1. Example 1: PGP Freeware
- 20.3.1.2. Example 2: WebWasher
- 20.3.1.3. Example 3: ZoneAlarm
- 20.3.1.4. Phase one results
- 20.3.2. Validation
- 20.3.1. Feature Analysis
- 20.4. Privacy as a Process
- 20.5. Conclusion
- 20.6. About the Author
- 20.1. Introduction
- Twenty One. Five Pitfalls in the Design for Privacy
- 21.1. Introduction
- 21.1.1. Understanding
- 21.1.2. Action
- 21.2. Faces: (Mis)Managing Ubicomp Privacy
- 21.2.1. Faces Design
- 21.2.2. Formative Evaluation
- 21.3. Five Pitfalls to Heed When Designing for Privacy
- 21.3.1. Concerning Understanding
- 21.3.1.1. Pitfall 1: Obscuring potential information flow
- 21.3.1.2. Evidence: Falling into the pitfall
- 21.3.1.3. Evidence: Avoiding the pitfall
- 21.3.1.4. Pitfall 2: Obscuring actual information flow
- 21.3.1.5. Evidence: Falling into the pitfall
- 21.3.1.6. Evidence: Avoiding the pitfall
- 21.3.2. Concerning Action
- 21.3.2.1. Pitfall 3: Emphasizing configuration over action
- 21.3.2.2. Evidence: Falling into the pitfall
- 21.3.2.3. Evidence: Avoiding the pitfall
- 21.3.2.4. Pitfall 4: Lacking coarse-grained control
- 21.3.2.5. Evidence: Falling into the pitfall
- 21.3.2.6. Evidence: Avoiding the pitfall
- 21.3.2.7. Pitfall 5: Inhibiting established practice
- 21.3.2.8. Evidence: Falling into the pitfall
- 21.3.2.9. Evidence: Avoiding the pitfall
- 21.3.1. Concerning Understanding
- 21.4. Discussion
- 21.4.1. Mental Models of Information Flow
- 21.4.2. Opportunities for Understanding and Action
- 21.4.3. Negative Case Study: Faces
- 21.4.4. Positive Case Study: Instant Messaging and Mobile Telephony
- 21.5. Conclusion
- 21.6. Acknowledgments
- 21.7. About the Authors
- 21.1. Introduction
- Twenty Two. Privacy Policies and Privacy Preferences
- 22.1. Introduction
- 22.2. The Platform for Privacy Preferences (P3P)
- 22.2.1. How P3P Works
- 22.2.2. P3P User Agents
- 22.3. Privacy Bird Design
- 22.3.1. Capturing User Privacy Preferences
- 22.3.2. Communicating with Users About Web Site Privacy Policies
- 22.3.3. Privacy Icons
- 22.4. Privacy Bird Evaluation
- 22.4.1. User Survey
- 22.4.2. Laboratory Study
- 22.5. Beyond the Browser
- 22.6. About the Author
- Twenty Three. Privacy Analysis for the Casual User with Bugnosis
- 23.1. Introduction
- 23.2. The Audience for Bugnosis
- 23.3. Cookies, Web Bugs, and User Tracking
- 23.3.1. Tracing Alice Through the Web
- 23.3.1.1. Visiting multiple sites
- 23.3.1.2. Unique identification with referrers and third-party cookies
- 23.3.2. Using Web Bugs to Enable Clickstream Tracking
- 23.3.2.1. The web bug: a definition
- 23.3.2.2. What about second-party transactions?
- 23.3.3. Bugnosis: Theory of Operation
- 23.3.3.1. One-sided errors
- 23.3.3.2. Detecting but not blocking web bugs
- 23.3.4. Presenting the Analysis
- 23.3.5. Alerting the User
- 23.3.1. Tracing Alice Through the Web
- 23.4. The Graphic Identity
- 23.5. Making It Simple Is Complicated
- 23.5.1. Using Browser Helper Objects and the Document Object Model
- 23.5.2. The Event Model
- 23.5.2.1. Provisional analysis
- 23.5.2.2. Rescanning, refreshing, and Old Paint
- 23.5.3. The Analysis Pane and Toolbar
- 23.5.3.1. The discovery dance
- 23.5.3.2. Making the Analysis Pane feel natural
- 23.5.3.3. Analyzing pop-up windows
- 23.5.4. Installation and Uninstallation
- 23.5.4.1. Installation/uninstallation problems
- 23.5.4.2. Windows XP Service Pack 2
- 23.6. Looking Ahead
- 23.6.1. Exposing Email Tracking
- 23.6.2. Platform for Privacy Preferences Project
- 23.6.3. Further Privacy Awareness Tools and Research
- 23.7. Acknowledgments
- 23.8. About the Author
- Twenty Four. Informed Consent by Design
- 24.1. Introduction
- 24.2. A Model of Informed Consent for Information Systems
- 24.2.1. Disclosure
- 24.2.2. Comprehension
- 24.2.3. Voluntariness
- 24.2.4. Competence
- 24.2.5. Agreement
- 24.2.6. Minimal Distraction
- 24.3. Possibilities and Limitations for Informed Consent: Redesigning Cookie Handling in a Web Browser
- 24.3.1. What Are Cookies and How Are They Used?
- 24.3.2. Web Browser as Gatekeeper to Informed Consent
- 24.3.3. Web Browser Development and Progress for Informed Consent: 1995-1999
- 24.3.4. Redesigning the Browser
- 24.3.5. Technical Limitations to Redesigning for Informed Consent
- 24.3.6. Reflections
- 24.4. Informing Through Interaction Design: What Users Understand About Secure Connections Through Their Web Browsing
- 24.4.1. Participants
- 24.4.2. Users Conceptions of Secure Connections
- 24.4.2.1. Definition of a secure connection
- 24.4.2.2. Recognition of a connection as secure or not secure
- 24.4.2.3. Visual portrayal of a secure connection
- 24.4.3. Reflections
- 24.5. The Scope of Informed Consent: Questions Motivated by Gmail
- 24.5.1. What Is Gmail?
- 24.5.2. How Gmail Advertisements Work
- 24.5.3. Gmail and the Six Components of Informed Consent
- 24.5.3.1. Disclosure
- 24.5.3.2. Comprehension
- 24.5.3.3. Voluntariness
- 24.5.3.4. Competence
- 24.5.3.5. Agreement
- 24.5.3.6. Minimal distraction
- 24.5.4. Two Questions Related to Informed Consent
- 24.5.4.1. The question of machines reading personal content
- 24.5.4.2. The question of indirect stakeholders
- 24.5.5. Reflections
- 24.5.6. Design Principles for Informed Consent for Information Systems
- 24.6. Acknowledgments
- 24.7. About the Authors
- Twenty Five. Social Approaches to End-User Privacy Management
- 25.1. A Concrete Privacy Problem
- 25.2. Acumen: A Solution Using Social Processes
- 25.2.1. Acumen Overview
- 25.2.2. The Acumen User Interface
- 25.3. Supporting Privacy Management Activities with Social Processes
- 25.3.1. Awareness and Motivation
- 25.3.1.1. Awareness and motivation in Acumen
- 25.3.2. Learning and Education
- 25.3.2.1. Learning and education in Acumen
- 25.3.3. Decision Making
- 25.3.3.1. Decision making and herd behavior
- 25.3.1. Awareness and Motivation
- 25.4. Deployment, Adoption, and Evaluation
- 25.4.1. Deployment and Adoption
- 25.4.1.1. Deployment and adoption in Acumen
- 25.4.2. User Needs Evaluation
- 25.4.2.1. User needs evaluation in Acumen
- 25.4.3. Technological Evaluation
- 25.4.3.1. Technological evaluation in Acumen
- 25.4.1. Deployment and Adoption
- 25.5. Gaming and Anti-gaming
- 25.5.1. Anti-gaming Techniques in Acumen
- 25.6. Generalizing Our Approach
- 25.6.1. Four Key Questions for a Privacy Management System
- 25.6.2. Sketching a System Design
- 25.7. Conclusion
- 25.8. About the Authors
- Twenty Six. Anonymity Loves Company: Usability and the Network Effect
- 26.1. Usability for Others Impacts Your Security
- 26.2. Usability Is Even More Important for Privacy
- 26.2.1. Case Study: Usability Means Users, Users Mean Security
- 26.2.2. Case Study: Against Options
- 26.2.3. Case Study: Mixminion and MIME
- 26.2.4. Case Study: Tor Installation, Marketing, and GUI
- 26.2.5. Case Study: JAP and its Anonym-o-meter
- 26.3. Bootstrapping, Confidence, and Reputability
- 26.4. Technical Challenges to Guessing the Number of Users in a Network
- 26.5. Conclusion
- 26.6. About the Authors
- Ninteen. Privacy Issues and Human-Computer Interaction
- V. Commercializing Usability: The Vendor Perspective
- Twenty Seven. ZoneAlarm: Creating Usable Security Products for Consumers
- 27.1. About ZoneAlarm
- 27.2. Design Principles
- 27.2.1. Know Your Audience
- 27.2.2. Think Like Your Audience
- 27.2.3. Eliminate Clutter
- 27.2.4. Eliminate Complexity
- 27.2.5. Create Just Enough Feedback
- 27.2.6. Be a Customer Advocate When Usability and Competitive Pressure Collide
- 27.3. Efficient Production for a Fast Market
- 27.4. Conclusion
- 27.5. About the Author
- Twenty Eight. Firefox and the Worry-Free Web
- 28.1. Usability and Security: Bridging the Gap
- 28.2. The Five Golden Rules
- 28.2.1. Identifying The User
- 28.2.2. 1. Enforce the Officer/Citizen Model
- 28.2.3. 2. Dont Overwhelm the User
- 28.2.4. 3. Earn Your Users Trust
- 28.2.5. 4. Put Out Fires Quickly and Responsibly
- 28.2.6. 5. Teach Your Users Simple Tricks
- 28.3. Conclusion
- 28.4. About the Author
- Twenty Nine. Users and Trust: A Microsoft Case Study
- 29.1. Users and Trust
- 29.1.1. Users Reactions to Trust Questions
- 29.1.2. Users Behavior in Trust Situations
- 29.1.3. Security Versus Convenience
- 29.1.4. Making Decisions Versus Supporting Decisions
- 29.2. Consent Dialogs
- 29.2.1. Consent Dialog Redesign
- 29.3. Windows XP Service Pack 2A Case Study
- 29.3.1. ActiveX Dialogs
- 29.3.2. File Download Dialogs
- 29.4. Pop-Up Blocking
- 29.5. The Ideal
- 29.6. Conclusion
- 29.7. About the Author
- 29.1. Users and Trust
- Thirty. IBM Lotus Notes/Domino: Embedding Security in Collaborative Applications
- 30.1. Usable Secure Collaboration
- 30.2. Embedding and Simplifying Public Key Security
- 30.2.1. Signing and Decrypting Email
- 30.2.2. Encrypting Email
- 30.3. Designing Security Displays
- 30.3.1. User Security Panel
- 30.3.1.1. Displaying public key certificates
- 30.3.1.2. Limitations and results
- 30.3.2. Database Access Control Information
- 30.3.2.1. Adding power and complexity
- 30.3.1. User Security Panel
- 30.4. User Control of Active Content Security
- 30.4.1. Deployment Study
- 30.4.2. Solutions and Challenges
- 30.5. Conclusion
- 30.6. About the Author
- Thirty One. Achieving Usable Security in Groove Virtual Office
- 31.1. About Groove Virtual Office
- 31.2. Groove Virtual Office Design
- 31.2.1. The Weakest Link
- 31.2.2. Do the Right Thing
- 31.2.3. Is That You, Alice?
- 31.2.4. Colorful Security
- 31.3. Administrators Strengths and Weaknesses
- 31.4. Security and Usability
- 31.5. About the Authors
- Twenty Seven. ZoneAlarm: Creating Usable Security Products for Consumers
- VI. The Classics
- Thirty Two. Users Are Not the Enemy
- 32.1. The Study
- 32.2. Users Lack Security Knowledge
- 32.3. Security Needs User-Centered Design
- 32.4. Motivating Users
- 32.5. Users and Password Behavior
- 32.6. About the Authors
- Thirty Three. Usability and Privacy: A Study of KaZaA P2P File Sharing
- 33.1. Introduction
- 33.1.1. Abuses on KaZaA Today
- 33.1.2. Unintended File Sharing Among KaZaA Users
- 33.1.3. Users Downloading Others Private Files
- 33.2. Usability Guidelines
- 33.3. Results of the Cognitive Walkthrough
- 33.3.1. Changing the Download File Directory
- 33.3.2. Sharing Files
- 33.3.3. Adding Files to the My Media Folder
- 33.3.4. Uploading Files
- 33.3.5. Summary of Usability Guidelines
- 33.4. A Two-Part User Study
- 33.4.1. Parts of the Study
- 33.4.1.1. KaZaA sharing comprehension questions
- 33.4.1.2. Current sharing settings discovery task
- 33.4.2. Results
- 33.4.2.1. KaZaA sharing comprehension questions
- 33.4.2.2. Current sharing settings discovery task
- 33.4.3. Suggested Design Improvements
- 33.4.1. Parts of the Study
- 33.5. Conclusion
- 33.6. Acknowledgments
- 33.7. About the Authors
- 33.1. Introduction
- Thirty Four. Why Johnny Cant Encrypt
- 34.1. Introduction
- 34.2. Understanding the Problem
- 34.2.1. Defining Usability for Security
- 34.2.2. Problematic Properties of Security
- 34.2.3. A Usability Standard for PGP
- 34.3. Evaluation Methods
- 34.4. Cognitive Walkthrough
- 34.4.1. Visual Metaphors
- 34.4.2. Different Key Types
- 34.4.3. Key Server
- 34.4.4. Key Management Policy
- 34.4.5. Irreversible Actions
- 34.4.6. Consistency
- 34.4.7. Too Much Information
- 34.5. User Test
- 34.5.1. Purpose
- 34.5.2. Description
- 34.5.2.1. Test design
- 34.5.2.2. Participants
- 34.5.3. Results
- 34.5.3.1. Avoiding dangerous errors
- 34.5.3.2. Figuring out how to encrypt with any key
- 34.5.3.3. Figuring out the correct key to encrypt with
- 34.5.3.4. Decrypting an email message
- 34.5.3.5. Publishing the public key
- 34.5.3.6. Getting other peoples public keys
- 34.5.3.7. Handling the mixed key types problem
- 34.5.3.8. Signing an email message
- 34.5.3.9. Verifying a signature on an email message
- 34.5.3.10. Creating a backup revocation certificate
- 34.5.3.11. Deciding whether to trust keys from the key server
- 34.6. Conclusion
- 34.6.1. Failure of Standard Interface Design
- 34.6.2. Usability Evaluation for Security
- 34.6.3. Toward Better Design Strategies
- 34.7. Related Work
- 34.8. Acknowledgments
- 34.9. About the Authors
- Thirty Two. Users Are Not the Enemy
- Index
- About the Authors
- Colophon
- Copyright