Template: Task Abstraction Log
Purpose
Use this template to systematically document the process of abstracting tasks from user requirements and domain analysis for your visualization project.
Project Information
Project: [Project Name]
Domain: [Application Domain]
Date Started: [Date]
Last Updated: [Date]
Team Members: [Names and roles]
Task Abstraction Framework
Domain Problem Characterization
What: [Clear description of the domain problem you’re addressing]
Who: [Description of target users - their roles, expertise, typical work context]
Why: [Goals and objectives - what users hope to accomplish]
Where: [Context of use - where and when this tool would be used]
When: [Temporal context - frequency of use, time constraints, deadlines]
How (Currently): [Current methods and tools users employ to address this problem]
Data Characterization
Dataset Types:
- Tabular Data - [Description if applicable]
- Network Data - [Description if applicable]
- Spatial Data - [Description if applicable]
- Temporal Data - [Description if applicable]
- Hierarchical Data - [Description if applicable]
- Other: [Specify type and description]
Data Attributes:
Attribute Name | Type | Description | Importance |
---|---|---|---|
[attribute1] | [Quantitative/Ordinal/Categorical] | [What it represents] | [High/Medium/Low] |
[attribute2] | [Quantitative/Ordinal/Categorical] | [What it represents] | [High/Medium/Low] |
[attribute3] | [Quantitative/Ordinal/Categorical] | [What it represents] | [High/Medium/Low] |
Data Relationships:
- Relationship 1: [Description of how data elements relate]
- Relationship 2: [Description of how data elements relate]
Task Identification
High-Level Tasks
These are the overarching goals users want to accomplish
Task 1: [Task Name]
- Description: [What the user wants to accomplish at a high level]
- Current Method: [How users currently do this task]
- Pain Points: [What makes this task difficult or time-consuming]
- Success Criteria: [How users know they’ve completed the task successfully]
- Frequency: [How often users perform this task]
- Priority: [High/Medium/Low based on user needs]
Task 2: [Task Name]
- Description: [What the user wants to accomplish at a high level]
- Current Method: [How users currently do this task]
- Pain Points: [What makes this task difficult or time-consuming]
- Success Criteria: [How users know they’ve completed the task successfully]
- Frequency: [How often users perform this task]
- Priority: [High/Medium/Low based on user needs]
Task 3: [Task Name]
[Continue with same structure…]
Mid-Level Tasks
Breaking down high-level tasks into more specific actions
For Task 1: [High-level task name]
- Subtask 1.1: [Specific action or step]
- Input: [What information or data is needed]
- Output: [What the user expects to get]
- Challenges: [What makes this subtask difficult]
- Subtask 1.2: [Specific action or step]
- Input: [What information or data is needed]
- Output: [What the user expects to get]
- Challenges: [What makes this subtask difficult]
For Task 2: [High-level task name]
[Similar breakdown…]
Low-Level Tasks
Detailed interaction-level tasks using established taxonomies
Visual Analysis Tasks (Based on Amar & Stasko, 2004):
- Retrieve Value: Find specific data values
- Context: [When users need this in your domain]
- Example: [Specific example from your domain]
- Filter: Focus on data meeting certain criteria
- Context: [When users need this in your domain]
- Example: [Specific example from your domain]
- Compute Derived Value: Calculate aggregations or transformations
- Context: [When users need this in your domain]
- Example: [Specific example from your domain]
- Find Extremum: Identify minimum or maximum values
- Context: [When users need this in your domain]
- Example: [Specific example from your domain]
- Sort: Order data by specific attributes
- Context: [When users need this in your domain]
- Example: [Specific example from your domain]
- Determine Range: Find span of values
- Context: [When users need this in your domain]
- Example: [Specific example from your domain]
- Characterize Distribution: Understand data patterns
- Context: [When users need this in your domain]
- Example: [Specific example from your domain]
- Find Anomalies: Detect outliers or exceptions
- Context: [When users need this in your domain]
- Example: [Specific example from your domain]
- Cluster: Group similar data points
- Context: [When users need this in your domain]
- Example: [Specific example from your domain]
- Correlate: Identify relationships between variables
- Context: [When users need this in your domain]
- Example: [Specific example from your domain]
Task Validation
Validation Methods Used:
- Follow-up user interviews - [Date(s) conducted]
- Expert review - [Expert consulted and date]
- Literature validation - [Relevant papers consulted]
- Observation studies - [Context and date]
- Task walk-through - [With whom and when]
Validation Results:
Task | Validated | Notes | Revisions Needed |
---|---|---|---|
[Task 1] | ✓ Yes / ⚠ Partially / ✗ No | [Feedback received] | [Changes to make] |
[Task 2] | ✓ Yes / ⚠ Partially / ✗ No | [Feedback received] | [Changes to make] |
[Task 3] | ✓ Yes / ⚠ Partially / ✗ No | [Feedback received] | [Changes to make] |
Key Insights:
- Insight 1: [What you learned about user needs]
- Insight 2: [What you learned about task priorities]
- Insight 3: [What you learned about task relationships]
Task Prioritization
Essential (Must Have):
- Task: [Why this is essential]
- Task: [Why this is essential]
Important (Should Have):
- Task: [Why this is important]
- Task: [Why this is important]
Desirable (Could Have):
- Task: [Why this would be valuable]
- Task: [Why this would be valuable]
Out of Scope (Won’t Have):
- Task: [Why this is deferred]
- Task: [Why this is deferred]
Design Implications
Visualization Requirements:
Based on the task analysis, the visualization needs:
- Requirement 1: [Specific visual encoding or feature needed]
- Requirement 2: [Specific visual encoding or feature needed]
- Requirement 3: [Specific visual encoding or feature needed]
Interaction Requirements:
Based on the task analysis, the interface needs:
- Interaction 1: [Specific interaction capability needed]
- Interaction 2: [Specific interaction capability needed]
- Interaction 3: [Specific interaction capability needed]
Information Architecture:
Based on the task analysis, the information should be organized:
- Organization principle 1: [How to structure information]
- Organization principle 2: [How to structure information]
Task Relationships and Dependencies
Sequential Task Flows:
- [Task A] → [Task B] → [Task C]
- Rationale: [Why these tasks happen in sequence]
- Design Implication: [How interface should support this flow]
- [Task D] → [Task E]
- Rationale: [Why these tasks happen in sequence]
- Design Implication: [How interface should support this flow]
Parallel Tasks:
-
[Task F] [Task G] (can be done simultaneously) - Design Implication: [How to support parallel execution]
Alternative Paths:
- [Task H] OR [Task I] (different approaches to same goal)
- Design Implication: [How to accommodate different approaches]
Iteration Log
Iteration 1: [Date]
Trigger: [What prompted this revision - new user feedback, expert input, etc.]
Changes Made:
- Change 1: [Description of task revision]
- Change 2: [Description of task revision]
Rationale: [Why these changes were needed]
Impact: [How this affected other tasks or design decisions]
Iteration 2: [Date]
Trigger: [What prompted this revision]
Changes Made:
- Change 1: [Description of task revision]
- Change 2: [Description of task revision]
Rationale: [Why these changes were needed]
Impact: [How this affected other tasks or design decisions]
Domain-Specific Considerations
[Your Domain] Specific Patterns:
- Pattern 1: [Common task pattern in your domain]
- Pattern 2: [Common task pattern in your domain]
Domain Constraints:
- Constraint 1: [Limitation that affects task performance]
- Constraint 2: [Limitation that affects task performance]
Domain Opportunities:
- Opportunity 1: [Unique advantage for visualization in this domain]
- Opportunity 2: [Unique advantage for visualization in this domain]
Validation Against Literature
Similar Studies Reviewed:
Study | Domain | Similar Tasks | Key Differences | Insights Applied |
---|---|---|---|---|
[Study 1] | [Domain] | [Tasks in common] | [How your context differs] | [What you learned] |
[Study 2] | [Domain] | [Tasks in common] | [How your context differs] | [What you learned] |
Task Taxonomy Alignment:
- Brehmer & Munzner (2013): [How your tasks map to their taxonomy]
- Amar & Stasko (2004): [How your tasks map to low-level analysis tasks]
- Domain-specific taxonomies: [Any field-specific task classifications]
Next Steps
Immediate Actions:
- [Action 1 - e.g., validate remaining tasks with users]
- [Action 2 - e.g., begin sketching designs for priority tasks]
- [Action 3 - e.g., document task-to-design mapping]
Questions for Design Phase:
- [Question about how to visually support specific tasks]
- [Question about interaction design for task flows]
- [Question about information prioritization]
Handoff to Design:
- Task analysis complete and validated
- Priority tasks clearly identified
- Design implications documented
- Design team briefed on findings
References
- Amar, R., & Stasko, J. (2004). A knowledge task-based framework for design and evaluation of information visualizations.
- Brehmer, M., & Munzner, T. (2013). A multi-level typology of abstract visualization tasks.
- [Add other relevant references for your domain]
Customize this template by removing sections that aren’t relevant to your project and adding domain-specific considerations as needed.