High-Level Insights

  • Iteration is Everything: The most significant improvements came from iterative feedback sessions. The design evolved significantly from### Publications and Presentations:
  • Publication 1: [Title, venue, status]
  • Presentation 1: [Title, venue, date]

Community Contributions:

  • Contribution 1: [What was shared with the community]
  • Contribution 2: [What was shared with the community]

Internal Documentation:

  • Document 1: [Internal knowledge sharing artifact]
  • Document 2: [Internal knowledge sharing artifact]sketch to the final product.
  • The Power of Low-Fidelity: Hand-drawn sketches were crucial. They allowed for a fast and non-committal way to explore ideas. Domain experts felt more comfortable critiquing a rough sketch than a polished visualization.
  • Documentation is Key: The logs for data cleaning, sketches, and build decisions were invaluable. They created a clear and traceable record of the entire process, which was essential for our final report and for understanding the design rationale.
  • Tacit Knowledge is Gold: The “think-aloud” sessions and observations were more informative than the formal interviews. They helped us uncover pain points that the domain expert took for granted.

Detailed Stage Reflections

Stage 1: Abstract Phase

What Worked Well:

  • Semi-structured Interviews: Open-ended questions revealed unexpected pain points about manual data correlation
  • Task Observation: Watching the domain expert work revealed the “Wednesday problem” pattern
  • Iterative Abstraction: Multiple rounds of task refinement led to the three core tasks (compare, find anomaly, locate hotspot)

What Could Be Improved:

  • Initial Scope: Started too broad - should have focused on specific use cases earlier
  • Technical Feasibility: Didn’t consider data limitations early enough, leading to some redesign

Key Insights:

  • Task Abstraction: Real tasks emerged from observation rather than interviews - people don’t always know what they need
  • User Interviews: Follow-up questions were crucial - initial answers were often surface-level
  • Domain Understanding: Transportation domain has hidden complexities (weather, time-of-day effects) that only emerged through deep engagement

Time Allocation:

  • Planned: 1 week
  • Actual: 1.5 weeks
  • Variance: Extra time needed for task refinement was valuable investment

Stage 2: Design Phase

What Worked Well:

  • Hand-drawn Sketching: Rapid iteration without tool constraints - generated 8 different layout concepts
  • Multiple Views Early: Decided on coordinated views approach early, which guided all subsequent design decisions
  • Color Strategy: Delay-based color coding (red-yellow-green) was intuitive and tested well

What Could Be Improved:

  • Data Volume Planning: Didn’t consider how design would scale with larger datasets
  • Responsive Design: Late addition - should have been considered from the start

Key Insights:

  • Sketching Process: Physical sketching encouraged broader exploration than jumping to digital tools
  • Digital Tools: Figma excellent for layout refinement, but VegaLite prototyping caught interaction issues early
  • Design Validation: Quick digital prototypes prevented major implementation problems

Iteration Effectiveness:

  • Number of Iterations: 5 major design iterations
  • Most Valuable Iteration: Iteration 3 - added geographic view after realizing spatial patterns were crucial
  • Iteration Drivers: User feedback and technical constraints drove most changes

Stage 3: Build Phase

What Worked Well:

  • Incremental Development: Building one chart at a time allowed for early testing and refinement
  • D3.js Choice: Provided the flexibility needed for custom interactions and coordinated views
  • Responsive Framework: CSS Grid made layout adaptation straightforward

What Could Be Improved:

  • Error Handling: Added too late in the process - should be built in from the start
  • Performance Testing: Didn’t test with larger datasets until late - found some bottlenecks

Key Insights:

  • Technology Choices: D3.js learning curve was worth it for the custom interaction requirements
  • Performance Considerations: Animation and transitions significantly improved user experience
  • Design-to-Code Translation: Having detailed Figma mockups made implementation much smoother

Technical Debt:

  • Accumulated Debt: [Technical shortcuts that created future problems]
  • Time Pressure Impact: [How time constraints affected code quality]
  • Lessons for Future: [How to better manage technical debt]

Stage 4: Evaluate Phase

What Worked Well:

  • Structured Tasks: The three-task evaluation protocol effectively tested core functionality
  • Think-Aloud Protocol: Captured reasoning process and revealed usability insights
  • Time-boxed Tasks: 3-minute limit encouraged focused interaction and revealed efficiency issues

What Could Be Improved:

  • Single User: Only testing with one domain expert limited feedback diversity
  • Task Realism: Could have included more complex, real-world scenarios

Key Insights:

  • Coordinated Views: Proved to be the most valuable feature - users immediately understood the connections
  • Learning Curve: Minimal training needed - intuitive interface was achieved
  • Performance Expectations: Users expect instant responsiveness in modern web tools

Process and Team Insights

Time Management:

  • Stage 1 (Abstract): 1.5 weeks - Longer than expected but crucial for foundation
  • Stage 2 (Design): 2 weeks - Hand sketching saved significant time later
  • Stage 3 (Build): 3 weeks - Incremental development prevented major setbacks
  • Stage 4 (Evaluate): 1 week - Could have been longer for multiple users
  • Total: 7.5 weeks vs. 6 week target

Role Clarity:

  • Researcher/Designer: Dual role worked well for small project, clear division of abstract vs. design thinking
  • Developer Role: Having design background made implementation smoother
  • Domain Expert: Regular check-ins prevented development of unusable features

Stakeholder Engagement:

  • Engagement Strategies: Weekly demos kept domain expert engaged and provided regular course correction
  • Feedback Integration: Prioritizing feedback based on core task support worked well
  • Expectation Management: Early prototypes helped set realistic expectations for final product

Technical Learnings

Tool Effectiveness:

Tool/Technology Effectiveness (1-5) Notes
D3.js 5 Learning curve steep but flexibility was essential
Figma 4 Excellent for layout design, good collaboration features
VegaLite 3 Good for rapid prototyping, limited for complex interactions
HTML/CSS Grid 5 Modern responsive design approach, very effective
VS Code 5 Excellent development environment with good extensions

Data Challenges:

  • Data Quality Issues: Transport data had inconsistent formatting - required significant cleaning
  • Data Access: Mock data worked well for prototype, but real API integration would add complexity
  • Data Processing: D3’s data manipulation functions were powerful but required JavaScript expertise

User-Centered Design Insights

User Understanding:

  • User Model Evolution: Initial assumption of “data analyst” user evolved to “operations manager” with different needs
  • Persona Validation: Domain expert matched our target persona well, but broader user base would have different priorities
  • Task Understanding: Real tasks were more exploratory than we initially assumed - needed support for discovery

Design Validation:

  • Early Validation Benefits: Hand-drawn sketches prevented weeks of development in wrong direction
  • Iteration Impact: User feedback in Stage 2 completely changed our chart selection approach
  • Design Blindspots: Assumed users would want complex filtering - simple route selection was sufficient

Methodological Contributions

Process Improvements:

  1. Integrated Sketching-Prototyping: Combining hand sketches with rapid digital prototyping
    • Problem Addressed: Gap between conceptual design and technical implementation
    • Implementation: Use sketches for broad exploration, VegaLite for interaction testing, then build
  2. Continuous Domain Expert Engagement: Weekly check-ins throughout development
    • Problem Addressed: Building features that don’t match real-world needs
    • Implementation: Schedule brief weekly demos rather than waiting for major milestones

Tool Recommendations:

  • Essential Tools: Hand sketching materials, Figma (or similar), D3.js (for flexibility), basic web hosting
  • Helpful Additions: VegaLite for prototyping, color picker tools, browser dev tools for debugging
  • Tools to Avoid: Overly complex frameworks early on - start simple and add complexity only when needed

Template Improvements:

  • Documentation Templates: Add specific time tracking templates - actual time vs. estimates revealed valuable insights
  • Process Checklists: Need checklist for data quality assessment early in the process
  • Evaluation Frameworks: Single-user evaluation sufficient for initial validation, but multi-user framework needed for robustness

Domain-Specific Insights

Transportation Domain Characteristics:

  • Unique Challenges: Time-sensitive decision making, geographic constraints, weather dependencies, public accountability
  • Opportunities: Rich spatial and temporal data, clear performance metrics, strong user motivation for efficiency
  • User Behaviors: Domain experts think in terms of routes and schedules, prefer actionable insights over exploratory analysis

Design Implications:

  • Spatial Context Critical: Any transportation visualization needs geographic reference
  • Time Patterns Essential: Day-of-week and time-of-day patterns are fundamental to domain understanding
  • Performance Focus: Users primarily interested in identifying and fixing problems, not general exploration

Recommendations for Future Projects

For Similar Projects:

  1. Start with Real Data: Even if limited, real data reveals domain complexities that mock data doesn’t
  2. Prioritize Geographic Views: Spatial context is critical in transportation domain
  3. Focus on Actionability: Users want to identify problems and understand causes, not just see patterns

For Other Domains:

  1. Invest in Task Abstraction: Understanding real tasks (not stated tasks) is critical foundation
  2. Design Study Process Works: The 5-stage process provided good structure and prevented major missteps
  3. Document Everything: The documentation proved invaluable for understanding design decisions and writing this reflection

For Process Improvement:

  1. Expand Evaluation: Single domain expert was sufficient for initial validation but more users needed for robust findings
  2. Consider Long-term Use: Our study focused on initial usability - long-term adoption would require different considerations
  3. Technical Sustainability: Consider maintenance and data pipeline requirements for deployed tools

Transferable Insights:

  • Insight 1: [Learning that could apply to other domains]
  • Insight 2: [Learning that could apply to other domains]

Domain Expertise Development:

  • Learning Curve: [How long it took to develop domain understanding]
  • Key Learning Resources: [Most valuable sources of domain knowledge]
  • Knowledge Gaps: [Areas where deeper expertise would have helped]

Impact Assessment

Immediate Impact:

  • User Adoption: [How the final tool is being used]
  • Workflow Changes: [How work processes have changed]
  • Efficiency Gains: [Measurable improvements in user efficiency]

Long-term Potential:

  • Scalability: [Potential for broader deployment]
  • Evolution: [How the tool might evolve over time]
  • Replication: [Potential for similar tools in related domains]

Research Contributions:

  • Novel Techniques: [New visualization or interaction techniques developed]
  • Validation Results: [Insights about visualization effectiveness]
  • Methodology Advances: [Contributions to design study methodology]

Recommendations for Future Projects

Project Setup:

  • Team Composition: [Ideal team structure and skills]
  • Timeline Planning: [More realistic time allocation recommendations]
  • Resource Requirements: [Essential resources and tools]

Process Adaptations:

  • Stage Modifications: [How to adapt stages for different contexts]
  • Checkpoint Additions: [Additional validation points to include]
  • Documentation Practices: [Better ways to capture process learning]

Risk Mitigation:

  • Common Pitfalls: [Mistakes to avoid in future projects]
  • Contingency Planning: [How to handle common setbacks]
  • Quality Assurance: [Additional quality checks to implement]

Knowledge Sharing

Publications and Presentations:

  • Publication 1: [Title, venue, status]
  • Presentation 1: [Title, venue, date]

Open Source Contributions:

  • Contribution 1: [What was shared with the community]
  • Contribution 2: [What was shared with the community]

Internal Documentation:

  • Document 1: [Internal knowledge sharing artifact]
  • Document 2: [Internal knowledge sharing artifact]

Personal and Professional Growth

Skills Developed:

  • Technical Skills: [New technical capabilities gained]
  • Design Skills: [Design competencies developed]
  • Research Skills: [Research methodologies learned]
  • Communication Skills: [Presentation and writing improvements]

Career Impact:

  • Role Clarification: [How this project affected career direction]
  • Network Building: [Professional relationships developed]
  • Portfolio Development: [How this contributes to professional portfolio]

Areas for Continued Learning:

Skills for Future Development:

  • Skill 1: [Area for future development]
  • Skill 2: [Area for future development]

Final Reflections

Most Valuable Learning:

“[Single most important insight from the entire project]”

Biggest Surprise:

“[Most unexpected discovery or outcome]”

Would Do Differently:

“[Major change you would make if starting over]”

Advice for Others:

“[Key advice for someone starting a similar project]”

Project Satisfaction:

  • Overall Rating: [1-5 scale]
  • Most Rewarding Aspect: [What was most fulfilling]
  • Most Challenging Aspect: [What was most difficult]

Appendices

A. Timeline Comparison

[Actual vs. planned timeline visualization]

B. Resource Utilization

[Analysis of time, budget, and resource usage]

C. Stakeholder Feedback Summary

[Key quotes and feedback from project stakeholders]

D. Future Research Questions

[Questions that emerged during the project that could drive future research]