Needs & perceived risk factors
Users expressed concerns about inaccurate AI outputs and data security, highlighting the need for transparency, user control over outputs, and adaptable language complexity.
Output comprehention
The comprehension of legal AI outputs depends on response complexity, including factors like structure and reference linking, which impact readability and accessibility.
Decision-making & reasoning
Users’ understanding of legal AI outputs significantly affects their decision-making and reasoning. Low trust, high-risk perception, and the need to verify responses reflect their cautious approach to relying on AI advice.
Design’s Role in Legal AI
Multidisciplinary teams and legal design thinking are essential to bridging the gap between users and legal experts. Human-centered design improves complexity, comprehension, and output structure, enhancing legal AI tools' effectiveness.