Abstract
People in the industrialized world increasingly rely on Artificial Intelligence (AI) to obtain information, get personalized recommendations, or make decisions in their day-to-day life. In professional contexts, for example, doctors now use intelligent image analysis tools for diagnoses (Davenport and Kalakota, 2019; Kaplan et al., 2021) and HR managers at companies let algorithms preselect who should be invited to a job interview (Liem et al., 2018; Houser, 2019). In personal life, data-driven AI systems recommend movies according to the user’s preferences (Lawrence, 2015; Floegel, 2020), monitor sleeping patterns (Alqassim et al., 2012; Lee and Finkelstein, 2015; Kolla et al., 2016), allow you to chat as you would do with a friend and provide emotional support (e.g., Replika, replika.ai), manage your home via smart home technologies (Robles and Kim, 2010; Wilson et al., 2017; Gram-Hanssen and Darby, 2018; Marikyan et al., 2019), and support your every-day banking tasks (Letheren and Dootson, 2017; Li et al., 2020). Through the use of machine intelligence, data are analyzed faster than ever before, decision processes are accelerated, monotonous tasks can be handed over to the computer, and in cases of chatbots and speech assistants, sociable connections are possible without a real human dialog partner being involved.
As new technologies emerge, we quickly adapt to them and integrate them into our personal and professional life. Yet, from a psychological point of view, we need to question how they impact human experience, motivation, and well-being. According to the Basic Psychological Needs Theory (BPNT; Deci and Ryan, 2000; Ryan and Deci, 2000), which has received a lot of attention in fields other than technology use, motivation to engage in a task and subsequent well-being can be achieved through (i) personal autonomy, (ii) the feeling of being competent, and (iii) relatedness to other people.
Looking at current developments and new applications offered in AI, which often involve competent and autonomous decision-making or building social connections, it can be argued that AI systems may target the three spheres addressed by BPNT. Nevertheless, empirical research to date has hardly investigated the association between behavioral intentions to use AI-based applications and the perceived fulfillment of the three Basic Psychological Needs (BPNs), nor how variations in system designs or user-specific factors play a role here.
The present study is therefore dedicated to the question of need fulfillment in the interaction with an AI-based smartphone assistant, as a function of (a) more vs. less agency of the assistant, (b) female vs. male perceived design features of the assistant, and (c) user gender. In doing so, we aim not only to inspire greater consideration of BPN in the future design of AI systems, but also to take up the current discourse around gender-stereotypical design of AI-based voice assistants (West et al., 2019).
As new technologies emerge, we quickly adapt to them and integrate them into our personal and professional life. Yet, from a psychological point of view, we need to question how they impact human experience, motivation, and well-being. According to the Basic Psychological Needs Theory (BPNT; Deci and Ryan, 2000; Ryan and Deci, 2000), which has received a lot of attention in fields other than technology use, motivation to engage in a task and subsequent well-being can be achieved through (i) personal autonomy, (ii) the feeling of being competent, and (iii) relatedness to other people.
Looking at current developments and new applications offered in AI, which often involve competent and autonomous decision-making or building social connections, it can be argued that AI systems may target the three spheres addressed by BPNT. Nevertheless, empirical research to date has hardly investigated the association between behavioral intentions to use AI-based applications and the perceived fulfillment of the three Basic Psychological Needs (BPNs), nor how variations in system designs or user-specific factors play a role here.
The present study is therefore dedicated to the question of need fulfillment in the interaction with an AI-based smartphone assistant, as a function of (a) more vs. less agency of the assistant, (b) female vs. male perceived design features of the assistant, and (c) user gender. In doing so, we aim not only to inspire greater consideration of BPN in the future design of AI systems, but also to take up the current discourse around gender-stereotypical design of AI-based voice assistants (West et al., 2019).
Original language | English |
---|---|
Article number | 855091 |
Number of pages | 20 |
Journal | Frontiers in Psychology |
DOIs | |
Publication status | Published - 14 Jun 2022 |
Fields of science
- 102013 Human-computer interaction
- 501002 Applied psychology
- 501012 Media psychology
- 202035 Robotics
JKU Focus areas
- Digital Transformation