Continued Social Work Phone: 866-419-0818


Exam Preview

Valuing Human Relationships in an Era of “Tech Rules”: The Ethics of PIE

View Course Details Please note: exam questions are subject to change.


1.  Which of the following intervention approaches fits best with the value of human relationships?
  1. Give clients the answers to the problems they are experiencing.
  2. Let clients solve their problems on their own.
  3. Avoid all direct contact with clients.
  4. Engage clients as partners in the helping process.
2.  Which of the following assessment approaches fits best with the value of human relationships?
  1. Focus only on the client’s intrapsychic processes.
  2. Explore the client’s interactions with their social environments.
  3. Diagnose the client’s biological problems (e.g., physical illnesses and disabilities).
  4. Identify relevant ethical standards and laws.
3.  When applying a person-in-environment approach to practice, which of the following are human relationships that behavioral health providers should consider?
  1. The client’s family, friends, coworkers, neighbors, and other individuals and groups that interact with the client.
  2. The physical environment, including climate change and pollutants.
  3. The client’s relationship with their higher power.
  4. The client’s income level and expenses.
4.  How might the use of chatbots in behavioral health practice interfere with the notion of “person-centered care”?
  1. It is completely unethical and illegal to ask a client to interact with a chatbot.
  2. Clients may not feel that the chatbot is personalizing its communication for their needs, wishes, and preferences.
  3. Behavioral health chatbots always provide the same level of person-centered care as a behavioral health provider.
  4. Ethically, it is better not to tell clients whether they are speaking to a person or a chatbot.
5.  When behavioral health professionals use videoconferencing to interact with clients, clients may feel a lack of human connection because:
  1. behavioral health providers are unable to convey empathy through videoconferencing.
  2. behavioral health providers are unable to see and respond to the client’s facial expressions.
  3. the clients are unable to see and respond to the behavioral health providers’ facial expressions.
  4. the clients are not physically in the same room as the behavioral health providers.
6.  When behavioral health providers use AI programs to write psychosocial assessments, they should use AI as:
  1. a tool, not a replacement for their own role in the assessment process.
  2. a replacement for their own role in the assessment process.
  3. a way to completely delegate human interaction and clinical judgement to AI.
  4. a way to eliminate the need for professional training and expertise in writing assessments.
7.  Which of the following strategies is useful for enhancing “therapeutic alliance” and trust with AI bots in behavioral health services?
  1. Not informing clients about how AI bots actually work.
  2. Ensuring the AI bot demonstrates empathy accurately and respectfully.
  3. Programming AI bots to be monotone (no expressions of human-sounding emotions).
  4. Allowing open access (by anyone) to listen to conversations between AI bots and behavioral health clients.
8.  Which of the following strategies is useful for fostering a positive human relationship between behavioral health providers and clients when using videoconferencing?
  1. Turn off the video and use audio (sound) only.
  2. Get down to work immediately and do not engage in informal discussion while videoconferencing.
  3. Ensure that the client and mental health provider are so close to the camera that you can see only their heads (not their bodies).
  4. Provide time for informal discussion and trust building.
9.  To foster the value of human relationships when using technology with clients, behavioral health providers should:
  1. Use technology as a replacement for behavioral health providers, not as tool.
  2. Use technology as tool, not as a replacement for behavioral health providers.
  3. Avoid technology that tries to simulate respect, caring, or empathy.
  4. Avoid technology that assesses a client’s interactions with their social environment.
10.  When designing AI bots for behavioral health services, providers can focus on fostering trust and therapeutic alliance by ensuring that the AI bots:
  1. Convey emotional empathy only.
  2. Convey cognitive empathy only.
  3. Convey both cognitive and emotional empathy.
  4. Do not convey any type of empathy.

Our site uses cookies to improve your experience. By using our site, you agree to our Privacy Policy.