Chat with us, powered by LiveChat

Tip of the Week: Leveraging Chatbots for BITs

By Joseph Allen Ed.D., PMP, NABITA Advisory Board Member 

Have you leveraged your institution’s chatbot to support your BIT’s mission? Most institutions have a chatbot feature on their institutional website as a first point of contact for students and visitors. This tool assists by responding to questions with pre-populated responses and connecting users with the proper resources, information, and departments. In a perfect world, these chatbot resources provide a scalable solution to the high volume of inquiries a front desk employee previously managed via the phone and walk-in visitors. As a key point of contact for your institution and your BIT, it’s essential to ensure your chatbot is set up properly to best serve your community. Here are a few factors to be aware of and to consider:

  1. Ensure your chatbot identifies and flags a variety of key terms the BIT determines to be of concern. Red flag terms and phrases may include, but are not limited to, references to killing, murder, suicide, hurting themselves or others, guns, and threats. Your BIT should identify and provide a list of these critical terms the chatbot should flag based on your specific institution’s culture and prior BIT referrals.
  2. After identifying the key terms and phrases, ensure your create default auto responses in a way that will provide current, accurate, and specific information for the given text submitted to your system. Be mindful of whether responses should be adjusted during institutional breaks or the summer when access to resources may differ. If someone notes in the chatbot that they are thinking of harming themselves or others, will your auto response provide a caring and compassionate response that includes campus and community mental health resources, campus police, and other resources to support them?
  3. Ensure your chatbot is initiating an email notification to your BIT team leadership for review and awareness when concerning language is entered. Gaining visibility to concerning chats, threats, and help-seeking behaviors, submitted to the chatbot, are paramount for proper BIT response. Preventing critical information, disclosures, and statements from being overlooked is vital. BIT members will seek additional relevant information, context, and data to make informed decisions on the most appropriate support/response.

Here are next steps to help ensure your BIT is leveraging the technology available through your chatbot:

Here are next steps to help ensure your BIT is leveraging the technology available through your chatbot:

  1. Understand chatbot management.
  • Who has administrative access to the tool for updating links, setting accurate auto responses, and establishing escalation emails?
  • Who will train BIT members to familiarize them with your chatbot’s functionality and responses?
  • Are there regular intervals for content review and updating?
  • What information does the chatbot collect about the end-user? Who has access to that information?
  1. Test your chatbot system.
  • What chatbot response is provided when you enter a pre-identified, critical term?
  • How accurate is the informational response when you input a critical term?
  • Is an escalation email set up in your system to send notice to the BIT leadership when a critical term is entered?
  • Who receives the escalation email and are they trained and aware of how to evaluate and respond to this information?

These are great places to start as your BIT seeks to initiate and facilitate intentional conversations with your chatbot administrators around leveraging this technology to support the mission of your BIT. Thank you for the work you do every day for your students and your community.

For more BIT insights, check out the NABITA Blog.