You are now in the main content area

Considerations for self-selecting external tools

At times, an external or third party tool which has not been vetted by TMU’s assessment process is needed within a course. If you are opting to use an external tool that has not been approved by the university, any information related to TMU is held to the same standards and policies as approved tools, even when not directly integrated into TMU's systems (like D2L).

This guide will walk you through the key considerations, from understanding a tool's purpose and data practices to respecting student choice and complying with university policies.

By carefully evaluating these factors before your course, you can select and use external tools to your students’ benefit, while aligning with TMU’s values.

Selecting a third-party tool

Choosing a tool that is suitable for your use is an important way to begin; learning about the tool will help guide your decision.

Begin by clearly identifying how the tool will be used in your course:  

  • Specific tasks that the tool will be used to complete
  • Learning-focused benefits of the activity
  • Alignment with your course and assessment’s learning outcomes
  • Free-to-use, or will it require a cost?

Keeping this in mind will help with the steps that follow, as your understanding of the tool and its use will allow you to be transparent with students, and engage in meaningful conversations about the tool.

Identify types of data

Based on the expected tasks, identify what data you, your students, and teaching team (TAs, graders, etc.) will share with the tool. Is any of it sensitive, or Personal Information? Consult CCS’s How to Classify the Data You Work With resource if needed.

Check documentation

With this in mind, carefully examine the tool and its parent company’s Privacy Policy, End-User License Agreement (EULA), and/or Terms of Use. Pay close attention to:

  • How personal information is handled and shared by the company. Is it shared with third parties? Is it owned by the company once input by users? 
  • Where is the data stored, and under what jurisdiction? Within Canada is best. 
  • How long is the data stored, and how can it be removed?

Apps that can be accessed via a Google account

What data from the students’ Google accounts will be shared with the app?

  • Examples include: personal information/profile, contacts, Google Drive files, calendar
  • Is this information required? If not, consider advising students to access/login with a different method.

For assistance with these questions, we recommend contacting the Privacy Office (privacy@torontomu.ca).

Ensure the tool's use aligns with university policies. These policies not only outline what is confidential, but also the responsibilities you’ll be taking on if asking students to use a tool.  

Relevant policies include but are not limited to: 

Additional student-specific policies are suggested in the "Communicating with students" section below.

Weigh the benefits of the tool against potential risks. While the elimination of all risk is impossible, does the benefit of the tool outweigh the possible risks?

If it does, and you choose to use it, the next stage is planning its use within your course!

The steps that follow in the sections below allow you to apply your understanding of the tool to course usage.

Planning your course use

After selecting your tool, including it in your course planning is essential. When planning, there are two core values to keep in mind:

  • Transparency, i.e. letting students know what they are getting into, and
  • Consent, i.e. letting students decide whether they want to share their information with this tool

After considering the classification of the data students will share and any intellectual property concerns, determine clear limitations of what should and should not be shared with this tool.

Design activities and instructions to minimize what is shared

  • Specific instructions about formatting content that is shared with the tool (e.g. excluding student PI that is the norm in assessments but may not be appropriate for external tools, such as the combination of full names and student numbers)
  • Encouraging specific account security settings within the tool (e.g. private accounts)
  • Choosing an anonymous username
  • Setting their contributions to be viewable only to the instructor or members of the course

Offer opt-outs

Design and plan alternatives (assignment or workarounds) for students who choose not to use the tool for valid reasons, allowing them to achieve the same learning outcomes.

If the use of the tool contributes to a graded component of the course, students who opt out should be provided with an alternative way to achieve the full grade.

Plan how to maintain academic integrity and communicate expectations to students when using the tool for graded activities. This will be important in your communication with them.

Identify questions you anticipate being raised by students about what is and is not permitted in their use of the tool for the course or assessment.

As the tool will be outside of TMU's areas of support, finding relevant resources and ensuring you are knowledgeable about its use is important.
  • Use the tool yourself, to evaluate the tool's user-friendliness. Make notes about difficulties you may encounter, and your methods of addressing them. 
  • Research its accessibility (including AODA compliance), including checking for records of comparable accessibility certifications. 
  • Gather resources to share with students and teaching team members to provide training and support.
  • Consider providing students with additional time and training to familiarize themselves with the tool – you may want to include this in the course schedule, so planning early is important!

You will want to ensure everyone involved in the course is aware of your approach to the tool, and acting within the decisions you have made. 

Create instructions and resources for TAs and graders, with clear guidance for:

  • Expectations for their use of the tool
  • Responsible data handling (including data minimization)
  • Obtaining student consent before inputting their work into third-party system
  • The use of any automated grading features (including eligibility for their use) and requirements for transparency with students around the decision-making that resulted in their grade

Communicating with students

Sharing information in advance and having conversations with your students about the tool will help to inform their understanding.

Explicitly mention the tool in your syllabus, including:

  • Third-party nature of the tool 
  • Any costs associated with its use
  • Tool's purpose in the course, including any specific technical requirements
  • Account creation requirements (if any)
  • Opt-out procedures and alternatives

Specific requirements can be found in Section 7:  Course Outlines – Required Information of  Policy 166 (Course Management).

All details may not fit into your course outline, but additional key information should be shared with students prior to their use. 

As a result of your preparation, the required information and resources should already be gathered to provide transparent information to students, including:

Intended use

  • More details of the tool's purpose in the course.
  • Potential risks and communication channels for concerns.
  • Any preferred methods of login/access, and suggestions to avoid (such as: direct login using Google account, or using the same username/password as their TMU account).

Data Privacy

  • Type of data collected (including clear examples), and any rules around what should be excluded.
  • If students input each other's data, guide instructors on addressing responsible data use and privacy concerns.
  • Data minimization and destruction options for the students

Copyright and Intellectual Property

  • Copyright rules and acceptable use of materials within the tool.
  • Ownership of work students may create or modify within the tool.

Conduct

Encourage students to think critically before engaging with the system, and ask themselves:

  • Is this information mine to share? 
  • Is anything within it private, personally identifiable, or potentially harmful to you, others, or the university, if made public?
  • Is my use of this tool inline with the Acceptable Use of Information Technology Policy? Actions including sharing passwords to TMU accounts, or allowing someone else (including an automated system) to use your electronic identity are violations.

Be sure to Emphasize the importance of maintaining appropriate actions, to avoid Academic Misconduct, and to be civil and professional, in alignment with the university’s student code of non-academic conduct and the policy on discrimination and harassment.

Support

Provide a list of available support resources, including but not limited to:

  • Links to online documentation, and contact information for support.
  • Opportunities for Q&A and addressing concerns.

Generative AI (if applicable)

  • Make any use of any automated grading features (where the system creates a score/mark/grade without the instructor evaluating the results) known to students. Be sure to transparently communicate what decision-making that resulted in their grade, ensuring that students know that an AI system produced the grade.
  • If the system can be used without contributing to the training data set, strongly encourage students to make use of the option. Provide step-by-step instructions to enable the setting, if relevant. 
  • Direct students to the Academic Integrity Office’s Artificial Intelligence FAQs, which provides details related Policy 60: Academic Integrity and AI for assessments.
  • Establish clear guidelines on authorship and attribution for content generated by GenAI tools. See the TMU Libraries’ Citing Artificial Intelligence resource for details.
  • As with all other types of tools, establish communication channels for questions or concerns.

Tools that leverage Generative AI

Before deciding to use a Generative AI (GenAI) tool in a course, consider TMU’s Principles and Guidelines on Generative Artificial Intelligence in Learning and Teaching.

Like all other tools, GenAI use needs to be considered within the context of university policy. In addition to the guidance for learning about all self-selected tools (above), these tools require additional research: 

  • User data: Understand how user and user-provided data is leveraged by the company and third parties, including for system training or commercial purposes.
  • Transparency: Choose tools that provide transparency about the training process, data used, how decisions are made to result in the output. This is essential, as it will allow you and your students to establish a more robust understanding of what leads to the generated content.
  • Accuracy: Is what is generated frequently incorrect? Are hallucinations common?

Applying what you’ve learned should include considerations for:

  • Informed Consent: Obtain explicit informed consent from students before using GenAI tools that involve their personal information or work. This includes grading tools that may provide automatic feedback of student work.
  • Use-specific risk assessment: Assess the potential risks associated with using the tool, including privacy, ethical, and legal risks.
  • Grading: If the AI system generates grades or assessment feedback, transparency with students around the use of the system, and how the grade-decisions are made is important – they should always know when these systems are used, and how to ask questions about the results.
  • Contractual obligations: Ensure that the use of GenAI tools does not violate any contracts, such as those related to grading. Consider checking your Collective Agreement or that of your TA’s, to confirm that the use of the tool is not replacing grading work that is intended to be completed by TAs.

Discussions around the use of GenAI cannot separate the technical choices from the ethical considerations and impact, as they are deeply intertwined. 

  • Intellectual Property: What are the ownership and intellectual property rights of content used to train the system? 
  • Environmental Impact: What are the environmental impacts of the use of the tool?
  • Limitations and mindful use: Be aware of the system's limitations, barriers to use, and potential for misuse.
  • Bias awareness: Be aware of potential biases in the training data and organization/company that created the tool/system, and how they may impact outputs. These biases may be a result of what was included and excluded from the training data.

Bias mitigation strategies

Approaches to mitigate bias can include: 

  • Your selection of tool(s).
  • Develop personal awareness of possible biases of the system through testing and research. Consider how these biases may impact the accuracy of what is generated, or reinforce dominant perspectives and ideologies.
  • Initiate and encourage discussions with students about common causes of bias in GenAI systems, as well as details related to the system they are being asked to use.
  • Design activities in the course that involve students identifying and analyzing outputs across systems for biases, to develop their own awareness of it in outputs.  
  • Create a clear and simple process for students to raise concerns regarding the impact of an AI-generated output on them or their work in the course.

Across the university, resources to guide decisions related to GenAI in teaching and learning are being developed:

If you wish to complete a guided self-assessment, an effective resource is the  (PDF file) Rubric for Evaluating AI Tools: Fundamental Criteria (external link)  (Paul R MacPherson Institute for Leadership, Innovation and Excellence in Teaching, McMaster University). Note: this rubric does not include considerations for TMU’s policies and best practices.