Privacy and Algorithmic Impact Assessments
Privacy Risk Assessment includes a range of activities necessary to support continuity in operations and innovation at McMaster. The Privacy Impact Assessment (PIA) and Algorithmic Impact Assessment (AIA) processes are conducted collaboratively under the leadership of the privacy office to identify privacy risks, and develop mitigation plans arising from new projects, initiatives, systems, processes, strategies, policies, applications or services.
Expandable List
A Privacy Impact Assessment (PIA) is a mandatory process which assists organizations in identifying and managing the privacy risks arising from new projects, initiatives, systems, processes, strategies, policies, business relationships etc. PIAs focus on how personal information (PI) will be collected, used, disclosed, stored, retained and protected.
If information security is also at risk, a slightly adapted process called a Privacy and Information Security Impact Assessment is conducted in collaboration with McMaster’s Information Security office.
What is personal information?
Personal information (PI) is recorded information that can be used to identify you.
- It reveals something of a personal nature about you.
- If you can be identified from the information (either alone or by combining it with other information), it is your PI.
How long does a PIA take to complete?
The time needed to complete a PIA can depend on a number of factors, including:
- degree of complexity in the application or service. More complex technology requires more time to assess privacy risks
- the level of vendor transparency. A PIA is more quickly completed when vendors provide excellent depth of detail in their privacy and service policies.
- internal partners are available for consultation, including the PIA requester, and other relevant risk management specialists.
- where there is a previous PIA, we may be able to streamline the process.
- The current capacity of the privacy office may also factor into how quickly we can progress on PIA files.
At McMaster University, privacy risk assessment is led by the Privacy Office (University Secretariat) and often involves collaboration with valued partners in the Information Security Office and Legal Services.
The PIA Process includes four stages:
Stage 1: Confirm need for PIA, and initiate the PIA/AIA process
Stage 2: Begin PIA self-assessment or facilitated assessment, research vendor practices
Stage 3: Review privacy risk elements, consult with partner units (e.g. Information Security, Legal Services, Enterprise Risk Management, Vendor) to develop a risk mitigation plan.
Stage 4: Implement recommendations and monitor service for changes that effect privacy risk score and mitigation planning.
A Privacy Impact Assessment is mandatory when an activity involves the collection, use, retention, or disclosure of personal information. This could include:
- an employee/office of the university will disclose personal information to a vendor’s application (e.g. uploaded)
- a vendor or service/application will collect personal information directly on behalf of the university.
- an existing agreement with a third party is affected by changes to their privacy policy, terms of service, or internal information security and process;
It is also a best practice to review a PIA for a service prior to a contract renewal. This should take place at least 6 months before the expiry date to ensure that the PIA remains up-to-date, and mitigation planning is reflected in any new agreement.
This flow chart below may help you to determine whether you need to complete a PIA.

Once you have determined that you should complete a PIA, it is key to begin the process at the early stages of new projects or initiatives. This helps to ensure that there is sufficient time to conduct a PIA, and avoid disruption to the project plan.
The first step in the PIA process is to check the Privacy Risk Assessment Tracker (internal only) to see if the privacy office has already completed or begun a PIA for the service or application, or a similar service or application. If you find the application listed in the Tracker, reach out to the privacy office to obtain the PIA report to ensure your use of the technology will be in compliance with privacy legislation.
If the service or application is not listed in the Tracker, use this flow chart to determine whether you need to complete a PIA.
If there is no PIA in place, you may initiate the PIA process by completing the Early Privacy Risk Check form. Once submitted, the Privacy Office will review the information and schedule a short meeting for initial consultation
During the initial consultation, we will confirm and document the purpose for using the application/service, the range of personal information, and an initial privacy risk score. This initial score will factor into the method used in the next stage.
There are four levels of privacy risk: low, moderate, high and non-compliant. Initial score is based on the highest level of classification of the personal information included in the use of the application or tool. The chart below provides rationale for determining privacy risk score levels:

Self-Assessment: In instances where the initial privacy risk is low, the privacy office will provide the requester with the PIA Self-Assessment workbook. The workbook should be completed using the application/service privacy policy (including related security policies) and terms of service documentation. We have developed a resource to help assess vendor policies – Guideline – How to Review a Privacy Policy. The completed worksheet must be provided to the privacy office for review, and ongoing retention.
Facilitated Assessment: Where privacy risk is moderate, high or potential non-compliant: In instances where the lawful authority is unknown, the personal information involved is classified as confidential or restricted, or where the application/service includes AI functionality. The privacy office will review the vendor’s privacy policy (including related security policies) and terms of service documentation, to identify risk factors and determine a preliminary risk score.
The risk factors identified in this stage support the development of privacy risk mitigation recommendations for stage 3.
The privacy office will consult with the requester to discuss the privacy risk score and identified privacy risk elements to develop a risk mitigation strategy. This consultation often includes identifying whether there is a need to engage with the vendor for more information regarding their practices, and brainstorming the use and scope of the application/service.
PIA requesters may need to consult with the vendor, to seek clarification or, in cases where recommendations include variations in terms of service or an agreement. Clarifications provided by the vendor better inform the risk score, and provide a clearer picture of the privacy risks. Once this is completed, please update our Privacy Office about the results of the vendor consult and your contract agreement.
Additional consultations with internal partners may be necessary, which may include Legal Services, Information Security, UTS Integration Services, or Procurement Services.
Once the privacy impact assessment is complete, the privacy office will communicate the final privacy risk score and recommendations in mitigating identified privacy risks. We will also update the status of the PIA in the Privacy Risk Assessment Tracker. The privacy office will also retain a summary report, which includes these details:
- The purpose for which the personal information is intended to be collected, used and disclosed, and an explanation of why the personal information is necessary to achieve the purpose.
- The legal authority for the intended collection, use and disclosure of the personal information.
- The types of personal information that is intended to be collected and, for each type of personal information collected, an indication of how the type of personal information is intended to be used or disclosed.
- The sources of the personal information that is intended be collected.
- A list of university employees, consultants or agents of the university who will have access to the personal information.
- Any limitations or restrictions imposed on the collection, use or disclosure of the personal information.
- The period of time that the personal information would be retained by the university (e.g. retention schedule).
- An explanation of the administrative, technical and physical safeguards and practices that will be used to protect the personal information and a summary of any risks to individuals in the event of a theft, loss or unauthorized use or disclosure of the personal information.
- The recommended steps necessary to prevent or reduce the likelihood of a theft, loss or unauthorized use or disclosure of personal information from occurring, and to mitigate the risks to individuals in the event of such an occurrence.
Ongoing Monitoring
The university is responsible for ensuring that our identification of privacy risks and mitigation planning remain accurate as we continue to use applications and services. As the direct client of the application/service, the requester will be responsible for implementing the recommendations. Please continue monitoring the tool for any changes to service features or how the applications are used, and notify the privacy office of these changes. We will review the most recent PIA results, and make relevant updates to support ongoing use and continuity in institutional operations.
For tools that include automated functions, including artificial intelligence, we will include an algorithmic impact assessment in the privacy risk assessment.
An algorithmic impact assessment (AIA) is a risk assessment process that determines the impact level of an automated decision-system on individuals whose personal information is collected, used or disclosed in the use of an AI-enabled application or system. This includes generative AI, agentic AI, or image recognition AI. An AIA is composed of 51 risk and 34 mitigation criteria, and assessment scores are based on many factors, including the system’s design, algorithm, decision type, impact, and data. By evaluating these risks, the tool helps ensure that AI systems used are ethical, transparent, and protect our students, departments, and institutional integrity.
The McMaster privacy office has developed an AIA tool for the use of AI tools in higher education. This tool is based on the AIA tool produced by the Treasury Board of Canada.