DPIA
A Data Protection Impact Assessment (DPIA) is a process to help you identify and minimise the data protection risks of a project. You must do a DPIA for processing that is likely to result in a high risk to individuals. This includes some specified types of processing.
Under data protection legislation the University has an obligation to consider the impact on an individual's privacy during all processing activities. This includes implementing appropriate technical and organisational measures to minimise the risk to personal data.
It is particularly important to consider privacy issues when considering new processing activities or setting up new procedures or systems that involve personal data. The GDPR imposes a specific 'privacy by design' requirement, emphasising the need to implement appropriate technical and organisational measures during the design stages of a process and throughout the life cycle of the relevant data processing to ensure that privacy and protection of data is not an afterthought.
For some projects the GDPR requires that a Data Protection Impact Assessment (DPIA) is carried out. The types of circumstances when this is required include
those involving processing of large amounts of personal data
where there is automatic processing/profiling;
processing of special categories of personal data
or monitoring of publicly accessible areas (i.e. CCTV).
The information security policy template is a mechanism for identifying and examining the impact of new initiatives and putting in place measures to minimise or reduce risks.
GitLab is a ‘data controller’ and is ultimately responsible for compliance with current data protection legislation. GitLab will take the appropriate measures to ensure privacy by design and to protect the data subject’s rights under the legislation. Information Users. All members of the GitLab are responsible for complying with all relevant data protection legislation and this policy. Where a concern about a data asset is identified this should be raised with the Data Protection Officer and Privacy Officer to enable an assessment to take place.
Any project that involves processing of personal data requires a DPIA assessment. Where a Project Manager is unknown, undefined or unable to complete a DPIA, the highest admin level individual for the respective application needs to complete the documentation. Ultimately, the highest level admin has responsibility and accountability for ensuring submission and completion of the Data protection impact assessment. Any admin on the respective technology should sign off on the DPIA to signify understanding and accountability for the risks in the particular technology. Product Managers/Admins should ensure that the DPO and Privacy Officer are consulted, in a timely manner, in all issues relating to the protection of personal data.
Researchers should ensure that a data management plan that incorporates a DPIA is completed for any project that involves processing of personal data. Where additional advice is required they should contact the Legal and Compliance in the first instance who will liaise with the DPO and Privacy Officer, as required.
The Data Protection Office and the Privacy Office are responsible for ensuring their security measures implemented against DPIAs are compliant with this policy and relevant data protection legislation. They will be involved in the DPIA process.
In accordance with the GDPR, GitLab has appointed Data Protection Officers to carry out the DPO role as defined in the legislation. The DPO and Privacy Officer assist GitLab by informing and advising on data protection obligations and providing advice regarding DPIAs.
DPOs can delegate DPIA assessments to Security Compliance Analysts where appropriate
A DPIA is a process to help identify and minimize the data protection risks of a projector, system or application. There are a number of criteria that determine when a DPIA should be carried out within GitLab.
A DPIA must be done before beginning any type of processing which is “likely to result in a high risk”. This means that although the actual level of risk has not been assessed, screening for factors that point to the potential for a widespread or serious impact on individuals must take place.
The GDPR Policy Template requires a DPIA if we plan to:
use systematic and extensive profiling with significant effects
process special category or criminal offence data on a large scale
or systematically monitor publicly accessible places on a large scale
The ICO also requires a DPIA if we plan to:
use new technologies; use profiling or special category data to decide on access to services;
profile individuals on a large scale; process biometric data
process genetic data; match data or combine datasets from different sources
collect personal data from a source other than the individual without providing them with a privacy notice (‘invisible processing’)
track individuals’ location or behaviour
profile children or target marketing or online services at them
process data that might endanger the individual’s physical health or safety in the event of a security breach