PrivacyEngine Podcast
The PrivacyEngine Podcast is your practical briefing on modern privacy and data protection. Each episode turns complex requirements into clear actions across GDPR, CCPA, HIPAA, India’s DPDPA, ISO standards, and emerging AI governance. Join privacy, security, legal, and product leaders as we unpack global regulations, real-world compliance challenges, and the controls that build trust at scale.
PrivacyEngine Podcast
Understanding Qatar’s PDPPL: Data Rights, Compliance and AI
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
In this episode of the International Data Protection Regulation Podcast, Dr. Maria Maloney and Katie Birch examine Qatar’s Personal Data Privacy Protection Law, Law No. 13 of 2016, and its growing relevance for privacy, cybersecurity, and AI governance.
The discussion covers the core structure of Qatar’s privacy framework, including how personal data is defined, how sensitive data is treated, and why consent plays such a central role in lawful processing. The episode also explores the rights available to individuals, such as access, correction, objection, erasure, and withdrawal of consent, as well as the practical obligations placed on organisations that handle personal data.
Maria and Katie also look at breach notification, enforcement, cross-border transfers, online privacy, direct marketing, and the higher standard applied to children’s data. They then turn to AI, considering how Qatar’s privacy framework may evolve as AI governance becomes more formalised and as global regulatory expectations continue to shift.
This episode is a practical overview for privacy professionals, legal teams, compliance leaders, and anyone interested in how data protection law is developing in Qatar and across the Gulf region.
Enjoyed this episode? Subscribe for more practical insights on global privacy and data protection. If you have questions about India's DPDPA, KSA & UAE's PDPL compliance, get in touch. We would love to hear from you.
Hello and welcome back to the International Data Protection Regulation Podcast, where we explore privacy laws around the world and how they shape our digital lives. I'm Dr. Maria Maloney, and today I'm joined again by my wonderful co-host, Katie Birch.
SPEAKER_01Thanks, Maria. Hi everyone, it's great to be back.
SPEAKER_02In today's episode, we're focusing on Qatar. I hope I said that right, a jurisdiction that was one of the first in the Gulf region to introduce a comprehensive national data protection law. The law, which is called Law No. 13 of 2016, concerning personal data privacy protection, reflects many global privacy principles while also responding to Qatar's own governance and cybersecurity context. Together, Katie and I will walk through how Qatar defines personal data, what rights individuals have, what obligations fall on organizations, and how these rules may indirectly shape emerging AI systems. So I'll get straight to the point. The first question goes to you, Katie. To start us off, what is the core data protection legislation in Qatar and when was it introduced?
SPEAKER_01Thank you, Maria. So the primary legislation in Qatar is, as you said, law number 13 of 2016 concerning personal data privacy protection. It's often abbreviated to the PDPPL. It was issued on the 3rd of November 2016, and Qatar was actually the first nation within the Gulf Cooperation Council to implement a dedicated national privacy law. The law established a comprehensive framework for the protection of individuals' privacy rights by regulating how personal data is collected, processed, and stored by individuals and organizations. It has strict requirements for data controllers. They have to process data fairly and transparently. They have specific protections for special category data, which we'll go to later. They have their authority, the NCSA, and they have also given out a lot of guidelines to kind of help companies with the practical implementation of it, which is quite good. Additionally they have touched into AI a little bit, which I suppose we'll talk about later too. But in 2021, the Ministry of Communications and Information Technology released some regulatory guidelines about the use of AI and the development of AI. Yeah, that's just the general outlook on what it looks like. So I'll move on.
SPEAKER_02Yeah. They seem fairly advanced, like you say. They were in 2020 sixteen, even before the GDPR, they have their national data protection law um you know uh rolled out across the country, which is which is good.
SPEAKER_01Yeah, I would agree they're quite on on top of things there. So my question for yourself, Maria, will be How does Qatar's law define personal data and does it include sensitive categories?
SPEAKER_02Thanks, Katie. So in Qatar, the law defines personal data as any information relating to a natural person whose identity is clearly identified or can be reasonably identified. And I think what they mean by that is kind of can be inferred from information by putting information together. This identification can occur either directly through data itself, uh such as your name or your like an ID that you have, your national ID number, your passport, or like I said, by combining information with any other details or data points that could then, when they're combined, uh kind of identify you as who you are. The legislation does include a specific, more rigorous category known as personal data of a special nature, and that would be Qatar's equivalent of what we call special category data or sensitive data. And this includes categories such as ethnic origin, interestingly, children's data, physical or mental health information, and religious creeds. Another interesting thing is that it can also extend to include your marital status or your marital relations, like for example, if you got married, that that would then become personal data of a special nature. And as well, if you you know if you obtained a criminal record, that would also be you know contained within that category. Because then the reason why they have this category, as just like in the GDPR, is because the misuse of this special nature of data is considered in Qatar more likely to cause serious damage to an individual. It cannot be processed as freely as the standard personal data. And under Article 16 of the law, any organization wishing to process this, these kind of sensitive categories must first obtain explicit permission from the National Cybersecurity Agency or the NCSA. Um and once permission is received, organizations or what we call you know data controllers, which is also the same term that is in the GDPR, must implement additional safeguards for this type of data. So this type of data is considered special, you know, as it needs special treatment under the law. This makes Qatar's approach particularly strict as it shifts from like what we have in the GPR self-compliance almost to a more like permit-based model for sensitive data. The Minister of Transport and Communication may also um they they reserve the right to add other types of data to this category if it is deemed that the misuse of this type of data would cause serious damage to the individual. So um it's the Minister of Transport and Communication that can expand that category, you know, in the future. So, yeah, so I mean they take personal data and special category data very seriously. And we we see as well with the NCSA, the National Cybersecurity Agency, that you know, you actually have if you want to process personal uh special category data or what we call under the GPR special category data, you actually have to get permission to process that data. So that's interesting. That is different. Yeah, it's different. Yeah. So moving on, Katie, to question three what lawful basis does Qatar rely on for personal processing personal data? Is it consent central like we saw with uh various other legislation in the Gulf region? Or is it similar to the GDPR?
SPEAKER_01Thanks, Maria. It is consent central. I would say yes. It consent seems to be the core, the core of lawful processing. Article 4 of the PDPPL says that the controller shall process personal data only after obtaining the consent of the individual, unless there is a certain lawful purpose. Individuals have the right to withdraw their consent at any time. And I mean, as per usual, the process for withdrawing consent needs to be as easy as the process of doing it, so that's quite good. But so I'm just gonna go back to those lawful purposes because the other these are more exceptions to consent, so situations where you can process data without consent. Um, and they're set out in chapter five of the act. I'm just gonna go through them here. So we have legal obligations where they're required by Qatari laws or court order to process data. I also just want to mention here that I feel I'm not sure if contractual necessity would come under legal obligation because they haven't explicitly listed contract as a reason in the same way GDPR has. So I assume to the extent that a contract is legally enforceable, it would come under this lawful purpose. They have public interest where processing is necessary for a task carried out in the public interest, or vital interest, or as usual, they have where processing serves a legitimate business purpose, they have scientific research carried out in the public interest and investigating a crime. And as you just touched on, they do have structural require requirements for the law for processing sensitive data lawfully, and the bar is much higher. For these categories, having a lawful basis like legitimate interest is not enough. They do need the explicit permission of the National Cybersecurity Agency, and they do need explicit consent.
SPEAKER_02Okay, and we saw as well there that um you know that sensitive or uh the data, personal data of a of a sensitive nature with it, that extends to uh your criminal record. So, yeah, so you would have to get permission uh from the from the National Cybersecurity Agency to process that data. But like you say, it is very much consent-driven as opposed to the GDPR.
SPEAKER_01Yeah. It is consent at the forefront for sure.
SPEAKER_02Yeah.
SPEAKER_01Um so go on to question four for yourself. And it's about the data subject rights. So, what rights does the law provide to the individual and how do they compare to the GDPR?
SPEAKER_02Well, again, I think if you know the data subject rights in the GDPR, you will be, you know, I think you'll feel quite comfortable with the data subject rights in uh uh Qatar's law number 13, um, because it grants individuals a specific set of rights, uh, it and it has the same philosophy as the GDPR, you know, kind of fundamentally protection of individuals' fundamental rights, like in the GDPR, uh, although the scope um in Qatar is somewhat narrower and more focused on the core principles of transparency and control. The law explicitly provides individuals with the following rights regarding their personal information. So you have what we call the right to withdraw consent, and that's again similar to what we see in the GDPR. You have the right to object to processing if you think that the processing is outside the original purpose for which you gave your data, or it has become excessive, or it's been used in a discriminatory or illegal way. Um, you also have the right to erasure, which is Qatar's version of the right to be forgotten, allowing individuals to request the deletion of their data once the purpose for the processing has ended, or if the if the data can be proven to be collected unfairly. Uh, it also has you know a right to access and review, so the individual can uh view their data, their store data, and verify that it's accurate and update that, you know, if again, so the next right would be the right correction. So if you review your data and you see that it's inaccurate, you actually have the right to update that data as well. So, again, very similar to the GDPR. You have the right to be notified as well. So the individuals must be informed of the processing activities, the purposes behind them, and any instances where their inaccurate data has been disclosed to third parties. Okay. Um, so overall, like again, you'd be comfortable coming from the GDPR and going to the Qatari set of data subject rights. In terms of the right to portability, there's no it's not explicitly mentioned in the 2016 law in Qatar, but they don't really provide uh kind of protection for automated decision making either, like they do in the GDPR. And as well, it's kind of generally understood that the response time is 30 days, whereas it's more specific explicit in the in the GDPR. And of course, like we said, you know, personal sensitive category data is needs it needs a uh a permit. So they're the kind of the differences between the GDPR and um and Qatar. But again, you know, very similar. So moving on then to question five, Katie. What are the main organizational obligations under the PBPPL?
SPEAKER_01Thanks, Maria. So organizations who are as per usual uh or as normal anyways, referred to as controllers, are bound by a set of strict operational and technical obligations. They're designed to shift the burden of protection from individuals to the controller themselves. So there's a few specific conditions directly from the act, and then there's a few kind of expansions that are found within the guidelines. So what we find in the act is they need to review privacy protection measures before entering new processing operations, which I suppose is expanded on in the guidelines, which talks about the need for DPIOs before starting any processing activity. They have a couple of guidelines about you know how to assess that risk and how to carry it out. But when they engage protesters, they need to make sure that they're going to be responsible for their data protection obligations. So again, we've seen that. So they're also responsible for training and raising awareness on data protection. They need to develop their internal systems to receive and investigate complaints and any rights requests. They need to develop internal systems for management of personal data. So they basically need a formal framework to manage the data lifecycles and document compliance. There's no explicit requirement for a DPO, it's more of a may, but it's still recommended. I mean, to to coordinate and carry out all of these requirements at the end of the day. They need to use their appropriate technologies to enable individuals to exercise their rights. So that just reminds me of like the web forms and stuff.
SPEAKER_02But we would have in privacy engines, yeah. So facilitating, facilitating data subjects, you know, having ability to access their rights or request information. Yeah.
SPEAKER_01And they also need to conduct audit, ongoing audits and reviews of their compliance with the requirements, and need to verify protesters' compliance, taking appropriate precautions to protect personal data. So I mean, a data sharing agreement wouldn't be out of the question there. Like, you know, some of the things don't explicitly say it. As I said, the guidelines go a lot more into it, but you know, it's a mat it's a kind of a question of it'll make your life easier to abide by those requirements.
SPEAKER_02Absolutely. So it's kind of a standard approach that we see, you know, we've seen with the GDPR kind of you know, assessments of of third parties, vendors, uh, making sure you have a contract in place to ensure that you are protected if there's a breach that happens with the processor and so on. Yeah.
SPEAKER_01Yeah, for sure. So they also have their notifications obligations, their transparency obligations, which we know quite well.
SPEAKER_02Um so they all would align quite similarly to the GDPR, basically.
SPEAKER_01Yeah. They also have their principles um like of data minimization and accuracy. They have their bridge reporting, but we're going to talk about that a bit a bit more later. But yeah, so a lot of it is GDPR related, and you don't need a DPO, but I mean it's easier to be compliant with one, especially if you're handling handling a lot of data. So we'll move on now to kind of what I was touching on there at the end of that. But um, Maria, could you tell us a bit about their breach notification requirements and what is expected in practice?
SPEAKER_02Absolutely, yeah. So um in Qatar, the PDPPL explicitly requires breach notification. And as of 2026, the regulatory expectations are strictly enforced by the NCSA or the National Cybersecurity Agency. So we've seen guidelines as well in 2026 and uh you know uh in in the recent past that makes it makes things more specific about the law. So unlike some jurisdictions where every minor leak must be reported, so we've seen that more and more with a GDPR. It seems like we you know you always have to report uh breaches with the GDPR. Qatar's law and more specifically Article 14 of the law triggers the notification requirement only if the breach is likely to cause what they call serious damage to the personal data or the privacy of the individuals. So how they interpret serious damage or how the regulators interpret serious damage usually means either something like identity theft or fraudulent use of personal data in some way. Uh, finance the potential of financial loss, financial loss for the individuals, or if there was unauthorized access to bank accounts of the individual and so on, that would be seen as serious damage. Physical or psychological harm as well, um, particularly involving sensitive data like health or children's data. And again, we saw that in the uh the Kingdom of Saudi Arabia's law, do you remember um particular sensitive, you know, quite protective of the individual, you know, would be considered quite, you know, psychological harm would be considered serious damage to the individual. And as well, reputational damage would be, you know, a sign where if there's a significant loss of privacy, that would be seen as serious damage as well. So that's quite interesting, the interpretation of what they see as serious damage. And we've seen that quite a lot in the in the Gulf region, where you know the culture kind of impacts the you know how they interpret the law, which is which is an interesting thing. Um, in terms of the the like the 70 72-hour rule that we see in the GDPR, the original law back in 2016 was somewhat vague on timing, but supplement, like I said, supplemental guidelines issued by the regulator have introduced a clear 72-hour window after 2020 uh 2016. So the organization must notify the NCSA or the National Cybersecurity Agency within 72 hours of becoming aware of the breach. And it must notify affected individuals within 72 hours as well if the serious damage threshold is met. And this ensures that individuals can take immediate steps, such as changing passwords or freezing credit cards and so on, to protect themselves. So, in practice, the Guitari Qatari regulator expects a high level of accountability and proactivity when it comes to breach reporting. They expect kind of internal management. So organizations are expected to have an internal personal data management system, as you mentioned previously, Katie, that can detect and try out breaches very quickly. If an organization cannot prove that they have the tools to detect a breach, they can be fined, even if they eventually report it. Like if they if their processes are seen to be inadequate, then they can be fined. Transparency is also a very big thing when notifying individuals, the organization must use clear, non-technical language to explain what happened, what data was lost, and what steps the individual should take to protect themselves. If the breach happens with the processor or a third party, they are legally required to notify the main organization or what they call the data controller immediately. So, you know, the things can be put in place to protect the individual and to notify the regulator. So, in terms of penalties, the expectation for reporting is backed up by significant financial deterrence. Uh failure to report a breach causes serious that can cause serious damage to individuals, can result in a fine of up to a million Qatari real, which is about$275,000. So and if the breach happened because the company failed to put appropriate Precautions in place, that fine can actually rise up to five million Qatar in Ryan. You know, so that's a significant increase. You know, that's over a million dollars as well. So yeah, they they take breach very seriously. And overall, I think from looking at everything that we've looked at so far, we can say that Qatar takes data protection quite seriously as well. Moving on with it for question seven, Katie. And we're moving through these quite quickly this week because I think we're getting good at we're getting more experienced at questions and answers. Um who enforces Qatar's national data protection law and what penalties exist? So again, I've touched on that in the previous question, but let's look into it in more detail now.
SPEAKER_01Yeah, um thanks, Maria. So as we said, they have kind of the overseeing authority, the National Cybersecurity Agency, but they also have a national data protection office within the cybersecurity agency that's focused on data protection specifically. So the agency itself has the power to conduct investigations, audits, and give orders, but the office give uh you know decisions, more so to say they give decisions on the uh on the organizations, you know, following the investigations and stuff. So the the PD PPL uh carries some of the steepest financial penalties in the region. Um you had failure to report a breach, which uh you already mentioned, but uh a fine for lack of proper safeguards could be up to five million uh Hictorian Reales, which is one million three hundred and seventy thousand dollar equival equivalent. Wow. Yeah. So it's significant, and also the reputational damage of you know having a fine and and a decision made against you because you you didn't have technical measures in place, you know. I feel like sometimes the reputational damage can be worse, especially for big companies.
SPEAKER_02I agree, yeah, yeah. So yeah, they also it's interesting there as well. So processing sensitive data without a permit can bring can get you the same kind of uh time over a million dollars. That's interesting.
SPEAKER_01It is, yeah, especially like like I don't know what the process of going through that would be, you know. Like I wonder how much leeway companies who were already processing sensitive data had between the introduction of the act and the permit process.
SPEAKER_02Yeah, that's true, yeah.
SPEAKER_01I don't know. But especially in the medical industry, you know. Yeah, yeah. Um, even religious organizations for marriages and stuff, prisons, you know, there's loads. So I'll just touch touch on uh an enforcement decision uh as we did last week. We found that quite helpful.
SPEAKER_02Oh yeah, that's true.
SPEAKER_01Yeah. So they don't seem to give the names of the company uh involved whenever they publish enforcement decisions, which is interesting, like kind of choice and transparency. But so basically the NDPO issued a ruling uh for a company in the ICT sector around its compliance program, which was prompted by an investigation following a formal complaint submitted by a data subject related to a request. So we've seen that that's kind of a pattern, you know. One one request that wasn't fulfilled to a satisfactory nature has led to investigations in two jurisdictions so far that we've seen that then highlighted bigger deficiencies in their privacy programs, and they got decisions from the authority saying this is not adequate.
SPEAKER_02Um so I find more about their privacy programs than it was about the data subject access request. Yeah.
SPEAKER_01Yeah. Exactly. In this case, for the ND NDPO decision, the company apparently, well, according to the the source, their their level of cooperation with the authority helped their case. That was apparently a note, a note that they commented on that they had they were quite forthcoming and you know cooperative. So that's another point to remember, I think, that like even though there is those fines, that once the authority is kind of at your doorstep to cooperate them and be transparent with them rather than kind of developing a defensive mentality around God, I'm gonna get a fine, you know.
SPEAKER_02Yeah, and I think I think that's an a natural response as well. And you know, we've seen that with the GDPR. People are quite defensive, of course, and you know, they want to hide things and stuff, but I think the more you cooperate, the more the uh you know the authorities or the regulator sees that you're willing to cooperate and you're open to change, the less you know hard or or strict they're gonna be when it comes to finding you.
SPEAKER_01Yeah, for sure. That I mean that that's evidence there anyway. Yeah, so we'll move on to question eight for yourself, Maria. So, what does Qatar's law say about cross-border transfers of personal data?
SPEAKER_02Thanks, Katie. Well, this is interesting actually, because Qatar's approach to cross-border transfers under the PDPPL is unique compared to other international frameworks like the GPR. While most laws start by prohibiting trans transfers unless specific safeguards are met, Qatar's law begins with kind of a principle of openness. Um, under Article 15, the law establishes a general rule that a data controller should not take measures or adopt procedures that may curb transfer transborder data flows. So they're kind of almost encouraging it there. Um, this means that by default, the law supports the international exchange of data to facilitate global business and technological innovation. So that's a different kind of perspective that we see from the GDPR. The GDPR would be like take the individual's uh personal data. If you can't provide the same level of protection when transferring the data, then you're not allowed to transfer it at all. Whereas here it's saying, you know, business and innovation shouldn't be hindered by stopping the transfer of personal data internationally. Uh, this freedom, however, is not absolute. The controller or the regulator is empowered to restrict or block data or the transfer of data if the transfer would violate the provisions of the PDPPL. Or the processing of that data in the destination country is likely to cause gross damage to the individual or their personal data. So there are kind of restrictions on that. Um, while the law itself is permissive, the National Cybersecurity Agency has issued guidelines that clarify how organizations should handle these transfers in a professional setting. Um, again, they go back to kind of explicit consent. For many high-risk transfers, obtaining the individual's clear and explicit consent remains the safest gold standard kind of thing for compliance. So that's like they're pointing back to, you know, even though it is a permissive type of approach to transfers, you still have to get explicit consent from the individual who's dating the processing. Again, they kind of look at contractual safeguards. Um, in practice, the regulator expects kind of written contracts between the Qatari entity or the exporter of the data and the international recipient who would be the importer. These contracts must ensure that the recipient would provide a level of protection at least equivalent to the Qatar's law. And again, very similar there to the GDPR, their idea of adequacy. And as well, DPIAs or data protection impact assessments can be undertaken as well in such scenarios. Um, for a university or a tech company, the NCSA typically requires DPI a DPIA to be done before starting any kind of large-scale international transfer. You must document that you have analysed the risk, the risks of the destination country and implement mitigation measures, like for example, you know, um sending data encrypted when transferring and so on. So the law provides the government with the right to restrict transfers for specific reasons such as national security, international relations, or the economic interest of the state. In 2026, this often means that certain sovereign data related to citizens or critical infrastructure may be required to stay within Qatar's borders. And again, we're seeing that happen quite a lot across the globe, kind of data sovereignty, keeping it within your borders, um, just for safety, for national safety. Uh, and this is this is this isn't a practice that we've we've known to be called like data sovereignty or data localization. So, you know, it it looks like they're kind of on the same path as many countries in in across the globe in terms of protecting personal data, protecting the citizens' data, and protecting the national security uh by keeping the data within their own kind of uh borders. So question nine, Katie. How is sensitive personal data treated differently under the Qatar law?
SPEAKER_01Thanks, Maria. So it sensitive personal data is treated more strictly under Qatar's law. It I know we've defined this in the past and previously, but I'm just gonna go over the definition again, which is data related to ethnic origin, children, health, physical or psychological condition, religious creeds, marital relations, and criminal offenses. So you need to get the permission of the authority if you're gonna protest this data. To get the permission, they also need to have it paired with a purpose of processing, explicit consent or parental consent if it's a minor's data. And they can also uh two, sorry, two of the reasons under which they can process sensitive data is if the data has been manifestly made public or if it's for an employment obligation. So we've seen that again with the GDPR, but they also need to carry out a DPIA to assess the risks and mitigate them. And in most cases, organizations are going to need to obtain the explicit consent before processing this type of data, and they're always going to need to implement enhanced safeguards to prevent misuse or harm. So it really reflects the principle that sensitive data carries greater risk for individuals, and they'll need to put in stronger conditions on accountability when they're handling this kind of data. The guidelines uh from the Ministry of Communications, the guidelines on AI, which we'll talk about later, also mentioned that the definition of personal data with a special nature should be interpreted broadly. They mentioned that and explicitly, which is you know, it's good when they say, you know, this is a strict textual interpretation, or you want to you want to understand this in a broad sense.
SPEAKER_02So yeah. Yeah, because it encompasses more basically. It protects more. Yeah.
SPEAKER_01So yeah, I'll I'll move on to my next question for yourself, Maria. How does the law address online privacy, marketing, and children's data?
SPEAKER_02So again, the PD PPL contains robust provisions specifically targeting online environments, the commercial use of data for marketing and the protection of children, which I really like, you know, um, who are afforded a heightened level of care under the law. And I I really like that idea that um children, because they're so vulnerable, we're seeing more and more how the online environment can damage children. So putting extra protections in place for children, I think, is a really good idea. The law is primarily designed for the digital age, applying to all personal data that is processed electronically. In 2026, the regulator or the NCS, the NCSA expects any organization with a digital presence to adhere to privacy by design. And we see that with the GDPR, we've seen that with a lot of other uh legislations that we've looked at in this series as well. And this means that websites and apps must have accessible privacy notices, clear, non-technical explanations of what data is being collected via cookies and tracking pixels and so on, and for what purpose the data is being collected. Um, technical mess mechanisms, technical mechanisms that allow users to exercise their rights, such as buttons to request data deletion or to withdraw consent directly through the interface, are also recommended, and that makes withdrawing consent much easier if you have those things in place already. With regard to direct marketing, Qatar takes a very firm stance against direct marketing by electronic channels like email and SMS and automated calls. Under Article 22, organizations are strictly prohibited from sending marketing materials unless they have obtained the individual's prior explicit consent. The opt-out requirement, every marketing communication must include the sender's identity and a clear, valid address or link that allows the recipient to opt out or stop future communications immediately. So that's very similar to the QPR. Thankfully, we're now seeing lots of emails where you can literally just decide, you know, the top of the email, you can unsubscribe, which makes it really easy. Um and as well, they they will not accept implied consent. Like, unlike some jurisdictions that allowed that allow soft opt-ins for uh existing customers, the Tatari regulator generally expects a high opt-in for any electronic marketing activity. So that's quite you know that's quite strict as well. Now, when it comes to children, children are specifically identified as special nature, special nature category, so special category data. This means that the law treats their data with the same level of sensitivity as health or religious information, which is great. So they require the guardian's consent. Any website or service targeting children must obtain the explicit consent of the child's parent or legal guardian before processing any of the child's data. Operators are expected to take reasonable steps to verify that the person providing consent is indeed the legal guardian. And upon a request, a website operator must provide a guardian with a full description of the data being processed, a reason for the processing, and a copy of the actual data collected about the child. So that is really good. You know, all the data that have been collected about their child can be provided to them upon request. You know, it's great. Um so yeah, like we're seeing more and more that like um this uh this data protection law, the PDPPL, has teeth, you know, it's it's you know, it takes the protection of data very seriously. So now, Katie, we're going to look at kind of more of the AI angle in terms of data protection. So, does Qatar currently have AI-specific legislation? And what does the PDPPL apply to in terms of AI systems?
SPEAKER_01Uh thanks, Maria. So this is a somewhat interesting question. So, right now, when we're recording this, they don't have a standalone AI law like the EU AI Act, but they do have soft law for AI through the national AI strategy and the guidelines. And as usual, you know, the PDPPL is going to apply to AI to the extent that people are using personal data. So where AI models or AI developers are using personal data, they're still subject to transparency, data minimization, accuracy, storage limitation, purpose limitation, and accountability, you know, and I feel like that makes a significant positive impact. I totally agree. And then again, not just that, but in the absence of you know, hard law, the Ministry of Communications and Information Technology has released principles and guidelines for ethical development and deployment of AI, and likewise principles and guidelines for the ethical use of AI. So they're not statutes, but they're could the best practice for if you're doing if you're if you're developing or using AI in Qatar. So I find these quite interesting and nice. So the development and deployment principles are do no harm, ensure system robustness and security, reduce bias and avoid discrimination, protect the environment, safeguard privacy, promote transparency, develop a human-centered approach, and assign the ultimate accountability to humans.
SPEAKER_02That's really good. It's nice to know they're similar to the EU AI act principles as well, you know, just with the AI and so on.
SPEAKER_01That's really good. But they also have principles for use of these systems. For anyone who's using them, it's kind of targeted to even mention students, you know, like you know, students. There's a big debate about whether students in college should be using AI, and I suppose the ultimate answer is you can't stop them, so you may as well give them guidelines.
SPEAKER_02Absolutely totally great.
SPEAKER_01So the principles for the use of AI is minimize bias and promote fairness, safeguard personal and organizational data, understand the AI's capabilities and limitations. Um, assume accountability, and this one it's entitled Assume Accountability, but it's explained as you need to gain an understanding and practice informed decision making and consider the ethical implications when you're using this technology. And there's two more principles. So the last two are comply with relevant laws and ensure the well-being of individuals and society.
SPEAKER_02I like those. Especially, I like the idea of assuming accountability because like you really have to assume accountability when you're using AI. You can ask AI a question and you can take it as verbatim, there is a response. But like if it comes out as garbage, then you're responsible for that garbage if you proliferate that garbage. Do you know what I mean? So you have to really kind of take ownership of what you're ready to get behind when it comes to responses from things like ChatGPT and so on. So I really like that idea of assume accountability because at the end of the day, you can't blame an AI system. You know, you have to blame yourself for the leading AI system.
SPEAKER_01I I personally really like to understand its limitations.
SPEAKER_02Yeah.
SPEAKER_01You know, I think that's a crucial, like they're all very good. Like, I I yeah.
SPEAKER_02They're very well thought out, I think. Yeah.
SPEAKER_01Yeah. Yeah.
SPEAKER_02And because that leads that that kind of blends with assume accountability. So understand its limitations, they kind of go hand in hand together, you know. So if you if you understand their limitations, then you know you have to be ready to assume accountability, you know, if you're gonna use it.
SPEAKER_01Um I think you like to hear this too then. On assume accountability and four use of our International Court and Dispute Resolution Center, which you've got a direction on practice guidelines for the use of artificial intelligence and court proceedings. And the reason why they did this was down from the judgment on uh related to the Qatar Financial Center Court where somebody referenced court law that did not exist. But this is beyond a mere research or on its it's uh you know, it's merely professional, it's it's professional negligence, you know. Um and the lawyer was found to be in contempt of court for first of argument that was based on AI that's hallucination, basically. Yeah, the hallucination, yeah. So these guys are on it.
SPEAKER_02Yeah, it's funny as well because I was just on LinkedIn today as well, and I saw a friend, a post from a friend who said that the Pope has actually uh come out and said and asked priests not to use AI when creating sermons, would you believe? That's mental. Yeah, really interesting. So everybody's using AI now. Yeah, even the priests.
SPEAKER_01So yeah, maybe these guidelines I like, I like the use guidelines they can apply, you know.
SPEAKER_02Absolutely. If you take them into account and you really consider them, yeah, yeah.
SPEAKER_01So we'll we'll move on to our final question here for yourself, Maria. Looking ahead, do you expect Qatar's privacy framework to evolve in response to AI and global trends?
SPEAKER_02Thanks, Katie. Well, this is the last question, and my answer to that is I think it has to. You know, like every every country, we have to accept AI is going to be in our future. Qatar's privacy framework is expected to undergo a significant shift from passive compliance to proactive AI governance, and we're seeing that all over the place lately, where you know, we have the the GDPR era was you know quite passive compliance, writing documents, writing policies, demonstrating that you've done, you know, you've looked into things and you have documentation to show. Whereas now with the AI, um in the AI era, evidence is key to compliance. You can no longer just write a policy, you really have to show uh the auditors that you're actually doing what you say you're doing. So, like they said in Qatar, proactive AI governance. So the PDPPL remains the core statute. Uh, there are several clear trends that indicate how the framework is evolving to keep pace with global movements like the EU AI Act and the surge in agentic AI, which is like everybody's talking about agentic AI at the moment. For the past few years, Qatar has relied on soft law, as you have mentioned, Katie, in the form of ethical AI guidelines issued by the Ministry of Communication and Information Technology. However, as of early 2026, there is a clear move towards making these principles legally binding. Experts expect an amendment to the PDPPL or a new executive decree that formally mandates algorithmic auditing and bias assessments for high-risk AI systems used in public services, healthcare, and finance. This would move Qatar from simply suggesting ethical AI to requiring it, with the National Cybersecurity Agency acting as a primary enforcer. In 2026, Qatar is doubling down on data localization that we referenced earlier about data sovereignty and making sure that certain types of personal data and sensitive data stay within the borders of Qatar. As the country builds out its own infrastructure, like the recently expanded Oracle and Asia Cloud region in Doha, the government is increasingly requiring that sensitive data about Qatari citizens and critical national infrastructure stays within the country's borders. This mirrors global trends in sovereign clouds, where nations want to ensure that AI models are trained on local data that never leaves their jurisdiction, protecting against foreign surveillance and ensuring cultural alignment. We are seeing a trend of regional convergence. Countries like Saudi Arabia and Oman have recently updated their laws to be even more GDPR compliant or GDPR-like, sorry, introducing mandatory data protection officers and stricter portability rights. Qatar is likely to update its 2016 law to include missing modern rights like data portability and a more explicit right to contest automated decision making. This would ensure that Qatar remains a GDPR-aligned pioneer in the region, facilitating smoother trade and data flows between Europe and Qatar. The regulator is now moving beyond just policies on paper. The NCSA's 2026 agenda places a massive emphasis on technical enforcement. Organizations will soon be expected to prove that they are using synthetic data or differential privacy when training AI models. I think that's that's kind of where it's going in terms of data protection and AI, then again, very closely aligned with AI compliance, as we're seeing with the GDPR as well. So the kind of the data protection law would be fundamental and it would, you know, any kind of laws that you would see coming in in terms of AI would have to kind of sit on top of that data protection law, again, similar to Europe. So that kind of concludes our podcast for today. I think we've been quite efficient. Uh, usually we take an hour, but we we definitely are within the hour this time. So this is the end of today's episode on Qatar's personal data privacy objection law, law number 13 of 2016. Katie and I have explored the foundations of Qatar's national privacy regime, the rights and uh the rights it provides, how it may increasingly intersect with AI governance as technology evolves. But again, there's a lot more to be learned, a lot more to understand about the law. So I would encourage you all to go and read the law and interpret the law. Um so thanks for listening, and um until next time.
SPEAKER_01Thanks everyone. Take care.