{"id":12736,"date":"2026-05-05T05:25:15","date_gmt":"2026-05-05T05:25:15","guid":{"rendered":"https:\/\/negd.gov.in\/?post_type=blog&#038;p=12736"},"modified":"2026-05-05T06:33:10","modified_gmt":"2026-05-05T06:33:10","slug":"implementing-privacy-by-design-in-government-technology-after-the-dpdp-act","status":"publish","type":"blog","link":"https:\/\/negd.gov.in\/hi\/blog\/implementing-privacy-by-design-in-government-technology-after-the-dpdp-act\/","title":{"rendered":"Implementing Privacy by Design in Government Technology After the DPDP Act"},"content":{"rendered":"<h4>\u00a0\u00a0\u00a0\u00a0 I.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Why privacy can no longer be treated as an afterthought<\/h4>\n<p>India\u2019s digital governance model has been driven by scale, interoperability, and service delivery. In trying to make systems faster and more connected, how does the State make sure it is not asking for, storing, or using more personal data than it really needs? After the Digital Personal Data Protection Act, 2023, that is no longer avoidable. The Act expects the Data Fiduciary to put in place reasonable technical and organisational safeguards to protect personal data. The Act also mandates the erasure of personal data upon the expiry of its specified purpose or when retention is no longer required, unless continued retention is necessary for compliance with other applicable legal mandates (Digital Personal Data Protection Act, 2023, 2023). These statutory duties must also be read in the broader constitutional context of informational privacy, dignity, autonomy, and proportionality recognised by the Supreme Court in Justice K. S. Puttaswamy (Retd.) v. Union of India (2017) (Justice K S Puttaswamy (Retd.), And Anr. vs. Union of India And Ors., 2017)<\/p>\n<p>Privacy is most effectively integrated at the architecture stage rather than as a subsequent review. Put simply, privacy has to be built into the system at the beginning i.e. at the stage where decisions regarding what data is to be collected, who may access it, how long is it required to be retained, how breaches are handled, and how vendors or processors are regulated. Once a system is already built, privacy measures are often added in the end instead of being real, built\u2011in protections (Ann Cavoukian, n.d.).<\/p>\n<p>This is particularly relevant for large-scale digital public infrastructure. This is particularly relevant for large-scale digital public infrastructure. While Aadhaar and DigiLocker operate under distinct statutory and architectural frameworks, they illustrate how identity-linked services, authentication and verification layers, digital documents, and service delivery increasingly intersect within India\u2019s government technology landscape. This convergence underscores the need for robust, platform-specific privacy-by-design and governance approaches to ensure trust, accountability, and the protection of citizen data as digital public infrastructure scales (IAPP, 2025). In such systems, privacy risks do not arise only from a data breach. They may also arise from excessive collection, unclear retention, unnecessary access, weak auditability, or data-sharing arrangements that are not sufficiently tied to a defined public purpose. This makes privacy-by-design particularly important in Government technology, because the architecture of the system often determines how citizens\u2019 data will be accessed, shared, retained, and protected in practice.<\/p>\n<h4>\u00a0 II.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Privacy by Design in the Government context<\/h4>\n<p>Government systems often sit at the intersection of welfare delivery, regulatory decision-making, authentication, and public records. A commercial app with weak privacy practices may create consumer harm; a public platform with weak privacy practices may affect legal entitlements, expose sensitive records, or undermine trust in the State itself.<\/p>\n<p>For Government systems, privacy is not ensured only through policies set out on paper. It also depends on system design and operation. This includes making sure that only necessary data is collected, access is confined to those who require it for their functions, records are maintained for later review, retention and deletion form part of routine processes, and the department is in a position to explain why each category of data is being collected and where responsibility lies when outside vendors are involved (Digital Personal Data Protection Act, 2023, 2023).<\/p>\n<div id=\"attachment_12737\" style=\"width: 660px\" class=\"wp-caption alignnone\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-12737\" class=\"wp-image-12737\" src=\"https:\/\/negd.gov.in\/wp-content\/uploads\/2026\/05\/figure-1.png\" alt=\"\" width=\"650\" height=\"434\" \/><p id=\"caption-attachment-12737\" class=\"wp-caption-text\">Figure 1. generated with AI- Seven foundational principles of Privacy by Design, adapted from Ann Cavoukian, Information and Privacy Commissioner of Ontario (<a href=\"https:\/\/www.ipc.on.ca\/sites\/default\/files\/legacy\/2018\/01\/pbd-1.pdf\">https:\/\/www.ipc.on.ca\/sites\/default\/files\/legacy\/2018\/01\/pbd-1.pdf<\/a>.)<\/p><\/div>\n<p>As shown in Figure 1, Privacy by Design provides a structured way to translate privacy obligations into administrative, technical, and procurement decisions. In the Government context, its seven principles may be understood as follows (Ann Cavoukian, n.d.).<\/p>\n<p>(i) Privacy should be proactive, not reactive- Departments should identify privacy risks before a system is launched, rather than addressing them only after complaints, breaches, or audit observations arise.<\/p>\n<p>(ii) Privacy should be the default setting- A citizen should not have to take extra steps to avoid unnecessary collection, disclosure, or retention of personal data. Government systems should be designed to collect and display only what is necessary for the service.<\/p>\n<p>(iii) Privacy should be embedded into design- It should be visible in the real\u2011world workings of the system, how data travels, who\u2019s allowed to see it, the records of who accessed it, how long it\u2019s stored, whether it\u2019s hidden or encrypted, and how people can report problems, not just written down in a privacy policy that no one reads.<\/p>\n<p>(iv) Privacy by Design seeks full functionality- It does not require departments to choose between service delivery and privacy. Instead, it encourages systems that are efficient while also being proportionate and privacy-preserving.<\/p>\n<p>(v) End-to-End Security- There should be end-to-end security across the data lifecycle. Personal data should be protected at the stages of collection, storage, use, sharing, archival, and deletion.<\/p>\n<p>(vi) Visibility and Transparency- There should be visibility and transparency. Departments should be able to explain what data is collected, for what purpose, where it is stored, with whom it is shared, and when it will be deleted.<\/p>\n<p>(vii) Request for User Privacy- The approach should remain user-centric. Citizens should receive clear information, meaningful control where applicable, accessible grievance mechanisms, and assurance that their data will be handled only in accordance with law.<\/p>\n<p>These principles are useful because they convert privacy from an abstract legal requirement into a practical design and governance discipline.<\/p>\n<h4>III.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Procurement is where privacy really begins<\/h4>\n<p>Once the seven principles are translated into Government practice, procurement becomes the first formal point at which they can be made enforceable. Privacy is often discussed only after the system has already been designed. In government projects, those choices usually begin at procurement. When RFPs provide specific guidance on data lifecycle management, vendor responses are better positioned to reflect clear compliance standards. Detailed procurement criteria help avoid ambiguity regarding personal data treatment.<\/p>\n<p>An effective procurement framework should call for precise disclosure from vendors on the treatment of personal data within the proposed system i.e. how such data will circulate, which data fields are indispensable, who will be authorised to access them, how audit records will be maintained, and what controls will govern the involvement of subcontractors or processors. It is also the point at which the State decides whether privacy will function as an enforceable design principle or as a statement of intent.<\/p>\n<p>Privacy by Design is not just about collecting data properly at the start. It also requires attention to what happens afterwards i.e. how the data is stored, used, shared, kept, and eventually deleted. The DPDP Act takes that broader approach. It does not focus solely on the first point of collection, but also on safeguards, responsibility within the organisation, erasure, and accountability, even where processing is carried out through Data Processors (Digital Personal Data Protection Act, 2023, 2023).<\/p>\n<h4>IV.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Privacy by Design and Security by Design as Complementary Frameworks<\/h4>\n<p>Privacy by Design and Security by Design are closely connected and mutually reinforcing, but they serve different functions. Privacy by Design focuses on whether personal data is collected, used, retained, shared, and erased in a lawful, necessary, and proportionate manner. Security by Design focuses on system integrity, availability, resilience, threat management, incident response, and protection against unauthorised access. In practice, both frameworks must operate together. A system that is not secure cannot meaningfully protect privacy; at the same time, a secure system must also ensure that it does not collect or retain more personal data than is necessary for the stated purpose (Ann Cavoukian, n.d.; Digital Personal Data Protection Act, 2023, 2023).<\/p>\n<p>Privacy and security are increasingly managed as complementary, integrated requirements within a unified governance framework. One useful reference point for Security by Design is zero trust architecture, which NIST describes not as a single product but as a set of principles for workflow, system design, and operations, aimed at reducing breaches and limiting internal lateral movement (Rose et al., 2020).<br \/>\nIn public digital systems, access should not be given just because someone is already on the network. It should depend on role, need, and regular checks. Citizen data should also not be left widely visible within the system. Segmentation, restricted access, and audit logs help reduce unnecessary access to personal data.<\/p>\n<p>This distinction is useful for Government implementation. Security reviews, vulnerability assessments, and incident-response mechanisms are essential, but they should be complemented by a separate privacy review that examines necessity, purpose limitation, retention, access, processor governance, and citizen-facing rights. The legal, programme, procurement, privacy, and information-security functions should, therefore, work in coordination, while retaining clarity on their respective responsibilities (CERT-In, 2022; Digital Personal Data Protection Act, 2023, 2023).<\/p>\n<p>Departments cannot afford to treat encryption, audit records, separation of environments, or deletion practices as back-end matters for vendors alone to settle. These are part of the basic conditions on which the lawful handling of personal data will later be judged. In the public sector, privacy means little unless it is reflected in how the system is built and run.<\/p>\n<h4>\u00a0\u00a0 V.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Incident response and reporting obligations<\/h4>\n<p>A well-drafted policy is of little value if an organisation cannot respond quickly when something goes wrong. In India, this is reinforced by CERT-In\u2019s binding directions. These directions require entities to report specified cyber incidents to CERT-In within six hours of becoming aware of the incident, as mandated by the 2022 cybersecurity directions for regulated entities (CERT-In, 2022)<\/p>\n<p>CERT-In has also issued regular guidance. As the Press Information Bureau noted on 23.01.2026, this included reports, advisories, white papers, and technical guidelines on subjects such as smart city infrastructure, ransomware, audit policy, and software and hardware bills of materials (PIB, 2026). Along with the 2022 directions, this suggests a growing focus on cyber preparedness in public systems.<\/p>\n<div id=\"attachment_12739\" style=\"width: 660px\" class=\"wp-caption alignnone\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-12739\" class=\"wp-image-12739\" src=\"https:\/\/negd.gov.in\/wp-content\/uploads\/2026\/05\/fig-2.png\" alt=\"\" width=\"650\" height=\"434\" \/><p id=\"caption-attachment-12739\" class=\"wp-caption-text\">Figure 2. Generated with AI- Privacy embedded across procurement, architecture, operations, and accountability in government technology, as mandated by the DPDP Act, 2023. DPDP Act and Rules<\/p><\/div>\n<h4>VI.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 The AI layer in public digital systems<\/h4>\n<p>The use of AI in public digital systems can support efficiency, better service delivery, and improved decision-support. At the same time, it introduces additional privacy and governance considerations that should be addressed at the design and procurement stage itself. This is consistent with India\u2019s responsible AI approach, which emphasises safety, reliability, accountability, transparency, and human-centric deployment of AI systems (NITI Aayog August 2021, n.d.; NITI Aayog February 2021, n.d.).<\/p>\n<p>For public authorities, the first step is to distinguish between different kinds of AI use. AI may be used for internal productivity, document summarisation, translation, citizen-service support, risk analysis, or decision-support. Each use case carries a different level of privacy risk. A low-risk internal productivity tool may require one set of controls, while an AI system that influences eligibility, scrutiny, prioritisation, or service delivery may require stronger safeguards, human oversight, and auditability.<\/p>\n<p>Departments should also examine whether personal data is being processed through third-party AI tools, whether prompts or outputs are retained, whether the provider is permitted to reuse the data, whether the system is hosted in a controlled environment, and whether adequate access, logging, retention, and deletion controls are available. Sensitive citizen data should not be entered into AI tools unless the department has clarity on these issues and has authorised such use.<\/p>\n<p>To ensure that AI-driven support remains accountable, systems should be designed to complement, rather than substitute, administrative judgment, incorporating clear interpretability and human-in-the-loop review mechanisms. Human oversight, review mechanisms, reasons, and grievance pathways should be built into the governance framework, particularly where the outcome may affect access to benefits, services, entitlements, or regulatory treatment (NITI Aayog February 2021, n.d.).<\/p>\n<p>Purpose limitation is equally important. If personal data collected for one Government service is later proposed to be used for AI model training, fine-tuning, or analytics, the department should separately examine whether such use is legally permissible, necessary, proportionate, and consistent with the original purpose. Wherever feasible, anonymised, aggregated, synthetic, or test data should be preferred over identifiable citizen data (Digital Personal Data Protection Act, 2023, 2023).<\/p>\n<p>AI systems may also raise concerns relating to bias, explainability, and accountability. If datasets are incomplete or historically skewed, automated systems may produce outcomes that require careful review. For Government use, procurement documents may, therefore, require dataset documentation, bias testing, explainability standards, periodic audit, and clear allocation of responsibility between the department and the technology provider (NITI Aayog August 2021, n.d.; NITI Aayog February 2021, n.d.).<\/p>\n<p>The right of erasure may also require special attention in AI contexts. In conventional databases, deletion may be relatively straightforward. In AI systems, particularly where data has been used for model training or fine-tuning, it may be more difficult to identify and remove the influence of specific data points. This does not mean that AI should be avoided. It means that departments should adopt a cautious approach before introducing identifiable personal data into AI training pipelines and should consider safeguards such as anonymisation, exclusion of sensitive datasets, restricted training environments, contractual limitations on reuse, and documented deletion or mitigation processes (Digital Personal Data Protection Act, 2023, 2023).<\/p>\n<p>Accordingly, AI governance in Government should be built around clear use-case approval, data classification, restrictions on use of public or commercial AI tools, hosting and retention requirements, model-training controls, human oversight, audit rights, and accountability for errors or harms. Privacy by Design in AI therefore requires not only a privacy notice or contractual clause, but a considered assessment of how the system will actually process, retain, and influence personal data.<\/p>\n<h4>VII.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Conclusion<\/h4>\n<p>In Government technology, privacy has to shape procurement, system design, operations, AI governance, and accountability. The DPDP Act reflects this through its focus on organisational measures, reasonable security safeguards, erasure, and continuing responsibility, including where processing is carried out through processors (Digital Personal Data Protection Act, 2023, 2023). CERT-In\u2019s reporting framework and wider cybersecurity guidance also point towards the need for risk-aware public digital systems (CERT-In, 2022; PIB, 2026)<\/p>\n<p>Viewed that way, Privacy by Design is not just a phrase. It is a practical method for ensuring that digital governance remains lawful, proportionate, secure, and worthy of public trust. The strength of the State\u2019s digital infrastructure will depend not only on what it is able to deliver, but also on whether it handles personal data with care at every stage, from collection to eventual deletion.<\/p>\n<h4>VIII.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 References<\/h4>\n<p>a. Ann Cavoukian. (n.d.). Privacy by design. Information and Privacy Commissioner of Ontario. Retrieved April 27, 2026, from https:\/\/www.ipc.on.ca\/sites\/default\/files\/legacy\/2018\/01\/pbd-1.pdf<\/p>\n<p>b. CERT-In. (2022, April 28). Directions for Safe &amp; Trusted Internet. https:\/\/www.cert-in.org.in\/PDF\/CERT-In_Directions_70B_28.04.2022.pdf<\/p>\n<p>c. Digital Personal Data Protection Act, 2023. (2023).<\/p>\n<p>https:\/\/www.meity.gov.in\/static\/uploads\/2024\/06\/2bf1f0e9f04e6fb4f8fef35e82c42aa5.pdf<\/p>\n<p>d. IAPP. (2025, October). The DigiLocker Story. https:\/\/dial.global\/wp-content\/uploads\/2025\/10\/The-DigiLocker-Story.pdf<\/p>\n<p>e. Justice K S Puttaswamy (Retd.), And Anr. Vs. Union of India And Ors. (2017).<\/p>\n<p>f. NITI Aayog August 2021. (n.d.). Responsible-AI. Retrieved April 27, 2026, from https:\/\/www.niti.gov.in\/sites\/default\/files\/2021-08\/Part2-Responsible-AI-12082021.pdf<\/p>\n<p>g. NITI Aayog February 2021. (n.d.). Responsible-AI. Retrieved April 27, 2026, from https:\/\/www.niti.gov.in\/sites\/default\/files\/2021-02\/Responsible-AI-22022021.pdf<\/p>\n<p>h. PIB. (2026, January 23). CERT-In: India\u2019s frontline defender against cyber threats. Government of India. https:\/\/static.pib.gov.in\/WriteReadData\/specificdocs\/documents\/2026\/jan\/doc2026123764501.pdf<\/p>\n<p>i. Rose, S., Borchert, O., Mitchell, S., &amp; Connelly, S. (2020). Zero Trust Architecture. National Institute of Standards and Technology. https:\/\/doi.org\/10.6028\/NIST.SP.800-207<\/p>\n<p>(This article has been written by Astha Ojha, TAU, National e-Governance Division, MeitY. For any comments or feedback, please write to astha.ojha@digitalindia.gov.in and negdcb@digitalindia.gov.in.)<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>(This article has been written by Astha Ojha, TAU, National e-Governance Division, MeitY. For any comments or feedback, please write to <a href=\"mailto:astha.ojha@digitalindia.gov.in\">astha.ojha@digitalindia.gov.in<\/a> and <a href=\"mailto:negdcb@digitalindia.gov.in\">negdcb@digitalindia.gov.in<\/a>.)<\/p>\n<p>&nbsp;<\/p>","protected":false},"author":10,"featured_media":12740,"comment_status":"closed","ping_status":"closed","template":"","blog_category":[351],"class_list":["post-12736","blog","type-blog","status-publish","has-post-thumbnail","hentry","blog_category-digital-infrastructure-cyber-security"],"acf":[],"_links":{"self":[{"href":"https:\/\/negd.gov.in\/hi\/wp-json\/wp\/v2\/blog\/12736","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/negd.gov.in\/hi\/wp-json\/wp\/v2\/blog"}],"about":[{"href":"https:\/\/negd.gov.in\/hi\/wp-json\/wp\/v2\/types\/blog"}],"author":[{"embeddable":true,"href":"https:\/\/negd.gov.in\/hi\/wp-json\/wp\/v2\/users\/10"}],"replies":[{"embeddable":true,"href":"https:\/\/negd.gov.in\/hi\/wp-json\/wp\/v2\/comments?post=12736"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/negd.gov.in\/hi\/wp-json\/wp\/v2\/media\/12740"}],"wp:attachment":[{"href":"https:\/\/negd.gov.in\/hi\/wp-json\/wp\/v2\/media?parent=12736"}],"wp:term":[{"taxonomy":"blog_category","embeddable":true,"href":"https:\/\/negd.gov.in\/hi\/wp-json\/wp\/v2\/blog_category?post=12736"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}