{"id":9661,"date":"2025-10-01T07:32:34","date_gmt":"2025-10-01T07:32:34","guid":{"rendered":"https:\/\/negd.gov.in\/?post_type=blog&#038;p=9661"},"modified":"2026-03-18T10:37:57","modified_gmt":"2026-03-18T10:37:57","slug":"from-ethics-to-enforcement-an-integrated-playbook-for-cybersecurity-law-risk-and-policy","status":"publish","type":"blog","link":"https:\/\/negd.gov.in\/hi\/blog\/from-ethics-to-enforcement-an-integrated-playbook-for-cybersecurity-law-risk-and-policy\/","title":{"rendered":"From Ethics to Enforcement: An Integrated Playbook for Cybersecurity Law, Risk, and Policy"},"content":{"rendered":"<p><span style=\"text-decoration: underline\"><strong>Part I \u2014 Foundations and Principles<\/strong><\/span><\/p>\n<p><strong>Foundations of Cyberethics<\/strong><\/p>\n<p>Cyberspace is not a single technology but a living mesh of networks, devices, software, and control systems that now underpin everything from public health and payments to education, energy, and critical national infrastructure. Within this environment, routine choices about identity, logging, encryption, telemetry, and disclosure can profoundly shape privacy, safety, markets, and democratic participation, often affecting people who never had a realistic opportunity to understand or negotiate the risks. This playbook starts from that reality and sets out a disciplined way to align what should be done (ethics), what must be done (law), and what is usually done in practice (policy), so day\u2011to\u2011day engineering and governance protect people by design and not by accident.<\/p>\n<p>At its core, the analysis uses three anchors. First, the triad of ethics\u2013law\u2013policy is positioned as a practical architecture: ethics supplies direction, law provides enforceable guardrails and remedies, and policy operationalises both through roles, processes, and technical controls. Second, the relationship between privacy and security is presented as overlapping but not identical aims: privacy centres on individual control over information life\u2011cycles; security centres on keeping systems and data protected against unauthorised access, alteration, disclosure, and disruption; and the overlap is where trust is either made or lost. Third, established norms\u2014the Internet Architecture Board\u2019s unacceptable activitiesi and the Ten Commandments of Computer Ethicsii\u2014are treated as design\u2011time guidance, not wall art, and are woven into concrete choices about least privilege, patching, logging, key management, vulnerability disclosure, and breach notification.<\/p>\n<p>Two U.S. anchors\u2014the Morris Worm prosecutioniii and the Lori Drew caseiv are retained to show, respectively, how small acts can propagate into large harms at Internet scale, and how platform deception that foreseeably causes severe harm raises limits\u2011of\u2011law questions that cannot be answered by terms of service alone. Indian scenarios are incorporated to situate the same ethical problems within the Information Technology Act\u2019s offence and enforcement framework, the Supreme Court\u2019s speech (Shreya Singhalv) and privacy (Puttaswamyvi) guardrails, CERT\u2011In\u2019s incident\u2011response role and reporting expectations, the Digital Personal Data Protection Actvii\u2019s consent\u2011centric baseline, and the intermediary\u2011due\u2011diligence regime, alongside practical case studies (for example, Suhas Kattiviii, \u201cSulli Deals,\u201d and \u201cBulli Baiix\u201d) that demonstrate gendered online harms and the operational importance of platform cooperation and timely takedowns.<\/p>\n<p>The audience is deliberately broad. Legal counsel will find a translation of doctrine into artefacts of proof and remedy. Policy owners and risk managers will find decision rubrics that are short enough to use and strong enough to defend. Engineers and operators will find patterns for secure\u2011by\u2011design and privacy\u2011by\u2011default implementation that do not break under load. Executives and board members will find a strategic vocabulary for resilience that treats transparency as a capability, not a liability. Students and educators will find accessible scaffolding\u2014the IAB list [i] and the Ten Commandments[ii]\u2014tied to concrete practices rather<br \/>\nthan abstract ideals. Across all roles, the aim is one habit: to treat ethics, policy, and law as a single system that makes responsible behaviour the path of least resistance.<\/p>\n<p>Practically, the introduction also previews how the document should be read. Foundations present the ethics\u2013policy\u2013law triad and the privacy\u2013security relationship; cross\u2011cutting issues frame recurring dilemmas and role conflicts; domain chapters show how those dilemmas appear in networks, embedded systems, health care, finance, cloud, blockchain, IoT, AI, intelligence, biometrics, and enterprise life; and the legal section converts the \u201cshould\u201d into \u201cmust\u201d by mapping offences, procedures, sector statutes, and institutional roles to controls, evidence, and remedies. The case anchors and Indian scenarios are not detours; they are the proof points that keep the guidance honest. The result is a playbook designed to travel\u2014from classroom to control room, from board pack to breach call\u2014without losing coherence or moral clarity.<\/p>\n<p><strong>Ethics, policy, and law: should, usually, must<\/strong><\/p>\n<p>Ethics articulates what should be done, law codifies a subset of those expectations into duties that must be followed, and policy translates both into rules and controls that are usually followed in day\u2011to\u2011day operations, aligning aspiration, enforcement, and routine practice in one governance system.<\/p>\n<p>In practice, ethics provides direction, law supplies enforceable limits and remedies, and policy operationalises both through roles, processes, and technical measures\u2014authentication, least privilege, logging, encryption, disclosure\u2014so ordinary engineering choices reflect shared obligations rather than convenience.<\/p>\n<p>The three\u2011circle Venn below captures \u201cEthics (should), Policy (usually), Law (must),\u201d with overlaps labelled for codified moral duties, operational mandates, organisational values, and a central, aligned governance zone.<\/p>\n<p>As the Part I Venn illustrates, privacy and security overlap without becoming identical, which is why proportionality and purpose limitation matter whenever telemetry expands.<\/p>\n<p>&nbsp;<\/p>\n<p>This alignment matters because ethics without policy does not scale, policy without law can falter under pressure, and law without ethical support can yield formal compliance without substantive protection for people and systems. (see Figure 1.1 above) In Part II the same design discipline is applied to cross\u2011cutting dilemmas and domain contexts where the privacy\u2013security tension is most acute.<\/p>\n<p><strong>Privacy versus security: difference and common ground<\/strong><\/p>\n<p>Privacy centres on a person\u2019s authority to control the collection, use, disclosure, retention, and downstream flow of information about them, while security protects information and systems against unauthorised access, alteration, disclosure, or disruption to preserve confidentiality, integrity, and availability.<\/p>\n<p>These aims reinforce each other but can diverge when added monitoring, logging, or retention enhances protection while intruding on autonomy, which is why proportionality, data minimisation, purpose limitation, and transparency are critical to reconcile protection with control in live operations.<\/p>\n<p>The Venn below presents distinctions and shared aims: privacy foregrounds control and ethics of handling personal data, security foregrounds defence against misuse or disruption, and the overlap emphasises that both sustain trust and that failures in either undermine the other.<\/p>\n<p>A simple way to remember the split is that privacy asks who decides and why, and security asks how to keep agreed decisions effective under benign and hostile conditions. (see Figure 1.2 above)<\/p>\n<p><strong>Principles into practice: accepted norms that guide decisions<\/strong><\/p>\n<p>Normative touchstones only matter when they shape concrete choices about identity, privilege, logging, patching cadence, encryption, key lifecycles, vulnerability disclosure processes, and breach notification timing, documentation, and scope.<br \/>\nEthically mature teams anticipate harm, choose safer alternatives, document reasons, and remediate with transparency that protects stakeholders while building organisational memory, thereby aligning engineering quality with moral responsibility.<\/p>\n<p><strong>Internet Architecture Board activities deemed unethicalx<\/strong><br \/>\ni. Activities that seek unauthorised access to Internet resources.<br \/>\nii. Activities that disrupt intended use of the Internet.<br \/>\niii. Activities that waste resources, whether people, capacity, or computing.<br \/>\niv. Activities that destroy the integrity of computer\u2011based information.<br \/>\nv. Activities that compromise the privacy of users.<\/p>\n<p><strong>Ten Commandments of Computer Ethicsxi<\/strong><br \/>\nI. A computer shall not be used to harm other people.<br \/>\nII. One must not interfere with other people\u2019s computer work.<br \/>\nIII. It is unethical to snoop around in other people\u2019s computer files.<br \/>\nIV. A computer must not be used for stealing.<br \/>\nV. A computer must not be used for bearing false witness.<br \/>\nVI. It is unethical to copy or use proprietary software without permission or paying for it.<br \/>\nVII. It is unethical to access an individual\u2019s computer resources without authorisation.<br \/>\nVIII. It is unethical to appropriate other people\u2019s intellectual output.<br \/>\nIX. When writing a programme or designing a system, one must think about the social consequences.<br \/>\nX. A computer shall always be used in ways that ensure consideration and respect for fellow humans.<\/p>\n<p>The next part applies these foundations to recurring dilemmas and to sector contexts where stakes, constraints, and trade\u2011offs are clearest.<\/p>\n<p><strong>Part II \u2014 Practice and Domains<\/strong><\/p>\n<p><strong>Case anchor 1 (U.S.): the Morris Worm prosecutionxii<\/strong><\/p>\n<p>The Morris Worm prosecution under the Computer Fraud and Abuse Act (CFAA) is widely treated as a turning point that demonstrated how a single code defect and ill\u2011judged experiment could propagate quickly and impose real operational and economic harms, catalysing the growth of network security as a serious engineering discipline.<\/p>\n<p>Its enduring lesson is that \u201cbenign intent\u201d does not neutralise foreseeable harm at Internet scale and that secure\u2011by\u2011design and test discipline are ethical as well as technical duties when third parties bear the risk of failure.<\/p>\n<p><strong>Case anchor 2 (U.S.): United States v. Lori Drewxiii<\/strong><\/p>\n<p>United States v. Lori Drew arose from the creation of a fictitious social\u2011media profile used to manipulate and humiliate a minor who later died by suicide, raising difficult questions about the fit between platform terms, intent, and federal anti\u2011hacking statutes and ultimately prompting sharper debate over proportionality and the limits of criminal law for platform\u2011rule violations.<\/p>\n<p>The practical lesson is that deception which foreseeably inflicts severe emotional harm can cross ethical and legal boundaries even when the immediate acts are framed as \u201crule breaking\u201d rather than \u201chacking,\u201d underscoring the need for calibrated rules, education, and early\u2011warning interventions around teen safety online.<\/p>\n<p><strong>Indian statutory baselines and institutions<\/strong><\/p>\n<p>India\u2019s Information Technology Act, 2000xiv (IT Act) is the primary statute governing electronic records, digital signatures, cybercrime, and powers to investigate and remedy offences, and it applies nationwide with extraterritorial reach where Indian systems are implicated.<br \/>\nSignificant 2008 amendments added new offences and powers, including Sections 66 series for unauthorised acts, Section 69 for interception and decryption powers, and provisions addressing obscenity, child sexual abuse material, and cyber\u2011terrorism.<\/p>\n<p>In Shreya Singhal v. Union of India (2015), the Supreme Court recalibrated the IT Act\u2019s free-speech interface. By invalidating Section 66A, the Court removed overbroad and vague restrictions on digital expression. Simultaneously, it upheld Section 69A, affirming that its procedural safeguards provide a constitutional and rule-governed framework for protecting public order. This ruling ensured that all digital safety measures remain subject to judicial oversight and respect due process.<\/p>\n<p>Organisationally, the Indian Computer Emergency Response Team (CERT\u2011In) is designated under Section 70B[xiv] as the national nodal agency for incident response, advisories, coordination, and mandatory reporting directions, including 2022 requirements for rapid incident reporting and log retention to strengthen collective defence.<\/p>\n<p>The Digital Personal Data Protection Act, 2023 (DPDP Act)[vii] establishes consent\u2011centric duties for \u201cdata fiduciaries,\u201d individual rights (access, correction, erasure, grievance), and a Data Protection Board for enforcement, with scope for specified exemptions and application to processing linked to the Indian market, moving the privacy baseline closer to a comprehensive data\u2011protection model.<\/p>\n<p>In parallel, the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, as amended in 2023xv, impose due\u2011diligence obligations on intermediaries, significant social\u2011media platforms, and online\u2011gaming intermediaries, with risks to safe\u2011harbour protection if obligations are not met, especially in relation to specified unlawful or State\u2011identified false content.<\/p>\n<p><strong>Indian case anchors: cyberstalking and gendered online harassment<\/strong><\/p>\n<p>In \u201cState of Tamil Nadu v. Suhas Katti (2004)\u201d, a Metropolitan Magistrate\u2019s Court at Egmore delivered one of India\u2019s earliest convictions for online harassment and obscene postings, applying Section 67 of the IT Act alongside Indian Penal Code provisions, which established an enforcement pathway for cyberstalking and reputational harm via online for a [viii].<\/p>\n<p>The case demonstrated practical tracing, charge-sheeting, and concurrent sentencing for digital harassment, and it has been cited repeatedly as a precedent for addressing misogynistic and defamatory campaigns online through combined statutory tools.<\/p>\n<p><strong>Mapping the U.S. and India: shared aims, different instruments<\/strong><\/p>\n<p>Both legal systems criminalise unauthorised access and harmful content while building sectoral and cross\u2011cutting regimes for privacy and security, but the instruments differ in shape, with the United States relying on a patchwork of sector laws and cooperation statutesxvi, and India combining IT Act with rules for intermediaries and a new, horizontal data\u2011protection statutexvii.<br \/>\nInstitutionally, the United States leans on sector regulators and information\u2011sharing frameworks, while India vests incident\u2011coordination and advisories in CERT\u2011In with time\u2011bound reporting and logging directives, showing distinct but convergent approaches to operational governance and collective defence.<\/p>\n<p><strong>Operationalising values: controls, playbooks, and evidence<\/strong><\/p>\n<p>Programmes translate values into secure\u2011by\u2011design architecturexviii, privacy\u2011by\u2011default xixstewardship, role\u2011based access and multi\u2011factor authentication, encryption and key management, vulnerability disclosure and breach\u2011notification playbooks, third\u2011party risk contracts, and routine exercises so that protection does not depend on heroics.<\/p>\n<p><strong>Educational scaffolding: sources of cyberethics<\/strong><\/p>\n<p>Teaching and enforcement both benefit from distinguishing seven sources of cyberethics\u2014law, philosophy, societal norms, environmental ethics, political ethics, economic ethics, and religious ethics\u2014because many disagreements are really cross\u2011source conflicts rather than factual disputes.<\/p>\n<p>Explicitly tagging a rule or control to its source clarifies why it matters, which exceptions are legitimate, and how it should evolve as technologies and social expectations shift. (see Figure 1.3 above)<br \/>\nThe final part turns to enforceable duties, case anchors, and Indian equivalents that translate \u201cshould\u201d and \u201cusually\u201d into \u201cmust\u201d and remedy.<\/p>\n<p><strong>Part III \u2014 Law, Cases, and India<\/strong><\/p>\n<p><strong>Legal timelines and institutional context<\/strong><\/p>\n<p>A historical view contextualises duties: anti\u2011hacking and communications\u2011privacy laws in the United States matured alongside sectoral privacy regimes, corporate\u2011records governance, federal security standards, and information\u2011sharing statutes, which together signal that accountability has expanded from access control to ecosystem collaboration over time.<br \/>\nIndia\u2019s trajectory shows the consolidation of cybercrime and electronic records under the IT Act, judicial calibration of speech and intermediary liability, institutionalisation of incident response via CERT\u2011In, and a new comprehensive privacy baseline through the DPDP Act, accompanied by evolving intermediary due\u2011diligence obligations.<\/p>\n<p><strong>Indian doctrinal guardrails that shape ethics in practice<\/strong><\/p>\n<p>Shreya Singhal[v] clarifies that vague speech offences cannot be a catch\u2011all for cyber conduct, steering enforcement towards well\u2011defined offences and due\u2011process\u2011compliant blocking and takedown measures with safeguards and record\u2011keeping for accountability.<br \/>\nPuttaswamy[vi] constitutionalises privacy as a fundamental right, implying that surveillance, data retention, and intrusive analytics must pass tests of legality, necessity, and proportionality, which should be reflected in design reviews, data minimisation policies, and audit trails within organisations.<\/p>\n<p><strong>Institutional pathways and compliance expectations in India<\/strong><\/p>\n<p>CERT\u2011In\u2019s 2022 directions xxon six\u2011hour incident reporting, 180\u2011day log retention, and clock synchronisation push organisations to maintain evidence readiness and timing discipline, which are operational expressions of ethical transparency and shared defence.<br \/>\nIntermediary Rules 2021 (as amended)[xv] push content moderation, grievance handling, and due diligence mechanisms upstream, with consequences for safe\u2011harbour if obligations are not met, aligning platform operations with public\u2011law expectations about safety and integrity online.<\/p>\n<p><strong>Sector snapshots to ground abstract values<\/strong><\/p>\n<p>In healthcare, patient safety stakes elevate the duty of necessity and proportionality in access and monitoring, favouring pragmatic controls such as least-privilege access, multi-factor authentication, encryption, network segmentation, device management, continuous auditing, and rehearsed breach notification that are auditable and explainable to patients and regulators.<br \/>\nIn finance, institutional trust relies on continuous monitoring, disciplined vulnerability management, phishing resilience, third\u2011party controls, and client education under a visible\u201ctone from the top,\u201d which together translate abstract duties into high\u2011signal practices that customers can recognise and reward.<\/p>\n<p><strong>Conclusion: alignment that earns trust<\/strong><\/p>\n<p>The argument of this playbook is simple to state and demanding to live: responsible digital governance is not a choice between ethics, policy, and law, but a disciplined alignment of all three, so that what should be done, what must be done, and what is usually done point in the same direction. Achieving that alignment begins with design. Secure\u2011by\u2011design and privacy\u2011by\u2011default are not slogans; they are specific architectural commitments\u2014least privilege as the default, encryption in transit and at rest, key lifecycle hygiene, comprehensive logging with integrity, patch pipelines that privilege safety over novelty, and segregation controls that fail safely\u2014expressed in code, configuration, and change control. The ethical burden of those choices is carried by everyday artefacts: access reviews, threat models, data\u2011flow diagrams, DPIAsxxi, SBOMsxxii, runbooksxxiii, and post\u2011incident reviews that are honest enough to change behaviour next time.<\/p>\n<p>The legal turn is equally practical. Duties are real only when mapped to controls and evidence that stand up to scrutiny. That means knowing which offences and procedures apply, what sectoral regimes demand, where institutional roles begin and end, and how to preserve evidence and privilege while being transparent enough to maintain trust with regulators, partners, and the public. In the United States, the path from anti\u2011hacking and communications privacy to sector privacy, corporate governance, federal standards, and information sharing shows a system learning to coordinate across silos at speed. In India, the consolidation under the IT Act, the free\u2011speech recalibration of Shreya Singhal, the privacy baseline of Puttaswamy and the DPDP Act, CERT\u2011In\u2019s operational role, and intermediary due\u2011diligence expectations show a different but convergent trajectory: from isolated offences to ecosystem governance.<\/p>\n<p>Culture binds the system. Transparency, proportionality, and reversibility are not abstract virtues; they are operational stances. Transparency means explaining decisions at a level that affected groups can understand, while protecting investigations and due process. Proportionality means matching controls and data use to risk and necessity, not to appetite or fashion. Reversibility means preferring decisions that can be undone when uncertainty is high, especially where harm would be intimate or irreversible. These stances turn values into speed, because teams that know how they decide, and why, decide faster and make fewer unforced errors.<\/p>\n<p>The case anchors underline the stakes. The Morris Worm reminds engineers and leaders that benign intent does not neutralise foreseeable harm at scale; code that self\u2011propagates is not a prank but a system\u2011level risk multiplier. The Lori Drew case warns that deception which foreseeably inflicts severe emotional harm cannot be excused as mere \u201crule breaking\u201d; platform terms cannot do the work of calibrated criminal law, and safety by design for minors is a duty, not a public\u2011relations initiative.<\/p>\n<p>___________<\/p>\n<p style=\"text-align: left\">i Internet Architecture Board. (1989). Ethics and the Internet (RFC 1087). https:\/\/datatracker.ietf.org\/doc\/html\/rfc1087<br \/>\nii Barquin, R. C. (1992). In Pursuit of a \u2018Ten Commandments\u2019 for Computer Ethics. Computer Ethics Institute. http:\/\/computerethicsinstitute.org\/publications\/tencommandments.html<br \/>\niii https:\/\/www.fbi.gov\/history\/famous-cases\/morris-worm<br \/>\niv https:\/\/www.justice.gov\/sites\/default\/files\/oip\/legacy\/2014\/07\/23\/11-17-2009-us-v-lori-drew.pdf<br \/>\nv https:\/\/www.thehindu.com\/opinion\/lead\/The-judgment-that-silenced-Section-66A\/article59870557.ece<br \/>\nvi https:\/\/www.scobserver.in\/wp-content\/uploads\/2021\/10\/1-266Right_to_Privacy__Puttaswamy_Judgment-Chandrachud.pdf<br \/>\nvii https:\/\/www.meity.gov.in\/static\/uploads\/2024\/06\/2bf1f0e9f04e6fb4f8fef35e82c42aa5.pdf<br \/>\nviii https:\/\/papers.ssrn.com\/sol3\/papers.cfm?abstract_id=3776961<br \/>\nix https:\/\/www.thehindu.com\/news\/national\/sulli-deals-bulli-bai-and-the-young-and-educated-hatemongers\/article38305009.ece<br \/>\nx Internet Architecture Board. (1989). Ethics and the Internet (RFC 1087). https:\/\/datatracker.ietf.org\/doc\/html\/rfc1087<br \/>\nxi Barquin, R. C. (1992). In Pursuit of a \u2018Ten Commandments\u2019 for Computer Ethics. Computer Ethics Institute. http:\/\/computerethicsinstitute.org\/publications\/tencommandments.html<br \/>\nxii https:\/\/www.fbi.gov\/history\/famous-cases\/morris-worm<br \/>\nxiii https:\/\/www.justice.gov\/sites\/default\/files\/oip\/legacy\/2014\/07\/23\/11-17-2009-us-v-lori-drew.pdf<br \/>\nxiv https:\/\/www.indiacode.nic.in\/bitstream\/123456789\/13116\/1\/it_act_2000_updated.pdf<br \/>\nxv https:\/\/www.meity.gov.in\/static\/uploads\/2024\/02\/Information-Technology-Intermediary-Guidelines-and-Digital-Media-Ethics-Code-Rules-2021-updated-06.04.2023-.pdf<br \/>\nxvi https:\/\/ogletree.com\/insights-resources\/blog-posts\/u-s-continues-patchwork-of-comprehensive-data-privacy-requirements-new-laws-set-to-take-effect-over-next-2-years\/<br \/>\nxvii https:\/\/nishithdesai.com\/fileadmin\/user_upload\/pdfs\/Research_Papers\/Privacy_&amp;_Data_in_India.pdf<br \/>\nxviii https:\/\/www.cisa.gov\/securebydesign<br \/>\nxix https:\/\/commission.europa.eu\/law\/law-topic\/data-protection\/rules-business-and-organisations\/obligations\/what-does-data-protection-design-and-default-mean_en<br \/>\nxx https:\/\/www.cert-in.org.in\/PDF\/CERT-In_Directions_70B_28.04.2022.pdf<br \/>\nxxi http:\/\/www.dataprotection.ie\/en\/organisations\/know-your-obligations\/data-protection-impact-assessments<br \/>\nxxii https:\/\/www.cisa.gov\/sbom<br \/>\nxxiii https:\/\/wa.aws.amazon.com\/wellarchitected\/2020-07-02T19-33-23\/wat.concept.runbook.en.html<\/p>","protected":false},"author":2,"featured_media":9662,"comment_status":"open","ping_status":"closed","template":"","blog_category":[351],"class_list":["post-9661","blog","type-blog","status-publish","has-post-thumbnail","hentry","blog_category-digital-infrastructure-cyber-security"],"acf":[],"_links":{"self":[{"href":"https:\/\/negd.gov.in\/hi\/wp-json\/wp\/v2\/blog\/9661","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/negd.gov.in\/hi\/wp-json\/wp\/v2\/blog"}],"about":[{"href":"https:\/\/negd.gov.in\/hi\/wp-json\/wp\/v2\/types\/blog"}],"author":[{"embeddable":true,"href":"https:\/\/negd.gov.in\/hi\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/negd.gov.in\/hi\/wp-json\/wp\/v2\/comments?post=9661"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/negd.gov.in\/hi\/wp-json\/wp\/v2\/media\/9662"}],"wp:attachment":[{"href":"https:\/\/negd.gov.in\/hi\/wp-json\/wp\/v2\/media?parent=9661"}],"wp:term":[{"taxonomy":"blog_category","embeddable":true,"href":"https:\/\/negd.gov.in\/hi\/wp-json\/wp\/v2\/blog_category?post=9661"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}