{"id":8305,"date":"2026-03-12T17:27:04","date_gmt":"2026-03-12T17:27:04","guid":{"rendered":"https:\/\/eprodat.wpenginepowered.com\/ai-denmark-draws-a-red-line-ai-will-not-be-allowed-to-interpret-emotions-in-the-workplace-or-the-classroom\/"},"modified":"2026-03-12T17:32:21","modified_gmt":"2026-03-12T17:32:21","slug":"ai-denmark-draws-a-red-line-ai-will-not-be-allowed-to-interpret-emotions-in-the-workplace-or-the-classroom","status":"publish","type":"post","link":"https:\/\/eprodat.com\/en\/ai-denmark-draws-a-red-line-ai-will-not-be-allowed-to-interpret-emotions-in-the-workplace-or-the-classroom\/","title":{"rendered":"(AI) Denmark Draws a Red Line: AI Will Not Be Allowed to Interpret Emotions in the Workplace or the Classroom"},"content":{"rendered":"\t\t<div data-elementor-type=\"wp-post\" data-elementor-id=\"8305\" class=\"elementor elementor-8305 elementor-8304\">\n\t\t\t\t<div class=\"elementor-element elementor-element-a1ce55e e-flex e-con-boxed e-con e-parent\" data-id=\"a1ce55e\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-765c0a0 elementor-widget elementor-widget-heading\" data-id=\"765c0a0\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t\t<h1 class=\"elementor-heading-title elementor-size-default\">(AI) Denmark Draws a Red Line: AI Will Not Be Allowed to Interpret Emotions in the Workplace or the Classroom <\/h1>\t\t\t\t<\/div>\n\t\t\r\n\t\t<div class=\"elementor-element elementor-element-8555408 elementor-widget elementor-widget-text-editor\" data-id=\"8555408\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<p>As the new European framework for artificial intelligence is being rolled out, one issue is capturing the attention of legal professionals, compliance officers, and technical teams alike: what happens to systems that \u201cread\u201d emotions from faces, voices, or gestures? The Guidance on the prohibition of AI that infers emotions in workplaces and educational institutions, published by the Danish Agency for Digitalisation, offers one of the first interpretations of Article 5(1)(f) of Regulation (EU) 2024\/1689. Its message is clear: it is prohibited to place on the market, put into service, or use AI systems intended to infer the emotions of natural persons in workplace or educational settings, except for medical or safety reasons. This is not merely a declaratory statement; it draws a legal boundary that will shape purchasing decisions, product design, and governance processes across public and private organisations throughout Europe. <br><span class=\"TextRun SCXW174748004 BCX0\" lang=\"NL-NL\" xml:lang=\"NL-NL\" data-contrast=\"auto\">The guidance focuses first and foremost on why a red line has been drawn here. Human emotions are contextual, cultural, and individual realities that cannot be reliably \u201cobjectified\u201d through mathematical rules. A smile does not always mean happiness; a raised voice does not prove anger. For this reason, even when a system appears to be correct, its reliability is low, and its use can lead to bias and unfavourable treatment. When such use occurs within asymmetric power relationships (employer\u2013employee, teacher\u2013student), the risks to fundamental rights are multiplied. This underpins the specific prohibition in these contexts. <\/span><span class=\"EOP SCXW174748004 BCX0\" data-ccp-props=\"{\"134233117\":true,\"134233118\":true,\"201341983\":0,\"335559740\":240}\"> <\/span><\/p><p> <\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-ec5d170 elementor-widget elementor-widget-text-editor\" data-id=\"ec5d170\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<p><span data-contrast=\"auto\">From a practical standpoint, the guidance structures the analysis around three cumulative conditions. First, there must be the placing on the market, putting into service, or use of an AI system with the specific purpose of inferring emotions. Second, the system must analyse biometric data (facial expressions, body language, voice tone or cadence) and, based on that observation, infer an identifiable emotional state (happiness, sadness, boredom, stress, enthusiasm, etc.). Third, the use must take place in a workplace or educational environment. If all three conditions are met, the practice is prohibited throughout the EU. If any one of them is missing, the case falls outside the scope of the prohibition, although it may still be unlawful for other reasons.     <\/span><span data-ccp-props=\"{\"134233117\":true,\"134233118\":true,\"201341983\":0,\"335559740\":240}\"> <\/span><\/p><p><span data-contrast=\"auto\">The first condition concerns the material scope of the rule. For a practice to be prohibited, the system must have been placed on the market, put into service, or used within the European Union with the specific purpose of inferring emotions. This means that the mere existence of a technology capable of doing so is not sufficient: it must have been deployed or made available on the market for that specific purpose. In this way, the prohibition applies both to those who design and sell such systems and to the organisations that implement them. <br>   <\/span><span data-ccp-props=\"{\"134233117\":true,\"134233118\":true,\"201341983\":0,\"335559740\":240}\"> <\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-db85516 elementor-widget elementor-widget-text-editor\" data-id=\"db85516\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<p><span class=\"TextRun SCXW108949001 BCX0\" lang=\"NL-NL\" xml:lang=\"NL-NL\" data-contrast=\"auto\"><span class=\"NormalTextRun SCXW108949001 BCX0\">The second condition is the technical hinge of the analysis and deserves closer attention. It is not enough for a system to capture traits; it must explicitly infer an emotion. The guidance stresses that the concept of \u201cinferring\u201d requires interpretation: transforming physical or behavioural traits into an emotional conclusion such as \u201cdisinterest,\u201d \u201cfrustration,\u201d or \u201csatisfaction.\u201d It also clarifies that not all behavioural analysis falls within this scope: the mere \u201cdetection\u201d of gestures or the counting of smiles, without emotional inference, does not by itself trigger the prohibition. What is prohibited is attributing an emotional state to a person based on biometric data.     <\/span><span class=\"NormalTextRun SCXW108949001 BCX0\"> <\/span>The third condition defines the contexts. \u201cWorkplace\u201d is interpreted broadly: offices, factories, warehouses, virtual environments such as Teams or Zoom, remote work from home, public spaces where an employment relationship exists, and recruitment and selection situations (interviews, tests). \u201cEducational institution\u201d is also understood expansively: primary and secondary education, universities, vocational training, adult education, and e-learning platforms when their use is mandatory. The focus is on individuals who are in a relationship of subordination or dependence vis-\u00e0-vis the organisation.<span class=\"NormalTextRun SCXW108949001 BCX0\"> <\/span><span class=\"NormalTextRun SCXW108949001 BCX0\">To ground the analysis, the guidance provides two illustrative scenarios that are readily recognisable today. In the first, a company installs AI-powered cameras in meeting rooms to analyse employees\u2019 voices and faces in order to measure \u201centhusiasm\u201d during presentations; the authority concludes that this case is prohibited: there is an AI system, emotional inference based on biometrics, and use in the workplace, with no medical or safety justification. In the second, schools use technology that monitors students\u2019 faces in the classroom to inform teachers in real time whether a student is bored, tired, or frustrated; this is also prohibited, as all three conditions are met, with an additional imbalance given that minors are involved.    <\/span><span class=\"NormalTextRun SCXW108949001 BCX0\"> <\/span><span class=\"NormalTextRun SCXW108949001 BCX0\">The guidance does not ignore the existence of exceptions. The AI Act allows the inference of emotions only where the purpose is medical or related to safety, and only where that objective is clearly justified and documented, and where no equally effective, less intrusive alternatives exist. Examples mentioned include therapeutic uses (support for people with autism) or accessibility uses (assistance for blind or deaf individuals). By contrast, providing management with a \u201cthermometer\u201d of employee satisfaction or measuring students\u2019 \u201cattention\u201d does not qualify as safety or medicine and does not fall within the exception.   <\/span><\/span><span class=\"EOP SCXW108949001 BCX0\" data-ccp-props=\"{}\"> <\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-8a3ac13 elementor-widget elementor-widget-heading\" data-id=\"8a3ac13\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">How should organisations respond? <\/h2>\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-39c1d9e elementor-widget elementor-widget-text-editor\" data-id=\"39c1d9e\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t\t\t\t\t\t<p><span data-contrast=\"auto\">The likely starting point is to review any AI initiative that touches on voice, image, or gestures in HR, recruitment, and performance evaluation; monitoring of customer service teams; learning analytics; or classroom surveillance. If the real purpose is to infer emotional states of staff or students, the legal conclusion is straightforward: stop, do not acquire, or do not deploy the system. The prohibition applies from the outset: it cannot be made lawful as long as the intention to analyse emotions in these cases remains. <br>Second, organisations should rethink their requirements for manufacturers and suppliers. The guidance emphasises that a system may fall outside the prohibition if it does not infer emotions but instead limits itself, for example, to measuring objective indicators of interaction in meetings (speaking time, turn-taking, interruptions) or non-emotional pedagogical metrics in the classroom (submissions, observable participation without emotional labelling). This requires product redesign and contractual and technical guarantees that no covert emotional inference takes place. The nuance is important: even if emotional classification is not shown to the user, if it exists, its use in work or education remains prohibited. <br>Third, it is essential to distinguish emotions from physical states. A system that detects drowsiness in professional drivers based on blinking and vehicle behaviour may qualify for the safety exception, provided it can be justified as protecting life and health and that no less intrusive alternative exists. By contrast, \u201cdetecting stress\u201d in a call-centre operator through voice analysis to adjust their script is neither a safety measure nor a medical intervention; it is a prohibited use. Purpose and proportionality are not presumed; they must be demonstrated. <br>Fourth, at the governance level, the guidance aligns with the principle of proactive accountability: it is not enough to remove a reference to \u201cmood\u201d; organisations must audit functionalities and document that the system does not perform emotional inference in these contexts. In procurement, contracts should include clauses explicitly prohibiting the present or future activation of emotion-recognition modules in workplace or educational settings and enabling technical audits. Internal compliance programmes should update AI policies, catalogues of prohibited practices, and mechanisms for rejecting use cases that may be attractive from a business perspective but are legally untenable. <br>Fifth, it is important to remember that the AI Act coexists with the GDPR and labour and education laws. A case that does not fall within the prohibition of Article 5(1)(f) may still be unlawful for other reasons: processing biometric data without a legal basis, lack of transparency or data minimisation, or disproportionate impacts on equality or non-discrimination. The guidance makes this clear: its focus is solely on the category of prohibited emotional inference in work and education; it does not prejudge compliance with other applicable laws. <br>                   <\/span><span data-ccp-props=\"{\"134233117\":true,\"134233118\":true,\"201341983\":0,\"335559740\":240}\"> <\/span><\/p><p><span data-contrast=\"auto\">In conclusion, the Danish guidance on the prohibition of AI systems that infer emotions may serve as an early reference point for the application of the European AI Act. Its interpretation clarifies the boundaries of a practice that, due to its potential impact on privacy and dignity, is prohibited in workplace and educational environments except for medical or safety reasons. <\/span><span data-ccp-props=\"{\"134233117\":true,\"134233118\":true,\"201341983\":0,\"335559740\":240}\"> <\/span><\/p><p><span data-contrast=\"auto\">The document reinforces the idea that organisations must review any technology that uses biometric data to infer emotions, ensuring that its uses strictly comply with the law and do not create power imbalances or intrusive forms of processing. In doing so, the guidance helps to consolidate a framework of trust in the development and deployment of artificial intelligence within the European Union. <\/span><span data-ccp-props=\"{\"134233117\":true,\"134233118\":true,\"201341983\":0,\"335559740\":240}\"> <\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t","protected":false},"excerpt":{"rendered":"<p>At vero eos et accusamus et iustoodio digni goikussimos ducimus qui blanp ditiis praesum voluum. <\/p>\n","protected":false},"author":2,"featured_media":8298,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[44],"tags":[],"class_list":["post-8305","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-construction"],"featured_image_src":"https:\/\/eprodat.com\/wp-content\/uploads\/2026\/02\/eprodat-blog-600x400.jpg","featured_image_src_square":"https:\/\/eprodat.com\/wp-content\/uploads\/2026\/02\/eprodat-blog-600x600.jpg","author_info":{"display_name":"Jaime S\u00e1nchez","author_link":"https:\/\/eprodat.com\/en\/author\/jaimepsfgmail-com\/"},"_links":{"self":[{"href":"https:\/\/eprodat.com\/en\/wp-json\/wp\/v2\/posts\/8305","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/eprodat.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/eprodat.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/eprodat.com\/en\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/eprodat.com\/en\/wp-json\/wp\/v2\/comments?post=8305"}],"version-history":[{"count":0,"href":"https:\/\/eprodat.com\/en\/wp-json\/wp\/v2\/posts\/8305\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/eprodat.com\/en\/wp-json\/wp\/v2\/media\/8298"}],"wp:attachment":[{"href":"https:\/\/eprodat.com\/en\/wp-json\/wp\/v2\/media?parent=8305"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/eprodat.com\/en\/wp-json\/wp\/v2\/categories?post=8305"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/eprodat.com\/en\/wp-json\/wp\/v2\/tags?post=8305"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}