{"id":831,"date":"2023-08-29T11:17:55","date_gmt":"2023-08-29T09:17:55","guid":{"rendered":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/?p=831"},"modified":"2026-02-03T12:12:32","modified_gmt":"2026-02-03T11:12:32","slug":"affective-computing-101","status":"publish","type":"post","link":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/affective-computing-101\/","title":{"rendered":"Affective Computing 101"},"content":{"rendered":"\n<p>When the human brain is processing information, it takes millions of neurons and an intricate cognitive network to decode whatever it is that we are simultaneously hearing, seeing, feeling. In every day social situations, we try (and sometimes fail) to interpret the behavior and affective reactions of others by decoding verbal and non-verbal signals. When it comes to affective reactions, non-verbal signals can even say more than words, as they transmit both conscious and unconscious emotions. Just think of somebody blushing or breaking into a sweat during a flight-or-fight situation. Recognizing and interpreting affective reactions, therefore, is a complex, multimodal perception process. <\/p>\n\n\n\n<p>Since the late 1990s, the field of affective computing has been aiming to make human emotions \u2013 in all of their complexity \u2013 recognizable and measurable for algorithms and intelligent systems. Consequently, affective computing is an interdisciplinary field, operating at the interface of computer science (AI and machine learning, among others), cognitive science, and psychology. But why do we (us humans) need machines to analyze our emotions in the first place? By perceiving, processing, and \u2013 in the next step \u2013 by simulating human emotions, affective computing applications can help individuals who face challenges when communicating or interpreting affective reactions, as we will later see. Furthermore, they aim to give machines emotional intelligence in order to improve the human-machine interface \u2013 which, by the way, is not as futuristic as it sounds: it\u2019s an everyday experience, for instance when driving a vehicle.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">From emotion to data<\/h3>\n\n\n\n<p>At Fraunhofer IIS, we understand and apply affective computing, also known as Emotion AI, by combining technical expertise with know-how in psychology and physiology  (read more in our <a href=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/emotion-analysis-101\/\">Emotion Analysis 101<\/a>). In our experience, the analysis of both conscious and unconscious affective reactions can be an asset in subject studies and other use cases \u2013 albeit how challenging the latter are to detect. For it is not just our conscious statements, but the whole array of unconscious, psycho-physical reactions that decide how we perceive our environment and how we react and interact: with consumer goods, in social situations, or while driving and in different traffic situations, for example.<\/p>\n\n\n\n<p>Data acquisition and data analysis for Emotion AI, therefore, are based on multimodal study designs, using speech signals, facial expressions, and bodily reactions (measured by sensor solutions and\/or image analysis). There are two approaches to acquire data, both involving experiments with human test subjects: by either knowing how they feel and then labeling the data, or by putting them in specific situations to trigger various emotional reactions. An example of the latter method is to trigger different emotional states in an <a href=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/acquiring-multimodal-data-in-the-emotionai-box\/\">exposure cabin<\/a> or in a driving simulator. Just like an exposure cabin, our driving simulator is fully equipped with cameras, lighting, and systems for multimodal biosignal acquisition (e.g., heart and breathing rate), which is useful, for example, when <a href=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/cognitive-load-in-autonomous-driving\/\">assessing cognitive overload while driving<\/a>. During post-processing, the fusion and analysis of the multimodal data follows: Algorithms intelligently evaluate the data, meaning they select, weight, and combine the various signals for each test subject. Compared to including discrete parameters only, the multimodal approach makes the results more robust and gives a holistic and more accurate picture of the subject\u2019s affective state. Finally, the algorithm trained for the classification of emotional states is implemented into product-ready software.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Affective computing: A tool of great power \u2013 and great responsibility<\/h3>\n\n\n\n<p>Another field where intelligent data analysis and Emotion AI are implemented vastly is the healthcare sector. One example of an application in this field is <a href=\"https:\/\/www.iis.fraunhofer.de\/en\/ff\/sse\/machine-learning\/affective-sensing\/emotionssensitive-robotik.html\">ERIK<\/a>, a joint research project aimed to help autistic children recognize and interpret emotions, and to respond accordingly. The project involves a humanoid robot that interacts with the children in real-time while performing sensor-based and software-supported analysis of facial expressions. In another research project, the monitoring system <a href=\"https:\/\/www.iis.fraunhofer.de\/en\/ff\/sse\/affective-computing\/facial-analysis-solutions\/shore-medicine\/ils-painfacereader.html\">PainFaceReader<\/a> automatically detects pain in patients who are unable to communicate, for example due to dementia, during acute post-operative care, or in palliative care. These patients may not be able to communicate their pain episodes, leading to an undertreatment of pain symptoms.<\/p>\n\n\n\n<p>Both affective computing applications improve the understanding and response to emotions in cases where communication barriers exist. They address vulnerable persons, which is why the potential misuse of these technologies needs to be part of the public discourse \u2013 just like their potential usefulness. Emotion AI is by design based on sensitive, private, and very personal data. In the context of healthcare applications, patients\u2019 privacy rights are also part of the equation. A high level of data security, therefore, needs to be a prerequisite of affective computing technologies and to be monitored subsequently. <\/p>\n\n\n\n<h3 class=\"wp-block-heading\">From black box to white box<\/h3>\n\n\n\n<p>Another core issue of Emotion AI is, as with all machine learning methods, the quality of the data (not just the issue of quantity, which machine learning experts will always respond to with \u201cmore, more, more\u201d). With skewed data, AI algorithms can reproduce biases, meaning they exhibit higher failure rates and inaccuracies with certain groups, leading to possible unfairness during real-world application. In case of facial expression recognition, for example, a bias can be in age, with the algorithm performing best for the young and being less accurate for older test subjects (cf. <a href=\"https:\/\/dl.acm.org\/doi\/fullHtml\/10.1145\/3531146.3533159\">Pahl et al. 2022<\/a>). Especially when applied in the context of health care solutions and patient care, age as a source of bias (just as gender and ethnicity) has to be identified and remedied as early as possible (see <a href=\"https:\/\/www.iis.fraunhofer.de\/content\/dam\/iis\/en\/doc\/il\/bmt\/fisba50304.pdf\">Deuschel et al. 2021<\/a> for a more detailed analysis). Algorithms need to be evaluated for fairness, considering that they and their training data are the basis of all real-world application and potentially high-impact decisions.<\/p>\n\n\n\n<p>In light of the digitization of everyday life and with respect to the acceptance of machine learning methods, the inner workings of these technologies, how they generate their knowledge, draw their conclusions, and their possible limitations, need to not only be part of the professional discourse, but also of general knowledge. Or to use the metaphor of the \u201c<a href=\"https:\/\/www.iis.fraunhofer.de\/en\/ff\/sse\/affective-computing\/cai.html\">Comprehensible Artificial Intelligence<\/a>\u201d project group: AI systems should evolve from being a black box to being a white box \u2013 not just for experts, but with regard to public acceptance, strategic decisions, and their application in industry (learn more on the issue of <a href=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/trustworthy-ai\/\">Trustworthy AI<\/a>).<\/p>\n\n\n\n<p>Image copyright: Fraunhofer IIS \/ Bianca M\u00f6ller<\/p>\n","protected":false},"excerpt":{"rendered":"<p>When the human brain is processing information, it takes millions of neurons and an intricate cognitive network to decode whatever it is that we are simultaneously hearing, seeing, feeling. In every day social situations, we try (and sometimes fail) to interpret the behavior and affective reactions of others by decoding verbal and non-verbal signals. When [&hellip;]<\/p>\n","protected":false},"author":9,"featured_media":832,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[41],"tags":[37,62],"coauthors":[51],"class_list":["post-831","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-affective-computing","tag-ai","tag-science-101"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.6 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Affective Computing 101 - SMART SENSING insights<\/title>\n<meta name=\"description\" content=\"Have AI understand your emotions? Explore our perspectives on the potentials of affective computing and discover real-world use cases.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/affective-computing-101\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Affective Computing 101 - SMART SENSING insights\" \/>\n<meta property=\"og:description\" content=\"Have AI understand your emotions? Explore our perspectives on the potentials of affective computing and discover real-world use cases.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/affective-computing-101\/\" \/>\n<meta property=\"og:site_name\" content=\"SMART SENSING insights\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/FraunhoferIIS\" \/>\n<meta property=\"article:published_time\" content=\"2023-08-29T09:17:55+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2026-02-03T11:12:32+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/08\/HSA_multimodale_Zustandserkennung-scaled.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"2560\" \/>\n\t<meta property=\"og:image:height\" content=\"1413\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Grit Nickel\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Grit Nickel\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"5 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/affective-computing-101\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/affective-computing-101\\\/\"},\"author\":{\"name\":\"Grit Nickel\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#\\\/schema\\\/person\\\/fc55925f8da111629c277bcedf848c5e\"},\"headline\":\"Affective Computing 101\",\"datePublished\":\"2023-08-29T09:17:55+00:00\",\"dateModified\":\"2026-02-03T11:12:32+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/affective-computing-101\\\/\"},\"wordCount\":1045,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/affective-computing-101\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2023\\\/08\\\/HSA_multimodale_Zustandserkennung-scaled.jpg\",\"keywords\":[\"AI\",\"Science 101\"],\"articleSection\":[\"Affective Computing\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/affective-computing-101\\\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/affective-computing-101\\\/\",\"url\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/affective-computing-101\\\/\",\"name\":\"Affective Computing 101 - SMART SENSING insights\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/affective-computing-101\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/affective-computing-101\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2023\\\/08\\\/HSA_multimodale_Zustandserkennung-scaled.jpg\",\"datePublished\":\"2023-08-29T09:17:55+00:00\",\"dateModified\":\"2026-02-03T11:12:32+00:00\",\"description\":\"Have AI understand your emotions? Explore our perspectives on the potentials of affective computing and discover real-world use cases.\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/affective-computing-101\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/affective-computing-101\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/affective-computing-101\\\/#primaryimage\",\"url\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2023\\\/08\\\/HSA_multimodale_Zustandserkennung-scaled.jpg\",\"contentUrl\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2023\\\/08\\\/HSA_multimodale_Zustandserkennung-scaled.jpg\",\"width\":2560,\"height\":1413,\"caption\":\"Emotion Detection Software\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/affective-computing-101\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Affective Computing 101\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#website\",\"url\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/\",\"name\":\"SMART SENSING insights\",\"description\":\"learn more about our focus research areas sensor technology, electronics, and artificial intelligence\",\"publisher\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#organization\",\"name\":\"Fraunhofer IIS\",\"url\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Fraunhofer-IIS-1.png\",\"contentUrl\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Fraunhofer-IIS-1.png\",\"width\":826,\"height\":299,\"caption\":\"Fraunhofer IIS\"},\"image\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/www.facebook.com\\\/FraunhoferIIS\",\"https:\\\/\\\/www.linkedin.com\\\/company\\\/fraunhofer-iis\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#\\\/schema\\\/person\\\/fc55925f8da111629c277bcedf848c5e\",\"name\":\"Grit Nickel\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2023\\\/08\\\/cropped-Grit_Nickel-glasowfotografie_1zu1-96x96.jpgb9e2641f671ece8d5eef3a78bdbb8b8e\",\"url\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2023\\\/08\\\/cropped-Grit_Nickel-glasowfotografie_1zu1-96x96.jpg\",\"contentUrl\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2023\\\/08\\\/cropped-Grit_Nickel-glasowfotografie_1zu1-96x96.jpg\",\"caption\":\"Grit Nickel\"},\"description\":\"Grit is a content writer at Fraunhofer IIS and a science communication specialist. She has 6+ years of experience in research and holds a PhD in German linguistics.\",\"sameAs\":[\"https:\\\/\\\/www.linkedin.com\\\/in\\\/grit-nickel\\\/\"],\"url\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/author\\\/grit-nickeliis-fraunhofer-de\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Affective Computing 101 - SMART SENSING insights","description":"Have AI understand your emotions? Explore our perspectives on the potentials of affective computing and discover real-world use cases.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/affective-computing-101\/","og_locale":"en_US","og_type":"article","og_title":"Affective Computing 101 - SMART SENSING insights","og_description":"Have AI understand your emotions? Explore our perspectives on the potentials of affective computing and discover real-world use cases.","og_url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/affective-computing-101\/","og_site_name":"SMART SENSING insights","article_publisher":"https:\/\/www.facebook.com\/FraunhoferIIS","article_published_time":"2023-08-29T09:17:55+00:00","article_modified_time":"2026-02-03T11:12:32+00:00","og_image":[{"width":2560,"height":1413,"url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/08\/HSA_multimodale_Zustandserkennung-scaled.jpg","type":"image\/jpeg"}],"author":"Grit Nickel","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Grit Nickel","Est. reading time":"5 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/affective-computing-101\/#article","isPartOf":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/affective-computing-101\/"},"author":{"name":"Grit Nickel","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#\/schema\/person\/fc55925f8da111629c277bcedf848c5e"},"headline":"Affective Computing 101","datePublished":"2023-08-29T09:17:55+00:00","dateModified":"2026-02-03T11:12:32+00:00","mainEntityOfPage":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/affective-computing-101\/"},"wordCount":1045,"commentCount":0,"publisher":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#organization"},"image":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/affective-computing-101\/#primaryimage"},"thumbnailUrl":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/08\/HSA_multimodale_Zustandserkennung-scaled.jpg","keywords":["AI","Science 101"],"articleSection":["Affective Computing"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/affective-computing-101\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/affective-computing-101\/","url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/affective-computing-101\/","name":"Affective Computing 101 - SMART SENSING insights","isPartOf":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#website"},"primaryImageOfPage":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/affective-computing-101\/#primaryimage"},"image":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/affective-computing-101\/#primaryimage"},"thumbnailUrl":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/08\/HSA_multimodale_Zustandserkennung-scaled.jpg","datePublished":"2023-08-29T09:17:55+00:00","dateModified":"2026-02-03T11:12:32+00:00","description":"Have AI understand your emotions? Explore our perspectives on the potentials of affective computing and discover real-world use cases.","breadcrumb":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/affective-computing-101\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/affective-computing-101\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/affective-computing-101\/#primaryimage","url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/08\/HSA_multimodale_Zustandserkennung-scaled.jpg","contentUrl":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/08\/HSA_multimodale_Zustandserkennung-scaled.jpg","width":2560,"height":1413,"caption":"Emotion Detection Software"},{"@type":"BreadcrumbList","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/affective-computing-101\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/"},{"@type":"ListItem","position":2,"name":"Affective Computing 101"}]},{"@type":"WebSite","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#website","url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/","name":"SMART SENSING insights","description":"learn more about our focus research areas sensor technology, electronics, and artificial intelligence","publisher":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#organization","name":"Fraunhofer IIS","url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#\/schema\/logo\/image\/","url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/Fraunhofer-IIS-1.png","contentUrl":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/Fraunhofer-IIS-1.png","width":826,"height":299,"caption":"Fraunhofer IIS"},"image":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/FraunhoferIIS","https:\/\/www.linkedin.com\/company\/fraunhofer-iis"]},{"@type":"Person","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#\/schema\/person\/fc55925f8da111629c277bcedf848c5e","name":"Grit Nickel","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/08\/cropped-Grit_Nickel-glasowfotografie_1zu1-96x96.jpgb9e2641f671ece8d5eef3a78bdbb8b8e","url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/08\/cropped-Grit_Nickel-glasowfotografie_1zu1-96x96.jpg","contentUrl":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/08\/cropped-Grit_Nickel-glasowfotografie_1zu1-96x96.jpg","caption":"Grit Nickel"},"description":"Grit is a content writer at Fraunhofer IIS and a science communication specialist. She has 6+ years of experience in research and holds a PhD in German linguistics.","sameAs":["https:\/\/www.linkedin.com\/in\/grit-nickel\/"],"url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/author\/grit-nickeliis-fraunhofer-de\/"}]}},"_links":{"self":[{"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/posts\/831","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/users\/9"}],"replies":[{"embeddable":true,"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/comments?post=831"}],"version-history":[{"count":19,"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/posts\/831\/revisions"}],"predecessor-version":[{"id":5100,"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/posts\/831\/revisions\/5100"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/media\/832"}],"wp:attachment":[{"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/media?parent=831"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/categories?post=831"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/tags?post=831"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/coauthors?post=831"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}