{"id":1702,"date":"2024-05-21T11:34:00","date_gmt":"2024-05-21T09:34:00","guid":{"rendered":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/?p=1702"},"modified":"2025-08-25T11:29:01","modified_gmt":"2025-08-25T09:29:01","slug":"acquiring-multimodal-data-in-the-emotionai-box","status":"publish","type":"post","link":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/acquiring-multimodal-data-in-the-emotionai-box\/","title":{"rendered":"Acquiring Multimodal Data in the Exposure Cabin"},"content":{"rendered":"\n<p>When acquiring data for <a href=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/affective-computing-101\/\">affective computing<\/a>, we aim for a multimodal mix of data as it allows for a more comprehensive and accurate understanding of human emotions and behaviors. By collecting multimodal data, such as physiological signals or behavioral changes, we can capture a more nuanced and holistic picture of an individual&#8217;s affective states. This, of course, leads to more reliable and robust results and high-quality data, which in turn is the prerequisite for the sound classification of emotions with AI-based algorithms.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Main challenges<\/h3>\n\n\n\n<p>What sounds good in theory, though, poses two major challenges to any (applied) research endeavor aiming to obtain state-of-the-art data on affective states:<\/p>\n\n\n\n<p><strong>Challenge no. 1<\/strong>: There is no one-size-fits-all solution when it comes to measurement modalities. Different study designs and use cases may have individual requirements to measurement modalities and protocols.<\/p>\n\n\n\n<p><strong>Challenge no. 2<\/strong>: While researchers \u2013 at least in their heart \u2013 wish for the optimum when it comes to study design and data, we are all constrained by feasibility and budgetary considerations.<\/p>\n\n\n\n<p>And that\u2019s where our exposure cabin comes in: We developed a closed, interference-free measurement cabin that facilitates the combination of various modalities \u2013 customized to individual needs. It provides researchers with<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>a controlled environment for data acquisition and precise detection of affective states, e.g., through predefined lighting conditions and noise environment<\/li>\n\n\n\n<li>simple and affordable data acquisition, e.g., through wearables and extensive sensor and camera equipment<\/li>\n\n\n\n<li>flexible and customized selection of measurement modalities, tailored to the specific content, purpose, and needs of each study<\/li>\n\n\n\n<li>fast and trouble-free study setup<\/li>\n\n\n\n<li>easy data synchronization and accurate mapping of stimulus and response<\/li>\n\n\n\n<li>high level of data security during the acquisition and analysis phase, taking into account all necessary ethical guidelines and data protection regulations.<\/li>\n<\/ul>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:100%\">\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-1 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure data-wp-context=\"{&quot;imageId&quot;:&quot;6a047bc7a4f84&quot;}\" data-wp-interactive=\"core\/image\" data-wp-key=\"6a047bc7a4f84\" class=\"wp-block-image size-large wp-lightbox-container\"><img decoding=\"async\" width=\"1024\" height=\"678\" data-wp-class--hide=\"state.isContentHidden\" data-wp-class--show=\"state.isContentVisible\" data-wp-init=\"callbacks.setButtonStyles\" data-wp-on--click=\"actions.showLightbox\" data-wp-on--load=\"callbacks.setButtonStyles\" data-wp-on-window--resize=\"callbacks.setButtonStyles\" data-id=\"1710\" src=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/ExpoBox-1_aussen_web-1024x678.png\" alt=\"\" class=\"wp-image-1710\" srcset=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/ExpoBox-1_aussen_web-1024x678.png 1024w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/ExpoBox-1_aussen_web-300x199.png 300w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/ExpoBox-1_aussen_web-768x509.png 768w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/ExpoBox-1_aussen_web-1536x1017.png 1536w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/ExpoBox-1_aussen_web-370x245.png 370w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/ExpoBox-1_aussen_web-270x179.png 270w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/ExpoBox-1_aussen_web-570x377.png 570w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/ExpoBox-1_aussen_web-740x490.png 740w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/ExpoBox-1_aussen_web.png 2048w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><button\n\t\t\tclass=\"lightbox-trigger\"\n\t\t\ttype=\"button\"\n\t\t\taria-haspopup=\"dialog\"\n\t\t\taria-label=\"Enlarge\"\n\t\t\tdata-wp-init=\"callbacks.initTriggerButton\"\n\t\t\tdata-wp-on--click=\"actions.showLightbox\"\n\t\t\tdata-wp-style--right=\"state.imageButtonRight\"\n\t\t\tdata-wp-style--top=\"state.imageButtonTop\"\n\t\t>\n\t\t\t<svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" width=\"12\" height=\"12\" fill=\"none\" viewBox=\"0 0 12 12\">\n\t\t\t\t<path fill=\"#fff\" d=\"M2 0a2 2 0 0 0-2 2v2h1.5V2a.5.5 0 0 1 .5-.5h2V0H2Zm2 10.5H2a.5.5 0 0 1-.5-.5V8H0v2a2 2 0 0 0 2 2h2v-1.5ZM8 12v-1.5h2a.5.5 0 0 0 .5-.5V8H12v2a2 2 0 0 1-2 2H8Zm2-12a2 2 0 0 1 2 2v2h-1.5V2a.5.5 0 0 0-.5-.5H8V0h2Z\" \/>\n\t\t\t<\/svg>\n\t\t<\/button><figcaption class=\"wp-element-caption\">Just a closed box from the outside &#8230;<\/figcaption><\/figure>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"683\" height=\"1024\" data-id=\"1706\" src=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousalstudie_ExpoBox_ambient_red-2_web-683x1024.png\" alt=\"\" class=\"wp-image-1706\" srcset=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousalstudie_ExpoBox_ambient_red-2_web-683x1024.png 683w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousalstudie_ExpoBox_ambient_red-2_web-200x300.png 200w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousalstudie_ExpoBox_ambient_red-2_web-768x1152.png 768w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousalstudie_ExpoBox_ambient_red-2_web-1024x1536.png 1024w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousalstudie_ExpoBox_ambient_red-2_web-1365x2048.png 1365w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousalstudie_ExpoBox_ambient_red-2_web-370x555.png 370w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousalstudie_ExpoBox_ambient_red-2_web-270x405.png 270w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousalstudie_ExpoBox_ambient_red-2_web-570x855.png 570w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousalstudie_ExpoBox_ambient_red-2_web-740x1110.png 740w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousalstudie_ExpoBox_ambient_red-2_web.png 2048w\" sizes=\"(max-width: 683px) 100vw, 683px\" \/><figcaption class=\"wp-element-caption\">&#8230; but plenty of configurations inside, e.g., predefined lighting<\/figcaption><\/figure>\n<\/figure>\n<\/div>\n<\/div>\n\n\n\n<h2 class=\"wp-block-heading\">Inside the box: Multimodal data acquisition<\/h2>\n\n\n\n<p>The main benefit of an exposure cabin for affective computing is the combination of a controlled environment to present stimuli and the sensor and monitoring equipment. Inside the exposure box, the study participants are wired up to measure physiological signals ranging from respiration, pulse, and blood oxygen saturation to intestinal activity. Additionally, they can be equipped with an EEG cap. Outside the box, a study monitor carefully observes and monitors the study participants and the measurement protocols.<\/p>\n\n\n\n<p>The configuration of our affective computing lab for specific study setups basically follows the modular principle: Which of the measurement modalities are chosen depends on the study\u2019s content, purpose, and on budgetary constraints (see challenge no. 2). Including an EEG into the analysis, for example, generates great data on brain activity and attentiveness, but it also generates higher costs on account of the labor-intensive data evaluation. The automotive industry, on the other hand, is always looking to make as many of these analyses contactless (for instance, <a href=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/cognitive-load-in-autonomous-driving\/?preview_id=1635&amp;preview_nonce=617d648b15&amp;preview=true&amp;_thumbnail_id=1637\">when assessing drivers\u2019 cognitive states<\/a>) \u2013 not for cost considerations, but in order to closely resemble real-world applications and measurement modalities in vehicles.<\/p>\n\n\n\n<div class=\"wp-block-group\"><div class=\"wp-block-group__inner-container is-layout-constrained wp-block-group-is-layout-constrained\">\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-2 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure data-wp-context=\"{&quot;imageId&quot;:&quot;6a047bc7a5e12&quot;}\" data-wp-interactive=\"core\/image\" data-wp-key=\"6a047bc7a5e12\" class=\"wp-block-image size-large wp-lightbox-container\"><img decoding=\"async\" width=\"1024\" height=\"704\" data-wp-class--hide=\"state.isContentHidden\" data-wp-class--show=\"state.isContentVisible\" data-wp-init=\"callbacks.setButtonStyles\" data-wp-on--click=\"actions.showLightbox\" data-wp-on--load=\"callbacks.setButtonStyles\" data-wp-on-window--resize=\"callbacks.setButtonStyles\" data-id=\"1705\" src=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_Expobox_11-2023-2-1024x704.jpg\" alt=\"Multimodal Data acquisition and monitoring in- and outside the EmotionAI Box\" class=\"wp-image-1705\" srcset=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_Expobox_11-2023-2-1024x704.jpg 1024w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_Expobox_11-2023-2-300x206.jpg 300w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_Expobox_11-2023-2-768x528.jpg 768w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_Expobox_11-2023-2-1536x1056.jpg 1536w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_Expobox_11-2023-2-2048x1408.jpg 2048w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_Expobox_11-2023-2-370x254.jpg 370w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_Expobox_11-2023-2-270x186.jpg 270w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_Expobox_11-2023-2-435x300.jpg 435w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_Expobox_11-2023-2-570x392.jpg 570w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_Expobox_11-2023-2-740x509.jpg 740w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><button\n\t\t\tclass=\"lightbox-trigger\"\n\t\t\ttype=\"button\"\n\t\t\taria-haspopup=\"dialog\"\n\t\t\taria-label=\"Enlarge\"\n\t\t\tdata-wp-init=\"callbacks.initTriggerButton\"\n\t\t\tdata-wp-on--click=\"actions.showLightbox\"\n\t\t\tdata-wp-style--right=\"state.imageButtonRight\"\n\t\t\tdata-wp-style--top=\"state.imageButtonTop\"\n\t\t>\n\t\t\t<svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" width=\"12\" height=\"12\" fill=\"none\" viewBox=\"0 0 12 12\">\n\t\t\t\t<path fill=\"#fff\" d=\"M2 0a2 2 0 0 0-2 2v2h1.5V2a.5.5 0 0 1 .5-.5h2V0H2Zm2 10.5H2a.5.5 0 0 1-.5-.5V8H0v2a2 2 0 0 0 2 2h2v-1.5ZM8 12v-1.5h2a.5.5 0 0 0 .5-.5V8H12v2a2 2 0 0 1-2 2H8Zm2-12a2 2 0 0 1 2 2v2h-1.5V2a.5.5 0 0 0-.5-.5H8V0h2Z\" \/>\n\t\t\t<\/svg>\n\t\t<\/button><figcaption class=\"wp-element-caption\">Multimodal data acquisition inside the box<\/figcaption><\/figure>\n\n\n\n<figure class=\"wp-block-image size-large is-style-default\"><img decoding=\"async\" width=\"1024\" height=\"683\" data-id=\"1707\" src=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousalstudie_ExpoBox_Garmin_Biopacs-4_web-1024x683.png\" alt=\"\" class=\"wp-image-1707\" srcset=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousalstudie_ExpoBox_Garmin_Biopacs-4_web-1024x683.png 1024w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousalstudie_ExpoBox_Garmin_Biopacs-4_web-300x200.png 300w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousalstudie_ExpoBox_Garmin_Biopacs-4_web-768x512.png 768w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousalstudie_ExpoBox_Garmin_Biopacs-4_web-1536x1024.png 1536w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousalstudie_ExpoBox_Garmin_Biopacs-4_web-370x247.png 370w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousalstudie_ExpoBox_Garmin_Biopacs-4_web-270x180.png 270w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousalstudie_ExpoBox_Garmin_Biopacs-4_web-570x380.png 570w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousalstudie_ExpoBox_Garmin_Biopacs-4_web-740x493.png 740w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousalstudie_ExpoBox_Garmin_Biopacs-4_web.png 2048w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\">Wearables to measure bio-signals<\/figcaption><\/figure>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"683\" data-id=\"1704\" src=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_ExpoBox_11-23-7_web-1024x683.png\" alt=\"\" class=\"wp-image-1704\" srcset=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_ExpoBox_11-23-7_web-1024x683.png 1024w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_ExpoBox_11-23-7_web-300x200.png 300w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_ExpoBox_11-23-7_web-768x512.png 768w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_ExpoBox_11-23-7_web-1536x1025.png 1536w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_ExpoBox_11-23-7_web-370x247.png 370w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_ExpoBox_11-23-7_web-270x180.png 270w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_ExpoBox_11-23-7_web-570x380.png 570w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_ExpoBox_11-23-7_web-740x494.png 740w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_ExpoBox_11-23-7_web.png 2048w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\">Multi-monitor setup<\/figcaption><\/figure>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"728\" data-id=\"1703\" src=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_ExpoBox_11-23-5_web-1024x728.png\" alt=\"\" class=\"wp-image-1703\" srcset=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_ExpoBox_11-23-5_web-1024x728.png 1024w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_ExpoBox_11-23-5_web-300x213.png 300w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_ExpoBox_11-23-5_web-768x546.png 768w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_ExpoBox_11-23-5_web-1536x1091.png 1536w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_ExpoBox_11-23-5_web-370x263.png 370w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_ExpoBox_11-23-5_web-270x192.png 270w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_ExpoBox_11-23-5_web-570x405.png 570w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_ExpoBox_11-23-5_web-740x526.png 740w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_ExpoBox_11-23-5_web.png 2048w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\">Monitoring outside the box<\/figcaption><\/figure>\n<\/figure>\n<\/div><\/div>\n\n\n\n<h2 class=\"wp-block-heading\">Why choose one over the other?<\/h2>\n\n\n\n<p>Once the study setup, data synchronization protocols, and the sample population are defined, the data is recorded in the exposure cabin. The study participants are exposed to various stimuli: visual images, videos, or audio clips, tasks they are asked to complete, or products and commercials they are asked to evaluate.<\/p>\n\n\n\n<p>But why combine subjective assessment and multimodals data analysis in the first place? The benefit of 360\u00b0 multimodal data lies in the objectivity and robustness of the data. Especially when it comes to analyzing affective reactions to products or online services, this can be off an advantage:<\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li>Physiological and behavioral data capture unfiltered reactions. Study participants may not want to admit in an interview or questionnaire that they were struggling with a task or overwhelmed while driving. (Yep, even in scientific contexts it can all be about the art of saving face.)<\/li>\n\n\n\n<li>Affective reactions often occur unconsciously and are thus hard to access through retrospective questioning. Real-time analysis can capture more of both the conscious and unconscious facet of a subject\u2019s affective reactions \u2013 at least more than any post-hoc user feedback analysis ever could.<\/li>\n\n\n\n<li>Real-time analysis is more precise. In retrospect, a test subjective may not only have trouble motivating why they got stressed, but also when exactly this occurred. By accurately mapping stimulus and response, multimodal data acquisition techniques can precisely determine these aspects.<\/li>\n<\/ol>\n\n\n\n<p>Are you interested in knowing whether a 360\u00b0 analysis can benefit your project? <a href=\"mailto:affective-computing@iis.fraunhofer.de\">Just reach out to us!<\/a><\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<p>Image copyright: Fraunhofer IIS \/ Paul Pulkert<\/p>\n","protected":false},"excerpt":{"rendered":"<p>When acquiring data for affective computing, we aim for a multimodal mix of data as it allows for a more comprehensive and accurate understanding of human emotions and behaviors. By collecting multimodal data, such as physiological signals or behavioral changes, we can capture a more nuanced and holistic picture of an individual&#8217;s affective states. This, [&hellip;]<\/p>\n","protected":false},"author":9,"featured_media":1709,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[41,98],"tags":[37,66,67],"coauthors":[51],"class_list":["post-1702","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-affective-computing","category-infrastructure","tag-ai","tag-multimodal-data","tag-wearables"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.6 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Acquiring Multimodal Data in the Exposure Cabin - SMART SENSING insights<\/title>\n<meta name=\"description\" content=\"To get a comprehensive understanding of human emotions, we acquire multimodal data. One way to do just that is our exposure cabin.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/acquiring-multimodal-data-in-the-emotionai-box\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Acquiring Multimodal Data in the Exposure Cabin - SMART SENSING insights\" \/>\n<meta property=\"og:description\" content=\"To get a comprehensive understanding of human emotions, we acquire multimodal data. One way to do just that is our exposure cabin.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/acquiring-multimodal-data-in-the-emotionai-box\/\" \/>\n<meta property=\"og:site_name\" content=\"SMART SENSING insights\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/FraunhoferIIS\" \/>\n<meta property=\"article:published_time\" content=\"2024-05-21T09:34:00+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-08-25T09:29:01+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_Expobox_inoutside_11-2023_web.png\" \/>\n\t<meta property=\"og:image:width\" content=\"2048\" \/>\n\t<meta property=\"og:image:height\" content=\"946\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Grit Nickel\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Grit Nickel\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"5 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/acquiring-multimodal-data-in-the-emotionai-box\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/acquiring-multimodal-data-in-the-emotionai-box\\\/\"},\"author\":{\"name\":\"Grit Nickel\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#\\\/schema\\\/person\\\/fc55925f8da111629c277bcedf848c5e\"},\"headline\":\"Acquiring Multimodal Data in the Exposure Cabin\",\"datePublished\":\"2024-05-21T09:34:00+00:00\",\"dateModified\":\"2025-08-25T09:29:01+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/acquiring-multimodal-data-in-the-emotionai-box\\\/\"},\"wordCount\":772,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/acquiring-multimodal-data-in-the-emotionai-box\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2024\\\/01\\\/Arousal-Studie_Expobox_inoutside_11-2023_web.png\",\"keywords\":[\"AI\",\"Multimodal Data\",\"Wearables\"],\"articleSection\":[\"Affective Computing\",\"Infrastructure\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/acquiring-multimodal-data-in-the-emotionai-box\\\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/acquiring-multimodal-data-in-the-emotionai-box\\\/\",\"url\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/acquiring-multimodal-data-in-the-emotionai-box\\\/\",\"name\":\"Acquiring Multimodal Data in the Exposure Cabin - SMART SENSING insights\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/acquiring-multimodal-data-in-the-emotionai-box\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/acquiring-multimodal-data-in-the-emotionai-box\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2024\\\/01\\\/Arousal-Studie_Expobox_inoutside_11-2023_web.png\",\"datePublished\":\"2024-05-21T09:34:00+00:00\",\"dateModified\":\"2025-08-25T09:29:01+00:00\",\"description\":\"To get a comprehensive understanding of human emotions, we acquire multimodal data. One way to do just that is our exposure cabin.\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/acquiring-multimodal-data-in-the-emotionai-box\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/acquiring-multimodal-data-in-the-emotionai-box\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/acquiring-multimodal-data-in-the-emotionai-box\\\/#primaryimage\",\"url\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2024\\\/01\\\/Arousal-Studie_Expobox_inoutside_11-2023_web.png\",\"contentUrl\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2024\\\/01\\\/Arousal-Studie_Expobox_inoutside_11-2023_web.png\",\"width\":2048,\"height\":946,\"caption\":\"Copyright: Fraunhofer IIS\\\/Paul Pulkert\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/acquiring-multimodal-data-in-the-emotionai-box\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Acquiring Multimodal Data in the Exposure Cabin\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#website\",\"url\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/\",\"name\":\"SMART SENSING insights\",\"description\":\"learn more about our focus research areas sensor technology, electronics, and artificial intelligence\",\"publisher\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#organization\",\"name\":\"Fraunhofer IIS\",\"url\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Fraunhofer-IIS-1.png\",\"contentUrl\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Fraunhofer-IIS-1.png\",\"width\":826,\"height\":299,\"caption\":\"Fraunhofer IIS\"},\"image\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/www.facebook.com\\\/FraunhoferIIS\",\"https:\\\/\\\/www.linkedin.com\\\/company\\\/fraunhofer-iis\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#\\\/schema\\\/person\\\/fc55925f8da111629c277bcedf848c5e\",\"name\":\"Grit Nickel\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2023\\\/08\\\/cropped-Grit_Nickel-glasowfotografie_1zu1-96x96.jpgb9e2641f671ece8d5eef3a78bdbb8b8e\",\"url\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2023\\\/08\\\/cropped-Grit_Nickel-glasowfotografie_1zu1-96x96.jpg\",\"contentUrl\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2023\\\/08\\\/cropped-Grit_Nickel-glasowfotografie_1zu1-96x96.jpg\",\"caption\":\"Grit Nickel\"},\"description\":\"Grit is a content writer at Fraunhofer IIS and a science communication specialist. She has 6+ years of experience in research and holds a PhD in German linguistics.\",\"sameAs\":[\"https:\\\/\\\/www.linkedin.com\\\/in\\\/grit-nickel\\\/\"],\"url\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/author\\\/grit-nickeliis-fraunhofer-de\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Acquiring Multimodal Data in the Exposure Cabin - SMART SENSING insights","description":"To get a comprehensive understanding of human emotions, we acquire multimodal data. One way to do just that is our exposure cabin.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/acquiring-multimodal-data-in-the-emotionai-box\/","og_locale":"en_US","og_type":"article","og_title":"Acquiring Multimodal Data in the Exposure Cabin - SMART SENSING insights","og_description":"To get a comprehensive understanding of human emotions, we acquire multimodal data. One way to do just that is our exposure cabin.","og_url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/acquiring-multimodal-data-in-the-emotionai-box\/","og_site_name":"SMART SENSING insights","article_publisher":"https:\/\/www.facebook.com\/FraunhoferIIS","article_published_time":"2024-05-21T09:34:00+00:00","article_modified_time":"2025-08-25T09:29:01+00:00","og_image":[{"width":2048,"height":946,"url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_Expobox_inoutside_11-2023_web.png","type":"image\/png"}],"author":"Grit Nickel","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Grit Nickel","Est. reading time":"5 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/acquiring-multimodal-data-in-the-emotionai-box\/#article","isPartOf":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/acquiring-multimodal-data-in-the-emotionai-box\/"},"author":{"name":"Grit Nickel","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#\/schema\/person\/fc55925f8da111629c277bcedf848c5e"},"headline":"Acquiring Multimodal Data in the Exposure Cabin","datePublished":"2024-05-21T09:34:00+00:00","dateModified":"2025-08-25T09:29:01+00:00","mainEntityOfPage":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/acquiring-multimodal-data-in-the-emotionai-box\/"},"wordCount":772,"commentCount":0,"publisher":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#organization"},"image":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/acquiring-multimodal-data-in-the-emotionai-box\/#primaryimage"},"thumbnailUrl":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_Expobox_inoutside_11-2023_web.png","keywords":["AI","Multimodal Data","Wearables"],"articleSection":["Affective Computing","Infrastructure"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/acquiring-multimodal-data-in-the-emotionai-box\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/acquiring-multimodal-data-in-the-emotionai-box\/","url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/acquiring-multimodal-data-in-the-emotionai-box\/","name":"Acquiring Multimodal Data in the Exposure Cabin - SMART SENSING insights","isPartOf":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#website"},"primaryImageOfPage":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/acquiring-multimodal-data-in-the-emotionai-box\/#primaryimage"},"image":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/acquiring-multimodal-data-in-the-emotionai-box\/#primaryimage"},"thumbnailUrl":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_Expobox_inoutside_11-2023_web.png","datePublished":"2024-05-21T09:34:00+00:00","dateModified":"2025-08-25T09:29:01+00:00","description":"To get a comprehensive understanding of human emotions, we acquire multimodal data. One way to do just that is our exposure cabin.","breadcrumb":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/acquiring-multimodal-data-in-the-emotionai-box\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/acquiring-multimodal-data-in-the-emotionai-box\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/acquiring-multimodal-data-in-the-emotionai-box\/#primaryimage","url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_Expobox_inoutside_11-2023_web.png","contentUrl":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/01\/Arousal-Studie_Expobox_inoutside_11-2023_web.png","width":2048,"height":946,"caption":"Copyright: Fraunhofer IIS\/Paul Pulkert"},{"@type":"BreadcrumbList","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/acquiring-multimodal-data-in-the-emotionai-box\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/"},{"@type":"ListItem","position":2,"name":"Acquiring Multimodal Data in the Exposure Cabin"}]},{"@type":"WebSite","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#website","url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/","name":"SMART SENSING insights","description":"learn more about our focus research areas sensor technology, electronics, and artificial intelligence","publisher":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#organization","name":"Fraunhofer IIS","url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#\/schema\/logo\/image\/","url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/Fraunhofer-IIS-1.png","contentUrl":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/Fraunhofer-IIS-1.png","width":826,"height":299,"caption":"Fraunhofer IIS"},"image":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/FraunhoferIIS","https:\/\/www.linkedin.com\/company\/fraunhofer-iis"]},{"@type":"Person","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#\/schema\/person\/fc55925f8da111629c277bcedf848c5e","name":"Grit Nickel","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/08\/cropped-Grit_Nickel-glasowfotografie_1zu1-96x96.jpgb9e2641f671ece8d5eef3a78bdbb8b8e","url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/08\/cropped-Grit_Nickel-glasowfotografie_1zu1-96x96.jpg","contentUrl":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/08\/cropped-Grit_Nickel-glasowfotografie_1zu1-96x96.jpg","caption":"Grit Nickel"},"description":"Grit is a content writer at Fraunhofer IIS and a science communication specialist. She has 6+ years of experience in research and holds a PhD in German linguistics.","sameAs":["https:\/\/www.linkedin.com\/in\/grit-nickel\/"],"url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/author\/grit-nickeliis-fraunhofer-de\/"}]}},"_links":{"self":[{"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/posts\/1702","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/users\/9"}],"replies":[{"embeddable":true,"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/comments?post=1702"}],"version-history":[{"count":15,"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/posts\/1702\/revisions"}],"predecessor-version":[{"id":4552,"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/posts\/1702\/revisions\/4552"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/media\/1709"}],"wp:attachment":[{"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/media?parent=1702"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/categories?post=1702"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/tags?post=1702"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/coauthors?post=1702"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}