{"id":386,"date":"2023-06-13T21:45:09","date_gmt":"2023-06-13T19:45:09","guid":{"rendered":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/?p=386"},"modified":"2025-10-14T11:16:59","modified_gmt":"2025-10-14T09:16:59","slug":"mikaia-ai-authoring-app","status":"publish","type":"post","link":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/mikaia-ai-authoring-app\/","title":{"rendered":"MIKAIA AI Authoring App"},"content":{"rendered":"\n<h2 class=\"wp-block-heading\"><a href=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/mikaia-university\/\">MIKAIA<sup>\u00ae<\/sup> University<\/a> Application Note: Interactively training an AI with the AI Authoring App &#8212; programming-free and from just a few example annotations<\/h2>\n\n\n\n<p>By design, researchers often have unique questions that they want to answer as quickly and efficiently as possible. The <a href=\"http:\/\/www.mikaia.ai\">MIKAIA<sup>\u00ae<\/sup> <\/a>team will never be able to create dedicated AI Apps for all these unique research questions. Instead, we have developed the <strong>AI Author App<\/strong> that empowers users to create an AI themselves on their own data. Since the app is based on \u201cFew Shot Learning\u201d, meaning the underlying backbone-AI that was already pre-trained on millions of histology image patches can be adapted to new use cases based on only very few examples (and in fact is not actually changed), the \u201ctraining\u201d process is very fast. Simply by defining a set of distinct tissue classes and outlining prototypical areas for each class, a new AI can be trained within minutes.<\/p>\n\n\n\n<figure class=\"wp-block-table\"><table><tbody><tr><td><strong><strong>Input<\/strong><\/strong><\/td><td><strong>&nbsp;<\/strong><\/td><\/tr><tr><td>Staining<\/td><td>The features produced by the pre-trained backbone <br>work optimally on HE-stains. <br>While we observed also good performance on other stains, <br>such as IHC, the app cannot currently <br>be used with multiplexed <br>immunofluorescence (mIF) slides.<\/td><\/tr><tr><td>Microscopy <br>mode<\/td><td>Brightfield<\/td><\/tr><tr><td>Magnification<\/td><td>Typically 20x or 40x.<\/td><\/tr><tr><td>Supported <br>Analysis Modes<\/td><td>ROI,&nbsp;FoV, Slide, Batch<\/td><\/tr><tr><td><strong>Outputs<\/strong><\/td><td><strong>&nbsp;<\/strong><\/td><\/tr><tr><td>Graphical <br>Outputs<\/td><td>&#8211; Polygons (with holes) that outline tissue classes<br>&#8211; When pre-clustering is disabled, tissue outlines are rectangular<br>&#8211; Optionally, a rejection class (\u201cunsure\u201d) can be activated. <br>When activated and depending on the sensitivity (1-100) <br>areas that look unlike any of the training annotations <br>will be assigned to the rejection class.&nbsp;<\/td><\/tr><tr><td>Slide-level <br>output metrics<\/td><td>&#8211; mm\u00b2 and % per tissue class<br>&#8211; T-SNE\/UMAP\/PCA plot of patches (helps locating outliers)<br>&#8211; Tissue area in mm\u00b2<\/td><\/tr><tr><td><strong>Description<\/strong><\/td><td><strong>&nbsp;<\/strong><\/td><\/tr><tr><td>Pre-processing<\/td><td>Tissue detection<br>[Optionally] pre-clustering of similarly looking regions. <br>AI analysis is then used to classify each cluster. <br>The average cluster side length (\u00b5m) is configurable.<\/td><\/tr><tr><td>Post-<br>processing \/ <br>additional <br>options<\/td><td>Adjacent clusters\/patches of identical tissue type can be fused<\/td><\/tr><tr><td>Technology \/ <br>Algorithm<\/td><td>The app conducts a patch-wise classification, <br>i.e., each patch (~50 x 50 \u00b5m) is assigned to <br>a tissue class. An optional pre-clustering step <br>ensures that predicted tissue areas align smoothly <br>with the image contents (no checkerboard). <br>This approach entails that an entire slide <br>can be analyzed fast, typically within less <br>than 5 minutes (depending on GPU), <br>but it also means it is not well suited <br>to segment fine structures such as vessels <br>or cells that are smaller than a single patch. &nbsp;<br><br>The app is AI-based and utilizes \u201cFew Shot <br>Learning\u201d, more precisely ProtoNets. The <br>ProtoNet backbone-AI has already been <br>pre-trained on millions of H&amp;E histology <br>patches. The training was conducted in a <br>special way, improving generalizability to <br>other use-cases.<\/td><\/tr><tr><td>Speed<\/td><td>Seconds per field-of view. <br>Typically &lt; 5 minutes per whole-slide <br>(depending on the GPU). <br>When no CUDA-enabled GPU is <br>available, the AI will run on the CPU, <br>which is slower.<\/td><\/tr><tr><td>Use cases<\/td><td>Detect and measure areas of interest, <br>such as tumors or metastases<br>Create regions of interest (e.g., tumor) <br>for downstream analysis (e.g., Cell <br>detection within tumor).\u2026<\/td><\/tr><\/tbody><\/table><\/figure>\n\n\n\n<p><strong>DIY &#8212; do it yourself!<\/strong> Train your own classifier on your data in four simple steps.<\/p>\n\n\n\n<ol style=\"list-style-type:1\" class=\"wp-block-list\">\n<li>Define names of tissue classes you want to distinguish<\/li>\n\n\n\n<li>Annotate some typical regions for these classes in one or multiple slides. Since the AI has been pre-trained on histology images, this step does not require too many annotations. Then adapt the pre-trained AI on your use case by analyzing the annotated regions.<\/li>\n\n\n\n<li>Now apply your own classifier on new unseen regions or slides. If you are not happy with the accuracy yet, go back to step 2 and annotate falsely classified regions. Iteratively, the app will guide you where to annotate.<\/li>\n\n\n\n<li>Enter a description text, pick an image or icon, and add the trained AI as a new app to the app center. It can now be used by you or your colleagues. You can even share it with other MIKAIA<sup>\u00ae<\/sup> users.<\/li>\n<\/ol>\n\n\n\n<h2 class=\"wp-block-heading\">Video<\/h2>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe title=\"MICAIA\u00ae - AI Authoring App\" width=\"770\" height=\"433\" src=\"https:\/\/www.youtube.com\/embed\/7KzxUABvmoU?list=PL1xbox4kZP0OL7PskwG_xYgmu9iV_N5Fd\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\">Step by Step Usage<\/h2>\n\n\n\n<p>In this application note, we demonstrate how the <strong>AI Author App<\/strong> can be employed in a common use case. <\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Configuration<\/h3>\n\n\n\n<p>The workflow starts by selecting the <strong>AI Author App<\/strong> in the app center and creating a new AI. Then define a number of tissue classes according to your needs, e.g., three different classes that distinguish blood (red), inflammation (blue), and muscle and connective tissue (yellow), respectively.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"1675\" height=\"908\" src=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/1.png\" alt=\"\" class=\"wp-image-404\" srcset=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/1.png 1675w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/1-300x163.png 300w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/1-1024x555.png 1024w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/1-768x416.png 768w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/1-1536x833.png 1536w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/1-370x201.png 370w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/1-270x146.png 270w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/1-570x309.png 570w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/1-740x401.png 740w\" sizes=\"(max-width: 1675px) 100vw, 1675px\" \/><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\">AI Training<\/h3>\n\n\n\n<p>Further on, the new AI is trained by creating some training annotations. Therefore, outline some stereotypical regions for each class. These do not necessarily have to be very accurate but should be specific to the tissue and must contain only the targeted tissue type. If the tissue comes in different appearances, create multiple annotations for this class to train the AI sufficiently. It is best not to combine multiple appearances of a single tissue class in a single annotation. At last, click \u201cTrain\u201d to trigger the analysis of your training annotations patch by patch. In this step, which does not take long, since it does not actually involve re-training, the AI will learn a mathematical representation of each tissue type (i.e., one or multiple &#8220;prototypes&#8221; per class that represent the class in the feature space aka embedding).<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"1674\" height=\"910\" src=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/2-3.png\" alt=\"Screenshot of the MICAIA AI Authoring App\" class=\"wp-image-405\" srcset=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/2-3.png 1674w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/2-3-300x163.png 300w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/2-3-1024x557.png 1024w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/2-3-768x417.png 768w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/2-3-1536x835.png 1536w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/2-3-370x201.png 370w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/2-3-270x147.png 270w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/2-3-570x310.png 570w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/2-3-740x402.png 740w\" sizes=\"(max-width: 1674px) 100vw, 1674px\" \/><\/figure>\n\n\n\n<p>To test the performance of the new AI define a ROI manually that contains all tissue types but was not annotated before. <\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"1675\" height=\"910\" src=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/3-1.png\" alt=\"Screenshot of the MICAIA AI Authoring App\" class=\"wp-image-406\" srcset=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/3-1.png 1675w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/3-1-300x163.png 300w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/3-1-1024x556.png 1024w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/3-1-768x417.png 768w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/3-1-1536x834.png 1536w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/3-1-370x201.png 370w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/3-1-270x147.png 270w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/3-1-570x310.png 570w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/3-1-740x402.png 740w\" sizes=\"(max-width: 1675px) 100vw, 1675px\" \/><\/figure>\n\n\n\n<p>After the ROI is analyzed patch by patch, evaluate the quality of the result. Since we trained the AI to cope with color differences induced by the use of different scanners, staining protocols or tissue thickness, it pays more attention to tissue texture than to tissue color. In the below example, a misclassification occurs: a region is classified as the red class (blood), but should be assigned to the yellow class (muscle and connective tissue). <\/p>\n\n\n<div id='gallery-1' class='gallery galleryid-386 gallery-columns-3 gallery-size-gridlove-single'><figure class='gallery-item'>\n\t\t\t<div class='gallery-icon landscape'>\n\t\t\t\t<a class=\"gridlove-popup\" href='https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/4-1.png'><img decoding=\"async\" width=\"740\" height=\"401\" src=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/4-1-740x401.png\" class=\"attachment-gridlove-single size-gridlove-single\" alt=\"\" srcset=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/4-1-740x401.png 740w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/4-1-300x162.png 300w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/4-1-1024x554.png 1024w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/4-1-768x416.png 768w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/4-1-1536x832.png 1536w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/4-1-370x200.png 370w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/4-1-270x146.png 270w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/4-1-570x309.png 570w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/4-1.png 1675w\" sizes=\"(max-width: 740px) 100vw, 740px\" \/><\/a>\n\t\t\t<\/div><\/figure><figure class='gallery-item'>\n\t\t\t<div class='gallery-icon landscape'>\n\t\t\t\t<a class=\"gridlove-popup\" href='https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/5-1.png'><img decoding=\"async\" width=\"740\" height=\"402\" src=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/5-1-740x402.png\" class=\"attachment-gridlove-single size-gridlove-single\" alt=\"\" srcset=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/5-1-740x402.png 740w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/5-1-300x163.png 300w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/5-1-1024x556.png 1024w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/5-1-768x417.png 768w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/5-1-1536x834.png 1536w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/5-1-370x201.png 370w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/5-1-270x147.png 270w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/5-1-570x309.png 570w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/5-1.png 1673w\" sizes=\"(max-width: 740px) 100vw, 740px\" \/><\/a>\n\t\t\t<\/div><\/figure>\n\t\t<\/div>\n\n\n\n<p>This is understandable, since the misclassified region has a similar texture to the <em>blood <\/em>class and none of the yellow class\u2019es training annotations contained a similarly looking region. Also, the AI backbone was trained to pay more attention to texture than to color in order to better cope with real-world data heterogeity caused by (1) unstandardized staining protocols, (2) varying tissue thickness and (3) different scanners with different optics, camera and color postprocessing. <\/p>\n\n\n\n<p>If the analysis result contains mistakes (mispredictions), do not worry. Simply add an additional \u201ccorrective\u201d training annotation in order to teach the AI that the tissue in the misclassified area actually belongs to the yellow class.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"1674\" height=\"912\" src=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/6-2.png\" alt=\"Screenshot of performance evaluation of the MICAIA AI Authoring App\" class=\"wp-image-410\" srcset=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/6-2.png 1674w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/6-2-300x163.png 300w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/6-2-1024x558.png 1024w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/6-2-768x418.png 768w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/6-2-1536x837.png 1536w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/6-2-370x202.png 370w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/6-2-270x147.png 270w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/6-2-570x311.png 570w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/6-2-740x403.png 740w\" sizes=\"(max-width: 1674px) 100vw, 1674px\" \/><\/figure>\n\n\n\n<p>Afterwards evaluate the performance again. This way, you can iteratively add new training annotations. <\/p>\n\n\n\n<p>In fact, the AI Author App can be regarded as a <strong>guided annotation tool<\/strong> (similar to Active Learning), since a user can spot misclassifications early on and then add annotations where they are necessary and help the model (as opposed to wasting time on adding many annotations at places that do not really help the AI)<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"1675\" height=\"910\" src=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/7-2.png\" alt=\"\" class=\"wp-image-411\" srcset=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/7-2.png 1675w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/7-2-300x163.png 300w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/7-2-1024x556.png 1024w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/7-2-768x417.png 768w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/7-2-1536x834.png 1536w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/7-2-370x201.png 370w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/7-2-270x147.png 270w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/7-2-570x310.png 570w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/7-2-740x402.png 740w\" sizes=\"(max-width: 1675px) 100vw, 1675px\" \/><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Add your AI to the App Center<\/h3>\n\n\n\n<p>When you are happy with the result, the trained AI is ready to be used.<\/p>\n\n\n\n<p>After selecting an icon and adding a suitable description text, toggle the \u201cShow in App Center\u201d switch to show your custom AI as a new app in the MIKAIA<sup>\u00ae<\/sup> App Center. You can even export your AI model and share it with other MIKAIA<sup>\u00ae<\/sup> users.<\/p>\n\n\n<div id='gallery-2' class='gallery galleryid-386 gallery-columns-3 gallery-size-gridlove-single'><figure class='gallery-item'>\n\t\t\t<div class='gallery-icon landscape'>\n\t\t\t\t<a class=\"gridlove-popup\" href='https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/8-1.png'><img decoding=\"async\" width=\"740\" height=\"402\" src=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/8-1-740x402.png\" class=\"attachment-gridlove-single size-gridlove-single\" alt=\"\" srcset=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/8-1-740x402.png 740w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/8-1-300x163.png 300w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/8-1-1024x556.png 1024w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/8-1-768x417.png 768w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/8-1-1536x833.png 1536w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/8-1-370x201.png 370w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/8-1-270x147.png 270w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/8-1-570x309.png 570w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/8-1.png 1677w\" sizes=\"(max-width: 740px) 100vw, 740px\" \/><\/a>\n\t\t\t<\/div><\/figure><figure class='gallery-item'>\n\t\t\t<div class='gallery-icon landscape'>\n\t\t\t\t<a class=\"gridlove-popup\" href='https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/9.png'><img decoding=\"async\" width=\"740\" height=\"403\" src=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/9-740x403.png\" class=\"attachment-gridlove-single size-gridlove-single\" alt=\"\" srcset=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/9-740x403.png 740w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/9-300x163.png 300w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/9-1024x557.png 1024w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/9-768x418.png 768w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/9-1536x836.png 1536w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/9-370x201.png 370w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/9-270x147.png 270w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/9-570x310.png 570w, https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/9.png 1674w\" sizes=\"(max-width: 740px) 100vw, 740px\" \/><\/a>\n\t\t\t<\/div><\/figure>\n\t\t<\/div>\n\n\n\n<h3 class=\"wp-block-heading\">Batch Analysis<\/h3>\n\n\n\n<p>Like with most other apps, the <strong>AI Author App<\/strong> supports analyzing single ROIs or slides as well as entire datasets. Simply import the folder (or folders) containing your dataset into the \u201cSlides\u201d pane. Then multi-select some or all slides. The analyze \u201cSlide\u201d button will then turn into the analyze \u201cBatch\u201d button. Click it to submit an analysis job for each selected slide into the job queue. Now lean back and watch while MIKAIA<sup>\u00ae<\/sup> loads and analyzes one slide after the other.<\/p>\n\n\n\n<p>Also read a separate app note that focuses on running an AI batch analysis: <a href=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/digital-pathology-batch-analysis-with-mikaia\/\">MIKAIA<sup>\u00ae<\/sup><\/a> <a href=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/digital-pathology-batch-analysis-with-mikaia\/\">AI Author: Create new AI and batch-analyze entire dataset<\/a>. Here is the video tutorial from that app note:<\/p>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"1080\" style=\"aspect-ratio: 1920 \/ 1080;\" width=\"1920\" autoplay controls muted src=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2024\/12\/AI-Author-batch-analysis-no-sound.mp4\"><\/video><\/figure>\n","protected":false},"excerpt":{"rendered":"<p>MIKAIA\u00ae University Application Note: Interactively training an AI with the AI Authoring App &#8212; programming-free and from just a few example annotations By design, researchers often have unique questions that they want to answer as quickly and efficiently as possible. The MIKAIA\u00ae team will never be able to create dedicated AI Apps for all these [&hellip;]<\/p>\n","protected":false},"author":6,"featured_media":402,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[3,28],"tags":[37,86,7,29,111],"coauthors":[54],"class_list":["post-386","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-digital-pathology","category-mikaia-university","tag-ai","tag-he","tag-mikaia","tag-mikaia-app-note","tag-workflow"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.6 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>MIKAIA AI Authoring App - SMART SENSING insights<\/title>\n<meta name=\"description\" content=\"Discover easy AI training using MIKAIA AI Authoring App. No programming required, just a few example annotations are all you need.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/mikaia-ai-authoring-app\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"MIKAIA AI Authoring App - SMART SENSING insights\" \/>\n<meta property=\"og:description\" content=\"Discover easy AI training using MIKAIA AI Authoring App. No programming required, just a few example annotations are all you need.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/mikaia-ai-authoring-app\/\" \/>\n<meta property=\"og:site_name\" content=\"SMART SENSING insights\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/FraunhoferIIS\" \/>\n<meta property=\"article:published_time\" content=\"2023-06-13T19:45:09+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-10-14T09:16:59+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/8-1.png\" \/>\n\t<meta property=\"og:image:width\" content=\"1677\" \/>\n\t<meta property=\"og:image:height\" content=\"910\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Nathalie Falk\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Nathalie Falk\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"8 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/mikaia-ai-authoring-app\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/mikaia-ai-authoring-app\\\/\"},\"author\":{\"name\":\"Nathalie Falk\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#\\\/schema\\\/person\\\/bc04708adb211585b9566ee709c2bcec\"},\"headline\":\"MIKAIA AI Authoring App\",\"datePublished\":\"2023-06-13T19:45:09+00:00\",\"dateModified\":\"2025-10-14T09:16:59+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/mikaia-ai-authoring-app\\\/\"},\"wordCount\":1344,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/mikaia-ai-authoring-app\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/8-1.png\",\"keywords\":[\"AI\",\"H&amp;E\",\"MIKAIA\u00ae\",\"MIKAIA\u00ae App Note\",\"Workflow\"],\"articleSection\":[\"Digital Pathology\",\"MIKAIA University\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/mikaia-ai-authoring-app\\\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/mikaia-ai-authoring-app\\\/\",\"url\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/mikaia-ai-authoring-app\\\/\",\"name\":\"MIKAIA AI Authoring App - SMART SENSING insights\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/mikaia-ai-authoring-app\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/mikaia-ai-authoring-app\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/8-1.png\",\"datePublished\":\"2023-06-13T19:45:09+00:00\",\"dateModified\":\"2025-10-14T09:16:59+00:00\",\"description\":\"Discover easy AI training using MIKAIA AI Authoring App. No programming required, just a few example annotations are all you need.\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/mikaia-ai-authoring-app\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/mikaia-ai-authoring-app\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/mikaia-ai-authoring-app\\\/#primaryimage\",\"url\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/8-1.png\",\"contentUrl\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/8-1.png\",\"width\":1677,\"height\":910},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/mikaia-ai-authoring-app\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"MIKAIA AI Authoring App\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#website\",\"url\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/\",\"name\":\"SMART SENSING insights\",\"description\":\"learn more about our focus research areas sensor technology, electronics, and artificial intelligence\",\"publisher\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#organization\",\"name\":\"Fraunhofer IIS\",\"url\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Fraunhofer-IIS-1.png\",\"contentUrl\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2023\\\/06\\\/Fraunhofer-IIS-1.png\",\"width\":826,\"height\":299,\"caption\":\"Fraunhofer IIS\"},\"image\":{\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/www.facebook.com\\\/FraunhoferIIS\",\"https:\\\/\\\/www.linkedin.com\\\/company\\\/fraunhofer-iis\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/#\\\/schema\\\/person\\\/bc04708adb211585b9566ee709c2bcec\",\"name\":\"Nathalie Falk\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2023\\\/10\\\/cropped-J43A0807-scaled-1-96x96.jpg0eaa6499419f8eebfa260a5a88fd13b6\",\"url\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2023\\\/10\\\/cropped-J43A0807-scaled-1-96x96.jpg\",\"contentUrl\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/wp-content\\\/uploads\\\/2023\\\/10\\\/cropped-J43A0807-scaled-1-96x96.jpg\",\"caption\":\"Nathalie Falk\"},\"description\":\"Nathalie is a molecular biologist with a PhD in neurobiology and postdoctoral experience in immunology and nephrology. After changing fields, she is currently studying psychology and gaining experience in content creation at Fraunhofer IIS.\",\"url\":\"https:\\\/\\\/websites.fraunhofer.de\\\/smart-sensing-insights\\\/author\\\/nathalie-falkiis-fraunhofer-de\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"MIKAIA AI Authoring App - SMART SENSING insights","description":"Discover easy AI training using MIKAIA AI Authoring App. No programming required, just a few example annotations are all you need.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/mikaia-ai-authoring-app\/","og_locale":"en_US","og_type":"article","og_title":"MIKAIA AI Authoring App - SMART SENSING insights","og_description":"Discover easy AI training using MIKAIA AI Authoring App. No programming required, just a few example annotations are all you need.","og_url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/mikaia-ai-authoring-app\/","og_site_name":"SMART SENSING insights","article_publisher":"https:\/\/www.facebook.com\/FraunhoferIIS","article_published_time":"2023-06-13T19:45:09+00:00","article_modified_time":"2025-10-14T09:16:59+00:00","og_image":[{"width":1677,"height":910,"url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/8-1.png","type":"image\/png"}],"author":"Nathalie Falk","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Nathalie Falk","Est. reading time":"8 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/mikaia-ai-authoring-app\/#article","isPartOf":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/mikaia-ai-authoring-app\/"},"author":{"name":"Nathalie Falk","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#\/schema\/person\/bc04708adb211585b9566ee709c2bcec"},"headline":"MIKAIA AI Authoring App","datePublished":"2023-06-13T19:45:09+00:00","dateModified":"2025-10-14T09:16:59+00:00","mainEntityOfPage":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/mikaia-ai-authoring-app\/"},"wordCount":1344,"commentCount":0,"publisher":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#organization"},"image":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/mikaia-ai-authoring-app\/#primaryimage"},"thumbnailUrl":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/8-1.png","keywords":["AI","H&amp;E","MIKAIA\u00ae","MIKAIA\u00ae App Note","Workflow"],"articleSection":["Digital Pathology","MIKAIA University"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/mikaia-ai-authoring-app\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/mikaia-ai-authoring-app\/","url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/mikaia-ai-authoring-app\/","name":"MIKAIA AI Authoring App - SMART SENSING insights","isPartOf":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#website"},"primaryImageOfPage":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/mikaia-ai-authoring-app\/#primaryimage"},"image":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/mikaia-ai-authoring-app\/#primaryimage"},"thumbnailUrl":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/8-1.png","datePublished":"2023-06-13T19:45:09+00:00","dateModified":"2025-10-14T09:16:59+00:00","description":"Discover easy AI training using MIKAIA AI Authoring App. No programming required, just a few example annotations are all you need.","breadcrumb":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/mikaia-ai-authoring-app\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/mikaia-ai-authoring-app\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/mikaia-ai-authoring-app\/#primaryimage","url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/8-1.png","contentUrl":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/8-1.png","width":1677,"height":910},{"@type":"BreadcrumbList","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/mikaia-ai-authoring-app\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/"},{"@type":"ListItem","position":2,"name":"MIKAIA AI Authoring App"}]},{"@type":"WebSite","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#website","url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/","name":"SMART SENSING insights","description":"learn more about our focus research areas sensor technology, electronics, and artificial intelligence","publisher":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#organization","name":"Fraunhofer IIS","url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#\/schema\/logo\/image\/","url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/Fraunhofer-IIS-1.png","contentUrl":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/06\/Fraunhofer-IIS-1.png","width":826,"height":299,"caption":"Fraunhofer IIS"},"image":{"@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/FraunhoferIIS","https:\/\/www.linkedin.com\/company\/fraunhofer-iis"]},{"@type":"Person","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/#\/schema\/person\/bc04708adb211585b9566ee709c2bcec","name":"Nathalie Falk","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/10\/cropped-J43A0807-scaled-1-96x96.jpg0eaa6499419f8eebfa260a5a88fd13b6","url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/10\/cropped-J43A0807-scaled-1-96x96.jpg","contentUrl":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-content\/uploads\/2023\/10\/cropped-J43A0807-scaled-1-96x96.jpg","caption":"Nathalie Falk"},"description":"Nathalie is a molecular biologist with a PhD in neurobiology and postdoctoral experience in immunology and nephrology. After changing fields, she is currently studying psychology and gaining experience in content creation at Fraunhofer IIS.","url":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/author\/nathalie-falkiis-fraunhofer-de\/"}]}},"_links":{"self":[{"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/posts\/386","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/users\/6"}],"replies":[{"embeddable":true,"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/comments?post=386"}],"version-history":[{"count":15,"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/posts\/386\/revisions"}],"predecessor-version":[{"id":3098,"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/posts\/386\/revisions\/3098"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/media\/402"}],"wp:attachment":[{"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/media?parent=386"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/categories?post=386"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/tags?post=386"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/websites.fraunhofer.de\/smart-sensing-insights\/wp-json\/wp\/v2\/coauthors?post=386"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}