
Artifical Intelligence
Research Group
The Junior Research Group on Correctable Hybrid Artificial Intelligence focuses on making AI systems more verifiable, explainable and correctable, especially in safety-critical settings. It develops methods to combine symbolic, human-readable knowledge (like rules) with data-driven models such as deep neural networks, so that the integrated system can be checked and adjusted reliably. The group works on explainable AI techniques, formal guarantees, and hybrid approaches that embed symbolic knowledge into machine learned models. It is led by Dr. Gesina Schwalbe and includes PhD researchers and assistants working on different points of intervention between neural nets and symbolic knowledge representation.
Group Members
| Diedrich Wolter | Diedrich.Wolter@isp.uni-luebeck.de | +49 451 310 16505 |
| Martin Leucker | leucker@isp.uni-luebeck.de | +49 451 310 16500 |
| Max Winkler | max@mxwinkler.de |
Associated Projects
Group Information
Members
3 members
Projects
1 project


