Trust and Machine Learning
Talks at this conferenceAdd a talk
The trustworthiness of ML-methods has recently come to focus not only due to the ongoing ChatGPT hype but also due to algorithmic mishaps of self-driving cars, legislative attempts by the EU and the general sense that there has been a breakthrough in the field.
Trust plays a central role in evaluating possible consequences of ML-methods.
This summer school offers three one day long sessions which will engage with different aspects of trust and ML.
The first session is a hands-on tutorial in the HLRS training center, where participants will learn how to code a simple ML application. This session will be taught by the HLRS training staff.
The second session is dedicated to get a grasp of the ethical and normative aspects of trust in ML contexts. There is a rich literature on trust concepts to be navigated. This session gives participants the means to select and apply a fitting trust concept for their ML-problem. This session is taught by Andreas Kaminski and Sebastian Hallensleben.
The third session discusses epistemic aspects of trust. It connects trust to inductive problems and shows how solutions to problems of epistemic trust in inductive methods are transferred to solutions in ML-contexts. The session is taught by Nic Fillion and Tom Sterkenburg.
Furthermore there will be the opportunity for participants to present their current work on trust and ML in an evening session. Should you want to present your work please indicate so in the application and attach an abstract of maximum 500 words.
The three day summer school will take place from July 26th to July 28th at HLRS in Stuttgart. There are no fees.
To apply please send an email with a short statement why you want to join and a CV to [email protected] not later than June 1st. Notifications will be sent mid of June.
This is a student event (e.g. a graduate conference).
June 1, 2023, 12:00am CET
Who is attending?
No one has said they will attend yet.
Will you attend this event?