The System and Software Security (S3) group (https://www.cs.manchester.ac.uk/research/expertise/systems-and-software-security/) at the University of Manchester (UK) is looking for a Post-Doc in Secure and Verifiable AI Models to join our ambition EPSRC-funded EnnCore project (https://enncore.github.io/).
The successful candidate will enjoy designing, developing, and evaluating novel AI models that are secure and robust against attacks. The project will involve continuous interaction with experts in explainable AI, software testing and formal software verification. In addition, you’ll have the opportunity to collaborate with researchers from our Arm Centre of Excellence (https://www.cs.manchester.ac.uk/arm-coe/).
More information about the position and application guidelines can be found here: https://www.jobs.manchester.ac.uk/displayjob.aspx?jobid=22435
The S3 group conducts world-leading research in explainable AI, automated software verification and testing. It also develops award-winning software verification and testing tools and regularly wins prizes at international competitions.