[TYPES/announce] Postdoc positions on ERC project fun2model in 'strong' AI/verification at Oxford

Marta Kwiatkowska marta.kwiatkowska at cs.ox.ac.uk
Thu Jun 6 13:26:11 EDT 2019


[Please forward to anyone interested. Apologies for multiple mailings.]

An exciting opportunity has arisen at the intersection of AI and 
verification: *three postdoctoral positions and two fully funded 
doctoral (DPhil) studentships* are available under the supervision of 
Professor Marta Kwiatkowska on the ERC Advanced Grant project FUN2MODEL 
(www.fun2model.org), to commence in October 2019 or as soon as possible 
thereafter.

The FUN2MODEL "From FUNction-based TO MOdel-based automated 
probabilistic reasoning for DEep Learning" project (www.fun2model.org) 
aims to make advances towards provably robust 'strong' Artificial 
Intelligence. In contrast to 'narrow' AI perception tasks realised by 
deep learning, which are limited to learning data associations, and 
sometimes referred to as function-based, 'strong' AI aims to match human 
intelligence and requires model-based reasoning about causality and 
'what if' scenarios, incorporation of cognitive aspects such as beliefs 
and goals, and probabilistic reasoning frameworks that combine logic 
with statistical machine learning.

The objectives of FUN2MODEL are to develop novel probabilistic 
verification and synthesis techniques to guarantee safety, robustness 
and fairness for complex decisions based on machine learning, formulate 
a comprehensive, compositional game-based modelling framework for 
reasoning about systems of autonomous agents and their interactions, and 
evaluate the techniques on a variety of case studies.

The positions are briefly described below; please follow the links for 
*information about the selection criteria and how to apply*.

*Senior Research Associate**on FUN2MODEL, fixed term for 3 years from 
1st October 2019, with the possibility of extension**
**Grade 8: Salary £40,792 – £48,677 p.a. (note: post may be under-filled 
at grade 7: £32,236 - £39,609 p.a.)*

http://www.cs.ox.ac.uk/news/1684-full.html

The successful appointee will be expected to provide overall leadership 
for the development of probabilistic verification and synthesis methods, 
including software implementation and PRISM extensions, with emphasis on 
data-centric modelling, coordination and reasoning for autonomous 
multi-agent systems, capturing cognitive and affective aspects. This 
includes causal reasoning based on Bayesian networks; game-theoretic 
methods and algorithmic schemes for coordination and collaboration; 
formalisation of provably robust and beneficial collaboration; 
extensions of PRISM modelling language and software; and relevant case 
studies.

*Research Associate**post 1 on FUN2MODEL, fixed term for 3 years from 
1st October 2019, with the possibility of extension**
Grade 7: Salary £32,236 - £39,609 p.a.
*

http://www.cs.ox.ac.uk/news/1683-full.html

The successful appointee will be expected to contribute to the 
development of probabilistic verification and synthesis methods, with 
emphasis on developing automated probabilistic verification and 
synthesis methods for machine learning components. This includes 
Bayesian interpretation; provable probabilistic robustness guarantees 
for neural networks; provably correct synthesis for neural networks; 
complex correctness properties for machine learning decisions; software 
implementation; and relevant case studies.

*Research Associate**post 1 on FUN2MODEL, fixed term for 3 years from 
1st October 2019, with the possibility of extension
Grade 7: Salary £32,236 - £39,609 p.a.
*

http://www.cs.ox.ac.uk/news/1682-full.html

The successful appointee will be expected to contribute to the 
development of probabilistic verification and synthesis methods, with 
emphasis on developing an algebraic theory of probabilistic components 
amenable to machine learning (ML). This includes study of interfaces and 
algebraic operations for ML components; contract-based probabilistic 
reasoning for ML components; reasoning about complex ML decisions; 
integration with autonomous multi-agent system models and reasoning 
tools; and relevant case studies.

The division of responsibilities between the three research posts may be 
adapted following interview depending on the qualifications and 
experience of the candidates.

*2 x Doctoral**(DPhil) Studentships on FUN2MODEL, 3.5 years from 1st 
October 2019, with the possibility of extension**
Stipend of at least £15600 per annum p.a, including fees at EU/home 
level, laptop and conference travel *

*For more information about the studentships, selection criteria and how 
to apply 
see**<http://www.cs.ox.ac.uk/news/1681-full.html>*http://www.cs.ox.ac.uk/news/1681-full.html

*Studenship 1:* *Fairness and bias in multi-agent interactions*

Fairness and bias of algorithmic decisions is critical to ensure their 
acceptance in society, but has been lacking in recently deployed AI 
software, for example Microsoft’s bot Tay. As a result, a variety of 
definitions of algorithmic fairness and corresponding verification 
approaches have been developed. However, these do not capture the 
influence of the cognitive and affective aspects of complex decisions 
made by autonomous agents, such as preferences and emotional state, 
which are essential to achieve effective collaboration of human and 
artificial agents. This project aims to develop a probabilistic, 
Bayesian framework based on causal inference for reasoning about 
fairness and bias in multi-agent collaborations, together with 
demonstrator case studies and associated software tools.

*Studentship 2: **Causal reasoning about accountability and blame*

While deep learning is able to discern data associations, Bayesian 
networks are capable of reasoning about counterfactual and 
interventional scenarios, for example “What if the car had swerved when 
the child stepped on to the road?”. However, in order to model realistic 
human behaviours, Bayesian priors and inference must additionally 
account for cognitive goals and intentions, such as inference of intent 
for the pedestrian. This project aims to develop a framework for 
probabilistic causal reasoning with cognitive aspects to study 
accountability and blame in autonomous scenarios, together with 
demonstrator case studies and associated software tools.

The successful applicants will join an internationally leading research 
group of Professor Marta Kwiatkowska, who has an extensive track record 
in probabilistic verification and pioneering research on safety 
verification for neural networks and trust in human-robot 
collaborations. More information about Professor Kwiatkowska’s research 
and PRISM model checker can be found here:

http://www.cs.ox.ac.uk/marta.kwiatkowska/
https://royalsociety.org/science-events-and-lectures/2018/11/milner-lecture/ 

https://www.prismmodelchecker.org/

The *closing date* for all applications is *8 July 2019* (note different 
procedures for postdocs and studentships) .
*Interviews* are expected to be held on *23-24th**July 2019*.

*Enquiries* to Professor Marta Kwiatkowska 
(marta.kwiatkowska at cs.ox.ac.uk) are welcome.

Our staff and students come from all over the world and we proudly 
promote a friendly and inclusive culture. Diversity is positively 
encouraged, through diversity groups and champions, for example 
http://www.cs.ox.ac.uk/aboutus/women-cs-oxford/index.html, as well as a 
number of family-friendly policies, such as the right to apply for 
flexible working and support for staff returning from periods of 
extended absence, for example maternity leave.

-- 
Professor Marta Kwiatkowska FRS
Fellow of Trinity College
Department of Computer Science
University of Oxford
Wolfson Building, Parks Road
Oxford, OX1 3QD

Tel: +44 (0)1865 283509
Email:Marta.Kwiatkowska at cs.ox.ac.uk
URL:http://www.cs.ox.ac.uk/people/marta.kwiatkowska/

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://LISTS.SEAS.UPENN.EDU/pipermail/types-announce/attachments/20190606/0bdb85c3/attachment-0001.html>


More information about the Types-announce mailing list