JiscMail Logo
Email discussion lists for the UK Education and Research communities

Help for CABERNET-EVENTS Archives


CABERNET-EVENTS Archives

CABERNET-EVENTS Archives


CABERNET-EVENTS@JISCMAIL.AC.UK


View:

Message:

[

First

|

Previous

|

Next

|

Last

]

By Topic:

[

First

|

Previous

|

Next

|

Last

]

By Author:

[

First

|

Previous

|

Next

|

Last

]

Font:

Proportional Font

LISTSERV Archives

LISTSERV Archives

CABERNET-EVENTS Home

CABERNET-EVENTS Home

CABERNET-EVENTS  March 2019

CABERNET-EVENTS March 2019

Options

Subscribe or Unsubscribe

Subscribe or Unsubscribe

Log In

Log In

Get Password

Get Password

Subject:

CfP: RV2019 - Runtime Verification

From:

Martin Leucker <[log in to unmask]>

Reply-To:

Martin Leucker <[log in to unmask]>

Date:

Thu, 21 Mar 2019 19:55:49 +0100

Content-Type:

text/plain

Parts/Attachments:

Parts/Attachments

text/plain (1 lines)



                  [ Apologize for Multiple Copies ]

Call for Papers

RV 2019

19th International Conference on Runtime Verification
Porto, Portugal
October 8-11, 2019

Abstract deadline: April 25, 2019
Paper and tutorial deadline: April 30, 2019

NEW IN 2019: Benchmark Papers Track

https://www.react.uni-saarland.de/rv2019/

# Scope

Runtime verification is concerned with the monitoring and analysis of
the runtime behaviour of software and hardware systems. Runtime
verification techniques are crucial for system correctness,
reliability, and robustness; they provide an additional level of rigor
and effectiveness compared to conventional testing, and are generally
more practical than exhaustive formal verification. Runtime
verification can be used prior to deployment, for testing,
verification, and debugging purposes, and after deployment for
ensuring reliability, safety, and security and for providing fault
containment and recovery as well as online system repair.

Topics of interest to the conference include, but are not limited to:

* specification languages for monitoring
* monitor construction techniques
* program instrumentation
* logging, recording, and replay
* combination of static and dynamic analysis
* specification mining and machine learning over runtime traces
* monitoring techniques for concurrent and distributed systems
* runtime checking of privacy and security policies
* metrics and statistical information gathering
* program/system execution visualization
* fault localization, containment, recovery and repair
* dynamic type checking

Application areas of runtime verification include cyber-physical
systems, safety/mission critical systems, enterprise and systems
software, cloud systems, autonomous and reactive control systems,
health management and diagnosis systems, and system security and
privacy.

An overview of previous RV conferences and earlier workshops can be
found at: http://www.runtime-verification.org.

# Submissions

All papers and tutorials will appear in the conference proceedings in
an LNCS volume. Submitted papers and tutorials must use the
LNCS/Springer style detailed here:

http://www.springer.de/comp/lncs/authors.html

Papers must be original work and not be submitted for publication
elsewhere. Papers must be written in English and submitted
electronically (in PDF format) using the EasyChair submission page
here:

https://easychair.org/conferences/?conf=rv19

The page limitations mentioned below include all text and figures, but
exclude references. Additional details omitted due to space
limitations may be included in a clearly marked appendix, that will be
reviewed at the discretion of reviewers, but not included in the
proceedings.

At least one author of each accepted paper and tutorial must attend RV
2019 to present.

# Papers

There are four categories of papers which can be submitted: regular,
short, tool demo, and benchmark papers. Papers in each category will
be reviewed by at least 3 members of the Program Committee.

* Regular Papers (up to 15 pages, not including references) should
present original unpublished results. We welcome theoretical papers,
system papers, papers describing domain-specific variants of RV, and
case studies on runtime verification.

* Short Papers (up to 6 pages, not including references) may present
novel but not necessarily thoroughly worked out ideas, for example
emerging runtime verification techniques and applications, or
techniques and applications that establish relationships between
runtime verification and other domains.

* Tool Demonstration Papers (up to 8 pages, not including references)
should present a new tool, a new tool component, or novel extensions
to existing tools supporting runtime verification. The paper must
include information on tool availability, maturity, selected
experimental results and it should provide a link to a website
containing the theoretical background and user guide. Furthermore, we
strongly encourage authors to make their tools and benchmarks
available with their submission.

* Benchmark Papers (up to 10 pages, not including references, NEW IN
2019) should describe a benchmark, suite of benchmarks, or benchmark
generator useful for evaluating RV tools. Papers will should include
information as to what the benchmark consists of and its purpose (what
is the domain), how to obtain and use the benchmark, an argument for
the usefulness of the benchmark to the broader RV community, and may
include any existing results produced using the benchmark. We are
interested in both benchmarks pertaining to real-world scenarios and
those containing synthetic data designed to achieve interesting
properties. Broader definitions of benchmark e.g. for generating
specifications from data or diagnosing faults are within
scope. Finally, we encourage but do not require benchmarks that are
tool agnostic (especially those that have been used to evaluate
multiple tools), labelled benchmarks with rigorous arguments for
correctness of labels, and benchmarks that are demonstrably
challenging with respect to the state-of-the-art tools. Benchmark
papers must be accompanied by an easily accessible and usable
benchmark submission. Papers will be evaluated by a separate benchmark
evaluation panel who will asses the benchmarks relevance, clarity, and
utility as communicated by the submitted paper.

The Program Committee of RV 2019 will give a best paper award, and a
selection of accepted regular papers will be invited to appear in a
special journal issue.

# Tutorial Track

Tutorials are two-to-three-hour presentations on a selected
topic. Additionally, tutorial presenters will be offered to publish a
paper of up to 20 pages in the LNCS conference proceedings.  A
proposal for a tutorial must contain the subject of the tutorial, a
proposed timeline, a note on previous similar tutorials (if
applicable) and the differences to this incarnation, and a biography
of the presenter. The proposal must not exceed 2 pages. Tutorial
proposals will be reviewed by the Program Committee.  Important Dates

# Website

https://www.react.uni-saarland.de/rv2019/

# Important Dates

Abstract deadline:                April 25, 2019
Paper and tutorial deadline: 	  April 30, 2019
Paper and tutorial notification:  June 14, 2019
Camera-ready deadline:  	  July 14, 2019
Conference: 	                  October 8 - 11, 2019

# Program Committee

Wolfgang Ahrendt, Chalmers University of Technology
Howard Barringer, The University of Manchester
Ezio Bartocci, Vienna University of Technology
Andreas Bauer, KUKA
Eric Bodden, Paderborn University and Fraunhofer IEM
Borzoo Bonakdarpour, Iowa State University
Christian Colombo, University of Malta
Ylies Falcone, Univ. Grenoble Alpes, CNRS, Inria
Lu Feng, University of Virginia
Bernd Finkbeiner, Saarland University
Adrian Francalanza, University of Malta
Radu Grosu, TU Vienna
Sylvain Hallé, Université du Québec à Chicoutimi
Klaus Havelund, Jet Propulsion Laboratory
Catalin Hritcu,	INRIA
Felix Klaedtke, NEC Labs Europe
Axel Legay, UCLouvain
David Lo, Singapore Management University
Leonardo Mariani, University of Milano Bicocca
Viviana Mascardi, DIBRIS, University of Genova
Dejan Nickovic, Austrian Institute of Technology AIT
Ayoub Nouri, Verimag
Gordon Pace, University of Malta
Doron Peled, Bar Ilan University
Ka I Pun, Western Norway University of Applied Sciences
Jorge A. Pérez, University of Groningen
Giles Reger, The University of Manchester
Grigore Rosu, University of Illinois at Urbana-Champaign
Kristin Yvonne Rozier, Iowa State University
Cesar Sanchez, IMDEA Software Institute
Gerardo	Schneider, University of Gothenburg
Nastaran Shafiei, NASA Ames Research Center/SGT
Julien Signoles, CEA LIST
Scott Smolka, Stony Brook University
Oleg Sokolsky, University of Pennsylvania
Bernhard Steffen, Univ Dortmund
Scott Stoller, Stony Brook University
Volker Stolz, Høgskulen på Vestlandet
Neil Walkinshaw, The University of Sheffield
Chao Wang, University of Southern California
Xiangyu Zhang, Purdue University






########################################################################



To unsubscribe from the CABERNET-EVENTS list, click the following link:

https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=CABERNET-EVENTS&A=1

Top of Message | Previous Page | Permalink

JiscMail Tools


RSS Feeds and Sharing


Advanced Options


Archives

May 2024
April 2024
March 2024
February 2024
January 2024
December 2023
November 2023
October 2023
September 2023
August 2023
July 2023
June 2023
May 2023
April 2023
March 2023
February 2023
January 2023
December 2022
November 2022
October 2022
September 2022
August 2022
July 2022
June 2022
May 2022
April 2022
March 2022
February 2022
January 2022
December 2021
November 2021
October 2021
September 2021
August 2021
July 2021
June 2021
May 2021
April 2021
March 2021
February 2021
January 2021
December 2020
November 2020
October 2020
September 2020
August 2020
July 2020
June 2020
May 2020
April 2020
March 2020
February 2020
January 2020
December 2019
November 2019
October 2019
September 2019
August 2019
July 2019
June 2019
May 2019
April 2019
March 2019
February 2019
January 2019
December 2018
November 2018
October 2018
September 2018
August 2018
July 2018
June 2018
May 2018
April 2018
March 2018
February 2018
January 2018
December 2017
November 2017
October 2017
September 2017
August 2017
July 2017
June 2017
May 2017
April 2017
March 2017
February 2017
January 2017
December 2016
November 2016
October 2016
September 2016
August 2016
July 2016
June 2016
May 2016
April 2016
March 2016
February 2016
January 2016
December 2015
November 2015
October 2015
September 2015
August 2015
July 2015
June 2015
May 2015
April 2015
March 2015
February 2015
January 2015
December 2014
November 2014
October 2014
September 2014
August 2014
July 2014
June 2014
May 2014
April 2014
March 2014
February 2014
January 2014
December 2013
November 2013
July 2013
April 2013
March 2013
February 2013
October 2012
July 2012
May 2012
April 2012
March 2012
December 2011
September 2011
July 2011
June 2011
April 2011
February 2011
August 2010
July 2010
May 2010
February 2010
January 2010
December 2009
October 2009
June 2009
May 2009
April 2009
March 2009
January 2009
November 2008
October 2008
September 2008
August 2008
July 2008
June 2008
May 2008
April 2008
March 2008
February 2008
January 2008
December 2007
November 2007
October 2007
September 2007
August 2007
July 2007
June 2007
May 2007
April 2007
March 2007
February 2007
January 2007
2006
2005
2004
2003
2002
2001


JiscMail is a Jisc service.

View our service policies at https://www.jiscmail.ac.uk/policyandsecurity/ and Jisc's privacy policy at https://www.jisc.ac.uk/website/privacy-notice

For help and support help@jisc.ac.uk

Secured by F-Secure Anti-Virus CataList Email List Search Powered by the LISTSERV Email List Manager