About SBST

Search-Based Software Testing (SBST) is the application of optimizing search techniques (for example, Genetic Algorithms) to solve problems in software testing. SBST is used to generate test data, prioritize test cases, minimize test suites, optimize software test oracles, reduce human oracle cost, verify software models, test service-orientated architectures, construct test suites for interaction testing, and validate real time properties (among others).

The objectives of this workshop are to bring together researchers and industrial practitioners both from SBST and the wider software engineering community to collaborate, to share experience, to provide directions for future research, and to encourage the use of search techniques in novel aspects of software testing in combination with other aspects of the software engineering lifecycle.

Important Dates

The workshop will adhere to the general ICSE workshop dates (AOE):

Paper Submission
Friday 1 Feb 2019
Sunday 10 Feb 2019

Notification to Authors
Friday 1 Mar 2019

Camera Ready Due
Friday 15 Mar 2019

Submission Guidelines

All submissions must conform to the ICSE 2019 formatting and submission instructions. All submissions must be anonymized, in PDF format and should be performed electronically through EasyChair.

Submission site: https://easychair.org/my/conference.cgi?conf=sbst2019

Call for Papers

Researchers and practitioners are invited to submit:

  • Full papers (maximum of 8 pages, including references) Original research in SBST, either empirical, theoretical, or showing practical experience of using SBST techniques and/or SBST tools.
  • Short papers (maximum of 4 pages, including references) Work that describes novel techniques, ideas and positions that have yet to be fully developed; or are a discussion of the importance of a recently published SBST result by another author in setting a direction for the SBST community, and/or the potential applicability (or not) of the result in an industrial context.
  • Position papers (maximum of 2 pages, including references) that analyze trends in SBST and raise issues of importance. Position papers are intended to seed discussion and debate at the workshop, and thus will be reviewed with respect to relevance and their ability to spark discussions.
  • Tool Competition entries (maximum of 4 pages, including references). We invite researchers, students, and tool developers to design innovative new approaches to software test generation.

In all cases, papers should address a problem in the software testing/verification/validation domain or combine elements of those domains with other concerns in the software engineering lifecycle. Examples of problems in the software testing/verification/validation domain include (but are not limited to) generating testing data, prioritizing test cases, constructing test oracles, minimizing test suites, verifying software models, testing service-orientated architectures, constructing test suites for interaction testing, and validating real time properties.

The solution should apply a metaheuristic search strategy such as (but not limited to) random search, local search (e.g. hill climbing, simulated annealing, and tabu search), evolutionary algorithms (e.g. genetic algorithms, evolution strategies, and genetic programming), ant colony optimization, and particle swarm optimization.