spark - examines SPARK programs and generates verification conditions
spark [OPTIONS] [ FILE_LIST or @METAFILE ]
The Examiner for SPARK, spark, analyses the given source files for errors and violations of the SPARK subset and (optionally) generates
verification conditions and dead path conjectures necessary for proof of absence of run-time exceptions and partial correctness.
This manual page only summarises the spark command-line flags, please refer to the full Examiner manual for further information.
These options do not quite follow the usual GNU command line syntax. All options start with a single dash instead of the usual two and they
can also be abbreviated, as long as the abbreviation is unique. For example -help can be abbreviated to -he but not -h as this clashes with
Specifies source file extension (Default 'ada')
Specifies the index file. By default no index file is used.
Specifies the warning control file. By default no warning control file is used.
Specifies the Examiner configuration file. By default no configuration file is used.
Ignore the spark.sw file, if it exists.
By default all listing files have the 'lst' extension. These options can be used to disable listing file generation or to change the
By default the report will be named 'SPARK.REP'. These options can be used to change the default name or to disable report genera-
Generate HTML listings and report file.
Generate report, listing and proof files in the specified directory.
No dates, line or error numbers will appear in the output files.
This can be one of 83, 95 (the default) or 2005.
Choose between the sequential (the default) or ravenscar language profile.
Do not predefine Standard.Duration.
Perform syntax check only. No semantic checks.
Choose between information or data.
Select security or safety policy for flow analysis.
-vcg Generate VCs.
-dpc Generate DPCs.
Select policy for generation of composite constant proof rules. Can be one of none (the default), lazy, keen or all.
Select alternative annotation character. The default is '#'.
Suppress screen output.
-nosli Don't generate SLI files.
Use the standard SPARK library.
Append Examiner table usage statistics to the report file. By default we don't do this.
Control treatment of FDL identifiers when found in SPARK source. Can be one of 'reject' (the default) or 'accept' or <string>.
Print Examiner banner and statistics and then exit.
-help Print command line summary and options.
Print information flow errors in original, less compact, format.
Print explanations after error messages. Settings can be off (the default), first_occurrence or every_occurrence.
Select policy for justification of errors. Values can be full (the default), brief or ignore.
Check casing for identifier references and check casing of package Standard identifiers. It is also possible to specify i or s to
check for only one of these.
sparksimp(1), spadesimp(1), zombiescope(1), victor(1), pogs(1)
This manual page was written by Florian Schanda <firstname.lastname@example.org> for the Debian GNU/Linux system (but may be used by
others). Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Ver-
sion 1.3 or any later version published by the Free Software Foundation; with no Invariant Sections, no Front-Cover Texts and no Back-Cover
22 March 2011 spark(1)