debian man page for sfood-checker

Query: sfood-checker

OS: debian

Section: 1

Format: Original Unix Latex Style Formatted with HTML and a Horizontal Scroll Bar

SFOOD-CHECKER(1)					      General Commands Manual						  SFOOD-CHECKER(1)

NAME
sfood-checker - check for superfluous import statements in Python source code
SYNOPSIS
sfood-checker [options] files...
DESCRIPTION
This script is used to detect forgotten imports that are not used anymore. When writing Python code (which happens so fast), it is often the case that we forget to remove useless imports. This is implemented using a search in the AST, and as such we do not require to import the module in order to run the checks. This is a major advantage over all the other lint/checker programs, and the main reason for taking the time to write it. As inputs, it can receive either files or directories; in case no argument is passed, it parses the current directory recursively.
OPTIONS
-h, --help show the help message and exit --debug Debugging output. -I IGNORES, --ignore=IGNORES Add the given directory name to the list to be ignored. -d, --disable-pragmas Disable processing of pragma directives as strings after imports. -D, --duplicates, --enable-duplicates Enable experimental heuristic for finding duplicate imports. -M, --missing, --enable-missing Enable experimental heuristic for finding missing imports.
SEE ALSO
sfood(1), sfood-cluster(1), sfood-copy(1), sfood-flatten(1), sfood-graph(1), sfood-imports(1).
AUTHOR
sfood-checker was written by Martin Blais <blais@furius.ca> and it's part of snakefood suite. This manual page was written by Sandro Tosi <morph@debian.org>, for the Debian project (and may be used by others). January 2, 2009 SFOOD-CHECKER(1)
Related Man Pages
go-list(1) - debian
pychecker(1) - debian
module::load(3pm) - suse
module::load5.18(3pm) - mojave
namespace::autoclean(3pm) - debian
Similar Topics in the Unix Linux Community
Remove duplicates from end of file
For loop to check the files availability
Request to check remove duplicates but write before it
[Solved] Removing duplicates from the file and saving as new file
Filtering duplicates based on lookup table and rules