The Network UPS Tools project code base is managed by the
GNU
Build System colloquially known as "The Autotools", which include autoconf
,
automake
and many other components. Some of their important roles are to
generate the portable shell configure
script to detect build environment
capabilities and other nuances, and to help actually build the project with
Makefile
recipes (supported by many implementations of the standard POSIX
make
tool) generated from the Makefile.am
templates by the automake
tool.
Among the many standard recipes promulgated by the Autotools, make dist
handles creation of archive files with a release (or other) snapshot of a
software project, which "distribute" all the original or generated files
needed to build and install that project on a minimally capable end-user
system (should have a compiler, make
, and dependency libraries/headers,
but is not required to have autotools, manual page generation tools, etc.)
The make distcheck
goal allows to validate that the constructed archive
is in fact sufficient for such a build (includes all required files), and
also that the code structure and its recipes properly support out-of-tree
builds (as used on multi-platform and cross-build environments) without
contaminating the source code directory structure.
NUT’s root ‘Makefile.am` defines the DISTCHECK_FLAGS
eventually passed
to the configure
script executed as part of distcheck
validation, and
the default set of the flags requires to build everything. This in turn
constrains the set of systems where this validation can be performed to
build environments that have all dependency projects installed, have the
documentation generation tools, etc. in order to make sure that for all
files that are compiled or otherwise processed by the build routine, we
actually distribute the sources (implicitly as calculated by programs’
listed sources, or via explicit EXTRA_DIST
and similar statements)
regardless of features enabled or not to be built in the original run.
To avoid this constraint and allow the relevant distcheck
-like validation
to happen on environments without "everything and a kitchen sink" installed,
further recipes are defined by NUT, such as:
distcheck-light
: does not require the optional features to be built,
but just allow them (using --with-all=auto --with-ssl=auto --with-doc=auto
etc. flags);
distcheck-light-man
: similar to the above, but require validation that
manual pages can all be built (do not build PDF or HTML formats, though);
distcheck-fake-man
: for formal validation on systems without documentation
processing tools used by NUT recipes, populate the distribution archive
with "PLACEHOLDER" contents of missing pre-generated manual pages (such an
archive SHOULD NOT be delivered to end-users as a fully functional release),
so the validation of recipes around pre-built documentation installation
can be performed;
distcheck-ci
: based on current build circumstances, dispatch to standard
strict distcheck
or to distcheck-fake-man
.
Other recipes based on this concept are also defined, including:
distcheck-valgrind
: build whatever code we can, and do not waste time on
documentation processing (--with-all=auto --with-ssl=auto --with-doc=skip
),
to run the NUT test programs (make check
in the built environment) through
the Valgrind memory-checking tool.
NUT sources include a helper script and a suppression file which allow developers and CI alike to easily run built programs through the popular Valgrind tool and check for memory leaks, un-closed file descriptors, and more.
One use-case to cover the population of NUT self-test programs (and the common code they pull in from NUT libraries and drivers)
is automated as the make distcheck-valgrind
goal.
Helper script and suppression file to analyze NUT binaries.
Example use-case:
:; make -ks -j && LD_LIBRARY_PATH=`pwd`/clients/.libs \ ./scripts/valgrind/valgrind.sh ./tools/nut-scanner/nut-scanner -DDDDDD -m auto
Note that the script is generated under ${top_builddir}
by configure
from
a template file located in ${top_srcdir}/scripts/valgrind/valgrind.sh.in
.
You might be able to run it directly, falling back to a valgrind
program in
your PATH
, if any.
See also:
Valgrind Suppression File How-to
--gen-suppressions=all --error-limit=no
to valgrind
program options to generate suppression snippets
The root Makefile.am
includes a recipe to run a special build of NUT
analyzed by the cppcheck
tool (if detected by configure
script) and
produce a cppcheck.xml
report for further tools to use, e.g. visualize
it by the Jenkins Warnings plugin.
As compilers like GCC and LLVM/CLANG evolve, so do their built-in code
analyzers and warnings. In fact, this is a large part of the reasoning
behind using a vast array of systems along with the compilers they have
(many points of view on the same code discover different issues in it),
and on another — behind the certain complexity pattern in NUT’s own code
base, where code recommended by one compiler seems offensive to another
(so stacks of pragma
expressions are used to quiesce certain warnings
around certain lines).
The options chosen into pre-sets that can be selected by configure
script options are ones we use for different layers of CI tests.
Values to note include:
--enable-Werror(=yes/no)
— make warnings fatal;
--enable-warnings(=.../no)
— enable certain warning presets:
gcc-hard
, clang-hard
, gcc-medium
, clang-medium
, gcc-minimal
,
clang-minimal
, all
— actual definitions that are compiler-dependent
(the latter just adds -Wall
which may be relatively portable);
hard
, medium
or minimal
— if current compiler is detected as
CLANG or GCC, apply corresponding setting from above (or all
otherwise);
gcc
or clang
— apply the set of options (regardless of detected
compiler) with default "difficulty" hard-coded in configure
script,
to tweak as our codebase becomes cleaner;
yes
/auto
(also takes effect if --enable-warnings
is requested
without an =ARG
part) — if current compiler is detected as CLANG
or GCC, apply corresponding setting with default "difficulty" from
above (or all
otherwise).
Note that for backwards-compatibility reasons and to help filter out
introduction of blatant errors, builds with compilers that claim GCC
compatibility can enable a few easy warning presets by default. This
can be avoided with an explicit argument to --disable-warnings
(or
--enable-warnings=no
).
All levels of warnings pre-sets for GCC in particular do not enforce
the -pedantic
mode for builds with C89/C90/ANSI standard revision
(as guesstimated by CFLAGS
content), because nowadays it complains
more about the system and third-party library headers, than about NUT
codebase quality (and "our offenses" are mostly something not worth
fixing in this era, such as the use of __func__
in debug commands).
If there still are practical use-cases that require builds of NUT on
pre-C99 compiler toolkits, pull requests are of course welcome — but
the maintainer team does not intend to spend much time on that.
Hopefully this warnings pre-set mechanism is extensible enough if we would need to add more compilers and/or "difficulty levels" in the future.
Finally, note that such pre-set warnings can be mixed with options
passed through CFLAGS
or CXXFLAGS
values to your local configure
run, but it is up to your compiler how it interprets the resulting mix.
The make shellcheck
recipe finds files which the file
tool determines
to be POSIX or Bourne-Again shell scripts, and runs them through respective
interpreter’s (bash
or system /bin/sh
) test mode to validate the syntax
works.
Given that the /bin/sh
implementation varies wildly on different systems
(e.g. Korn shell, BASH, DASH and many others), this goal performed by CI on
a large number of available platforms makes sure that the lowest-denominator
syntax we use is actually understood everywhere.
At a later time additional tests, perhaps using the shellcheck
tool,
can be introduced into the stack.
The make shellcheck-nde
recipe calls tests/nut-driver-enumerator-test.sh
to self-test the scripts/upsdrvsvcctl/nut-driver-enumerator.sh.in
against
an array of SHELL_PROGS
(e.g. a list of interpreters provided by specific
CI agents), and make sure that shell-script based processing of ups.conf
in various interpreters provides the exact spelling of expected results.
NUT recipes rely on the aspell
tool (with aspell-en
dictionary, default
but different on numerous platforms), and a custom maintained dictionary file
(specified in Makefile
variables as NUT_SPELL_DICT
— by default, it is
${top_srcdir}/docs/nut.dict
) for additional words either unique to NUT or
quite common but absent in older standard dictionaries on some systems.
Operations are done according to LANG
and LC_ALL
values, both specified
in Makefile
variables as ASPELL_ENV_LANG
, by default en.UTF-8
.
The "nut-website" generation has similar recipes and relies on integration with those provided by the main NUT code base, but maintains its own custom dictionary for words only present in the website sources.
The root Makefile.am
includes recipes which allow developers and maintainers
to check spelling of all documentation (and/or update the custom dictionary),
while recipes in numerous subdirectories (where README.adoc
or other specific
documentation files exist) have similar goals to check just their files.
The actual implementation of the goals is in docs/Makefile.am
, and either
calls the tool if it was detected by the configure
script, or skips work.
For each checked file, a *-spellchecked
touch-file is created in respective
${builddir}
, so it is not re-checked until the source document, the custom
dictionary, or the Makefile
recipe is updated.
The ecosystem of Makefile.am
files includes the following useful recipes:
spellcheck
: passively check that all used words are in some dictionary
known to this system, or report errors for unknown words;
spellcheck-interactive
: actively check the documents, and for unknown
words start the interactive mode of aspell
so you can either edit the
source text (replace typos with suggested correct spelling), update the
custom dictionary, or ignore the hit (to rephrase the paragraph later, etc.)
This recipe can update the timestamp of the custom dictionary file,
causing all documents to become fair game for re-checks of their spelling.
* spellcheck-sortdict
: make sure the custom dictionary file is sorted
alphanumerically (helpful in case of manual edits) and the word count
in the heading line is correct (helpful in case of manual edits or git
branch merges).
The root Makefile.am
also provides some aids for maintainers:
spellcheck-interactive-quick
: runs "passive" spellcheck
in parallel
make
mode, and only if it errors out — runs spellcheck-interactive
;
spellcheck-report-dict-usage
: prepares a nut.dict.usage-report
file
to validate that words used in a custom dictionary are actually present
in any NUT documentation source file.