[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: gnucobol-3.1-rc1 and -U_FORTIFY_SOURCE
Re: gnucobol-3.1-rc1 and -U_FORTIFY_SOURCE
Tue, 14 Jul 2020 11:31:20 +0100
Thanks for bringing this up; this is an interesting topic.
Do you think it would be worth hardening builds by default? The Debian and Red Hat articles you link don't mention many downsides (other than maybe performance). It shouldn't be too hard to extend configure to check for the hardening options. But I'm not sure if package maintainers have any expectations for what should be in default CFLAGS/LDFLAGS.
On Mon, Jul 13, 2020 at 12:52 PM James K. Lowden
> On Sun, 12 Jul 2020 09:25:34 -0400
> Jeffrey Walton <email@example.com> wrote:
> > Disabling _FORTIFY_SOURCE will cause GNU Cobol to fail audits and
> > acceptance testing. For example, checksec
> > (https://github.com/slimm609/checksec.sh) will generate findings on
> > the output artifacts.
> That's impressive. My gcc manual mentions _FORTIFY_SOURCE exactly 4
> times, 3 of which vaguely describe how it's affected by -O2 on Ubuntu
> 8.10. It never once lists which functions are checked, nor what
> compile-time or run-time errors are produced.
Yeah, Fortify Sources is not well documented. I think it is actually
part of "object size checking" in the compiler tools.
When Fortify Sources is in effect, then the compiler will insert calls
to safer functions when it can determine destination buffer sizes.
This is actually the compiler doing what ISO/IEC TR 24731-1 and TR
24731-2 provides. That is, when a destination buffer size can be
deduced by the compiler, the compiler will use a safer version of
memcpy (and friends), like memcpy(dptr, dsize, sptr, cnt). If cnt >
dsize at runtime, then a call to abort() takes place.
> Yet its use causes the software to fail audits and acceptance testing??
Hardening the toolchain is part of any thorough SDLC program. Many
organizations don't have a SDLC in place, but some do. Some
organizations have a partial plan in place.
Debian and Red Hat have a partial plan in place. They specify a
hardened toolchain, but it is not mandatory. A program or library that
does not meet program requirements is still distributed. Also see
I know for certain US DoD, US Financial and US Entertainment have a
SDLC in place. Findings have to be fixed or the risk has to be
There's a lot to be said about "risk has to be accepted". Some
organizations have a Risk Committee. They turn up in regulated
environments like US Financial. When a risk is encountered that is not
fixed, then the committee must approve the deviation from company
policy and standard operating procedures. The process of getting the
risk committee involved is called Risk Acceptance.
Dealing with the risk committee is usually a fair amount of work.
Organizations usually just fix the problem rather than sending
something into risk acceptance.
(I can tell you some stories about risk acceptance that would make you
laugh at some of the things an organization was willing to tolerate to
onboard a new system).
> Do these same auditors and acceptors scan the source code for
> #undef _FORTIFY_SOURCE
> or intentional fakery to quell spurious warnings? If they do, is its
> mere presence cause for rejection, or does someone analyze the source
> and form some kind of reasoned judgement? Let me guess....
Yes. Folks have tools and scripts to audit source code and binaries.
And yes, we have to waste our time with each finding. It can waste a
lot of time.
Fixing the problem once at the source can save a lot of folks a lot of
time. It also puts an end to mailing list messages like this.
> While it is, shall we say, "shallow" to rely on compiler settings as a
> standard for software quality, your question is nevertheless well put.
>From my experience, a hardened toolchain is part of a standard SDLC.
The processes are put in place to promote consistency, increase
reliability and reduce risk.
> It's not clear to me why _FORTIFY_SOURCE is being disabled globally.
> Potentially, there's some specialized code in which it triggers
> undesirable effects. Those should be isolated, so that _FORTIFY_SOURCE
> can otherwise do whatever good it does.
I believe FORTIFY_SOURCE needs -O1 or above (or others, like -Os or
-Ofast). Object size checking is tied to compiler analysis.
As I understand things, debugs builds or -O0 need to disable Fortify
Sources because compiler analysis is mostly (completely?) disabled. No
analysis, no object size checking. I think it is an unfortunate tying.