gdb
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

How does gdb simulate I/O?


From: Adam Beneschan
Subject: How does gdb simulate I/O?
Date: Wed, 29 Feb 2012 08:42:20 -0800

Hi, everyone,

This is my first time posting to this list...

When gdb is simulating a program on a cross-target, how does it
determine when to do I/O on standard input/output?

Let me be more specific.  I'm assuming that when a program is being
simulated, GDB recognizes, or is capable of recognizing, a certain
instruction sequence as an attempt to write a character (or string) to
standard output.  This could be a particular instruction, like some
sort of "trap" or "breakpoint" instruction with certain parameters, or
it could be a call to a routine with a specific name (e.g. if there's
a _write() routine in the program, GDB could recognize a call to that
routine).  When GDB tries to simulate such an instruction or a call to
the routine, it instead determines what the program is trying to
write, and dumps it to standard output itself, instead of executing
the instruction or the routine.  Similarly for standard input.

What does GDB recognize as a standard I/O call?  How would I find out
where this is documented; or is there a way to get GDB to tell me?

Is this configurable?  I.e. how would I tell GDB what instruction, or
what routine, to look for?

I'm interested in answers both for the general simulation case, and
specifically for simulating ARM.  I skimmed through the manual but
couldn't find anything that seemed related.

My motivation here is that I'm trying to make sure a runtime library
is built correctly so that programs using it can be simulated with GDB
and do standard input/output.

Thanks for any help you can provide.








reply via email to

[Prev in Thread] Current Thread [Next in Thread]