savannah-hackers-public
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Savannah-hackers-public] excessively high load avg, again (daily?)


From: Jim Meyering
Subject: [Savannah-hackers-public] excessively high load avg, again (daily?)
Date: Thu, 08 Dec 2011 13:18:13 +0100

This high load may be due to the "nightly" rsync you see below:

  top - 12:04:09 up 29 days, 18:43,  2 users,  load average: 20.80, 20.24, 18.41
  Tasks: 267 total,   1 running, 263 sleeping,   2 stopped,   1 zombie
  Cpu(s):  0.7%us,  0.4%sy,  0.0%ni, 76.9%id, 22.0%wa,  0.0%hi,  0.0%si,  0.0%st
  Mem:   6281876k total,  3307120k used,  2974756k free,   380964k buffers
  Swap: 12582904k total,   215660k used, 12367244k free,  1656744k cached

    PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND
   1927 www-data  39  19 1090m 740m 3596 S    0 12.1 425:43.54 python
   2691 nobody    20   0  109m  58m  33m S    0  1.0   0:02.24 git
  10840 root      20   0  107m  48m 1556 D    1  0.8   1:19.83 rsync
  24475 www-data  20   0  267m  35m 4532 S    0  0.6   0:01.81 apache2
  23259 www-data  20   0  265m  33m 4496 S    0  0.5   0:01.52 apache2
  30805 www-data  20   0  250m  17m 4472 S    0  0.3   0:00.72 apache2
   2961 www-data  20   0 47364  16m 8680 D    0  0.3   0:06.41 cgit.cgi
  21151 root      20   0 11512 8420 2724 D    0  0.1   0:02.12 generate_log_ac
   2550 www-data  20   0  8356 6772 1772 S    0  0.1   0:00.56 index.cgi
   3023 www-data  20   0  7928 6292 1724 S    3  0.1   0:00.09 index.cgi
   3027 www-data  20   0  7808 6288 1724 S    3  0.1   0:00.10 index.cgi
   3001 www-data  39  19  9572 6200 2488 D    0  0.1   0:00.10 viewvc.cgi
  10838 root      20   0 13164 6076 2808 S    0  0.1   0:12.87 sshd
   1866 nobody    20   0 29544 3840 1720 S    1  0.1   0:00.22 rsync
   1980 root      20   0 22264 3828 2904 S    0  0.1   2:09.75 apache2
    896 root      20   0 10604 3472 2760 S    0  0.1   0:00.03 sshd
   2481 root      20   0 10604 3468 2760 S    0  0.1   0:00.02 sshd
  29941 root      20   0 10604 3468 2756 S    0  0.1   0:00.02 sshd
   2276 root      20   0 10604 3464 2756 S    0  0.1   0:00.02 sshd
   2367 root      20   0 10604 3464 2760 S    0  0.1   0:00.02 sshd

Or, maybe it's related to this:

I noticed that I could not push to coreutils, several times over
the course of maybe 15 minutes:

    $ git push
    Timeout, server git.sv.gnu.org not responding.
    fatal: The remote end hung up unexpectedly

Since there were some git-daemon processes dating back to November
and since we have a limit on those (at least I think that's what
Michael said), I've just killed those November git-daemon processes.
Obviously just stop-gap, since it does not address the underlying cause,
but at least we can now git-push.

Perhaps related, there are lots of these in dmesg.
We get from 1 to ~6 per hour:

[2572324.224898] cgit.cgi[29777]: segfault at 863d000 ip b768c810 sp bfe9da2c 
error 6 in libc-2.11.2.so[b7619000+140000]



reply via email to

[Prev in Thread] Current Thread [Next in Thread]