[httperf] httperf-0.9.0 segmentation fault on Mac 10.6.7

Michael Irwin mdirwin78 at att.net
Mon May 2 07:32:40 PDT 2011


Running httperf 0.9.0 on Mac 10.6.7 I get segmentation faults with certain command line options. It seems to be based on what --rate and/or --num-conn is set to. For example, the following shows the problem:

httperf --server clubland.heroku.com --uri "/" --num-conn 100 --num-call 1 --timeout 5 --port 80 --verbose --rate 30

And any rate >= 30 shows the same problem.

These do not seg fault (usually):
httperf --server clubland.heroku.com --uri "/" --num-conn 100 --num-call 1 --timeout 5 --port 80 --verbose --rate 10
httperf --server clubland.heroku.com --uri "/" --num-conn 100 --num-call 1 --timeout 5 --port 80 --verbose --rate 20
httperf --server clubland.heroku.com --uri "/" --num-conn 50 --num-call 1 --timeout 5 --port 80 --verbose --rate 30

When I get the segmentation fault, the output is this:

httperf --server clubland.heroku.com --uri "/" --num-conn 100 --num-call 1 --timeout 5 --port 80 --verbose --rate 30
httperf --verbose --timeout=5 --client=0/1 --server=clubland.heroku.com --port=80 --uri=/ --rate=30 --send-buffer=4096 --recv-buffer=16384 --num-conns=100 --num-calls=1
httperf: warning: open file limit > FD_SETSIZE; limiting max. # of open files to FD_SETSIZE
httperf: maximum number of open descriptors = 1024
Segmentation fault

Not sure if it matters but `ulimit -n` is 256.

How can I solve this issue? Thanks!

Mike Irwin




More information about the httperf mailing list