[httperf] httperf-0.9.0 segmentation fault on Mac 10.6.7
Michael Irwin
mdirwin78 at att.net
Mon May 2 08:37:35 PDT 2011
Thanks for the response. Setting ulimit -n to 1024 doesn't seem to make a difference.
On May 2, 2011, at 11:32 AM, Arlitt, Martin wrote:
> Hi Mike
>
> I do not have a Mac environment to explore this on, but the fact that 'ulimit -n' is 256 and httperf thinks it can open 1,024 file descriptors looks like a problem.
>
> Thanks
> Martin
>
>> -----Original Message-----
>> From: httperf-bounces at linux.hpl.hp.com [mailto:httperf-
>> bounces at linux.hpl.hp.com] On Behalf Of Michael Irwin
>> Sent: Monday, May 02, 2011 7:33 AM
>> To: httperf at linux.hpl.hp.com
>> Subject: [httperf] httperf-0.9.0 segmentation fault on Mac 10.6.7
>>
>> Running httperf 0.9.0 on Mac 10.6.7 I get segmentation faults with certain
>> command line options. It seems to be based on what --rate and/or --num-
>> conn is set to. For example, the following shows the problem:
>>
>> httperf --server clubland.heroku.com --uri "/" --num-conn 100 --num-call 1 -
>> -timeout 5 --port 80 --verbose --rate 30
>>
>> And any rate >= 30 shows the same problem.
>>
>> These do not seg fault (usually):
>> httperf --server clubland.heroku.com --uri "/" --num-conn 100 --num-call 1 -
>> -timeout 5 --port 80 --verbose --rate 10 httperf --server
>> clubland.heroku.com --uri "/" --num-conn 100 --num-call 1 --timeout 5 --
>> port 80 --verbose --rate 20 httperf --server clubland.heroku.com --uri "/" --
>> num-conn 50 --num-call 1 --timeout 5 --port 80 --verbose --rate 30
>>
>> When I get the segmentation fault, the output is this:
>>
>> httperf --server clubland.heroku.com --uri "/" --num-conn 100 --num-call 1 -
>> -timeout 5 --port 80 --verbose --rate 30 httperf --verbose --timeout=5 --
>> client=0/1 --server=clubland.heroku.com --port=80 --uri=/ --rate=30 --send-
>> buffer=4096 --recv-buffer=16384 --num-conns=100 --num-calls=1
>> httperf: warning: open file limit > FD_SETSIZE; limiting max. # of open files
>> to FD_SETSIZE
>> httperf: maximum number of open descriptors = 1024 Segmentation fault
>>
>> Not sure if it matters but `ulimit -n` is 256.
>>
>> How can I solve this issue? Thanks!
>>
>> Mike Irwin
>>
>>
>> _______________________________________________
>> httperf mailing list
>> httperf at linux.hpl.hp.com
>> http://www.hpl.hp.com/hosted/linux/mail-archives/httperf/
>
> _______________________________________________
> httperf mailing list
> httperf at linux.hpl.hp.com
> http://www.hpl.hp.com/hosted/linux/mail-archives/httperf/
More information about the httperf
mailing list