[httperf] Newbee question
arlitt at hpl.hp.com
Tue Jun 14 13:56:03 PDT 2005
if I understand your question correctly, you want to have 380 concurrent
sessions (users). in order to do that, you will need to do some planning,
as the command you included does not (as you correctly observed) maintain
380 concurrent sessions.
I suggest you review a message I sent to the httperf list in March 2004;
it is available from
just click on the 'Date' column for March 2004, then click on the message
I sent entitled "Httperf concurrent/rate implementation". that will give
you a detailed description on how to set up a more elaborate experiment
that maintains a desired number of concurrent sessions.
if you plan to test a large number of concurrent sessions, you may need to
apply a simple patch to httperf. that patch is described in a message I
sent in February 2004 ("httperf help required (fwd)"), which is also
available from the mail archive.
if you still have questions after reviewing these messages, please let me
> Thanks for your anwer! It seems that --wsesslog is indeed more useful
> for what I intend.
> I tried:
> --hog --wsesslog=380,1,sfile --max-connections=2 --rate=1 --timeout=10
> with sfile:
> I thought that should do it, but then the 380 will never be reached,
> right? It adds 1 session per second and many will have finished before
> the 380 is reached. Or will each session loop through the session file
> Can I force the program to loop through the session file (as with --wlog)?
> Also: what is the meaning of --max-connections in this context?
> Martin Arlitt wrote:
> >if you use the --wsesslog option (instead of --wlog) you can simulate
> >multiple simultaneous users. with this option httperf will use one
> >connection per user, assuming persistent connections are not disabled.
> >On Thu, 9 Jun 2005, Mark Pors wrote:
> >>I have been playing with httperf and it looks very promising for my
> >>purpose, but I am not sure how to interpret some of the options.
> >>What I try to accomplish is the following:
> >>- generate a load of x 'users' simultaneous
> >>- where a 'user' is simulation of a web browser, loading a single HTML
> >>page including all images, style sheets etc.
> >>- the 'user' has never more than 2 simultaneous connections with the server
> >>So I could make a file containing the URL of the web page + all URLs of
> >>the images etc., and call that file with --wlog=y,file.txt
> >>Using --max-connections 2 should ensure that each session doesn't use
> >>more than 2 simultaneous connections (am I correct?).
> >>Now is the question: how do I get 'x' simultaneous 'users'? The --rate
> >>option doesn't seem to have any influence...
> >>Any help is highly appreciated!
> >>Kind regards,
> >>httperf mailing list
> >>httperf at linux.hpl.hp.com
More information about the httperf