In asp 2.0 we use Reponse.Buffer = True in order to make all the html read at once. <BR>Let me explain.<BR>When the server reaches an .asp page it interprets the page on the server using asp.dll and it output (into a stream) some html now the HTML is sent to the browser. BUT if the asp engine sees the instruction Reponse.Buffer = true it then **waits** until it has finished building this output stream and **then** it sends it to the browser.<BR><BR>In asp 2.0 if you don't use the instruction the default is = false<BR><BR>Other example:<BR>If I have a query and I know it has a lot of result, such as 100 000 records, now my question is do I want to wait until all the 100 000 records are return to me and ** then *** show them to the user ?<BR><BR>OR<BR><BR>Each time it gets a couple of records, should it to the user bit by bit.<BR><BR>In asp 3.0 the default value is = True so you don't have to use it.<BR><BR>Hope it helps cauze I'm not good in explaining stuff...:)<BR>
In my opinion this is the answer to WHAT response.buffer is not WHY it's used. Anyone wanna give their thoughts on this? For my part I'd think I might rather just stream stuff to the user as it's available.
It seems to me that:-<BR>It's (generally) more efficient to send a single large packet of data across a network (especially a low bandwidth phone line) than to send loads of small packets, which is what could happen if you set ASP to NOT buffer the output from the engine. i.e. just send me stuff as you process each step at a time.<BR>But of course there's a trade off. You may want your client to see some output as you build up the page(s), I guess thats what the Flush method is for.