paging thru Javascript array ..limitations

Results 1 to 6 of 6

Thread: paging thru Javascript array ..limitations

  1. #1
    sohail Shah Guest

    Default paging thru Javascript array ..limitations

    I have a Recordset that can typically contain 50000 or more records and i have to do paging thru it Currently i am using the paging property of the ADO recordset <BR>to page thru the records.But it is having some performance issues due to repeated hits on the database for each page request. <BR><BR>Is it going to be a problem if i use client side Javascript array to store the recordset. I mean can javascript handle such large arrays.<BR><BR>I am also thinking of changing the recordset into an XML file and then navigate thru it using data bound controls.But does Netscape navigator support this? .I want to make this application Browser neutral.Is there any way to navigate the XML file which is browser neutral.<BR> <BR>I am using an IIS4.0 server with SQL server7.0 as the back end and ASP2.0 <BR><BR>Thanks in advance,<BR>Sohail Shah

  2. #2
    stevew Guest

    Default not the answer, but another question...

    Does anyone really NEED to page through 50000 records?<BR>

  3. #3
    sohail shah Guest

    Default RE: not the answer, but another question...

    True real scenario this will not happen ....<BR>but if the user keeps all the fields empty on the search screen and presses Search ..In this case all the records from the database will be fetched...In such a case u need to show the results even though the user may not page thru it...

  4. #4
    Join Date
    Dec 1969

    Default Are you SERIOUS???

    50,000 records at, say 100 bytes of data each. Put that into XML format and it&#039;s liable to TRIPLE in size (really! XML is a *very* wasteful format!). 300 time 50,000 = 15 MEGABYTES of information!<BR><BR>So your user would have to DOWNLOAD ALL 15 MEGABYTES before he/she could see the first record! Let&#039;s see...on a 56K connection... Oh, just forget it!<BR><BR>How about an alternative plan?<BR><BR>How about SHARING the load? Download 100 or 200 or even 1,000 records to the browser. Use JS code to page within those records. But then go back to the server to get the next super-page of 100, 200, or whatever? This gets more reasonable: 1,000 records at 300 bytes each is still 300KB. So that&#039;s still about one minute with a 56K modem. Maybe you adjust how many records you get per super-page based on the connection speed?<BR><BR>But, really, there is something TERRIBLY wrong with any database viewing system that requires the user to scroll through 50,000 records! There *MUST* be a better way to organize the data so that the user pre-selects a reasonable sized subset of those records! I&#039;d RE-THINK this one...a *LOT*!!!<BR><BR>

  5. #5
    Join Date
    Dec 1969

    Default LOL true I would rather work in Burger King..

    ..where at most I will have to scroll through 25-30 food items to take an order, 50,000 recs geez.

  6. #6
    Join Date
    Dec 1969

    Default Magazine Article?

    He may be referring to an article in Visual Basic Programmers Journal... I just got the issue yesterday, and it described exactly what he&#039;s doing...<BR><BR>Although they did throughly point out that it doesn&#039;t work for large recordsets... basically, page the database into 100 records, drop it onto the client as XML (so then the client can "page") and then go back to the server for the next 100, etc.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts