This is not really a question though I hope some of the answers will answer some of mine.<BR><BR>I have the following scenario:<BR>I am developing an intranet for my company. This solution may very well involve anything up to 500 pages. The hard part is generating a dynamic, easily-updatable and efficient navigation system. My current solution is that every page has an ID and a reference to a parent ID and all this with page titles, URL, etc. is stored in a database and each page has an &#060;%ID=0234%&#062; tag in it.<BR><BR>The weakness with this solution is that URLs are really unique IDs so I am really double storing some data here. Another problem is that when I add a new page I have to look up the ID of it&#039;s parent page every time and there&#039;s no real way of getting a complete tidy structure of all information in one place.<BR><BR>That&#039;s why I was considering converting the structure to an XML tree without ID&#039;s for each page. The problem is finding the url for the particular page you&#039;re on will require parsing the whole file looking for it&#039;s url. Since I haven&#039;t used XML much, I don&#039;t know what kind of performance hit this would create. Parsing anything up to 500 tags and creating menus (I only have links to pages at top level, same level and a "up" button per page so I never show the whole menu at once) has to cost a bit of CPU? The site would probably get hit about 1500-2000 times a day. Each hit meaning have to build the "top level", "same level", and "up-button" links every time (this is supposed to be dynamic).<BR><BR>If there&#039;s a better solution like some smart caching I would be interested in hearing that aswell. <BR>