Web Wiz - Green Windows Web Hosting - Celebrating 25 Years!

  New Posts New Posts RSS Feed - 8beta2 Appreciable decline of productivity
  FAQ FAQ  Forum Search   Events   Register Register  Login Login

8beta2 Appreciable decline of productivity

 Post Reply Post Reply Page  <1 234
Author
Ritchie View Drop Down
Groupie
Groupie
Avatar

Joined: 14 June 2005
Location: Russian Federation
Status: Offline
Points: 91
Post Options Post Options   Thanks (0) Thanks(0)   Quote Ritchie Quote  Post ReplyReply Direct Link To This Post Posted: 09 March 2006 at 6:58pm
Greetings Borg! I have an offer of more optimum method of sample of data instead of what is used now. It is so-called " the Method paginal SQL samples " I it did not test on WWF 8 beta 2, but I can tell, that it is less exacting to resources and returns only that quantity of data which is necessary in each concrete sample.
 
The offered method does not apply for a rank of the best. It, maybe, and not better, but precisely is not worse than others.
 
1. Search of the decision was based on following wishes: the Reference to new page of sample should not lead re-query all sample.
 
2. The inquiry of page should return usual habitual recordset. Inizialisation should not be excessively long that the inquiry to the table in 1000000 lines has not tired with expectation.
 
3. It should be at any moment authorized to choose any page from sample, including earlier chosen.
 
4. The page should contain any set in lines.
 
5. The page should begin with any set line under number.
 
6. The decision satisfying to all these wishes to a srtass, has been found.
 
 
 
1. Here it: we Have inquiry which we wish to choose page by page
 
select * from BigTable
 
we it we do not start, and it we pass to a step 2.
 
2. We initialize thus: declare @handle int,
 
@rows int exec sp_cursoropen @handle OUT, ' select * from BigTable ', 1, 1, @rows OUT select @handle, @rows
 
thus we receive empty recordset, containing metadata-descriptions of columns who can be used for reception of the name of fields (and types). After NextRecordset also we receive handle the turned out cursor and in lines in all sample. handle it to be necessary for us for substitution in following calls, and it should be kept on the client in a global variable, and in lines can be used for definition   pages.
 
3. We receive the necessary page from sample:
 
exec sp_cursorfetch @handle, 16, @rowid, @rowcount
 
Where in @handle it is substituted the kept value handle, in @rowid number of a line with which the page interesting us begins is substituted, and in @rowcount is substituted in lines on the given page.
 
The step 3 is repeated so much how many it is necessary time.
 
4. We release resources after pages already any more will not be necessary
 
exec sp_cursorclose @handle
 
 
In our case in Query Analizer it is carried out following SQL a code
 
 
-----------------------------------------------------------------------------
Declare @handle int,@rows int
EXEC sp_cursoropen @handle OUT, 'SELECT TOP 1000000 tblTopic.Topic_ID, tblTopic.Poll_ID, tblTopic.Moved_ID,
tblTopic.Subject, tblTopic.Icon, tblTopic.Start_Thread_ID, tblTopic.Last_Thread_ID,
tblTopic.No_of_replies, tblTopic.No_of_views, tblTopic.Locked, tblTopic.Priority,
tblTopic.Hide, tblThread.Message_date, tblThread.Message, tblThread.Author_ID,
tblAuthor.Username, LastThread.Message_date, LastThread.Author_ID, LastAuthor.Username,
tblTopic.Event_date
FROM tblTopic WITH (NOLOCK) , tblThread WITH (NOLOCK) , tblThread AS LastThread WITH (NOLOCK) , tblAuthor WITH (NOLOCK) , tblAuthor AS LastAuthor WITH (NOLOCK)
WHERE ( tblThread.Author_ID = tblAuthor.Author_ID AND LastThread.Author_ID = LastAuthor.Author_ID AND tblTopic.Start_Thread_ID = tblThread.Thread_ID AND tblTopic.Last_Thread_ID = LastThread.Thread_ID AND (tblTopic.Priority = 3 OR tblTopic.Moved_ID = 2 OR tblTopic.Forum_ID = 2) ) AND (tblTopic.Hide = 0 )
ORDER BY tblTopic.Priority DESC, LastThread.Message_date DESC;',1, 1, @rows OUT
SELECT @handle, @rows
 
Declare @rowid int,@rowcount int
SET @rowid = 2000
SET @rowcount = 20
 
exec sp_cursorfetch @handle,16,@rowid,@rowcount
------------------------------------------------------------------------------------------
 
The necessary set of records stands out practically instantly and well works with a lot of records in the table. Wink
 
 


Edited by Ritchie - 09 March 2006 at 7:06pm
Back to Top
WebWiz-Bruce View Drop Down
Admin Group
Admin Group
Avatar
Web Wiz Developer

Joined: 03 September 2001
Location: Bournemouth
Status: Offline
Points: 9844
Post Options Post Options   Thanks (0) Thanks(0)   Quote WebWiz-Bruce Quote  Post ReplyReply Direct Link To This Post Posted: 10 March 2006 at 12:47pm
Thanks Ritchie for the suggestion, but do you have a version that doesn't use Stored Procedures as most of the supported database types don't support them and because of other issues with SP's version 8 won't be using them.
Back to Top
DeadMeat View Drop Down
Newbie
Newbie
Avatar

Joined: 24 January 2006
Status: Offline
Points: 32
Post Options Post Options   Thanks (0) Thanks(0)   Quote DeadMeat Quote  Post ReplyReply Direct Link To This Post Posted: 11 March 2006 at 6:38am
Hi Borg
 
I had an idea on this, that would speed the search up, but I would not recomend it, unless you are left with no choice. It is what I would call a really serious hack of sorts, and I will only ever do it in a DB where time critical transactions are taking place. I have mainly only ever used it for financial transactions etc, where time is more important than the perfect DB. I am still not sure that 0.1 of a second is that important, but then others may disagree. So to this a way to simplify the query is to alter the TOPICS table, by adding in 4 extra fields, these being :
Start_Author_ID      : int
Start_Thread_Date : datetime
Last_Author_ID       : int
Last_Thread_Date  : datetime
These fields would then be updated at the time that you update the fields Start_Thread_ID and Last_Thread_ID. This means that you do not have to include the THREAD table in the search.
 
I would not recomend this option, as it kinda removes the normalisation of the DB, but this option would definitely speed up the search, as the search is not a simple 2 layer Join as apposed to the 3 layer join that is currently in place.
 
Any way just another option to think about.
 
Thanks
DM
Back to Top
WebWiz-Bruce View Drop Down
Admin Group
Admin Group
Avatar
Web Wiz Developer

Joined: 03 September 2001
Location: Bournemouth
Status: Offline
Points: 9844
Post Options Post Options   Thanks (0) Thanks(0)   Quote WebWiz-Bruce Quote  Post ReplyReply Direct Link To This Post Posted: 11 March 2006 at 3:06pm
I have already thought of this and have already denormalised the tblTopics table to an extent, but I didn't want to go to far with it as storing loads of extra data would increase the size of the database.

But it is certainly something worth considering for very large large forums, however, to get around speed issues I decided to employ a different tactic of limiting the search results to the first 1000 (can't see anyone reading through more results than that) this then means the search runs very fast as only a maximum of 1000 records require the join.

To be honest, I'm wondering if it is worth limiting this amount further.


Back to Top
Ritchie View Drop Down
Groupie
Groupie
Avatar

Joined: 14 June 2005
Location: Russian Federation
Status: Offline
Points: 91
Post Options Post Options   Thanks (0) Thanks(0)   Quote Ritchie Quote  Post ReplyReply Direct Link To This Post Posted: 12 March 2006 at 9:45am
I think, that the basic problem all the same remains in superfluous quantity of data which returns Recordset. In process of growth of the table tblTopics it will be necessary to limit them (SELECT TOP 1000, SELECT TOP 500,  SELECT TOP 200 and so on). Recordset it is necessary to limit to any possible way. One of ways in MS SQL has been offered by me. Possibly it should to work well with huge quantity of messages in the table tblTopics it will not be natural to work it with MS Acsess and MySQL. MS Acsess it is impossible to use as a source of data for greater forums. Therefore I consider, that application of one storage procedure for optimization of speed in MS SQL versions is quite justified. WWF - best of the forums, written on ASP also has no competitors. The engine of a forum should be calculated on work with greater databases Wink
Чем дальше в лес, тем толще партизаны
Back to Top
DeadMeat View Drop Down
Newbie
Newbie
Avatar

Joined: 24 January 2006
Status: Offline
Points: 32
Post Options Post Options   Thanks (0) Thanks(0)   Quote DeadMeat Quote  Post ReplyReply Direct Link To This Post Posted: 13 March 2006 at 5:15am
Hi Borg
 
I know what you mean. Another approach would be to do a double hit on the database. he first to get the count of records that are returned (this would be a simplare join as you do not need the author tables), so that you know the number of pages, and the second is to do the actual select. Now you already know how many topics per page each user has, so I would limit the TOP portion of the select statement to 5 times that limit, this will make it really fast, then if he bounces onto the page 6 or 11 etc, then you would increase the select statement by a further 5 times the topics.
 
There is another approach that also might work, and that is to do the same as Yahoo, and MSN search with their results. They display only page 1-7 and when you hit page 5 then the number of pages increases etc.
 
From my stand point I prefer the first approach as it makes it a lot easier to handle, the only problem is that you end up with 2 hits against the DB, but this might e able to be converted to one hit. I just need to test some things under Mysql. I know how it works under MSSQL, and I will feed you back my findings.
 
Thanks
DM
Back to Top
DeadMeat View Drop Down
Newbie
Newbie
Avatar

Joined: 24 January 2006
Status: Offline
Points: 32
Post Options Post Options   Thanks (0) Thanks(0)   Quote DeadMeat Quote  Post ReplyReply Direct Link To This Post Posted: 13 March 2006 at 1:41pm

Hi Borg

I just thought of a simpler approach, that would not require the DB to be hit twice, and that is to do a TOP select of 5 times the number of topics per page plus 1. Then in the code check to see if the number of returned records is equal to this, if so then the number of pages at the top and bottom would end up being 1 2 3 4 5 more... When the user selects page 4 then you increase the Select amount to 10 times the number of topics per page, etc etc. This might cause a slow down in the release of the new V8, but this would sort out your speed issue on the topics page, as you would normally only be loading 5 times the topic count. I know that is adds to the code, but it would make the topics page display quicker.

Anyway just some ideas.

Thanks
DM

Back to Top
WebWiz-Bruce View Drop Down
Admin Group
Admin Group
Avatar
Web Wiz Developer

Joined: 03 September 2001
Location: Bournemouth
Status: Offline
Points: 9844
Post Options Post Options   Thanks (0) Thanks(0)   Quote WebWiz-Bruce Quote  Post ReplyReply Direct Link To This Post Posted: 13 March 2006 at 3:15pm
Some nice ideas, I will certainly have to look into them and see what can be done.
Back to Top
 Post Reply Post Reply Page  <1 234

Forum Jump Forum Permissions View Drop Down

Forum Software by Web Wiz Forums® version 12.08
Copyright ©2001-2026 Web Wiz Ltd.


Become a Fan on Facebook Follow us on X Connect with us on LinkedIn Web Wiz Blogs
About Web Wiz | Contact Web Wiz | Terms & Conditions | Cookies | Privacy Notice

Web Wiz is the trading name of Web Wiz Ltd. Company registration No. 05977755. Registered in England and Wales.
Registered office: Web Wiz Ltd, Unit 18, The Glenmore Centre, Fancy Road, Poole, Dorset, BH12 4FB, UK.

Prices exclude VAT at 20% unless otherwise stated. VAT No. GB988999105 - $, € prices shown as a guideline only.

Copyright ©2001-2026 Web Wiz Ltd. All rights reserved.