One of my cube in my AS 2005 DB has about 24 million record and bunch of dimensions. I have been watching its processing time in the past week or two. I have been doing Full Process every morning, on average it takes about 23 minutes to finish the full process. However starting yesterday, all of sudden, it took 34 minutes to finish, then today, it is like crawl, it is now about 68 minutes, it is only reading about 7.55 million (total is 24 million). Could anyone give me some pointer on what is going on.
I have 8 dual core processor, 24 gb memory on 64 bit SQL 2005 standard on windows 2003 64 bit with Database engine, AS, RS and IS in one server.
Hello, if you have a sudden increase in processing time of the cube you can have thousands of explanations for that problem.
Please provide more information about applications competing for resources on the same box, including an OLTP application. Are you only running a BI-solution or more applications?
You should not have these problems with a server with that capacity.
And SQL Server 2005 / Analysis Services, Standard Edition, only supports 4 GB RAM.
Enterprise edition do not have that limit.
HTH
Thomas Ivarsson
|||Check your datasource, indexes may have been dropped, etc. Narrow the problem down to SSAS or your datasource. Since you said its been fine for awhile, usually a hit in performance occurs at the datasource.|||Thanks Thomas and Zach! That's what happened when you have all the eggs in one box, it turns out one of backup job has been schedule running too frequently and take all I/O. I do have couple of quesitons:
1) Right now my ssis process cube task is setup as Full Process the whole AS database (around 15 cubes), and half of the time is used to process the 24 million cube, do you think it is better to setup to process this cube separately, do you think just doing "ProcessIncremental" and choose "affected objects" should be enough to replace "ProcessFull", I know if you choose "ProcessData" then you have to do "ProcessIndex" afterwards?
2) I did change default setting of AS one of memory/lowMemoryLimit from 75 to 50, and memory/totalmemoryLimit from 80 to 55, Do you think this will affect the AS cube performance?
|||Are these cubes time related? IE: partitioned by week, if partitioned at all? Is your system setup to handle Process Update and having your dimensions update any partition data...instead of reprocessing the entire database?|||I most often recommend full process.
Your options are a bit limited since you run the standard edition of Analysis Services 2005 that do not have partitions as a feature.
With partitions you can process parts of the fact tables and save some processing time.
With the hardware capacity you have available you can consider using the Enterprise Edition instead.
The suitable setting for memory is covered in the Performance Guide pointed to at the top of this newsgroup.
HTH
Thomas Ivarsson
No comments:
Post a Comment