Print Page | Close Window

Size limit for Crystal Report

Printed From: Crystal Reports Book
Category: Crystal Reports 9 through 2020
Forum Name: Technical Questions
Forum Discription: Formulas, charting data, Crystal syntax, etc.
URL: http://www.crystalreportsbook.com/forum/forum_posts.asp?TID=18070
Printed Date: 29 Apr 2024 at 4:59am


Topic: Size limit for Crystal Report
Posted By: Crystalreport
Subject: Size limit for Crystal Report
Date Posted: 26 Nov 2012 at 4:05am
Does anyone know if there is a data set "size limit" for crystal reports ??
 Here is what is going on.  We run a series of DSO reports each period that carry 12 months of historical data.  they have been running fine until period-10

The size of the largest reoccurring DSO report that has run successfully is 29.6 mb.

I have tried running this report for period 10 both through Business Intelligence and through Crystal itself but it gives me the dreaded blank page which I have seen before. 

Does anyone know if this is a BW limitation ?? something in crystal itself ? Bi ??

If anyone has come across this problem of "size: before please let me know how you got around it.  I do not want to split up the reports if I do not have to.  thank you...

A.







Replies:
Posted By: kevlray
Date Posted: 27 Nov 2012 at 5:30am
The only 'size' limit I have ever ran into was the available memory on the computer that ran the report.  I have successfully crashed CR on Windows XP and 1.5 GB of RAM with a report that was returning a few (100+) million rows.


Posted By: lockwelle
Date Posted: 27 Nov 2012 at 7:23am
can't say i've ever hit a limit...have printed thousands of pages.
 
How many pages/records are you printing/selecting?
 
Not that it really matters, I just have a dislike of large reports, as I don't see anyone actually reading through that many pages...but that is just me
 
Not that I haven't written them, I just don't understand creating anything that big as actually being used/printed...
so just wondering.


Posted By: kevlray
Date Posted: 27 Nov 2012 at 8:36am
I work for a local government agency,  you would be surprised the amount of data that they request (making sure that all the clients insurance(s) are billed properly for an entire year). 


Posted By: comatt1
Date Posted: 27 Nov 2012 at 9:16am
Originally posted by kevlray

I work for a local government agency,  you would be surprised the amount of data that they request (making sure that all the clients insurance(s) are billed properly for an entire year). 



I hear you KEV, working within the city, you would be surprised what newspapers want with open records requests. They want every little piece of information they can get, and then will go through and nit pick to find information that can validate their stories.

not only newspapers but special interest groups, they are big proponents of finding any and all dirt on government employees.

Health Department reports are usually very extensive since they have so many guidelines, newspapers just want EVERYTHING and pick out what looks the best for the next news story.


Posted By: lockwelle
Date Posted: 27 Nov 2012 at 9:38am
well, nevermind, there are crazies who love big reports with lots of details, that would burn reams and reams to print.
 
I can understand financials...but once you know where your out of balance on the summaries, you can zero in on the details to find the difference...
 
oh well, to each their own.


Posted By: Crystalreport
Date Posted: 27 Nov 2012 at 10:18am
So I have found out that the reason the reports were failing is because of the lack of Memory during the processing of the report itself.  4GB of memory and most reports were taking up 3.67 GB to process.  I assumed the process was timing out ( past 60 minutes) but as it turns out the reports were "breaking" because of lack of memory.  So my assumption is that if I can convince the IT department to upgrade their hardware the problem should go away.  Hopefully.....


Posted By: kevlray
Date Posted: 27 Nov 2012 at 1:10pm
Also if you are using tables vs. a command, that will eat up a lot of memory.  The command will (normally) run faster also.


Posted By: lockwelle
Date Posted: 28 Nov 2012 at 8:59am
Thanks kevlray...
Never thought of that, since all my reports run from a stored proc.
 
Something to keep in mind if I ever go somewhere else



Print Page | Close Window