What’s The Biggest…

Ok, an interesting question to put to everybody…

What is the biggest Lotus Notes .nsf file that you have ever seen. Personally the biggest I’ve seen was about 5Gb in size, I know there are bigger out there but I’m interesting in knowing what the biggest you have ever come across is. Please post a comment with the biggest you have ever seen, even if it doesn’t beat the current top size as I’d like to get a good idea for an average .nsf size also…

Advertisement
Posted in None
15 comments on “What’s The Biggest…
  1. Ed Brill says:

    Oh come on Dec…size isn’t everything There’s a VP at Lotus with a 9 GB mailfile. That I’ve seen and heard about.Not surprisingly from where I sit, I’ve heard reports of databases that are way way big, bigger than 64 GB… but I’ve never seen one myself.

    Like

  2. Adam says:

    The db that I’m currently working on was about 7.5 gigs up until a few weeks ago when implemented a new nightly archiving process that should stabilize it at around 3.5 gigs. While 7.5 is large, the db is used by hundreds, if not thousands, of users concurrently and was performing generally well in processing thousands of transactions and new records daily. Also, the application uses another db that is currently over 8 gigs. We have, however, stripped down the design of that db to it’s bare essentials to maximize performance.

    Declan – Thanks so much for the Gmail account.. Looking forward to using it often.

    Like

  3. There was a posting on Notes.net today or yesterday where someone said they had a 39.7GB mail file at their location. I want to know what the heck is in that file!

    Like

  4. We’ve got a 4.4GB CRM database here. Performance was abysmal when I joined a couple of years ago, but a lot of work on setting the view indexes up properly helped resolve that.

    Like

  5. Paul Mooney says:

    11.5 GB CRM system. All done in Notes 5. That will be my first DB2 conversion project with ND7…

    Like

  6. Jens Bruntt says:

    10.000.000 documents but only 4GB.
    I deleted it as soon as I saw it. It was one of those system databases – events4.nsf or admin4.nsf, can’t remember which.

    Like

  7. I heard Domino has been tested up to 65 GB in DB size. What a nigtmare to support! I have supported an application around 9 GB before. When you have large databases, managing view indexes is a must! Currently dealing with 3-5 GB mail files sizes at my new gig. I can’t even imagine a 40 GB mail file? Doesn’t IBM have some archiving software in their portfolio somewhere?

    Like

  8. Curt Carlson says:

    We have a large application that consists of 10 separate .nsfs. The biggest of which is now at 32 GIG. Some of this is due to the large number of views, but mostly because of attachments and the customers unwillingness to archive anything. Amazingly, the performance is still good(only 102,000 docs in the 32Gig DB) although every day I ask myself is this the day when we will hit the wall?.

    Like

  9. I had to make some modifications to a report archive database a few years ago because it was 60 GB and still growing.

    Most of the size was due to the report file attachments that were on each document. I think there were about 700,000 docs in there at the time. I changed it so the attachments were stored in a directory structure on the filesystem, and the database size got knocked down to around 5 GB.

    Performance was good as long as the view indexes were built every night. That was R5 on an NT box.

    – Julian

    Like

  10. A company I worked for had a production system running R5.0.9a (when I left, IIRC) that dealt with intellectual property stuff.

    The main database had around one and a quarter million documents in it, and was somewhere between 10-12Gb (depending on whether you wanted views or not!).

    Given the time – over two years ago – it was pretty big. And, for various reasons, it had no transaction logging. Mainly because we exported data from that large central database into smaller databases that we needed to copy off the server at the filesystem level – so transaction logging was more hassle than it was worth to maintain, to be honest.

    If I’d still been there for the R6 upgrade to that server, I’d have almost certainly used transaction logging, though. The benefits of using transaction logging with views outweigh the minor operational hassles that it brings, really.

    (OK, yes – I know I could exclude databases from transaction logging. But there were lots of them, and maintaining it would be a nightmare. Plus it also complicated backups – we didn’t use a version of the backup software that had a Domino agent and were just doing filesystem level backups, so we couldn’t really attempt transaction logging. As I said, with R5 it probably wasn’t worth it – but R6 would have been compelling enough to change the backup system and our working methods, I feel.)

    More recently, I’ve seen a mail database grow to be 59Gb large. This was due to a silly user who decided that the best way to work from home was to set up an agent that forwarded mail to his home ISP account. Except someone sent him an email with a large attachment, which his ISP’s SMTP server rejected. The rejection notice – complete with attachment – went back to his mail database. Which forwarded it back to his ISP. Which rejected it. I think you can see how his mail database got to be 59Gb large here…

    Suffice it to say that I’ve only just arrived at this job, and mail quotas/router restrictions are on my task list. However, that day they moved up the task list several places. To just below a task that says “Kill users who make stupid agents”, in fact…

    As for average size – I can have a look on my Domino network if you like. The Domain Catalog should help me find you a quick enough figure… But I suspect it’s probably going to be in the high tens or low hundreds, really…

    I think it’s worth our while to quickly mention performance – a reason people often shy away from large databases in Domino/Notes. I have to say that in both cases I’ve mentioned (and all the ones I haven’t!), I was quite impressed. It wasn’t exactly SQL database performance, but it was certainly nippy considering the data sets. R4.6 used to scale horribly – you didn’t need to check how large a database was, you could feel it! (Usually as you counted the seconds^Wminutes whilst waiting for your views to build etc…) R5 on the same hardware did far better – especially on scalability. I’m happy to work with large databases, in fact I’d even say I’m confident about it – I wasn’t ever quite so confident with previous releases.

    In fact, the other day I was talking to a consultant we brought in about consolidating some of our Domino.Doc file cabinet databases together. (We currently split into new DBs based on number of binders, and we’d rather split based on size so that things are more consistent.) They suggested 500Mb. I looked at the largest file cabinet we have, and saw that it was about 7Gb. “I’m happy to have them split into 8Gb chunks, personally”, I replied. They made some doubtful noises… So in a jaunty tone, I responded:
    “Yes, you’re right. Make it a nice round 10Gb. No sense in skimping, is there?”…

    Like

  11. Declan Lynch says:

    Philip, thanks for the long reply and thanks to everybody else’s replies also.

    The reason I asked this question is that I recently heard about a notes application that was 30Gb in size. Hearing this made my jaw drop, i just couldn’t believe it as the biggest I’d ever seen was about 5Gb or so.

    I think from reading all the replies there are a few important points that have been made. If your going to run a big database then the newer version of Domino are the better choice and that you need to make sure that you have the resources on the server for the indexer to run properly.

    Maybe it’s time to increase those mailfile quotas from 500Mb to something a little higher.

    Like

  12. Heh – that’s the first time someone’s THANKED me for a long reply.

    Like

  13. Paul Mooney says:

    500Mb! Luxury!!

    Like

  14. Wild Bill says:

    Well, we built one that hit 4gig. However, it had over 4 MILLION documents in it. The views (that I designed *cough*) werent really intended for scalability – with the net effect that it took 8 DAYS to rebuild indexes.. On a beastie AIX box..
    Now *that* is a candidate for DB2…!

    So – physical database size itself – not that bad. I’ve seen 12.5gig Mail files (at a big Pharma) and 12 gig applications. But with *dumb* views and over a million documents – well, just start to wait a while…

    —* Bill

    Like

  15. Dan Holzrichter says:

    I have been testing with domino 6.5.4 on Windows server 2003 and seem to have hit a wall at around 64 GB. My database has aprox 1.2 million documents in it and i have only one view which is sorted but only has one column with a text value. The performance on a single CPU server (around 2.8 GHz hyperthreaded) is surprisingly good. Indexing does take a while but not nearly as long as I expected.Has anyone actually seen a database larger than 64 GB. I don’t think that the file size limit on the OS is an issue any longer, so maybe there is some sort of limit in Domino still?Dan

    Like

Comments are closed.

Archives
%d bloggers like this: