[nycphp-talk] DB for large datasets
Matthew Terenzio
webmaster at localnotion.com
Thu Aug 12 09:51:21 EDT 2004
Postgres or Oracle .........that should work ; )
On Aug 12, 2004, at 9:38 AM, Tim Sailer wrote:
> I'm developing an internal application that takes information about
> network traffic, and stores it in tables, currently MySQL, for each
> month. A merge table gets queried instead of having to look through
> each table. Now, the problem is, we're looking at more than 30M
> records/mo
> and MySQL is just barfing. I'm getting the notorious error 127 from the
> table handler, and bad corruption if I am foolish enough to try to
> delete
> from the tables. The backend feeding the database is perl, and the
> frontend,
> of course, is PHP. My only alternative at this point is to go to
> another
> 'robust' database like Postresql or Oracle. My inclination is
> Postgresql.
> Not having any experience lately with Postgresql, I'm turning to the
> collective
> brainpower of this group. Porting the code from MySQL to Postgresql
> seems
> straightforward. Does Postgresql have something like Merge Tables? What
> about performance. What does anyone thing will be the performance
> loss/gain
> on the move? It's on a 2.8Ghz P4/2G RAM machine.
>
> Thanks,
> Tim
>
> --
> Tim Sailer <sailer at bnl.gov>
> Information and Special Technologies Program
> Office of CounterIntelligence
> Brookhaven National Laboratory (631) 344-3001
> _______________________________________________
> New York PHP Talk
> Supporting AMP Technology (Apache/MySQL/PHP)
> http://lists.nyphp.org/mailman/listinfo/talk
> http://www.newyorkphp.org
More information about the talk
mailing list