More of a MySQL question
I'm hoping, however, there are enough experts here to help me.
Usually, when I put together a database, I create one table per datatype. I've never had a problem when it contained 100, 500, or even 3000 entries. This new system, however, might have to contain up to a million entries.
I think this might be pushing it for that many rows in one table, especially when there might be heavy user traffic. I was thinking of breaking up the dataset into several tables and accessing them through a hash function. I'm concerned that this might make accessing the entries rather complicated.
Any suggestions?
Usually, when I put together a database, I create one table per datatype. I've never had a problem when it contained 100, 500, or even 3000 entries. This new system, however, might have to contain up to a million entries.
I think this might be pushing it for that many rows in one table, especially when there might be heavy user traffic. I was thinking of breaking up the dataset into several tables and accessing them through a hash function. I'm concerned that this might make accessing the entries rather complicated.
Any suggestions?
