I've got a bunch of mysql tables with the same structure and a hex md5hash unhexed stored as binary(16) in my mysql tables.
However, when I add data randomly into my different mysql tables (to split write performance on different servers and tables), I can't enforce uniqueness without a centralized table with a UNIQUE/PRIMARY index or an external key-value optimized storage system.
Mysql is TOO SLOW to enforce uniqueness because the time to insert decreases exponentially as you add more and more records to the unique table.
I need a sample and working model with an external system like mongodb, couchdb, tokyocabinet, tokyotyrant, voldemort, or cassandra where before the md5hash is inserted into a mysql table, it is checked in batch against any of these systems.
My requirements is that you can check and insert 1 Million unique hashes into the key-value store in under 5 seconds, when the key count in less than 100M unique keys, and the value are nulls.
接包方 | 国家/地区 | |
---|---|---|
![]() |
3
Vvprok
(中标)
|
|
![]() |
3
Getwebhelp
|
|
![]() |
3
Xfactor
|
|
![]() |
2
Sraoss_india
|
|
![]() |
2
Neuronindia
|
|
![]() |
2
Wordmann
|
|
![]() |
2
Glaswegian
|