A
AV
Hi,
What is the best way to handle the following:
* List of 1M unique phrases
* These are "whitelist" phrases. Meaning that if an input string
matches one of these phrases it is "GOOD" if it does not match any of
these phrases it is "BAD".
* The process gets hit 200-500 times per SECOND
* I want to avoid making this many look up requests to a database
with
clustered index that stores these phrases
* I want to somehow detect the BAD phrase with an in-memory lookup
* The problem is that in my environment (ASP.NET) the regular
hashtable takes up too much space in memory for these 1M records
(since it actually stores the values of each record).
* I do not care about the value of the record, I only want to store
it's hash and in the minimalistic way so that the looks that happe
200-500 requests per second can use the in-memory struct.
Any suggestions for C#/ASP.NET environment?
Thanks,
AV
What is the best way to handle the following:
* List of 1M unique phrases
* These are "whitelist" phrases. Meaning that if an input string
matches one of these phrases it is "GOOD" if it does not match any of
these phrases it is "BAD".
* The process gets hit 200-500 times per SECOND
* I want to avoid making this many look up requests to a database
with
clustered index that stores these phrases
* I want to somehow detect the BAD phrase with an in-memory lookup
* The problem is that in my environment (ASP.NET) the regular
hashtable takes up too much space in memory for these 1M records
(since it actually stores the values of each record).
* I do not care about the value of the record, I only want to store
it's hash and in the minimalistic way so that the looks that happe
200-500 requests per second can use the in-memory struct.
Any suggestions for C#/ASP.NET environment?
Thanks,
AV