Hello, i am in the process of studying the code and determining if i can use it in an application i am developing.
a challenge i forsee based on my inital search has to do with running this as a web service.
in short loading a large database into memory on each call would be a performance killer when the data size grows.
also as the data volume grows the ram needed to cash the data could become an issue in a cloud environment.
i am going to take a look at how the system does the matching and see if i can find any way to use a sql db or a no sql db to balance how it works.
granted i may not find a better way.... but i will take a look.
i saw metnion of prefiltering - that can help in some ways but still tends to not scale very well and does not help with a general use ,
say you have 100 places users may be asked to authenticate at.
say you have 300,000 users with just one finger to match.
a normal web service would need to load that data on any given call, multiple calls would (by default) load the same data into more ram....
ok a cash could allow n callers to share a common data pool and that helps some what but then adds problems with re-loading the data when the backing store has chnages.
well i am going to start learning the code and see what i find.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Definitely cache all templates in RAM. SourceAFIS was designed to work this way. AFAIK, most if not all fingerprint matchers need some form of in-memory storage. 300K templates can still fit in a mid-sized cloud instance.
As for keeping the cache up to date, write-through caching is the easiest to implement, but it doesn't let you run multiple servers concurrently. Write-back caching is more complicated and requires some change notification mechanism, but it will work in a distributed system.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Hello, i am in the process of studying the code and determining if i can use it in an application i am developing.
a challenge i forsee based on my inital search has to do with running this as a web service.
in short loading a large database into memory on each call would be a performance killer when the data size grows.
also as the data volume grows the ram needed to cash the data could become an issue in a cloud environment.
i am going to take a look at how the system does the matching and see if i can find any way to use a sql db or a no sql db to balance how it works.
granted i may not find a better way.... but i will take a look.
i saw metnion of prefiltering - that can help in some ways but still tends to not scale very well and does not help with a general use ,
say you have 100 places users may be asked to authenticate at.
say you have 300,000 users with just one finger to match.
a normal web service would need to load that data on any given call, multiple calls would (by default) load the same data into more ram....
ok a cash could allow n callers to share a common data pool and that helps some what but then adds problems with re-loading the data when the backing store has chnages.
well i am going to start learning the code and see what i find.
Definitely cache all templates in RAM. SourceAFIS was designed to work this way. AFAIK, most if not all fingerprint matchers need some form of in-memory storage. 300K templates can still fit in a mid-sized cloud instance.
As for keeping the cache up to date, write-through caching is the easiest to implement, but it doesn't let you run multiple servers concurrently. Write-back caching is more complicated and requires some change notification mechanism, but it will work in a distributed system.
understood, i have to do a lot of study before i draw any conclusions, just wanted to start a talk and see what anyone else has found or tried.
i agree that we have to do some kind of memory use.
i need to see the actual matching code before i can say anything of real meaning.
got the files but it will take time to learn what is going on.
thanks for any input.