I got a directory that contains files for users of a program I have. There are around 70k json files in that directory.
The current search method is using glob
and foreach
. It's getting quite slow and hogging the server. Is there any good way to search through these files more efficiently? I'm running this on a Ubuntu 16.04 machine and I can use exec
if needed.
Update:
Theses are json files and each file needs to be opened to check if it contains the search query or not. Looping over the files is quite fast, but when it needs to open each file, it takes quite a while.
These cannot be indexed using SQL or memcached, as I'm using memcached for some other things.
See Question&Answers more detail:os