A high-performance sensitive word (illegal word/dirty word) detection and filtering component, with a traditional and simplified exchange, supports full-width half-width exchange, Chinese characters to pinyin, fuzzy search, and other functions. C#Language, using StringSearchEx2.Replacefiltering, on a 48k sensitive thesaurus at over 300 million characters per second. (cpu i7 8750h). C#'s own regular is very slow, which StringSearchEx2.ContainsAnyis Regex.IsMatchmore than 88,000 times the efficiency, which is related to the number of keywords. In the Find All test, (sensitive words in the detected text will not be displayed, you can debug and view by yourself). Special class for filtering illegal words (sensitive words), you can set the length of skip words, default full-width to half-width, ignore case, skip words, repeated words, blacklist, etc.