Igor: BLOB is not supported by LDI because there is no way to know the encoding form...
For indexing blob column I added clob column wich is always null, made pl/sql function...
Thank You, Marcelo. This third party Analyzer really support Russian morphology better...
Igor: If you have a file .jar with the new Analyzers compiled using a Java 1.5 version...
Install Instructions
I tested org.apache.lucene.analysis.ru.RussianAnalyzer. It support search of words...
Install Instructions
Documentation
Updated test suite
Only test with 1000 rows
Fixed spell checker new API
Updated to 3.6.2 dist
Updated to 3.6.2 dist
Updated to 3.6.2 dist
Updated to 3.6.2 dist
Change Log
Change Log
Change Log
Readme
Readme
Hi Igor: I think that the problem is that the analyzer is used during indexing and...
Below test: create table test_en (comments clob) / insert into test_en values ('comments')...
Marcelo, I can't understand what's wrong with varhar2 to Clob conversion in my test....
Igor: The problem is not with LDI/Lucene/CLOB encoding functionality. IMO the problem...
I made test on oracle database 11.2.0.3 with NLS_CHARACTERSET AL32UTF8. New line...
Thank You, Marcelo. I tried You example with function converting to UTF8. All works...
Igor: Lucene is always using UTF8 as character encoding for streams. My DB charset...
I tried with entering Text with Sql Developer - result is the same. I am using character...
Hi Igor: The problem is the conversion of the chr() function to CLOB. If you replace...
Below test table creation script: create table test ( id number unique, comments...
Hi Igor: IndexUpdateServ and IndexScanServ processes are used to increase LDI cache...
Hello. I installed lucene domain index on Oracle 11.2.0.3. I see that LUCENE.IndexUpdateServ...