Re: [cx-oracle-users] Python 2: unicode vs str param causing slowness
Brought to you by:
atuining
|
From: Dayton G. <Day...@sy...> - 2015-02-04 21:33:54
|
Sadly, still pretty stuck on this issue. Was looking at the EXPLAIN PLAN
for the slow query. Used...
select plan_table_output from
table(dbms_xplan.display_cursor('id',null,'basic +PEEKED_BINDS'));
and
select plan_table_output from table(dbms_xplan.display_cursor('id',null,
'+ALLSTATS'));
I get the following (not gonna obfuscate the full query anymore. Too
stuck). Please correct me if I'm wrong, but I don't see anything
immediately out of place.
SELECT COUNT(*) FROM "APPSTORE_DEVICE" INNER JOIN "AUTH_USER" ON (
"APPSTORE_DEVICE"."USER_ID" = "AUTH_USER"."ID" )
WHERE("APPSTORE_DEVICE"."IS_ACTIVE" = 1 AND "APPSTORE_DEVICE"."TENANT_ID"
=
9 AND "AUTH_USER"."IS_ACTIVE" = 1 AND "APPSTORE_DEVICE"."USER_ID" IS
NOT NULL AND "APPSTORE_DEVICE"."IS_CORPORATE_OWNED" = 0 AND :arg0 =
"APPSTORE_DEVICE"."PLATFORM" )
PLAN_TABLE_OUTPUT
---------------------------------------------------------------------------
-
| Id | Operation | Name | E-Rows |
---------------------------------------------------------------------------
-
| 0 | SELECT STATEMENT | | |
| 1 | SORT AGGREGATE | | 1 |
| 2 | NESTED LOOPS | | |
| 3 | NESTED LOOPS | | 1 |
|* 4 | TABLE ACCESS BY INDEX ROWID| APPSTORE_DEVICE | 1 |
|* 5 | INDEX RANGE SCAN | APPSTORE_DEVICE_E3D75FEF | 55 |
|* 6 | INDEX RANGE SCAN | AUTH_USER_55F56498 | 44270 |
|* 7 | TABLE ACCESS BY INDEX ROWID | AUTH_USER | 1 |
PLAN_TABLE_OUTPUT
---------------------------------------------------------------------------
-
Predicate Information (identified by operation id):
---------------------------------------------------
4 - filter(("APPSTORE_DEVICE"."IS_ACTIVE"=1 AND
"APPSTORE_DEVICE"."TENANT_ID"=9 AND
"APPSTORE_DEVICE"."IS_CORPORATE_OWNED"=0 AND
"APPSTORE_DEVICE"."USER_ID" IS NOT NULL))
5 - access("APPSTORE_DEVICE"."PLATFORM"=:ARG0)
6 - access("AUTH_USER"."IS_ACTIVE"=1)
PLAN_TABLE_OUTPUT
---------------------------------------------------------------------------
--
7 - filter("APPSTORE_DEVICE"."USER_ID"="AUTH_USER"."ID")
Peeked Binds (identified by position):
--------------------------------------
1 - :ARG0 (NVARCHAR2(30), CSID=2000): 'ios'
Again, the device table has...
PLATFORM NVARCHAR2(20)
Does the size difference matter?
Thanks
--
On 1/26/15 1:06 PM, "Dayton Gomez" <Day...@sy...> wrote:
>>>> cursor.execute("""select :platform a, dump(:platform, 1017) b from
>>>>dual""", {'platform': 'ios'}).fetchall()
>[(u'ios', 'Typ=1 Len=6 CharacterSet=AL16UTF16: ^@,i,^@,o,^@,s')]
>
>>>> cursor.execute("""select :platform a, dump(:platform, 1017) b from
>>>>dual""", {'platform': u'ios'}).fetchall()
>[(u'ios', 'Typ=1 Len=6 CharacterSet=AL16UTF16: ^@,i,^@,o,^@,s')]
>
>The slow query with a dump (SELECT COUNT(*), dump(:platform, 1017)Š)
>outputs the same encoding:
>(27195, 'Typ=1 Len=6 CharacterSet=AL16UTF16: ^@,i,^@,o,^@,s')
>
>They're the same. If what you said was correct, shouldn't they be
>different?
>--
>
>
>On 1/26/15 12:05 PM, "Doug Henderson" <djn...@gm...> wrote:
>
>>On 26 January 2015 at 10:34, Dayton Gomez wrote:
>>
>>>
>>> NLS_CHARACTERSET WE8MSWIN1252
>>> NLS_NCHAR_CHARACTERSET AL16UTF16 - is this what you requested?
>>> NLS_LANG = '.UTF8' (set by Django), the database hasŠ NLS_LANGUAGE
>>> AMERICAN
>>>
>>This clearly shows that you will have a character set conversion
>>between VARCHAR2 and NVARCHAR2 strings.
>>
>>
>>Try issuing the following queries from django, to try to see this.
>>
>>cursor.execute("""select platform, dump(platform, 1017) from device""")
>>
>>cursor.execute("""select :platform a, dump(:platform, 1017) b from
>>dual""", {'platform: 'ios'})
>>
>>cursor.execute("""select :platform a, dump(:platform, 1017) b from
>>dual""", {'platform: u'ios'})
>>
>>I suspect you will see that the bound variable has a different charset
>>for the last two queries.
>>
>>In the slow query, I suspect that there will be an implicit character
>>set conversion that prevents using the index. If you can get the
>>execution plan for the fast and slow versions, they will be very
>>different.
>>
|