From: Gerhard <ger...@gm...> - 2002-06-05 17:56:57
|
* pa...@bo... <pa...@bo...> [2002-06-05 17:08 -0000]: > On Wed, 5 Jun 2002 18:40:11 +0200 Gerhard Häring wrote: > >CC-ing to pypgsql-users list, maybe other users have comments on this issue, > >too. > > [Arrays vs. 'IN'] > > >QUERY: DECLARE PgSQL_08120A6C CURSOR FOR select * from cages where name > > in '{"mammals","apes"}' > > > >... which obviously cannot work. The reason is that if pyPgSQL encounters a > >tuple or a list in the statement parameters, it will quote them suitable for > >use as an ARRAY. Unfortunately, the ARRAY syntax is different from what you > >would need for the IN operator in SQL. > > Right, and there's no way of knowing what the database engine expects for > that parameter until it has parsed the statement. Unless you also parse at the client side, which I'd rather not ;-) There might be a solution with controlling the behaviour using a flag in the cursor object, for example. > I suppose there isn't a way of getting type information from PostgreSQL about > parameters - indeed, does PostgreSQL actually support parameters in the way > that other database systems do? You mean prepared statements? No, PostgreSQL doesn't currently support these. All it gets is a query string. From what I hear on IRC and on postgresql-hackers, people are working on prepared statements. Unfortunately, we'll have a problem with them and pyPgSQL, as as far as I see, prepared statements don't play nice with the DB-API 2.0 'format' and 'pyformat' parameter quoting styles. I believe that the 'qmark' or 'numeric' styles are designed exactly for prepared statement use. It might be possible to rewrite the query and continue using our 'pyformat' style, but I haven't tried, yet. > > [Unicode patch] > That would be great. Where XML data and PostgreSQL come together, it quickly > becomes a requirement to handle the data gracefully or transparently, > although I suspect that many database systems still struggle with Unicode. Hopefully, pyPgSQL with PostgreSQL works fine. PostgreSQL can even use the UTF-8 encoding at server side. With the Unicode patch and appropriate parameters for the connect() call, you should be able to use full Unicode: >>> from pyPgSQL import PgSQL >>> conn = PgSQL.connect(client_encoding="utf-8", unicode_results=1) >>> cursor = conn.cursor() >>> cursor.execute("insert into test(name) values (%s)", unicode("Österreich", "latin1")) >>> cursor.execute("select * from test") >>> res = cursor.fetchone() >>> print repr(res.name) u'\xd6sterreich' Btw. the Unicode support that PySQLite recently gained is based on the same patch, and it includes an example for mass-importing Freshmeat project data from XML into a database. As you said, XML is Unicode-based, so this is a relatively practical example, I hope. Gerhard -- mail: gerhard <at> bigfoot <dot> de registered Linux user #64239 web: http://www.cs.fhm.edu/~ifw00065/ OpenPGP public key id 86AB43C0 public key fingerprint: DEC1 1D02 5743 1159 CD20 A4B6 7B22 6575 86AB 43C0 reduce(lambda x,y:x+y,map(lambda x:chr(ord(x)^42),tuple('zS^BED\nX_FOY\x0b'))) |