You can subscribe to this list here.
2001 |
Jan
|
Feb
(1) |
Mar
|
Apr
|
May
|
Jun
|
Jul
(1) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
2002 |
Jan
(1) |
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
(1) |
Aug
(1) |
Sep
|
Oct
|
Nov
(1) |
Dec
|
2003 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
(1) |
Aug
(1) |
Sep
|
Oct
(83) |
Nov
(57) |
Dec
(111) |
2004 |
Jan
(38) |
Feb
(121) |
Mar
(107) |
Apr
(241) |
May
(102) |
Jun
(190) |
Jul
(239) |
Aug
(158) |
Sep
(184) |
Oct
(193) |
Nov
(47) |
Dec
(68) |
2005 |
Jan
(190) |
Feb
(105) |
Mar
(99) |
Apr
(65) |
May
(92) |
Jun
(250) |
Jul
(197) |
Aug
(128) |
Sep
(101) |
Oct
(183) |
Nov
(186) |
Dec
(42) |
2006 |
Jan
(102) |
Feb
(122) |
Mar
(154) |
Apr
(196) |
May
(181) |
Jun
(281) |
Jul
(310) |
Aug
(198) |
Sep
(145) |
Oct
(188) |
Nov
(134) |
Dec
(90) |
2007 |
Jan
(134) |
Feb
(181) |
Mar
(157) |
Apr
(57) |
May
(81) |
Jun
(204) |
Jul
(60) |
Aug
(37) |
Sep
(17) |
Oct
(90) |
Nov
(122) |
Dec
(72) |
2008 |
Jan
(130) |
Feb
(108) |
Mar
(160) |
Apr
(38) |
May
(83) |
Jun
(42) |
Jul
(75) |
Aug
(16) |
Sep
(71) |
Oct
(57) |
Nov
(59) |
Dec
(152) |
2009 |
Jan
(73) |
Feb
(213) |
Mar
(67) |
Apr
(40) |
May
(46) |
Jun
(82) |
Jul
(73) |
Aug
(57) |
Sep
(108) |
Oct
(36) |
Nov
(153) |
Dec
(77) |
2010 |
Jan
(42) |
Feb
(171) |
Mar
(150) |
Apr
(6) |
May
(22) |
Jun
(34) |
Jul
(31) |
Aug
(38) |
Sep
(32) |
Oct
(59) |
Nov
(13) |
Dec
(62) |
2011 |
Jan
(114) |
Feb
(139) |
Mar
(126) |
Apr
(51) |
May
(53) |
Jun
(29) |
Jul
(41) |
Aug
(29) |
Sep
(35) |
Oct
(87) |
Nov
(42) |
Dec
(20) |
2012 |
Jan
(111) |
Feb
(66) |
Mar
(35) |
Apr
(59) |
May
(71) |
Jun
(32) |
Jul
(11) |
Aug
(48) |
Sep
(60) |
Oct
(87) |
Nov
(16) |
Dec
(38) |
2013 |
Jan
(5) |
Feb
(19) |
Mar
(41) |
Apr
(47) |
May
(14) |
Jun
(32) |
Jul
(18) |
Aug
(68) |
Sep
(9) |
Oct
(42) |
Nov
(12) |
Dec
(10) |
2014 |
Jan
(14) |
Feb
(139) |
Mar
(137) |
Apr
(66) |
May
(72) |
Jun
(142) |
Jul
(70) |
Aug
(31) |
Sep
(39) |
Oct
(98) |
Nov
(133) |
Dec
(44) |
2015 |
Jan
(70) |
Feb
(27) |
Mar
(36) |
Apr
(11) |
May
(15) |
Jun
(70) |
Jul
(30) |
Aug
(63) |
Sep
(18) |
Oct
(15) |
Nov
(42) |
Dec
(29) |
2016 |
Jan
(37) |
Feb
(48) |
Mar
(59) |
Apr
(28) |
May
(30) |
Jun
(43) |
Jul
(47) |
Aug
(14) |
Sep
(21) |
Oct
(26) |
Nov
(10) |
Dec
(2) |
2017 |
Jan
(26) |
Feb
(27) |
Mar
(44) |
Apr
(11) |
May
(32) |
Jun
(28) |
Jul
(75) |
Aug
(45) |
Sep
(35) |
Oct
(285) |
Nov
(99) |
Dec
(16) |
2018 |
Jan
(8) |
Feb
(8) |
Mar
(42) |
Apr
(35) |
May
(23) |
Jun
(12) |
Jul
(16) |
Aug
(11) |
Sep
(8) |
Oct
(16) |
Nov
(5) |
Dec
(8) |
2019 |
Jan
(9) |
Feb
(28) |
Mar
(4) |
Apr
(10) |
May
(7) |
Jun
(4) |
Jul
(4) |
Aug
|
Sep
(4) |
Oct
|
Nov
(23) |
Dec
(3) |
2020 |
Jan
(19) |
Feb
(3) |
Mar
(22) |
Apr
(17) |
May
(10) |
Jun
(69) |
Jul
(18) |
Aug
(23) |
Sep
(25) |
Oct
(11) |
Nov
(20) |
Dec
(9) |
2021 |
Jan
(1) |
Feb
(7) |
Mar
(9) |
Apr
|
May
(1) |
Jun
(8) |
Jul
(6) |
Aug
(8) |
Sep
(7) |
Oct
|
Nov
(2) |
Dec
(23) |
2022 |
Jan
(23) |
Feb
(9) |
Mar
(9) |
Apr
|
May
(8) |
Jun
(1) |
Jul
(6) |
Aug
(8) |
Sep
(30) |
Oct
(5) |
Nov
(4) |
Dec
(6) |
2023 |
Jan
(2) |
Feb
(5) |
Mar
(7) |
Apr
(3) |
May
(8) |
Jun
(45) |
Jul
(8) |
Aug
|
Sep
(2) |
Oct
(14) |
Nov
(7) |
Dec
(2) |
2024 |
Jan
(4) |
Feb
(4) |
Mar
|
Apr
(7) |
May
(2) |
Jun
(1) |
Jul
|
Aug
(5) |
Sep
|
Oct
|
Nov
(4) |
Dec
(14) |
2025 |
Jan
(22) |
Feb
(6) |
Mar
(5) |
Apr
(14) |
May
(6) |
Jun
(11) |
Jul
(19) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
From: Daniel J S. <dan...@ie...> - 2004-06-02 02:06:02
|
Daniel J Sebald wrote: >> >> No. Look at the actual definition of these functions, still conserved >> from the K&R days, through both ANSI/ISO C standards, and still valid >> for a current Linux box: >> >> int putchar(int c); >> >> That argument is an int, not a char, for exactly the reasons we're >> discussing here. >> > > Really?! A function called "putchar" whose argument is an "int" > rather than "char"? Nice. Would this be a tolerable hack? Define a special type in some header for char's as function arguments /* Throwback to K&R */ #ifdef __STDC__ #define ACHAR char #else #define ACHAR int #endif do_enh_writec(ACHAR c) Hopefully there aren't too many, and it might be fairly easy to search them out, e.g., grep char */* > junk xemacs junk (in xemacs search for '\nchar' because function arguments start in first column) ..... Anyway, is gnuplot.texi a generated file? (I see something called doc2texi.el) If so, should it be in the cvsignore list so that it isn't checked out when getting the latest CVS version? Dan |
From: Daniel J S. <dan...@ie...> - 2004-06-01 22:13:17
|
Ethan Merritt wrote: >>So, the question is whether the PostScript standard requires that image >>line data end on a byte boundary. >> >> > >I don't know if this answers your question, but the standard says: > >"Each row of the source image begins on a character boundary. > If the number of data bits per row is not a multiple of 8, the end of the > row must be padded with extra bits to fill up the last character. > The PostScript interpreter ignores these bits." > Where did you find that? In the following book? @book{AdobeSystems:1999, author = {Adobe Systems}, title = {{PostScript} language reference}, edition = {3rd}, publisher = {Addison-Wesley}, year = 1999, address = {Reading, MA}, comment = {The language definition. LZWEncode, FlateEncode, DCTEncode, RunLengthEncode, CCITTFaxEncodeRilter.}, callno = {QA76.73 P67 P67 1999}, location = {Wendt} } In any case, that is the answer I'm looking for. Thanks. >p.s. > Please do not send Email as HTML. > Your mailer apparently doesn't intermingle the PNG images. OK... I cc'd to the list; I'll hear about that one. :) Dan |
From: Daniel J S. <dan...@ie...> - 2004-06-01 18:47:52
|
Hans-Bernhard Broeker wrote: >On Tue, 1 Jun 2004, Daniel J Sebald wrote: > > >>Hans-Bernhard Broeker wrote: >> >> > >[...] > > >>>So the choice is this: either we finally and officially discontinue all >>>support for K&R compilers, or we make such parameters int everywhere. >>> >>> > > > >>I'm beginning to follow this thread... My first thought is to not >>discard the K&R convention without a compelling argument, given that I'm >>sure a lot of people put a lot of effort into making it compatible. >> >> > >Well, let's just say that lack of people with regular exposition to >platforms to test K&R compatibility on is about as compelling an argument >as they come. Only after the release, when it was altogether too late, >did anybody even think of testing this on an actual K&R compiler. And >they failed. > > > >>The TBOOLEAN stuff didn't seem like all that much trouble. >> >> > >Trouble enough that it broke in rather subtle ways on some platforms >before I changed it in 3.8k.2 (?), a couple of weeks before the 4.0 >release. Now, after that change, it's broken in other, not quite as >subtle ways, on other platforms. > > > >>What are the issues with char in K&R C functions? >> >> > >The issue is not so much with char itself, but with the lack of function >prototypes. This means that all argument types of functions external to a >source module, and optionally those from inside, too, are implicitly >defined only, > That's the problem I'm experiencing. > and one of the effects of that is that all integer types >smaller than int (i.e. char, and maybe short, too) get casted to int >before being passed to a function. > > > >>Certainly, that is almost all they worked with at the time of its >>writing, i.e., putchar, getchar. >> >> > >No. Look at the actual definition of these functions, still conserved >from the K&R days, through both ANSI/ISO C standards, and still valid >for a current Linux box: > > int putchar(int c); > >That argument is an int, not a char, for exactly the reasons we're >discussing here. > Really?! A function called "putchar" whose argument is an "int" rather than "char"? Nice. Dan |
From: Hans-Bernhard B. <br...@ph...> - 2004-06-01 18:03:17
|
On Tue, 1 Jun 2004, Daniel J Sebald wrote: > Hans-Bernhard Broeker wrote: [...] > >So the choice is this: either we finally and officially discontinue all > >support for K&R compilers, or we make such parameters int everywhere. > I'm beginning to follow this thread... My first thought is to not > discard the K&R convention without a compelling argument, given that I'm > sure a lot of people put a lot of effort into making it compatible. Well, let's just say that lack of people with regular exposition to platforms to test K&R compatibility on is about as compelling an argument as they come. Only after the release, when it was altogether too late, did anybody even think of testing this on an actual K&R compiler. And they failed. > The TBOOLEAN stuff didn't seem like all that much trouble. Trouble enough that it broke in rather subtle ways on some platforms before I changed it in 3.8k.2 (?), a couple of weeks before the 4.0 release. Now, after that change, it's broken in other, not quite as subtle ways, on other platforms. > What are the issues with char in K&R C functions? The issue is not so much with char itself, but with the lack of function prototypes. This means that all argument types of functions external to a source module, and optionally those from inside, too, are implicitly defined only, and one of the effects of that is that all integer types smaller than int (i.e. char, and maybe short, too) get casted to int before being passed to a function. > Certainly, that is almost all they worked with at the time of its > writing, i.e., putchar, getchar. No. Look at the actual definition of these functions, still conserved from the K&R days, through both ANSI/ISO C standards, and still valid for a current Linux box: int putchar(int c); That argument is an int, not a char, for exactly the reasons we're discussing here. -- Hans-Bernhard Broeker (br...@ph...) Even if all the snow were burnt, ashes would remain. |
From: Daniel J S. <dan...@ie...> - 2004-06-01 17:29:20
|
Hans-Bernhard Broeker wrote: >[Taken over to the list, because of the fundamental issue at the bottom.] > >On Wed, 26 May 2004, Ethan Merritt wrote: > > > >>I thought it looked wrong at the time, but I haven't had time to look at >>it in detail. >> >> > >Feels familiar. I handed out my PhD thesis to the institution yesterday, >5 minutes before the deadline at noon. Phew ;-) > (Feels familiar, the deadline thing.) >>In paticular the change that broke things was in term.c: >> static void do_enh_writec __PROTO((int c)); >> /* note: c is char, but must be declared int due to an old K&R ANSI-C strict HP cc */ >> >> > >Ah, now I see more clearly. > >But actually, as long as gnuplot is supposed to be compilable under K&R C, >which it is(!), Petr's change did go in the right direction. It just >didn't go far enough. In K&R C, function arguments indeed can't be char >without causing all kinds of problems. But neither can you cast function >pointers to functions of a different signature and expect the resulting >code to work. > >So the choice is this: either we finally and officially discontinue all >support for K&R compilers, or we make such parameters int everywhere. > >Given all the troubles with TBOOLEAN (which are actually essentially the >same thing, since TBOOLEAN ends up as char on K&R compilers), I tend >towards declaring K&R C dead. But if we do that, we should at least split >off a development branch (4.1) first, and do it there. > I'm beginning to follow this thread... My first thought is to not discard the K&R convention without a compelling argument, given that I'm sure a lot of people put a lot of effort into making it compatible. (And I'm not K&R biased. Having learned C on PCs, I've come to toss software convention to the wind.) The TBOOLEAN stuff didn't seem like all that much trouble. What are the issues with char in K&R C functions? Certainly, that is almost all they worked with at the time of its writing, i.e., putchar, getchar. That is, do_enh_writec() is so similar to putchar() that I can't see where the trouble arises? What's the deal with the old HP compiler? Is that the thing causing problems? What do some of its standard headers look like for compatibility, e.g., putchar()? Dan |
From: Ethan M. <merritt@u.washington.edu> - 2004-06-01 17:25:19
|
On Tuesday 01 June 2004 09:38 am, Hans-Bernhard Broeker wrote: > > No. This should apply to *all* editions of the documentation integrated > with the program itself. In a nutshell: if you type 'help pdf' in your > gnuplot, it should display the help chunk from term/pdf.trm if and only > if 'set term pdf', typed at that same prompt, turns on the pdf terminal. > E.g. only if the PDF terminal was actually built into the program. What is so bad about giving complete help even if your local version is incomplete? If nothing else, it allows the user to think "That looks nice, I should rebuild gnuplot to include support for PDF". To pick a random parallel case from the standard linux man page set, if I type "man sched_setscheduler" it tells about a bunch of commands to control what CPU my jobs run on, whether or not I actually *have* more than one CPU. Similarly if you type "man gcc" you will get compiler options for architectures you don't have, and "man groff" will describe options for fancy type-setting devices you don't have. What I do think might help is if the error message from 'set term <whatever>' were a bit more informative. E.g.: gnuplot> set term foo unknown or ambiguous terminal type; type just 'set terminal' for a list gnuplot> set term pdf support for the pdf terminal was not built into your local copy of gnuplot In fact I would prefer that all conditionally compiled options produce this sort of error message: gnuplot> set pm3d support for pm3d mode was not built into your local copy of gnuplot and so on. -- Ethan A Merritt merritt@u.washington.edu Biomolecular Structure Center Mailstop 357742 University of Washington, Seattle, WA 98195 |
From: Hans-Bernhard B. <br...@ph...> - 2004-06-01 16:49:59
|
On Tue, 1 Jun 2004, Petr Mikulik wrote: > In summary: it is obviously only gnuplot.gih which likes to be created > with only those terminals compiled into the gnuplot executable, right? No. This should apply to *all* editions of the documentation integrated with the program itself. In a nutshell: if you type 'help pdf' in your gnuplot, it should display the help chunk from term/pdf.trm if and only if 'set term pdf', typed at that same prompt, turns on the pdf terminal. E.g. only if the PDF terminal was actually built into the program. Same for all other terminal drivers. The crucial distinction is between documentation closely linked to an individual binary, and documentation distributed independent of it. > Then, the particular piece of Makefile, should use the original method of > creating its own xxxterm.h, and delete it after creation. I don't think anything like that was ever the original method. The original method, IIRC, was to have no private term.h at all, but just use src/term.h for that purpose, because that file contains the definitive selection of drivers. I experimented with this, replacing all occurences of "term.h" in docs/Makefile.in with $(TERM_H), and setting that to ../src/term.h. Both the gih and the "check" target worked nicely. -- Hans-Bernhard Broeker (br...@ph...) Even if all the snow were burnt, ashes would remain. |
From: Ethan M. <merritt@u.washington.edu> - 2004-06-01 16:33:59
|
On Tuesday 01 June 2004 09:22 am, Petr Mikulik wrote: > > So: why can't the doc2*.o files just refer to ../src/term.h? What is > > docs/term.h needed for? I guess this question goes mainly to Lars, > > but any insight from others would be valuable. > > In summary: it is obviously only gnuplot.gih which likes to be created > with only those terminals compiled into the gnuplot executable, right? There is still a vestigial target in the docs Makefile called "allgih", which was originally there to provide the option of listing all terminal in the on-line help. Intentionally or not, this has now become the default. I have no objection to restoring the original behavior of "make gih". On the other hand I see little harm in including all the terminals in the on-line help, so long as "set term" provides a correct list of which terminals are actually available. -- Ethan A Merritt merritt@u.washington.edu Biomolecular Structure Center Mailstop 357742 University of Washington, Seattle, WA 98195 |
From: Petr M. <mi...@ph...> - 2004-06-01 16:22:55
|
> But Lars introduced a change (revision 1.44 of docs/Makefile.in) that > relies on a separate file, docs/term.h, built by concatenating and > preprocessing the config.h file and a selection of terminal driver > sources listed in docs/Makefile.in, $(CORETERM): > > term.h: $(CORETERM) > @echo "Building term.h" > @cat ../config.h $(CORETERM) > term.c > $(CPP) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(CPPFLAGS) \ > -DTERM_DRIVER_H -DTERM_HELP term.c | \ > sed '/^ *$$/d;/^#/d' >$@ > @rm -f term.c > > The problem with this construction is that the logical connection between > config.h and the decision which terminal drivers to include exists *only* > inside src/term.h, so this method completely fails to screen out the > unwanted drivers. > > So: why can't the doc2*.o files just refer to ../src/term.h? What is > docs/term.h needed for? I guess this question goes mainly to Lars, but > any insight from others would be valuable. In summary: it is obviously only gnuplot.gih which likes to be created with only those terminals compiled into the gnuplot executable, right? Then, the particular piece of Makefile, should use the original method of creating its own xxxterm.h, and delete it after creation. All other output formats of gnuplot documentation will lists all terminals. Should it be like that? --- PM |
From: Lars H. <lhe...@us...> - 2004-06-01 16:09:22
|
> But Lars introduced a change (revision 1.44 of docs/Makefile.in) that > relies on a separate file, docs/term.h, built by concatenating and > preprocessing the config.h file and a selection of terminal driver > sources listed in docs/Makefile.in, $(CORETERM): > > term.h: $(CORETERM) > @echo "Building term.h" > @cat ../config.h $(CORETERM) > term.c > $(CPP) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(CPPFLAGS) \ > -DTERM_DRIVER_H -DTERM_HELP term.c | \ > sed '/^ *$$/d;/^#/d' >$@ > @rm -f term.c > > The problem with this construction is that the logical connection between > config.h and the decision which terminal drivers to include exists *only* > inside src/term.h, so this method completely fails to screen out the > unwanted drivers. > > So: why can't the doc2*.o files just refer to ../src/term.h? What is > docs/term.h needed for? I guess this question goes mainly to Lars, but > any insight from others would be valuable. Had to go back to the list archive ... There was a discussion here, Subject: "LaTeX docs, and docs cleanup", involving Petr, Ethan, and myself wrt alphabetical order of help entries. That was solved for allterm.h, where it matters. I think the idea behind intorducing docs/term.h was to simplify the makefile somewhat, but the problem is indeed that config.h alone does not mask out unwanted drivers, which was the assumption. |
From: Hans-Bernhard B. <br...@ph...> - 2004-06-01 15:18:09
|
Hi, folks, to my considerable surprise, I've just discovered that, as of December 2002, the build of online help no longer masks out terminal drivers. Our Debian package maintainer just forwarded this to us, as a bug report. This used to be done by having docs/termdoc.c #include docs/allterm.h for printable doc files, or src/term.h for online versions that should match what is actually present in term.h. But Lars introduced a change (revision 1.44 of docs/Makefile.in) that relies on a separate file, docs/term.h, built by concatenating and preprocessing the config.h file and a selection of terminal driver sources listed in docs/Makefile.in, $(CORETERM): term.h: $(CORETERM) @echo "Building term.h" @cat ../config.h $(CORETERM) > term.c $(CPP) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(CPPFLAGS) \ -DTERM_DRIVER_H -DTERM_HELP term.c | \ sed '/^ *$$/d;/^#/d' >$@ @rm -f term.c The problem with this construction is that the logical connection between config.h and the decision which terminal drivers to include exists *only* inside src/term.h, so this method completely fails to screen out the unwanted drivers. So: why can't the doc2*.o files just refer to ../src/term.h? What is docs/term.h needed for? I guess this question goes mainly to Lars, but any insight from others would be valuable. -- Hans-Bernhard Broeker (br...@ph...) Even if all the snow were burnt, ashes would remain. |
From: Petr M. <mi...@ph...> - 2004-05-28 15:00:03
|
Currently, interactive terminals map keycodes to ascii values of 'a' .. 'Z', or to values above 1000, like: enum { GP_FIRST_KEY = 1000, GP_BackSpace, GP_Tab, GP_Linefeed, ... } The enclosed patch splits this group of "gnuplot" keycodes into two groups: 1. group with "usual keycodes", 2. group with "other keycodes" (=what remains after removing "usual keycodes"). There are the following "usual" keycodes: GP_BackSpace = 0x08, GP_Tab = 0x09, GP_Return = 0x0D, GP_Escape = 0x1B, GP_Delete = 127, Do you notice yet other keycode is missing? The patch does effect any current functionality, but helps together with Ethan's "pause mouse key" implementation such as "mousekeys_17may2004.patch", where command "pause mouse key; show var" returns something you expect from DOS keycodes and mainly for compatibility to ginput.m in Octave. --- PM |
From: Daniel J S. <dan...@ie...> - 2004-05-27 21:28:20
|
Daniel J Sebald wrote: > Anyone know if 12 bits for color channels and/or gray scale of images > is allowed in PostScript? I walked over to the library. 1, 2, 4 , 8 and 12 are part of the standard. Dan |
From: Ethan M. <merritt@u.washington.edu> - 2004-05-27 19:48:22
|
On Thursday 27 May 2004 12:43 pm, Ethan Merritt wrote: > The PostScript Language Reference Manual says (p. 220): > > BitsPerComponent integer (Required) > "Specifies the number of bits used to represent each color component. > The number must be 1, 2, 3, 8, or 12. Only a single number may be ^^^^ Pardon the typo. That should of course be "4" rather than "3" > specified. BitsPerComponent must be 1 in an image dictionary used > with imagemask." -- Ethan A Merritt merritt@u.washington.edu Biomolecular Structure Center Mailstop 357742 University of Washington, Seattle, WA 98195 |
From: Ethan M. <merritt@u.washington.edu> - 2004-05-27 19:43:38
|
On Thursday 27 May 2004 12:30 pm, Daniel J Sebald wrote: > Well, I've never gotten wise about TrueColor. I think it is a trade > name, not sure. It's a visual display mode in X11, as distinct from DirectColor or PseudoColor, which both use color maps. > Don't know. I think Ethan put some commands for TrueColor images in his > gd.trm mods. Maybe I should have used a different keyword. I just meant 8-bits per channel, rather than a color map. > Anyone know if 12 bits for color channels and/or gray scale of images is > allowed in PostScript? The PostScript Language Reference Manual says (p. 220): BitsPerComponent integer (Required) "Specifies the number of bits used to represent each color component. The number must be 1, 2, 3, 8, or 12. Only a single number may be specified. BitsPerComponent must be 1 in an image dictionary used with imagemask." -- Ethan A Merritt merritt@u.washington.edu Biomolecular Structure Center Mailstop 357742 University of Washington, Seattle, WA 98195 |
From: Daniel J S. <dan...@ie...> - 2004-05-27 19:09:35
|
Petr Mikulik wrote: > Hi Daniel, > > > >>Well yes, that would be a logical thing to do. Limit to the nearest >>power of two. I'm seeing a few things in Ghostview notes, and also PNG >>notes: >> >>Keys recognized in PNG filter algorithms >>------------------------------------------------------------------------ >>Key Range Default >>------------------------------------------------------------------------ >>Colors <integer> 1 to 16 16 >>BitsPerComponent <integer> 1, 2, 4, 8, or 16 8 >> >> >> >>and >> >>Implements the Predictor=2 pixel-differencing option of the LZW filters. >>Recognized keys are: Colors <integer> (1 to 4, default=1) >>BitsPerComponent <integer> (1, 2, 4, or 8, default=8) >> >> >>But, I don't recall reading of any limitations in the PostScript >>standard when I was at the library. I will have to go there again. >> Anyway, even though I see some things that say 1,2,4,8, for both color >>and gray scale, GhostView seems to handle 1,2,4,8,12. So should I make >>it those restrictions? >> >> > >gnuplot graphs should print OK on usual (PS Level 2) printers, and should be >viewable by ghostview. > Yes, I think we can use that as a good benchmark. I will rewrite the driver so that it changes 3,5,6,7,9,10,11 bits to 4,8,12 appropriately. I am going to make 12 bits an option. Although I see most references on PostScript say 1,2,4,8, I'm almost certain I read in the Adobe standard text that up to 12 bits is allowed. You'd think that Adobe would have allowed some room for high res graphics. >Actually, what does BitsPerComponent mean? If it is 8, then R,G,B components >of the image will be restricted to 0..255, right? > Correct. Each channel, whether it be a single gray scale, or a color channel triple has 256 levels. For color, that is a lot. I know for the color images in X11, my graphics card has true color with 5 bits per channel. (Of course, one day I'll work on a machine with 8 bits per channel and be spoiled.) >I guess it won't be much problem -- gnuplot is not the tool to process >TrueColor photographs. > Well, I've never gotten wise about TrueColor. I think it is a trade name, not sure. But the general strategy is that instead of the lookup table, there is independent control of the channels. So PostScript's image scheme of 256 levels per color channel is essentially the same as TrueColor. In fact, here is a good reference which states that PostScript has TrueColor color images, but not TrueColor color plots: http://www.astro.princeton.edu/~esirko/idl_html_help/devices13.html >BTW, what happens if you convert a TrueColor photo to postscript, by >"convert" or another tool? The palette goes to 256 for each component? > Don't know. I think Ethan put some commands for TrueColor images in his gd.trm mods. >Please ask about 8 and 12 at gnuplot mailing list. There are postscript >experts there too. > Anyone know if 12 bits for color channels and/or gray scale of images is allowed in PostScript? Thanks, Dan |
From: Hans-Bernhard B. <br...@ph...> - 2004-05-27 14:05:11
|
On Wed, 28 Apr 2004, Grigory Rubtsov wrote: > Dear developers. > I have always used gnuplot for scientific purposes. Recently I have installed > gnuplot 4.0. I often use commands plot "very_long_filename.dat", but in 4.0 > <tab> key doesn't automatically insert file name, but in previous versions it > worked. That's not the change to version 4.0, but the way you compiled that broke that. <tab> completion is a part of the GNU readline features. If you don't compile gnuplot to use that library, it won't have <tab> completion. -- Hans-Bernhard Broeker (br...@ph...) Even if all the snow were burnt, ashes would remain. |
From: Hans-Bernhard B. <br...@ph...> - 2004-05-27 12:16:54
|
On Wed, 26 May 2004, Ethan Merritt wrote: > On Wednesday 26 May 2004 10:03 am, Hans-Bernhard Broeker wrote: > > On Wed, 26 May 2004, Ethan Merritt wrote: > > > But what does that mean in terms of handling the current problem? > > > > It means that we revert Petr's patch iff we decide to go ANSI C now, or > > extend it if we want to stick with K&R compatibility. > > How about if we define a new type in gp_types.h > > #undef CHAROUT > #ifdef KR /* I don't know what the proper test is for K&R */ > #typedef int CHAROUT > #else > #typedef char CHAROUT > #endif Nah, that's terminally ugly. ;-) If the change to 'int' for K&R can work (needs a bit of testing), then the same 'int' will work for ANSI C, too. So making this distinction would serve no useful purpose. -- Hans-Bernhard Broeker (br...@ph...) Even if all the snow were burnt, ashes would remain. |
From: Petr M. <mi...@ph...> - 2004-05-27 08:01:35
|
> > Please somebody fix X11 terminal so that: > > - it maps correct number of colors, > > fixed. Works OK now -- oh no, there is still a bug in gradient palette, see below. > > - the color mapping is exactly as in png. > > Do you have an example that shows a difference in color mapping? Get CVS from two days ago and try set pal maxcolor 5 set pm3d map splot x set out 'a.png'; set term png; replot set out 'a.ps'; set term post color; replot set out 'a.pdf'; set term pdf; replot set out 'a.svg'; set term svg; replot set out 'a.fig'; set term fig; replot wrong mapping is for x11, pdf, svg, aquaterm, win, post. I've patched most of the above, and still remaining are "fig" and "x11" for this case of a gradient palette: set palette maxcolors 3 set palette defined ( 0 "black", 1 "red" ) set term x11 persist set pm3d map set tics out set cbrange [-2:2] set xtics -10,1 set mxtics 2 set cbtics -10,1 set mcbtics 2 splot x set term post color; set out 'a.ps'; replot; set out; set term pop set term png; set out 'a.png'; replot; set out; set term pop set term fig color; set out 'a.fig'; replot; set out; set term pop set term pdf; set out 'a.pdf'; replot; set out; set term pop set term svg; set out 'a.svg'; replot; set out; set term pop --- PM |
From: Ethan M. <merritt@u.washington.edu> - 2004-05-26 19:23:46
|
On Wednesday 26 May 2004 04:28 am, Petr Mikulik wrote: > Please somebody fix X11 terminal so that: > - it maps correct number of colors, fixed. > - the color mapping is exactly as in png. Do you have an example that shows a difference in color mapping? -- Ethan A Merritt merritt@u.washington.edu Biomolecular Structure Center Mailstop 357742 University of Washington, Seattle, WA 98195 |
From: Ethan M. <merritt@u.washington.edu> - 2004-05-26 17:59:35
|
On Wednesday 26 May 2004 10:03 am, Hans-Bernhard Broeker wrote: > On Wed, 26 May 2004, Ethan Merritt wrote: > > But what does that mean in terms of handling the current problem? > > It means that we revert Petr's patch iff we decide to go ANSI C now, or > extend it if we want to stick with K&R compatibility. How about if we define a new type in gp_types.h #undef CHAROUT #ifdef KR /* I don't know what the proper test is for K&R */ #typedef int CHAROUT #else #typedef char CHAROUT #endif and then change all the enhanced text function with a char argument to have a CHAROUT argument instead. That reduces the chance of breaking compilation on platforms for which the existing code works, and still allows us to accommodate K&R's dislike of char arguments. e.g. static void do_enh_writec __PROTO((CHAROUT c)); -- Ethan A Merritt merritt@u.washington.edu Biomolecular Structure Center Mailstop 357742 University of Washington, Seattle, WA 98195 |
From: Hans-Bernhard B. <br...@ph...> - 2004-05-26 17:05:57
|
On Wed, 26 May 2004, Ethan Merritt wrote: > On Wednesday 26 May 2004 08:54 am, Hans-Bernhard Broeker wrote: > > So the choice is this: either we finally and officially discontinue all > > support for K&R compilers, or we make such parameters int everywhere. > > > > Given all the troubles with TBOOLEAN (which are actually essentially the > > same thing, since TBOOLEAN ends up as char on K&R compilers), I tend > > towards declaring K&R C dead. But if we do that, we should at least > > split off a development branch (4.1) first, and do it there. > > But what does that mean in terms of handling the current problem? It means that we revert Petr's patch iff we decide to go ANSI C now, or extend it if we want to stick with K&R compatibility. > I assume that there will be a version 4.0 patch level 1 that corresponds > to the state of the CVS tree at the time of the split, and any known > problems in it should be at the least documented. Tagging/releasing a 4.0.1 would not necessarily have to do anything to do with the split-off of a development branch. Right now, I don't see a pressing need for a 4.0.1. The post-release bug fixes are still quite small. > My inclination would be to leave the 4.0 code the way it was before > Petr's change, and simply add a "known bugs" note that the HP compiler > mis-handles the enhanced-text code. That statement would be a misrepresentation of fact. The problem really is in our code, not in their compiler. Lars tried to build 4.0 on SunOS 4's K&R compiler, and observed similar problems. > There may be other K&R compilers that also complain, but we haven't > heard bug reports from them yet. As I said: SunOS has complained, if only in private. > We would still need something like the TBOOLEAN hack for K&R > compilers suggested in bug report #953887. Right. And that could probably be built into the TBOOLEAN macro block (at the end of syscfg.h) which I changed shortly before the release, triggering all this. We could typedef _Bool to int if we're on K&R C. And we would have to change over to ANSI prototypes for all functions with TBOOLEAN arguments to pacify the pickier ANSI C compilers, while at it. Let ansi2knr handle the needs of K&R compilers. -- Hans-Bernhard Broeker (br...@ph...) Even if all the snow were burnt, ashes would remain. |
From: Ethan M. <merritt@u.washington.edu> - 2004-05-26 16:30:03
|
On Wednesday 26 May 2004 08:54 am, Hans-Bernhard Broeker wrote: > So the choice is this: either we finally and officially discontinue all > support for K&R compilers, or we make such parameters int everywhere. > > Given all the troubles with TBOOLEAN (which are actually essentially the > same thing, since TBOOLEAN ends up as char on K&R compilers), I tend > towards declaring K&R C dead. But if we do that, we should at least > split off a development branch (4.1) first, and do it there. But what does that mean in terms of handling the current problem? I assume that there will be a version 4.0 patch level 1 that corresponds to the state of the CVS tree at the time of the split, and any known problems in it should be at the least documented. My inclination would be to leave the 4.0 code the way it was before Petr's change, and simply add a "known bugs" note that the HP compiler mis-handles the enhanced-text code. There may be other K&R compilers that also complain, but we haven't heard bug reports from them yet. We would still need something like the TBOOLEAN hack for K&R compilers suggested in bug report #953887. -- Ethan A Merritt merritt@u.washington.edu Biomolecular Structure Center Mailstop 357742 University of Washington, Seattle, WA 98195 |
From: Hans-Bernhard B. <br...@ph...> - 2004-05-26 16:01:10
|
[Taken over to the list, because of the fundamental issue at the bottom.] On Wed, 26 May 2004, Ethan Merritt wrote: > I thought it looked wrong at the time, but I haven't had time to look at > it in detail. Feels familiar. I handed out my PhD thesis to the institution yesterday, 5 minutes before the deadline at noon. Phew ;-) > In paticular the change that broke things was in term.c: > static void do_enh_writec __PROTO((int c)); > /* note: c is char, but must be declared int due to an old K&R ANSI-C strict HP cc */ Ah, now I see more clearly. But actually, as long as gnuplot is supposed to be compilable under K&R C, which it is(!), Petr's change did go in the right direction. It just didn't go far enough. In K&R C, function arguments indeed can't be char without causing all kinds of problems. But neither can you cast function pointers to functions of a different signature and expect the resulting code to work. So the choice is this: either we finally and officially discontinue all support for K&R compilers, or we make such parameters int everywhere. Given all the troubles with TBOOLEAN (which are actually essentially the same thing, since TBOOLEAN ends up as char on K&R compilers), I tend towards declaring K&R C dead. But if we do that, we should at least split off a development branch (4.1) first, and do it there. -- Hans-Bernhard Broeker (br...@ph...) Even if all the snow were burnt, ashes would remain. |
From: <man...@il...> - 2004-05-26 15:42:29
|
Le mercredi 26 Mai 2004 16:59, Lars Hecking a =E9crit : > > My question is: do you think it could be possible to get the C parser > > used in the source of gnuplot, run swig on it, add some little code > > around, and get a usable parser ? I mean, is the C parser of gnuplot > > store the result of a command line in a structure which can be used > > easily ? > > The gnuplot parser is essentially a full-custom job, and it's spread > across several source files. E.g. every single .trm file has its own > parser for terminal options, and in general every source module has its > own parser for the options it provides. > > A number of years ago, I started to move towards a table-driven parser, > but got stuck somewhere along the line (the plot command is > particularily nasty in this regard ...). > > What I would like to have is a lex/yacc based scanner/parser for > gnuplot. This would allow to put it into its own module, and let other > apps use it. But I think no-one here has the skills to write one. Well, I was afraid to get such an answer :o( You're right, lex/yacc could be the solution, but really need someone=20 speaking this strange language instead of english ;o) Anyway, thank you very much for you help, =2D-=20 Fr=E9d=E9ric |