The pre-compiled 64-bit version of COBOL at https://www.arnoldtrembley.com/GnuCOBOL.htm is listed wrong. The website says it's 3.1 rc1 but when I check it using cobc -V, it says it's 3.0 rc1.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Anonymous
Anonymous
-
2020-07-08
Why do you need 64 bit ?
Do you have a Cobol suite that exceeds 2,147,483,648 bytes of addressability ?
The 32 bit GnuCOBOL lives in the (X86) framework.
32 bit GnuCobol works fine with 64 bit Windows (Win7, Win10).
SQL Server, LUW (DB2) et al work fine in the 32bit X86 world.
Is there a suite of programs that only work with 64 bit addresses ?
Curious as to why the need for 64 bit ?
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Speed of 32 vs 64 will be wildly variably dependent on work load and chip. If you can rip through register manipulations 8 bytes at a time, that can be a huge win. I've never looked, but I'm assuming the low-level IO writers take advantage of r named registers. If you are chewing through individual 8, 16, 32 bit values with no locality of reference, then it'll be a dog's breakfast of slower performance.
COBOL doesn't hold that many addresses, in general, so size may not be of much concern either, but yes, each slot is doubled in space along with the integers.
Can't say I disagree in general, though, cdg, anon op.
Keeping datasets above 16 or 24 bits gets way harder on the brain, just in the effort to encapsulate a mental image. ;-) Grokking a 16meg memory map is way easier than 4 gig or hundreds of quintillians for spacial reference compaction in the mind. Cognitive leaks that eat into mental stamina and productive use of time.
Call it the the push of progress. Apple is shutting down 32bit, GNU/Linux distros are moving the same way, dropping the 32bit compatibility layers. Ubuntu? recently made some concessions for the Wine project, if I remeber correctly (I don't). Big tech wants everyone to buy new, young mover shakers help without really realizing as they build the new things, never having experienced the old. Once that approaches 51% new, then old can be discarded at will. ;-)
IBM caved, released a 64 bit COBOL. With I'm assuming a fair amount of chagrin for some, and a slowly realized "whoa, do we really need to try and think at this scale, all day, every day?" :-) And some super happy programmers running with the wind.
I'm pretty sure we're safe in both camps for the time being with libcob/cobc, but distros might play the hand at any time. Partly because the maintainers are likely 35- and won't even notice or care that it's a thing. -m32 on x86-64 relies on some nifty hack overlays at the binary level, at the expense of doubling system detail complexity. Especially for those critical distro build maintainers, possibly playing follow the Apple.
All happily typed on a completely satisfying 32 bit laptop
Have good,
Blue
Last edit: Brian Tiffin 2020-07-08
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
You can only memory map up to a 4 Gig file in 32bit, but up to 16 Tera file in 64bit.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Anonymous
Anonymous
-
2020-07-08
I think the high order bit is a sign bit.
32 bit is actually 31 bit (31 ^ 2) in terms of addressability (in decimal 2,147,483,6487)
64 bit is actually 63 bit (63 ^2) in terms of addressability.
I would think the DBMS is solely responsible for the 1TB database access being better.
If the 1TB database is "memory resident" then there are likely thousands of paging dataset segments allocated to the database.
My circa 2020 perception of 64 bit is simply "a solution looking for a problem"
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Exactly. I write database software in COBOL and, for large databases, 64-bit runs rings around 32-bit. P.S. If you're a retired COBOL programmer and are looking to make a ton of cash, check out insurance companies. They hire mainatiners for their billions of lines of COBOL code and can't find young programmers to do it.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Of course I can only speak for my own system and the programs I write. The 32-bit version (3.1 rc1) is slower (by about 5%) than the latest 64-bit version I can find (3.0) when running the programs I write which access large databases. As for the environment, the older MinGW environment seems faster than the newer msys64. The difference in the speed of compiled COBOL programs may not be enough for most people to switch to the 64-bit version but I'll take all the speed I can get.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
I'd guess most of that performace is because of the different "bundled packages" (including an up-to-date GCC), not because of 32/64 bit.
We'll see if/when @arn79 provides both architectures of the msys64 environment.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
As those are binaries and those are downloade from @arn79 you may want to contact him instead of posting to the forum.
Note: the MSYS2 binaries seem to have been rolled back (for whatever reason), all packages I've updated some days ago want to be downgraded, including GnuCOBOL:
Hm, no? If you update the packages, done via pacman -Syuu you end up with the packages "gnucobol-3.0rc1-2".
The official pacakge definition (see my link above) is "gnucobol-3.1rc1-1", if you want to (also see my link above) you can download the definition (the complete directory) and build it yourself.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
@arn79 whenever you prepare the next msys64 based package: please provide both the 32bit and the 64bit variant of it - to allow people to use it as needed (the msys64 packages will always be a quite different version because of its different "bundled" packages, when compared to the mingw variant). Thank you.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
You either have to import the GNU keyring (@lazka is this something that may should be part of MSYS2?) or ignore the pgp signature or disable the check by adding --skippgpcheck to the makepgk options (I've only seen this as I've used makepkg --help now, guess that should work).
But at lazka has said below - you can also directly install the pre-build binary pkg.tar.zst fromt th repo if you want to.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Either use --skippgpcheck or import the key first gpg --recv 13E96B53C005604E. makepkg uses the user keyring, so we can't change that. But I agree that makepkg should ask to import the key if it is missing
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
The pre-compiled 64-bit version of COBOL at https://www.arnoldtrembley.com/GnuCOBOL.htm is listed wrong. The website says it's 3.1 rc1 but when I check it using cobc -V, it says it's 3.0 rc1.
Why do you need 64 bit ?
Do you have a Cobol suite that exceeds 2,147,483,648 bytes of addressability ?
The 32 bit GnuCOBOL lives in the (X86) framework.
32 bit GnuCobol works fine with 64 bit Windows (Win7, Win10).
SQL Server, LUW (DB2) et al work fine in the 32bit X86 world.
Is there a suite of programs that only work with 64 bit addresses ?
Curious as to why the need for 64 bit ?
In general, 64-bit takes up more disk space, more memory, and runs slower than 32-bit. I don't see the need.
Speed of 32 vs 64 will be wildly variably dependent on work load and chip. If you can rip through register manipulations 8 bytes at a time, that can be a huge win. I've never looked, but I'm assuming the low-level IO writers take advantage of r named registers. If you are chewing through individual 8, 16, 32 bit values with no locality of reference, then it'll be a dog's breakfast of slower performance.
COBOL doesn't hold that many addresses, in general, so size may not be of much concern either, but yes, each slot is doubled in space along with the integers.
Can't say I disagree in general, though, cdg, anon op.
Keeping datasets above 16 or 24 bits gets way harder on the brain, just in the effort to encapsulate a mental image. ;-) Grokking a 16meg memory map is way easier than 4 gig or hundreds of quintillians for spacial reference compaction in the mind. Cognitive leaks that eat into mental stamina and productive use of time.
Call it the the push of progress. Apple is shutting down 32bit, GNU/Linux distros are moving the same way, dropping the 32bit compatibility layers. Ubuntu? recently made some concessions for the Wine project, if I remeber correctly (I don't). Big tech wants everyone to buy new, young mover shakers help without really realizing as they build the new things, never having experienced the old. Once that approaches 51% new, then old can be discarded at will. ;-)
IBM caved, released a 64 bit COBOL. With I'm assuming a fair amount of chagrin for some, and a slowly realized "whoa, do we really need to try and think at this scale, all day, every day?" :-) And some super happy programmers running with the wind.
I'm pretty sure we're safe in both camps for the time being with libcob/cobc, but distros might play the hand at any time. Partly because the maintainers are likely 35- and won't even notice or care that it's a thing. -m32 on x86-64 relies on some nifty hack overlays at the binary level, at the expense of doubling system detail complexity. Especially for those critical distro build maintainers, possibly playing follow the Apple.
All happily typed on a completely satisfying 32 bit laptop
Have good,
Blue
Last edit: Brian Tiffin 2020-07-08
I have both 32-bit 3.1 rc1 and 64 bit 3.0 and the 64-bit version is faster on my system. It also handles 1 TB databases better.
You can only memory map up to a 4 Gig file in 32bit, but up to 16 Tera file in 64bit.
I think the high order bit is a sign bit.
32 bit is actually 31 bit (31 ^ 2) in terms of addressability (in decimal 2,147,483,6487)
64 bit is actually 63 bit (63 ^2) in terms of addressability.
I would think the DBMS is solely responsible for the 1TB database access being better.
If the 1TB database is "memory resident" then there are likely thousands of paging dataset segments allocated to the database.
My circa 2020 perception of 64 bit is simply "a solution looking for a problem"
Yes the DBMS is responsible. I write DBMS systems and they're much more efficient in 64-bit than in 32-bit.
Exactly. I write database software in COBOL and, for large databases, 64-bit runs rings around 32-bit. P.S. If you're a retired COBOL programmer and are looking to make a ton of cash, check out insurance companies. They hire mainatiners for their billions of lines of COBOL code and can't find young programmers to do it.
@toml_12953 - Please verify:
Thank you,
Simon
Of course I can only speak for my own system and the programs I write. The 32-bit version (3.1 rc1) is slower (by about 5%) than the latest 64-bit version I can find (3.0) when running the programs I write which access large databases. As for the environment, the older MinGW environment seems faster than the newer msys64. The difference in the speed of compiled COBOL programs may not be enough for most people to switch to the 64-bit version but I'll take all the speed I can get.
I'd guess most of that performace is because of the different "bundled packages" (including an up-to-date GCC), not because of 32/64 bit.
We'll see if/when @arn79 provides both architectures of the msys64 environment.
As those are binaries and those are downloade from @arn79 you may want to contact him instead of posting to the forum.
Note: the MSYS2 binaries seem to have been rolled back (for whatever reason), all packages I've updated some days ago want to be downgraded, including GnuCOBOL:
So maybe Arnold created those after they have been rolled back?
In any case you should be able to to build it via pacman yourself, if you need it, using the official package definition which is at 3.1-rc1.
It looks like the "official package definition" has been rolled back to 2017 and GnuCOBOL 2.2.
How do we get 3.1 as the official package definition?
Hm, no? If you update the packages, done via
pacman -Syuuyou end up with the packages "gnucobol-3.0rc1-2".The official pacakge definition (see my link above) is "gnucobol-3.1rc1-1", if you want to (also see my link above) you can download the definition (the complete directory) and build it yourself.
@arn79 whenever you prepare the next msys64 based package: please provide both the 32bit and the 64bit variant of it - to allow people to use it as needed (the msys64 packages will always be a quite different version because of its different "bundled" packages, when compared to the mingw variant). Thank you.
I still can't build MSYS2 GC31-rc1. Apparently it is not available:
So, where can I get the MSYS2 package?
Hi Arnold,
https://packages.msys2.org/package/mingw-w64-x86_64-gnucobol?repo=mingw64
I think this is the one,
according to instruction
Last edit: thomas 2020-07-08
The https link shows GC31-rc1, but when I do the install in MSYS2, I still get GC30-rc1-2:
So I don't understand why I cannot get GC31-rc1. It's on the web site, but the pacman install always gets GC30-rc1-2.
Why isn't it working?
Because that is the definition that is used when the build server will do the next build, not the version the build server currently provides.
But you can use that definition and build the package on your own at any time, see the link I've provided earlier.
Can I use this link with MSYS2 pacman?
http://repo.msys2.org/mingw/x86_64/mingw-w64-x86_64-gnucobol-3.1rc1-1-any.pkg.tar.zst
I am not sure what kind of archive a "tar.zst" file is.
Last edit: Arnold Trembley 2020-07-09
I have no idea either...
So I've just redid the "build manual" procedure as MSYS2 documents it:
This creates binary install packages that can be installed with
pacman -U mingw-w64-gnucobol*.pkg.tar.xzCan I suggest you to try it?
Thanks for the reply. I tried it and got this:
==> Validating source files with sha256sums...
gnucobol-3.1-rc1.tar.xz ... Passed
gnucobol-3.1-rc1.tar.xz.sig ... Skipped
001-mingw-gnucobol-2.2-fixformatwarnings.patch ... Passed
cobenv.sh ... Passed
cobenv.cmd ... Passed
==> Verifying source file signatures with gpg...
gnucobol-3.1-rc1.tar.xz ... FAILED (unknown public key 13E96B53C005604E)
==> ERROR: One or more PGP signatures could not be verified!
Tom L
Last edit: Simon Sobisch 2020-07-09
You either have to import the GNU keyring (@lazka is this something that may should be part of MSYS2?) or ignore the pgp signature or disable the check by adding
--skippgpcheckto the makepgk options (I've only seen this as I've usedmakepkg --helpnow, guess that should work).But at lazka has said below - you can also directly install the pre-build binary pkg.tar.zst fromt th repo if you want to.
Either use
--skippgpcheckor import the key firstgpg --recv 13E96B53C005604E. makepkg uses the user keyring, so we can't change that. But I agree that makepkg should ask to import the key if it is missing