Discussion:
Microsoft PowerShell is now open source
(too old to reply)
Stephen Hoffman
2016-08-18 19:15:21 UTC
Permalink
Raw Message
As Microsoft PowerShell was discussed in the context of overhauling and
updating and potentially replacing DCL and the OpenVMS CLI support,
Microsoft has open-sourced PowerShell, and has ported it.

https://github.com/PowerShell/PowerShell

C# code, an MIT-style license, and Microsoft ports to macOS and Linux.

I don't expect to see this availability happen on OpenVMS any time
soon, but it's now much more of an option than it used to be. Once
C# and probably Mono is ported and available on OpenVMS, that is.

Not that I (personally) see the availability of PowerShell being big
draw for folks to migrate to OpenVMS, though.
--
Pure Personal Opinion | HoffmanLabs LLC
Arne Vajhøj
2016-08-19 00:38:33 UTC
Permalink
Raw Message
Post by Stephen Hoffman
As Microsoft PowerShell was discussed in the context of overhauling and
updating and potentially replacing DCL and the OpenVMS CLI support,
Microsoft has open-sourced PowerShell, and has ported it.
https://github.com/PowerShell/PowerShell
C# code, an MIT-style license, and Microsoft ports to macOS and Linux.
MS is open sourcing a lot these days.
Post by Stephen Hoffman
I don't expect to see this availability happen on OpenVMS any time soon,
but it's now much more of an option than it used to be. Once C# and
probably Mono is ported and available on OpenVMS, that is.
A mono port would be interesting. The hardest parts are probably
the JIT compiler and GTK#.

But given how little benefits HP managed to get out of Java, then
I am slightly skeptical about how much it would actually be used.
Post by Stephen Hoffman
Not that I (personally) see the availability of PowerShell being big
draw for folks to migrate to OpenVMS, though.
I don't like PS.

I would prefer Python.

YMMV

Arne
John E. Malmberg
2016-08-19 22:48:18 UTC
Permalink
Raw Message
Post by Stephen Hoffman
As Microsoft PowerShell was discussed in the context of overhauling and
updating and potentially replacing DCL and the OpenVMS CLI support,
Microsoft has open-sourced PowerShell, and has ported it.
https://github.com/PowerShell/PowerShell
C# code, an MIT-style license, and Microsoft ports to macOS and Linux.
I don't expect to see this availability happen on OpenVMS any time soon,
but it's now much more of an option than it used to be. Once C# and
probably Mono is ported and available on OpenVMS, that is.
Not that I (personally) see the availability of PowerShell being big
draw for folks to migrate to OpenVMS, though.
More important would be to get OpenWsman Server ported to OpenVMS.

The OpenWsman and possibly also a CIFS server has the potential to make
the OpenVMS systems show up as a manageable system for sites that have
standardized on Microsoft based management tools.

A check the box item to keep a manager happy.

A powershell port may help with that.

Regards,
-John
***@qsl.net_work
seasoned_geek
2016-12-30 15:51:38 UTC
Permalink
Raw Message
Post by John E. Malmberg
The OpenWsman and possibly also a CIFS server has the potential to make
the OpenVMS systems show up as a manageable system for sites that have
standardized on Microsoft based management tools.
But those sites will be out of business in a couple of years so are they really worth pursuing? Nobody planning to still be in business 5 years from now is using Microsoft products for anything. That is why you are seeing so many OpenSource and free license items from them. Microsoft is Dead Tech and everyone in Redmond knows it. They can't even get their own cloud service to work using their own products, so it is hosted using Linux. Their bug ridden MS Office package had to be ported to Linux so it could be hosted on their cloud.

Yes, they are trying to come up with a flavor of DOT-NOT-ANYWHERE but it will be as big a pig as the first attempt, MONO.
IanD
2016-12-22 12:07:46 UTC
Permalink
Raw Message
Post by Stephen Hoffman
As Microsoft PowerShell was discussed in the context of overhauling and
updating and potentially replacing DCL and the OpenVMS CLI support,
Microsoft has open-sourced PowerShell, and has ported it.
https://github.com/PowerShell/PowerShell
C# code, an MIT-style license, and Microsoft ports to macOS and Linux.
I don't expect to see this availability happen on OpenVMS any time
soon, but it's now much more of an option than it used to be. Once
C# and probably Mono is ported and available on OpenVMS, that is.
Not that I (personally) see the availability of PowerShell being big
draw for folks to migrate to OpenVMS, though.
--
Pure Personal Opinion | HoffmanLabs LLC
I'm seeing Powershell grow in significance for automated tasks in the cloud space

DevOps are liking it more and more as well too

It has object piping under the hood, something that is fairly well foreign to VMS :-( but enables a lot of goodness if one has the libraries to back it up with

Object piping has been recommended for a while now for linux as a way forward towards more robust and fully featured glue between processes and systems. The desire to move beyond simplistic text piping is there and Powershell can do it out of the box

I know the x86-64 port and now Alpha support is taking the lions share of the VSI efforts but I do wonder how much further thought is being given to what will become the glue language to hold it all together in the looming future of vms. Surely not DCL or even DCL++ in whatever form that could take

There's so few people around who know VMS these days that I don't see the value in enhancing DCL as it's not going to win new converts. Better to grandfather it and guarantee compatibility and bring something new and far more capable to the new VMS IMO. Something at least along the lines of modern scripting languages

What to do and how to implement it and how much cross interaction does one support between the old and the new?

I've been reading about the Bash shell on windows and how they did it. It's quite separate but there are still ways to share the file system at least. I expect MS to continue to bring Bash into windows more fully as time goes on

How will VMS approach the need to modernise it's scripting language and how?

I'd love to see things like awk added by default in VMS instead of resorting to clunky DCL to do stuff for which it is woefully inadequate for. If we had some of the really useful tools like what's available on linux, installed by default on vms then people might start to use them more and be more familiar with them

i.e sum a column of numbers in a file using

awk "-F" "," "{sum+=$1} END{print sum;}" file.csv

is much easier than stepping through a file in DCL even if one has to quote every dam parameter in vms to get the awk command to work :-( (why is vms so annoying like this for command execution?)

Installing these sort of helper tools separately is somewhat annoying in a production environment as there is change control which can mean sometimes months just to get something in and these tools seem to be stepping away from single EXE's / a simple directory install to full blown GNV integrated offerings which is a step away from what I want especially when dealing with production environments.
If only they could distributed as LD files perhaps it might make things easier or better still, included with the OS...
John E. Malmberg
2016-12-22 14:47:12 UTC
Permalink
Raw Message
Post by IanD
I'm seeing Powershell grow in significance for automated tasks in the cloud space
DevOps are liking it more and more as well too
Powershell is missing many features that DevOps need. Try installing an
Exchange sever via Powershell. The last time I tried, the install
script tested to see that it is running from the GUI console for one
step. So the powershell works fine for a logged in user. But DevOps
needs unattended lights out installs.

There are quite a few gotchas in using powershell for some tasks. I was
trying to get it to be used instead of Cygwin for test automation and
ran into a lot of interesting features.

In Powershell, an object can be a reference to an in-memory object that
is running like a lightweight subprocess. This is mainly for backward
compatibilty with previous Microsoft efforts. This is mainly needed to
work use stuff that has not yet have a Powershell API, or the Powershell
API is incomplete. And there are dragons there.
Post by IanD
It has object piping under the hood, something that is fairly well
foreign to VMS :-( but enables a lot of goodness if one has the
libraries to back it up with
Object piping has been recommended for a while now for linux as a
way forward towards more robust and fully featured glue between
processes and systems. The desire to move beyond simplistic text
piping is there and Powershell can do it out of the box
As of the last time I worked with it, there was quite a lot that
Powershell could not do out of the box.
Post by IanD
I know the x86-64 port and now Alpha support is taking the lions
share of the VSI efforts but I do wonder how much further thought
is being given to what will become the glue language to hold it
all together in the looming future of vms. Surely not DCL or even
DCL++ in whatever form that could take
There's so few people around who know VMS these days that I don't
see the value in enhancing DCL as it's not going to win new
converts. Better to grandfather it and guarantee compatibility
and bring something new and far more capable to the new VMS IMO.
Something at least along the lines of modern scripting languages
What to do and how to implement it and how much cross interaction
does one support between the old and the new?
That can be added to utilities by something similar to
/(out|in)format=(xml|json) or similar, and a set
process/pipe_default=formatted so that when utilities are writing to a
MBX they default to /(out|in).

That does not break compatibility and provides the needed functionality.

Essentially that is what powershell is doing to pass objects. I think
it is serializing them to XML, but do not remember. That is why all the
native powershell commmands have options for XML input / output.
Post by IanD
I've been reading about the Bash shell on windows and how they did
it. It's quite separate but there are still ways to share the file
system at least. I expect MS to continue to bring Bash into windows
more fully as time goes on
Does it explain how they are dealing with the lack of a fork? It would
be nice to be able to pull out the massive hack needed to get around
that on VMS.
Post by IanD
How will VMS approach the need to modernise it's scripting language and how?
Could either IRB (Interactive Ruby) or Interactive Python be just used?

Why write a new interpreter if you can use an existing one and just
import modules for it.
Post by IanD
I'd love to see things like awk added by default in VMS instead of
resorting to clunky DCL to do stuff for which it is woefully inadequate
for. If we had some of the really useful tools like what's available on
linux, installed by default on vms then people might start to use them
more and be more familiar with them
i.e sum a column of numbers in a file using
awk "-F" "," "{sum+=$1} END{print sum;}" file.csv
is much easier than stepping through a file in DCL even if one has
to quote every dam parameter in vms to get the awk command to work
:-( (why is vms so annoying like this for command execution?)
Because VMS uses a defined syntax that is different than Linux.

GAWK on VMS has a CLD which is included in the GNV kit and can be
installed as a DCL verb. It supports DCL style options. The syntax has
been supported for a long time. The .CLD for the DCL Verb is a GNV
addition to make the kit more complete.
Post by IanD
Installing these sort of helper tools separately is somewhat
annoying in a production environment as there is change control
which can mean sometimes months just to get something in and these
tools seem to be stepping away from single EXE's / a simple directory
The tools are stepping a way from an incomplete implementation that is
installed in a random location where other tools do not know how to find
them to using the Vendor supplied installation tool, PCSI and a standard
directory.

The PCSI tool allows getting a list of what packages are installation.
Post by IanD
install to full blown GNV integrated offerings which is a step away
from what I want especially when dealing with production environments.
GNV is being broken up into individual packages that can work together
and work separately.

The GNV projects are also being structured so that it takes minimal
effort to keep them up to date with the upstream projects, and even
allow builds against the actual pre-release repository code of the
upstream projects.

If I were doing this full time, the methods that I am using for the new
packages allows releasing kits with in one work day of the upstream
announcing an official release.

Bash 4.4 is being tested now. A more complete bzip2 being tested now.

GAWK on VMS source is not at the GNV site at all, it is only in the GNU
repository. However there is still a lot of functionality in GAWK that
is missing or needs to be improved.
Post by IanD
If only they could distributed as LD files perhaps it might make
things easier or better still, included with the OS...
GNV project is very much short of the number of people needed.

* Needed - Testers, proof readers and such to make the website easier
to use and to catch documentation errors and omissions. Also to make
the documentation and web site easier to use. Make the documentation
for the packages more consistent in the look and feel.

* Needed are also people to look at or even adopt maintenance of the
existing repackaged stuff, and make the porting more efficient, and
start building them against the pre-release branches.

Things are improving slowly.

All updated packages are now being built under Jenkins control. This
has found and required fixing some bugs in the building and kitting
procedure, particularly where files got missed from being checked into
the source repository or included in the source kit.

Some of the packages now are running self tests to validate their
functionality and bug fixes.

Unfortunately my Jenkins is stuck behind a NAT firewall, so you can not
see the graphs plotting the test results or see the current build status
of a project. That would take hosting a "Dashboard" jenkins on an
Intenet facing server that other Jenkins instances can publish job
results to.

Regards,
-John
***@qsl.network
V***@SendSpamHere.ORG
2016-12-22 14:51:23 UTC
Permalink
Raw Message
{...snip...}
I'd love to see things like awk added by default in VMS instead of resorting
to clunky DCL to do stuff for which it is woefully inadequate for. If we had
some of the really useful tools like what's available on linux, installed by
default on vms then people might start to use them more and be more familiar
with them
i.e sum a column of numbers in a file using
awk "-F" "," "{sum+=$1} END{print sum;}" file.csv
is much easier than stepping through a file in DCL even if one has to quote
every dam parameter in vms to get the awk command to work :-( (why is vms so
annoying like this for command execution?)
Awk is not part of any shell on unix/linux/etc. It's a utility. If you want
awk, port it!

If it's a .csv file, pull it into a spreadsheet program and sum the columns!
--
VAXman- A Bored Certified VMS Kernel Mode Hacker VAXman(at)TMESIS(dot)ORG

I speak to machines with the voice of humanity.
Paul Sture
2016-12-22 18:57:14 UTC
Permalink
Raw Message
Post by V***@SendSpamHere.ORG
{...snip...}
I'd love to see things like awk added by default in VMS instead of resorting
to clunky DCL to do stuff for which it is woefully inadequate for. If we had
some of the really useful tools like what's available on linux, installed by
default on vms then people might start to use them more and be more familiar
with them
i.e sum a column of numbers in a file using
awk "-F" "," "{sum+=$1} END{print sum;}" file.csv
is much easier than stepping through a file in DCL even if one has to quote
every dam parameter in vms to get the awk command to work :-( (why is vms so
annoying like this for command execution?)
Awk is not part of any shell on unix/linux/etc. It's a utility. If you want
awk, port it!
If it's a .csv file, pull it into a spreadsheet program and sum the columns!
Importing .csv files into a spreadsheet can quickly become labour intensive.
Anything repetitive is a candidate for automation via other means, for
example using Python to import directly to a SQLite or PostgreSQL.
--
"History does not repeat itself, but it does rhyme" -- Mark Twain
Stephen Hoffman
2016-12-22 22:27:10 UTC
Permalink
Raw Message
Post by Paul Sture
Importing .csv files into a spreadsheet can quickly become labour intensive.
As anyone that's worked with CSV will recognize\, there are issues with
that format. The definition is less than universal\, particularly
around the encoding of the edge cases. Having dealt with CSV in
various environments and contexts\, I usually recommend avoidance. If
you do have to deal with CSV\, libcsv is portable. Given the choice\,
JSON or XML is preferable to CSV.
Post by Paul Sture
Anything repetitive is a candidate for automation via other means, for
example using Python to import directly to a SQLite or PostgreSQL.
Ayup. libcsv\, or maybe luacsv or ftcsv\, or other such tools are
available. There's an OpenVMS port of libcsv\, too. But for
passing around objects within an application or when designing an API\,
CSV just isn't an approach I'd select. Even for exporting, as there
are tools that'll directly write out Microsoft Excel files or other
formats\, or XML.
--
Pure Personal Opinion | HoffmanLabs LLC
V***@SendSpamHere.ORG
2016-12-22 23:42:22 UTC
Permalink
Raw Message
Post by Stephen Hoffman
Post by Paul Sture
Importing .csv files into a spreadsheet can quickly become labour intensive.
As anyone that's worked with CSV will recognize\, there are issues with
that format. The definition is less than universal\, particularly
around the encoding of the edge cases. Having dealt with CSV in
various environments and contexts\, I usually recommend avoidance. If
you do have to deal with CSV\, libcsv is portable. Given the choice\,
JSON or XML is preferable to CSV.
What? CSV is not a standard?
--
VAXman- A Bored Certified VMS Kernel Mode Hacker VAXman(at)TMESIS(dot)ORG

I speak to machines with the voice of humanity.
Paul Sture
2016-12-23 13:44:42 UTC
Permalink
Raw Message
Post by V***@SendSpamHere.ORG
Post by Stephen Hoffman
Post by Paul Sture
Importing .csv files into a spreadsheet can quickly become labour intensive.
As anyone that's worked with CSV will recognize\, there are issues with
that format. The definition is less than universal\, particularly
around the encoding of the edge cases. Having dealt with CSV in
various environments and contexts\, I usually recommend avoidance. If
you do have to deal with CSV\, libcsv is portable. Given the choice\,
JSON or XML is preferable to CSV.
What? CSV is not a standard?
Yes. Nope. Not really. Sort of. Maybe.

RFC 4180 is dated as late as 2005, and it was so well publicised that I
never heard of it before this year.

Wiki:
<https://en.wikipedia.org/wiki/Comma-separated_values>

"The CSV file format is not standardized. The basic idea of separating
fields with a comma is clear, but that idea gets complicated when the
field data may also contain commas or even embedded line-breaks. CSV
implementations may not handle such field data, or they may use
quotation marks to surround the field. Quotation does not solve
everything: some fields may need embedded quotation marks, so a CSV
implementation may include escape characters or escape sequences."

FWIW I have preferred tab-separated files since about 20 years ago,
because in any environment where tab is used to navigate input forms,
the tab character doesn't actually get into data fields. Tab-delimited
files have for a long time been easier to import into databases than
comma separated values, and indeed SQLite3 only got CSV capability quite
recently (and that broke some tab-delimited imports here that had worked
flawlessly for several years).

From the Wiki article, here comes the rub with tab-separated files:

"In addition, the term "CSV" also denotes some closely related
delimiter-separated formats that use different field delimiters. These
include tab-separated values and space-separated values. A delimiter
that is not present in the field data (such as tab) keeps the format
parsing simple. *These alternate delimiter-separated files are often even
given a .csv extension despite the use of a non-comma field separator.*[1]
This loose terminology can cause problems in data exchange. Many
applications that accept CSV files have options to select the delimiter
character and the quotation character."

Both OpenOffice and LibreOffice suffer from an insistence that
tab-separated files have a .csv extension. Both exhibit what to me is
perverse behaviour when saving as a .csv file - they close the spreadsheet
and instead open the newly created .csv file. WTF? They also used to[2]
put a lock on that .csv file so that you had to close the wretched thing
before being able to use it as input to e.g. a database import.


[1] My emphasis
[2] I couldn't reproduce this on my most recent attempt (on macOS with
LibreOffice), but it was a real pain in the when when I last used
Windows.

Next episode: Form over function. Why do modern spreadsheets try their
best to look like a ransom note when you copy and paste in data from
other sources? ?-)
--
"History does not repeat itself, but it does rhyme" -- Mark Twain
Paul Sture
2016-12-23 14:10:17 UTC
Permalink
Raw Message
Post by Paul Sture
<https://en.wikipedia.org/wiki/Comma-separated_values>
Historical note: Apparently CSV files originated with Fortran77's list
directed input and output:

<https://en.wikipedia.org/wiki/Comma-separated_values#History>

"Comma-separated values is a data format that pre-dates personal
computers by more than a decade: the IBM Fortran (level H extended)
compiler under OS/360 supported them in 1972.[5] List-directed ("free
form") input/output was defined in FORTRAN 77, approved in 1978.
List-directed input used commas and/or spaces for delimiters, so
unquoted character strings could not contain commas or spaces."

and

"Comma-separated value lists are easier to type (for example into
punched cards) than fixed-column-aligned data, and were less prone to
producing incorrect results if a value was punched one column off from
its intended location."

Harumph. Should've had some decent punch operators then, though
according to recent comments elsewhere on Usenet, the punch operators
found in universities were clearly not up to the standard I enjoyed in
the commercial world.
--
"History does not repeat itself, but it does rhyme" -- Mark Twain
Simon Clubley
2016-12-24 01:37:36 UTC
Permalink
Raw Message
Post by Paul Sture
Post by Paul Sture
<https://en.wikipedia.org/wiki/Comma-separated_values>
Historical note: Apparently CSV files originated with Fortran77's list
<https://en.wikipedia.org/wiki/Comma-separated_values#History>
"Comma-separated values is a data format that pre-dates personal
computers by more than a decade: the IBM Fortran (level H extended)
compiler under OS/360 supported them in 1972.[5] List-directed ("free
form") input/output was defined in FORTRAN 77, approved in 1978.
List-directed input used commas and/or spaces for delimiters, so
unquoted character strings could not contain commas or spaces."
Fortran is also the language which decided that Hollerith constants
were a good idea. Are you beginning to see a pattern here ? :-)

Simon.
--
Simon Clubley, ***@remove_me.eisner.decus.org-Earth.UFP
Microsoft: Bringing you 1980s technology to a 21st century world
o***@gmail.com
2016-12-25 23:13:27 UTC
Permalink
Raw Message
Post by Simon Clubley
Fortran is also the language which decided that Hollerith constants
were a good idea. Are you beginning to see a pattern here ? :-)
Every time I hear of an SQL injection vulnerability in a web script I think maybe Hollerith constants in SQL wouldn't be a bad idea.
Stephen Hoffman
2016-12-26 17:03:27 UTC
Permalink
Raw Message
Post by Simon Clubley
Fortran is also the language which decided that Hollerith constants
were a good idea. Are you beginning to see a pattern here ? :-)
https://fortran.io
--
Pure Personal Opinion | HoffmanLabs LLC
Simon Clubley
2016-12-29 18:51:12 UTC
Permalink
Raw Message
Post by Stephen Hoffman
Post by Simon Clubley
Fortran is also the language which decided that Hollerith constants
were a good idea. Are you beginning to see a pattern here ? :-)
https://fortran.io
That's just wrong. :-)

(Oh, and I thought it was COBOL which ran today's mainframes, not
Fortran. :-))

Next, someone's going to invent a BASIC Web Framework...

Simon.
--
Simon Clubley, ***@remove_me.eisner.decus.org-Earth.UFP
Microsoft: Bringing you 1980s technology to a 21st century world
Paul Sture
2016-12-29 22:13:46 UTC
Permalink
Raw Message
Post by Simon Clubley
Post by Stephen Hoffman
Post by Simon Clubley
Fortran is also the language which decided that Hollerith constants
were a good idea. Are you beginning to see a pattern here ? :-)
https://fortran.io
That's just wrong. :-)
Hmm. Fortran is still in use by the HPC folks, so it's not quite as
strange as at first sight.

Snippet from my notes on a Scientific Computing course I did a while
ago:

Fortran 77 still widely used:
- millions of lines of legacy code
- faster for some things

Note: In general, adding more high-level programming features
to a language makes it harder for the compiler to optimise into
fast-running code
Post by Simon Clubley
(Oh, and I thought it was COBOL which ran today's mainframes, not
Fortran. :-))
Bill Gunshannon said he was looking for a COBOL project. Perhaps he'd
consider a Cobol web framework :-)
Post by Simon Clubley
Next, someone's going to invent a BASIC Web Framework...
David F. has probably got the building blocks for that.
--
"History does not repeat itself, but it does rhyme" -- Mark Twain
Bill Gunshannon
2016-12-29 22:57:55 UTC
Permalink
Raw Message
Post by Paul Sture
Post by Simon Clubley
Post by Stephen Hoffman
Post by Simon Clubley
Fortran is also the language which decided that Hollerith constants
were a good idea. Are you beginning to see a pattern here ? :-)
https://fortran.io
That's just wrong. :-)
Hmm. Fortran is still in use by the HPC folks, so it's not quite as
strange as at first sight.
Snippet from my notes on a Scientific Computing course I did a while
- millions of lines of legacy code
- faster for some things
Note: In general, adding more high-level programming features
to a language makes it harder for the compiler to optimise into
fast-running code
Post by Simon Clubley
(Oh, and I thought it was COBOL which ran today's mainframes, not
Fortran. :-))
Bill Gunshannon said he was looking for a COBOL project. Perhaps he'd
consider a Cobol web framework :-)
Post by Simon Clubley
Next, someone's going to invent a BASIC Web Framework...
David F. has probably got the building blocks for that.
I've done web with COBOL. No big deal. I've considered making a
library but I was never sure anyone would care. Maybe some day.
At the moment, I am working on a set of COBOL Modules to simplify
using postgres with GnuCOBOL. I have enough done right now to
handle most of the common stuff but haven't thrown it out to the
wolve4s yet for comment. Maybe this week.

I wonder if GnuCOBOL could be made to work on VMS using the VMS
C compiler? Might be fun to try.

bill
Jan-Erik Soderholm
2016-12-30 09:23:59 UTC
Permalink
Raw Message
Post by Bill Gunshannon
On 2016-12-29, Simon Clubley
Post by Simon Clubley
Post by Stephen Hoffman
Post by Simon Clubley
Fortran is also the language which decided that Hollerith constants
were a good idea. Are you beginning to see a pattern here ? :-)
https://fortran.io
That's just wrong. :-)
Hmm. Fortran is still in use by the HPC folks, so it's not quite as
strange as at first sight.
Snippet from my notes on a Scientific Computing course I did a while
- millions of lines of legacy code
- faster for some things
Note: In general, adding more high-level programming features
to a language makes it harder for the compiler to optimise into
fast-running code
Post by Simon Clubley
(Oh, and I thought it was COBOL which ran today's mainframes, not
Fortran. :-))
Bill Gunshannon said he was looking for a COBOL project. Perhaps he'd
consider a Cobol web framework :-)
Post by Simon Clubley
Next, someone's going to invent a BASIC Web Framework...
David F. has probably got the building blocks for that.
I've done web with COBOL. No big deal.
No it isn't. I made some COBOL programming for web pages
served throught "CICS Web Services" on a MVS mainframe.
Full production use for central applications at Ericsson.
Bill Gunshannon
2016-12-30 13:52:00 UTC
Permalink
Raw Message
Post by Jan-Erik Soderholm
Post by Bill Gunshannon
On 2016-12-29, Simon Clubley
Post by Simon Clubley
Post by Stephen Hoffman
Post by Simon Clubley
Fortran is also the language which decided that Hollerith constants
were a good idea. Are you beginning to see a pattern here ? :-)
https://fortran.io
That's just wrong. :-)
Hmm. Fortran is still in use by the HPC folks, so it's not quite as
strange as at first sight.
Snippet from my notes on a Scientific Computing course I did a while
- millions of lines of legacy code
- faster for some things
Note: In general, adding more high-level programming features
to a language makes it harder for the compiler to optimise into
fast-running code
Post by Simon Clubley
(Oh, and I thought it was COBOL which ran today's mainframes, not
Fortran. :-))
Bill Gunshannon said he was looking for a COBOL project. Perhaps he'd
consider a Cobol web framework :-)
Post by Simon Clubley
Next, someone's going to invent a BASIC Web Framework...
David F. has probably got the building blocks for that.
I've done web with COBOL. No big deal.
No it isn't. I made some COBOL programming for web pages
served throught "CICS Web Services" on a MVS mainframe.
Full production use for central applications at Ericsson.
Yeah, but CICS isn't COBOL.

bill
Jan-Erik Soderholm
2016-12-30 14:16:51 UTC
Permalink
Raw Message
Post by Bill Gunshannon
Post by Jan-Erik Soderholm
Post by Bill Gunshannon
On 2016-12-29, Simon Clubley
Post by Simon Clubley
Post by Stephen Hoffman
Post by Simon Clubley
Fortran is also the language which decided that Hollerith constants
were a good idea. Are you beginning to see a pattern here ? :-)
https://fortran.io
That's just wrong. :-)
Hmm. Fortran is still in use by the HPC folks, so it's not quite as
strange as at first sight.
Snippet from my notes on a Scientific Computing course I did a while
- millions of lines of legacy code
- faster for some things
Note: In general, adding more high-level programming features
to a language makes it harder for the compiler to optimise into
fast-running code
Post by Simon Clubley
(Oh, and I thought it was COBOL which ran today's mainframes, not
Fortran. :-))
Bill Gunshannon said he was looking for a COBOL project. Perhaps he'd
consider a Cobol web framework :-)
Post by Simon Clubley
Next, someone's going to invent a BASIC Web Framework...
David F. has probably got the building blocks for that.
I've done web with COBOL. No big deal.
No it isn't. I made some COBOL programming for web pages
served throught "CICS Web Services" on a MVS mainframe.
Full production use for central applications at Ericsson.
Yeah, but CICS isn't COBOL.
bill
CICS is CICS, of course. Have very little with the applications
as such to do, apart from the management. Like having Cobol
applications running under ACMS on VMS.

Or are you talkning about writing the web server as such in Cobol?
I don't see any meaning in that. Part becuse there are already
web servers available, and part becuse there are better tools
(than Cobol) to write web servers in.
seasoned_geek
2016-12-30 16:30:16 UTC
Permalink
Raw Message
Post by Jan-Erik Soderholm
CICS is CICS, of course. Have very little with the applications
as such to do, apart from the management. Like having Cobol
applications running under ACMS on VMS.
Or are you talkning about writing the web server as such in Cobol?
I don't see any meaning in that. Part becuse there are already
web servers available, and part becuse there are better tools
(than Cobol) to write web servers in.
It sounded like they were talking about writing Web pages in COBOL that browsers parsed and displayed via some kind of JIT.

Since no Web page should _ever_ under _any_ circumstances connect directly to a production database, I don't see the worry about what language one is using causes. As I documented here: http://theminimumyouneedtoknow.com/soa_book.html

XML used from outside world to Web page
XML converted to fixed length proprietary message passing from Web to socket service.
Proprietary fixed length message sent to actual back end for processing
Proprietary outbound message format response sent to socket service
Proprietary outbound message format converted to XML response and sent out through Web page to big bad outside world.

Running XML straight in or allowing a Web service to directly connect to any production database is simply IT malpractice. SQL Injection and other breech techniques cannot work when there is a layer in between converting from National Security Risk XML to a quote bad quote proprietary message format which will then be processed by something that actually _has_ access to the various databases.

XML is used for the evil outside world and Proprietary fixed used for the inside world. Never the twain shall meet. That 32767 character SQL Injection trick is going to get chopped off at the N characters you allowed the field to be. The man in the middle mindlessly parses to fixed then converts fixed to XML on the way back out. He doesn't look anything up nor does he have access to anything. He knows about Y message formats and throws everything else away.
seasoned_geek
2016-12-30 17:05:29 UTC
Permalink
Raw Message
Post by Bill Gunshannon
I wonder if GnuCOBOL could be made to work on VMS using the VMS
C compiler? Might be fun to try.
I like you Bill and I've been a C programmer since PC's only came with 5 1/4 floppy drives, but, I gotta say, a COBOL "compiler" which translates to C is sacrelidge (sp?) of the highest order. One who does that cannot be buried on hallowed ground.

Even if one could fake the COBOL SORT verb it would be wretchedly slow without the full processing power. I also wonder how things like these snippets could be implemented.

SELECT DRAW-IDX
ASSIGN TO 'MY_MEGA_FILE'
ORGANIZATION IS INDEXED
ACCESS MODE IS SEQUENTIAL
RECORD KEY IS DRAW_DT IN DRAWING_RECORD ASCENDING
LOCK MODE IS AUTOMATIC
FILE STATUS IS IN-STAT.

FD DRAW-IDX
IS GLOBAL
LABEL RECORDS ARE STANDARD.

COPY 'CDD_RECORDS.DRAWING_RECORD' FROM DICTIONARY.

FD DRAW-STATS
IS GLOBAL
LABEL RECORDS ARE STANDARD.

COPY 'CDD_RECORDS.ZILLIONARE_STATS_RECORD' FROM DICTIONARY
REPLACING ZILLIONARE_STATS_RECORD BY DSTATS-REC.



FD RPT-FILE
LABEL RECORDS ARE OMITTED
REPORT IS REPORT-LISTING.



REPORT SECTION.

RD REPORT-LISTING
PAGE LIMIT IS 60 LINES
HEADING 1
FIRST DETAIL 6.


01 TYPE IS PAGE HEADING.
05 LINE NUMBER IS 1.
10 COLUMN NUMBER IS 1 PIC X(20) SOURCE TODAYS-DATE-FORMATTED.
10 COLUMN NUMBER IS 30 PIC X(21)
VALUE IS 'Drawing Number Report'.
10 COLUMN NUMBER IS 70 PIC X(5) VALUE IS 'Page:'.
10 COLUMN NUMBER IS 77 PIC ZZZ SOURCE PAGE-COUNTER.

05 LINE NUMBER IS 3.
10 COLUMN NUMBER IS 5 PIC X(7) VALUE IS 'Drawing'.
10 COLUMN NUMBER IS 18 PIC X(2) VALUE IS 'No'.
10 COLUMN NUMBER IS 23 PIC X(2) VALUE IS 'No'.
10 COLUMN NUMBER IS 28 PIC X(2) VALUE IS 'No'.
10 COLUMN NUMBER IS 33 PIC X(2) VALUE IS 'No'.
10 COLUMN NUMBER IS 38 PIC X(2) VALUE IS 'No'.
10 COLUMN NUMBER IS 45 PIC X(4) VALUE IS 'Mega'.

05 LINE NUMBER IS 4.
10 COLUMN NUMBER IS 6 PIC X(4) VALUE IS 'Date'.
10 COLUMN NUMBER IS 19 PIC X(1) VALUE IS '1'.
10 COLUMN NUMBER IS 24 PIC X(1) VALUE IS '2'.
10 COLUMN NUMBER IS 29 PIC X(1) VALUE IS '3'.
10 COLUMN NUMBER IS 34 PIC X(1) VALUE IS '4'.
10 COLUMN NUMBER IS 39 PIC X(1) VALUE IS '5'.
10 COLUMN NUMBER IS 46 PIC X(2) VALUE IS 'No'.

05 LINE NUMBER IS 5.
10 COLUMN NUMBER IS 3 PIC X(12) VALUE IS '------------'.
10 COLUMN NUMBER IS 18 PIC X(2) VALUE IS '--'.
10 COLUMN NUMBER IS 23 PIC X(2) VALUE IS '--'.
10 COLUMN NUMBER IS 28 PIC X(2) VALUE IS '--'.
10 COLUMN NUMBER IS 33 PIC X(2) VALUE IS '--'.
10 COLUMN NUMBER IS 38 PIC X(2) VALUE IS '--'.
10 COLUMN NUMBER IS 44 PIC X(4) VALUE IS '----'.

01 DETAIL-LINE TYPE IS DETAIL.
05 LINE NUMBER IS PLUS 1.
10 COLUMN NUMBER IS 4 PIC X(10) SOURCE DRAWING-DATE-FORMATTED.
10 COLUMN NUMBER IS 18 PIC Z9 SOURCE NO_1.
10 COLUMN NUMBER IS 23 PIC Z9 SOURCE NO_2.
10 COLUMN NUMBER IS 28 PIC Z9 SOURCE NO_3.
10 COLUMN NUMBER IS 33 PIC Z9 SOURCE NO_4.
10 COLUMN NUMBER IS 38 PIC Z9 SOURCE NO_5.
10 COLUMN NUMBER IS 46 PIC Z9 SOURCE MEGA_NO.


INITIATE REPORT-LISTING.

* forcing sequential access on indexed file
SELECT DRAW-STATS
ASSIGN TO 'DRAWING_STATS'
ORGANIZATION IS INDEXED
ACCESS MODE IS SEQUENTIAL
RECORD KEY IS ELM_NO IN DSTATS-REC
LOCK MODE IS AUTOMATIC
FILE STATUS IS D-STAT.


* CDD really comes into its own with COBOL doesn't it?
SD SORT-FILE.

COPY 'CDD_RECORDS.ZILLIONARE_STATS_RECORD' FROM DICTIONARY
REPLACING ZILLIONARE_STATS_RECORD BY SORT-REC.

FD SORTED-FILE
VALUE OF ID IS SORTED-FILE-NAME.

COPY 'CDD_RECORDS.ZILLIONARE_STATS_RECORD' FROM DICTIONARY
REPLACING ZILLIONARE_STATS_RECORD BY SORTED-REC.


SORT SORT-FILE
ON DESCENDING KEY SINCE_LAST IN SORT-REC
INPUT PROCEDURE IS S000-DSTAT-INPUT
GIVING SORTED-FILE.

Just some scraps I had lying around. I realize the OpenSource community tries to funnel everything through C, but, it's not right. Both COBOL and FORTRAN have things which simply cannot be done in C or at least done well. I'm not a fan of the GNU Wal-mart approach of reducing the quality of everything until it can all be made by peasants in North Korea.

One thing worthy of note, been a while since I looked at this but when a VAX was actually a VAX, the different section headers generated different PSECTs with different attributes/protections. We used to view this with LINK/MAP listings. Besides putting 88 levels into non-writable PSECTs there were other things which I don't fully remember. I know this capability was incredibly important when it came to passing linkage sections around and it was for a reason other than protection but I cannot remember why off-hand, I just remember that COBOL was the only language of the day which had the feature...and yes, I had been programming in C a while then.


Completely changing topics. That SORT with awk comment in the Powerless Shell thread.

Nearly every VMS COBOL shop intelligent enough to use CDD has a communal SORT source file which specs an input and output file whose definition is pulled from CDD. It then has one sort statement and the INPUT paragraph. Every time a developer is forced to consume something non-standard they make a local copy of this utility source to consume the file.

One-offs become production jobs after they've been in place for years. Start with a definition in CDD along with compiled code so you don't end up forever paying the interpreter price.
Bill Gunshannon
2016-12-31 01:42:11 UTC
Permalink
Raw Message
Post by seasoned_geek
Post by Bill Gunshannon
I wonder if GnuCOBOL could be made to work on VMS using the VMS
C compiler? Might be fun to try.
I like you Bill and I've been a C programmer since PC's only came with 5 1/4 floppy drives, but, I gotta say, a COBOL "compiler" which translates to C is sacrelidge (sp?) of the highest order. One who does that cannot be buried on hallowed ground.
I've been a C programmer since before the PC existed and floppies were
8". :-)

The most popular Ada compiler in use today started out translating Ada
into C. And it stayed that way for quite few versions.

It has been a long time since I saw a compiler that went from source
to machine language. All of them today use some form of intermediate
language during the translation. GnuCOBOL just uses C as its
intermediate language. The first C++ compiler I ever saw did the
same.
Post by seasoned_geek
Even if one could fake the COBOL SORT verb it would be wretchedly slow without the full processing power. I also wonder how things like these snippets could be implemented.
SELECT DRAW-IDX
ASSIGN TO 'MY_MEGA_FILE'
ORGANIZATION IS INDEXED
ACCESS MODE IS SEQUENTIAL
RECORD KEY IS DRAW_DT IN DRAWING_RECORD ASCENDING
LOCK MODE IS AUTOMATIC
FILE STATUS IS IN-STAT.
FD DRAW-IDX
IS GLOBAL
LABEL RECORDS ARE STANDARD.
COPY 'CDD_RECORDS.DRAWING_RECORD' FROM DICTIONARY.
FD DRAW-STATS
IS GLOBAL
LABEL RECORDS ARE STANDARD.
COPY 'CDD_RECORDS.ZILLIONARE_STATS_RECORD' FROM DICTIONARY
REPLACING ZILLIONARE_STATS_RECORD BY DSTATS-REC.
FD RPT-FILE
LABEL RECORDS ARE OMITTED
REPORT IS REPORT-LISTING.
REPORT SECTION.
RD REPORT-LISTING
PAGE LIMIT IS 60 LINES
HEADING 1
FIRST DETAIL 6.
01 TYPE IS PAGE HEADING.
05 LINE NUMBER IS 1.
10 COLUMN NUMBER IS 1 PIC X(20) SOURCE TODAYS-DATE-FORMATTED.
10 COLUMN NUMBER IS 30 PIC X(21)
VALUE IS 'Drawing Number Report'.
10 COLUMN NUMBER IS 70 PIC X(5) VALUE IS 'Page:'.
10 COLUMN NUMBER IS 77 PIC ZZZ SOURCE PAGE-COUNTER.
05 LINE NUMBER IS 3.
10 COLUMN NUMBER IS 5 PIC X(7) VALUE IS 'Drawing'.
10 COLUMN NUMBER IS 18 PIC X(2) VALUE IS 'No'.
10 COLUMN NUMBER IS 23 PIC X(2) VALUE IS 'No'.
10 COLUMN NUMBER IS 28 PIC X(2) VALUE IS 'No'.
10 COLUMN NUMBER IS 33 PIC X(2) VALUE IS 'No'.
10 COLUMN NUMBER IS 38 PIC X(2) VALUE IS 'No'.
10 COLUMN NUMBER IS 45 PIC X(4) VALUE IS 'Mega'.
05 LINE NUMBER IS 4.
10 COLUMN NUMBER IS 6 PIC X(4) VALUE IS 'Date'.
10 COLUMN NUMBER IS 19 PIC X(1) VALUE IS '1'.
10 COLUMN NUMBER IS 24 PIC X(1) VALUE IS '2'.
10 COLUMN NUMBER IS 29 PIC X(1) VALUE IS '3'.
10 COLUMN NUMBER IS 34 PIC X(1) VALUE IS '4'.
10 COLUMN NUMBER IS 39 PIC X(1) VALUE IS '5'.
10 COLUMN NUMBER IS 46 PIC X(2) VALUE IS 'No'.
05 LINE NUMBER IS 5.
10 COLUMN NUMBER IS 3 PIC X(12) VALUE IS '------------'.
10 COLUMN NUMBER IS 18 PIC X(2) VALUE IS '--'.
10 COLUMN NUMBER IS 23 PIC X(2) VALUE IS '--'.
10 COLUMN NUMBER IS 28 PIC X(2) VALUE IS '--'.
10 COLUMN NUMBER IS 33 PIC X(2) VALUE IS '--'.
10 COLUMN NUMBER IS 38 PIC X(2) VALUE IS '--'.
10 COLUMN NUMBER IS 44 PIC X(4) VALUE IS '----'.
01 DETAIL-LINE TYPE IS DETAIL.
05 LINE NUMBER IS PLUS 1.
10 COLUMN NUMBER IS 4 PIC X(10) SOURCE DRAWING-DATE-FORMATTED.
10 COLUMN NUMBER IS 18 PIC Z9 SOURCE NO_1.
10 COLUMN NUMBER IS 23 PIC Z9 SOURCE NO_2.
10 COLUMN NUMBER IS 28 PIC Z9 SOURCE NO_3.
10 COLUMN NUMBER IS 33 PIC Z9 SOURCE NO_4.
10 COLUMN NUMBER IS 38 PIC Z9 SOURCE NO_5.
10 COLUMN NUMBER IS 46 PIC Z9 SOURCE MEGA_NO.
INITIATE REPORT-LISTING.
* forcing sequential access on indexed file
SELECT DRAW-STATS
ASSIGN TO 'DRAWING_STATS'
ORGANIZATION IS INDEXED
ACCESS MODE IS SEQUENTIAL
RECORD KEY IS ELM_NO IN DSTATS-REC
LOCK MODE IS AUTOMATIC
FILE STATUS IS D-STAT.
* CDD really comes into its own with COBOL doesn't it?
SD SORT-FILE.
COPY 'CDD_RECORDS.ZILLIONARE_STATS_RECORD' FROM DICTIONARY
REPLACING ZILLIONARE_STATS_RECORD BY SORT-REC.
FD SORTED-FILE
VALUE OF ID IS SORTED-FILE-NAME.
COPY 'CDD_RECORDS.ZILLIONARE_STATS_RECORD' FROM DICTIONARY
REPLACING ZILLIONARE_STATS_RECORD BY SORTED-REC.
SORT SORT-FILE
ON DESCENDING KEY SINCE_LAST IN SORT-REC
INPUT PROCEDURE IS S000-DSTAT-INPUT
GIVING SORTED-FILE.
Just some scraps I had lying around. I realize the OpenSource community tries to funnel everything through C, but, it's not right. Both COBOL and FORTRAN have things which simply cannot be done in C or at least done well. I'm not a fan of the GNU Wal-mart approach of reducing the quality of everything until it can all be made by peasants in North Korea.
If you want to see how it is done, the source is available. :-)

bill
Bob Koehler
2017-01-03 14:28:31 UTC
Permalink
Raw Message
Post by Bill Gunshannon
It has been a long time since I saw a compiler that went from source
to machine language. All of them today use some form of intermediate
language during the translation. GnuCOBOL just uses C as its
intermediate language. The first C++ compiler I ever saw did the
same.
Before there were C++ compilers, there was cfront, which did exactly
that. Not really a compiler.
seasoned_geek
2017-01-06 15:56:47 UTC
Permalink
Raw Message
Post by Bob Koehler
Post by Bill Gunshannon
It has been a long time since I saw a compiler that went from source
to machine language. All of them today use some form of intermediate
language during the translation. GnuCOBOL just uses C as its
intermediate language. The first C++ compiler I ever saw did the
same.
Before there were C++ compilers, there was cfront, which did exactly
that. Not really a compiler.
And cfront was abandoned when C++ expanded to the point the C language could not emulate it in any reasonable fashion. This is what I'm saying. While the typical half-arsed Linux approach may be taken via Gnu of translating to C or some other intermediary language, the intermediary language would have to be able to support each and every feature of the original source language. In the case of COBOL this would include the _complete_ COBOL-74 and COBOL-85 standards, including copylibs. It would also need to support dictionary extensions per platform. On VMS we had CDD (Common Data Dictionary) which I don't think was ever fully ported to another platform but a butt-load of code on VMS uses. IBM had several others:

http://stackoverflow.com/questions/29358238/how-to-use-data-dictionary-on-creating-physical-file-for-db2-for-ibm-i-as400

I don't remember what Unisys had, but someone here probably will.
seasoned_geek
2017-01-06 15:48:19 UTC
Permalink
Raw Message
Post by Bill Gunshannon
Post by seasoned_geek
Post by Bill Gunshannon
I wonder if GnuCOBOL could be made to work on VMS using the VMS
C compiler? Might be fun to try.
I like you Bill and I've been a C programmer since PC's only came with 5 1/4 floppy drives, but, I gotta say, a COBOL "compiler" which translates to C is sacrelidge (sp?) of the highest order. One who does that cannot be buried on hallowed ground.
I've been a C programmer since before the PC existed and floppies were
8". :-)
The most popular Ada compiler in use today started out translating Ada
into C. And it stayed that way for quite few versions.
It has been a long time since I saw a compiler that went from source
to machine language. All of them today use some form of intermediate
language during the translation. GnuCOBOL just uses C as its
intermediate language. The first C++ compiler I ever saw did the
same.
Post by seasoned_geek
Even if one could fake the COBOL SORT verb it would be wretchedly slow without the full processing power. I also wonder how things like these snippets could be implemented.
SELECT DRAW-IDX
ASSIGN TO 'MY_MEGA_FILE'
ORGANIZATION IS INDEXED
ACCESS MODE IS SEQUENTIAL
RECORD KEY IS DRAW_DT IN DRAWING_RECORD ASCENDING
LOCK MODE IS AUTOMATIC
FILE STATUS IS IN-STAT.
FD DRAW-IDX
IS GLOBAL
LABEL RECORDS ARE STANDARD.
COPY 'CDD_RECORDS.DRAWING_RECORD' FROM DICTIONARY.
FD DRAW-STATS
IS GLOBAL
LABEL RECORDS ARE STANDARD.
COPY 'CDD_RECORDS.ZILLIONARE_STATS_RECORD' FROM DICTIONARY
REPLACING ZILLIONARE_STATS_RECORD BY DSTATS-REC.
FD RPT-FILE
LABEL RECORDS ARE OMITTED
REPORT IS REPORT-LISTING.
REPORT SECTION.
RD REPORT-LISTING
PAGE LIMIT IS 60 LINES
HEADING 1
FIRST DETAIL 6.
01 TYPE IS PAGE HEADING.
05 LINE NUMBER IS 1.
10 COLUMN NUMBER IS 1 PIC X(20) SOURCE TODAYS-DATE-FORMATTED.
10 COLUMN NUMBER IS 30 PIC X(21)
VALUE IS 'Drawing Number Report'.
10 COLUMN NUMBER IS 70 PIC X(5) VALUE IS 'Page:'.
10 COLUMN NUMBER IS 77 PIC ZZZ SOURCE PAGE-COUNTER.
05 LINE NUMBER IS 3.
10 COLUMN NUMBER IS 5 PIC X(7) VALUE IS 'Drawing'.
10 COLUMN NUMBER IS 18 PIC X(2) VALUE IS 'No'.
10 COLUMN NUMBER IS 23 PIC X(2) VALUE IS 'No'.
10 COLUMN NUMBER IS 28 PIC X(2) VALUE IS 'No'.
10 COLUMN NUMBER IS 33 PIC X(2) VALUE IS 'No'.
10 COLUMN NUMBER IS 38 PIC X(2) VALUE IS 'No'.
10 COLUMN NUMBER IS 45 PIC X(4) VALUE IS 'Mega'.
05 LINE NUMBER IS 4.
10 COLUMN NUMBER IS 6 PIC X(4) VALUE IS 'Date'.
10 COLUMN NUMBER IS 19 PIC X(1) VALUE IS '1'.
10 COLUMN NUMBER IS 24 PIC X(1) VALUE IS '2'.
10 COLUMN NUMBER IS 29 PIC X(1) VALUE IS '3'.
10 COLUMN NUMBER IS 34 PIC X(1) VALUE IS '4'.
10 COLUMN NUMBER IS 39 PIC X(1) VALUE IS '5'.
10 COLUMN NUMBER IS 46 PIC X(2) VALUE IS 'No'.
05 LINE NUMBER IS 5.
10 COLUMN NUMBER IS 3 PIC X(12) VALUE IS '------------'.
10 COLUMN NUMBER IS 18 PIC X(2) VALUE IS '--'.
10 COLUMN NUMBER IS 23 PIC X(2) VALUE IS '--'.
10 COLUMN NUMBER IS 28 PIC X(2) VALUE IS '--'.
10 COLUMN NUMBER IS 33 PIC X(2) VALUE IS '--'.
10 COLUMN NUMBER IS 38 PIC X(2) VALUE IS '--'.
10 COLUMN NUMBER IS 44 PIC X(4) VALUE IS '----'.
01 DETAIL-LINE TYPE IS DETAIL.
05 LINE NUMBER IS PLUS 1.
10 COLUMN NUMBER IS 4 PIC X(10) SOURCE DRAWING-DATE-FORMATTED.
10 COLUMN NUMBER IS 18 PIC Z9 SOURCE NO_1.
10 COLUMN NUMBER IS 23 PIC Z9 SOURCE NO_2.
10 COLUMN NUMBER IS 28 PIC Z9 SOURCE NO_3.
10 COLUMN NUMBER IS 33 PIC Z9 SOURCE NO_4.
10 COLUMN NUMBER IS 38 PIC Z9 SOURCE NO_5.
10 COLUMN NUMBER IS 46 PIC Z9 SOURCE MEGA_NO.
INITIATE REPORT-LISTING.
* forcing sequential access on indexed file
SELECT DRAW-STATS
ASSIGN TO 'DRAWING_STATS'
ORGANIZATION IS INDEXED
ACCESS MODE IS SEQUENTIAL
RECORD KEY IS ELM_NO IN DSTATS-REC
LOCK MODE IS AUTOMATIC
FILE STATUS IS D-STAT.
* CDD really comes into its own with COBOL doesn't it?
SD SORT-FILE.
COPY 'CDD_RECORDS.ZILLIONARE_STATS_RECORD' FROM DICTIONARY
REPLACING ZILLIONARE_STATS_RECORD BY SORT-REC.
FD SORTED-FILE
VALUE OF ID IS SORTED-FILE-NAME.
COPY 'CDD_RECORDS.ZILLIONARE_STATS_RECORD' FROM DICTIONARY
REPLACING ZILLIONARE_STATS_RECORD BY SORTED-REC.
SORT SORT-FILE
ON DESCENDING KEY SINCE_LAST IN SORT-REC
INPUT PROCEDURE IS S000-DSTAT-INPUT
GIVING SORTED-FILE.
Just some scraps I had lying around. I realize the OpenSource community tries to funnel everything through C, but, it's not right. Both COBOL and FORTRAN have things which simply cannot be done in C or at least done well. I'm not a fan of the GNU Wal-mart approach of reducing the quality of everything until it can all be made by peasants in North Korea.
If you want to see how it is done, the source is available. :-)
bill
When did VAX/DEC COBOL and VAX/DEC BASIC compilers go OpenSource? I'm not talking about how things are done on a wanna-be operating system such as Linux but on a real operating system such as VMS.

Last time I looked at gnuCOBOL, which has been a while, it didn't fully implement either COBOL-74 or COBOL-85, just kind of cherry picked a few things. One couldn't run older "card formatted" COBOL with line numbers through it.

Complain all you want about that old line number format, several payroll systems for factories were written with it and probably still are, at least they were as of a few years ago. It allowed them to distribute updates in source form so customized sites could check the line number range for customization conflicts. It was really quite efficient.

I also do not know how one would get the C compiler to create the correct PSECTs since the language typically doesn't work that way. I don't remember seeing anything I didn't expect when I tossed the /MAP qualifier onto the sample code for http://theminimumyouneedtoknow.com/app_book.html
Bill Gunshannon
2017-01-06 15:57:37 UTC
Permalink
Raw Message
Post by seasoned_geek
Post by Bill Gunshannon
Post by seasoned_geek
Post by Bill Gunshannon
I wonder if GnuCOBOL could be made to work on VMS using the VMS
C compiler? Might be fun to try.
I like you Bill and I've been a C programmer since PC's only came with 5 1/4 floppy drives, but, I gotta say, a COBOL "compiler" which translates to C is sacrelidge (sp?) of the highest order. One who does that cannot be buried on hallowed ground.
I've been a C programmer since before the PC existed and floppies were
8". :-)
The most popular Ada compiler in use today started out translating Ada
into C. And it stayed that way for quite few versions.
It has been a long time since I saw a compiler that went from source
to machine language. All of them today use some form of intermediate
language during the translation. GnuCOBOL just uses C as its
intermediate language. The first C++ compiler I ever saw did the
same.
Post by seasoned_geek
Even if one could fake the COBOL SORT verb it would be wretchedly slow without the full processing power. I also wonder how things like these snippets could be implemented.
SELECT DRAW-IDX
ASSIGN TO 'MY_MEGA_FILE'
ORGANIZATION IS INDEXED
ACCESS MODE IS SEQUENTIAL
RECORD KEY IS DRAW_DT IN DRAWING_RECORD ASCENDING
LOCK MODE IS AUTOMATIC
FILE STATUS IS IN-STAT.
FD DRAW-IDX
IS GLOBAL
LABEL RECORDS ARE STANDARD.
COPY 'CDD_RECORDS.DRAWING_RECORD' FROM DICTIONARY.
FD DRAW-STATS
IS GLOBAL
LABEL RECORDS ARE STANDARD.
COPY 'CDD_RECORDS.ZILLIONARE_STATS_RECORD' FROM DICTIONARY
REPLACING ZILLIONARE_STATS_RECORD BY DSTATS-REC.
FD RPT-FILE
LABEL RECORDS ARE OMITTED
REPORT IS REPORT-LISTING.
REPORT SECTION.
RD REPORT-LISTING
PAGE LIMIT IS 60 LINES
HEADING 1
FIRST DETAIL 6.
01 TYPE IS PAGE HEADING.
05 LINE NUMBER IS 1.
10 COLUMN NUMBER IS 1 PIC X(20) SOURCE TODAYS-DATE-FORMATTED.
10 COLUMN NUMBER IS 30 PIC X(21)
VALUE IS 'Drawing Number Report'.
10 COLUMN NUMBER IS 70 PIC X(5) VALUE IS 'Page:'.
10 COLUMN NUMBER IS 77 PIC ZZZ SOURCE PAGE-COUNTER.
05 LINE NUMBER IS 3.
10 COLUMN NUMBER IS 5 PIC X(7) VALUE IS 'Drawing'.
10 COLUMN NUMBER IS 18 PIC X(2) VALUE IS 'No'.
10 COLUMN NUMBER IS 23 PIC X(2) VALUE IS 'No'.
10 COLUMN NUMBER IS 28 PIC X(2) VALUE IS 'No'.
10 COLUMN NUMBER IS 33 PIC X(2) VALUE IS 'No'.
10 COLUMN NUMBER IS 38 PIC X(2) VALUE IS 'No'.
10 COLUMN NUMBER IS 45 PIC X(4) VALUE IS 'Mega'.
05 LINE NUMBER IS 4.
10 COLUMN NUMBER IS 6 PIC X(4) VALUE IS 'Date'.
10 COLUMN NUMBER IS 19 PIC X(1) VALUE IS '1'.
10 COLUMN NUMBER IS 24 PIC X(1) VALUE IS '2'.
10 COLUMN NUMBER IS 29 PIC X(1) VALUE IS '3'.
10 COLUMN NUMBER IS 34 PIC X(1) VALUE IS '4'.
10 COLUMN NUMBER IS 39 PIC X(1) VALUE IS '5'.
10 COLUMN NUMBER IS 46 PIC X(2) VALUE IS 'No'.
05 LINE NUMBER IS 5.
10 COLUMN NUMBER IS 3 PIC X(12) VALUE IS '------------'.
10 COLUMN NUMBER IS 18 PIC X(2) VALUE IS '--'.
10 COLUMN NUMBER IS 23 PIC X(2) VALUE IS '--'.
10 COLUMN NUMBER IS 28 PIC X(2) VALUE IS '--'.
10 COLUMN NUMBER IS 33 PIC X(2) VALUE IS '--'.
10 COLUMN NUMBER IS 38 PIC X(2) VALUE IS '--'.
10 COLUMN NUMBER IS 44 PIC X(4) VALUE IS '----'.
01 DETAIL-LINE TYPE IS DETAIL.
05 LINE NUMBER IS PLUS 1.
10 COLUMN NUMBER IS 4 PIC X(10) SOURCE DRAWING-DATE-FORMATTED.
10 COLUMN NUMBER IS 18 PIC Z9 SOURCE NO_1.
10 COLUMN NUMBER IS 23 PIC Z9 SOURCE NO_2.
10 COLUMN NUMBER IS 28 PIC Z9 SOURCE NO_3.
10 COLUMN NUMBER IS 33 PIC Z9 SOURCE NO_4.
10 COLUMN NUMBER IS 38 PIC Z9 SOURCE NO_5.
10 COLUMN NUMBER IS 46 PIC Z9 SOURCE MEGA_NO.
INITIATE REPORT-LISTING.
* forcing sequential access on indexed file
SELECT DRAW-STATS
ASSIGN TO 'DRAWING_STATS'
ORGANIZATION IS INDEXED
ACCESS MODE IS SEQUENTIAL
RECORD KEY IS ELM_NO IN DSTATS-REC
LOCK MODE IS AUTOMATIC
FILE STATUS IS D-STAT.
* CDD really comes into its own with COBOL doesn't it?
SD SORT-FILE.
COPY 'CDD_RECORDS.ZILLIONARE_STATS_RECORD' FROM DICTIONARY
REPLACING ZILLIONARE_STATS_RECORD BY SORT-REC.
FD SORTED-FILE
VALUE OF ID IS SORTED-FILE-NAME.
COPY 'CDD_RECORDS.ZILLIONARE_STATS_RECORD' FROM DICTIONARY
REPLACING ZILLIONARE_STATS_RECORD BY SORTED-REC.
SORT SORT-FILE
ON DESCENDING KEY SINCE_LAST IN SORT-REC
INPUT PROCEDURE IS S000-DSTAT-INPUT
GIVING SORTED-FILE.
Just some scraps I had lying around. I realize the OpenSource community tries to funnel everything through C, but, it's not right. Both COBOL and FORTRAN have things which simply cannot be done in C or at least done well. I'm not a fan of the GNU Wal-mart approach of reducing the quality of everything until it can all be made by peasants in North Korea.
If you want to see how it is done, the source is available. :-)
bill
When did VAX/DEC COBOL and VAX/DEC BASIC compilers go OpenSource? I'm not talking about how things are done on a wanna-be operating system such as Linux but on a real operating system such as VMS.
No one said they did. But this thread started out talking about
GnuCOBOL which is.
Post by seasoned_geek
Last time I looked at gnuCOBOL, which has been a while, it didn't fully implement either COBOL-74 or COBOL-85, just kind of cherry picked a few things. One couldn't run older "card formatted" COBOL with line numbers through it.
Times have changed. The only real lack at this point is OO and there
aren't that may COBOL shops who care. It handles both old fashioned
formatted COBOL source and the more modern free-form source. I still
prefer the old format if for no other reason than the greater ease in
reading it. But then, I am a dinosaur.
Post by seasoned_geek
Complain all you want about that old line number format, several payroll systems for factories were written with it and probably still are, at least they were as of a few years ago. It allowed them to distribute updates in source form so customized sites could check the line number range for customization conflicts. It was really quite efficient.
Well, being pedantic, sequence numbers not line numbers. Most
programmers didn't number them sequentially to make inserting
new cards easier.
Post by seasoned_geek
I also do not know how one would get the C compiler to create the correct PSECTs since the language typically doesn't work that way. I don't remember seeing anything I didn't expect when I tossed the /MAP qualifier onto the sample code for http://theminimumyouneedtoknow.com/app_book.html
Is it really necessary for functioning COBOL programs? And
function is the ultimate target, right?

bill

David Froble
2016-12-30 07:26:08 UTC
Permalink
Raw Message
Post by Paul Sture
Post by Simon Clubley
Post by Stephen Hoffman
Post by Simon Clubley
Fortran is also the language which decided that Hollerith constants
were a good idea. Are you beginning to see a pattern here ? :-)
https://fortran.io
That's just wrong. :-)
Hmm. Fortran is still in use by the HPC folks, so it's not quite as
strange as at first sight.
Snippet from my notes on a Scientific Computing course I did a while
- millions of lines of legacy code
- faster for some things
Note: In general, adding more high-level programming features
to a language makes it harder for the compiler to optimise into
fast-running code
Post by Simon Clubley
(Oh, and I thought it was COBOL which ran today's mainframes, not
Fortran. :-))
Bill Gunshannon said he was looking for a COBOL project. Perhaps he'd
consider a Cobol web framework :-)
Post by Simon Clubley
Next, someone's going to invent a BASIC Web Framework...
David F. has probably got the building blocks for that.
Well, first I'd have to know what a "Web Framework" is ....

I'd say I don't have a clue about maybe half of the terminology I see.

In my world, real world problems are defined, solutions are designed, and
implemented, where possible. Don't worry too much about names.

Yeah, got stuff that listens for socket connection requests, though TLS V1.2 on
VMS is rather questionable at this time. Really, that's all a web server or web
service is, a listener and some capabilities to handle specific requests, right?
Jan-Erik Soderholm
2016-12-30 09:27:47 UTC
Permalink
Raw Message
Post by David Froble
On 2016-12-29, Simon Clubley
Post by Simon Clubley
Post by Stephen Hoffman
Post by Simon Clubley
Fortran is also the language which decided that Hollerith constants
were a good idea. Are you beginning to see a pattern here ? :-)
https://fortran.io
That's just wrong. :-)
Hmm. Fortran is still in use by the HPC folks, so it's not quite as
strange as at first sight.
Snippet from my notes on a Scientific Computing course I did a while
- millions of lines of legacy code
- faster for some things
Note: In general, adding more high-level programming features
to a language makes it harder for the compiler to optimise into
fast-running code
Post by Simon Clubley
(Oh, and I thought it was COBOL which ran today's mainframes, not
Fortran. :-))
Bill Gunshannon said he was looking for a COBOL project. Perhaps he'd
consider a Cobol web framework :-)
Post by Simon Clubley
Next, someone's going to invent a BASIC Web Framework...
David F. has probably got the building blocks for that.
Well, first I'd have to know what a "Web Framework" is ....
I'd say I don't have a clue about maybe half of the terminology I see.
In my world, real world problems are defined, solutions are designed, and
implemented, where possible. Don't worry too much about names.
Yeah, got stuff that listens for socket connection requests, though TLS
V1.2 on VMS is rather questionable at this time. Really, that's all a web
server or web service is, a listener and some capabilities to handle
specific requests, right?
Yes, if it follows the standards and protocols for "the web". If
it doesn't, it is just another application specific communication
solution, and has nothing with "the web" as such to do.
seasoned_geek
2016-12-30 16:13:07 UTC
Permalink
Raw Message
Post by Simon Clubley
Fortran is also the language which decided that Hollerith constants
were a good idea. Are you beginning to see a pattern here ? :-)
For those who never had the nightmare ...errr I mean pleasure.

https://docs.oracle.com/cd/E19957-01/805-4939/z40007365eaa/index.html

http://sc.tamu.edu/IBM.Tutorial/docs/Compilers/xlf_8.1/html/lr33.HTM
V***@SendSpamHere.ORG
2016-12-23 15:58:55 UTC
Permalink
Raw Message
Post by Paul Sture
Post by V***@SendSpamHere.ORG
Post by Stephen Hoffman
Post by Paul Sture
Importing .csv files into a spreadsheet can quickly become labour intensive.
As anyone that's worked with CSV will recognize\, there are issues with
that format. The definition is less than universal\, particularly
around the encoding of the edge cases. Having dealt with CSV in
various environments and contexts\, I usually recommend avoidance. If
you do have to deal with CSV\, libcsv is portable. Given the choice\,
JSON or XML is preferable to CSV.
What? CSV is not a standard?
Yes. Nope. Not really. Sort of. Maybe.
It was a flippant comment and not really intended to be answered. ;)
--
VAXman- A Bored Certified VMS Kernel Mode Hacker VAXman(at)TMESIS(dot)ORG

I speak to machines with the voice of humanity.
Johnny Billquist
2016-12-27 16:32:05 UTC
Permalink
Raw Message
Post by V***@SendSpamHere.ORG
Post by Paul Sture
Post by V***@SendSpamHere.ORG
Post by Stephen Hoffman
Post by Paul Sture
Importing .csv files into a spreadsheet can quickly become labour intensive.
As anyone that's worked with CSV will recognize\, there are issues with
that format. The definition is less than universal\, particularly
around the encoding of the edge cases. Having dealt with CSV in
various environments and contexts\, I usually recommend avoidance. If
you do have to deal with CSV\, libcsv is portable. Given the choice\,
JSON or XML is preferable to CSV.
What? CSV is not a standard?
Yes. Nope. Not really. Sort of. Maybe.
It was a flippant comment and not really intended to be answered. ;)
I'm not sure he really answered it either... :-)

Johnny
--
Johnny Billquist || "I'm on a bus
|| on a psychedelic trip
email: ***@softjar.se || Reading murder books
pdp is alive! || tryin' to stay hip" - B. Idol
Hans Vlems
2016-12-25 11:38:52 UTC
Permalink
Raw Message
CSV has an S in it that cannot possibly mean Standard :-)
Hans
Johnny Billquist
2016-12-27 16:33:18 UTC
Permalink
Raw Message
Post by Hans Vlems
CSV has an S in it that cannot possibly mean Standard :-)
Correct. The "S" most definitely does not stand for "Standard". Glad to
hear you knew that. :-)

Johnny
--
Johnny Billquist || "I'm on a bus
|| on a psychedelic trip
email: ***@softjar.se || Reading murder books
pdp is alive! || tryin' to stay hip" - B. Idol
V***@SendSpamHere.ORG
2016-12-22 23:40:22 UTC
Permalink
Raw Message
Post by Paul Sture
Post by V***@SendSpamHere.ORG
{...snip...}
I'd love to see things like awk added by default in VMS instead of resorting
to clunky DCL to do stuff for which it is woefully inadequate for. If we had
some of the really useful tools like what's available on linux, installed by
default on vms then people might start to use them more and be more familiar
with them
i.e sum a column of numbers in a file using
awk "-F" "," "{sum+=$1} END{print sum;}" file.csv
is much easier than stepping through a file in DCL even if one has to quote
every dam parameter in vms to get the awk command to work :-( (why is vms so
annoying like this for command execution?)
Awk is not part of any shell on unix/linux/etc. It's a utility. If you want
awk, port it!
If it's a .csv file, pull it into a spreadsheet program and sum the columns!
Importing .csv files into a spreadsheet can quickly become labour intensive.
Anything repetitive is a candidate for automation via other means, for
example using Python to import directly to a SQLite or PostgreSQL.
Whrre inhe examlle give did it state it was repetitive?

Anyway, without an invoked application, show me how it's easier in 'bash'
than with DCL.
--
VAXman- A Bored Certified VMS Kernel Mode Hacker VAXman(at)TMESIS(dot)ORG

I speak to machines with the voice of humanity.
Stephen Hoffman
2016-12-23 00:30:10 UTC
Permalink
Raw Message
Post by V***@SendSpamHere.ORG
Post by Paul Sture
Post by V***@SendSpamHere.ORG
{...snip...}
I'd love to see things like awk added by default in VMS instead of
resorting to clunky DCL to do stuff for which it is woefully inadequate
for. If we had some of the really useful tools like what's available on
linux, installed by default on vms then people might start to use them
more and be more familiar with them
i.e sum a column of numbers in a file using
awk "-F" "," "{sum+=$1} END{print sum;}" file.csv
is much easier than stepping through a file in DCL even if one has to
quote every dam parameter in vms to get the awk command to work :-(
(why is vms so annoying like this for command execution?)
Awk is not part of any shell on unix/linux/etc. It's a utility. If
you want awk, port it!
If it's a .csv file, pull it into a spreadsheet program and sum the columns!
Importing .csv files into a spreadsheet can quickly become labour intensive.
Anything repetitive is a candidate for automation via other means, for
example using Python to import directly to a SQLite or PostgreSQL.
Whrre inhe examlle give did it state it was repetitive?
One might then wonder where in the question was the distinction between
a shell built-in and an external utility relevant?
Post by V***@SendSpamHere.ORG
Anyway, without an invoked application, show me how it's easier in
'bash' than with DCL.
Can't say I've thought about the distinction between a DCL built-in and
an external utility either, other than the fallout around the image
rundown — and in retrospect, that distinction was a pretty dumb idea,
too. It's a command, make it work consistently without having to know
whether the thing is built into the interpreter or is external and
invoked separately or whether the support is dynamically downloaded
from the core DCL server in Bolton or whatever.

Processing CSV or other modern file formats in DCL is a slog. At best.
Yes, DCL can be useful and even fun for some, and it's a good choice
for for developers that are fond of writing piles of glue code and
expending more than a little development effort for what is a common
and solved task on other platforms — via awk or libcsv or otherwise —
but not even remotely feature-competitive with other systems. That's
before discussing the OO capabilities available within PowerShell and
other such, those as IanD had referenced in the post that was snipped.
DCL can't invoke a user-written lexical, short of patch hackery. And
the concept of loading the contents of an arbitrary file and then
treating that whole file as a variable — as an object — that can be
processed or filtered or otherwise modified just doesn't exist in DCL.
With DCL, we're still working in the Fortran or BASIC or C era here;
open, read a record, process it, write, loop, close. Without good
tools for even those tasks.
--
Pure Personal Opinion | HoffmanLabs LLC
hb
2016-12-23 08:27:43 UTC
Permalink
Raw Message
Post by V***@SendSpamHere.ORG
Anyway, without an invoked application, show me how it's easier in 'bash'
than with DCL.
As you probably know, in bash there are more flow control commands.
Anything else is just different, in my point of view:

$ ty x.csv
1,"whatever"
2,"the contents"
17,"may be"
$
$ ty x.com
$ open/read in x.csv
$ sum=0
$ loop:
$ read/end=close/err=close in line
$ sum = sum + f$extract(0,f$locate(",",line),line)
$ goto loop
$ close:
$ close in
$ write sys$output "sum: ''sum'"
$
$ @x
sum: 20
$

$ cat x.sh
#!/bin/bash
sum=0
while read line
do sum=$((${sum}+${line%%,*}))
done < x.csv
echo sum: $sum
$
$ ./x.sh
sum: 20
$
Stephen Hoffman
2016-12-22 17:39:46 UTC
Permalink
Raw Message
On 2016-12-22 12:07:46 +0000, IanD said:

Dredging up a discussion from last summer?
Post by IanD
I'm seeing Powershell grow in significance for automated tasks in the cloud space
DevOps are liking it more and more as well too
It has object piping under the hood, something that is fairly well
foreign to VMS :-( but enables a lot of goodness if one has the
libraries to back it up with...
As for the suggestion around OO APIs and better scripting capabilities
and tools for OpenVMS, that's all a given for hauling the platform
forward. OpenVMS needs that. Integrated, not as an add-on.

But I'm skeptical around Microsoft PowerShell and .NET being the best
path forward for OpenVMS, though perfectly willing to borrow good ideas
from those and from other tools and other platforms. I don't see a
preponderance of folks considering or migrating server applications
from Windows Server boxes to OpenVMS. For most that might or do want
to migrate off of Microsoft Windows Server, it'll likely be to Linux
for the foreseeable future.

It does appear that Microsoft needs PowerShell and other Windows Server
capabilities ported over as part of their efforts to offer SQL Server
and other products on Linux, though. How long they'll be in that
market, against Oracle and PostgreSQL and other packages? But I
digress.

There was an investment in this as part of the DEC Windows NT Affinity
era. Among previous Affinity-related projects, the COM/DCOM support
never really saw much use or acceptance among OpenVMS application
developers, and the OpenVMS registry database is useful but AFAIK
remains entirely underutilized. Whether that outcome was the result of
marketing problems or whatever else, or if enough of the ISVs and
end-users just weren't interested?

If there is an opportunity here, it's in borrowing the best of ideas
from Microsoft and elsewhere and keeping the best ideas of traditional
OpenVMS, of better compatibility with Unix platforms and tools, and of
hauling OpenVMS ISV and end-user applications forward, while resolving
and removing the worst of the accrued problems latent within current
OpenVMS.

That, and getting the pricing sorted out, as the current pricing model
and accoutrements do little to encourage wholly new development and
deployments — the x86-64 port will help here with the server hardware
costs.

Perl, Python, Lua, OODCL, whatever. There are options. The bash
copyrights make its inclusion directly into the base distribution
impractical given the inability to open-source the OpenVMS platform to
comply with GPL requirements, though there are other shells available.
But various offerings and becoming agnostic toward the scripting
language is probably the most tractable path forward, with the
inclusion of various choices into the base distro — scripts as both
objects and tools, etc. Some folks can and will even choose to use DCL
here, after all.

VSI has not made any particular statements around changing the
traditional APIs and related design approaches. The effort of
adopting OO is not small for VSI, ISVs and end-users. too. It's a
high-risk decade-long gamble — or decade-long investment, if you're in
marketing — in the future of the platform. If PowerShell on Linux
does pick up enough interest? But VSI has more than enough to do,
ahead of that potential outcome...
--
Pure Personal Opinion | HoffmanLabs LLC
seasoned_geek
2016-12-30 16:05:33 UTC
Permalink
Raw Message
Post by IanD
i.e sum a column of numbers in a file using
awk "-F" "," "{sum+=$1} END{print sum;}" file.csv
is much easier than stepping through a file in DCL even if one has to quote every dam parameter in vms to get the awk command to work :-( (why is vms so annoying like this for command execution?)
Ummm in a VMS world we wouldn't generate a CSV for re-use. Those go to and from undesirable x86 based platforms. Had it been a legitimate data file with fixed column widths one could use the DCL SORT command and its numerous options. Those who are curious about it who don't have access to their terminal right now can see some examples and discussion of it starting on page 11-4 of this book: http://theminimumyouneedtoknow.com/app_book.html

<Grin>

Sorry, couldn't resist.
Loading...