So... a client has a system, the data folks key in their data and when they're done they click an icon on their desktops.
The icon invokes an ftp session and sends the data to a server. After the data are sent a .txt file of JCL, filetype=JES.
The big iron sweeps the server (CONTROL-M), grabs the data and then sends
the JES to the INTRDR. This kicks off procs that run programs that I
wrote back when years began with 19 and all goes well.
The supplier of mainframe services has said 'ftp? horrors, port 21 must be closed and everything needs to go to port 22'...
... and the supplier can't get it right. Every time they send a file from the server to a dataset on the mainframe 'it comes out garbage'.
I sat in on a conference call yesterday and heard 'the best us suppliers
can do is turn a process that's run with no intervention for 30 years into something that needs someone to manually log on and submit jubs.'
My unspoken response was 'Hogwash. Someone's done this before and
their work has been incorporated into documentation, somewhere.'
I've been slogging through the z/OS Linux manuals and I've found the OGET documentation. I'm starting to get a feeling that a certification or a
key or the like needs to be installed.
If someone's kind enough to offer a suggestion or so generous as to say
'Why, I remember when we did that at Drexel, Burham Lambert, it went kind
of like this...' I'd be much obliged.
DD
On 9/8/2023 9:08 AM, docdwarf@panix.com wrote:
So... a client has a system, the data folks key in their data and when
they're done they click an icon on their desktops.
The icon invokes an ftp session and sends the data to a server. After the >> data are sent a .txt file of JCL, filetype=JES.
The big iron sweeps the server (CONTROL-M), grabs the data and then sends
the JES to the INTRDR. This kicks off procs that run programs that I
wrote back when years began with 19 and all goes well.
The supplier of mainframe services has said 'ftp? horrors, port 21 must be >> closed and everything needs to go to port 22'...
... and the supplier can't get it right. Every time they send a file from >> the server to a dataset on the mainframe 'it comes out garbage'.
I sat in on a conference call yesterday and heard 'the best us suppliers
can do is turn a process that's run with no intervention for 30 years into >> something that needs someone to manually log on and submit jubs.'
My unspoken response was 'Hogwash. Someone's done this before and
their work has been incorporated into documentation, somewhere.'
I've been slogging through the z/OS Linux manuals and I've found the OGET
documentation. I'm starting to get a feeling that a certification or a
key or the like needs to be installed.
If someone's kind enough to offer a suggestion or so generous as to say
'Why, I remember when we did that at Drexel, Burham Lambert, it went kind
of like this...' I'd be much obliged.
SFTP?
In article <km0m1oF8emlU4@mid.individual.net>,
bill <bill.gunshannon@gmail.com> wrote:
On 9/8/2023 9:08 AM, docdwarf@panix.com wrote:
So... a client has a system, the data folks key in their data and when
they're done they click an icon on their desktops.
The icon invokes an ftp session and sends the data to a server. After the >>> data are sent a .txt file of JCL, filetype=JES.
The big iron sweeps the server (CONTROL-M), grabs the data and then sends >>> the JES to the INTRDR. This kicks off procs that run programs that I
wrote back when years began with 19 and all goes well.
The supplier of mainframe services has said 'ftp? horrors, port 21 must be >>> closed and everything needs to go to port 22'...
... and the supplier can't get it right. Every time they send a file from >>> the server to a dataset on the mainframe 'it comes out garbage'.
I sat in on a conference call yesterday and heard 'the best us suppliers >>> can do is turn a process that's run with no intervention for 30 years into >>> something that needs someone to manually log on and submit jubs.'
My unspoken response was 'Hogwash. Someone's done this before and
their work has been incorporated into documentation, somewhere.'
I've been slogging through the z/OS Linux manuals and I've found the OGET >>> documentation. I'm starting to get a feeling that a certification or a
key or the like needs to be installed.
If someone's kind enough to offer a suggestion or so generous as to say
'Why, I remember when we did that at Drexel, Burham Lambert, it went kind >>> of like this...' I'd be much obliged.
SFTP?
No, sftp. Case sensitive.
DD
On 9/8/2023 3:27 PM, docdwarf@panix.com wrote:
In article <km0m1oF8emlU4@mid.individual.net>,
bill <bill.gunshannon@gmail.com> wrote:
On 9/8/2023 9:08 AM, docdwarf@panix.com wrote:
I sat in on a conference call yesterday and heard 'the best us suppliers >>>> can do is turn a process that's run with no intervention for 30 years into >>>> something that needs someone to manually log on and submit jubs.'
Your supplier should really provide an upgrade path if they require you
to discontinue using ftp.
You're probably doing you own googling, but there should be lots of >information available on these subjects.
https://www.ibm.com/docs/en/zos/2.2.0?topic=systems-specifying-ftp-sftp-server-settings
https://kinsta.com/knowledgebase/ftp-vs-sftp/
https://techdifferences.com/difference-between-ftp-and-sftp.html
https://www.ibm.com/docs/en/zos/2.2.0?topic=settings-enabling-data-file-transfer-between-systems
If the only problem is that the data is corrupted, it's possible the >transfer is being done in binary (with no ASCII to EBCDIC translation). >Fixing that should be just a setting somewhere.
Good luck!
Sysop: | Luis Silva |
---|---|
Location: | Lisbon |
Users: | 763 |
Nodes: | 10 (0 / 10) |
Uptime: | 181:18:20 |
Calls: | 111 |
Files: | 46,971 |
Messages: | 11,239 |