JCLcnv1demo.htm - DEMO conversions, sample JCL, scripts, executions JCLcnv2real.htm - comprehensive instructions for REAL conversions *THIS DOC* JCLcnv3aids.htm - conversion AIDS (cross-refs,tips,mass changes,etc) JCLcnv4gdg.htm - GDG file handler from UV Software DATAcnv1.htm - Data file conversion - Original Documentation DATAcnv2.htm - Data File Conversion - Comprehensive & Complete
Part_1 | Comprehensive conversion guide for JCL & COBOL |
- prior JCLcnv1demo.htm demonstrated simplistic JCL/script conversions | |
- this JCLcnv2real.htm is your guide for your more complex JCL & COBOL | |
- This is a step by step guide with illustrations of I/O files | |
to help you understand the process. | |
- COBOL conversions 1st to create control file info for JCL conversions | |
- JCL conversions will be repeated in Part_3 with enhanced control-files | |
- Essential files report shows files required to begin operations on Unix | |
All JCL is analyzed to eliminate the many temporary & intermediate files |
Part_2 | Extra conversions depending on site JCL contents |
- JCL conversion for CA7 scheduler | |
- DGOTO &C_L2JN PROC code reduction | |
- CA7 commands &C_ variables reduced by JCL PROC expander | |
- table analysis for keywords desired | |
CA7 cross-references | |
- generating CA7 cross-reference reports | |
- converting cross-refs to .csv, summary & detail, 4-up & 1-up |
Part_3 | - comprehensive guide to creating the control files that supply file |
info (record-size, file-type, key locations, etc) to assist | |
conversion of DATA files & improve JCL conversion. | |
- LISTCAT reports transferred from the mainframe to extract file info | |
that may be missing in the JCL. | |
- Re-converting JCL with more control-file info to improve conversions |
Part_4 | - comprehensive instructions to convert mainframe data files |
- allow for complexities of real data conversion vs the demos | |
in DATAcnv1.htm#Part_3 which were quicker & easier, but | |
converted 1 file at a time & renamed datafiles same as copybooks. | |
- Part 4 converts all files in the directory and inserts the | |
actual data filenames in the generated jobs (vs copybooknames). | |
- we will create a control file to relate data files to copybooks | |
- the control file allows for multiple datafiles using same copybook. | |
- FTP fies to Unix/Linux via JCL operating on Mainframe | |
using 'LOCSITE RDW' option to FTP variable length files | |
optional SORT option 'FTOV' convert Fixed to Variable before FTP RDW |
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Part_5 | Test/Debug Tips & Techniques |
Test Environment, RUNLIBS & RUNDATA superdirs defined in your profile | |
- allows different programmers to have their own set of Libraries & Data | |
Use these tips for the Demo JCL/scripts or for your own JCL/scripts | |
Converting Your files - brief review of DATAcnv1.htm. | |
run 'testdatainit' before JCL/scripts | |
- to clear output files & make it easier to see outputs of current test | |
- use 'joblog' to capture the log from your JCL/script | |
Test/Debug for difficult JCL/scripts | |
- save all data1/* in datasave/ & load data1/ with files for difficult job | |
Iterations of test/investigate/modify as required | |
- Animator/Debugger for Micro Focus COBOL | |
Check results of test/debug, investigate output files (use uvhd if packed) | |
Print-outs using 'uvlp' scripts assist test/debug | |
Modifying no of gnerations in GDG control file & reloading Indexed file | |
GDG files & step Restart | |
Activating 'console-logging' vs 'job-logging' | |
File Comparison for files with packed/binary &/or no LineFeeds | |
- unix 'diff' does not work for these types of files | |
- UV Software provides the 'uvcmp' utilities | |
- uvcopy jobs uvcmp1,2,3 & several scripts uvcmpFA1,uvcmpFE1,etc | |
to make the uvcopy jobs easier to run. | |
- uvcmp prints mismatched record pairs in vertical hexadecimal | |
flagging differences with '*'s, see sample report comparing | |
2 generations of gl.account.master_000001 & _000002 |
Part_11 Conversion support scripts & utilities - summary of scripts, C programs,& uvcopy jobs used for JCL conversion, - listings of a few important scripts (jcl2ksh51A,jcldata51A,cnvMF51A) If Vancouver Utilities installed, see all in the following directories /home/uvadm/sf/IBM/ - scripts used for JCL/COBOL/DATA conversions /home/uvadm/pf/IBM/ - uvcopy jobs used for JCL/COBOL/DATA conversions /home/uvadm/src/ - C programs used for JCL/COBOL/DATA conversions
Part_12 | Control-Files used for JCL, COBOL,& DATA conversions. |
jclunixop51 control file for JCL conversion if Micro Focus COBOL | |
jclunixop53 control file for JCL conversion if AIX COBOL |
Part_13 | Optional Procedures for JCL Conversion |
Storing parm files in multiple subdirs (vs 1 combined directory) | |
- required if parm-names are not unique | |
- if parm-names in different PDS libraries have different contents | |
Converting JCL for Micro Focus JCL emulator on Windows | |
- expands procs, converts CA7 statements (DGOTO, etc) |
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
1A1. | Conversion Strategy |
1A2. | Initial pilot conversion & execution recommended |
1A3. | Directories for the Comprehenssive Conversions vs Demo Conversions |
1A4. | Setup Userid & Directories for Real Conversion |
1A5. | Setup User profiles for Real Conversion |
1A6. | testlibs for mainframe COBOL & JCL conversion to unix/linux |
1B1. | Setup Directories for the Real conversion (vs earlier demos) |
1B2. | Directory illustrations of testlibs, testdata,& cnvdata superdirs |
1B3. | All subdirs in testlibs for conversion of JCL, COBOL,& Parms |
1B4. | Alternative for Re-Conversion - new user & new testlibs,testdata,cnvdata |
- so we can copy desired files & compare re-conversion results | |
1B5. | Alternative for Re-Conversion - same user & just new testlibs |
1C1. | jclunixop51 - options control-file for the JCL converter |
- only a few site-dependent customizable options shown here | |
(see entire control file listed later in Part_12) | |
Example updates to jclunixop51 options depending on site requirements | |
- changing lower case defaults to UPPER for filenames,programnames,parmnames | |
- changing default COBOL calls to JAVA | |
- changing parms default 1 combined library to multiple parmsds/subdirs | |
(if your parmnames are not unique) | |
- changing the COBOL call from default Micro Focus .ints to Windows, | |
executables, unikix, natural, etc. |
1C2. | search/replace tables for the JCL converter |
- only 1 example shown here, replacing hard-coded IP#s with $variables | |
(see entire control file listed later in Part_12) | |
Example updates to jclunixop51 search/replace tables | |
- replace hard-coded IP#s with $variables | |
- $variable definitions in profiles of scheduler/operators & programmers | |
- so programmer testing can FTP to test sites vs production sites |
1D1. | Transfer JCL, COBOL,& Parms from mainframe to Unix/Linux |
1D2. | unzip COBOL,JCL,etc files & copy to the appropriate subdir |
1D3. | Determine if parms can be combined into 1 subdir (default) |
or if multiple subdirs required because parm-names not unique | |
- see Part_13 if not unique & multiple subdirs required |
1F1. | Converting Mainframe copybooks & COBOL to Unix/Linux |
1F2. | convert/compile 1 program or copybook at a time |
- re-compiling all COBOL programs | |
1F3. | COBOL data-file info report |
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
1G1. | JCL conversion to Korn shell scripts for Micro Focus COBOL |
1G2. | Alternative JCL conversions for AIX COBOL |
Converting 1 JCL at a time (for Micro Focus or AIX COBOL) | |
1G3. | converting JCL without re-converting PROCs & Parms |
1G4. | Re-convert JCL when IDCAMS/LISTCAT info available |
1H1. | Run the Cross-References |
1H2. | samples of most relevant cross-ref reports |
1H3. | consolidating COBOL, DB2,& Easytrieves in cross-ref reports |
1H4. | crossref summary (total references) to highlight utility usage |
1I1. | converting cross-refs to .csv, summary & detail, 4-up & 1-up |
- generate .csv's summary & detail, 4-up & 1-up | |
1I2. | Samples: xkshprog2a.dtl, xkshprog2a_dtl.csv, xkshprog2a_dtl1.csv |
1M1. | create missing files reports |
- sample missing files reports |
1N1. | Not Used reports for parms, procs, copybooks,& programs |
- sample reports - for copybooks | |
1N2. | generate Not Used reports for copybooks |
- viewing/printing reports - example for copybooks | |
1N3. | generate Not Used reports for parms |
- extra steps required if multiple subdirs for duplicate names |
1O1. | COBOL data-file info report |
1P1. | Determine Essential INPUT files for JCL/scripts |
mvsfiles5A - Missing Files Report for All JCLs | |
1P2. | mvsfiles51 - Missing Files Report for 1 JCL at a time |
1P3. | Essential file reports - mvsfiles3,5,6,7 |
1P4. | Convert mvsfiles6 to data conversion control file for Part_3 & Part_4 |
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
This JCLcnv2real.htm is the comprehensive conversion guide for JCL & COBOL vs JCLcnv1demo.htm which illustrated conversions for simplistic JCL/scripts. Part 1 intends to provide the complete details required to convert the more complex JCL & COBOL that you may have at your site. This is a step by step guide with illustrations of input/output files to help you understand the process.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
I recommend a small pilot conversion before you attempt to convert All data files to train the conversion team on a small manageable project & to discover any conversion problems & correct them before much effort expended.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
JCLcnv1demo.htm suggested setting up conversion superdirs (testlibs,testdata, cnvdata) in your homedir, because initial profiles define RUNLIBS,RUNDATA,CNVDATA as $HOME/testlibs, $HOME/testdata, $HOME/cnvdata, which is convenient for Demos & initial training - each team member can have their own set of libraries & data.
export RUNLIBS=$HOME/testlibs # personal libs (in homedir) for training export RUNDATA=$HOME/testdata # personal data for testing in homedir export CNVDATA=$HOME/cnvdata # personal data conversion in homedir
For the REAL conversion, you should setup the conversion superdirs in some other file system (other than /home/...) with lots of space. The names of file systems depend on the sysadmin who set them up. Using /p1 for example:
export RUNLIBS=/p1/cnv/cnv1/testlibs # common libs for all team members export RUNDATA=/p1/cnv/cnv1/testdata # common data export CNVDATA=/p1/cnv/cnv1/cnvdata # common data conversion
You could use separate file systems p1/, /p2, /p3 if desired. It is better to have at least 2 levels between mount point & working directories (ie /p1/cnv/cnv1/testlibs - not /p1/testlibs). This is more flexible and allows allows multiple sets of libraries & data. For example when you get a fresh set of JCL/COBOL/Parms from the mainframe, you can save the current testlibs as testlibs.old & make a new testlibs to convert the fresh set, in case you want to compare results & copy control/option files from old to new.
We have been calling our superdirs for libraries & data 'testlibs' & 'testdata'. When you get to production, you could call them 'prodlibs' & 'proddata', or just 'libs' & 'data'.
For the REAL conversion, there should only be 1 set of libraries, but rogrammers can still have their own set of testdata files - just change the definition of RUNDATA in their profile, for rexample:
export RUNLIBS=/p1/cnv/cnv1/libs # common libs for all team members export RUNDATA=$HOME/testdata # private testdata for programmers export CNVDATA=/p1/cnv/cnv1/cnvdata # common data conversion
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
We recommend setting up a separate common userid & directories for your REAL conversions (vs demo/test conversions that might be done in homedirs). Our suggested common userid is 'cnv1' & its homedir will be /p1/cnv/cnv1/... This allows you the potential of setting up other userids for progressive phases of your conversion (cnv2, cnv3, etc). The intermediate dir /cnv/ follows good unix principles & gives you more flexibility (a place to store related files).
#1. login as 'root'
Note |
|
#2a. fdisk /dev/sdb <-- separate partition for conversions #2b. mkfs /dev/sdb1 <-- make file system #2c. mkdir /p1 <-- setup dir for mount point #2d. mount /dev/sdb1 /p1 <-- mount partition
Note |
|
#3. mkdir /p1/cnv <-- make superdir for cnv1 ============= & possible future cnv2, cnv3, etc
#4. groupadd apps <-- setup group 'apps', IF NOT ALREADY SETUP ============= - https://uvsoftware.ca/jclcnv1demo.htm#1D8
#5. useradd -m -g apps -s /bin/bash cnv1 <-- setup user 'cnv1' ====================================== - option -g specifies group 'apps' - use option '-s' to specify login shell as 'bash' - could specify '-s /bin/ksh' if preferred - JCL/scripts code 1st line '#!/bin/ksh' because only the Korn shell has '$FPATH' to functions $APPSADM/sfun/... (jobset51,exportgen0,etc) & allows 'autoload' at beginning of JCL/scripts to declare functions that may be called within that script
#5a. useradd -m -d /export/home/cnv1 -g apps -s /bin/ksh cnv1 ============================================================ - must specify '-d ...' homedir option for SUN Solaris - homedir defaults to /home/cnv1 if '-d' not specified
#6. passwd cnv1 <-- setup password desired ===========
#7. chmod 755 /home/cnv1 <-- allow file copy between userids ====================
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
See the supplied user & common profiles listed at JCLcnv1demo.htm#1C1 & 1C2. These are distributed in $UV/env/... but you are instructed to copy to $APPSADM/env & modify as required for your site. See instructions at https://uvsoftware.ca/install.htm#B4. We expect you would have copied the _uv profiles & renamed for your company, for example:
cp $UV/env/stub_profile_uv $APPSADM/env/stub_profile_ABC cp $UV/env/common_profile_uv $APPSADM/env/common_profile_ABC vi $APPSAM/env/stub_profile_ABC <-- modify as required for your site vi $APPSAM/env/common_profile_ABC <-- modify as required for your site
#7. cp $APPSADM/env/stub_profile_ABC /p1/cnv/cnv1/.profile ======================================================== - copy stub_profile to conversion homedir & rename for Korn shell
#7a. cp $APPSADM/env/stub_profile_ABC /p1/cnv/cnv1/.bash_profile ============================================================ - OR copy & rename for Bash shell
#8. Change RUNLIBS/RUNDATA/CNVDATA profile definitions from $HOME defaults to the /p1/cnv/cnv1 directory (more space than homedirs)
#8a. vi /p1/cnv/cnv1/.profile <-- modify for Korn shell profile ======================== #8b. vi /p1/cnv/cnv1/.bash_profile <-- OR modify for Bash shell ============================= - change definitions from $HOME to new (larger) conversion directory
export RUNLIBS=$HOME/testlibs # personal libs (in homedir) for training export RUNDATA=$HOME/testdata # personal data for testing in homedir export CNVDATA=$HOME/cnvdata # personal data conversion in homedir
--- change from above to as below ---
export RUNLIBS=/p1/cnv/cnv1/testlibs # common libs for all team members export RUNDATA=/p1/cnv/cnv1/testdata # common data export CNVDATA=/p1/cnv/cnv1/cnvdata # common data conversion
#9. Logoff/Logon to make new RUNLIBS/RUNDATA/CNVDATA defs effective
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
On page '1A4', we setup user 'cnv1' with homedir /p1/cnv/cnv1, to hold the directories to perform the Real conversions. The Real conversion directories should be in a file system larger than the /home directories used for the Demo conversions.
#1. login cnv1 --> /p1/cnv/cnv1
#2a. mkdir testlibs <-- make superdir for JCL/COBOL subdirs #2b. mkdir testdata <-- make superdir for DATA subdirs #2c. mkdir cnvdata <-- make superdir for DATA conversion
#3a. cdl <-- alias cdl='cd $RUNLIBS' --> cd testlibs ===
#3b. mvslibsdirs <-- setup 30 subdirs for JCL & COBOL conversions ===========
#4a. cdd <-- alias cdd='cd $RUNDATA' --> cd testdata ====
#4b. mvsdatadirs <-- setup 12 DATA subdirs for later execution JCL/scripts ===========
#5a. cdc <-- alias cdd='cd $CNVDATA' --> cd cnvdata ====
#5b. cnvdatadirs <-- setup 30 subdirs for DATA conversion EBCDIC to ASCII ===========
#6a. cdl <-- alias cdl='cd $RUNLIBS' --> cd testlibs ===
#6b. copymvsctls <-- script to copy control files from /home/uvadm/ctl =========== to $RUNLIBS/ctl/... - see copymvsctls listed at JCLcnv2real.htm#11C0
#7a. vi ctl/jclunixop51 <-- customize JCL converter options ? ================== - for Micro Focus COBOL
#7b. vi ctl/jclunixop53 <-- customize JCL converter options ? ================== - for AIX COBOL
Note |
|
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
We are illustrating the directories for JCL,COBOL,& DATA conversions & testing on one page - showing only a few of the actual subdirs required. See the next page to see most of the actual subdirs required for the JCL & COBOL conversions.
/p1/cnv/cnv1 <-- Real conversion in separate file system /home/userxx - NOT in home dirs as for demo conversions :-----testlibs <-- libraries for JCL & COBOL ($RUNLIBS) : :--*--cbl0 - COBOL programs, '*' indicates files from mainframe : :-----cbls - copy here (standard source library) before compiling : :--*--cpy0 - for COBOL copybooks : :-----cpys - copy here (standard copybook library) : :--*--jcl0 - test/demo JCLs supplied : :-----jcl3 - JCLs converted to Korn shell scripts : :-----maps - copybooks converted to record layouts : :--*--parm0 - parms,control cards,includes (SORT FIELDS, etc) : :-----parms - parmdir if parmnames unique (all parms in 1 dir) : :--*--proc0 - test/demo PROCs supplied : :-----procs - will be merged with jcl1, output to jcl2
Note |
|
/p1/cnv/cnv1 <-- Real conversion in separate file system :-----testdata <-- data dirs (defined as $RUNDATA in JCL/scripts) : :------data1 - datafiles (data1 for future flexibility) : :------ctl - GDG control file : :------joblog - programmer debug log files : :------jobmsgs - status msgs from JCL/scripts : :------jobtimes - job/step times date stamped history files : :------jobtmp - temporary files for SYSIN instream data : :------sysout - SYSOUT printer files : :------tmp - tmp subdir for uvsort & misc use
/p1/cnv/cnv1 <-- Real conversion in separate file system :-----cnvdata <-- data conversion superdir $CNVDATA : :--*--d1ebc - EBCDIC files from mainframe for conversion : :-----d2asc - files converted to ASCII with same record layout : :-----d3ebc - files converted back to EBCDIC for return to mainframe : :-----d4pipe - data files converted to '|' pipe delimited format : :-----cpys - COBOL copybooks : :-----maps - cobmaps (record layouts) generated from copybooks : :-----pfx1 - uvcopy jobs to convert EBCDIC to ASCII (gen from cobmaps) : :-----pfp1 - uvcopy jobs to convert to pipe delimited (from copybooks)
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
/p1/cnv/cnv1 <-- Real conversion in separate file system :-----testlibs : :--*--cbl0 - COBOL programs, '*' indicates files from mainframe : :-----cbl1 - cleaned up, cols 1-6 & 73-80 cleared, etc : :-----cbl2 - cnvMF5 converts mainframe COBOL to MicroFocus COBOL : :-----cbls - copy here (standard source library) before compiling : :-----cblst - cobol source listings from compiles : :-----cblx - compiled COBOL programs (.int's) : :--*--cpy0 - for COBOL copybooks : :-----cpy1 - cleaned up, cols 1-6 & 73-80 cleared, etc : :-----cpy2 - cnvMF5 converts mainframe COBOL to MicroFocus COBOL : :-----cpys - copy here (standard copybook library) : :--UV-ctl - conversion control files (jclunixop51,cobdirectives) : :--*--ezt0 - easytrieve programs : :--*--jcl0 - test/demo JCLs supplied : :-----jcl1 - intermediate conversion 73-80 cleared : :-----jcl2 - PROCs expanded from procs : :-----jcl3 - JCLs converted to Korn shell scripts : :-----jcls - copy here manually 1 by 1 during test/debug : :-----maps - copybooks converted to record layouts : :-----mapsql - sql copybooks converted to record layouts : :--*--parm0 - parms,control cards,includes (SORT FIELDS, etc) : :-----parms - parmdir if parmnames unique (all parms in 1 dir) : :--*--parmsd0 - parm superdir, when parmnames not unique : : :-----PGAPO_CDG_GMUVCFC_XNM <-- multiple subdirs : : :-----PGAPO_CDG_MCVCNQI_XNM : :-----parmsds - parms cleaned up, lowercased, IP#s $variables : : :-----pgapo_cdg_gmuvcfc_xnm <-- multiple subdirs : : :-----pgapo_cdg_mcvcnqi_xnm <-- multiple subdirs : :-----parmsdx - used to determine if parmnames unique : :-----parmdiffs - diff outputs from parmdiff1 uvcopy job : :--*--proc0 - test/demo PROCs supplied : :-----procs - will be merged with jcl1, output to jcl2 : :--*--sql0 - SQL includes & DB2 scripts to create/load tables : :-----sql1 - cleaned up, cols 1-6 & 73-80 cleared, etc : :-----sql2 - cnvMF5 converts mainframe COBOL to MicroFocus COBOL : :-----sqls - copy here (standard copybook library) : :-----tmp - tmp dir for miscellaneous output files : :-----xmvs - essential file reports for each JCL/script : :-----xmvsA - essential file reports forAll JCL/scripts : :-----xnu - Not Used reports (copybooks,programs,parms,procs) : :-----xref - cross-references (see XREFjobs.htm) : :-----xrefdtl - prepare any detail crossref for conversion to .csv : :-----xrefdtlcsv - convert any detail crossref to .csv : :-----xrefsum - prepare any summary crossref for conversion to .csv : :-----xrefsumcsv - convert any summary crossref to .csv : :-----xrefmissing - missing file reports (extracted from cross-refs)
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
During conversion, you may need to re-transfer & re-convert the JCL/COBOL/DATA several times (since changes continue on the mainframe).
You should setup a new set of directories for these re-conversions, so you can copy over desired files from the prior set (or compare to prior conversions).
Our 1st alternative is to setup a new userid 'cnv2' (vs cnv1) & a new set of testlibs, testdata,& cnvdata. Here are just a few of the directories:
/p1/cnv/cnv2 <-- new userid 'cnv2' with new set testlibs,testdata,cnvdata :-----testlibs <-- libraries for JCL & COBOL ($RUNLIBS) : :--*--cbl0 - COBOL programs : :--*--cpy0 - copybooks : :--*--jcl0 - JCLs
/p1/cnv/cnv2 <-- new userid 'cnv2' :-----testdata <-- data dirs (defined as $RUNDATA in JCL/scripts) : :------data1 - datafiles (data1 for future flexibility) : :------ctl - GDG control file : :------joblog - programmer debug log files
/p1/cnv/cnv2 <-- new userid 'cnv2' :-----cnvdata <-- data conversion superdir $CNVDATA : :--*--d1ebc - EBCDIC files from mainframe for conversion : :-----d2asc - files converted to ASCII with same record layout : :-----d3ebc - files converted back to EBCDIC for return to mainframe
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
If you only need to re-transfer & re-convert the JCL & COBOL, assuming you will continue to use the existing testdata & cnvdata directories, you could just setup a new set of testlibs in existing user 'cnv1'.
We could rename existing testlibs as testlibs.old & setup a new testlibs, OR I think better to setup testlibs2 & rename existing testlibs as testlibs1, to make it clearer when you compare or copy files to the new testlibs2. Here is an illustration showing only a few subdirs.
/p1/cnv/cnv1 <-- will use existing userid 'cnv1' :-----testlibs1 <-- OLD libraries for JCL & COBOL : :--*--cbl0 - cobol programs : :--*--cpy0 - copybooks : :--*--jcl0 - JCLs : : :-----testlibs2 <-- NEW libraries for JCL & COBOL <--**NOTE** : :--*--cbl0 - cobol programs : :--*--cpy0 - copybooks : :--*--jcl0 - JCLs
/p1/cnv/cnv1 <-- use existing cnv1/testdata for new testlibs2 :-----testdata <-- data dirs (defined as $RUNDATA in JCL/scripts) : :------data1 - datafiles (data1 for future flexibility) : :------ctl - GDG control file : :------joblog - programmer debug log files
/p1/cnv/cnv1 <-- use existing cnv1/cnvdata for new testlibs2 :-----cnvdata <-- data conversion superdir $CNVDATA : :--*--d1ebc - EBCDIC files from mainframe for conversion : :-----d2asc - files converted to ASCII with same record layout : :-----d3ebc - files converted back to EBCDIC for return to mainframe
2. mkdir testlibs2 ===============
2a. mv testlibs testlibs1 <-- optional, but clearer =====================
3. vi .bash_profile <-- edit profile, change RUNLIBS as shown below ================ export RUNLIBS=/p1/cnv/cnv1/testlibs2
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Please see the full listing of the jclunixop51 file at '12A1' followed by more extensive explanations than given here for options more likely to need changes depending on your site.
Here is the 'jclunixop51' options string, followed by expalanations for a few of the options that you might need to modify for your site.
jclunixop51:a2b0c8d0e2f1g1h0i0j0k15l1999m1n3o8p0q0r0s0t15u0v0w0x0y6z0 #<- mvstest options # ====*=====*=======*=========*=========*==================
# *c0 - all filenames lower case # c1 - program names UPPER case # c2 - filenames UPPER case (also uop=l2 jcldata52) # c4 - control card modulenames UPPER case # c8 - UPPERcase filenames to lookup datactl53I
# f0 - default file typ=RSF (allows packed/binary) # *f1 - default uvsort/uvcp file typ=LST # f2 - over-ride file typ by ctl/datactl53I (if filename match) # f3 - f1+f2 force typ=LST regardless datactl53I
# *j0 - default COBOL calls as per option 'r' # j1 - generate JAVA calls (vs COBOL) for COBOLs called directly # j2 - include -DDDNAME=$DDNAME for each file in step # j4 - generate JAVA calls for COBOLs called by IKJEFT01 # j8 - change output script file extension from .ksh to .java # j16 - insert # elif ((S0020 == 4)); then & # alias goto ...
# m0 - generate exportfile modules original filename # *m1 - $RUNLIBS/parms/... (or $RUNLIBS/parmsds/parmlib/...) # m2 - parms subdirs $RUNLIBS/parmsds/subdir/parmnames # m4 - generate --> sed -f $SEDSCRIPT $SYSIN_P >$SYSIN # m8 - assign parms to $RUNDATA vs $RUNLIBS default
# *r0 - gen cobrun, executes .int's & allows animation # r1 - gen runw for .ints NetExpress/Windows # r2 - assume executable programs either unix/windows # r4 - assume executables in PATH (progname only) # r8 - generate 'findexec' Multi-Level program search # r16 - generate 'unikixvsam $RLX/COBOLX' # r32 - generate Natural call (natsec batch ...)
Note |
|
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
$RUNLIBS/ctl/jclunixop51 (options control-file used by the JCL converter), contains several search/replace table, that you may modify & re-convert. See '12A1' for a listing of the complete jclunixop51 file.
Here we will list just 1 of the search/replace tables, as an example a problem that may be solved with this feature.
# REPTBL2 - replace any pattern on any OUTPUT (after conversion to script) # - entries must be tidle filled & table ended by a line of all tildes # 01-30=search pattern, 31-60=replace pattern, 61-80=qualifier pattern :REPTBL2: search/replace table for output UNIX script 219.68.193.1~~~~~~~~~~~~~~~~~~$IP_SITE_AAA~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 219.68.193.2~~~~~~~~~~~~~~~~~~$IP_SITE_BBB~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 219.68.193.3~~~~~~~~~~~~~~~~~~$IP_SITE_CCC~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Then you would define the values for the IP $variables in the profiles. You could define the values desired for production in the common_profile used by operators & the job scheduler.
export IP_SITE_AAA=219.68.193.1 export IP_SITE_BBB=219.68.193.2 export IP_SITE_CCC=219.68.193.3
Programmers could define alternates for testing in their stub_profile, that would override those in the common_profile, for example:
export IP_SITE_AAA=220.120.77.7 export IP_SITE_BBB=220.120.77.8 export IP_SITE_CCC=220.120.77.9
open 219.68.193.1 <-- JCL before conversion open $IP_SITE_AAA <-- JCL/script after conversion
open 219.68.193.1 <-- JCL/script executed in production open 220.120.77.7 <-- JCL/script executed by programmer testing
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
We assume that all mainframe libarary components (JCL,COBOL,PARMS) were zipped into 1 archive named BATCH_CONVERT_yymmdd.zip & FTP'd to unix.
We assume the directory structure illustrated on page '1B3'. Here are just a few of the subdirs where we will store the unzipped modules to be converted. We are omitting the many more subdirs required for conversions (see '1B3').
/p1/cnv/cnv1 <-- Real conversion in separate file system :-----testlibs <-- libraries for JCL & COBOL ($RUNLIBS) : :--*--cbl0 - COBOL programs, '*' indicates files from mainframe : :--*--cpy0 - for COBOL copybooks : :--*--sql0 - for SQL copybooks : :--*--ezt0 - for Easytrieve programs : :--*--jcl0 - test/demo JCLs supplied : :--*--parm0 - parms,control cards,includes (SORT FIELDS, etc) : :--*--proc0 - test/demo PROCs supplied
#0a. Login cnv1 --> /p1/cnv/cnv1 #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs
#1. mkdir unzip <-- make subdir to unzip new base zip file ===========
#2. cp ../BATCH_CONVERT_yymmdd.zip unzip ====================================
#3. cd unzip ========
#4. unzip BATCH_CONVERT_yymmdd.zip <-- unzip new base files ==============================
#5. cdl <-- back up to testlibs ===
#6. cfdd unzip <-- script to Count Files in Directories ========== - results might be something like the following:
cfdd - Count Files in all sub-Directories of a super-Directory 00310 files in subdir #00001 unzip/COBOL 00391 files in subdir #00002 unzip/COPY 00091 files in subdir #00003 unzip/COPYSQL 00052 files in subdir #00004 unzip/EZTPROGRAM 00342 files in subdir #00005 unzip/EZTMACRO 01871 files in subdir #00006 unzip/JCL 01390 files in subdir #00007 unzip/JPROC 00069 files in subdir #00008 unzip/PARM 04516 Total files in 00008 subdirs of unzip 20160402:1535
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#01. cp unzip/COBOL/* cbl0
#02. cp unzip/COPY/* cpy0
#03. cp unzip/COPYSQL/* sql0
#04. cp unzip/EZTPROGRAM/* ezt0
#05. cp unzip/EZTMACRO/* ezm0
#06. cp unzip/JCL/* jcl0
#07. cp unzip/JPROC/* proc0
#08. cp -r unzip/PARM/* parm0
Note |
|
#09. cfdd . >tmp/testlibs_counts <-- create report, file counts in each dubdir =========================== in testlibs ('.' = current directory)
#10. uvlp12 tmp/testlibs_counts <-- print out for comparison to unzip counts ========================== - see unzip counts on previous page
#11. Compare file counts in testlibs subdirs to file counts in unzip subdirs to ensure you have copied unzip/subdirs to correct testlibs/subdirs
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Mainframe parms are usually stored in PDS libraries. Here are 3 examples of how the libraries might be defined in the JCL.
//SYSIN DD DSN=PGAPO.CDG.GMUVCFC.XNM(CTVITR) //SYSIN DD DSN=PGAPO.CDG.MCVCNQI.XNM(CTVITR) //SYSIN DD DSN=PGAPO.CDG.UVCOOF.XNM(CTVKMGN)
Each library may have multiple PDS members, for example:
parmsd0/PGAPO_CDG_GMUVCFC_XNM: -rw-rw-r--. 1 cnv1 apps 16974 Feb 4 17:22 AIRGRP -rw-rw-r--. 1 cnv1 apps 16974 Feb 4 17:22 AIRGW -rw-rw-r--. 1 cnv1 apps 19106 Feb 4 17:22 AIRIKEL -rw-rw-r--. 1 cnv1 apps 1148 Feb 4 17:22 EMAIL2
parmsd0/PGAPO_CDG_MCVCNQI_XNM: -rw-rw-r--. 1 cnv1 apps 820 Feb 4 17:22 AIRGRP -rw-rw-r--. 1 cnv1 apps 820 Feb 4 17:22 BAKDF -rw-rw-r--. 1 cnv1 apps 9594 Feb 4 17:22 MECH -rw-rw-r--. 1 cnv1 apps 11316 Feb 4 17:22 AIRIKEL
parmsd0/PGAPO_CDG_UVCOOF_XNM: -rw-rw-r--. 1 cnv1 apps 17466 Feb 4 17:22 AIRIKEL -rw-rw-r--. 1 cnv1 apps 1230 Feb 4 17:22 EMAIL2 -rw-rw-r--. 1 cnv1 apps 6560 Feb 4 17:22 MECH
Note that when we transferred the PDS libraries to unix, we have chosen to convert the DSN's to unix directories (stored within parmsd0/...)
For some mainframe sites, there may be only 1 PDS library & if multiple, the PDS member names may be unique, which means we could store all the members in 1 directory on unix - which is $RUNLIBS/parms/...
But for more complex sites, there may be many PDS libraries & a given member name may have different contents in different libraries. In this case (parm-names not unique), we need to store the parms in multiple subdirs. Please see the procedures in 'Part_13'.
The following instructions assume that parm-names are unique and can be stored in 1 directory $RUNLIBS/parms/... If parm-names are not unique, they need to be stored in multiple subdirs in $RUNLIBS/parmsd0/.../...
See Part_13 to determine if your parm-names are unique or not and if not follow the procedures given in Part_13, before returning here to '1F1' to convert the COBOL & '1G1' to convert the JCL.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
You should perform the COBOL conversions BEFORE the JCL conversions, because COBOL conversions create control files used by JCL conversions, to supply record-sizes for SORTs, etc in the JCL/scripts.
We will 1st convert the COBOL copybooks, followed by the COBOL programs. 'cnvMF51Acpy' will perform all steps of the copybook conversion.
If you have SQL copybooks in a separate copybook library (sql0) 'cnvMF51sql' will be used in addition.
'cnvMF51A' will perform all steps of the COBOL program conversion. Here is an illustration of the conversion steps thru multiple subdirs.
cpy0 ---------> cpy1 ---------> cpy2 ----------> cpys cleanup convert copy(cp)
sql0 ---------> sql1 ---------> sql2 ----------> sqls cleanup convert copy(cp)
cbl0-------->cbl1-------->cbl2-------->cbls--------->cblx cleanup convert copy compile
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. cnvMF51Acpy all <-- convert copybooks thru all steps, as illustrated above =============== - reply null (take defaults) or reply 'y' if y/n demanded
#2. cnvMF51Asql all <-- convert SQL copybooks thru all steps, as illustrated above =============== - reply null (take defaults) or reply 'y' if y/n demanded
#3. cnvMF51A all <-- convert COBOL programs thru all steps, as illustrated above ============ - reply null (take defaults) or reply 'y' if y/n demanded - prompts to compile all programs (after cleanup/convert)
If in doubt about any prompt/replies - see console log displays of demo conversions:
https://uvsoftware.ca/jclcnv1demo.htm#3E2 - console log for copybook conversions
https://uvsoftware.ca/jclcnv1demo.htm#3E3 - console log for COBOL program conversions
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
cnvMF51 program.cbl - convert 1 COBOL program thru all stages & compile =================== - cleanup,convert,compile: cbl0 -> cbl1 -> cbl2 -> cbls
cnvMF51cpy copybook.cpy - convert 1 copybook (cleanup & convert) ======================= cpy0 --> cpy1 --> cpy2 --> cpys
mfcbl1 program.cbl <-- compile 1 COBOL program, expects ================== - source in cbls/, copybooks in cpys/, writes to cblx/
mfcblA all <-- compile all programs, input from subdir cbls, output to ========== subdir cblx: .int, .idy, .err,& .cbl (copy for animation)
Script 'mfcblA' is provided to compile all programs in the COBOL source subdir 'cbls' (using copybooks in 'cpys'), storing compiled executable programs in 'cblx'.
#1a. cnvAIXcpyA all <-- convert copybooks for AIX COBOL ===============
#2a. cnvAIXcblA all <-- convert COBOL programs for AIX COBOL ==============
aixcblA all - compile ALL programs in directory (for AIX COBOL) aixcbl1 program.cbl - compile 1 program at a time (for AIX COBOL)
aixcblADB2 all - compile ALL programs in directory for DB2 aixcbl1DB2 program.cbl - compile 1 program at a time for AIX COBOL with DB2
aixcblAsub all - compile ALL (called) programs in directory - add to archive for linking with calling programs aixcbl1sub program.cbl - compile 1 called programs
aixcblAsubDB2 all - compile ALL (called) programs in directory for DB2 - add to archive for linking with calling programs aixcbl1subDB2 program.cbl - compile 1 called programs for DB2
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
'cobfiles5A' is a script provided to generate a "COBOL data-files report". It reads all programs in the cbls/* directory & writes a report including filenames, Input/Output, Organization, Access method, record size,& copybook. Programmers should have a copy before starting test/debug. You can run this any time after you have converted the copybooks & COBOL programs.
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. cobfiles5A cbls cpys maps <-- creates xref/cobfiles report =========================
cobfil51 ** COBOL Files Report ** Dir=cbls 2016/02/11_10:48:07 progname.cbl DDname OAM open recsz pb copybook.cpy FDname Key lines ================================================================================
car100.cbl custmas SS_ I___ 256 p custmas.cpy custmas car100.cbl nalist L__ O___ 120 nalist 48 car200.cbl saledtl SS_ I___ 64 saledtl.cpy saledtl car200.cbl custmas IR_ I___ 256 p custmas.cpy custmas car200.cbl key-> cm-cust car200.cbl salelst L__ O___ 120 sdline.cpy salelst 60 cgl100.cbl acctmas SS_ I___ 128 p acctmas cgl100.cbl actlist L__ O___ 120 actlist 53 cgl200.cbl glmsold SS_ I___ 128 p glmsold cgl200.cbl glmsnew SS_ O___ 128 p glmsnew cgl200.cbl gltrans LS_ I___ 80 gltrans 64 Total programs = 4, total files = 10
Note |
|
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
You should perform the COBOL conversions BEFORE the JCL conversions, because COBOL conversions create control files used by JCL conversions, to supply record-sizes for SORTs, etc in the JCL/scripts.
Script 'jcl2ksh51A' performs all steps for conversion of all files in the directories of JCLs, PROCs,& Parms. Here is an illustration of the conversions thru several subdirs. Initial subdirs of mainframe files are jcl0,proc0,& parm0.
/p1/cnv/cnv1 :-----testlibs : :--*--jcl0 - JCL from mainframe : :-----jcl1 - intermediate conversion 73-80 cleared : :-----jcl2 - PROCs expanded to pure JCL : :-----jcl2fix - JCL copied converting CA7 &symbols to %%symbols : :-----jcl3 - JCLs converted to Korn shell scripts : :--*--proc0 - PROCs from mainframe : :-----procs - will be merged with jcl1, output to jcl2 : :-----tmp1 - test jclfixCA7 inputs jclfixtest1 & jclfixftp1 : :-----tmp2 - test jclfixCA7 outputs
proc0 -------> procs parm0 -------> parms cleanup cleanup
jcl0 -----> jcl1 ---------> jcl2 ---------> jcl3 ------> jcls cleanup PROC expand convert to ksh copy 1 at a time to test
--- Alternative for Micro Focus JCL Emulator ---
jcl0 -----> jcl1 ---------> jcl2 ---------> jcl2fix -------> Windows cleanup PROC expand jclfixCA7 JCL engine
Script 'jcl2ksh51A' does everything. If you have problems, you could enter the various commands manually. See listing of jcl2ksh51A on page '11A1'.
#1. jcl2ksh51A all <-- convert ALL JCL thru all steps, as illustrated below ============== - reply null (take default) at all prompts - or reply 'y' for prompts that demand a y/n response
If in doubt about any prompt/replies - see console log displays of demo conversions: https://uvsoftware.ca/jclcnv1demo.htm#3F3
#2. uvcopy jclfixCA7,fild1=jcl2,fild2=jcl2fix ========================================= - convert CA7 &symbols to Micro Focus %%symbols for all JCLs in jcl2/... while copying to jcl2fix/... - also convert FTP steps to run under Windows
See details & examples of Micro Focus JCL conversion on pages '1L1' - '1L5'
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
#1a. jcl2ksh53A all <-- convert JCL thru all steps for AIX COBOL ==============
Use 'jcl2ksh53A' vs 'jcl2ksh51A' if the JCL being converted is intended to execute AIX COBOL vs Micro Focus COBOL. For AIX COBOL, we generate a call to the linked program vs the .int for Micro Focus. The GDG file handler is different for AIX, some file types require a TYPE- prefix on the exportfile.
jcl2ksh53 jcl0/jar100.jcl <-- example, convert jar100.jcl ========================= from jcl0 --> jcl2 --> jcl2 --> jcl3
Use script 'jcl2ksh51' for 1 at a time vs 'jcl2ksh51A' for All in directory.
jcl2ksh51 jcl0/jar100.jcl <-- example, convert jar100.jcl ========================= from jcl0 --> jcl2 --> jcl2 --> jcl3
jcl0 -----> jcl1 ---------> jcl2 ---------> jcl3 ------> jcls cleanup PROC expand convert to ksh copy 1 at a time to test
You must 1st use 'jcl2ksh51A' to convert All because it also converts the PROCs and PARMs. Use 'jcl2ksh51' for later additions from the mainframe.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
These instructions can save you time when you need to re-convert because you are changing options or control-files or UV Software has enhanced the PROC or JCL converters - and NOT because your input source JCL/COBOL/PARMs have changed.
You can use jclpx51 to re-expand the PROCs in all jcl1/* --> jcl2/... Use if you modify the jclprocop51 options in the JCL converter control file ($RUNLIBS/ctl/jclunixop51) OR if UV Software has enhanced the PROC expansion utility to fix problems you have reported.
You can skip the PROC expansion if it is only the JCL options, control file, or the JCL converter program has changed.
jclpx51 jcl2 jcl3 <-- expand PROCs in all jcl1/* --> jcl2/... =================
Note that there is no separate script/program for AIX proc expansion as required for JCL conversion.
You can use jclxx51/jclxx53 to reconvert JCL to scripts when you do not need to to repeat the JCL cleanup & proc expand jcl0 --> jcl1 --> jcl2 Use if you modify the JCL converter control file OR if UV Software has enhanced the JCL converter to fix problems you have reported.
jclxx51 jcl2 jcl3 <-- convert all jcl2/* --> jcl3/... (Micro Focus) =================
jclxx53 jcl2 jcl3 <-- convert all jcl2/* --> jcl3/... (AIX) =================
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
You can improve the JCL conversions by transferring IDCAMS/LISTCAT reports from the mainframe. See Part_3 for the detailed instructions.
The LISTCAT reports can supply data file information such as record-sizes, file-type, Indexed key location/length, no of generations for GDG files, etc.
The 1st JCL conversion wwithout the LISTCAT info will probably result in some missing record-sizes on 'uvsort's (converted from SORTs) & 'uvcp's (converted from IDCAMS). Missing record sizes are indicated by rcs=99999.
If LISTCAT files are not immediately available, you might choose to continue with the following cross-refs, etc, and re-convert the JCL when the LISTCAT files are available.
See 'Part_3' for detailed instructions on converting the mainframe LISTCAT reports into the control files that will improve the JCL conversions.
JCL conversion script 'jcl2ksh51A' calls 'jcldata51A' which creates control file ctl/jcldata53I.dat/.idx looked-up by JCL converter to get file info.
ctl/jcldata53I is created from several sources as follows:
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
It is a good idea to run the cross-references now, because they can help to resolve various problems you may encounter during testing/debugging.
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
Note |
|
Note |
|
#2a. renameL ezt0 <-- convert to lower-case if not already ============ #2b. rename+X ezt0 .ez <-- append extension '.ez' for cross-refs ================= ONLY IF no .ez_ extension present
#3. xrefall cbls jcl3 <-- create all cross-ref reports in subdir xref/... ================= - cross-ref reports listed below:
xcobcopy1 |
|
xcobcopy2 |
|
xcobcall1 |
|
xcobcall2 |
|
xkshfile1 |
|
xkshfile2 |
|
xkshparm1 |
|
xkshparm2 |
|
xkshparmsd1 |
|
xkshparmsd1 |
|
xkshprog1 |
|
xkshprog2 |
|
xkshprog2a |
|
xjclproc1 |
|
xjclproc2 |
|
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
You can see samples of all cross-references in JCLcnv3aids.htm, but here are a few examples of the most useful cross-references that will help you test/debug by identifying missing called-programs, copybooks,& programs. We are selecting only a few lines from the reports to illustrate missing modules.
xcobcopy2 ** crossref all PROGRAMS using each COPYBOOK ** =======================================================2015/09/20_11:06:10 custmas.cpy car100.cbl car102.cbl car105.cbl car110.cbl ___________06 car115.cbl car120.cbl *paymas.cpy cpy100.cbl *sqlca.cpy sqlora2.cbl ***Total Missing CopyBooks 2 ***
xcobcall2 ** crossref to show all PROGRAMS calling each CALLED-PROGRAM ** =======================================================2015/09/20_11:06:10 getdate car115.cbl *getparm car130.cbl car140.cbl ***Total Missing called programs = 1 ***
Note |
|
xkshprog1 ** list all PROGRAMs executed in each ksh SCRIPT ** =======================================================2015/09/20_11:26:51 jgl230.ksh cgl100 cgl200 sort jgl300.ksh idcams iebgener jpy200.ksh cpy200 *ppy299 sort jpy300.ksh cpy200 *cpy300 *ppy299 sort ***Total Missing Programs 3 ***
Note |
|
xkshprog2 ** crossref to show all ksh SCRIPTS executing each PROGRAM ** =======================================================2015/09/20_11:27:00 cgl100 jgl100.ksh jgl105.ksh jgl230.ksh cgl200 jgl200.ksh jgl210.ksh jgl211.ksh jgl212.ksh *cpy300 jpy300.ksh *ppy299 jpy200.ksh jpy300.ksh ***Total Missing Programs 2 ***
Note |
|
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
'xkshprog2a' alternate version of 'xkshprog2' to consolidate COBOL,DB2,& Easytrieve. Compare 'xkshprog2' (1st below) & 'xkshprog2a' (2nd below) and Note consolidation of all COBOLs into 1 group, all DB2 '+' flags, & all Easytrieve '^' flags.
#xkshprog2 ** crossref to show all ksh SCRIPTS executing each PROGRAM ** #======================================================2016/02/11_08:41:01 a12621a gapocece.ksh <--
a25651 gapochch.ksh gapoanfe.ksh <-- COBOLs will be combined
aadr460_3 gapocdhg.ksh gapoacfe.ksh <--
^aadr725_2 gapoxlcd.ksh gapovjac.ksh
adrdssu gapocdcz.ksh gapoabbl.ksh gapoabbv.ksh gapoabde.ksh_2 _______ -------------- many lines removed ----------------- ________938 gapoxvcb.ksh gapovzmm.ksh
^batr230_2 gapooab5.ksh gapomyz7.ksh
+dexm0409_2 gapocd32.ksh gapoac32.ksh
+dexm1055_3 gapocam3.ksh gapoayk8.ksh gapoaykc.ksh
idcams gapocd23.ksh_25 gapoab31.ksh_3 gapoab32.ksh_2 gapoab33.ksh_2 _______ -------------- many lines removed ----------------- _______5453 gapoxboo.ksh_5 gapovzmy.ksh_2 gapovzno.ksh_4
#xkshprog2a ** crossref to show all ksh SCRIPTS executing each PROGRAM ** #======================================================2016/02/11_08:40:06 adrdssu gapocdcz.ksh gapoabbl.ksh gapoabbv.ksh gapoabde.ksh_2 _______ -------------- many lines removed ----------------- ________938 gapoxvcb.ksh gapovzmm.ksh
cobol gapocece.ksh gapoabfe.ksh gapoacfe.ksh gapoafaf.ksh _______6 gapochhg.ksh <-- Note COBOLs combined
+coboldb2 gapocd32.ksh gapoac32.ksh gapoayk3.ksh gapoayk8.ksh __________5 gapocame.ksh <-- Note DB2 COBOLs combined
^easytrv gapoxocq.ksh gapovjab.ksh gapovjac.ksh gapomyz5.ksh __________5 gapocame.ksh <-- Note Easytrieves combined
idcams gapocd23.ksh_25 gapoab31.ksh_3 gapoab32.ksh_2 gapoab33.ksh_2 _______ -------------- many lines removed ----------------- _______5453 gapoxboo.ksh_5 gapovzmy.ksh_2 gapovzno.ksh_4
consolidating COBOL, DB2,& Easytrieve programs highlights the Utilities (addrdssu & idcams in the examples above). The JCL converter handles many utilities but you may have some 3rd party software not known to the converter. The converter assumes COBOL if the program is unrecognized.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
The 'xkshprog2a' report could be 2000 lines long if you have over 1500 JCL/scripts. We provide uvcopy job 'xrefdrop1' to summarize the xkshprog2a report to highlight mainframe 3rd party & home-grown utilities in your JCL that you will need to find replacements for in the unix/linux environment.
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. uvcopy xrefdrop1,fili1=xref/xkshprog2a,filo1=xref/xkshprog2a_sum ================================================================ - drop xref lines, retaining only the 1st & last (total line) for each program
adrdssu gapo30fg.ksh gapo30f4.ksh gapo30fd.ksh gapo30p1.ksh ------ many lines dropped except 1st & last ------ ________983 gapoxvcp.ksh gapovtay.ksh gapovtaz.ksh gapovzmm.ksh
ca7bti gapohlcd.ksh_2 gapofjac.ksh_2 gapofjad.ksh_2 gapofjan.ksh_2 _______52 gapoooec.ksh_2 gapommwa.ksh gapotrmd.ksh
*cobol gapo30d4.ksh_4 gapo30b5.ksh_2 gapo30fs.ksh gapo30l9.ksh ______2761 gapoxnoo.ksh <-- All COBOL programs consolidated
*coboldb2 gapo30d1.ksh_6 gapo30b4.ksh_6 gapo30bn.ksh_6 gapo30ff.ksh_2 _________455 gapopqmd.ksh_4 gaponokc.ksh_4 gaponox2.ksh gapour07.ksh
dsntiaul gapo4032.ksh gapo40fo.ksh gapo41fo.ksh gapo42fo.ksh _________490 gapoxboa.ksh gapovzno.ksh_2
eztpa00 gapo30a3.ksh gapo30y5.ksh_3 gapo40bl.ksh_2 gapo40y3.ksh ________202 gapopqu1.ksh gaponox3.ksh gaponoy3.ksh
ftp gapo30hh.ksh gapo40bl.ksh_2 gapoabbl.ksh_4 gapoabcq.ksh_3 ____1177 gapopqnv.ksh_3 gaponoob.ksh_3 gaponoof.ksh gaponozd.ksh_4
idcams gapo30d1.ksh_13 gapo30b4.ksh_16 gapo30b5.ksh_2 gapo30bi.ksh _______6420 gapoxboo.ksh_5 gapovzmy.ksh_2 gapovzno.ksh_4
iebgener gapo30hh.ksh gapo30l9.ksh gapo30lj.ksh gapo30p2.ksh_5 _________1304 gapoxoij.ksh gapovmgv.ksh
sort gapo30d1.ksh_6 gapo30b4.ksh_9 gapo30bn.ksh_7 gapo30fs.ksh_4 _____4372 gapoxvcp.ksh gapovtay.ksh_5 gapovtaz.ksh_4
txt2pdf gapocemw.ksh_14 gapoafku.ksh gapoanku.ksh_2 gapoaobl.ksh_3 ________117 gapooabr.ksh_4 gapomyzq.ksh_2 gaponoft.ksh gaponoku.ksh_14
xmitip gapocd23.ksh gapoabam.ksh gapoabbl.ksh gapoabbo.ksh _______410 gapopqa3.ksh_2 gapoping.ksh gapovmao.ksh
Note |
|
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Script 'xref2csvall' is provided to convert any of the xref/... reports to .csv format, summary & detail, 4-up & 1-up. You can see the 'xref2csvall' script listed later in this documentation on page '11D3' or at $UV/sf/util/xref2csvall. Here we will demonstrate using the following sample of the xref/xrefprog2a report.
#xkshprog2a ** crossref to show all ksh SCRIPTS executing each PROGRAM ** #======================================================2016/02/11_08:05:33 cobol jar100.ksh jar200.ksh jgl100.ksh jgl200.ksh ______6 jgl230.ksh_2 eztpa00 eztlist.ksh ftp ftpput1.ksh idcams jgl320.ksh iebgener_2 jgl320.ksh mailsmtp.ksh quikjob qjtlist.ksh sort_3 jar200.ksh jgl200.ksh jgl230.ksh
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. xrefprog2a jcl3 <-- must 1st generate the crossref input file =============== - xrefprog2a is part of xrefall (already run)
#2. mkdir xrefdtl xrefdtlcsv xrefsum xrefsumcsv =========================================== - make subdirs to receive the 6 outputs from the 'xref2csvall' script
#3. xref2csvall xref/xkshprog2a ============================ - generates 6 different versions of the crossref as follows:
#2a. xrefdtl/xrefprog2a.dtl - detail 4-up #2b. xrefdtlcsv/xrefprog2a_dtl.csv - detail 4-up .csv #2c. xrefdtlcsv/xrefprog2a_dtl1.csv - detail 1-up .csv #2d. xrefsum/xrefprog2a.sum - summary 4-up #2e. xrefsumcsv/xrefprog2a_sum.csv - summary 4-up .csv #2f. xrefsumcsv/xrefprog2a_sum1.csv - summary 1-up .csv
Note |
|
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
cobol jar100 1 jar200 1 jgl100 1 jgl200 1 6 cobol jgl230 2 6 eztpa00 eztlist 1 1 ftp ftpput1 1 1 idcams jgl320 1 1 iebgener jgl320 1 mailsmtp 1 2 quikjob qjtlist 1 1 sort jar200 1 jgl200 1 jgl230 1 3
cobol,jar100,1,jar200,1,jgl100,1,jgl200,1,6, cobol,jgl230,2,,,,,,,6, eztpa00,eztlist,1,,,,,,,1, ftp,ftpput1,1,,,,,,,1, idcams,jgl320,1,,,,,,,1, iebgener,jgl320,1,mailsmtp,1,,,,,2, quikjob,qjtlist,1,,,,,,,1, sort,jar200,1,jgl200,1,jgl230,1,,,3,
cobol,jar100,1,, cobol,jar200,1,, cobol,jgl100,1,, cobol,jgl200,1,, cobol,jgl230,2,, cobol,,,6, eztpa00,eztlist,1,, eztpa00,,,1, ftp,ftpput1,1,, ftp,,,1, idcams,jgl320,1,, idcams,,,1, iebgener,jgl320,1,, iebgener,mailsmtp,1,, iebgener,,,2, quikjob,qjtlist,1,, quikjob,,,1, sort,jar200,1,, sort,jgl200,1,, sort,jgl230,1,, sort,,,3,
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
The cross-references '*' flag missing files that can be detected. We can extract the '*' flagged lines from following cross-references:
xcobcall2 |
|
xcobcopy2 |
|
xkshparm2 |
|
xkshproc2 |
|
xkshprog2 |
|
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. mkdir xrefmissing <-- make subdir for missing files reports =================
#2. uvcopy select2d,fild1=xref,fild2=xrefmissing,arg1=*,uop=a0b1 ============================================================
Note |
|
#3. vi xrefmisisng/* <-- investigate the missing files reports ================
*getparm_2 car130.cbl car140.cbl --------------------> 1 selections from xref/xcobcall2
*paymas.cpy cpy100.cbl --------------------> 1 selections from xref/xcobcopy2
*ppy200s1_2 jpy200.ksh jpy300.ksh --------------------> 1 selections from xref/xkshparm2
*ppy299_4 jpy200.ksh_2 jpy300.ksh_2 --------------------> 1 selections from xref/xkshproc2
*cobol401_2 jgl400.ksh jgl420.ksh *cobol402_2 jgl400.ksh jgl420.ksh *cpy300 jpy300.ksh *mailtoln maildemo.ksh *ppy299_4 jpy200.ksh_2 jpy300.ksh_2 --------------------> 5 selections from xref/xkshprog2
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
The 'xnu1' script makes it easy to run the uvcopy jobs (xnuload1 & xnuload1) required to create the "Not Used" reports for JCL & COBOL items such as parms, procs, copybooks,& COBOL programs.
#1. xnu1 xref/xcobcopy2 cpys <-- create Not Used reports for cpys ======================== xnu/cpysUsed, xnu/cpysNU, xnu/cpysAll
#2. xnu1 xref/xkshprog2 cbls <-- create Not Used reports for COBOL programs ======================== xnu/cblsUsed, xnu/cblsNU, xnu/cblsAll
#3. xnu1 xref/xjclproc2 procs <-- create Not Used reports for procs ========================= xnu/procsUsed, xnu/procsNU, xnu/procsAll
#4a. xnu1 xref/xkshparm2 parms <-- create Not Used reports for parms ========================= xnu/parmsUsed, xnu/parmsNU, xnu/parmsAll - for sites with All parms in 1 subdir parms/...
#4b. xnu1 xref/xkshparmsd2 parms <-- create Not Used reports for parms =========================== xnu/parmsUsed, xnu/parmsNU, xnu/parmsAll - for sites with duplicate parmnames in multiple subdirs parmsds/subdir/... - see special instructions for parms on page '1M3'
acntmas.cpy* <-- xnu/cpysAll - all copybooks with '*' flag Not Used acnttran.cpy* citytax1.cpy custmas1.cpy* custmas255.cpy custmas.cpy saledtl.cpy
citytax1.cpy <-- xnu/cpysUsed - Used copybooks (no '*' flags) custmas255.cpy custmas.cpy saledtl.cpy
acntmas.cpy* <-- xnu/cpysNU - Not Used copybooks with '*' flag acnttran.cpy* custmas1.cpy*
Note |
|
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. xnu1 xref/xcobcopy2 cpys <-- create Not Used reports for cpys ======================== xnu/cpysUsed, xnu/cpysNU, xnu/cpysAll
2a. vi xnu/cpysUsed <-- view cpys Used 2b. vi xnu/cpysNU <-- view cpys NotUsed '*' flag on right 2c. vi xnu/cpysAll <-- view All cpys with NU flag '*' if Not Used
The 5-up reports save a lot of paper if your wish to print the reports Here is an example just for the cpysAll report showing All copybooks with an '*' flag if Not Used.
spread1 File: xnu/cpysAll Options: q1a0b0c15s1n5 2016/02/10_17:17:27 ================================================================================ acntmas.cpy* acnttran.cpy* citytax1.cpy cust1.cpy* custmas1.cpy* custmas255.cpy custmas.cpy saledtl.cpy sdline.cpy sqlca.cpy stline.cpy unixproc1.cpy* unixwork1.cpy* 00013 total items in 5up/cpysAll
We showed the detailed operating instructions to generate the "Not Used" reports only for the COBOL copybooks. You can perform similar instructions for the other items (COBOL programs, JCL PROCs,& JCL PARMs.
2. xnu1 xref/xkshprog2 cbls <-- create Not Used reports for COBOL programs ======================== xnu/cblsUsed, xnu/cblsNU, xnu/cblsAll 3. xnu1 xref/xkshproc2 procs <-- create Not Used reports for procs ========================= xnu/procsUsed, xnu/procsNU, xnu/procsAll 4a. xnu1 xref/xkshparm2 parms <-- create Not Used reports for parms ========================= xnu/parmsUsed, xnu/parmsNU, xnu/parmsAll - for sites with All parms in 1 subdir parms/... 4b. xnu1 xref/xkshparmsd2 parms <-- create Not Used reports for parms =========================== xnu/parmsUsed, xnu/parmsNU, xnu/parmsAll - for sites with duplicate parmnames in multiple subdirs parmsds/subdir/...
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
You will need special operating instructions to generate Not Used reports for parms if your parmnames are not unique & you have multiple parm subdirs within $RUNLIBS/parmsds/subdirs/... as explained on pages '1E1' - '1E8'.
$RUNLIBS :-----parmsds : :-----pgapo_cdg_gmuvcfc_xnm <-- multi subdirs : : :--------------------artgrp : : :--------------------artgw <-- parm modules : :-----pgapo_cdg_mcvcnqi_xnm : : :--------------------adwh400 : : :--------------------artgw <-- may be duplicate names in different subdirs
But the non-unique parms in multiple subdirs do not stop us from creating 'not Used' reports --> We can simply ccopy/combine them into 1 subdir and use that as input to the Not used report procedure.
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#2. mkdir parms <-- make subdir for combined parms =========== - if not already existing
#3. \cp parmsds/*/* parms <-- copy/combine all parms into 1 subdir ===================== - need '\cp' to disable the alias cp='cp -i'
4b. xnu1 xref/xkshparmsd2 parms <-- create Not Used reports for parms =========================== xnu/parmsUsed, xnu/parmsNU, xnu/parmsAll
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
'cobfiles5A' is a script provided to generate a "COBOL data-files report". It reads all programs in the cbls/* directory & writes a report including filenames, Input/Output, Organization, Access method, record size,& copybook. Programmers should have a copy before starting test/debug. You can run this any time after you have converted the copybooks & COBOL programs.
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. cobfiles5A cbls cpys maps <-- creates xref/cobfiles report =========================
cobfil51 ** COBOL Files Report ** Dir=cbls 2016/02/11_10:48:07 progname.cbl DDname OAM open recsz pb copybook.cpy FDname Key lines ================================================================================
car100.cbl custmas SS_ I___ 256 p custmas.cpy custmas car100.cbl nalist L__ O___ 120 nalist 48 car200.cbl saledtl SS_ I___ 64 saledtl.cpy saledtl car200.cbl custmas IR_ I___ 256 p custmas.cpy custmas car200.cbl key-> cm-cust car200.cbl salelst L__ O___ 120 sdline.cpy salelst 60 cgl100.cbl acctmas SS_ I___ 128 p acctmas cgl100.cbl actlist L__ O___ 120 actlist 53 cgl200.cbl glmsold SS_ I___ 128 p glmsold cgl200.cbl glmsnew SS_ O___ 128 p glmsnew cgl200.cbl gltrans LS_ I___ 80 gltrans 64 Total programs = 4, total files = 10
Note |
|
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Scripts 'mvsfiles51/mvsfiles5A/mvsfiles5B' create "data file trace" reports from jcl2 (MVS JCLs with procs expanded). There are 7 reports created mvsfiles1-mvsfiles7 but the most useful report might be 'xmvs/mvsfiles5' which '*' flags the INPUT files actually needed to run any 1 job or all jobs in subdir jcl2/*.
This report eliminates duplicates & intermediate files (outputs for later inputs) & identifies the essential files required to be be transfered from the mainframe. There are 3 scripts as follows:
mvsfiles51 |
|
mvsfiles5B |
|
mvsfiles5A |
|
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. mkdir xmvsA <-- make subdir for mvsfiles5A reports (if not existing) ===========
#2. mvsfiles5A jcl2 xmvsA <-- generate Essential files reports for All JCL =====================
#3. vi xmvsA/* <-- inspect reports mvsfiles1,2,3,4,5,6,7 ==========
#3a. vi xmvsA/mvsfiles6 <-- the most useful report ================== - essential files only
The xmvsA/... reports could be reasonably accurate if jobnames were assigned in alpha sequence, so filenames in new jobs are from previous jobs, or are in fact new files for the whole system.
But the xmvs/... reports generated for 1 JCL at a time are 100% accurate since thre is no confusion whether a filename is the 1st instance in the JCL or is a subsequent instance (input from a previous step).
Please see the essential file reports for 1 job at a time on the next 2 pages.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
I think mvsfiles51 (for 1 JCL at a time) will be the most useful. Run this script before you attempt to debug each JCL/script to determine if you have all INPUT files required for a particular job. Here is an example for jcl2/jgl230.jcl
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. mkdir xmvs <-- make subdir for reports (if not existing) ==========
#2. mvsfiles51 jcl2/jgl230.jcl xmvs <-- generate reports in xmvs/jgl230/... ===============================
#3. vi xmvs/jgl230/mvsfiles3 <-- inspect report of all files from each job ======================== (in jobname sequence)
#4. vi xmvs/jgl230/mvsfiles5 <-- inspect report of all files SORTED by FILENAME ======================== - more useful to determine INPUT files required - '*' flags the essential INPUT files reqqquired
#5. vi xmvs/jgl230/mvsfiles6 <-- inspect report of ONLY the essential INPUT files ======================== (only the '*' flagged files from mvsfiles5)
See sample reports on the next page:
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
mvsfiles3 - Add Record-sizes & Packed field indicators - 2015/09/20 16:35:34 Jobname Step Program DDname MOD Gen Rcsz pb From To * <-----DSNname------> =============================================================================== jgl230 001 SORT SORTIN O GL.ACCOUNT.TRAN1 jgl230 001 SORT SORTOUT NCD +1 80 002 GL.ACCOUNT.TRANS_ jgl230 002 CGL200 GLTRANS O +1 0080 001 GL.ACCOUNT.TRANS_ jgl230 002 CGL200 GLMSOLD O 0 0128 p 003 GL.ACCOUNT.MASTER_ jgl230 002 CGL200 GLMSNEW NCD +1 128 p 003 GL.ACCOUNT.MASTER_ jgl230 003 CGL100 ACCTMAS S 0 0128 p 002 GL.ACCOUNT.MASTER_ jgl230 003 CGL100 ACTLIST MKD +1 133 GL.ACCOUNT.ACNTLIST_
mvsfiles5 - Insert '*' Flags beside Esential Input files - 2015/09/20 16:35:34 Jobname Step Program DDname MOD Gen Rcsz pb From To * <-----DSNname------> =============================================================================== jgl230 003 CGL100 ACTLIST MKD- +1 133 GL.ACCOUNT.ACNTLIST_
jgl230 002 CGL200 GLMSOLD O 0 0128 p 003 * GL.ACCOUNT.MASTER_ jgl230 002 CGL200 GLMSNEW NCD- +1 128 p 003 GL.ACCOUNT.MASTER_ jgl230 003 CGL100 ACCTMAS S 0 0128 p 002 GL.ACCOUNT.MASTER_
jgl230 001 SORT SORTIN O * GL.ACCOUNT.TRAN1
jgl230 001 SORT SORTOUT NCD- +1 80 002 GL.ACCOUNT.TRANS_ jgl230 002 CGL200 GLTRANS O - +1 0080 001 GL.ACCOUNT.TRANS_
# mvsfiles6 - only the essential input files to run jgl230.ksh jgl230 002 CGL200 GLMSOLD O 0 0128 p * GL.ACCOUNT.MASTER_ jgl230 001 SORT SORTIN O * GL.ACCOUNT.TRAN1
# mvsfiles7 - copy essential files from $CNVDATA/d2asc/... to $RUNDATA/data/... cp $CNVDATA/d2asc/gl.account.master_* $RUNDATA/data cp $CNVDATA/d2asc/gl.account.tran1 $RUNDATA/data
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
We can use the Essential files report to create the data conversion control file that we will need in Part_4 to convert mainframe files to unix/linux/windows.
'uvcopy xmvs2ctl1' will convert xmvsA/mvsfiles6 to ctl/mvsfiles8. It will extract the datafilenames from columns 54-97. It will lookup Indexed file ctl/datactl53I (used for JCL conversion) for matches on filename and append the information available (record-size, file-type, Indexed keyloc/keylen.
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. uvcopy xmvs2ctl1,fili1=xmvsA/mvsfiles6,filr2=ctl/datactl53I,filo1=ctl/mvsfiles8 =============================================================================== - create control file for data conversion from essential files report - appending file info from ctl/datactl53I (Indexed file from JCL conversion)
#2. cp ctl/mvsfiles8 $CNVDATA/ctl/datacpy52 ======================================= - copy over to $CNVDATA for data conversion in Part_4
gapoaoo1 001 IDCAMS OUTDAT SKK 0400 * NGAPO.GAPUF.EAO.AUSKUNFT gapoaoo1 005 AOPV110 OTABACT SKK * VGAPO.GAPCI.SAG.ORGSTAM gapoaoo3 032 AOPV910 TABSTAM SKK * VGAPO.GAPCI.SAG.TABSTAM gapoaoo2 010 AOPV140 TABSTAM S * VGAPO.GAPCI.YAG.YAGTABS gapoaoo1 005 AOPV110 ARTSTAM SKK 4000 * VGAPO.GAPCI.EAO.EAOART gapoaoo2 034 SORT SORTIN SKK 0350 * VGAPO.GAPCI.EAO.EAOBUCH
ngapo.gapuf.eao.auskunft cpy=____________ rs1=00400 rca=00400 rcs=00400 gdg=___ typ=RSF key=___,___ vgapo.gapci.sag.orgstam cpy=____________ rs1=04000 rca=00120 rcs=04000 gdg=___ typ=IDXf1 key=000,028 vgapo.gapci.sag.tabstam cpy=____________ rs1=00204 rca=00120 rcs=00204 gdg=___ typ=IDXf1 key=000,016 vgapo.gapci.agy.agytabs cpy=ctabstam.cpy rs1=00204 rca=00300 rcs=00204 gdg=___ typ=IDXf1 key=000,016 vgapo.gapci.eao.eaoart cpy=cpdaart.cpy rs1=04000 rca=01500 rcs=04000 gdg=___ typ=IDXf1 key=002,015 vgapo.gapci.eao.eaobuch cpy=copvbel.cpy rs1=04089 rca=00350 rcs=04089 gdg=___ typ=IDXf1 key=000,028
We show a few copybooks (from the JCL converter Indexed file ctl/datactl53I), possible if you have already performed some datafile conversion. But most likely the copybooks are all blank at this point and you may have to edit in manually - later in Part_4.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
2A1. | Introduction |
2B1. | JCL conversion for CA7 scheduler |
2B2. | illustrating DGOTO &C_L2JN PROC code reduction |
2B3. | CA7 commands &C_ variables reduced by new JCL PROC expander |
2B5. | table analysis for keywords desired |
2C1. | CA7 cross-references |
- generating CA7 cross-reference reports | |
2C2. | converting cross-refs to .csv, summary & detail, 4-up & 1-up |
2C3. | xrefsumcsv/xrefCA7jcl2_sum.csv - summary csv 4-up |
- xrefdtlcsv/xrefCA7jcl2_dtl.csv - detail csv 4-up | |
2C4. | xrefsumcsv/xrefCA7jcl2_sum1.csv - summary csv 1-up |
- xrefdtlcsv/xrefCA7jcl2_dtl1.csv - detail csv 1-up | |
2C5. | xrefCA7 script listing |
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Part_1 | has documented the basic JCL & COBOL conversions applying to all sites. |
Part_2 | will document some extra conversions depending on the site. |
The first extra conversion documented here is for the CA7 scheduler which is often found in mainframe JCL.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
In Jan 2016 the PROC expander (jclproc51) was modified to process CA7 commands. The number of output lines can be dramatically reduced when DGOTO's are used wwith the '&C_L2JN' (jobname) variable.
cleanup PROC expand convert to ksh copy to test jcl0 --------> jcl1 ------------> jcl2 --------------> jcl3 -----------> jcls cleanup1 *jclproc51* jclunix51 cp
We will illustrate code reduction using demo PROC PAYAM2 wich defines 3 SORTOUTs & uses DGOTO &C_L2JN to select the file corresponding to the current job (&C_L2JN). We will show only 3 SORTOUT files in the demo version (vs 12 in the actual PROC).
------------------- PROC called by gapoceco.jcl1 (demo version) ----------------- //PAYAM2 DPROC DAT //* PAYAM2 is reduced version of PAYAM1 with only 3 SORTOUT's vs 12 //* - to demo new version of PROC expander that removes extraneous DGOTO choices //* - comments out DGOTO, DSTEP //* - called from demo version of gapoceco.jcl - will drop 1st & 3rd SORTOUT here //* but would reduce actual PAYAM1 proc from 12 SORTOUTs to only the 1 appropriate //* // DSET DAT=YMD(&C_JDATE) // DGOTO &C_L2JN //* //GAPOCDCO DSTEP //SORTOUT DD DSN=PGAPO.FGJFV.ICRQCDCO.COMK452.F&DAT..T&C_TIME, <-- will be dropped // DISP=(NEW,CATLG,DELETE), // SPACE=(TRK,(150,150),RLSE), // DCB=(RECFM=FB,LRECL=500,BLKSIZE=5000) // DGOTO ENDE //* //GAPOCECO DSTEP //SORTOUT DD DSN=PGAPO.FGJFV.ICRQCECO.COMK452.F&DAT..T&C_TIME, <-- will be selected // DISP=(NEW,CATLG,DELETE), // SPACE=(TRK,(150,150),RLSE), // DCB=(RECFM=FB,LRECL=500,BLKSIZE=5000) // DGOTO ENDE //* //GAPOCHCO DSTEP //SORTOUT DD DSN=PGAPO.FGJFV.ICRQCHCO.COMK452.F&DAT..T&C_TIME, <-- will be dropped // DISP=(NEW,CATLG,DELETE), + 10 more if actual PAYAM1.prc // SPACE=(TRK,(150,150),RLSE), // DCB=(RECFM=FB,LRECL=500,BLKSIZE=5000) // DGOTO ENDE //* //ENDE DSTEP
The demo PROC retains code for 3 jons (GAPOCDCO, GAPOACAM,& GAPOAFAM). The next page shows the results when this PROC is expanded in the GAPOCECO JCL. Will drop 1st & 3rd SORTOUT here but would reduce original PROC from 12 SORTOUTs to only the 1 appropriate to the current jobname.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
------------------- demo JCL input to PROC expander -------------------- //GAPOCECO JOB test DGOTO in proc payam2 //* //JS020C EXEC PGM=SORT,COND=(0,NE) //SORTIN DD DSN=PGAPO.FGJFV.ICRQCECO.COMK452.CWUICDG,DISP=SHR //* //CARPROC EXEC PAYAM2
------------------- demo JCL output from PROC expander -------------------- //GAPOCECO JOB test DGOTO in proc payam2 //* //JS020C EXEC PGM=SORT,COND=(0,NE) //SORTIN DD DSN=PGAPO.FGJFV.ICRQCECO.COMK452.CWUICDG,DISP=SHR //* //*<CARPROC EXEC PAYAM2 #<-PROC1call //*PAYAM2 DPROC DAT // SET DAT=YMD(&C_JDATE) //* DGOTO &C_L2JN //*GAPOCECO DSTEP //SORTOUT DD DSN=PGAPO.FGJFV.ICRQCECO.COMK452.F&DAT..T&C_TIME, <-- selected by DGOTO jobname // DISP=(NEW,CATLG,DELETE), // SPACE=(TRK,(150,150),RLSE), // DCB=(RECFM=FB,LRECL=500,BLKSIZE=5000) //* DGOTO ENDE //*ENDE DSTEP
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
The new JCL PROC expander (jclproc51) dramatically reduces the number of CA7 commands & &C_... variables. For example, here are the numbers for a recent conversion - from running xrefCA7 on 3 different directories & options:
#1. xrefCA7 jcl2old <-- old proc expansion before Jan2016 changes =============== #2. xrefCA7 jcl2 <-- new proc expansion ============ #3. xrefCA7 jcl2 j0 <-- new proc expansion with option 'j0' =============== to process //* comment statements
#1 #2 #3 # CA7 symbol old jclproc51 new jclproc51 new with option j0 ============================================================================ 1. &C_DAY 4 4 4 2. &C_JDATE 430 430 430 3. &C_L2JN 274 0 274 4. &C_TIME 3458 421 421 1. DGOTO 3938 8 555 5. DIF 4 4 4 6. DPROC 472 0 472 7. DSET 1760 0 0 8. DSTEP 3938 0 555
1/2. &C_DAY and &C_JDATE are same on all 3 because they do not appear on statments that are //* commented out or dropped.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
As previously mentioned, in Jan 2016 the PROC expander (jclproc51) was modified to process CA7 commands dramatically reducing the code when DGOTO's are used with the '&C_L2JN' (jobname) variable.
But the old version of the PROC expander was saved in case somebody wanted to see the results of the old PROC expansion which would retain all the CA7 commands & '&C_...' variables.
The old proc expander program was saved as $UV/src/jclproc51j16.c (compiled into $UV/bin/jclproc51j16). To run it I created script $UV/sf/IBM/jclpx51j16 (vs $UV/sf/jclpx51 which calls new version).
So you could run the old proc expander as follows:
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. mkdir jcl2old <-- make subdir to receive output of jclpx51j16 =============
#2. jclpx51j16 jcl1 jcl2old <-- run proc expansion with old version ======================= - not processing DGOTO's etc
#3. xrefCA7 jcl2old <-- could then run CA7 crossrefs on old versions ===============
#4. vi xref/xrefCA7jcl2old <-- view CA7 crossrefs (old versions) ======================
#5. xref2csvall xref/xrefCA7jcl2old <-- could convert to .csv if desired ===============================
#6. View as follows =============== #6a. xrefdtl/xrefCA7jcl2old.dtl - detail 4-up #6b. xrefdtlcsv/xrefCA7jcl2old_dtl.csv - detail 4-up .csv #6c. xrefdtlcsv/xrefCA7jcl2old_dtl1.csv - detail 1-up .csv #6d. xrefsum/xrefCA7jcl2old.sum - summary 4-up #6e. xrefsumcsv/xrefCA7jcl2old_sum.csv - summary 4-up .csv #6f. xrefsumcsv/xrefCA7jcl2old_sum1.csv - summary 1-up .csv
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
'table3d' is a general purpose uvcopy job that you can use to create table summary counts of various items of interest to you. Our example will create table summary counts of all '&C_' symbols in the jcl2 directory (JCLs after PROC expansion, but prior to script conversion).
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. mkdir stats <-- make dir for output files =========== - will be stats/directory_keyword - any punctuation in keyword replaced by '_' underscores
#2. uvcopy table3d,fild1=jcl2old,arg1='&C_',arg4=plrq,uop=w0b0j1 ============================================================ - create table of '&C_' variables in jcl2old/... (old PROC expander) - re: options 'uop=w0b0j1' - you will see expanations when you run the job - reply null to all other prompts - output to stats/jcl2old__C_
Job: table3d Dir: jcl2old Keyword(s): &C_ Qual1: Qual2: Blanked: .()' Userops: q1b3c0f0j0l0m0p1w1w0b0j1 table3d 2016/02/11_16:41:09 Counts by Targetword following specified Keyword tbl#0001 tblt1f7 e0(48) line# count % 1strec# target-word 1 4 0 23 &C_DAY 2 430 10 186 &C_JDATE 3 274 6 187 &C_L2JN 4 191 4 35 &C_TIME 5 3,268 78 189 &C_TIME, 4,167*100 *TOTAL*
#3. uvcopy table3d,fild1=jcl2,arg1='&C_',arg4=plrq,uop=w0b0j1 ============================================================ - create table of '&C_' variables in jcl2/... (new proc expander) - output to stats/jcl2old__C_
Job: table3d Dir: jcl2 Keyword(s): &C_ Qual1: Qual2: Blanked: .()' Userops: q1b3c0f0j0l0m0p1w1w0b0j1 table3d 2016/02/11_16:41:42 Counts by Targetword following specified Keyword tbl#0001 tblt1f7 e0(48) line# count % 1strec# target-word 1 4 0 43 &C_DAY 2 430 38 184 &C_JDATE 3 274 24 185 &C_L2JN 4 191 16 34 &C_TIME 5 231 20 187 &C_TIME, 1,130*100 *TOTAL*
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
#4. uvcopy table3d,fild1=jcl2old,arg1='DGOTO:DIF:DPROC:DSET:DSTEP',arg4=plrq,uop=w0 =============================================================================== - create table of DGOTO:etc in jcl2old/... (old proc expander) - note options 'uop=w0' <-- different than the '&C_' tables above - output to stats/jcl2old_DGOTO:DIF:DPROC:DSET:DSTEP
Job: table3d Dir: jcl2old Keyword(s): DGOTO:DIF:DPROC:DSET:DSTEP Qual1: Qual2: Blanked: .()' Userops: q1b3c0f0j0l0m0p1w1w0 table3d 2016/02/11_17:03:57 Counts by Targetword following specified Keyword tbl#0001 tblt1f7 e0(48) line# count % 1strec# target-word 1 3,934 38 187 DGOTO 2 4 0 24 DIF 3 552 5 185 DPROC 4 1,760 17 186 DSET 5 3,938 38 188 DSTEP 10,188*100 *TOTAL*
#5. uvcopy table3d,fild1=jcl2,arg1='DGOTO:DIF:DPROC:DSET:DSTEP',arg4=plrq,uop=w0 ============================================================================ - create table of DGOTO:etc variables in jcl2/... (new proc expander) - output to stats/jcl2_DGOTO:DIF:DPROC:DSET:DSTEP
Job: table3d Dir: jcl2 Keyword(s): DGOTO:DIF:DPROC:DSET:DSTEP Qual1: Qual2: Blanked: .()' Userops: q1b3c0f0j0l0m0p1w1w0 table3d 2016/02/11_17:03:37 Counts by Targetword following specified Keyword tbl#0001 tblt1f7 e0(48) line# count % 1strec# target-word 1 551 33 185 DGOTO 2 4 0 44 DIF 3 552 33 183 DPROC 4 554 33 186 DSTEP 1,661*100 *TOTAL*
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Script 'xrefCA7' is provided to cross-reference CA7 commands & &C_ variables CA7 commands are DGOTO,DSTEP,DSET,FIF,DPROC,DEND,DABORT, and &C_ variables are &C_DAY, &C_JDATE, &C_L2JN, &C_TIME, etc.
This script is different the previously documented crossref scripts (xcobcopy1/2,xcobcall1/2,xkshparm1/2,xkshprog1/2,xmvsproc1/2), in that the xrefCA7 script contains the list of target-words to cross-reference, whereas the previous scripts cross-referenced whatever word followed a specifeid keyword (such as copy, call, EXEC, etc).
You can see the 'xrefCA7' script listed later in this documentation on page '2C5' or at $UV/sf/util/xrefCA7.
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. xrefCA7 jcl2 <-- generate CA7 crossref in xref/xrefCA7jcl2 ============
#2. vi xref/xrefCA7jcl2 <-- view report, sample shown below ===================
#xrefCA7 ** crossref JCL for CA7 DGOTO,DSTEP,etc And &C_... variables ** #Keyword=&C_:DIF:DGOTO:DSTEP:DSET:DPROC:DEND:DABORT Exclude=~~ Include=~~ Skip= page# 1 #Directory=/home3/cnv1/testlibs_old/jclt2 Options=q1a16b16c4e0d0g0h1j0l1p0s2w0a16b16c4j1w0 #======================================================2016/02/09_17:41:33
&c_day_4 gapocahd.jcl gapofjmm.jcl gapofmmm.jcl gapoftmm.jcl
&c_jdate gapocdco.jcl gapoabcq.jcl gapoabcr.jcl gapoabdw.jcl_3 _________10 gapocdm3.jcl_2 gapoabk5.jcl_2
&c_l2jn gapocdco.jcl
&c_time gapocdco.jcl_12 gapoabcq.jcl gapoabcr.jcl gapoabdw.jcl_3 ________21 gapocdm3.jcl_2 gapoabk5.jcl_2
dgoto gapocdco.jcl_13 gapoayfb.jcl_2 gapofjmm.jcl_2 gapofmmm.jcl_2 ______21 gapohvoo.jcl_2
dif_4 gapocahd.jcl gapofjmm.jcl gapofmmm.jcl gapoftmm.jcl
dproc_3 gapocdco.jcl gapoayfb.jcl gapofjmm.jcl
dset_3 gapocdco.jcl gapoayfb.jcl gapofjmm.jcl
dstep_17 gapocdco.jcl_13 gapoayfb.jcl_2 gapofjmm.jcl_2
#**TotalWords: 9, TotalFiles=35, TotalRefs: 84
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Script 'xref2csvall' is provided to convert any of the xref/... reports to .csv format, summary & detail, 4-up & 1-up. You can see the 'xref2csvall' script listed later in this documentation on page '11D3' or at $UV/sf/util/xref2csvall. Here we will demonstrate using the xref/xrefCA7 crossref report listed on the previous page.
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. xrefCA7 jcl2 <-- must 1st generate the crossref input file ============ for the 'xref2csvall' script
#2. xref2csvall xref/xrefCA7jcl2 ============================ - generates 6 different versions of the CA7 crossref - samples listed below
** xrefsum/xrefCA7jcl2.sum - summary formatted for conversion to csv
&c_day gapocahd 1 gapofjmm 1 gapofmmm 1 gapoftmm 1 4 &c_jdate gapocdco 1 gapoabcq 1 gapoabcr 1 gapoabdw 3 10 &c_l2jn gapocdco 1 1 &c_time gapocdco 12 gapoabcq 1 gapoabcr 1 gapoabdw 3 21 dgoto gapocdco 13 gapoayfb 2 gapofjmm 2 gapofmmm 2 21 dif gapocahd 1 gapofjmm 1 gapofmmm 1 gapoftmm 1 4 dproc gapocdco 1 gapoayfb 1 gapofjmm 1 3 dset gapocdco 1 gapoayfb 1 gapofjmm 1 3 dstep gapocdco 13 gapoayfb 2 gapofjmm 2 17
** xrefdtl/xrefCA7jcl2.dtl - detail formatted for conversion to csv
&c_day gapocahd 1 gapofjmm 1 gapofmmm 1 gapoftmm 1 4
&c_jdate gapocdco 1 gapoabcq 1 gapoabcr 1 gapoabdw 3 10 &c_jdate gapocdm3 2 gapoabk5 2 10
&c_l2jn gapocdco 1 1
&c_time gapocdco 12 gapoabcq 1 gapoabcr 1 gapoabdw 3 21 &c_time gapocdm3 2 gapoabk5 2 21
dgoto gapocdco 13 gapoayfb 2 gapofjmm 2 gapofmmm 2 21 dgoto gapohvoo 2 21
dif gapocahd 1 gapofjmm 1 gapofmmm 1 gapoftmm 1 4
dproc gapocdco 1 gapoayfb 1 gapofjmm 1 3
dset gapocdco 1 gapoayfb 1 gapofjmm 1 3
dstep gapocdco 13 gapoayfb 2 gapofjmm 2 17
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
&c_day,gapocahd,1,gapofjmm,1,gapofmmm,1,gapoftmm,1,4, &c_jdate,gapocdco,1,gapoabcq,1,gapoabcr,1,gapoabdw,3,10, &c_l2jn,gapocdco,1,,,,,,,1, &c_time,gapocdco,12,gapoabcq,1,gapoabcr,1,gapoabdw,3,21, dgoto,gapocdco,13,gapoayfb,2,gapofjmm,2,gapofmmm,2,21, dif,gapocahd,1,gapofjmm,1,gapofmmm,1,gapoftmm,1,4, dproc,gapocdco,1,gapoayfb,1,gapofjmm,1,,,3, dset,gapocdco,1,gapoayfb,1,gapofjmm,1,,,3, dstep,gapocdco,13,gapoayfb,2,gapofjmm,2,,,17,
&c_day,gapocahd,1,, &c_day,gapohloo,1,, &c_day,gapohooo,1,, &c_day,gapohvoo,1,, &c_day,,,4, &c_jdate,gapocdco,1,, &c_jdate,gapocdes,1,, &c_jdate,gapocdet,1,, &c_jdate,gapocdfy,3,, &c_jdate,,,10, &c_l2jn,gapocdco,1,, &c_l2jn,,,1, &c_time,gapocdco,12,, &c_time,gapocdes,1,, &c_time,gapocdet,1,, &c_time,gapocdfy,3,, &c_time,,,21, dgoto,gapocdco,13,, dgoto,gapocahd,2,, dgoto,gapohloo,2,, dgoto,gapohooo,2,, dgoto,,,21, dif,gapocahd,1,, dif,gapohloo,1,, dif,gapohooo,1,, dif,gapohvoo,1,, dif,,,4, dproc,gapocdco,1,, dproc,gapocahd,1,, dproc,gapohloo,1,, dproc,,,3, dset,gapocdco,1,, dset,gapocahd,1,, dset,gapohloo,1,, dset,,,3, dstep,gapocdco,13,, dstep,gapocahd,2,, dstep,gapohloo,2,, dstep,,,17,
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
&c_day,gapocahd,1,gapofjmm,1,gapofmmm,1,gapoftmm,1,4, ,,,,,,,,,, &c_jdate,gapocdco,1,gapoabcq,1,gapoabcr,1,gapoabdw,3,10, &c_jdate,gapocdm3,2,gapoabk5,2,,,,,10, ,,,,,,,,,, &c_l2jn,gapocdco,1,,,,,,,1, ,,,,,,,,,, &c_time,gapocdco,12,gapoabcq,1,gapoabcr,1,gapoabdw,3,21, &c_time,gapocdm3,2,gapoabk5,2,,,,,21, ,,,,,,,,,, dgoto,gapocdco,13,gapoayfb,2,gapofjmm,2,gapofmmm,2,21, dgoto,gapohvoo,2,,,,,,,21, ,,,,,,,,,, dif,gapocahd,1,gapofjmm,1,gapofmmm,1,gapoftmm,1,4, ,,,,,,,,,, dproc,gapocdco,1,gapoayfb,1,gapofjmm,1,,,3, ,,,,,,,,,, dset,gapocdco,1,gapoayfb,1,gapofjmm,1,,,3, ,,,,,,,,,, dstep,gapocdco,13,gapoayfb,2,gapofjmm,2,,,17, ,,,,,,,,,,
&c_day,gapocahd,1,, &c_day,gapohloo,1,, &c_day,gapohooo,1,, &c_day,gapohvoo,1,, &c_day,,,4, &c_jdate,gapocdco,1,, &c_jdate,gapocdes,1,, &c_jdate,gapocdet,1,, --- 10 lines removed to save space --- &c_time,gapocdm3,2,, &c_time,gapocdm5,2,, &c_time,,,21, dgoto,gapocdco,13,, dgoto,gapocahd,2,, dgoto,gapohloo,2,, dgoto,gapohooo,2,, dgoto,gapohvoo,2,, dgoto,,,21, dif,gapocahd,1,, dif,gapohloo,1,, dif,gapohooo,1,, dif,gapohvoo,1,, dif,,,4, dproc,gapocdco,1,, --- 10 lines removed to save space --- dstep,,,17,
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
#!/bin/ksh # xrefCA7 - cross-ref JCL for CA7 DGOTO,DSTEP,etc and &C_... variables # xrefCA71 - alternate for only CA7 commands DGOTO,DSTEP,etc # xrefCA72 - alternate for only CA7 &C_... variables echo "xrefCA7 - crossref JCL for CA7 commands DGOTO,DSTEP,etc And &C_ variables" echo " - calls uvcopy job 'xref3' to create the cross-ref report" echo " - see doc in xref3 re following jobs to create summary reports & .csv files" export JOBID="xrefCA7" # xrefCA7 jcl2 a16b16c4j1 <- 2 sets of options # a16 - column to begin references # b16 - width allowed for each reference # c4 - number of references per line # j1 - JCL (bypass if * col1 or col2) dir="$1" # arg1 must be a directory if [[ ! -d "$dir" ]]; then echo "usage: xrefCA7 directory options - arg1 must be directory" echo " ==========================" echo "sample: xrefCA7 jcl2 a16b16c4j1w1 - arg2 options default as shown" echo " ==========================" echo "options: a16=cols for selected pattern, b16=cols for JOBnames" echo " c4=JOBnames/line, j1=JCL (bypass * col1 or col2)" exit 1; fi # setup default options, if not specified - use defaults ops=a16b16c4j1w0$2; # append user option overrides onto defaults if [[ ! -d tmp ]]; then mkdir tmp; fi #make tmp if not already present # init output file, will append grep output for each program >tmp/grep3 for i in $dir/* do grep -nH '&C_' $i /dev/null >>tmp/grep3 grep -nH ' DIF ' $i /dev/null >>tmp/grep3 grep -nH ' DGOTO ' $i /dev/null >>tmp/grep3 grep -nH ' DSTEP' $i /dev/null >>tmp/grep3 grep -nH ' DSET ' $i /dev/null >>tmp/grep3 grep -nH ' DPROC ' $i /dev/null >>tmp/grep3 grep -nH ' DEND' $i /dev/null >>tmp/grep3 grep -nH ' DABORT' $i /dev/null >>tmp/grep3 done sort -u -o tmp/grep3a tmp/grep3 #Jan31/16 - sort drop dups # create full path name of directory, if not already if [[ $dir = /* ]]; then DIR=$dir; else DIR=${PWD}/$dir; fi export DIR=$DIR export TITLE="crossref JCL for CA7 DGOTO,DSTEP,etc And &C_... variables" uvcopy xref3,fili1=tmp/grep3a,filo1=xref/xrefCA7$dir\ ,arg1='&C_:DIF:DGOTO:DSTEP:DSET:DPROC:DEND:DABORT',arg2=~~,arg3=~~,arg5='&C_',uop=$ops # inhibit prompt for vi/uvlp if batch run (batch will print all xref) if [[ "$XREFALL" != "Y" ]]; then echo "report generated = xref/xrefCA7" echo " - use uvlp12,uvlp14,uvlp16 to laser print at 12,14,16 cpi" echo "--> enter command (vi,cat,more,uvlp12,uvlp14,uvlp16,etc)" read ans if [[ ! "$ans" = "" ]]; then $ans xref/xrefCA7$dir; fi fi exit 0
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
3A1. | Control-Files for JCL conversion to supply data-file-info |
- that may be missing in JCL (record-sizes,file-types,etc) | |
- required for script conversions of SORTs,IDCAMs,IEBGNENERs,etc |
3A2. | Control-Files copied from $UV/ctl/... by copymvsctls setup |
3A3. | control files AFTER FIRST COBOL/JCL conversion + cobfiles5A |
3A4. | control files after mainframe LISTCAT files added |
3B1. | 1st conversion of COBOL & JCL (before updating control files) |
- operating instructions (review of instructions in Part_1). |
3B2. | capturing COBOL info for JCL conversion (record-sizes, file-types) |
3B3. | control files created by 1st conversion (in subdir ctl/...) |
- datajcl51, datajcl52 extracted from the JCL |
3B4. | JCL converter script 'jcl2ksh51A' does everything |
- calls 'jcldata51A' to create datajcl51 & datajcl52 | |
& load into Indexed file datactl53I.dat/.idx for JCL converter | |
- calls 'jclxx51' for JCL conversion to Korn shell scripts | |
- updates datactl53I with datafile info from COBOL conversions |
3B5. | datactl53I control file before & after update by 1st JCL conversion |
3C1. | using LISTCAT mainframe reports to improve JCL conversion |
- transfer, translate, store multiple reports in subdir cat0/... |
3C2. | LISTCAT report sample (mock-up for VU demo files) |
3C3. | control files extracted from LISTCAT reports into $RUNLIBS/cat0/... |
- processed from cat0/... to subdirs cat1/... & cat2/... | |
- summarized to 1 report in ctl/datacat51 | |
- loaded into Indexed file ctl/datacat52I.dat/.idx | |
- merged into ctl/datactl53I.dat/.idx by jcldata51A |
3D1. | Problem - record sizes missing from LISTCAT report |
Solution - an extra LRECL report from mainframe to be converted | |
to keyword format & combined with other control files |
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
3E1. | manual coded control file for missing record-sizes,file-types,etc |
- may create/edit ctl/add/datamisc51 info to be merged with | |
other control files into final ctl/datactl53I.dat/.idx | |
used by JCL converter. |
3E2. | converting Excel spreadsheets of datafile info to control files |
- store ctl/add/dataxls51 info to be merged with other control files |
3F1. | Re-Converting JCL with LISTCAT, datamisc,& dataxls |
- using 'alldiff2' script to see differences made by re-conversion | |
- confirm chages as expected & no unintended changes |
3F2. | dropping unwanted lines on alldiff2 report with vi & diffdrop2 |
3F3. | - alldiff2 report before & after vi & diffdrop2 |
3F4. | re-convert alternative to save time if hundreds of JCLs,PROCs,PARMs |
- use jcldata51A & jclxx51 (vs do everything jcl2ksh51A) |
3F5. | JCL converter control file ctl/datactl53I |
- before & after 2nd conversion |
3G1. | JCL converter control file summary |
This JCLcnv2real.htm#Part_3 is the comprehensive guide to creating the control files that provide the data file information (record-size, file-type, key locations, etc) to assist conversion of the JCL & DATA files. Part 2 explains how to transfer the LISTCAT reports from the mainframe & extract data file information into the keyword control files used by Vancouver Utility JCL & DATA conversions.
The initial JCL conversion in Part 1 did create a control file using the information available in the JCL, which may not be complete. The JCL conversion script captures datafile info from any OUTPUT file definition but record-sizes etc may be missing for INPUT files that have never been created anywhere in any job in the entire JCL. The LISTCAT information can supply the record-sizes, file types, key locations that may be missing in the JCL.
After capturing the LISTCAT info for data file conversion, you should also re-convert the JCL to take advantage of the improved data file info available.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Here are the more relevant subdirs we will use to illustrate customizing JCL conversions by modifying control files & re-converting. See the jclunixop51 JCL converter options control file listed at '12A1'.
/home/uvadm/mvstest/testlibs <-- test/demo libraries supplied in /home/uvadm/... /home/userxx <-- could copy to your homedir, BUT, we recommend following: /p1/cnv/cnv1 <-- separate filesystem & userid for real conversion vs demo/test :-----testlibs : :------cat0 - LISTCAT reports from mainframe (see '3C1') : :------cat1 - LISTCAT reports converted to unix control files : :------cat2 - LISTCAT reports translated to lower case : :------cbl0 - mainframe COBOL programs : :------cbl1/2 - intermediate subdirs used in conversion : :------cbls - converted COBOL programs : :------cblx - compiled COBOL programs : :------cpy0 - mainframe COBOL copybooks : :------cpy1/2 - intermediate subdirs used in conversion : :------cpys - copy here (standard copybook library)
: :------ctl <-- CONTROL FILES : : :-----... <-- see details on next page --->
: :------jcl0 - mainframe JCL : :------jcl1 - intermediate conversion 73-80 cleared : :------jcl2 - PROCs expanded from procs : :------jcl3 - JCLs converted to Korn shell scripts : :------jcls - copy here manually 1 by 1 during test/debug : :------parm0 - control cards & includes (SORT FIELDS, etc) : :------parms - control cards with 73-80 cleared : :------proc0 - test/demo PROCs supplied : :------procs - will be merged with jcl1, output to jcl2 : :------tmp - tmp subdir used by uvsort & various conversions : :------xref - cross-references
You can modify various options in ctl/jclunixop51 & re-convert. The complete conversion of all JCL components can be pictured as follows:
jcl0 ------> jcl1 ---------> jcl2 -----------> jcl3 ------> jcls cleanup PROC expand convert to ksh copy 1 at a time for test/debug
jcl2ksh51A all <-- script to convert ALL JCL thru all steps ============== jcl0 --> jcl1 --> jcl2 --> jcl3
jcl2ksh53A all <-- alternative for AIX COBOL ==============
Use 'jcl2ksh53A' vs 'jcl2ksh51A' if the JCL being converted is intended to execute AIX COBOL vs Micro Focus COBOL. For AIX COBOL, we generate a call to the linked program vs the .int for Micro Focus. The GDG file handler is different for AIX, some file types require a TYPE- prefix on the exportfile.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Here are the control files after 'copymvsctls' (see '1C1') has copied initial control files from $UV/ctl/... BEFORE FIRST COBOL & JCL conversion. See copymvsctls listed at '11C0'.
/p1/cnv/cnv1/testlibs/ <-- separate filesystem & userid for real conversion /home/userxx/testlibs/ - Not your homedir as prior test/demo/training :-----ctl/ : :----add/ : : :-----dummy_readme : :-----cnvcob5.tbl <-- COBOL conversion search/replace table : :-----cobdirectives <-- Micro Focus COBOL compile Directives : 0 :-----cobfil55bI.dat <-- COBOL datafile info (initially empty) : 0 :-----cobfil55bI.idx : 0 :-----datacat52I.dat <-- LISTCAT datafile info (initially empty) : 0 :-----datacat52I.idx : 0 :-----datamisc51 <-- you can edit with missing record sizes : 0 :-----dataxls51.csv <-- file of record sizes from Excel spread sheet : :-----extfh.cfg <-- Micro Focus COBOL Extended File Handler config : :--*--jclunixop51 <-- JCL converter control-file : 0 :-----lrecl0 <-- record-sizes for NonVsam files (missing on LISTCAT rpt) : :-----utilities
The '0' indicates empty files intentionally created so that following optional scripts & uvcopy jobs will not fail if you do not create those control files.
Compare this to the control files AFTER First conversion - listed on next page -->
Please see
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
/p1/cnv/cnv1/testlibs/ <-- separate filesystem & userid for real conversion :-----ctl/ : :-----add/ : : :-----datactl53a <-- copied/renamed from ../datactl53I.dat : : :-----dummy_readme : :-----cnvcob5.tbl : :-----cobdirectives : :-----cobfil55a <-- COBOL datafile info stored by cnvMF51A : :-----cobfil55aI.dat - to Identify ORG Line Seqntl (print) files : :-----cobfil55aI.idx : :-----cobfil55b <-- COBOL datafile info stored by cnvMF51A : :-----cobfil55bI.dat - to supply record-sizes to JCL converter : :-----cobfil55bI.idx : :-----cobfiles <-- COBOL files report (from cobfiles5A) : :-----cobfilesI.dat - loaded into Indexed file : :-----cobfilesI.idx : :-----cpyrcs1 <-- record-sizes from COBOL copybooks : :-----cpyrcs1I.dat - loaded into Indexed file : :-----cpyrcs1I.idx : 0 :-----dataadd51 <-- datafile info from all ctlfiles in add/... : 0 :-----dataadd52I.dat - added to datactl53I used by JCL converter : 0 :-----dataadd52I.idx : 0 :-----datacat51 <-- datafile info from LISTCAT cat2/... : :-----datacat52I.dat - added to datactl53I used by JCL converter : :-----datacat52I.idx : :-----datactl53 <-- datafile info for JCL converter (sequential) : :-----datactl53I.dat - Indexed file version used by JCL converter : :-----datactl53I.idx : :-----datajcl51 <-- datafile info extracted from all jcl2/... : :-----datajcl52 - 1st input to create datactl53 for JCL converter : :-----datamisc51 <-- you can edit with missing record sizes : :-----dataxls51.csv <-- file of record sizes from Excel spread sheet : :-----extfh.cfg : :-----gdgctl51 <-- initial GDG control file created by jcl2ksh51A : : - copy to $RUNDATA/ctl, modify no of generations : : & load Indexed file for JCL/script executions : :-----jclunixop51 : 0 :-----lrecl0 <-- record-sizes for NonVsam files (missing on LISTCAT rpt) : :-----utilities
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
The mainframe LISTCAT reports can be transferred to unix/linux & data-file info (record-sizes,etc) can be extracted to control files to be merged with the control files extracted from the JCL (which may be incomplete).
/p1/cnv/cnv1/testlibs/ <-- separate filesystem & userid for real conversion :-----cat0 <-- LISTCAT reports from mainframe (see '3C1') :-----cat1 <-- LISTCAT reports converted to unix control files :-----cat2 <-- LISTCAT reports translated to lower case : : :-----ctl/ : :-----add/ : : :-----datactl53a <-- copied/renamed from ../datactl53I.dat : : :-----dummy_readme : :-----cnvcob5.tbl : :-----cobdirectives : :-----cobfil55a <-- COBOL datafile info stored by cnvMF51A : :-----cobfil55aI.dat - to Identify ORG Line Seqntl (print) files : :-----cobfil55aI.idx : :-----cobfil55b <-- COBOL datafile info stored by cnvMF51A : :-----cobfil55bI.dat - to supply record-sizes to JCL converter : :-----cobfil55bI.idx : :-----cobfiles <-- COBOL files report (from cobfiles5A) : :-----cobfilesI.dat - loaded into Indexed file : :-----cobfilesI.idx : :-----cpyrcs1 <-- record-sizes from COBOL copybooks : :-----cpyrcs1I.dat - loaded into Indexed file : :-----cpyrcs1I.idx : 0 :-----dataadd51 <-- datafile info from all ctlfiles in add/... : 0 :-----dataadd52I.dat - added to datactl53I used by JCL converter : 0 :-----dataadd52I.idx : 0 :-----datacat51 <-- datafile info from LISTCAT cat2/... : :-----datacat52I.dat - added to datactl53I used by JCL converter : :-----datacat52I.idx : :-----datactl53 <-- datafile info for JCL converter (sequential) : :-----datactl53I.dat - Indexed file version used by JCL converter : :-----datactl53I.idx : :-----datajcl51 <-- datafile info extracted from all jcl2/... : :-----datajcl52 - 1st input to create datactl53 for JCL converter : :-----datamisc51 <-- you can edit with missing record sizes : :-----dataxls51.csv <-- file of record sizes from Excel spread sheet : :-----extfh.cfg : :-----gdgctl51 <-- initial GDG control file created by jcl2ksh51A : : - copy to $RUNDATA/ctl, modify no of generations : : & load Indexed file for JCL/script executions : :-----jclunixop51 : 0 :-----lrecl0 <-- record-sizes for NonVsam files (missing on LISTCAT rpt) : :-----utilities
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
No need to execute these instructions here in Part 3 - because they have already been executed in Part 1. This is a brief review of the setup & 1st conversion of of COBOL & JCL. We will re-convert COBOL & JCL after we create additional control files from LISTCAT reports (to supply record-sizes that may be missing from JCL).
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. mkdir testlibs <-- make superdir for JCL/COBOL subdirs
#2. cdl <-- alias cdl='cd $RUNLIBS' --> cd testlibs
#3. mvslibsdirs <-- setup 30 subdirs for JCL & COBOL conversions ===========
#4. copymvsctls <-- script to copy control files from /home/uvadm/ctl =========== to $RUNLIBS/ctl/... (see copymvsctls listed at '11C0')
Note |
|
#5. cnvMF51Acpy all <-- convert copybooks thru all steps ============= - cpy0 --> cpy1 --> cpy2 --> cpys
#5a. cnvAIXcpyA all <-- alternative for AIX COBOL ==============
#6. cnvMF51A all <-- convert COBOL programs thru all steps ============ - cbl0 --> cbl1 --> cbl2 --> cbls --> cblx
#6a. cnvAIXcblA all <-- alternative for AIX COBOL ==============
#7. jcl2ksh51A all <-- convert ALL JCL thru all steps ============== proc0 --> procs, parm0 --> parms jcl0 --> jcl1 --> jcl2 --> jcl3
#7a. jcl2ksh53A all <-- alternative for AIX COBOL ==============
#8. jcldata51A <-- create control file ctl/datactl53 ========== - helps convert JCL to scripts, jcl2 --> jcl3 - supplies record-sizes missing from JCL
Note |
|
#8. cp jcl3/* jcls <-- copy converted JCL/scripts to jcls (in $PATH) ============== - only for our demos, not for your conversions (will copy 1 at a time as you test/debug)
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
You must perform COBOL conversion prior to JCL conversion, because it stores control file ctl/cobfil55b with record size info for the JCL converter. The COBOL conversion was performed by a do everything script 'cnvMF51A' which was 1st documented on page '1F1' & most recently on page '3B1'.
cnvMF51Acpy all <-- convert COBOL copybooks thru all steps ============= - cpy0 --> cpy1 --> cpy2 --> cpys
cnvMF51A all <-- convert COBOL programs thru all steps ============ - cbl0 --> cbl1 --> cbl2 --> cbls --> cblx
'cnvMF51A' - includes following 6 steps: (see listing on page '11B2')
Here is asample of cobfile55b, created from 4 of the COBOL programs used for the test/demo conversions in this documentation. You will later see some of these record-sizes used to update ctl/datactl53I.
*cobfil55 COBOL File Info Dir=cbl1 2015/03/08_17:48:23 *progname.cbl DDname OAL open recsz copybook.cpy FDname Writes Advances * 1 2 3 4 5 6 7 *123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890 *========================================================================================== car200.cbl saledtl SS I 64 saledtl.cpy saledtl car200.cbl custmas IR I 256 custmas.cpy custmas car200.cbl salelst L L O 120 sdline.cpy salelst 2 2 cgl100.cbl acctmas SS I 128 acctmas cgl100.cbl actlist L L O 120 actlist 1 1 cgl200.cbl glmsold RS I 128 glmsold cgl200.cbl glmsnew RS O 128 glmsnew 1 * cgl200.cbl gltrans RS I 80 gltrans car100.cbl custmas SS I 256 custmas.cpy custmas car100.cbl nalist S L O 120 nalist 2 2
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
#3. jcl2ksh51A all <-- convert ALL JCL components thru all steps ==============
'jcl2ksh51A' - includes folowing functions: (see listing at '11A1')
#3a. cleanup proc0 --> procs, parm0 --> parms, jcl0 --> jcl1 #3b. proc expand jcl1 --> jcl2 #3c. jcldata51A <-- create ctl/datactl53 control file ========== combining info from JCL, COBOL, LISTCAT, etc #3d. jclxx51 jcl2 jcl3 <-- convert JCL to Korn shell scripts ================= jcl2 --> jcl3
The do everything script 'jcl2ksh51A' calls script 'jcldata51A' to create control file ctl/datactl53I.dat/.idx for use by 'jclxx51' (JCL to script).
'jcldata51A' - includes following 6 steps: (see listing at '11C1')
AR.CUSTOMER.MASTER rca=_____ rcs=_____ typ=RSF key=___,___ job=jar100 prg=CAR100 AR.CUSTOMER.MASTER.INDEXED rca=_____ rcs=_____ typ=RSF key=___,___ job=jar200 prg=CAR200 AR.CUSTOMER.NAMEADRS.LIST100 rca=00133 rcs=00133 typ=RSF key=___,___ job=jar100 prg=CAR100 AR.SALES.ITEMS rca=_____ rcs=_____ typ=RSF key=___,___ job=jar200 prg=SORT AR.SALES.LIST rca=00133 rcs=00133 typ=RSF key=___,___ job=jar200 prg=CAR200 GL.ACCOUNT.ACNTLIST(+1) rca=00133 rcs=00133 typ=RSF key=___,___ job=jgl100 prg=CGL100 GL.ACCOUNT.MASTER(+1) rca=00128 rcs=00128 typ=RSF key=___,___ job=jgl320 prg=IDCAMS GL.ACCOUNT.MASTER(0) rca=_____ rcs=_____ typ=RSF key=___,___ job=jgl200 prg=CGL200 GL.ACCOUNT.TRAN1 rca=_____ rcs=_____ typ=RSF key=___,___ job=jgl200 prg=SORT GL.ACCOUNT.TRANS(+1) rca=00080 rcs=00080 typ=RSF key=___,___ job=jgl230 prg=SORT
ar.customer.master rca=_____ rcs=_____ typ=RSF key=___,___ job=jar100 prg=CAR100 ar.customer.master.indexed rca=_____ rcs=_____ typ=RSF key=___,___ job=jar200 prg=CAR200 ar.customer.nameadrs.list100 rca=00133 rcs=00133 typ=RSF key=___,___ job=jar100 prg=CAR100 ar.sales.items rca=_____ rcs=_____ typ=RSF key=___,___ job=jar200 prg=SORT ar.sales.list rca=00133 rcs=00133 typ=RSF key=___,___ job=jar200 prg=CAR200 gl.account.acntlist_ rca=00133 rcs=00133 typ=RSF key=___,___ job=jgl100 prg=CGL100 gl.account.master_ rca=00128 rcs=00128 typ=RSF key=___,___ job=jgl320 prg=IDCAMS gl.account.master_ rca=_____ rcs=_____ typ=RSF key=___,___ job=jgl200 prg=CGL200 gl.account.tran1 rca=_____ rcs=_____ typ=RSF key=___,___ job=jgl200 prg=SORT gl.account.trans_ rca=00080 rcs=00080 typ=RSF key=___,___ job=jgl230 prg=SORT
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
The JCL conversion was previously documented on page '1G1' & recently on page '3B1'.
#3. jcl2ksh51A all <-- convert ALL JCL thru multiple steps ============== proc0 --> procs, parm0 --> parms, jcl0 --> jcl1 --> jcl2
Note |
|
#3d. jclxx51 jcl2 jcl3 <-- convert all jcl2/* to Korn shell scripts in jcl3/... ================= - no need to jclxx51 if you just ran jcl2ksh51a - might run jclxx51 to reconvert only without repeating cleanup procs,parms,jcls,& proc expansion
'jcl2ksh51A' calls 'jclxx51' which in turn calls the actual JCL converter C program 'jcluni51' as follows:
jclunix51 jcl2/xxx.jcl jcl3/xxx.ksh ctl/datactl53I.dat/.idx ctl/cobfil55b.dat/.idx ===================================================================================
jclunix51 reads both ctl/datactl53I & cobfil55bI to get record-sizes & file-types (if missing in JCL SORTs,IDCAMS,IEBGENERs,etc). It also updates ctl/datactl53I if it finds info in cobfil55bI that was not in datactl53I.
Here is the BEFORE & AFTER listing of ctl/datactl53I showing record-size info transferred from cobfil55bI (listed on previous page) & used to update ctl/datactl53I.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Here are listings of the 2 versions of the JCL converter control file (before & after running the 1st JCL conversion).
ctl/datactl53 |
|
ar.customer.master rca=_____ rcs=_____ gdg= typ=RSF key=___,___ job=jar100 prg=CAR100 ar.customer.master.indexed rca=_____ rcs=_____ gdg= typ=RSF key=___,___ job=jar200 prg=CAR200 ar.customer.nameadrs.list100 rca=00133 rcs=00133 gdg= typ=RSF key=___,___ job=jar100 prg=CAR100 ar.sales.items rca=_____ rcs=_____ gdg= typ=RSF key=___,___ job=jar200 prg=SORT ar.sales.list rca=00133 rcs=00133 gdg= typ=RSF key=___,___ job=jar200 prg=CAR200 gl.account.acntlist_ rca=00133 rcs=00133 gdg= typ=RSF key=___,___ job=jgl100 prg=CGL100 gl.account.master_ rca=00128 rcs=00128 gdg= typ=RSF key=___,___ job=jgl320 prg=CGL200 gl.account.tran1 rca=_____ rcs=_____ gdg= typ=RSF key=___,___ job=jgl200 prg=SORT gl.account.trans_ rca=00080 rcs=00080 gdg= typ=RSF key=___,___ job=jgl230 prg=SORT
ctl/datactl53I.dat |
|
ar.customer.master rca=00256 rcs=00256 gdg= typ=RSF key=___,___ job=jar100 prg=CAR100 ar.customer.master.indexed rca=00256 rcs=00256 gdg= typ=RSF key=___,___ job=jar200 prg=CAR200 ar.customer.nameadrs.list100 rca=00120 rcs=00120 gdg= typ=LSTt key=___,___ job=jar100 prg=CAR100 ar.sales.items rca=00064 rcs=00064 gdg= typ=RSF key=___,___ job=jar200 prg=SORT ar.sales.list rca=00120 rcs=00120 gdg= typ=LSTt key=___,___ job=jar200 prg=CAR200 gl.account.acntlist_ rca=00120 rcs=00120 gdg= typ=LSTt key=___,___ job=jgl100 prg=CGL100 gl.account.master_ rca=00128 rcs=00128 gdg= typ=RSF key=___,___ job=jgl320 prg=CGL200 gl.account.tran1 rca=00080 rcs=00080 gdg= typ=RSF key=___,___ job=jgl200 prg=SORT gl.account.trans_ rca=00080 rcs=00080 gdg= typ=RSF key=___,___ job=jgl230 prg=SORT
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
If available, the mainframe LISTCAT reports can be transferred to unix & the data file info (record-sizes, file-types, Indexed keys) can be extracted & merged into the JCL converter control file to greatly improve JCL conversions.
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. mkdir cat0 cat1 cat2 <-- make subdirs for LISTCAT files processing
#2. cd cat0
#3. FTP LISTCAT reports from mainframe to unix - may be multiple files, names do not matter, but must be valid for unix
#4. cdl <-- change back to working directory (above cat0,cbl0,jcl0,etc)
#5. <-- change names as necessary for unix #5a. renameL cat0 - rename to Lower case for unix #5b. renameB2_ cat0 - change any embedded blanks to underscores #5c. renameD2_ cat0 - change any '$' dollar signs to '_' underscores #5d. rename-QQ cat0 - remove any 'Quotes' #5e. rename-PP cat0 - remove any (Parenthesis)
#6. catdata50 all <-- extract datafile info from LISTCAT reports =============
cat0 -------> cat1 ---------> cat2 --------> ctl/datacat51 --> ctl/datacat52I catdata51 catdata52 catcat51 uvcp
Note |
|
We had a problem at 1 site - the record sizes were missing from the LISTCAT report for NON-VSAM files - but the mainframe person was able to create an additional report with LRECLs which we converted to the keyword format & combined with the other files - see details on page '3D1'.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
# listcat001 - test file for Vancouver Utilities JCL & DATA conversion # - see www.uvsoftware.ca/datacnv1.htm#Part_6 & Part_3 here. # 'CLUSTER's for a few files extracted from mainframe LISTCAT report # - omitting many items not relevant to desired information # - selecting: KEYLEN, RKP, AVGLRECL, MAXLRECL, RECTOTAL, LIMIT (for GDGs)
0CLUSTER ------- AR.CUSTOMER.MASTER.INDEXED ASSOCIATIONS DATA-----AR.CUSTOMER.MASTER.INDEXED.DATA INDEX----AR.CUSTOMER.MASTER.INDEXED.INDEX KEYLEN-----------------6 AVGLRECL-----------256 BUFSPACE---------14366 RKP--------------------0 MAXLRECL-----------256 EXCPEXIT--------(NULL) REC-TOTAL-------------32 SPLITS-CI------------0 EXCPS--------------495
0CLUSTER ------- AR.SALES.ITEMS KEYLEN-------0 AVGLRECL--------64 BUFSPACE------4096 CISIZE--------2048 RKP----------0 MAXLRECL--------64 EXCPEXIT------(NULL) CI/CA----------270 REC-TOTAL---20 SPLITS-CI------0 EXCPS----------8
0CLUSTER ------- GL.ACCOUNT.MASTER KEYLEN-------0 AVGLRECL-------128 BUFSPACE------4096 CISIZE--------2048 RKP----------0 MAXLRECL-------128 EXCPEXIT------(NULL) CI/CA----------270
0GDG BASE ------GL.ACCOUNT.TRANS IN-CAT --- SYS3C.ICFCAT.UCATY03 HISTORY --- lines omitted --- ATTRIBUTES LIMIT------------------3 SCRATCH NOEMPTY ASSOCIATIONS NONVSAM--GL.ACCOUNT.TRANS.G1201V00 NONVSAM--GL.ACCOUNT.TRANS.G1202V00 NONVSAM--GL.ACCOUNT.TRANS.G1203V00 0NONVSAM ------- GL.ACCOUNT.TRANS.G1201V00 IN-CAT --- SYS3C.ICFCAT.UCATY03 HISTORY --- lines omitted --- VOLUMES --- lines omitted --- ASSOCIATIONS GDG------GL.ACCOUNT.TRANS ATTRIBUTES 0NONVSAM ------- GL.ACCOUNT.TRANS.G1202V00 --- lines omitted -- 0NONVSAM ------- GL.ACCOUNT.TRANS.G1203V00 --- lines omitted --
If you know the LISTCAT parameter to include record-sizes for NONVSAM files - please let me know.
See workaround on page '3D1'. We created an addional LRECL listing on the mainframe, transferred to unix ctl/lrecl0,& converted to our keyword style control file into cat2/lrecl2 to be merged with other LISTCAT info.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
AR.CUSTOMER.MASTER rca=00256 rcs=00256 typ=RSF cnt=00000032 AR.CUSTOMER.MASTER.INDEXED rca=00256 rcs=00256 typ=IDXf1 cnt=00000032 key=000,006 AR.SALES.ITEMS rca=00064 rcs=00064 typ=RSF cnt=00000020 GL.ACCOUNT.MASTER_ rca=00000 rcs=00000 typ=RSF gdg=003 GL.ACCOUNT.TRANS rca=00080 rcs=00080 typ=RSF
ar.customer.master rca=00256 rcs=00256 typ=RSF cnt=00000032 ar.customer.master.indexed rca=00256 rcs=00256 typ=IDXf1 cnt=00000032 key=000,006 ar.sales.items rca=00064 rcs=00064 typ=RSF cnt=00000020 gl.account.master_ rca=00000 rcs=00000 typ=RSF gdg=003 gl.account.trans rca=00080 rcs=00080 typ=RSF
ar.customer.master rca=00256 rcs=00256 gdg= typ=RSF ar.customer.master.indexed rca=00256 rcs=00256 gdg= typ=IDXf1 key=000,006 ar.sales.items rca=00064 rcs=00064 gdg= typ=RSF gl.account.master_ rca=_____ rcs=_____ gdg=003 typ=RSF gl.account.trans rca=00080 rcs=00080 gdg= typ=RSF
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
We had a problem at 1 site - the record sizes were missing from the LISTCAT report for NON-VSAM files - but the mainframe person was able to create an additional report which showed LRECL's as follows:
DATA SET NAME VOLUME ORG RECFM LRECL BLKSZ PGAPO.ABE.EKSTADA.VLK SAG102 PO FB 80 6160 PGAPO.GAPUF.ABE.ABEBUCH.ONLINE SAG039+ PS FB 350 5950 PGAPO.GAPJW.GAPOMYEB.MGYART SAG073+ PS FB 4000 8000 PGAPO.ESDKW.GAPO41F2.KSAG416.LISTE2.RE SAG043+ PS FBA 133 6118 PGAPO.ESDKW.GAPO41F2.KSAG420.AUSGABE.RE SAG102+ PS FB 14150 14150 PGAPO.GAPBU.GAPOAYK2.FTP.KRCOPY.G0398V00 SAG065+ PS FB 80 1600
I then wrote a uvcopy job to convert the LRECL listing into the keyword format used by the Vancouver Utilities JCL & DATA conversion jobs.
uvcopy lrecl0cat2,fili1=ctl/lrecl0,filo1=cat2/lrecl2 ===================================================== - convert mainframe LRECL listing to VU keyword format
pgapo.abe.ekstada.vlk rca=00080 rcs=00080 typ=RSF pgapo.gapuf.abe.abebuch.online rca=00350 rcs=00350 typ=RSF pgapo.gapjw.gapomyeb.mgyart rca=04000 rcs=04000 typ=RSF pgapo.gappd.gapoaff5.ksag416.liste2 rca=00133 rcs=00133 typ=RSF pgapo.esdkw.gapoayf2.ksag420.ausgabe.re rca=14150 rcs=14150 typ=RSF pgapo.gapbu.gapoayk2.ftp.krcopy_ rca=00080 rcs=00080 typ=RSF gdg=___
Note - GDG files recognized by last node '.G####V00' - replaced by trailing '_' ID for UV GDG file & gdg=___ appended
This job is part of script 'catdata50' which calls following jobs: catdata51, catdata52, lrecl0cat2 (this job), cp, catcat51,& uvcp
catdata51 - convert mainframe LISTCAT report to keyword format catdata52 - translate filenames to lower case & convert GDG files ot VU format *lrecl0cat2 - converts LRECL mainframe report to keyword format cat2/lrecl2 cp - copies cat2/lrecl2 to ctl/add for JCL converter control files catcat51 - combines all files in cat2/... into ctl/datacat51 uvcp - loads ctl/datacat52I, to assist JCL conversion - ctl/datacat52I also input to DATA conversion job 'ctlcat12'
JCL converter script jcl2ksh51A calls script jcldata51, which calls uvcopy job ctldata53, which loads ctl/datactl53I (for the JCL converter) from various inputs including ctl/datacat52I
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
If after 1st 2 conversions, you notice missing record-sizes or incorrect file-types for some files, you can add your own-coded control file in the ctl/add/ subdir, knowing that the keyword info from all files in ctl/add/* are merged into the JCL converter control file on the next conversion. Here is the sample/demo supplied in $UV/ctl/datamisc51.
# datamisc51 - miscellaneous data file info for JCL conversion # - demo for https://www.uvsoftware.ca/jclcnv2real.htm#Part_3 # # Must be stored in $RUNLIBS/ctl/add/... # But could have any name desired BECAUSE: # - uvcopy catcat51 (part of jcldata51A script, called by jcl2ksh51A) # - will combine all files found in ctl/add/... into ctl/dataadd51 # which is loaded into Indexed file ctl/dataadd51I.dat/.idx # which is looked up by uvcopy ctldata53 to merge any keyword data found # into ctl/datactl53, loaded into ctl/datactl53I.dat/.idx for JCL converter # These comment lines ('#' in column 1) will be dropped by catcat51 # Here are a few lines keyword info for the demo in JCLcnv1demo.htm # ar.sales.items rca=00064 rcs=00064 typ=RST gl.account.acntlist_ rca=00132 rcs=00132 typ=LSTt2 gdg=003 gl.account.trans_ rca=00080 rcs=00080 typ=RST gdg=007
For our JCL conversion #2, we will copy the demo file into ctl/add/... where it will be combined with other control files on the next JCL conversion.
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. cp ctl/datamisc51 ctl/add/ ========================== - copy demo manually prepared control file into ctl/add/... to make it effective on next JCL conversion
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Some users might prefer to create am Excel spreadsheet to code missing record sizes, file types, Indexed keys, gdgs, etc. OR they may already have Excel spreadsheets that were used for the mainframe.
Here is a sample Excel spreadsheet supplied in $UV/ctl/dataxls51.csv. This is the comma-delimited version after being exported from the Excel spreadsheet.
DSN,Volume,Organization,Reclen,Block,RecFormat,Allocated,USED,Extents,created,Referenced ar.customer.master,VOL001,PS,256,25600,FB,800,50,1,9/30/2014,3/14/2015, ar.sales.items,VOL002,PS,64,6400,FB,750,25,1,8/25/2014,2/17/2015, ar.customer.nameadrs.list100,VOL003,PS,132,1320,FB,750,25,1,8/25/2014,2/17/2015,
For this spreadsheet, the only item applicable to unix was the record-size. We made a uvcopy job 'xlsdata51' to extract the data into our required format For example:
ar.customer.master rca=00256 rcs=00256 data=ps__fb_ ar.sales.items rca=00064 rcs=00064 data=ps__fb_ ar.customer.nameadrs.list100 rca=00132 rcs=00132 data=ps__fb_
The Output file will be in $RUNLIBS/ctl/add/ where it will be combined with any other control files on the next JCL conversion
See the uvcopy job 'xlsdata51' listed on page '11C7' You might modify it depending on the format of your Excel spreadsheet. You can run as follows:
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. uvcopy xlsdata51,fili1=ctl/dataxls51.csv,filo1=ctl/add/xlsdata51 ================================================================
#1a. uvcopy xlsdata51 <-- same as above, files default as shown ================
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1a. mv jcl3 jcl3.old <-- save prior conversions by changing dirname ================ - so we can use alldiff2 below to see differences
#1b. mkdir jcl3 <-- make new subdir for re-conversions ==========
#2. jcl2ksh51A all <-- convert ALL JCL thru all steps ============== proc0 --> procs, parm0 --> parms jcl0 --> jcl1 --> jcl2 --> jcl3
I highly recommend this procedure to see the differences that re-converting makes - to ensure you changed what you intended & did not cause any unintended changes.
#3a. alldiff2 jcl3.old jcl3 <-- create diff report in tmp/jcl3.dif ======================
#3b. vi tmp/jcl3.dif <-- investigate diff report =============== --> :g/ver:/d <-- drop diffs caused by JCL converter version diff --> :g/logmsg/d <-- drop diffs caused by logmsg lines
Note |
|
Dropping the version, logmsg, & other non-relevant diffs lets you concentrate on the important differences, but can cause many empty difference files (where only diff was the version). I suggest you use the 'uvcopy diffdrop2' job to eliminate these lines:
#4. uvcopy diffdrop2,fili1=tmp/jcl3.dif,filo1=tmp/jcl3.dif2 =======================================================
#4a. uvcopy diffdrop2 <-- same as above (files default as shown) ================
#5. vi tmp/jcl3.dif2 <-- inestigate shorter report ================
#6. cp jcl3/... jcls/ <-- copy to jcls/... for test/debug ================= - only jcls/ is in the PATH - do not copy all jcl3/* to jcls/... - recommend copy each JCL/script when ready to test/debug
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
The previous page gave the instructions to run script 'alldiff2' (after reconverting the JCL) to create a differences report to prove that the reconversions changed what was intended with no unintended consequences. Here we will illustrate 3 versions of the alldiff2 report:
To save space, we will show results for only 4 of the 9 demo JCL/scripts.
39c39 < exit 0 #ver:20150313 a2b0c0d1e2f0g1h1i15j0k15l1m4n3o8p0q0r0s2t1u1v0w0x0y1z1 --- > exit 0 #ver:20150314 a2b0c0d1e2f0g1h1i15j0k15l1m4n3o8p0q0r0s2t1u1v0w0x0y1z1 diff file# 2 - jcl3.old/... vs jcl3/jar100.ksh
31c31 < logmsg2 "Executing--> uvsort \"fili1=$SORTIN,typ=RSF,rcs=64,filo1=$SORTO..." --- > logmsg2 "Executing--> uvsort \"fili1=$SORTIN,typ=RST,rcs=64,filo1=$SORTO..." 33c33 < uvsort "fili1=$SORTIN,typ=RSF,rcs=64,filo1=$SORTOUT,typ=RSF,rcs=64\ --- > uvsort "fili1=$SORTIN,typ=RST,rcs=64,filo1=$SORTOUT,typ=RSF,rcs=64\ 70c70 < exit 0 #ver:20150313 a2b0c0d1e2f0g1h1i15j0k15l1m4n3o8p0q0r0s2t1u1v0w0x0y1z1 --- > exit 0 #ver:20150314 a2b0c0d1e2f0g1h1i15j0k15l1m4n3o8p0q0r0s2t1u1v0w0x0y1z1 diff file# 3 - jcl3.old/... vs jcl3/jar200.ksh
47c47 < exit 0 #ver:20150313 a2b0c0d1e2f0g1h1i15j0k15l1m4n3o8p0q0r0s2t1u1v0w0x0y1z1 --- > exit 0 #ver:20150314 a2b0c0d1e2f0g1h1i15j0k15l1m4n3o8p0q0r0s2t1u1v0w0x0y1z1 diff file# 4 - jcl3.old/... vs jcl3/jgl100.ksh
38c38 < uvsort "fili1=$SORTIN,typ=RSF,rcs=80,filo1=$SORTOUT,typ=RSF,rcs=80\ --- > uvsort "fili1=$SORTIN,typ=RSF,rcs=80,filo1=$SORTOUT,typ=RST,rcs=80\ 76c76 < exit 0 #ver:20150313 a2b0c0d1e2f0g1h1i15j0k15l1m4n3o8p0q0r0s2t1u1v0w0x0y1z1 --- > exit 0 #ver:20150314 a2b0c0d1e2f0g1h1i15j0k15l1m4n3o8p0q0r0s2t1u1v0w0x0y1z1 diff file# 5 - jcl3.old/... vs jcl3/jgl200.ksh 9 different of 9 files compared jcl3.old to jcl3
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
#5b. vi tmp/jcl3.dif <-- investigate diff report =============== --> :g/ver:/d <-- drop diffs caused by JCL converter version diff --> :g/logmsg/d <-- drop diffs caused by logmsg lines
39c39 --- diff file# 2 - jcl3.old/... vs jcl3/jar100.ksh
31c31 --- 33c33 < uvsort "fili1=$SORTIN,typ=RSF,rcs=64,filo1=$SORTOUT,typ=RSF,rcs=64\ --- > uvsort "fili1=$SORTIN,typ=RST,rcs=64,filo1=$SORTOUT,typ=RSF,rcs=64\ 70c70 --- diff file# 3 - jcl3.old/... vs jcl3/jar200.ksh
47c47 --- diff file# 4 - jcl3.old/... vs jcl3/jgl100.ksh
38c38 < uvsort "fili1=$SORTIN,typ=RSF,rcs=80,filo1=$SORTOUT,typ=RSF,rcs=80\ --- > uvsort "fili1=$SORTIN,typ=RSF,rcs=80,filo1=$SORTOUT,typ=RST,rcs=80\ 76c76 --- diff file# 5 - jcl3.old/... vs jcl3/jgl200.ksh 9 different of 9 files compared jcl3.old to jcl3
33c33 < uvsort "fili1=$SORTIN,typ=RSF,rcs=64,filo1=$SORTOUT,typ=RSF,rcs=64\ --- > uvsort "fili1=$SORTIN,typ=RST,rcs=64,filo1=$SORTOUT,typ=RSF,rcs=64\ diff file# 3 - jcl3.old/... vs jcl3/jar200.ksh
38c38 < uvsort "fili1=$SORTIN,typ=RSF,rcs=80,filo1=$SORTOUT,typ=RSF,rcs=80\ --- > uvsort "fili1=$SORTIN,typ=RSF,rcs=80,filo1=$SORTOUT,typ=RST,rcs=80\ diff file# 5 - jcl3.old/... vs jcl3/jgl200.ksh
33c33 < uvsort "fili1=$SORTIN,typ=RSF,rcs=80,filo1=$SORTOUT,typ=RSF,rcs=80\ --- > uvsort "fili1=$SORTIN,typ=RSF,rcs=80,filo1=$SORTOUT,typ=RST,rcs=80\ diff file# 6 - jcl3.old/... vs jcl3/jgl230.ksh 9 different of 9 files compared jcl3.old to jcl3
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Page '3F1' re-converted using 'jcl2ksh51A' (do everything script).
#2. jcl2ksh51A all <-- convert ALL JCL thru all steps ============== proc0 --> procs, parm0 --> parms jcl0 --> jcl1 --> jcl2 --> jcl3
But jcl2ksh51A repeats cleanups & proc expands that are unnecessary if we are ONLY CHANGING CONTROL FILES. If you have hundreds of JCLs, PROCs,& PARMs, you can save time by replacing jcl2ksh51A on page '3F1' with the following 3 steps (#2a,#2b,#2c).
#2a. jcldata51A <-- re-create ctl/datactl53 control file ========== including LISTCAT info
#2b. jclxx51 jcl2 jcl3 <-- re-convert JCL to Korn shell scripts =================
#2c. cp ctl/datactl53I.dat ctl/add/datactl53a ======================================== - save updated JCL converter control file for future re-conversions
Running jcldata51A (vs jcl2ksh51A) avoids repeating the cleanups & proc expansions (unnecessary if ONLY CHANGING CONTROL-FILES), but we then need to run #2b 'jclxx51' (JCL to Korn shell script conversion) separately plus a few other tasks as shown below.
We also need #2c to copy ctl/datactl53I.dat to ctl/add/datactl53a because the JCL converter updates the Indexed file datactl53I.dat/.idx with any datafile info supplied by other sources such as COBOL, LISTCAT, dataxls (spreadsheets), datamisc (misc edited info). We copy the file into ctl/add/... because all files in ctl/add/... are combined for following re-conversions.
The jcl2ksh51A script renames as shown above on #2c, but there might be some advantage in renaming it as follows (using suffix 'b' vs 'a').
#2d. cp ctl/datactl53I.dat ctl/add/datactl53b =======================================*
This allows you to make manual updates to ctl/add/datactl53b & not lose them on reconversion - because ctl/add/datactl53a is overwritten by reconversion, but any added info in ctl/add/datactl53a will be merged into new control files & especially 'dataclt53I.dat/.idx' used by the JCL converter.
#2e. vi ctl/add/datactl53b <-- manual update ===================== - add record-sizes, change file-types, etc as desired
Then to be effective you would need to rec-convert (using jcl2ksh51A OR jcldata51A, jclxx51,& cp ... (#2a,#2b,#2c above).
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
ar.customer.master rca=00256 rcs=00256 gdg=___ typ=RSF key=___,___ job=jar100 prg=CAR100 ar.customer.master.indexed rca=00256 rcs=00256 gdg=___ typ=RSF key=___,___ job=jar200 prg=CAR200 ar.customer.nameadrs.list100 rca=00120 rcs=00120 gdg=___ typ=LSTt key=___,___ job=jar100 prg=CAR100 ar.sales.items rca=00064 rcs=00064 gdg=___ typ=RSF key=___,___ job=jar200 prg=SORT ar.sales.list rca=00120 rcs=00120 gdg=___ typ=LSTt key=___,___ job=jar200 prg=CAR200 gl.account.acntlist_ rca=00120 rcs=00120 gdg=___ typ=LSTt key=___,___ job=jgl100 prg=CGL100 gl.account.master_ rca=00128 rcs=00128 gdg=___ typ=RSF key=___,___ job=jgl320 prg=CGL200 gl.account.tran1 rca=00080 rcs=00080 gdg=___ typ=RSF key=___,___ job=jgl200 prg=SORT gl.account.trans_ rca=00080 rcs=00080 gdg=___ typ=RSF key=___,___ job=jgl230 prg=SORT
ar.customer.master rca=00256 rcs=00256 gdg=___ typ=RSF key=___,___ job=jar100 prg=CAR100 ar.customer.master.indexed rca=00256 rcs=00256 gdg=___ typ=IDXf1 key=000,006 job=jar200 prg=CAR200 ar.customer.nameadrs.list100 rca=00120 rcs=00120 gdg=___ typ=LSTt key=___,___ job=jar100 prg=CAR100 ar.sales.items rca=00064 rcs=00064 gdg=___ typ=RST key=___,___ job=jar200 prg=SORT ar.sales.list rca=00120 rcs=00120 gdg=___ typ=LSTt key=___,___ job=jar200 prg=CAR200 gl.account.acntlist_ rca=00120 rcs=00120 gdg=003 typ=LSTt key=___,___ job=jgl100 prg=CGL100 gl.account.master_ rca=00128 rcs=00128 gdg=005 typ=RSF key=___,___ job=jgl320 prg=CGL200 gl.account.tran1 rca=00080 rcs=00080 gdg=___ typ=RSF key=___,___ job=jgl200 prg=SORT gl.account.trans_ rca=00080 rcs=00080 gdg=007 typ=RST key=___,___ job=jgl230 prg=SORT
1a. ar.customer.master.indexed now coded corectly as typ=IDXf1 & key=000,006 - picked up from the LISTCAT reports (used on 2nd convert but not 1st.
1b. 3 of the GDG files now have the number of generations coded - picked up from the LISTCAT reports.
1c. ar.sales.items & gl.account.trans changed 'typ=RSF' to 'typ=RST'. - picked up from ctl/datamisc51 (used on 2nd convert but not 1st).
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Here is a summary of the data control files used by the JCL converter super- script 'jcl2ksh51A'. This should help you understand the process.
jcl2ksh51A |
|
jcldata51A |
|
3a. ctl/datacat52I - Indexed file created by script 'catdata50' from multiple mainframe LISTCAT reports - empty file created by copymvsctls (in case no LISTCAT)
3b. ctl/add/... - may be multiple sequential files in ctl/add/... subdir - will be combined by uvcopy catcat51 & loaded into Indexed file ctl/dataadd52I
3c. ctl/dataadd52I - Indexed file created by this script (jcldata51A) from all additional info files in ctl/add/...
3d. $UV/ctl/add/dummy_readme supplied if no additional info - copied to ctl/add by copymvsctls (in case no other files)
3e. additional files that you may create, name as you like, for example: ctl/datacpy52 - datafilenames & copybooknames ctl/dataedt52 - make with editor to supply missing info as desired ctl/datagdg52 - make with editor to supply missing GDG gnerations
3f. ctl/add/datactl53a - input from any prior jcl2ksh51A, converts all JCL jcl0 --> jcl1 --> jcl2 --> jcl3 - saves updates to datactl53I made by JCL converter for reconversions by jcl2ksh51A
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
4A1. | Generating & Executing ALL jobs to convert all DATA files (vs JCL & COBOL) |
- data conversion jobs generated from COBOL copybooks | |
- need to replace copybooknames with datafilenames | |
- using a control file of datafilenames coded with copybooknames |
4A2. | 3 ways to create the control-file |
- 1. edit control file manually | |
- 2. create control file from Essential files report | |
- 3. create from mainframe LISTCAT/VTOC reports |
4A3. | Creating control file manually |
- control file created from directory of datafilenames | |
- control file after inserting copybooknames |
4B1. | xmvs2ctl1 - create control file from Essential files report |
- Operating Instructions, sample input, sample output |
4C1. | Create control file from mainframe catalogs vai Excel spreadsheet |
- Operating Instructions, sample input, sample output |
4C2. | adding LISTCAT & JCL file info to control file from excel |
catctl12 - sample input & sample output |
4C3. | catcl12 Op. Instrns. add info from datacat52I & datctl53I |
- sort control-file on group-code |
4C4. | DROP records wwithout record-size |
- SELECT records wwithout record-size | |
- converting Variable length to Fixed length |
4C5. | control-file review |
- selecting a group of files for conversion | |
- copy control file from testlibs to cnvdata |
4D1. | Replicate Mainframe FTP JCL to transfer files |
genftpjcl1 - Operating Instructions - sample input |
4D2. | genftpjcl1 - sample output for Fixed length files |
4D3. | genftpjcl1 - sample output - Variable length file |
'LOCSITE RDW' option to FTP variable length files | |
4D4. | genftpjcl1 - sample output - GDG file (Fixed Length) |
4D5. | SORT option 'FTOV' convert Fixed to Variable before FTP RDW |
4D6. | FTP mainframe files to unix/linux |
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
4E1. | copy mainframe FTP'd files to $CNVDATA for conversion |
- copy control file to $CNVDATA/ctl/datacpy52 |
4E2. | convert mainframe Variable Length files to Fixed Lengths |
- create table summary of variable record lengths | |
for all files in directory | |
- sample report |
4E3. | convert variable length files to fixed lengths |
- sample control file |
4F1. | generate jobs to convert EBCDIC to ASCII |
4F2. | sample job with record-type test code added |
4F3. | replacing copybook-names with datafile-names |
- allowing for files without copybooks |
4F4. | Generate conversion jobs with datafile names inserted from control-file |
- Execute All jobs to convert All datafiles in the directory | |
- copy files from $CNVDATA to $RUNDATA for testing |
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
This part 3 presents comprehensive operating instructions to convert your mainframe data files for use on Unix/Linux/Windows. Here we have tried to allow for the complexeties of a real data conversion vs the demo conversions previously illustrated in DATAcnv1.htm#Part_3 & DATAcnv1.htm#Part_6.
The DATAcnv1.htm#Part_3 conversions were quicker & easier, but they converted 1 file at a time & required renaming the datafiles the same as the copybooks to simplify the procedures to generate data converison jobs from the copybooks.
Using the same name for the copybook, datafile,& all generated jobs is possible because we have different subdirs for different file types (cpys/..., maps/..., pfx1/..., pfp1/... d1ebc/..., d2asc/..., d4pipe/...).
This JCLcnv2real.htm#Part_4 will show you how to perform these conversions for all files in the directory (vs one at a time documented previously), and to insert the actual data filenames in the generated jobs (vs copybooknames).
A key part of this plan is use a control file to relate the data file names to the copybook names. The control file method allows for the distinct possibility of having multiple datafiles using the same copybook.
ar.customer.armaster cpy=____________ rcs=_____ typ=_____ ar.sales.arsales cpy=____________ rcs=_____ typ=_____ gl.account.glmaster_ cpy=____________ rcs=_____ typ=_____ gdg=003
ar.customer.armaster cpy=armaster.cpy rcs=_____ typ=_____ ar.sales.arsales cpy=arsales.cpy rcs=_____ typ=_____ gl.account.glmaster_ cpy=glmaster.cpy rcs=_____ typ=_____ gdg=003
You may have to insert the copybooknames manually with the editor, but after that the conversion jobs are generated automatically from the copybooks & they can all be executed with 1 command. Exceptions would be for files with multiple record types (redefined records in the copybook). In those usually rare instances you need to add code to the generated jobs to select the appopriate bank of auto generated converison isntructions.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
** creating a control-file for data-file conversion**
The control-file will contain a list of the data-files we need to transfer from the mainframe to unix/linux/windows & convert from EBCDIC to ASCII. The control file must be coded with the copybook-name that describes the record layout of the data-file. The control-file might contain some other items such as record-size, but this can be determined from the copybook.
We will describe 3 ways to create the control-file
ar.customer.armaster cpy=armaster.cpy rcs=_____ typ=_____ ar.sales.arsales cpy=arsales.cpy rcs=_____ typ=_____ gl.account.glmaster_ cpy=glmaster.cpy rcs=_____ typ=_____ gdg=003
ESSENTIAL file lists created by 'mvsfiles5A' which analyzes all JCLs to determine the esential input files & ignoring intermediate, sort, work,& output files. See '1P1' - '1P4'.
Filenames can be extracted from mainframe LISTCAT/VTOC reports transfered to Windows &/or Linux, perhaps into an Excel spreadsheet. Thes files would contain many files that do need to be transferred because they are intermediate work files, sort files, etc
We will show how a knowledgeable person could edit the Excell spreadsheet datafiles with the corresponding COBOL copybook, add an indicator for files he knows need to be transferred,& perhaps a group code so that the files can be transferred & converted in logical groups.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
#0a. Login #0b. cdl (alias cd $CNVDATA) --> /p1/cnv/cnv1/cnvdata (or your cnvdata)
#1. vi ctl/datacpy51 <-- use vi to create control file ================ - from your knowledge of datafiles & copybooks - sample as follows
ar.customer.armaster cpy=armaster.cpy ar.sales.arsales cpy=arsales.cpy gl.account.glmaster_ cpy=glmaster.cpy
If you have already transferred the datafiles you intend to convert, you can use the 'datacpy51' utility to generate the control file with all data file names in the directory & with 'fill-in-the-blanks' spaces for the copybookname, record-size override for copybook size, and file type override for default typ=RSF. Assuming data files in d1ebc/...
#1a. uvcopy datacpy51,fild1=d1ebc,filo2=ctl/datacpy51 ================================================
ar.customer.armaster cpy=____________ rcs=_____ typ=_____ ar.sales.arsales cpy=____________ rcs=_____ typ=_____ gl.account.glmaster_ cpy=____________ rcs=_____ typ=_____ gdg=003
ar.customer.armaster cpy=armaster.cpy rcs=_____ typ=_____ ar.sales.arsales cpy=arsales.cpy rcs=_____ typ=_____ gl.account.glmaster_ cpy=glmaster.cpy rcs=_____ typ=_____ gdg=003
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
We can use the Essential files report to create the data conversion control file that we will need to convert mainframe files to unix/linux/windows. See ESSENTIAL files reports at '1P1' - '1P4'
'uvcopy xmvs2ctl1' will convert xmvsA/mvsfiles6 to ctl/mvsfiles8 & you can change it to ctl/datacpy51 or copy/rename after running xmvs2ctl1 with defaults. It will extract the datafilenames from columns 54-97. It will lookup Indexed file ctl/datactl53I (used for JCL conversion) for matches on filename and append the information available (record-size, file-type, Indexed keyloc/keylen.
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. uvcopy xmvs2ctl1,fili1=xmvsX/mvsfiles6,filr2=ctl/datactl53I,filo1=ctl/datacpy51 =============================================================================== - create control file for data conversion from essential files report - appending file info from ctl/datactl53I (Indexed file from JCL conversion)
#2. cp ctl/datacpy51 $CNVDATA/ctl/datacpy52 ======================================= - copy over to $CNVDATA for data conversion later at '4E1' & '4F4'
gapoaoo1 001 IDCAMS OUTDAT SKK 0400 * NGAPO.GAPUF.EAO.AUSKUNFT gapoaoo1 005 AOPV110 OTABACT SKK * VGAPO.GAPCI.SAG.ORGSTAM gapoaoo3 032 AOPV910 TABSTAM SKK * VGAPO.GAPCI.SAG.TABSTAM gapoaoo2 010 AOPV140 TABSTAM S * VGAPO.GAPCI.YAG.YAGTABS gapoaoo1 005 AOPV110 ARTSTAM SKK 4000 * VGAPO.GAPCI.EAO.EAOART gapoaoo2 034 SORT SORTIN SKK 0350 * VGAPO.GAPCI.EAO.EAOBUCH
ngapo.gapuf.eao.auskunft cpy=____________ rs1=00400 rca=00400 rcs=00400 gdg=___ typ=RSF key=___,___ vgapo.gapci.sag.orgstam cpy=____________ rs1=04000 rca=00120 rcs=04000 gdg=___ typ=IDXf1 key=000,028 vgapo.gapci.sag.tabstam cpy=____________ rs1=00204 rca=00120 rcs=00204 gdg=___ typ=IDXf1 key=000,016 vgapo.gapci.agy.agytabs cpy=ctabstam.cpy rs1=00204 rca=00300 rcs=00204 gdg=___ typ=IDXf1 key=000,016 vgapo.gapci.eao.eaoart cpy=cpdaart.cpy rs1=04000 rca=01500 rcs=04000 gdg=___ typ=IDXf1 key=002,015 vgapo.gapci.eao.eaobuch cpy=copvbel.cpy rs1=04089 rca=00350 rcs=04089 gdg=___ typ=IDXf1 key=000,028
We show a few copybooks (from the JCL converter Indexed file ctl/datactl53I), possible if you have already performed some datafile conversion, but most likely the copybooks are all blank at this poiunt and you may have to edit in manually.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
This is the 1st step in a series of jobs to create control files for generating FTP JCLs to transfer required data files and to convert them from EBCDIC to ASCII preserving packed/binary.
This assumes a knowledgeable person has created a control file with mainframe filenames & coded a copybookname for files needing to be transferred. This file created from mainframe catalogues & imported to an Excel spreadsheet
NGAP2.GAPUF.AFI.BELDAT,SAG061,PS,350,5950,FB,15,0,28.11.2013,25.08.2015\ ,NGAP2,GAPUF,AFI,BELDAT,,,,DAT,26.08.2015,,Y,AFI,,,,,,MOF,FOPVBEL,,,
NGAP2.GAPUF.ACH.KUMAUSK.G0060V00,SAG030,PS,400,6000,FB,657,55,25.08.2015,25.08.2015\ ,NGAP2,GAPUF,ACH,KUMAUSK,G0060V00,,,AUSK,26.08.2015,,Y,AFI,,,,,,MOF,FPDAAUF,,,
----------- filename ---------- group recsize Y/N copybook NGAPO.GAPUF.AFI.BELDAT AFI 350 Y FOPVBEL NGAPO.GAPUF.AFI.KUMAUSK.G5589V00 AFI 400 Y FPDAAUF NGAPO.GAPUF.AFI.KUMAUSK.G5590V00 AFI 400 Y FPDAAUF NGAPO.GAPUF.AFI.KUMAUSK.G5591V00 AFI 400 Y FPDAAUF VGAPO.GAPCI.EAO.EAOART.DATA EAO 0 Y FPDAART VGAPO.GAPCI.EAO.EAOAUF.DATA EAO 0 Y FPDAAUF VGAPO.GAPCI.EAO.EAOKND.DATA EAO 0 Y FWSKDN
This assumes the Excel spreadsheet has been exported to a .csv file stored in $RUNLIBS/ctl/catctl0.csv for processing by uvcopy.
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. uvcopy catcsv2ctl1,fili1=ctl/catctl0.csv,filo1=ctl/catctl1 ,filo2=tmp/catctl0.rpt ========================================================== - selects records if 'Y' in cell 21 or copybook(nonblank) in cell 29 - filo1 catctl1 is primary output desired - filo2 catctl0.rpt is vertical listing if you want to check cell#s
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
This assumes a knowledgeable person has created a control file with all mainframe filenames & coded a copybookname for files needing to be transferred This file created from mainframe catalogues & imported to an Excel spreadsheet - coded with 'Y' in cell 21 if file to be migrated, copybooks coded in cell 29
Prior job 'catcsv2ctl1' has extracted desired info from the Excel .csv file into the fixed format you see below (filename,group.recsize,Y/N,copybook).
This job 'ctlcat12' will convert the fixed column format to the VU keyword control file format and add data file info from Indexed files datacat52I & datactl53I that were created for the JCL conversion in Part_1.
The most important info transferred from the Excel file to the VU control file is the 'COPYBOOK-name' that we need to auto generate the data conversion job for each 'DATAFILE-name'. Note that we convert it from UPPER to lower, appending '.cpy' & coding in keyword format 'cpy=copybook.cpy'.
Also note that we convert the mainframe GDG file format '.G####V00' to the VU GDG format (trailing '_' underscore on filename) & code the number of generations on the keyword gdg=###.
We also drop filenames ending with .VLK, .GJ####, .VOR####, .VOR#####, and remove the '.DATA' from the end of filenames to match filenames in dataadd52I & datactl53I. This was site-specific & will be removed for other sites.
----------- filename ---------- group recsize Y/N copybook NGAPO.GAPUF.AFI.BELDAT AFI 350 Y FOPVBEL NGAPO.GAPUF.AFI.KUMAUSK.G5589V00 AFI 400 Y FPDAAUF NGAPO.GAPUF.AFI.KUMAUSK.G5590V00 AFI 400 Y FPDAAUF NGAPO.GAPUF.AFI.KUMAUSK.G5591V00 AFI 400 Y FPDAAUF VGAPO.GAPCI.EAO.EAOART.DATA EAO 0 Y FPDAART VGAPO.GAPCI.EAO.EAOAUF.DATA EAO 0 Y FPDAAUF VGAPO.GAPCI.EAO.EAOKND.DATA EAO 0 Y FWSKDN
ngapo.gapuf.afi.beldat grp=afi___ cpy=copvbel.cpy rs1=00350 rca=00350 rcs=00350 key=___,___ gdg=___ typ=RSF ngapo.gapuf.afi.kumausk_ grp=afi___ cpy=cpdaauf.cpy rs1=00400 rca=00400 rcs=00400 key=___,___ gdg=003 typ=RSF vgapo.gapci.afi.afiart grp=afi___ cpy=cpdaart.cpy rs1=_____ rca=01500 rcs=04000 key=002,015 gdg=___ typ=IDXf1 vgapo.gapci.eao.eaoauf grp=eao___ cpy=cpdaauf.cpy rs1=00512 rca=00250 rcs=00512 key=000,021 gdg=___ typ=IDXf1 vgapo.gapci.eao.eaoknd grp=eao___ cpy=cwskdn.cpy rs1=_____ rca=00700 rcs=04000 key=002,012 gdg=___ typ=IDXf1
We are not showing Indexed file inputs datactl53I & datacat52I that contribute much of the keyword info to the output file. See datactl53I sample at '3B5' and datacat52I at '3C2' & '3C3'.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Note |
|
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. uvcopy catcsv2ctl1,fili1=ctl/catctl0.csv,filo1=ctl/catctl1,filo2=tmp/catctl0.rpt ================================================================================ - ALREADY run on '4C1'
#2. uvcopy ctlcat12,fili1=ctl/catctl1,filo1=ctl/catctl2 ,filr2=ctl/datacat52I,filr3=ctl/datactl53I ========================================================== - copy ctl/catctl1 to ctl/catctl2, adding info from Indexed files: ctl/datacat52I - info from LISTCAT, record sizes, keys, gdgs ctl/datactl53I - info from JCL (from output files definitions)
#3. uvsort "fili1=ctl/catctl2,rcs=200,typ=LSTt,filo1=ctl/catctl3\ ,key1=45(7),key2=0(44)" ============================================================== - sort on group-code (primary) & filename (secondary)
ngapo.gapuf.afi.beldat grp=afi___ cpy=copvbel.cpy rs1=00350 rca=00350 rcs=00350 key=___,___ gdg=___ typ=RSF ngapo.gapuf.afi.kumausk_ grp=afi___ cpy=cpdaauf.cpy rs1=00400 rca=00400 rcs=00400 key=___,___ gdg=003 typ=RSF vgapo.gapci.afi.afiart grp=afi___ cpy=cpdaart.cpy rs1=_____ rca=01500 rcs=04000 key=002,015 gdg=___ typ=IDXf1 vgapo.gapci.eao.eaoauf grp=eao___ cpy=cpdaauf.cpy rs1=00512 rca=00250 rcs=00512 key=000,021 gdg=___ typ=IDXf1 vgapo.gapci.eao.eaoknd grp=eao___ cpy=cwskdn.cpy rs1=_____ rca=00700 rcs=04000 key=002,012 gdg=___ typ=IDXf1
The Excel spread sheet of all mainframe files was coded with a group-code that will be used to transfer & convert mainframe files in logical groups. The uvsort 'key1=45(7)' presumes that ctlcat12 has created 'grp=...' in cols 46-52.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
We cannot convert files without knowing the record-size. We decided to select them out to a separate file for later research & convert - in order not to delay converting the majority of our files for which we do have a record size.
#4a. uvcopy drop1,fili1=ctl/catctl3,filo1=ctl/catctl4,arg1=rcs=_____ ===============================================================
ngapo.gapuf.afi.beldat grp=afi___ cpy=copvbel.cpy rs1=00350 rca=00350 rcs=00350 key=___,___ gdg=___ typ=RSF ngapo.gapuf.afi.kumausk_ grp=afi___ cpy=cpdaauf.cpy rs1=00400 rca=00400 rcs=00400 key=___,___ gdg=003 typ=RSF vgapo.gapci.afi.afiauf grp=afi___ cpy=cpdaauf.cpy rs1=00512 rca=00250 rcs=00512 key=000,021 gdg=___ typ=IDXf1
#4b. uvcopy keep1,fili1=ctl/catctl3,filo1=ctl/catctl4rcs0,arg1=rcs=_____ ===================================================================
vgapo.gapci.eao.eaoart grp=eao___ cpy=cpdaart.cpy rs1=_____ rca=01500 rcs=04000 key=002,015 gdg=___ typ=IDXf1 vgapo.gapci.eao.eaoknd grp=eao___ cpy=cwskdn.cpy rs1=_____ rca=00700 rcs=04000 key=002,012 gdg=___ typ=IDXf1
We decided to convert all mainframe VARIABLE length files to FIXED length on the unix/linux/windows system to simplify procedures & allow for some software such as Easytrieve that does not handle variable length files on unix/linux/windows.
Many mainframe files defined as VARIABLE length were in fact FIXED length.
After we transfer the files, we will run 'varstat1' to deterrmine the largest record-size, and use that as our fixed record-size on unix/linux/windows.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Here is a review of the control files created from the Excel spreadsheet of all mainframe data-files coded with the copybook, Y/N flag, and a group code (logical group for conversion)
ctl/catctl0.csv - the Excel file converted to .csv NGAP2.GAPUF.AFI.BELDAT,SAG061,PS,350,5950,FB,etc,,,,,
ctl/catctl1 - desired fields extracted from .csv to fixed field format NGAPO.GAPUF.AFI.BELDAT AFI 350 Y FOPVBEL
ctl/catctl2 - convert to keyword format, adding file info from datacat52I & datactl53I (LISTCAT & JCL conversions) ngapo.gapuf.afi.beldat grp=afi___ cpy=copvbel.cpy rs1=00350 rca=00350 rcs=00350 key=___,___ gdg=___ typ=RSF
ctl/catctl3 |
|
ctl/catctl4 |
|
ctl/catctl4rcs0 |
|
There may be hundreds or thousands of files to be converted, but we will convert in logical groups (using the group-code in original Excel file). For example, we will select files with group code 'afi'.
#5. uvcopy keep1,fili1=ctl/catctl4,filo1=ctl/catctl4afi,arg1=grp=afi ================================================================ - select group-code 'afi' to separate control-file
#5a. uvcp "fili1=ctl/catctl4,rcs=128,typ=LSTt,filo1=ctl/catctl4afi\ ,sel=45(7):grp=afi" =============================================================== - alternative using 'uvcp' vs 'uvcopy keep1'
#6. cp $RUNLIBS/ctl/catctl4afi $CNVDATA/ctl/ ========================================
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
'genftpjcl1' is a uvcopy job to replicate mainframe FTP JCL to transfer files to unix/linux. It inserts filenames & record-sizes from a control-file into templates of FTP JCL to unload files & FTP to unix/linux. The control file can be created from various sources
A. Filenames extracted from LISTCAT reports transfered to linux. But LISTCAT files would contain many files that do need to be transferred because they are intermediate work files, sort files, etc
B. ESSENTIAL file lists created by 'mvsfiles5A' which analyzes all JCLs to determine the esential input files & ignoring intermediate, sort, work,& output files. See '1P1' - '1P4'.
C. Manual selection by knowledgeable person - create Excell spreadsheet from mainframe catalogues of all files and add indicator for files he thinks need to be transferred - also add a cell for the corresponding copybook for each datafile since EBCDIC to ASCII data conversions are generated from copybooks
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. uvcopy genftpjcl1,fili1=ctl/catctl4,fild2=jclftp1 ================================================= - replicate mainframe FTP JCL to transfer datafiles to linux - FTP JCL templates are embedded in the 'genftpjcl1' uvcopy job
The generated FTP JCLs will be transferred back to the mainframe for execution. Mainframe files will be FTP'd to /p1/cnv/ftp/ & later copied to /p1/cnv/cnv1/d0ebc for conversion to /p1/cnv/cnv1/d2asc/... (use $CNVDATA/d1ebc & $CNVDATA/d2asc)
ngapo.gapuf.abe.beldat cpy=copvbel rs1=00350 rca=00350 rcs=00350 key=___,___ gdg=___ vgapo.gapci.abe.abeknd cpy=cwskdn rs1=_____ rca=00700 rcs=04000 key=002,012 gdg=___ ngapo.gapuf.abe.artre(0) cpy=cpdaare rs1=00120 rca=00120 rcs=00120 key=___,___ gdg=003
We have FTP templates for the 3 types of mainframe files that we need to transfer - Fixed record length files, Variable record-length,& GDG files (fixed length).
The 3 templates are listed on the next 3 pages:
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
//VZSQ8M01 JOB (6LAF5N,A0000),'ECKERT', // CLASS=X,MSGLEVEL=(1,1),NOTIFY=&SYSUID,MSGCLASS=Y /*JOBPARM S=RY00,F=F072 //******************************************************************* //* Transfer files for Rehost migration incl. preparation steps etc. //* - create a temporary file via sort //* - transfer the file via ftp //* - remove the temporary file //*******************************************************************
//*------------------------------------------------------------------ //* Create a temporary file via sort (here for fixed length files) //*------------------------------------------------------------------ //S1M0001F EXEC PGM=SORT //SYSUDUMP DD SYSOUT=K,HOLD=YES //SYSOUT DD SYSOUT=* //SYSPRINT DD SYSOUT=* //SORTIN DD DSN=NGAPO.GAPBU.ABE.BELEGE.GESAMT.D141118, // DISP=(SHR,KEEP,KEEP) //SORTOUT DD DSN=NGAPO.GAPBU.ABE.BELEGE.GESAMT.D141118.X, // DISP=(NEW,CATLG,DELETE),UNIT=SYSDA, // SPACE=(TRK,(0500,0500),RLSE), // DCB=*.SORTIN //SYSIN DD * SORT FIELDS=COPY //* //*------------------------------------------------------------------ //* Transfer the file via ftp (here for fixed length files, no RDW) //*------------------------------------------------------------------ //S2M0001F EXEC PGM=FTP,PARM='(TRAN TRTAUGE1' //OUTPUT DD SYSOUT=* //SYSPRINT DD SYSOUT=* //INPUT DD * 31.253.96.12 21 (exit ftpuser ftppassword cd /p1/cnv/ftp pwd type i PUT 'NGAPO.GAPBU.ABE.BELEGE.GESAMT.D141118.X' + ngapo.gapbu.abe.belege.gesamt.d141118 QUIT /* //*------------------------------------------------------------------ //* Remove the temporary file //*------------------------------------------------------------------ //S3M0001F EXEC PGM=IDCAMS //SYSPRINT DD SYSOUT=* //SYSIN DD * DELETE NGAPO.GAPBU.ABE.BELEGE.GESAMT.D141118.X SET MAXCC = 0 //*
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
//*------------------------------------------------------------------ //* Create a temporary file via sort (here for var. length files) //*------------------------------------------------------------------ //S1M0002V EXEC PGM=SORT //SYSUDUMP DD SYSOUT=K,HOLD=YES //SYSOUT DD SYSOUT=* //SYSPRINT DD SYSOUT=* //SORTIN DD DSN=VGAPO.GAPCI.ABE.ABEACL, // DISP=(SHR,KEEP,KEEP) //SORTOUT DD DSN=NGAPO.GAPCI.ABE.ABEACL.X, // DISP=(NEW,CATLG,DELETE),UNIT=SYSDA, // SPACE=(TRK,(0500,0500),RLSE), // DCB=(RECFM=VB,LRECL=4000) //SYSIN DD * SORT FIELDS=COPY //* //*------------------------------------------------------------------ //* Transfer the file via ftp (here for var. length files, with RDW) //*------------------------------------------------------------------ //S2M0002V EXEC PGM=FTP,PARM='(TRAN TRTAUGE1' //OUTPUT DD SYSOUT=* //SYSPRINT DD SYSOUT=* //INPUT DD * 31.253.96.12 21 (exit ftpuser ftppassword cd /p1/cnv/ftp pwd type i LOCSITE RDW PUT 'NGAPO.GAPCI.ABE.ABEACL.X' + vgapo.gapci.abe.abeacl QUIT /* //*------------------------------------------------------------------ //* Remove the temporary file //*------------------------------------------------------------------ //S3M0002V EXEC PGM=IDCAMS //SYSPRINT DD SYSOUT=* //SYSIN DD * DELETE NGAPO.GAPCI.ABE.ABEACL.X SET MAXCC = 0 //*
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
//*------------------------------------------------------------------ //* Create a temporary file via sort (here for fixed length files) //*------------------------------------------------------------------ //S1M0003F EXEC PGM=SORT //SYSUDUMP DD SYSOUT=K,HOLD=YES //SYSOUT DD SYSOUT=* //SYSPRINT DD SYSOUT=* //SORTIN DD DSN=NGAPO.GAPUF.ABE.ARTRESO(0), // DISP=(SHR,KEEP,KEEP) //SORTOUT DD DSN=NGAPO.GAPUF.ABE.ARTRESO.X, // DISP=(NEW,CATLG,DELETE),UNIT=SYSDA, // SPACE=(TRK,(0500,0500),RLSE), // DCB=(RECFM=FB,LRECL=0120) //SYSIN DD * SORT FIELDS=COPY //* //*------------------------------------------------------------------ //* Transfer the file via ftp (here for fixed length files, no RDW) //*------------------------------------------------------------------ //S2M0003F EXEC PGM=FTP,PARM='(TRAN TRTAUGE1' //OUTPUT DD SYSOUT=* //SYSPRINT DD SYSOUT=* //INPUT DD * 31.253.96.12 21 (exit ftpuser ftppassword cd /p1/cnv/ftp pwd type i PUT 'NGAPO.GAPUF.ABE.ARTRESO.X' + ngapo.gapuf.abe.artreso_000001 QUIT /* //*------------------------------------------------------------------ //* Remove the temporary file //*------------------------------------------------------------------ //S3M0003F EXEC PGM=IDCAMS //SYSPRINT DD SYSOUT=* //SYSIN DD * DELETE NGAPO.GAPUF.ABE.ARTRESO.X SET MAXCC = 0 //*
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Here is an optional procedure to FTP mainframe files to Unix/Linux. It would be nice if all files could be FTP'd with the RDW option, so we can know the record-sizes immediately on unix/linux, but the RDW option will not convert the Fixed- Length-Files to Variable-Length-Files during the FTP.
But we could convert FLR to VLR prior to the FTP with RDW option for all. When we receive the files on unix/linux, we will know files that were Fixed-Length on the mainframe, because all record will have the same size in the RDW prefix. Here is a skeleton JCL that could be used to convert Fixed to Variable.
//* JCL to convert Fixed-Length to Variable-Length //SORT001 EXEC PGM=SORT //SYSUDUMP DD SYSOUT=K,HOLD=YES //SYSOUT DD SYSOUT=* //SYSPRINT DD SYSOUT=* //SORTIN DD DSN=AAAAA.BBBBB.CCC.DDDDD, // DISP=(SHR,KEEP,KEEP) //SORTOUT DD DSN=AAAAA.BBBBB.CCC.DDDDD.X, // DISP=(NEW,CATLG,DELETE),UNIT=SYSDA, // SPACE=(TRK,(0500,0500),RLSE), // DCB=(RECFM=VB,LRECL=32000) //SYSIN DD * SORT FIELDS=COPY OUTFIL FNAMES=SORTOUT,FTOV /*
Note |
|
//* JCL to FTP Variable-Length with RDW to Unix/Linux //ACHP01B EXEC PGM=FTP,PARM='(TRAN TRTAUGE1' //OUTPUT DD SYSOUT=* //SYSPRINT DD SYSOUT=* //INPUT DD * 30.252.99.11 21 (exit userid password cd /p2/cnv1/cnvdata/d0ebc/ pwd type i LOCSITE RDW PUT 'AAAAA.BBBBB.CCC.DDDDD.X' AAAAA.BBBBB.CCC.DDDDD QUIT /*
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
The generated FTP JCLs will be transferred back to the mainframe for execution. Mainframe files will be FTP'd to /p1/cnv/ftp/ & then copied to $CNVDATA/d0ebc/ for conversion to $CNVDATA/d2asc.
#2. rm /p1/cnv/ftp/* <-- clear output directory of any prior FTP's ================
#3. Execute FTP JCL on the mainframe ================================ - FTP mainframe files to /p1/cnv/ftp/...
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
We will now change to $CNVDATA/cnvdata/ (vs $RUNLIBS/testlibs/ above) **
#0a. Login on unix/linux #0b. cdl (alias cd $CNVDATA) --> /p1/cnv/cnv1/cnvdata (or your cnvdata)
/p1/cnv/ftp/... <-- mainframe files have been FTP'd here
/p1/cnv/cnv1/ :-----cnvdata : :-----cpys <-- copybooks : :-----ctl <-- control files : :-----d0ebc <-- EBCDIC files from mainframe (some Variable length) : :-----d1ebc <-- EBCDIC files from mainframe (all Fixed length) : :-----d2asc <-- converted to ASCII here : :-----maps <-- cobmaps (record layouts) : :-----pfx1 <-- uvcopy jobs generated from copybooks : :-----pfx2 <-- with record type tests for variable length files : :-----pfx3 <-- with datafile-names replacing copybook-names : :-----stats <-- table summaries of recsizes in variable length files
#1a. rm -f d0ebc/* d1ebc/* d2asc/* ============================= - clear data-conversion directories from prior groups
#1b. cp /p1/cnv/ftp/* d0ebc ====================== - copy FTP'd files to $CNVDATA/d0ebc for conversion
For the following procedures (generating data conversion jobs) we will standardize our control file names as $CNVDATA/ctl/datacpy52.
Previous documentaiton described 3 ways of creating the control file & the name may have been different for some of those methods. You should now use 1 of the following copy comamnds:
#2a. cp $RUNLIBS/ctl/datacpy52 $CNVDATA/ctl/datacpy52 ================================================ - if you created the control file manually as per '4A3'
#2b. cp $RUNLIBS/ctl/datacpy52 $CNVDATA/ctl/datacpy52 ================================================ - if you created the control file from the Essential Files report as per '4B1'
#2c. cp $RUNLIBS/ctl/catctl4 $CNVDATA/ctl/datacpy52 =============================================== - if you created the control file from an Excel spreadsheet of mainframe filenames, edited with copybooknames by a knowledgeable person as per '4C1' - '4C5'
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
We decided to convert all mainframe VARIABLE length files to FIXED length on the unix/linux/windows system to simplify procedures & allow for some software such as Easytrieve that does not handle variable length files on unix/linux/windows. Many mainframe files defined as VARIABLE length were in fact FIXED length.
We can now run 'varstat1' to determine the largest record-size in the variable length files & use that as our fixed record-size on unix/linux/windows. 'varstat1' will create a table summary of record-sizes used in variable length files - Ftp'd from mainframe with 'RDW' option (4 byte hdrs with record size).
#3. uvcopyx varstat1 d0ebc stats uop=q0i7,rop=r0 ============================================ - create table summary recsize counts for all files in d0ebc subdir - output reports in stats/... with same names as datafiles in d0ebc
varstat1 2015/12/16_13:39:14 record-sizes in d0ebc/vgapo.gapci.afi.afiahst tbl#0001 tblt1f1 c0(5) line# count % 1strec# record-sizes 1 4,632 100 1 00500 4,632*100 *TOTAL*
varstat1 2015/12/16_13:39:54 record-sizes in d0ebc/vgapo.gapci.afi.afipn tbl#0001 tblt1f1 c0(5) line# count % 1strec# record-sizes 1 1 0 1 00080 2 8,339 11 2 00150 3 66,240 88 3 00300 74,580*100 *TOTAL*
varstat1 2015/12/16_13:39:56 record-sizes in d0ebc/vgapo.gapci.afi.afispl tbl#0001 tblt1f1 c0(5) line# count % 1strec# record-sizes 1 1 0 1 00080 2 31 28 4 00438 3 5 4 21 00467 4 11 10 2 00469 5 4 3 7 00561 6 48 44 3 00615 7 1 0 37 00689 8 1 0 83 00691 9 1 0 60 00694 10 5 4 16 00700 108*100 *TOTAL*
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Please see the table summaries of record sizes in variable length files on the previous page. Many files defined as variable length on the mainframe had in fact only 1 record-length. many other files had only 2 or 3 record- lengths due to header &/or trailer records. Only a few were truly variable.
We can now pick the largest record-length in each file & enter it in our conntrol file ctl/datacpy52 on keyword rs1=... If desired you can round up odd numbers as desired.
We will then load control file ctl/datacpy52 into Indexed file ctl/datacpy52I and run 'uvcopy varfix01' which will copy the variable length files from d0ebc/... to d1ebc/... converting variable lengths to fixed lengths as dictated by the rs1=... keyword in the Indexed file ctl/datacpy52I.
#4a. vi stats/* <-- investigate the variable length summaries ========== - to determine a fixed length for each file
--- OR better - list all summary tables for easier investigation - if you have a printer available
#4b. listall1 stats <-- print all reports in tmp subdir ============== - can mark up the report with desired record lenths
#5. vi ctl/datacpy52 <-- update the rs1=... keyword in the control file ================
#6. uvcopy loadctlI,fili1=ctl/datacpy52,filo1=datacpy52I ====================================================
ngapo.gapuf.afi.artreso_ grp=afi cpy=cpdaare.cpy rs1=00120 rca=00120 rcs=00120 key=___,___ gdg=003 typ=RSF ngapo.gapuf.afi.beldat grp=afi cpy=copvbel.cpy rs1=00350 rca=00350 rcs=00350 key=___,___ gdg=___ typ=RSF ngapo.gapuf.afi.kumausk_ grp=afi cpy=cpdaauf.cpy rs1=00400 rca=00400 rcs=00400 key=___,___ gdg=003 typ=RSF vgapo.gapci.afi.afiadz grp=afi cpy=ctbvstam.cpy rs1=00400 a rca=00120 rcs=04000 key=000,028 gdg=___ typ=IDXf1 vgapo.gapci.afi.afiahst grp=afi cpy=ctbvstam.cpy rs1=00500 a rca=00120 rcs=04000 key=000,028 gdg=___ typ=IDXf1 vgapo.gapci.afi.afiart grp=afi cpy=cpdaart.cpy rs1=01350 a rca=01500 rcs=04000 key=000,015 gdg=___ typ=IDXf1 vgapo.gapci.afi.afiauf grp=afi cpy=cpdaauf.cpy rs1=00512 rca=00250 rcs=00512 key=000,021 gdg=___ typ=IDXf1 vgapo.gapci.afi.afibuch grp=afi cpy=copvbel.cpy rs1=00350 rca=00349 rcs=00350 key=000,028 gdg=___ typ=IDXf1
#7. uvcopyxr1 varfix01 d0ebc d1ebc ctl/datacpy52I uop=q0i7 ======================================================= - convert all files from variable length in d0ebc/... to Fixed length in d1ebc/...
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
The COBOL copybooks will be used to generate uvcopy jobs to convert EBCDIC datafiles to ASCII in character fields, but preserving packed/binary fields which must remain the same for unix COBOL as for mainframe COBOL.
We will copy the COBOL copybooks from $RUNLIBS/cpys/* to $CNVDATA/cpys/..., convert them to cobmaps (record layouts) in maps/...,& convert the cobmaps to data conversion jobs in pfx1/...
If desired you could copy over only the copybooks required for the current group of datafiles (defined in ctl/datacpy52) using 'uvcopy ctlsfcpy1'.
But, we might as well copy over all copybooks & convert all to cobmaps, since the control file ctl/datacpy52 will write data conversion jobs into pfx3/... only for the datafiles defined in the control file.
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. cp $RUNLIBS/cpys/* cpys ======================= - copy all copybooks from $RUNLIBS to $CNVDATA/cpys
#2. uvcopyx cobmap1 cpys maps uop=q0i7 ================================== - convert all cobol copybooks to cobmaps (record layouts)
#3. uvcopyx uvdata51 maps pfx1 uop=q0i7 =================================== - convert all cobmaps to data conversion jobs - names the data conversion jobs in pfx1/... same as copybook-names - codes I/O files (fili1=... & filo1=...) same as copybook-names
#4. cp pfx1/* pfx2/ =============== - copy all data conversion jobs to pfx2/... to be edited with record-type test code for files with multiple types (redefined records)
#5. vi pfx2/... <-- edit files with multiple record types with record-type =========== test code for files with multiple types (redefined records)
Note |
|
Note |
|
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Here we will show you only the edited result of the citytax2 example which is documented in detail at https://uvsoftware.ca/datacnv1.htm#5A1 - 5B3.
# uvcopy job to translate EBCDIC to ASCII, preserve packed, fix zoned signs opr='jobname=citytax2 - pfx2 name=datafilename' opr='copybook=citytax2 - pfx1 name=copybookname' uop=q0,was=a64000b64000 fili1=?d1ebc/citytax2,rcs=00128,typ=RSF filo1=?d2asc/citytax2,rcs=00128,typ=RSF @run opn all loop get fili1,a0 skp> eof mvc b0(00128),a0 move rec to outarea before field prcsng tra b0(00128) translate entire outarea to ASCII # --- <-- insert R/T tests here for redefined records # Next 9 instructions inserted manually to test multi-record-types cmc b8(1),'H' Header record skp= put1 cmc b8(1),'T' Tax record skp= typT cmc b8(1),'P' Payment record skp= typP msg b0(64) Error - show 1st 64 bytes msgw 'Invalid record type byte 8 not H,T,P - enter to bypass' skp loop # typT mvc b88(4),a88 bns post-date mvc b92(15),a92 pns land-value:face-value trt b107(9),$trtsea ns maint-tax skp put1 # --- typP mvc b20(60),a20 pns mthpayments skp put1 # --- put1 put filo1,b0 skp loop eof cls all eoj
In this example the 9 instructions we inserted are overshadowing the number of instructions automatically generated to preserve the packed/binary/signed fields.
You should understand that there could be hundreds of auto-generated instructions for the various record types, in which case inserting the record type code is a minor effort in comparison.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
/p1/cnv/cnv1 :-----cnvdata1 : :-----cpys <-- copybooks : :-----ctl <-- control files : :-----d0ebc <-- EBCDIC files from mainframe (some Variable length) : :-----d1ebc <-- EBCDIC files from mainframe (all Fixed length) : :-----d2asc <-- converted to ASCII here : :-----maps <-- cobmaps (record layouts) : :-----pfx1 <-- uvcopy jobs generated from copybooks : :-----pfx2 <-- with record type tests for variable length files : :-----pfx3 <-- with datafile-names replacing copybook-names
The data conversion jobs in pfx1/... & pfx2/... are named from the copybooks and the internal I/O files (fili1=... & filo1=...) are also named the same as the copybook.
'uvcopy uvdata52' will replace the copybook-names with the datafile-names, guided by the control file ctl/datacpy52 which matches datafiles & copybooks. Here are a few lines from ctl/datacpy52:
ngapo.gapuf.afi.beldat grp=afi cpy=copvbel.cpy rs1=00350 rca=00350 rcs=00350 key=___,___ gdg=___ typ=RSF ngapo.gapuf.afi.kumausk_ grp=afi cpy=___________ rs1=00400 rca=00400 rcs=00400 key=___,___ gdg=003 typ=RSF vgapo.gapci.afi.afiadz grp=afi cpy=ctbvstam.cpy rs1=00400 a rca=00120 rcs=04000 key=000,028 gdg=___ typ=IDXf1 vgapo.gapci.afi.afiahst grp=afi cpy=ctbvstam.cpy rs1=00500 a rca=00120 rcs=04000 key=000,028 gdg=___ typ=IDXf1 vgapo.gapci.afi.afiart grp=afi cpy=cpdaart.cpy rs1=01350 a rca=01500 rcs=04000 key=000,015 gdg=___ typ=IDXf1 vgapo.gapci.afi.afiauf grp=afi cpy=cpdaauf.cpy rs1=00512 rca=00250 rcs=00512 key=000,021 gdg=___ typ=IDXf1
We will allow for files without copybooks by copying a supplied skeleton job from $UV/IBM/skeleton2 into pfx2/... Then any datafile without a copybook assigned in the control file will use this skeleton job to convert EBCDIC to ASCII. This assumes that datafiles without copybooks do not have any packed/binry/signed fields - which is often the case.
#6. cp $UV/IBM/skeleton2 pfx1 ========================= - copy uvcopy job skeleton for datafiles without copybooks - OK for files that have no packed/binary fields - could modify control file typ=LST (LineFeeds OK if no packed/binary)
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Finally, we are ready to gnerate the data conversion jobs with the actual datafile names (vs the copybook names).
#7. uvcopy uvdata52,fili1=ctl/datacpy52,fild2=pfx2,fild3=pfx3,uop=q0i7 ================================================================== - generate data conversion uvcopy job for each datafile - inserting datafilenames (vs copybooknames) on fili1=... filo1=... and renaming the job from copybookname to the datafilename
#8. uvcopyxx 'pfx3/*' <-- execute all data conversion jobs ================= - to convert d1ebc/... to d2asc/...
'uvcopyxx' is a script that executes all the uvcopy jobs in pfx3/... If desired you can execute individual jobs for individual files, for example:
#8a. uvcopy pfx3/ngapo.gapuf.afi.beldat ================================== - execute job to convert EBCDIC file d1ebc/ngapo.gapuf.afi.beldat to ASCII in d2asc/ngapo.gapuf.afi.beldat
#9. cp d2asc/* $RUNDATA/data1/ ========================== - copy all converted files to .../.../testdata/data1/ for testing
In case you have accumulated a lot of datafiles in $CNVDATA/d2asc, we provide 'ctlsfdata1' that will generated a script to copy only the datafiles defined in control file ctl/datacpy52.
#9a. uvcopy ctlsfdata1,fili1=datacpy52,filo1=sf/copydata1 ==================================================== - gnerate script to copy datafiles from $CNVDATA/d2asc to $RUNDATA/data1
#9b. sf/copydata1 <-- execute script to copy datafiles ============= from $CNVDATA/d2asc/... to $RUNDATA/data1/...
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
5A1. | Introduction/Overview |
5B1. | Test Environment, RUNLIBS & RUNDATA superdirs defined in your profile |
- allows different programmers to have their own set of Libraries & Data | |
- best to have common set of Libraries, but different Data dirs | |
- aliases cdl/cdd allow you to switch easily between Libs/Data dirs |
5B2. | JCL converted to scripts in Part_1 |
- this Part_5 presents Tips & Techniques for testing & debugging |
5C1. | Use these tips for the Demo JCL/scripts or for your own JCL/scripts |
See DATA file conversions in https://uvsoftware.ca/datacnv1.htm | |
to convert your data files & FTP data from Mainframe to Unix/Linux |
5C2. | Converting Your files - brief review of DATAcnv1.htm. |
Datafile Conversion Directories Required |
5C3. | FTP data from Mainframe to Unix/Linux - binary & option RDW |
5C4. | Re-Naming files as required |
5D1. | run 'testdatainit' before JCL/scripts |
- to clear output files & make it easier to see outputs of current test | |
- use 'joblog' to capture the log from your JCL/script | |
- subdirs cleared by testdatainit |
5E1. | Test/Debug for difficult JCL/scripts |
- save all data1/* in datasave/ & load data1/ with files for difficult job | |
5F2. | Iterations of test/investigate/modify as required |
- Animator/Debugger for Micro Focus COBOL | |
5F3. | Check results of test/debug, investigate output files (use uvhd if packed) |
Print-outs using 'uvlp' scripts assist test/debug |
5G1. | Re-Creating the GDG control file - when new JCLs added with new GDG files |
- alternate location for gdgctl51I in $APPSADSM/ctl vs default $RUNDATA/ctl |
5G2. | Editing the GDG control file with no of generations desired (vs default 7) |
- modifying text file version & reloading Indexed file used by functions | |
in JCL/scripts |
5H1. | GDG files & step Restart |
- VU JCL conversions write new GDGs into a jobtmp subdir & restore to | |
the data subdir only if JobEnd=Normal. This allows you to rerun scripts | |
without worrying about any new GDGs created in the failing run. | |
- If there are steps prior to the failing step with updates in place, | |
you can use "step Restart" at the failed step to prevent double updates. | |
5H2. | RERUN after failure without restart |
5H3. | RERUN after failure WITH RESTART |
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
5I1. | jobstop & job restart |
- activate stop at begin each step to examine files from 2nd screen |
5J1. | Activating 'console-logging' vs 'job-logging' for 1 JCL/script debug |
- use console logging for long seesions running multiple jobs | |
- captures any operator inputs (job-logging does not) |
5K1. | uvcmp___ - file comparison for files with packed/binary &/or no LineFeeds |
- unix 'diff' does not work for these types of files | |
- UV Software provides the 'uvcmp' utilities | |
- uvcopy jobs uvcmp1,2,3 & several scripts uvcmpFA1,uvcmpFE1,etc | |
to make the uvcopy jobs easier to run. | |
- uvcmp prints mismatched record pairs in vertical hexadecimal | |
flagging differences with '*'s, see sample report comparing | |
2 generations of gl.account.master_000001 & _000002 |
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Here are some tips that will help you to test/debug your JCL/scripts & COBOL programs in applications migrated from the mainframe.
Most jobs can be debugged by comparing the output files & sysout reports to the mainframe files & reports. Here are some tips for the more difficult cases. We assume that your profile has RUNDATA=$HOME/testdata (debugging in homedir), so you do not affect other programmers working. Your might also have your own set of libraries (RUNLIBS=$HOME/testlibs), OR RUNLIBS might point to a common set of libraries (JCL/scripts, COBOL programs, parms, quikjobs, etc).
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
For initial testing & self-education, you might setup testlibs, testdata,& cnvdata in your homedir. Assuming your login is userxx, here is the directory showing only the most essential subdirs. See all testlibs subdirs at '1B3'.
/home/userxx <--- your homedir :-----testlibs <-- $RUNLIBS - LIBRARIEs for JCL & COBOL : :-----cbls - COBOL program source : :-----cblx - compiled COBOL programs : :-----jcls - JCL/scripts (Korn shells converted from JCLs) : : :-----testdata <-- $RUNDATA - DATA directories & subdirs : :-----data1 - datafiles : :-----joblog - programmer debug log files : :-----jobtmp - temp files for new GDGs, SYSINs, etc : :-----sysout - SYSOUT printer files (from COBOL DISPLAYs) : : :-----cnvdata <-- $CNVDATA - data CONVERSION superdir : :-----d1ebc - EBCDIC files from mainframe for conversion : :-----d2asc - files converted to ASCII : :-----cpys - COBOL copybooks : :-----maps - cobmaps (record layouts) generated from copybooks : :-----pfx1 - uvcopy jobs convert EBCDIC to ASCII (gen from cobmaps)
Note that RUNLIBS, RUNDATA,& CNVDATA are defined in your stub_profile, before calling the common_profile (in /home/appsadm/env). See profiles listed on pages '1C1' & '1C2'. The supplied stub_profile defines these variables as follows (for initial testing, will change for production).
export RUNLIBS=$HOME/testlibs export RUNDATA=$HOME/testdata export CNVDATA=$HOME/cnvdata
This means that each programmer would have their own set of libraries & data, which is OK for early testing, especially if only 1 programmer on the project. For multi programmer teams, you would probably want to redefine RUNLIBS to point to a common set of libraries (not in homedirs), but you might well leave RUNDATA in $HOME/testdata, so programmers would not conflict each other.
Aliases cdl/cdd/cdc are supplied (in the common_profile), to make it easy for you to quickly switch to testlibs/testdata/cnvdata as desired.
alias cdl='cd $RUNLIBS' alias cdd='cd $RUNDATA' alias cdc='cd $CNVDATA'
These are even more useful after you change testlibs/testdata/cnvdata to common locations for the programmer team members - might have longer paths and you would not have to remember where.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
We assume you have already converted all COBOL & JCL as documented on pages '1F1' & '1G1' using super scripts that convert all copybooks & COBOL programs, and all JCLs with PROCs & PARMs.
cnvMF51Acpy all <-- convert copybooks thru all steps ============= - cpy0 --> cpy1 --> cpy2 --> cpys cnvMF51A all <-- convert COBOL programs thru all steps ============ - cbl0 --> cbl1 --> cbl2 --> cbls --> cblx
jcl2ksh51A all <-- convert ALL JCL thru all steps ============== jcl0 --> jcl1 --> jcl2 --> jcl3
--- OR for AIX, the conversion scripts would be:
cnvAIXcpyA all <-- convert copybooks for AIX COBOL =============== cnvAIXcblA all <-- convert COBOL programs for AIX COBOL ==============
jcl2ksh53A all <-- convert JCL for AIX COBOL ==============
The converted JCL/scripts are now in jcl3/..., but need to be copied to jcls/... for execution since only jcls is in $PATH (defined in the common_profile).
cp jcl3/* jcls <-- copy All scripts to execution subdir jcls/... ============== - we will do this for our test/demos - BUT do NOT copy ALL for your "real conversion"
cp jcl3/jobnamex.ksh jcls <-- copy 1 script at a time before test/debug ========================= - recommended for your "real conversion"
You should NOT copy all jcl3/* jcls/ for your "real conversion" project. For your real conversion project, you should copy 1 script at a time when you ready to start test/debug on each script. This is an easy way to keep track of your progress (jcls/... are debugged, jcl3/... not yet)
Another reason NOT to copy all jcl3/* jcls/ when JCL/scripts re-converted:
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
jcl0 ------> jcl1 ---------> jcl2 -----------> jcl3 ------> jcls cleanup PROC expand convert to ksh copy 1 at a time for test/debug
jcl2ksh51A all <-- script to convert ALL JCL thru all steps ============== jcl0 --> jcl1 --> jcl2 --> jcl3
You only need to rerun the 'jcl2 --> jcl3' step, using 'jclxx51' when the only changes are enhancements to JCL converter or changing options & control files.
jclxx51 jcl2 jcl3 <-- script to reconvert jcl2/* --> jcl3/... =================
To convert 1 JCL at a time, use the 'jcl2ksh51' script (vs 'jcl2ksh51A').
jcl2ksh51 jcl0/jobxx.jcl <-- convert 1 JCL from jcl0->jcl1->jcl2->jcl3->jcls ========================
'jcl2ksh51' converts thru all stages to jcl3/... & prompts for copy to jcls/... It prompts before overwriting jcls/... in case you need to save any extensive editing you have already done on some steps of the JCL/script.
JCL conversion scripts for AIX are coded with '53' vs '51' for Micro Focus.
jcl2ksh53A all <-- script to convert ALL JCL thru all steps ============== jcl0 --> jcl1 --> jcl2 --> jcl3
jclxx53 jcl2 jcl3 <-- script to convert just the JCL jcl2/* --> jcl3/... ================= - omitting cleanups & proc expansion prior to JCL convert - save time when only changes are enhancements to JCL converter or changing options & control files
jcl2ksh53 jcl0/jobxx.jcl <-- convert 1 JCL from jcl0->jcl1->jcl2->jcl3->jcls ========================
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
The cross-reference reports can help to resolve various problems you may encounter during testing/debugging.
xrefall cbls jcl3 ksh <-- create all cross-ref reports ===================== - in subdir xref/...
cobfiles5A cbls cpys maps <-- creates xref/cobfiles report =========================
See samples of cross-refs in JCLcnv3aids.htm#Part_1. Here are some of the most useful cross-references:
xkshparm1 |
|
xkshparm2 |
|
xkshprog1 |
|
xkshprog2 |
|
xcobcopy1 |
|
xcobcopy2 |
|
xcobcall1 |
|
xcobcall2 |
|
cobfil51 |
|
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
For documentation, we are using testfiles supplied with Vancouver Utilities, but for your conversion, you will need to FTP files from the mainframe & convert as illustrated with the demo files in this documentation.
See comprehensive DATA file conversions in 'Part_4' of this JCLcnv2real.doc.
For simpler data file conversions see https://uvsoftware.ca/datacnv1.htm - especially following parts:
https://uvsoftware.ca/datacnv1.htm#Part_3
https://uvsoftware.ca/datacnv1.htm#Part_6
If you had no packed/binary fields you could FTP in TEXT MODE which automatically translates to ASCII, but the CR/LF makes records 2 bytes longer, & you either have to remove to match COBOL programs, or modify program SELECTs to 'ORGANIZATION LINE SEQUENTIAL' which allows for the CR/LF & also allows you to use unix system utilities (vi,more,lp,cat,etc) that required LineFeeds.
But most mainframes have a lot of files with packed/binary fields & it is probably best to FTP all files in BINARY MODE & use the VU conversion methods, which generate data conversion jobs automatically from the COBOL copybooks, to preserve the packed/binary fields (same for unix COBOLs as mainframe COBOLs). Note that unpacked signed fields require zoned sign conversion. Mainframe units byte positives are coded '{ABCDEFGHI' & negatives '}JKLMNOPQR'. Unix COBOL positives are coded just '0123456789' & negatives 'pqrstuvwxy'. The VU methods allow for this & an alternate version could be used with FTP text/ASCII files.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
As explained above, we are using VU test/demo files for documentation, but for your conversion, you will need to FTP files from the mainframe & convert as documented in https://uvsoftware.ca/datacnv1.htm#Part_6. Here (on pages 5C2-5C4) is a brief overview of DATA file conversion, but you must use the detailed procedures in DATAcnv1.htm.
/home/userxx <-- your homedir :-----cnvdata1 <-- cnvdata superdir for DATA conversion : : : :-----cpys <-- copybooks <--- copy from $UV/mvstest/cnvdata1/cpys/* : :-----ctl - control files : :-----d0ebc <-- EBCDIC data files <-- copy $UV/mvstest/cnvdata1/d0ebc/* : :-----d1ebc - EBCDIC data files renamed to VU conventions : :-----d2asc - ASCII data files (converted by uvcopy uvdata51) : :-----d3ebc - convert back to EBCDIC by uvdata31 : :-----d4pipe - pipe delimited to load DB tables (by genpipe1) : :-----maps - copybooks with record layouts on right side : :-----pfx1 - uvcopy jobs to convert EBCDIC to ASCII (gen by uvdata51) : :-----pfx2 - with data filenames inserted (vs copybook names) : :-----pfx3 - copied here for modify/execute : :-----pfp1 - uvcopy jobs to create pipe delimited (gen by genpipe1) : :-----pfp2 - pipe delim jobs with datafilenames inserted : :-----pfp3 - copy here before editing for multi-record-type files : :-----sqlTC1 - scripts to Create database tables (gen from copybooks) : :-----sqlTL1 - scripts to Load database tables (gen from copybooks) : :-----tmp - temp files (keep working directory clean)
cpys-------->maps---------->pfx1-------------->pfx2----------->pfx3---------> cobmap1 uvdata51 uvdata52 copy/edit execute convert copybooks change copybooknames to uvcopy jobs to datafilenames
Variable-> fixed BCDIC->ASCII copy/rename MAINFRAME -----> d0ebc ------> d1ebc ----------> d2asc ---------> $RUNDATA/data1 FTP/BINARY ASCII to d2asc pfx1/uvcopy jobs cp/renameL rename GDG files
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Here (pages 5C1-5C4) is a brief overview of how you would FTP & convert YOUR datafiles (vs demo files). You must use detailed procedures in DATAcnv1.htm.
#1. Login --> your homedir ===== #2. cdc --> $HOME/$CNVDATA === #3. cd d0ebc <-- change into the data subdir ========
#4. ftp xxx.xxx.xxx.xxx <-- FTP to mainframe IP# =================== - binary & option RDW #4a. userid--> ..... <-- enter userid #4b. passwd--> ..... <-- enter password #4c. ascii <-- translate EBCDIC to ASCII & insert CR/LF ---OR--- (ascii is usually the default) #4c. binary <-- ensure binary transfer #4d. quote site RDW <-- required if any variable length files #4e. cd ... <-- change to data files directory ?? #4f. get 'XXX.XXX.XXX' <-- get desired data files #4g. get '...etc....' - 'single quotes' may be required
#5. Might need to change permissions on files FTP'd from mainframe
#5a. chmod 664 * <-- files should be 664 ===========
Note 'LITERAL SITE RDW' could be used as well as 'QUOTE SITE RDW'
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
On some systems FTP'd files may be enclosed in single quotes, may be UPPER case, may have embedded blanks or '$' signs. ON unix we need to remove single quotes, remove any embedded blanks or '$' signs, & lower case is strongly recommended.
#6. mv \'XXX.XXX.XXX\' xxx.xxx.xxx <-- the hard way ================================ - see #7. the easy way
Imagine how awkward manual renaming via 'mv' commands would be if you have hundreds of files to be renamed. Vancouver Utilities has many 'rename' scripts to make mass changes to filenames easy.
#7a. renameQQ data1 <-- easy way to remove Quotes from All files in subdir ============== #7b. renameL data1 <-- easy way to translate names to Lower case ============= #7c. renameBU data1 <-- easy way to convert Blanks to '_' Underscores ============== #7d. renameDU data1 <-- easy way to convert '$' signs to '_' Underscores ==============
After converting GDG files, you must rename them using the VU GDG file naming conventions - appending '_000001' for the 1st generation. For example:
#8. cp $CNVDATA/d2asc/AR.CUSTOMER.MASTER(0) $RUNDATA/data1/ar.customer.master_000001 ================================================================================ - the HARD way of renaming mainframe GDG files to VU GDG standards
#8a. renameGDG d1ebc <-- script to rename all files in directory for VU GDGs =============== (the EASY way)
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
When you are testing, it is a good idea to run 'testdatainit', before you run the JCL/script being debugged. 'testdatainit' clears several subdirs, which makes it much easier to inspect the results.
Here are the testdata subdirs flagged with '0' to indicate subdirs that are cleared with testdatainit & '*' to indicate subdirs with your data files thet you would not want to clear.
/home/userxx <--- userid for testing OR your homedir :-----testdata - data files superdir : :--*--data1 - data files : :--*--datasave - backup, allow clear data1/.. to files for 1 job debug : :--*--dataMF - datafiles from MainFrame for comparison : :--*--ctl - GDG control file : :--0--jobctl - jobstop control files to debug JCL/script : :--0--joblog - programmer debug log files : :--0--jobmsgs - status msgs from JCL/scripts (step begin/end) : :--0--jobtimes - job/step times date stamped history files : :--0--jobtmp - temporary files for SYSIN data & GDG temp files : : :----JOBXX - jobtmp subdir created by jobset51 (script line 10) : : : :-----GDG : : : : :-----data1 - new GDG files restored to data1/... at EOJ : :--0--sysout - SYSOUT printer files : : :----yymmdd - date stamped subdir for print files : : : :-----JOBXX_S0010_SYSPRINT - named by job+step+DDname : :--0--tmp - tmp subdir for uvsort & misc use : :--0--tape1 - tape files become GDG files on disc
dataMF/... |
|
datasave/... |
|
data1/... |
|
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
1st run 'testdatainit' to clear output subdirs & make it easy to check results. Then run your JCL/script using 'joblog' which captures the log into the 'jblog' subdir for review if measages roll off he screen. Here is an example:
#1. Login userxx --> $HOME (/home/userxx) ============
#2. cdd --> $HOME/testdata ======================
#3. testdatainit <-- clear subdirs of prior run data ============ - also prompts to reload the GDG control file
#4. joblog jobxx.ksh <-- use joblog to run script to be tested ================ - saves console messages in joblog/...
The subdirs cleared are: jobctl, joblog, jobmsgs, jobtmp, sysout,& tmp. We will discuss in order of importance:
Note |
|
Note |
|
llr sysout <-- will show only new files from your latest run - because testdatainit has cleared outputs of prior runs
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
jobxx.ksh start=S0030 stop=S0030 ================================ - would run only step 3 - omit the stop=... to run all remaining steps - would use to restart production jobs after failing step
Note |
|
#7. llr jobtmp <-- list all filenames in jobtmp/ such as GDG files ==========
#8. l jobtmp/JOBXX/GDG/data1/ <-- list any GDG files left in jobtmp ======================== - JobEnd=Normal moves new GDG files back to data1/... subdir - JobEnd=AbTerm leaves new GDG files in jobtmp/JOBXX/GDG/data1/... so you can fix problem & rerun without removing new GDGs from data1/...
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
We assume that you have been test/debugging many jobs & then find a difficult job that requires these exceptional procedures. The data1/... subdir could have hundreds of files from prior tests; we can save them to datasave/..., clear all data1/* & copyback only the files required for the difficult job being debugged.
#1. Login userxx --> /home/userxx (or whatever Login desired)
#2. cdd --> $HOME/testdata ======================
#3. mkdir datajobxx <-- mkdir for files required in jobxx =============== (problem job to be debugged)
#3a. cp data1/fileA datajobxx/ <-- select files required for jobxx ========================= #3b. cp data1/fileB datajobxx/ ... etc ... =========================
#4. mkdir datasave <-- make backup subdir for existing datafiles ============== (if not already existing)
#5. cp data1/* datasave <-- backup all existing data1/* files ===================
#7a. cp datasave/* data1 ===================
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
This assumes that you have completed the preparations suggested on the previous pages, especially the backup of data1/* files to datasave/..., clear all data1/*, & copyback only the files required for the difficult job being debugged. You could now repeat the following steps for each cycle of test/debug that might be required for a difficult job.
#1. cdd --> ensure we are in $RUNDATA
#2. rm -f data1/* <-- remove all files from data1/... =============
#3. cp datajobxx/* data1 <-- store data1/ files for jobxx only ====================
#4. testdatainit <-- clear files from all temporary subdirs ============ (jobctl,joblog,jobmsgs,jobtmp,sysout,tmp,wrk)
#5. export ANIM="+A" <-- activate Micro Focus COBOL animator (debugger) ================
#6. joblog jobxx.ksh <-- test/debug jobxx ================
#7. reply to animation prompts ==========================
The Micro Focus Animator is a marvelous debugging tool for COBOL programs. It stops at 1st instruction & displays a menu of comamnds. Here are a few:
--> s <-- Step (execute current high-lighted statement) - for Perform: go into paragraph or section --> ps <-- Perform Statement - do not animate paragraph/section (appears to step over it) --> qc <-- Query the fieldname under the Cursor - move cursor (with arrow keys) to the desired fieldname --> lc <-- Locate definition of fieldname under the Cursor --> g <-- Go, steps continuously (but slowly) thru instructions --> 9 <-- (while Go is progressing), enter digit 2-9 to speed up --> z <-- Zoom thru remainder of program (without animating) --> Esc <-- Escape to exit program --> y <-- requires 'y' response to actually exit
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
#8. vi joblog/jobxx.log <-- view the joblog ===================
#9a. vi data1/... <-- check data files with 'vi' ============ - IF LineFeeds present (no Packed fields)
#9b. uvhd data1/... <-- use 'uvhd' if no LF's ============== - &/or if Packed/Binary fields present
#10a. llr jobtmp <-- list jobtmp files created ========== - intermediate files & new GDG's (new GDGs NOT restored to data1/ if Abterm)
#10b. vi jobtmp/JOBXX/GDG/... <-- investigate GDG files =======================
#11a. llr sysout <-- list sysout files created ========== - from COBOL 'DISPLAY's
#11b. vi sysout/JOBXX/... <-- investigate sysout files as desired ===================
#11c. uvlp18 sysout/JOBXX/... <-- print sysout files as desired ======================= - uvlp18 prints 18 cpi (132 cols on 8" wide)
#11d. uvlp14L sysout/JOBXX/... - use uvlp12L (Landscape) 14cpi 150 cols 11" ========================
#12. cdl --> $RUNLIBS <-- switch to $RUNLIBS
#13a. vi jcls/jobxx.ksh <-- modify the JCL/script if required =================
#14a. vi cbls/cblxx.ksh <-- modify COBOL programs if required =================
#14b. mfcbl1 cblxx.cbl <-- recompile the COBOL program ================
-------------> repeat steps 1 - 14 as required <---------------
#15. cp datasave/* data1 <-- restore all data1/* files when jobxx debugged ===================
#16. export ANIM="" <-- de-activate Micro Focus COBOL animator ==============
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Hard-copy listings can help you debug difficult jobs & document the problems for other team members. Vancouver Utilities provides several scripts (based on the uvlist utility documented at uvlist.htm. These scripts are great for JCL/script & program listings since the page headings capture vital information (filename, date, userid, page#, line# of 1st line on page,etc). Here are a few of the most useful scripts:
uvlp12 |
|
uvlp12D |
|
uvlp14 |
|
uvlp14L |
|
etc |
|
#1. uvlp14 joblog/jobxx.log <-- print the joblog for better study ======================= - see 'uvlp' scripts in uvlist.htm
#2. lslp data1 <-- print data1/... to study with jobxx.log ========== - 'lslp' script does ls -l to tmp/ & uvlp12 tmp file
#3. llr sysout <-- display printer filenames in sysout/yyyymmdd/... ==========
#4. uvlp14 sysout/yyyymmdd/JOBXX_S0010_DDname ========================================= - print sysout files
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
If you have not yet edited gdgctl51 & reloaded gdgctl51I, then you could re-create gdgctl51 from all JCL by running 'uvcopy jclgdgctl51' separately from script 'jcl2ksh51A'. You might do this if you had added a few JCLs via jcl2ksh51 (vs jcl2ksh51A).
#1. Login userxx --> /home/userxx #2. cdl --> /home/userxx/testlibs (alias cdl='cd $RUNLIBS')
#3. uvcopy jclgdgctl51,fild1=jcl3,filo1=ctl/gdgctl51 ================================================ - extract all GDG file-names from converted JCL/scripts
NOTE |
|
#4. cp ctl/gdgctl51 $GDGCTL/ctl/ <-- copy to GDG control directory ============================ - GDGCTL $RUNDATA or $APPSADM
#5. uvcopy gdgload51,fili1=$GDGCTL/gdgctl51,filo1=$GDGCTL/gdgctl51I =============================================================== #5a. uvcopy gdgload51 <-- same but easier (files default as shown above) ================
If you reconvert all JCL (using script 'jcl2ksh51A'), you will NOT overwrite the GDG control file & lose any manual edits to modify the number of generations, because the script tests for existing $GDGCTL/gdgctl51 & will not overwrite.
If you only needed to convert a few additional JCL's you could use script 'jcl2ksh51' which converts 1 at a time & does not recreate the GDG control file. You could then edit the existing gdgctl51 adding any new GDG files & reload the Indexed file by rerunning 'uvcopy gdgload51'.
The location of the GDG control file is determined by the common_profile (stored at /home/appsadm/env/common_profile & called by .profile or .bash_profile). Here are lines 61-62 of $APPSADM/env/common_profile.
export GDGCTL=$RUNDATA/ctl #<-- default location # export GDGCTL=$APPSADM/ctl #<-- could change to this ?
If you have multiple RUNDATA directories you might want to activate 'export GDGCTL=$APPSADM/ctl' & #comment out 'export GDGCTL=$RUNDATA/ctl' so you would have only 1 GDG control file vs multiple.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
The GDG control file is 1st created as part of the 'jcl2ksh51A' script that converts all JCL to Korn shell scripts. All GDG filenames (identified by suffixes (0),(+1),etc) are extracted from all JCL, sorted dropping duplicates, & loaded into an Indexed file for lookup when the JCL/scripts are executed - to determine latest & next generation,& also to determine how many generations to maintain. These lookups & determinations are performed by functions 'exportgen0' & 'exportgen1'.
Here is the GDG file created from the demo JCL/scripts listed in part_2 of JCLcnv1demo.htm#Part_2. This is the text file that can be edited to modify the number of generations before loading into the Indexed file 'gdgctl51I' (looked up by the exportgen0/exportgen1 functions in the JCL/scripts).
gl.account.acntlist_ gdg=07 gl.account.master_ gdg=07 gl.account.trans2_ gdg=07 gl.account.trans_ gdg=07
As shown above the number of generations defaults to 7, You ca edit the file to set the generations as desired for each file. Note that the location of the GDG control file defaults to $RUNDATA/ctl/gdgctl51 but is configurable via "export GDGCTL=..." (discussed later).
#1. Login userxx --> /home/userxx
#2. cdd --> /home/userxx/testdata (alias='cd $RUNDATA) ===
#3. vi ctl/gdgctl51 <-- edit to modify no of generations ===============
#4. uvcopy gdgload51 ================ - reload Indexed file for use by exportgen1, exportgen0,& jobend51
I/O files are defaulted within the 'gdgload51' uvcopy job as follows:
#4a. uvcopy gdgload51,fili1=$GDGCTL/gdgctl51,filo1=$GDGCTL/gdgctl51I ===============================================================
Since the common_profile defines "export GDGCTL=$RUNDATA/ctl", the result is:
#4b. uvcopy gdgload51,fili1=$RUNDATA/ctl/gdgctl51,filo1=$RUNDATA/ctl/gdgctl51I =========================================================================
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
The VU JCL conversion writes new GDGs into $RUNDATA/jobtmp/$JOBID2/GDG/data1/... and restores the new GDGs to $RUNDATA/data1/... at S9000 JobEnd=Normal. If any step fails, the job ends at S9900 JobEnd=AbTerm & any new GDG files in jobtmp/... are not restored to data1/...
The logic above means that you can simply rerun JCL/scripts with GDG files and not worry about any new GDGs created in the failing run. However you do need to know if there are any non-GDG files that are updated in place (vs updated by copying to a newfile). If there are updates in place prior to the failing step, then you can use "step Restart" to prevent double updates. This assumes it was not the failing step doing an update in place, in which case you would need to restore that file from a backup made before the job started.
We will demo new GDG file creation & step restart using JCL/script 'jgl200.ksh' (listed at jclcnv1demo.htm#2D1 & normal joblog at jclcnv1demo.htm#4F4) Step1 sorts a transaction file to a GDG file which updates a master file (also GDG) on the step2 COBOL program. We can cause 'AbTerm' by moving the COBOL executable from $RUNLIBS/cblx to tmp/... After the failed run, we can restore the execcutable to demo recovery.
#1. Login userxx --> /home/userxx (or whatever Login desired)
#2. cdl --> $HOME/testlibs (alias cdl='cd $HOME/testlibs)
#3. mv cblx/cgl200.int tmp <-- move step2 program out to tmp/... ====================== - to cause failure on step2
#4. cdd --> $HOME/testdata
#5. l data1 <-- list existing files in $RUNDATA/testdata/data1 ======= - will show only files for jgl200.ksh - only 1 generation of each file existing
-rw-rw-r-- 1 userxx apps 13952 Oct 18 15:11 gl.account.master_000001 -rw-rw-r-- 1 userxx apps 1600 Oct 18 15:11 gl.account.trans_000001
#6. joblog jgl200.ksh <-- execute the job (step2 will fail) ================= - joblog listed below (just step2 & AbTerm msgs)
******** Begin Step S0020 cgl200 (#2) ******** gen+1 GLTRANS=jobtmp/JGL200/GDG/data1/gl.account.trans_000002 gens=07 gen0 GLMSOLD=data1/gl.account.master_000001 insize=4.0K gen+1 GLMSNEW=jobtmp/JGL200/GDG/data1/gl.account.master_000002 gens=07 Executing--> cobrun -F /home/userxx/testlibs/cblx/cgl200 Load error : file '/home/userxx/testlibs/cblx/cgl200' error code: 173, pc=0, call=1, seg=0 173 Called program file not found in drive/directory ERR: step#S0020 cgl200 abterm 255 JobEnd=AbTerm, JCC=255,StepsX/L=2/S0020 GDG files NOT moved from jobtmp/subdirs to /home/userxx/testdata/subdirs
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
#7. cdl --> $HOME/testlibs
#8. mv tmp/cgl200.int cblx <-- move step2 program back to execution library ======================
#9. cdd --> $HOME/testdata
#10. joblog jgl200.ksh <-- re-execute the job, will get messages ================= about GDG files in jobtmp from prior failure
WARN: files in jobtmp/GDG subdirs (from prior AbTerm ERR?) ---- files in jobtmp/JGL200/GDG/*/* listed below: jobtmp/JGL200/GDG/data1/gl.account.trans_000002 If NO restart by step#, GDG files in jobtmp/... will be cleared - allows rerun from begin job with no worry about GDGs If RESTARTing by step#, example--> jobname.ksh start=S0050 - GDG files in jobtmp/... will NOT be cleared - will be available to steps after restart step# - will be restored to data1/... subdir at JobEnd=Normal --> <-- null entry to continue - - - lines removed to JobEnd=Normal - - - JobEnd=Normal, StepsExecuted=1, LastStep=S0020
#11. l data1 <-- list existing files in $RUNDATA/testdata/data1 ======= - will show only files for jgl200.ksh - now 2 generations each
-rw-rw-r-- 1 userxx apps 13952 Oct 18 15:11 gl.account.master_000001 -rw-rw-r-- 1 userxx apps 13952 Nov 26 12:15 gl.account.master_000002 -rw-rw-r-- 1 userxx apps 1600 Oct 18 15:11 gl.account.trans_000001 -rw-rw-r-- 1 userxx apps 1600 Nov 26 12:15 gl.account.trans_000002
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
#10. joblog jgl200.ksh start=S0020 <-- re-execute job, with RESTART at step2 ==================*********** - will get same msgs re GDG files - BUT, notice the 'WARN: **START**' msg now appears
WARN: files in jobtmp/GDG subdirs (from prior AbTerm ERR?) ---- files in jobtmp/JGL200/GDG/*/* listed below: jobtmp/JGL200/GDG/data1/gl.account.trans_000002 If NO restart by step#, GDG files in jobtmp/... will be cleared - allows rerun from begin job with no worry about GDGs If RESTARTing by step#, example--> jobname.ksh start=S0050 - GDG files in jobtmp/... will NOT be cleared - will be available to steps after restart step# - will be restored to data1/... subdir at JobEnd=Normal --> <-- null entry to continue
WARN: **START** at start=S0020 - - - lines removed to JobEnd=Normal - - - JobEnd=Normal, StepsExecuted=1, LastStep=S0020
#11. l data1 <-- list existing files in $RUNDATA/testdata/data1 ======= - will show only files for jgl200.ksh - still 2 generations each
-rw-rw-r-- 1 userxx apps 13952 Oct 18 15:11 gl.account.master_000001 -rw-rw-r-- 1 userxx apps 13952 Nov 26 12:15 gl.account.master_000002 -rw-rw-r-- 1 userxx apps 1600 Oct 18 15:11 gl.account.trans_000001 -rw-rw-r-- 1 userxx apps 1600 Nov 26 12:15 gl.account.trans_000002
See complete documentation re GDG files at JCLcnv4gdg.htm.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
The JCL converter inserts a call to function 'stepctl51' st the begining of each step. This function normally does nothing, but can be activated by 'jobstop' to stop at the begining of each step (or a specified step) & wait for the operator to enter 'go' or 'clear'.
stepctl51 # test oprtr jobstop/jobclear #======== # inserted at begining of each step by JCL converter option 'b2'
Pausing at the begining of each step allows you to investigate the files in data1/... & jobtmp/JOBXX/GDG/data1/... that might be deleted at the end of the step. Also see JCLcnv3aids.htm#8B1 which suggested you might use the JCL converter option to #comment out 'rm's (converted from original 'DELETE's) & then later uncomment with mass change scripts.
#1. Login --> /home/userxx #2. cdd --> $HOME/testdata
---> instructions omitted here (see page '5H2')
#6a. jobstop jobxx.ksh <-- activate stop at begin each step ================= - stores control record in jobctl/jobxx.ctl
#6b. joblog jobxx.ksh <-- execute the job ================ - will get following display at begin each step
--> jobxx.ksh paused by job control file: jobctl/jobxx.ctl - job control record: jobxx.ksh S0000 111125_124512 - waiting until reply 'go' or 'clear'
#7. Login on a 2nd screen --> $HOME #8. cdd --> $HOME/testdata
#9a. l data1 <-- investigate data files on screen#2 ======= - while jobxx paused on screen#1
#9b. l jobtmp/JOBXX/GDG/data1/ <-- investigate GDG files in jobtmp =========================
The 'jobstop' script may also specify the 1st step# to stop (art end of step)
#6a. jobstop jobxx.ksh stop=S0030 <-- store jobctl file with stop step# ============================
#6b. joblog jobxx.ksh <-- execute the job ================ - will stop at end of step S0030 & subsequent steps if you reply 'cont'inue
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
As of Feb 2012, the 'job restart' facility has been enhanced to include the 'job stop' facility. This means you no longer have to precede your job execute (with or without joblog) by the 'jobstop' script to store the jobctl/file. You can restart a job at a specified step# & stop it at the END of a step#, Here are some examples:
#1. jgl310.ksh <-- runs all 6 steps of this demo job ==========
#1a. joblog jgl310.ksh <-- same with joblogging ================= - will omit 'joblog' from following examples - BUT you should use joblog for test/debug
#2. jgl310.ksh start=S0030 <-- rerun job starting at step 3 ====================== - will run to end of job with no stops
#3. jgl310.ksh start=S0030 stop=S0040 ================================= - rerun job starting at step 3 & stopping at END of step 4 - displays following choices:
120210:155447:JGL310: jgl310.ksh paused by job control file: jobctl/jgl310.ctl 120210:155447:JGL310: - job control record: jgl310.ksh S0030 120210_155447 120210:155447:JGL310: - waiting for reply: cont(inue), clear, endok, endab
cont - continue (execute) to next step, stop again,& reprompt as above clear - clear the jobctl/jgl310.ctl file, will execute to end job with no stops endok - goto Normal end of job, will restore any new GDG's endab - goto AbNormal end of job, will NOT restore any new GDG's
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
If desired you could activate console-logging & rerun the demo jobs.
Console-Logging will capture the entire login session from login to logout vs the 'joblog' script intended for programmer test/debug to capture the log for 1 job at a time.
Note that console-logging captures all screen I/O including any reponses to prompts & commands the operator might do between running jobs.
See complete details of console logging at ADMjobs.htm#Part_6 but here is the short version of activating. ADMjobs.htm#6D1 describes the subdirs required in /home/appsadm where logs are collected. We will assume that you are 'userxx'.
#1. login --> your homedir
#2a. mkdir $APPSADM/log1/userxx <-- subdir current log being created #2b. mkdir $APPSADM/log2/userxx <-- subdir current months processed logs #2c. mkdir $APPSADM/log3/userxx <-- subdir last months logs
#3. vi .profile <-- edit your profile =========== #3a. uncomment the 9 '##' lines near the end of your profile - see profile listing on page '1C1' #3b. :wq
#4. Logout & Log back in to start console logging
#5. Run some jobs (jar100.ksh for example) #5a. cdd - change to $RUNDATA directory #5b. l - list subdirs in $RUNDATA #5c. jar100.ksh - run 1st job (COBOL program car100.cbl) #5d. l data1 - list data subdir
#6. Logout/Login to process the log - copies from $APPSADM/log1/userxx/date_time to $APPSADM/log2/userxx/. removing screen control codes that would make log unreadable
#7. logview <-- script lists your available log filenames ======= - prompts for file# to view (#1 is latest) #7a. --> 1 <-- enter '1' to see latest log #7b. --> 0 <-- enter '0' to quit
#8. l $APPSADM/log2/userxx/ <-- can list your log files directly ======================= vs logview script
#8a. vi $APPSADM/log2/userxx/yymmdd_HHMMSS <-- inspect latest log file =====================================
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
'job-logging' is good for test/debug (captures log for 1 job at a time), but 'console-logging' is better for production, because it captures everything, inputs as well as outputs, what commands the operator did between jobs. Here is a sample - I have numbered the operator's commands on right side:
#1. cdd <-- change to $RUNDATA #2. l <-- list files in $RUNDATA #3. l data1 <-- list files in data1/ (before running job) #4. jar100.ksh <-- run JCL/script #5. l data1 <-- list files after running job
<@uvsoft4:userxx:/home/userxx> cdd <-- #1. <@uvsoft4:userxx:/home/userxx/testdata> l <-- #2. drwxrwxr-x 2 userxx apps 4096 Oct 17 10:49 ctl drwxrwxr-x 2 userxx apps 4096 Oct 17 09:48 data1 <-- data files subdir drwxrwxr-x 2 userxx apps 4096 Oct 17 09:27 joblog drwxrwxr-x 2 userxx apps 4096 Oct 17 09:27 jobmsgs drwxrwxr-x 2 userxx apps 4096 Oct 17 09:27 jobtmp drwxrwxr-x 2 userxx apps 4096 Oct 17 09:27 sysout drwxrwxr-x 2 userxx apps 4096 Oct 17 09:27 tmp drwxrwxr-x 2 userxx apps 4096 Oct 17 09:27 wrk
<@uvsoft4:userxx:/home/userxx/testdata> l data1 <-- #3. -rw-rw-r-- 1 userxx apps 8192 Oct 17 09:48 ar.customer.master -rw-rw-r-- 1 userxx apps 1280 Oct 17 09:48 ar.sales.items -rw-rw-r-- 1 userxx apps 13952 Oct 17 09:48 gl.account.master_000001 -rw-rw-r-- 1 userxx apps 1600 Oct 17 09:48 gl.account.trans_000001
<@uvsoft4:userxx:/home/userxx/testdata> jar100.ksh <-- #4. ========== 111017:105233:JAR100: Begin Job=JAR100 111017:105233:JAR100: /home/userxx/testlibs/jcls/jar100.ksh 111017:105233:JAR100: RUNLIBS=/home/userxx/testlibs 111017:105233:JAR100: RUNDATA=/home/userxx/testdata 111017:105233:JAR100: JTMP=jobtmp/JAR100 SYOT=sysout/JAR100 111017:105233:JAR100: RUNDATE=20111017 111017:105233:JAR100: ******** Begin Step S0010 car100 (#1) ******** 111017:105233:JAR100: file: CUSTMAS=data1/ar.customer.master fsize=8.0K 111017:105233:JAR100: file: NALIST=data1/ar.customer.nameadrs.list100 fsize= 111017:105233:JAR100: file: SYSOUT=sysout/JAR100/S0010_SYSOUT fsize= 111017:105233:JAR100: Executing--> cobrun -F /home/userxx/testlibs/cblx/car100 111017:105233:JAR100: Job Times: Begun=10:52:33 End=10:52:33Elapsed=00:00:00 111017:105233:JAR100: EOF filr01 rds=3 size=6144: /home/userxx/testdata/ctl/gdgctl51I 111017:105233:JAR100: JobEnd=Normal, StepsExecuted=1, LastStep=S0010
<@uvsoft4:userxx:/home/userxx/testdata> l data1 <-- #5. -rw-rw-r-- 1 userxx apps 8192 Oct 17 09:48 ar.customer.master -rw-rw-r-- 1 userxx apps 2858 Oct 17 10:52 ar.customer.nameadrs.list100 -rw-rw-r-- 1 userxx apps 1280 Oct 17 09:48 ar.sales.items -rw-rw-r-- 1 userxx apps 13952 Oct 17 09:48 gl.account.master_000001 -rw-rw-r-- 1 userxx apps 1600 Oct 17 09:48 gl.account.trans_000001
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
'diff' is a marvelous utility, but does not work for files without lineFeeds & with packed/binary fields present. For these files UV Software provides the 'uvcmp' utilities - uvcopy jobs uvcmp1,2,3 & several scripts uvcmpFA1, uvcmpFE1, etc to make the uvcopy jobs easier to run.
Here is an example you can run using JCL/script 'jgl200', (listed at jclcnv1demo.htm#2D1 & normal joblog at jclcnv1demo.htm#4F4) jgl200 updates the account.master file with a transaction file while copying to a new output file. Since the account.master file is a GDG file, it is easy to compare the old & new master files without having to save the input file.
#1. Login mvstest1 --> /home/mvstest1 #1. or Login yourself (userxx) --> your homedir (/home/userxx)
#2. cdd --> $HOME/testdata (alias cdd='cd $HOME/testdata)
#3. joblog jgl200.ksh <-- execute account.master update ================= - joblog at jclcnv1demo.htm#4F4
#4. l data1 <-- list data1/... subdir after execution =======
-rw-rw-r-- 1 userxx apps 13952 Oct 18 15:11 gl.account.master_000001 -rw-rw-r-- 1 userxx apps 13952 Dec 13 11:31 gl.account.master_000002 -rw-rw-r-- 1 userxx apps 1600 Oct 18 15:11 gl.account.tran1 -rw-rw-r-- 1 userxx apps 1600 Oct 18 15:11 gl.account.trans_000001
#5. mkdir rptcmp <-- make subdir for comparison report ============ (if not already present)
#6. uvcmpFA1 data1/gl.account.master_000001 data1/gl.account.master_000002 r128 =========================================================================== - compare the before & after update files, 'r128' specifies the record-size - may omit option 'r' on the command line & specify when prompted - writes difference report to rptcmp/gl.account.master_000001 (ie - named same as 1st file compared)
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
#7. vi rptcmp/gl.account.master_000001 <-- inspect difference report ================================== - 1st 2 record pairs listed below:
uvcmp1 - compare 2 files, print mismatched records, '*' flag diffs 2011/12/13_11:32:34 uop=q1p30r256s6t500000u3x2y0q1r128 recsize reccount file-size typ Report=rptcmp/gl.account.master_000001 1: 128 109 13,952 RSF File1=data1/gl.account.master_000001 2: 128 109 13,952 RSF File2=data1/gl.account.master_000002 1 2 3 4 5 6 f#record#byte# 0123456789012345678901234567890123456789012345678901234567890123 ===============================================================================
1 1 0 11100 11100Royal Bank Lynn Valley ....b.20090131 2333332222222333335676624666247662566667222222220010683333333322 0111000000000111002F91C021EB0C9EE061CC590000000000972C2009013100 ******
2 1 11100 11100Royal Bank Lynn Valley ....b.20111213 2333332222222333335676624666247662566667222222220010683333333322 0111000000000111002F91C021EB0C9EE061CC590000000000972C2011121300 ****** - - - 5 record pairs omitted - - -
==================== EOF or StopPrint/StopRead count reached ============== F1Count=109, F2Count=109, StopPrint=6, StopRead=500000 F1Reads=109, MisMatches=109, MisMatsPrinted=6, Recsize=128
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Here is a summary of scripts, C programs,& uvcopy jobs used for JCL conversion, followed by listings of a few important scripts (active links vs dead links). Active links have a trailing period '11A1.' vs trailing underscore '11A2_' If you have the Vancouver utilities installed, you can see all as follows:
/home/uvadm/sf/IBM/ - scripts used for JCL/COBOL/DATA conversions /home/uvadm/pf/IBM/ - uvcopy jobs used for JCL/COBOL/DATA conversions /home/uvadm/src/ - C programs used for JCL/COBOL/DATA conversions
11A1. | jcl2ksh51A - convert all JCL, Procs,& Parms (do everything script) |
proc0 -------> procs parm0 -------> parms | |
cleanup cleanup | |
jcl0 ----> jcl1 ---------> jcl2 ---------> jcl3 ------> jcls | |
cleanup PROC expand convert to ksh copy 1 at a time |
11C1. | jcldata51A - called by jcl2ksh51A to create control files |
- see next page |
11A2_ jclpx51 - expand all PROCs while copying from jcl1 to jcl2 - called by jclpx51
11A3_ jclxx51 - convert all JCLs from jcl2 --> jcl3 - use to reconvert for changes in options or utilities
11A4_ jcl2ksh51 - convert 1 MVS JCL thru all stages - cleanup, PROC expand, JCL convert, copy to jcls - jcl0 ---> jcl1 ---> jcl2 ---> jcl3 ---> jcls
11A5_ jclproc51.c - C program to expand PROCs while copying jcl1 --> jcl2
11A6_ jclunix51.c - C program to convert JCL to Korn shell scripts jcl2 --> jcl3
11B1_ cnvMF51Acpy - convert COBOL copybooks thru all steps cpy0 ------> cpy1 ------> cpy2 ------> cpys cleanup convert copy
11B2. | cnvMF51A - convert COBOL programs thru all steps |
cbl0 ------> cbl1 ------> cbl2 ------> cbls ------> cblx | |
cleanup convert copy compile |
11B3_ cnvMF51 - convert 1 COBOL program thru all stages & optionally compile - cleanup, conversion,& compile: cbl0 --> cbl1 --> cbl2 --> cbls - see COBOL conversion & compile scripts listed in MVSCOBOL.htm.
11B4. | mfcbl1 - compile 1 COBOL program |
11B5_ mfcblA - compile All COBOL programs cbls ---> cblx
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
11C0. | copymvsctls - copy Vancouver Utility control files from $UV/ctl/... |
required to convert mainframe COBOL & JCL |
11C1. | jcldata51A - creates ctl/datajcl51, ctl/datajcl52 & combines other info |
(datacat51 + dataxls51 + datamisc51 + cobfil55b) | |
- creates datactl53 & loads indexed file datactl53I.dat/.idx | |
- to supply record-sizes, keys, file types to the JCL converter | |
- called by jcl2ksh51A |
11C2_ jcldata51 - uvcopy job to extract filenames from JCL
11C3_ jcldata52 - uvcopy job to to translate filenames to lowercase & convert GDG filenames to VU convention (trailing underscore)
11C4_ catcat51 - uvcopy job to combine all control files in $RUNLIBS/ctl/add/... collecting significant keyword info on 1 entry per filename. - uvcp used to load Indexed file ctl/dataadd52I.dat/.idx
11C5_ ctldata53 - uvcopy job to combine keyword info & load ctl/datactl53I.dat/.idx - looks up dataadd52I & datacat52I to get keyword info for matching files
11C6. | catdata50 - create control file of data file info from LISTCAT reports |
- to supply record-sizes, file-types, keys, gdgs | |
that may be missing from JCL |
13C7. | xlsdata51 - uvcopy job to extract data file info from Excel spreadsheet |
- sample uvcopy job written for an existing spreadsheet format |
11C8_ jclgdgctl51 - uvcopy job to extract GDG filenames from all JCL - to create ctl/gdgctl51 control file for no of generations
11C9_ gdgload51 - uvcopy job to load Indexed file ctl/gdgctl51I.dat/.idx - looked up by exportgen0 & exportgen1 functions in JCL scripts to determine next generation for reading or writing
11D1. | xrefall - script to generate most cross-references |
- xcobcall1/2, xcobcopy1/2, xkshfile1/2, xkshparm1/2, | |
xkshprog1/2, xkshproc1/2 |
11D2. | xkshprog2 - sample, 1 of the most useful cross-references |
11D3. | xref2csvall - convert cross-references to .csv's |
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
#!/bin/ksh # jcl2ksh51A - convert All MVS JCL to Korn shell scripts # - by Owen Townsend, UV Software, Feb2018 # - jcl2ksh51A updated from jcl2ksh54A # - dropping extra conversion (jcl3-->jcl4) required at some java sites # jcl0 ---> jcl1 ---> jcl2 ---> jcl3 [--> jcls] --> jcls (1 at a time) # Assumptions: # - have run UV scripts mvslibsdirs,mvsdatadirs,cnvdata to create directories # - have run mvscopyctls to copy initial control-files to testlibs/ctl/... # - mainframe JCL stored in testlibs/subdirs/jcl0/... # PROC's in proc0/, INCLUDES in include0/, control cards in parm0/ # if [[ -d jcl0 && -d jcl1 && -d jcl2 && -d jcl3 && -d proc0\ && -d procs && -d parm0 && -d parm1 && -d parms && -d include0 && -d includes\ && "$1" = "all" && -f ctl/jclunixop51 ]]; then : else echo "usage: jcl2ksh51A all" echo " ==============" echo "subdirs required: jcl0/1/2/3/4,proc0/s,parm0/1,include0/s" echo "- requires control file ctl/jclunixop51" echo "use script \$UV/sf/IBM/mvslibsdirs to create required subdirs" echo "see www.uvsoftware.ca/jclcnv1demo.htm#3F1 for prompt responses" exit 91; fi # echo " " echo "\$RUNLIBS/ctl/jclunixop51 - JCL converter options file must be present" echo "datafile info control files required (recsize, type, keys, gdgs)" echo "1. ctl/datacat52I - datafile info created from mainframe LISTCATs" echo " OR--> makeISF0 ctl/datacat52I 191 0,44 <-- create emty file" echo "2. ctl/add/... <-- additional control files of datafile info" echo " \$UV/ctl/add/dummy_readme - use this if no additional files" echo "Run 'copymvsctls' (\$UV/sf/IBM/copymvsctls) to create empty/dummy files" echo " - if you have not transferred LISTCAT reports & created additional files" echo " - see script \$UV/sf/IBM/catdata50 to extract datafile info from LISTCATs" echo " " reply=x; until [[ "$reply" = "y" || "$reply" = "n" ]] do echo "Have you run 'copymvsctls' to create initial control files" echo "&/or created manually, may add LISTCAT files to ctl/cat0/..." echo " - proceed y/n ?"; read reply; done if [[ "$reply" = "n" ]]; then exit 92; fi # #eject rm -f errs/* # clear errors directory echo "jcl2ksh51A to perform all steps of JCL conversion to Korn shell scripts" echo "jcl0------->jcl1---------->jcl2--------->jcl3--------->jcl4--------->jcls" echo " cleanup Proc-Expand convert ksh convert java copy1/test" echo "see www.uvsoftware.ca/jclcnv1demo.htm#3F1 for operating instructions" echo "can SKIP CLEANUP parms/procs/includes for RE-CONVERSIONS if no changes those dirs" reply=x; until [[ "$reply" = "YES" || "$reply" = "n" ]] do echo "cleanup proc0-->procs, parm0-->parm1, & include0-->includes YES/n ?" echo "--> REPLY 'n' for RE-CONVERSIONS (usually)" echo "- do not need to reconvert procs,parms,includes unless you have changed them" echo "- do need to reconvert them if re-transferred from mainframe or changed" echo "- critical, so must reply 'YES' to cleanup procs/parms/includes" read reply; done # if [[ "$reply" = "YES" ]]; then reply=x; until [[ "$reply" = "y" || "$reply" = "n" ]] do echo "OK to remove all files (outputs) from procs, parm1,& includes - ? y/n"; read reply; done if [[ "$reply" = "y" ]]; then rm -f procs/*; rm -f parm1/*; rm -f includes/*; fi # reply=x; g=g8; # default to clear cols 73-80 until [[ "$reply" = "y" || "$reply" = "n" ]] do echo "clear cols 73-80 of procs, parms, & includes y/n ?"; read reply; done if [[ "$reply" = "n" ]]; then g=g0; fi # uvcopyx cleanup proc0 procs uop=q0i7n1$g,arg1=.prc #<-- cleanup PROCs #================================================= echo "cleanup proc0-->procs complete, enter to cleanup include0-->includes"; read reply # uvcopyx cleanup include0 includes uop=q0i7n5$g #<-- cleanup includes #============================================= echo "cleanup include0-->includes complete, enter to cleanup parm0-->parm1"; read reply # uvcopyx cleanup parm0 parm1 uop=q0i7n5$g #<-- cleanup parms #============================================= echo "- cleanup parm0-->parm1 complete " echo "Now run db2exportselect1 to possibly insert 'export to \$SYSRTMP of del ...'" echo "- while copying parm1-->parms, which makes no changes by default" echo "Users who require inserting 'export to \$SYSRTMP of del ...'" echo "- should copy this job from \$UV/pf/IBM/... to \$RUNLIBS/pf/sf/..." echo " & uncomment the code to insert the 'export to \$SYSRTMP of del ...'" echo "- should also copy job to \$RUNLIBS/sf/... because you would lose uncommented" echo " when a new version of Vancovuer Utilities (uvadm) installed" echo "enter to proceed (or cancel, uncomment code in job,& rerun)"; read reply uvcopy db2exportselect1,fild1=parm1,fild2=parms #============================================== fi #
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
echo "now cleanup jcl0--->jcl1 & expand PROCs jcl1--->jcl2" reply=x; until [[ "$reply" = "y" || "$reply" = "n" ]] do echo "OK to remove all files from jcl1,jcl2,jcl3 ? y/n"; read reply; done if [[ "$reply" = "y" ]]; then rm -f jcl1/*; rm -f jcl2/*; rm -f jcl3/*; fi # # 1. cleanup JCL, strip CR's, clear cols 73-80,& shorten to last nonblank uvcopyx cleanup jcl0 jcl1 uop=q0i7g8n1,arg1=.jcl #<-- cleanup JCL #=============================================== echo "cleanup complete, enter to perform PROC expansion"; read reply # # 2. expand procs & includes as we copy JCL from subdir jcl1 to subdir jcl2 jclpx51 jcl1 jcl2 procs includes #=============================== echo "PROC expansion complete, enter to continue"; read reply # # 3. generate data info file (ctl/datactl53I) for JCL converter # - major changes July 2014, see explanations lines 25-40 above reply=x; until [[ "$reply" = "y" || "$reply" = "n" ]] do echo "create data conversion info file datactl53I y/n ?" echo "----> reply 'y' if ctl/add/masterctl51 updated or new files in JCL" echo "----> reply 'n' if re-converting with no ctlfile updates & no new files" echo "----> (after JCL converter enhancements)" read reply; done # if [[ "$reply" = "y" ]]; then jcldata51A #<-- script to create datactl53I (4 uvcopy jobs) #========= if (($?)); then echo "jcldata51I failed, missing files ? (make null files if N/A)" echo "ctl/datajcl52 + ctl/datacat52I + ctl/add/... --> ctl/datactl53I" exit 93; fi fi # echo "enter to convert JCL to ksh "; read reply # 4. convert expanded JCL to Korn shell scripts (jclxx51/jclunix51 July2006) jclxx51 jcl2 jcl3 #================ # # 5. prompt for extra conversion jcl3 --> jcl4 for schedulers, etc #Sep2019 - extra step makes no changes yet reply=x; until [[ "$reply" = "y" || "$reply" = "n" ]] do echo "perform extra conversion for schedulers, etc" echo "extra step jcl3-->jcl4 copies all unchanged (as of Sep2019) OK y/n ?" read reply; done if [[ "$reply" = "y" ]]; then uvcopy jclcnv34,fild1=jcl3,fild2=jcl4 #==================================== # - jclcnv34 to be modified/replaced when extra conversion determined fi #Extra conversion jcl3 --> jcl4 (for prior projects #cmntd out) # uvcopy jclautomic2,fild1=jcl3,fild2=jcl4 # for Finland Automic scheduler # uvcopy jcljava3,fild1=jcl3,fild2=jcl4 # for Shanghai JAVA # #======================================= # # see jcl2ksh54A for some java sites that need extra step jcljava3 # echo "uvlp12 errs/procNF.rpt <-- print summary report of PROCs Not Found" echo "uvlp12 errs/includeNF.rpt <-- print summary report INCLUDEs Not Found" echo "uvlp12 errs/parmNF.rpt <-- print summary report of PARMs Not Found" echo "all JCLs converted from jcl0 --> jcl1 --> jcl2 --> jcl3 (Not to jcls)" echo "copy scripts from jcl3 to jcls (in PATH) 1 by 1 as you test/debug" #
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
#Aug19/14 - code added here at end jcl2ksh51A to load gdgctl51I.dat/.idx # - with GDG files extracted from JCL & default generations gdg=07 # - should be later updated manually or from LISTCAT reports # - see $UV/pf/catdata51 & catdata52 echo "End JCL conversion - create GDG control file" echo "Extract GDG files from JCL/scripts & Load \$GDGCTL/gdgctl51I" echo "\$GDGCTL=$GDGCTL" echo "\$GDGCTL defined in profile as \$RUNDATA/ctl or \$APPSADM/ctl" echo "- generations default to 7, later update gdgctl51 & reload gdgctl51I" echo "- OR extract from LISTCAT (see $UV/pf/datacat51)" # uvcopy jclgdgctl51,fild1=jcl3,filo2=ctl/gdgctl51,rop=r0 #====================================================== # - extract exportgen0/1 files from all JCL/scripts into ctl/... reply=x; until [[ "$reply" = "y" || "$reply" = "n" ]] do echo "answer 'y' below to overwrite/reload any existing \$GDGCTL/gdgctl51I ?" echo "answer 'n' below if you edited gdgctl51 with desired generations & reloaded gdgctl51I" echo "- will save any existing \$GDGCTL/gdgctl51 file with date/time stamp" echo "overwrite/reload any existing \$GDGCTL/gdgctl51I y/n ?"; read reply; done if [[ "$reply" == "y" ]]; then ## mv $GDGCTL/gdgctl51 $GDGCTL/gdgctl51_$(date +%Y%m%d_%H%M%S) #<-- disable Jan31/2018 cp ctl/gdgctl51 $GDGCTL/ #======================= # copy from $RUNLIBS/ctl to $GDGCTL ($RUNDATA/ctl or $APPSADM/ctl) uvcopy gdgload51,fili1=$GDGCTL/gdgctl51,filo1=$GDGCTL/gdgctl51I,fili2=ctl/datactl54symbols #========================================================================================= # - load Indexed file GDG generations to JCL/scripts function exportgen1 fi echo "Recommend running folllowing scripts before testing" echo "testdatainit #<-- init various testdata subdirs & reload gdgctl51I" echo "xrefall cbls jcl1 jcl3 #<-- recommend running JCL & COBOL cross-references" echo "runjclstats jcl2 jcl3 #<-- create statistics table summary counts in stats/..." echo "runsysinrpts jcl3 rpts #<-- create various reports for DFSORT complex functions,etc" echo " " exit 0
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
#!/bin/ksh # cnvMF51A - Korn shell script from UVSI stored in: /home/uvadm/sf/IBM/ # cnvMF51A - convert all MVS mainframe COBOL programs to Micro Focus COBOL # - see op. instrns at www.uvsoftware.ca/mvscobol.htm#Part_2 # - also see cnvMF51 for 1 program at a time #Dec05/15 - change OLS option defaults to k6l3 for MOPS #Aug07/14 - cobfil55I changed to cobfil55aI (see uvcopy job $UV/pf/IBM/cobfil55) #Sep14/15 - chg clop1 k1 to k0 do not insert quotes on CALL names if not existing # if [[ "$1" != "all" ]]; then echo "usage: cnvMF51A all <--arg1 'all' to convert programs cbl0->cbl1->cbl2" echo " ============" exit 1; fi # echo "convert all COBOL programs " echo "cbl0------->cbl1------->ctl-------->cbl2 ------->cbls--------->cblx" echo " cleanup identify OLS convert copy ? compile ?" reply=x; until [[ "$reply" = "y" || "$reply" = "n" ]] do echo "convert all programs: cbl0-->cbl1-->cbl2-->cbls-->cblx y/n ?" read reply; done if [[ "$reply" = "n" ]]; then echo "reply=$reply, exiting"; exit 1; fi # if [[ -d cbl0 && -d cbl1 && -d cbl2 && -d cbls && -d ctl && -d tmp ]]; then : else echo "must have subdirs: cbl0,cbl1,cbl2,cbl2,cbls,ctl,tmp";exit 9;fi # #Jul0714 - clear output subdirs cbl1 & cbl1, later prompt copy to cbls rm -f cbl1/* cbl2/* # echo "Next step calls CLEANUP job with options defaulted for most sites" echo "- see cleanup options at: www.uvsoftware.ca/cnvaids.htm#2D1" echo " OR enter 'uvcopy cleanup' on 2nd screen (then cancel)" clop1="q0i7c5e15g8j1k0l1n1s8t1" echo "cleanup default options=$clop1 (may enter overrides or null)" echo "enter cleanup option overrides (null default) --> " read clop2 # uvcopyx cleanup cbl0 cbl1 arg1=.cbl,uop=$clop1$clop2 #=================================================== # - cleanup mainframe code (clear 1-6, 73-80, lower case except in quotes) # reply=x; until [[ "$reply" = "y" || "$reply" = "n" ]] do echo "generate ctl/cobfil55aI to determine ORG Line Seqntl y/n ?" echo "- then run cnvMF5 with option 'k4' to lookup ctl/cobfil55aI" echo " to Identify ORG LINE SEQNTL by cnvMF51 option k4" echo " or use 'k1/k2' to also id OLS files by recsize 80 or 132/133" echo " or select option 'l3' to ID OLS by keywords" echo " see: https://www.uvsoftware.ca/mvscobol.htm#6G2" echo "Note - later screen will prompt for override options" echo "- default options 'k6l3' to use ctl/cobfil55aI (recommended)" echo "- or use options 'k0l3' to use only keywords" echo "generate ctl/cobfil55aI to determine Line Seqntl files y/n ?" read reply; done; # if [[ $reply = "y" ]]; then uvcopy cobfil55,fild1=cbl1,fild2=maps,filo1=ctl/cobfil55a,filo2=ctl/cobfil55b,uop=q0i7,rop=r0 #============================================================================================ # - create sequential file of cobol file info # uvsort "fili1=ctl/cobfil55a,rcs=127,typ=LST,filo1=ctl/cobfil55aI,typ=ISF\ ,key1=0(44),isk1=0(44),del1=0(1):*" #===================================================================== # - load sequential info into indexed file for lookup by cnvMF5 echo "created ctl/cobfil55aI to Identify ORG Line Seqntl files" # uvsort "fili1=ctl/cobfil55b,rcs=127,typ=LST,filo1=ctl/cobfil55bI,typ=ISF\ ,key1=0(44),isk1=0(44),del1=0(1):*" #===================================================================== # - load Indexed file for lookup by jclunix53.c echo "created ctl/cobfil55bI to supply record-sizes to JCL converter" fi echo "Next screen will prompt for COBOL CONVERTER options to override defaults" echo "- here are some options that some sites may want to change" echo "y1 (default) - clear cols 1-6 (cols 73-80 always cleared)" echo "y0 - retain cols 1-6 (do NOT clear)" # echo "t1 - translate to lowercase except in quotes (recommended)" # echo "t2 - translate to UPPERcase, t0 (default) leave as is from cleanup" echo "l_ - controls ORGANIZATION LINE SEQUENTIAL (vs RECORD SEQUENTIAL)" echo "l0 - do NOT set OLS via keywords in card & printer filenames" echo "l1 - set by keywords in filenames (card,parm,sysin,etc)" echo "l2 - set by keywords in filenames (printer,sys011,report,etc)" echo "l8 - force all sequential files ORGANIZATION LINE SEQUENTIAL" echo "k_ - also use ctl/cobfil55aI to set ORGANIZATION LINE SEQUENTIAL files" echo "k0 - do NOT use ctl/cobfil55aI to determine OLS" echo "k1 - set OLS if recsize 80 on Input files" echo "k2 - set OLS if recsize 132/133 on Output files" echo "k4 - set OLS if matching cobfil55aI entry has 'L' in byte 46" echo "k6l3 - default/RECOMMENDED, unless problems creating cobfil55aI" echo "m0 - do NOT insert unixwork1/unixproc1, m2 - do not insert in copybooks" echo "m1 - insert unixwork1/unixproc1 to get PARM data from JCL" echo "Enter override options here or next screen (null accept default l3k6m0)" read ops2 # uvcopy cnvMF5,fild1=cbl1,fild2=cbl2,uop=q1i3l3k6m0$ops2\ ,fili3=ctl/cnvcob5.tbl,fili4=ctl/cobfil55aI #======================================================= # - convert mainframe COBOL to Micro Focus COBOL # echo "cnvMF51A COBOL conversion complete (results in cbl2 subdir)" echo "- copy to cbls AND compile all programs n/y/g ? " echo "- n=no, y=compile to .ints(animation), g=compile to .gnts" echo "cp -f cbl2/* cbls <-- OR you could copy manually like this" echo "=================" echo "mfcblA all <-- AND compile all programs like this" echo "========== (or use mfcblAg for .gnts)" reply="" until [[ "$reply" = "n" || "$reply" = "y" || "$reply" = "g" ]] do echo "copy to cbls & compile ? (n=No, y=compile .ints, g=compile .gnts)" read reply; done if [[ "$reply" = "y" ]] then cp cbl2/* cbls #<-- copy programs to compile subdir mfcblA all #<-- compile all programs to .int's elif [[ "$reply" = "g" ]]; then cp cbl2/* cbls #<-- copy programs to compile subdir mfcblAg all #<-- compile all programs to .gnt's fi exit 0
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
#!/bin/ksh # mfcbl1 - Korn shell script from UVSI stored in: /home/uvadm/sf/IBM/ # mfcbl1 - compile 1 program to .int for animation # - copies source to outdir cblx (required for animation) # - must be in library superdir with following subdirs: # (cbls=cobolsource, cpys=copybooks, cblx= output) # - see MVSCOBOL.doc/VSECOBOL.doc for Operating Instructions pgm="$1"; cbls="cbls"; cblx="cblx"; # capture progname, default input subdir test -n "$2" && cbls="$2"; # if arg2 subdir spcfd - use it (vs dflt cbls) test -n "$3" && cblx="$3"; # if arg2 subdir spcfd - use it (vs dflt cbls) if [[ ! -f "$cbls/$pgm" ]]; then echo "USAGE: mfcbl1 progname.cbl [cbls] [cblx]" echo " =================================" echo " - arg1 progname mandatory, arg2 default cbls/, arg3 default cblx/" exit 1; fi # cwd=$(pwd) # capture Current Working Directory # specify copybook searchpath for MF COBOL export COBCPY=$cwd/cpys:$cwd/sqls:$COBDIR/cpylib # - may need to add other copybook dirs, example for ORACLE: # export COBCPY=$cwd/cpys:$ORACLE_HOME/precomp/public # # establish COBOL options for Micro Focus COBOL compile if [[ -f $cwd/ctl/cobdirectives ]]; then export COBOPT=$cwd/ctl/cobdirectives # directives (-C options) export EXTFH=$cwd/ctl/extfh.cfg # COBOL File Handler Configuration else export COBOPT=$UV/ctl/cobdirectives # directives (-C options) export EXTFH=$UV/ctl/extfh.cfg # COBOL File Handler Configuration fi # convert any UPPER case progname to lower & remove any .ext (.cbl .cbl, etc) typeset -l ps=$pgm # convert UPPER case progname to lowercase px=${ps%%.*} # remove any extension (.cbl, etc) rm -f $cblx/$px.* # remove old versions of this program cp $cbls/$pgm $cblx # copy source to outdir (for animation) cd $cblx # change to outdir to receive output files integer psl=$(wc -l < $pgm) # capture line count in program cat >$px.err <<EOF # init .err file w progname, will append errs # #compile: $pgm Lines=$psl EOF cob -a -P -We -k$pgm -o $px >>$px.err 2>&1 #========================================= coberr=$? cat $px.err | head # pxl1=$(wc -l $px.err); pxl2=${pxl1% *}; pxl3=${pxl2##* }; # if [[ $pxl3 -gt 3 ]]; then #Jan11/10 - cob -Ws default, need -We to return non-0 for fail test # - alternate workaround above to test err rpt lines > 3 if [[ $coberr -ne 0 ]]; then echo "#compile: $ps - *FAILED*" rm -f $px.cbl $px.int $px.idy else rm -f $px.err; fi if [ -f $px.o ]; then rm -f $px.o ; fi cd $cwd # change back up to CWD when compile began # if .lst was created (-P lst() option), move it to cblst directory if [ -f $cblx/$px.lst ]; then mv -f $cblx/$px.lst cblst; fi exit 0
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
# copymvsctls - copy Vancouver Utility control files # required to convert mainframe COBOL & JCL # - by Owen Townsend, UV Software, 2008, update 2018 # # Run this script after creating testlibs/subdirs for JCL/COBOL conversion # 1. login <-- login (to your homedir, or as mvstest, or ?) # 2. mvstestdirs <-- setup directories required for JCL/COBOL conversion # 2a. mkdir testlibs <-- superdir libraries for JCL conversions # 2b. mkdir testdata <-- superdir data for JCl/script executions # 2c. mkdir cnvdata <-- superdir for data conversions EBCDIC to ASCII # 3. cdl <-- alias for 'cd $RUNLIBS or 'cd $TESTLIBS' # 4. mvslibsdirs <-- make about 25 subdirs in $RUNLIBS # 5. cdd <-- alias for 'cd $RUNDATA or 'cd $TESTDATA' # 6. mvsdatadirs <-- make about 10 subdirs in $RUNDATA # 7. copymvsctls <-- copy control files & functions for conversion & execution # =========== * This Script * # echo "copymvsctls - copy VU control files required to convert JCL & COBOL" echo "- will copy from \$UV/ctl/... to \$RUNLIBS=$RUNLIBS" echo "- are you currently in \$RUNLIBS y/n ?"; read reply if [[ "$reply" != "y" ]]; then echo "aborted, not in \$RUNLIBS"; exit 99; fi cp $UV/ctl/jclunixop51 $RUNLIBS/ctl #options for JCL converter cp $UV/ctl/jclunixop53 $RUNLIBS/ctl #options for JCL converter for AIX COBOL cp $UV/ctl/aixcblrw.tbl $RUNLIBS/ctl #search/replace table AIX COBOL cp $UV/ctl/aixhvcopybks $RUNLIBS/ctl #search/replace table AIX COBOL cp $UV/ctl/cnvcob5.tbl $RUNLIBS/ctl #search/replace table Micro Focus COBOL cp $UV/ctl/cobdirectives $RUNLIBS/ctl #COBOL DIRECTIVES for Micro Focus cp $UV/ctl/datactl54symbols $RUNLIBS/ctl #convert some $symbols in data ctlfiles cp $UV/ctl/extfh.cfg $RUNLIBS/ctl #Micro Focus file handler configuration cp $UV/ctl/utilities $RUNLIBS/ctl #avoid *flag utilities on cross-refs cp $UV/sfun/* $RUNLIBS/sfun #functions cp $UV/ctl/add/masterctl51* $RUNLIBS/ctl/add/ # for jcl2ksh51A cp $UV/ctl/add/masterctl51* $RUNLIBS/ctl/addold/ # backups in ctl/addold # cp -r $UV/ctl/cat* $RUNLIBS/ctl #subdirs cat0,cat1,cat2 LISTCAT JCL info # cp -r $UV/ctl/cat* $CNVDATA/ctl #subdirs cat0,cat1,cat2 LISTCAT DATA info cp $UV/ctl/lrecl0 $RUNLIBS/ctl #mainframe record sizes for JCL convert # # init control files for JCL converter (add data later if/when required) touch ctl/coboljava2 #<-- cobol to java name converion touch ctl/jclscripts #<-- JCL PROGRAMs to be replaced by unix scripts touch ctl/rexxprograms #<-- REXX program-names (for JCL converter to recognize) touch ctl/uvcopyprograms #<-- JCL PROGRAMs to be replaced by unix scripts touch ctl/ezt0 #<-- empty file reqd for xkshprog2 # # init Indexed control files for JCL converter (add data later if/when required) makeISF0 ctl/datacat52I 191 0,70 #<-- init LISTCAT info for jcldata51A & jcl2ksh51A makeISF0 ctl/cobfil55bI 127 0,44 #<-- init recsize info for cnvMF51A COBOL conversion makeISF0 ctl/datarcf51I 191 0,70 #<-- init recsize info for some JAVA conversions exit 0
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
# jcldata51A - extract datafile info (record-sizes,etc) from all JCL # & re-create 'datactl53I' Indexed file for JCL converter # - see http://uvsoftware.ca/jclcnv2real.htm#2B3 # - by Owen Townsend, updated Jan 2015 # jcldata51A - (this script) called by script jcl2ksh51A (for Micro Focus COBOL) # - also called by jcl2ksh53A for AIX COBOL # #Nov15/18 - change 'catcat51 to 'adddata51' (updated with ctldata53 keyword logic) #Nov10/18 - added jcldata51a to assist transfer keyword values between files # - drops topnode, could be only diff preventing transfer # - makes Indexed file for jcldata52 to lookup independent of topnode # - will transfer record sizes & typ=RDW (vs default typ=RSF) #Sep27/18 - filenames in control files forced to UPPER case by option t2 # in jcldata52 & loadctlI70, still allows lower case filenames in JCL/scripts, # since jclunix51 UPPER cases the script filenames to match the control-files # # ** uvcopy jobs & datafiles used by this script ** # # 1. uvcopy jcldata51 extracts DSNs from all JCLs in jcl2/... # & writes control file ctl/datajcl51 # # 2. uvcopy jcldata52 copies ctl/datajcl51 to ctl/datajcl52, converting the # filenames from mainframe to unix/linux script & VU standards # # 3. uvcopy ctldata53 copies ctl/datajcl52 to ctl/ctldata53, updating keyword # info by lookup Indexed files ctl/datacat52I & ctl/dataadd52I # Following files must/may exist before this script is run # # 3a. ctl/datacat52I - Indexed file created by script catdata50 # from multiple mainframe LISTCAT reports # - empty file created by copymvsctls (in case no LISTCAT) # # 3b. ctl/add/... - may be multiple sequential files in ctl/add/... subdir # - will be combined by uvcopy adddata51 & loaded into # Indexed file ctl/dataadd52I # # 3c. ctl/dataadd52I - Indexed file (interim) created by this script # from all additional info files in ctl/add/... # # 3d. $UV/ctl/add/masterctl51 - dummy file # - copied to ctl/add by copymvsctls (in case no other files) # # 3e. additional files that you may create, name as you like, for example: # ctl/datacpy52 - datafilenames & copybooknames # ctl/dataedt52 - make with editor to supply missing info as desired # ctl/datagdg52 - make with editor to supply missing GDG gnerations # # 3f. ctl/add/datactl53 - input from any prior jcl2ksh51A # - saves updates by JCL converter for reconversions by jcl2ksh51A # # 4. ctl/datactl53I - Indexed file (final result) created by this script # to supply datafile info to JCL converter # (record sizes, Indexed keylocs/keylens, file types, etc) #
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
# ** Notes ** # # 1. all uvcopy jobs stored at $UV/pf/IBM/... ($UV usually /home/uvadm) # all scripts are stored at $UV/sf/IBM/... # # 2. This script is called by jcl2ksh51A which performs all steps # of JCL conversion # # 3. You may run this script separately anytime you want to update # the JCL conversion control file (ctl/datactl53 & ctl/datactl53I) # when any 1 of the source files has been updated # # 4. You may then run script 'jclxx51 jcl2 jcl3' to reconvert all JCL # (if no new JCLs have been added to jcl0,jcl1,jcl2) # # 5. If new JCLs have been added, use 'jcl2ksh51A' (which calls this jcldata51A) # before it reconverts all JCL to ksh scripts # # 6. How To change record-sizes & file-types in the control files: # ============================================================= # - use $RUNLIBS/ctl/add/masterctl51 & $UV/sf/IBM/masterctlup51 # - see doc at begin sf/IBM/masterctlup51 #
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
# ** sources for data info files ** # # 1. JCL, all DSNs extracted, sorted,& reduce to 1 record # per unique datafilename with available file info keyworded on right side # (rca=avg-recsize, rcm=max-recsize, key=..., etc) # uvcopy job#1: jcldata51 reads all JCL, writes ctl/datajcl51 # uvcopy job#2: jcldata52 reads ctl/datajcl51, writes ctl/datajcl52 # # 2. LISTCAT report from mainframe transfered to unix/linux, # - reformated similarily to the JCL info # - see script 'catdata50' to extract LISTCAT info & load Indexed file # (runs uvcopy catdata51,catdata52,catcat51,uvcp to load Indexed file) # - must be executed BEFORE JCL conversion # OPTIONAL, if not available make empty file so JCL conversion will run # --------> makeISF0 ctl/datacat52I 191 0,44 # # 3. Additional file info ctl/dataadd53I # uvcopy job catcat51 - combines all files in ctl/add/... # - creates 1 record per filename collecting significant info from multi records # Info files from various sources, filename on left, keyword info on right # # Control filenames in ctl/add/... anything you like, such as: # # 3a. ctl/add/dataedt52 - create with editor, add info not found in JCL # - record sizes of input only files # 3b. ctl/add/datagdg52 - number of generations for GDG files # 3c. ctl/add/datacpy52 - copybook names, record sizes, filetypes # - copybooks not required for JCL convert, required for DATA convert # 3d. ctl/add/datafile52 - Data conversion control file (created from data-files) # uvcopy job filedata51 - reads ctl/datafile.txt, writes ctl/datafile51 # uvcopy job filedata52 - reads ctl/datafile51, writes ctl/datafile52 # - this file indicates which files have packed or binary fields # - 1st 5000 bytes of EBCDIC datafiles are scanned for x'0C' & x'00 # - result coded on src=... as 'Dp' or 'Db' #Mar08/2017 - pf/IBM/datafile51,2,3 new job names for datacnv51,,2,3 # - filedata51,2,3 new data names for filedata51,,2,3 # - but no longer used in this jcldata51A script # # 3. Additional file info - OPTIONAL # If not available make at least 1 dummy file so JCL conversion will run # - $UV/ctl/add/dummy_readme supplied, copied by copymvsctls script to # $RUNLIBS/ctl/add/dummy_readme #
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
# ** sample input#1 - new datajcl52 ** # # gl.account.acntlist_ rcs=00133 typ=RSF data=____ job=jgl230 prg=CGL100 # gl.account.master2 rcs=00080 typ=ISF data=____ job=jgl360 prg=IDCAMS key=(0,6) # gl.account.master_ rcs=00080 typ=RSF data=____ job=jgl230 prg=CGL200 # gl.account.master_ rcs=00000 typ=RSF data=____ job=jgl230 prg=CGL200 # gl.account.tran1 rcs=00000 typ=RSF data=____ job=jgl230 prg=SORT # gl.account.trans_ rcs=00080 typ=RSF data=____ job=jgl230 prg=SORT # #Note - see samples of all control files at beginning of $UV/pf/IBM/ctldata53 # - filenames in control files must match filenames output by JCL converter # - mainframe '$' keyed as '_' underscores # - GDG files Identified by trailing '_' underscores #Sep27/18 - filenames in control files forced to UPPER case # by option t2 on jcldata52 & loadctlI70 # echo "jcldata51A - create 'datactl53I' data file info for JCL converter" echo " - data file info may be supplied from 3 sources" echo "JCL + LISTCAT + ADDitional sources (combined into 1 Indexed file)" echo "datajcl52 + datacat52I + dataadd52I" echo " - combined to ctl/datactl53 & loaded Indexed file ctl/datactl53I" echo " 1. this script runs jcldata51 & jcldata52 to create ctl/datajcl52" echo " 2. optional catdata51,52,catcat51 to create ctl/datacat52I" echo " - see script \$UV/sf/IBM/catdata50 extract datafile info from LISTCATs" echo " - or create empty file --> makeISF0 ctl/datacat52I 191 0,44" echo " 3. ctl/dataadd52I will be created by combining all files in ctl/add/..." echo " ctl/add/... <-- additional control files of datafile info" echo " \$UV/ctl/add/dummy_readme - use this if no additional files" reply=x; until [[ "$reply" == "y" || "$reply" == "n" ]] do echo "Have you created the dependent files described above" echo "and have already run initial 'copymvsctls' to setup empty files y/n ?" read reply; done if [[ "$reply" = "n" ]]; then echo "reply=$reply, Aborting"; exit 1; fi #
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
uvcopy jcldata51,fild1=jcl2,filo2=ctl/datajcl51,uop=q0i7r0,rop=r0 #================================================================ # - extract DSN from all JCL, sort reduce to 1 entry with file info on right # uvcopy jcldata51a,fili1=ctl/datajcl51,filo1=ctl/datajcl51a,uop=q0i7,rop=r0 #========================================================================= # - copy ctl/datajcl51 to ctl/datajcl51a prior dropping the top-node # prior to loading Indexed file for jcldata52 # uvcopy loadctlI70,fili1=ctl/datajcl51a,filo1=ctl/datajcl51aI,uop=q0i7t2 #======================================================================= # - sort filenames & load Indexed file for lookup by jcldat52 # uvcopy jcldata52,fili1=ctl/datajcl51,filo2=ctl/datajcl52,filr2=ctl/datajcl51aI,uop=q0i7l2 #======================================================================================== # - modify filenames for scripts, GDG (0),(+1) converted to trailing '_' # - convert embedded '$ '#' to '_'s, leading '&&' to '__' # - filenames in control files forced to UPPER case by option l2 # still allows lower case filenames in JCL/scripts, # since jclunix51 UPPER cases script filenames to match the control-files # - JCL converter control file 'jclunixop51' option c0/c2 determines lower/UPPER case #Nov10/18 - new, using Indexed file, if rcs=_____ or typ=RSF lookup ctl/jcldata51aI # - to add keyword values if present to keyword=... missing in jcldata51 # echo "datafilenames extracted from jcl2/... to make control file of recsize,etc" echo "enter to combine any ADDitional info in ctl/add/... & load Indexed file" read reply # uvcopy adddata51,fild1=ctl/add,filo1=ctl/dataadd51,uop=q0i7,rop=r0 #================================================================= #Nov15/18 - change 'catcat51 to 'adddata51' (updated with ctldata53 keyword logic) # combine all additional files of datafile info & load Indexed file # - sort/reduce any duplicate filenames to 1 entry # - combining significant keyword=... values on right side # uvcp "fili1=ctl/dataadd51,rcs=191,typ=LST,key1=0(44),filo1=ctl/dataadd52I\ # ,typ=ISF,isk1=0(44),trl=0(44)" #Apr05/17 - replace uvcp with loadctlI70 # uvcopy loadctlI70,fili1=ctl/dataadd51,filo1=ctl/dataadd52I,uop=q0i7t2 #==================================================================== #Mar26/18 - loadctlI70 case option default t2=UPPER, t1=lower #Jun26/17 - uop=t2 UPPERcase for dataadd52I to match JCL filenames # Load Indexed file of additional info to combine with JCL & LISTCAT info #
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
echo "enter to combine info from JCL, LISTCAT,& ADDitional & load Indexed file" echo "- Indexed file ctl/datactl53I.dat/.idx used by JCL converter" read reply # uvcopy ctldata53,fili1=ctl/datajcl52,filo1=ctl/datactl53\ ,filr2=ctl/datacat52I,filr3=ctl/dataadd52I,rop=r0,uop=q0i7 #========================================================= # - copy ctl/datajcl52 to ctl/ctldata53, updating keyword info # by lookup Indexed files ctl/datacat52I & ctl/dataadd52I # uvcopy loadctlI70,fili1=ctl/datactl53,filo1=ctl/datactl53I,uop=q0i7t2 #==================================================================== #Mar26/18 - loadctlI70 case option default t2=UPPER(Acerta), t1=lower # Load Indexed file used by JCL converter to get recsize, keys, etc #Jun26/17 - uop=t2 UPPER case for JCL converter Indexed file # ##Oct08/17 - remove jclfiles54I, replaced by ctl/add/masterctl51 for cpy & rcs,etc updates ## Additions in jcl2ksh54A for Rexx & Java (vs jcl2ksh51A COBOL sites) ## - to recreate ctl/datactl53I with copybook name & record-size ## - may comment out here in jcldata51 (clients without jclfiles54I) or make empty jclfiles54I ## uvcopy jclfiles54,fili1=ctl/datactl53,filo1=ctl/datactl53a,fili2=rpts/jclfiles/jclfiles54I\ ## ,fili3=ctl/dataadd52I,uop=q0i7,rop=r0 ##========================================================================================= ## - copy adding copybook & recsize from rpts/jclfiles/jclfiles54I if not already present ## - then override from ctl/dataadd52I ## uvcopy loadctlI70,fili1=ctl/datactl53a,filo1=ctl/datactl53I,uop=q0i7t2 ##======================================================================= #Mar26/18 - loadctlI70 case option default t2=UPPER(Acerta), t1=lower ##Oct08/17 - above jclfiles54 & datactl53a now obsolete # uvcopy ctldata54,fili1=ctl/datactl53,fili2=ctl/datactl54symbols,filo1=ctl/datactl54,uop=q0i7,rop=r0 #=================================================================================================== #Oct08/17 - now using ctl/datactl53 (vs ctl/datactl53a obsoleted by ctl/add/masterctl51) # - copy to ctl/datactl54 resolving $SYMBOLS # uvcopy loadctlI70,fili1=ctl/datactl54,filo1=ctl/datactl54I,uop=q0i7t2 #==================================================================== #Mar26/18 - loadctlI70 case option default t2=UPPER(Acerta), t1=lower # - load Indexed file ctl/datactl54I for expfileini function & uvcopy fileini2 # #Aug13/17 - ctldata54 & loadctlI70 to load ctl/datactl54I UNcommented # - so ctldata54/loadctlI70 can be removed from jcl2ksh54A (were following jcldata51A) # - and ctldata54A could be eliminated since both call this jcldata51A exit 0 #------------------------ end of $UV/sf/jcldata51A ----------------------
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
#!/bin/ksh # catdata50 - script to extract info from MVS LISTCAT reports # - relevant info used to automate JCL & DATA file conversions # - by Owen Townsend, UV Software, July 22/2014 # - runs uvcopy jobs catdata51, catdata52, & sort/load # Indexed file of LISTCAT info for JCL & DATA conversions # - see www.uvsoftware.ca/jclcnv2real.htm#2C1 # # MVS LISTCAT reports FTP'd to the unix & stored in subdir ctl/cat0/... # - may be multiple reports, named for various mainframe discs # catdata51 extracts desired items & writes to subdir ctl/cat1/... # - AVGLRECL, MAXLRECL, RKP, KEYLEN, REC-TOTAL, GDG LIMIT # - code file info as keywords=... on right side of filename # if [[ "$1" == "all" && -d ctl/cat0 && -d ctl/cat1 && -d ctl/cat2 && -d ctl ]]; then : else echo "usage: catdata50 all" echo " =============" echo "- arg1 must be 'all' & subdirs ctl,ctl/cat0,ctl/cat1,ctl/cat2 must be present" exit 91; fi # export UVCOPYROP=i30 # inhibit prompts for uvcopy jobs # uvcopyx catdata51 ctl/cat0 ctl/cat1 uop=q0i7,rop=r0 #================================================== # - extract file info from MVS LISTCAT reports (recsize, key loc/len, etc) # - script uvcopyx repeats catdata51 for all files in directory # uvcopyx catdata52 ctl/cat1 ctl/cat2 uop=q0i7,rop=r0 #================================================== # - translate filenames to lower case, append trailing '_' for GDG files # - convert any embedded '$ '#' to '_'s,& sort by filename # - script uvcopyx repeats catdata52 for all files in directory # uvcopy lrecl0cat2,fili1=ctl/lrecl0,filo1=ctl/cat2/lrecl2 #======================================================= # - convert mainframe LRECL listing to LISTCAT UV format # - to be merged with other ctl/cat2/... files # - lrecl2cat1 job added to this script sf/IBM/catdata50 Dec01/2015 # - copymvsctls creates empty ctl/lrecl1 in case LRECL listing unavailable # uvcopy catcat51,fild1=ctl/cat2,filo1=ctl/datacat51,uop=q0i7,rop=r0 #================================================================= # sort/reduce any duplicate filenames to 1 entry # - combining significant keyword=... values on right side # uvcp "fili1=ctl/datacat51,rcs=191,typ=LST,filo1=ctl/datacat52I,typ=ISF,isk1=0(44)" #================================================================================= # - load indexed file, keyed by filename # - to lookup & transfer info to JCL & DATA conversion jobs exit 0
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
# xlsdata51 - convert spreadsheet .csv to control-file for data conversion # - by Owen Townsend, UV Software, March 10/2015 # - updated for data conversions vs JCL conversions March 2017 # # uvcopy xlsdata51,fili1=ctl/xlsdata51.csv,filo1=ctl/cat2/dataxls51 # ================================================================= # # Input file ---> ctl/xlsdata51.csv # - named as shown above & in ctl/... subdir, or enter alternate at prompt # - must be in comma-delimited format (extension .csv) # # Output file ---> ctl/cat2/xlsdata51 # - same name but '.csv' suffix dropped (actually does not matter) # - could have any name desired BECAUSE: # - uvcopy catcat51 (part of jcldata51A script) # - will combine all files found in ctl/caat2/... into ctl/datacat51 # which is loaded into Indexed file ctl/datacat51I.dat/.idx # # ** sample input ** # # DSN,copybook,Organization,Reclen,Block,RecFormat,Allocated,USED,Extents,created,Referenced # ar.customer.armaster,armaster,PS,256,25600,FB,800,50,1,9/30/2014,3/14/2015, # ar.customer.arsales,arsales,PS,64,6400,FB,750,25,1,8/25/2014,2/17/2015, # gl.account.master,glmaster,PS,128,12800,FB,750,25,1,8/25/2014,2/17/2015, # gl.account.gltran,gltran,PS,80,8000,FB,750,25,1,8/25/2014,2/17/2015, # vendor.master.names,vendormas,PS,100,10000,VB,750,25,1,8/25/2014,2/17/2015, # vendor.master.payments,vendorpaymas,PS,100,10000,VB,750,25,1,8/25/2014,2/17/2015, # # ** sample outut ** # # ar.customer.armaster cpy=armaster rca=00256 rcm=00256 rcs=00256 data=ps # ar.customer.arsales cpy=arsales rca=00064 rcm=00064 rcs=00064 data=ps # gl.account.master cpy=glmaster rca=00128 rcm=00128 rcs=00128 data=ps # gl.account.gltran cpy=gltran rca=00080 rcm=00080 rcs=00080 data=ps # vendor.master.names cpy=vendormas rca=00100 rcm=00100 rcs=00100 data=ps # vendor.master.payments cpy=vendorpaymas rca=00100 rcm=00100 rcs=00100 data=ps # opr='$jobname - convert spreadsheet .csv to control-file for data conversion' rop=r1 # prompt for file disposition at EOF fili1=?ctl/dataxls51.csv,typ=LST,rcs=256 filo1=?ctl/cat2/dataxls51,typ=LSTt,rcs=100 # load keyword template # 1 2 3 4 5 6 7 8 #12345678901234567890123456789012345678901234567890123456789012345678901234567890 lod=h0(100) cpy=____________ rca=_____ rcm=_____ rcs=_____ data=_______ cpy=____________ rca=_____ rcm=_____ rcs=_____ data=_______ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
@run opn all get fili1,a0 bypass 1st record (header) # # begin loop to get/process/put records until EOF man20 get fili1,a0 get next record skp> man90 (cc set > at EOF) cmc a0(1),'#' #cmt line to bypass ? skp= man20 # drop #cmt lines disabled so demo file #cmts retained for doc add $ca1,1 count records in trl a0(100) translate to lower case # # convert csv to fixed fields 100 bytes apart fix b100(100),a0(100),15,',' fix .csv fields 100 apart # # refresh keyword template & insert desired data # store recsize in rca=... & rcs=..., store organization & format in typ=org_fm_ man30 mvc h0(80),h100 refresh template scn h04(20),' ' find length of _________ clr h04($rx20),' ' clear __________ mvu h04(20),b200,' ' store copybook til ending blank mvn $ca3,b400(5) capture record size mvn h21(5),$ca3 rca=... mvn h31(5),$ca3 rcm=... mvn h41(5),$ca3 rcs=... mvc h52(4),b300 organization mvc h56(3),b600 record format # # bypass records without rec-size, filenames ending in .data or .index, # & file organization 'pds' man50 cmn $ca3,0 any record size ? skp= man20 scn b100(50),'.data ' skp= man20 scn b100(50),'.index ' skp= man20 cmc h52(3),'pds' skp= man20 # # store filename in output area followed by keywords man60 mvc c0(50),b100 filename to output area mvc c46(80),h0 append keywords put filo1,c0(100) add $ca2,1 count records out skp man20 # # EOF - close files & exit man90 cls all eoj
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
#!/bin/ksh # xrefall - Korn shell script from UVSI stored in: /home/uvadm/sf/util # xrefall - script to generate all xref reports in batch mode # --> cd $RUNLIBS - change above dirs to be xrefd (cbls & jcls) echo "xrefall - generate COBOL & JCL cross-ref reports in subdir xref" if [[ -d "$1" && -d "$2" && -d "$3" ]] then : else echo "usage: xrefall COBdir JCLdir KSHdir" echo " ============================" echo "example: xrefall cbls jcl1 jcl3/jcl4 <-- jcl3 or jcl4 during conversion" echo " =========================== - scripts moved to jcls as debugged" echo "example: xrefall cbls jcl1 jcls <-- use jcls after conversion" echo " ======================" exit 1 fi cdir="$1"; jdir="$2"; kdir="$3"; #setup $symbols for coboldir & jcldir reply=x until [[ "$reply" = "y" || "$reply" = "n" ]] do echo "OK to remove all old reports from xref/ & recreate new y/n ?" read reply;done; if [[ "$reply" == "y" ]]; then rm -f xref/*; fi # copy utilities from uvadm/ctl in case copymvsctls not already run cp $UV/ctl/utilities ctl # used by xkshprog1 & xkshprog2 # code cobols for xrefprog1/chkprog1/2 to +flag EXEC SQL, @flag EXEC CICS uvcopy cblcode1,fild1=$cdir,filo1=ctl/cblscoded,filo2=sf/cbl0copycode # ls ezt0 >ctl/ezt0 # Easytrieve programs for ^flags # ezt0/... files must be lower case with .ezt extension # export XREFALL=Y # set env-var to stop prompts for vi/uvlp12 xcobcall1 $cdir q0i7 # list calls in any 1 program xcobcall2 $cdir q0i7 # programs calling any 1 called-program xcobcopy1 $cdir q0i7 # list copybooks in any 1 program xcobcopy2 $cdir q0i7 # programs copying any 1 copybook #xcobfile2 $cdir q0i7 # programs using any 1 external filename xjclproc1 $jdir q0i7 # list PROCs in any 1 jcl/script xjclproc2 $jdir q0i7 # jcl/scripts calling any 1 PROC # xkshfile1 $kdir q0i7 # list datafiles in any 1 jcl/script xkshfile2 $kdir q0i7 # jcl/scripts using any 1 datafile xkshparm1 $kdir q0i7 # list parms/... in any 1 jcl/script xkshparm2 $kdir q0i7 # jcl/scripts referencing any 1 parms/... # xkshparmsd1 $kdir q0i7 # list parmsds/.../... in any 1 jcl/script # xkshparmsd2 $kdir q0i7 # jcl/scripts referencing any 1 parmsds/.../... xkshprog1 $kdir q0i7 # list programs in any 1 jcl/script xkshprog2 $kdir q0i7 # jcl/scripts executing any 1 program # xkshprog2a $kdir q0i7 # jcl/scripts executing any 1 program #Apr10/2017 - xkshprog2a (sum COBOLs) replaced by uvcopy xkshdrop1 & various options # #Jan12/2019 - xkshprog3 # XREFID=JAVA:prog,job & convert :/_, to blanks xkshprog3 $kdir q0i7 # jcl/scripts executing any 1 program #Jan14/2019 - xkshprog3a # XREFID=REXX/prog & dsnutilb_LOAD xkshprog3a $kdir q0i7 # jcl/scripts executing any 1 program #
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
echo " " echo "summarize PROC xrefs ? - xrefdrop1 will solicit options to drop high volume refs" echo " - default uop=m1n1 2 lines per PROC, 1st 4 xrefs & 2nd line with total count" echo " - suggest m2n1 to drop PROCs with only 1 reference" uvcopy xrefdrop1,fili1=xref/xjclproc2,filo1=xref/xjclproc2_,uop=m1n1 #=================================================================== # - summarize PROC xrefs - drop less than 2 refs, keep 1st & last lines of group # echo " " echo "select UTILITY PROGRAM xrefs ? - default uop=u1 will select all utilities" echo " - uses table of known utilities in $RUNLIBS/ctl/utilities" echo " - enter option u3 to be prompted for utilities to omit (example - iefbr14)" echo " - may enter multiples with ':' colon delimiters (ex - iefbr14:sort)" echo " - enter option u4 to be prompted for utilities to select" echo " - enter option u8 to drop all utilities & show only COBOL/JAVA programs" uvcopy xrefdrop1,fili1=xref/xkshprog2,filo1=xref/xkshprog2_,uop=u1 #================================================================= uvcopy xrefdrop1,fili1=xref/xkshprog3a,filo1=xref/xkshprog3a_,uop=u1 #=================================================================== # - select PROGRAM xrefs UTILITIES only - default all lines of all utilities # echo " " echo "select UTILITY PROGRAMs AND minimize xref lines per utiltiiy" echo " - default uop=m1n1u1 2 lines per utility, 1st 4 xrefs & 2nd line with total count" echo " - enter option u3 to be prompted for utilities to omit (example - iefbr14)" echo " - may enter multiples with ':' colon delimiters (ex - iefbr14:sort)" echo " - enter option u4 to be prompted for utilities to select" echo " - enter option u8 to drop all utilities & show only COBOL/JAVA programs" uvcopy xrefdrop1,fili1=xref/xkshprog2,filo1=xref/xkshprog2_,uop=m1n1u1 #===================================================================== uvcopy xrefdrop1,fili1=xref/xkshprog3a,filo1=xref/xkshprog3a_,uop=m1n1u0 #======================================================================= # - summarize PROGRAM xrefs for UTILITIES only - 1st line + total # echo " " uvcopy xrefselect1,fili1=xref/xkshprog1,filo1=xref/xkshprog1_,arg1=db2_dsn_dbt #============================================================================= # - select refs by up to 3 prefixes ex: arg1=db2_dsn_dbt # # create Missing Files reports (selects '*' col1 from all xrefs) echo "will mkdir xrefmissing &/or remove all files from xrefmissing OK" test -d xrefmissing || mkdir xrefmissing rm -f xrefmissing/* uvcopy selectrmp3f,fild1=xref,fild2=xrefmissing,uop=a0b1,arg1='*' #================================================================ #
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
# show Op. Instrns. to print reports from xref subdir (laser print at 12 cpi) echo " " echo "---------------------------------------------------------------" echo "reports generated in xref subdir, view/print command examples:" echo "ls -l xref <-- list filenames of reports in xref subdir" echo "uvlp12 xref/??? <-- print 1 specified report" echo "uvlp12D xref/??? <-- print 1 report DUPLEX" echo "uvlpd12 xref <-- print ALL reports in xref subdir Simplex" echo "uvlpd12D xref <-- print ALL reports in xref subdir DUPLEX" echo "more xref/* <-- display all reports in xref subdir" echo "--> enter to see follow-on reports based on crossrefs"; read reply; echo " " echo "--> suggest 'Not Used' reports for copybooks, programs, parms, procs" echo "mkdir xnu <-- make subdir to receive Not Used reports" echo "xnu1 xref/xcobcopy2 cpys <--Not Used for copybooks" echo "xnu1 xref/xcobprog2 cbls <--Not Used for programs" echo "xnu1 xref/xkshparm2 parms <--Not Used for parms" echo "xnu1 xref/xkshproc2 procs <--Not Used for procs" echo " " echo "--> suggest also run cobfiles5A to report data-files in all COBOL programs" echo "cobfiles5A cbls cpys maps <-- creates ctl/cobfiles & xref/cobfiles" echo "========================= - also required by JCL conversions" echo "uvlp12 xref/cobfiles <-- report useful during test/debug" echo "mvsfiles5A jcl2 xmvsA <-- report data files req'd for ALL JCL/scripts" echo "mvsfiles51 jcl2/jobname.jcl xmvs <-- files for any 1 JCL/script" echo "mvsfiles5B jcl2 xmvs <-- files for ALL JCLs (separately)" echo "================================ - reports in xmvs/jobname/..." echo " " echo "--> run xrefdrop1 to drop low frequency programs from crossref rpts" echo "--- to highlight high volume programs & utilities" echo "uvcopy xrefdrop1,fili1=xref/xkshprog2,filo1=xref/xkshprog2_,uop=m1n1u1" echo "======================================================================" echo " - m1n1u1 - u1=utilities summary, 1st line only + total (ran above)" echo " - suggest uop=m10,m50,or m100 to drop more low-freqs, highlight high-freqs" echo " - combine with 'uop=n1', to show just 1st line+total for each program" echo " " echo "uvcopy xrefdrop1,fili1=xref/xjclproc2,filo1=xref/xjclproc2_,uop=m1n1" echo "====================================================================" echo " - summarize PROC xrefs - 1st line + total for each proc (ran above)" echo " " echo "--> run xrefselect1 to investigate specified programs by up to 3 prefixes" echo "uvcopy xrefselect1,fili1=xref/xkshprog1,filo1=xref/xkshprog1_db2,arg1=db2_dsn_dbt" echo "=================================================================================" echo "- select refs to 'db2' from xkshprog1, programs for each JCL (ran above)" echo " " echo "Missing File reports created in xrefmissing/..." echo "uvcopy select2d,fild1=xref,fild2=xrefmissing,uop=a0b1,arg1='*'" echo "==============================================================" echo "ls -l xrefmissing/* <-- list missing file reports" echo "vi xrefmissing/* <-- inspect Missing Files reports" echo "uvlpd13D <-- print all if no big files (separate reports)" echo "listall1 xrefmissing <-- combine multiple small files on 1 report" echo " " exit 0
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
#!/bin/ksh # xkshprog2 - crossref ksh JCL/scripts executing each PROGRAM # - uvcopy chkprog2 codes '*' missing & ALSO '+' EXEC SQL # xkshprog2a - alternate version to consolidate COBOL & Easytrieve to cobol, coboldb2, easytrv echo "xkshprog2 - crossref to show all ksh SCRIPTS executing each PROGRAM" echo "- codes DB2 cobols '+', CICS cobols '@', codes DFSRRC00 with '^'" echo "- must have run uvcopy cblcode1 --> ctl/cblscoded .cbl/.cdb/.cic extnsns" export JOBID="xkshprog2" # xkshprog2 jcls [options] <- default indir & options as follows # xkshprog2 jcls a18b16c4j4l1w1 <- 2 sets of options for 2 reports # a18 - column to begin references # b16 - width allowed for each reference # c4 - number of references per line # j4 - ksh script (bypass # in col1) # l1 - lower case, progid= & program( same # w1 - progname is 1s word after PROGID dir="$1" if [[ ! -d "$dir" ]]; then echo "usage: xkshprog2 directory options - arg1 must be directory" echo " ============================" echo "sample: xkshprog2 jcls a18b16c4j4 - arg2 options default as shown" echo " ==========================" echo "options: a18=cols for filenames, b16=cols for programnames" echo " c4=programnames/line, j4=ksh scripts (ignore # col1)" echo " w1=programname is 1st word following PROGID=" exit 1; fi # setup default options, if not specified - use defaults ops=a18b16c4d8j4l1$2; # append any user options to defaults #Oct02/07 - option l1 & lower case progid:program for IKJEFT01 same if [[ ! -d tmp ]]; then mkdir tmp; fi #make tmp if not already present # init output file, will append grep output for each program >tmp/grep2 for i in $dir/* do \grep -i ' PROGID=' $i /dev/null >>tmp/grep2 done # \grep disables any alias grep='grep -n' would create 2 ':'s vs 1 # create full path name of directory, if not already if [[ $dir = /* ]]; then DIR=$dir; else DIR=${PWD}/$dir; fi export DIR=$DIR export TITLE="crossref to show all ksh SCRIPTS executing each PROGRAM" uvcopy xref2,fili1=tmp/grep2,filo1=tmp/xkshprog2,arg1=progid,arg2=~~,arg3=~~,uop=$ops #Jan12/19 - arg1=progid for xkshprog2, and arg1=xrefid for xkshprog3 & convert :/_, to blanks # run chkprog1 to *flag missing COBOL programs & +flag database programs #============================================================================================= # run chkprog2 to *flag missing COBOL programs # - xref2 above wrote to tmp/xkshprog2, now copy to xref/xkshprog2 #Sep05/15 - uvcopy chkprog2 to code '*' missing, '+' EXEC SQL, '@' EXEC CICS, '^' DFSRRC00 #Feb16/18 - remove mkdir ezt0, just assume ctl/ezt0 exists # test -d ezt0 || mkdir ezt0; ls ezt0 >ctl/ezt0 # allow for sites with no easytrieve programs uvcopy chkprog2,fili1=tmp/xkshprog2,filo1=xref/xkshprog2,fili2=ctl/cblscoded,fili3=ctl/ezt0,fili4=ctl/utilities,rop=r0 #============================================================================================== # inhibit prompt for vi/uvlp if batch run (batch will print all xref) if [[ "$XREFALL" != "Y" ]]; then echo "report generated = xref/xkshprog2" echo " - use uvlp12,uvlp14,uvlp16 to laser print at 12,14,16 cpi" echo "--> enter command (vi,cat,more,uvlp12,uvlp14,uvlp16,etc)" read ans; if [[ ! "$ans" = "" ]]; then $ans xref/xkshprog2; fi fi exit 0
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
# xref2csvall - uvcopy Parameter File from UVSI stored in: /home/uvadm/pf/util/ # xref2csvall - convert crossrefs to .csv files , summary & detail, 4-up & 1-up # # xref2csvall xref/xrefCA7jcl2 <-- sample usage for CA7 # ============================ - may specify any crossref in the xref/... subdir # # Must have already run the cross-ref script to create the desired cross-ref # input file for this script - various crossref scripts available, for example: # # --crossref-- dir --crossref report-- # xrefCA7 jcl2 --> xref/xrefCA7jcl2 # - crossref CA7 commands DGOTO,etc and &C_ variables # xmvsproc2 jcl1 --> xref/xmvsproc2 # - crossref PROC calls in jcl1/... # xkshprog2 jcl3 --> xref/xkshprog2 # - crossref programs called by JCL/Korn shell scripts # # xref2csvall xref/xrefCA7jcl2 <-- sample usage for CA7 crossrefs # ============================ - may specify any crossref in the xref/... subdir # ----- output multiple files as follows ----- # -jobname- ----- input file ----- ----- output file ----- # xrefsumdtl xref/xrefCA7jcl2 xrefsum/xrefCA7jcl2.sum # xrefdtl/xrefCA7jcl2.dtl # xref2csv4 xrefsum/xrefCA7jcl2.sum xrefsumcsv/xrefCA7jcl2_sum.csv # xref2csv4 xrefdtl/xrefCA7jcl2.dtl xrefdtlcsv/xrefCA7jcl2_dtl.csv # xrefcsv421 xrefsumcsv/xrefCA7jcl2_sum.csv xrefsumcsv/xrefCA7jcl2_sum1.csv # xrefcsv421 xrefdtlcsv/xrefCA7jcl2_dtl.csv xrefdtlcsv/xrefCA7jcl2_dtl1.csv # # xrefCA7 jcl2 <-- must have already run appropriate script for desired crossref # ============ - sample usage for CA7 crossrefs # - may specify any crossref in the xref/... subdir # echo "xref2csvall - convert crossrefs to .csv files , summary & detail, 4-up & 1-up" echo " - crossref input must be in xref/... (xrefCA7jcl2, xkshprog2, xmvsproc2, etc)" df="$1" if [[ ! -f "$df" ]]; then echo "usage: xref2csvall xref/xrefCA7jcl2 <-- sample for CA7 crossrefs" echo " ============================" echo " - arg1 must exist in xref/... already created by appropriate crossref script" exit 91; fi #
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
f=$(basename $df) #<-- extract base filename for xref/... # uvcopy xrefsumdtl,fili1=xref/${f},filo1=xrefsum/${f}.sum,filo2=xrefdtl/${f}.dtl,rop=r0q0 #======================================================================================= # - create both summary & detail reports prior to conversion to .csv below # uvcopy xref2csv4,fili1=xrefsum/${f}.sum,filo1=xrefsumcsv/${f}_sum.csv,uop=n10,rop=r0q0 #===================================================================================== # - convert CA7 xref crossref SUMMARY to .csv (4 references/line) # uvcopy xref2csv4,fili1=xrefdtl/${f}.dtl,filo1=xrefdtlcsv/${f}_dtl.csv,uop=n10,rop=r0q0 #===================================================================================== # - convert CA7 xref crossref DETAIL to .csv (4 references/line) # uvcopy xrefcsv421,fili1=xrefsumcsv/${f}_sum.csv,filo1=xrefsumcsv/${f}_sum1.csv,rop=r0q0 #====================================================================================== #- convert CA7 xref crossref SUMMARY .csv from 4 refs/line to 1 ref/line # uvcopy xrefcsv421,fili1=xrefdtlcsv/${f}_dtl.csv,filo1=xrefdtlcsv/${f}_dtl1.csv,rop=r0q0 #====================================================================================== # - convert CA7 xref crossref DETAIL .csv from 4 refs/line to 1 ref/line # echo "files created by this script $0 are:" ls -l xref*/${f}* #================
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
This part lists some of the more important control-files that you may need to update for your conversion preferences. The more important control-files are listed, identified by active links vs dead links. Active links have a trailing period '12A1.' vs trailing underscore '12A2_'
If you have the Vancouver utilities installed, you can see all control files in /home/uvadm/ctl/...
12A1. | jclunixop51 - control-file for JCL conversion to Micro Focus COBOL |
- JCL converter is 'jclunix51.c' |
12B1_ jclunixop53 - control-file for JCL conversion to AIX COBOL - JCL converter is 'jclunix53.c'
12C1. | Example updates to jclunixop51 options depending on site requirements |
- changing lower case defaults to UPPER for filenames,programnames,parmnames | |
- changing default COBOL calls to JAVA | |
- changing parms default 1 combined library to multiple parmsds/subdirs | |
(if your parmnames are not unique) | |
- changing the COBOL call from default Micro Focus .ints to Windows, | |
executables, unikix, natural, etc. |
12C2. | Example updates to jclunixop51 - search/replace tables |
- replace hard-coded IP#s with $variables | |
- $variable definitions in profiles of scheduler/operators & programmers | |
- so programmer testing can FTP to test sites vs production sites |
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
jclprocop51:d0h0i0q5y4 #<-- options for jclproc51 (MF & AIX COBOL) (q1->q5 Apr2020) # ==================== --> this is '*jclunixop51' for Linux vs 'jclunixop53' for AIX COBOL #*$UV/ctl/jclunixop51 - Vancouver Utilities JCL conversions for Linux COBOL/JAVA # $UV/ctl/jclunixop53 - Vancouver Utilities JCL conversions for AIX COBOL # *jclprocop51 options (line#01) used by jclproc51 (PROC expand) # jclunixop53 options (line#20) used by jclunix53 (JCL converter for AIX COBOL) # ----------------- jclprocop51 options for jclproc51 PROC expansion ------------------ # d0 - DD overrides inhibited - default OK if you do not use overrides # d1 - DD overrides activated - change d0 to d1 if you use DD overrides # d4 - debug dump jct1 table before & aafter marksteps1 & after marksteps2 # h0 - do NOT replace &symbols in :NOREPLACE: table (near end this file) # h1 - replace all &SYMBOLs if value declared (will be $SYMBOLs in ksh) # i0 - default includes directory is $RUNLIBS/includes (Apr01/2017) # i2 - change includes directory to $RUNLIBS/parms (Apr01/2017) # q1 - remove //* comments from PROCs <-- activated Feb02/18 for Asite # q2 - remove //* comments from JCLs (will be #comments in scripts) # q3 - remove //* comments from Both PROCs & JCLs # q5 - remove //* comments from PROCs if < 4 alphas (drop all '*' lines) # y4 - replace all &symbols with values (vs only on filenames) # jclunixop51:a2b0c8d0e2f1g1h0i0j0k15l1999m1n3o8p0q0r0s0t15u0v0w0x0y6z0 #<- mvstest options # ========================================================= # Apr29/20 - file type option changes f1 typ=LST, f3(f1+f2) force typ=LST regardless datactl53I # Sep12/19 - j5 to convert COBOL calls to JAVA calls # May20/18 - c8 UPPERcase filenames to lookup datactl53I req'd for lowercase filenames # a1 - EXEC *ABEND* will goto S9900 Abterm # a2 - minimize ABEND step code (Aug18/13) # a4 - inhibit ABEND steps (EXEC #commented) # a8 - inhibit IEFBR14 steps with no DD's (Jun01/12) # b1 - remove '$$' from '$$SYMBOL$$'s in filenames # b2 - assign &&/__temp files to data1/__... vs $JTMP/__... # c0 - all filenames lower case # c1 - program names UPPER case # c2 - filenames UPPER case (also uop=l2 jcldata52) # c4 - control card modulenames UPPER case # c8 - UPPERcase filenames to lookup datactl53I # d1 - drop DDNAMES that begin/end with LOAD/LIB # d2 - drop DSNAMES that begin/end with load/lib # d4 - used for debug dump in jclproc51 # d8 - inhibit sort field -4 VB/RDW for diff report # e1 - echo run command to test script w/o programs # e2 - option '-v' on cp & mv echo to console log # f0 - default file typ=RSF (allows packed/binary) # f1 - default uvsort/uvcp file typ=LST # f2 - over-ride file typ by ctl/datactl53I (if filename match) # f3 - f1+f2 force typ=LST regardless datactl53I # g0 - new GDGs data/, use exportgen2 vs exportgen1 # g1 - new GDGs jobtmp/, move to data/ at Normal EOJ # g2 - new GDGs $RUNDATA/jobtmp/, move to data at EOJ # - g0g1g2 warnings only, see jobset51/exportgen1 # h0 - NO replace for &SYMBOLs in :NOREPLACE: table # all other &SYMBOLS replaced if value available # h1 - DO replace the &SYMBOLs in :NOREPLACE: table # generate data ctl file with expanded filenames # i0 - lookup datactl53I for rcs,typ,key (always 2014+) # i1 - update datactl53I input recsizes from outsizes # i2 - lookup cobfil55bI for recsize, updt datactl53I if 0 # i4 - update datactl53I w cobfil55bI recsize even if non-zero # i8 - update datactl53I typ=LSTt if L/46 cobfil55bI # i16 - update datactl53I input rcs from out even if non-zero # i32 - set typ=LSTt if recsize 132/133 on SORT,IEBGENER,etc # i64 - updt data=pb,typ=RSF if input pb & output not (or vv) # j0 - default COBOL calls as per option 'r' # j1 - generate JAVA calls (vs COBOL) for COBOLs called directly # j2 - include -DDDNAME=$DDNAME for each file in step # j4 - generate JAVA calls for COBOLs called by IKJEFT01 # j8 - change output script file extension from .ksh to .java # j16 - insert # elif ((S0020 == 4)); then & # alias goto ... # k1 - FTP: insert 'open' prior to 1st line if IP# # k2 - FTP: insert 'user' prior 1st word if not cmd # k4 - copy FTP commands from lib module to instream # k8 - use FTP option '-u' to inhibit authorization # l1999 - recsize default for typ=LSTt option f5/f6 # m0 - generate exportfile modules original filename # m1 - $RUNLIBS/parms/... (or $RUNLIBS/parmsds/parmlib/...) # m2 - parms subdirs $RUNLIBS/parmsds/subdir/parmnames # m4 - generate --> sed -f $SEDSCRIPT $SYSIN_P >$SYSIN # m8 - assign parms to $RUNDATA vs $RUNLIBS default # n0 - unrecognized JCL passed thru as is # n1 - unrecognized JCL cmntd out with '#?' cols 1-2 # n2 - if any '=' present, convert any ','s to ';'s # n3 - if n2 & no '=', cmnt out with '#?' cols 1-2 # n4 - convert unrecognized JCL to instream data # o8 - expand all &/$symbols in EXEC keyword=values # p1 - use topnode as subdir data1/topnode/file.name.nodes # q1 - drop //* comments in code from PROCs (Dec2014) # q2 - drop //* comments in code from JCL (not yet) # r0 - gen cobrun, executes .int's & allows animation # r1 - gen runw for .ints NetExpress/Windows # r2 - assume executable programs either unix/windows # r4 - assume executables in PATH (progname only) # r8 - generate 'findexec' Multi-Level program search # r16 - generate 'unikixvsam $RLX/COBOLX' # r32 - generate Natural call (natsec batch ...) # s1 - convert SORT for SYNCSORT (not uvsort) # s2 - convert sort field 'c' to 'e' (Ebcdic seq) # t1 - insert clear alias goto fpr COND & stepctl51 # t2 - insert stepctl51 for stepstops & steptimes # t4 - insert goto after stepctl51 &/or COND test # t8 - insert jobend51 (jobabend51 obsoleted by testcc) # u1 - make topnode subdir Apr30/2018 # v1 - comment #rm DELETE via IDCAMS DELETE # v2 - comment #rm DELETE via DISP=(OLD/SHR/MOD,DEL) # w1 - generate 'unikixbld -i -d $KIXDATA/filename' # w2 - insert $RUNDATA on exportfile/gen for unikix # w4 - unikixvsam $RLX/dsnutilb vs dsnutilb <$SYSIN # x1 - Tsite traceback SYSIN files to parm on prior step # x4 - Tsite chg TABLEname @@CHG1.___ to DB2ADMIN.___ # x8 - Tsite insert "$PKG." java -cp $CLASS_PATH $PKG.prgmname # y2 - Asite dbtload/dbtunload by LOAD in procname # y4 - jclproc51, replace &symbols on all lines vs DSNs # z1 - append 1 on COBOL reserved word DDNAMES # z2 - 2 blanks sep before comments vs 1 (in parjcl) # z8 - Tsite debug: show sysin search back files to parms # ---------------------------------------------------------------------------- # Aug01/19 - use option x_ for all Tsite options as follows: */ # - disable existing x_ for Indexed file type IDXf_ (assume IDXf1) */ # - option x1 to make Tsite parms tarceback optional */ # - option y1 to x4 TABLEname @@CHG1._ to DB2ADMIN._ */ # - option y8 to x8 insert $PKG. on java calls */ # ---------------------------------------------------------------------------- # Jul31/19 - Csite parmsds option m to not generate SYSIN_P & grep # - disable m16 parmfile@member vs parmfile/member (not used) # - disable current m1 $RUNDATA vs $RUNLIBS & m8 original filename # - Re-assign as follows: # m1 - $RUNLIBS/parms/... (or $RUNLIBS/parmsds/parmlib/...) # m2 - parms subdirs $RUNLIBS/parmsds/subdir/parmnames # m4 - generate --> sed -f $SEDSCRIPT $SYSIN_P >$SYSIN # m8 - assign parms to $RUNDATA vs $RUNLIBS default # ---------------------------------------------------------------------------- # uvadm/ctl/jclunixop51 - control file for MVS JCL to Korn shell converter # - must copy/modify to ctl/ subdir where jclunix converter executed # ------------------------------------------------------------------------ # Following lines will be inserted in scripts until :ENDINSERT # - except for '# ' comment (col1 '#', col2 ' ' blank) # '#!/.......' is 1st line output - usually '#!/bin/ksh' or '#!/bin/ksh93' #!/bin/ksh export JOBIDX=$(basename $0) JOBID=${JOBIDX%.*} scriptpath="$0" args="$*" integer JCC=0 SCC=0 LCC=0 SMAX=0 # init step status return codes autoload jobset51 jobend51 jobabend51 logmsg1 logmsg2 stepctl51 testcc evalcp1 autoload exportfile exportgen0 exportgen1 exportgen2 exportgenall exportgenx jobset51 "$args" # call function for JCL/script initialization goto #optional restart at any step, goto step default S0000=A in jobset51 S0000=A :ENDINSERT: - above non #comment lines inserted at begin output scripts # jobset51 inserted at begin each script # - sets up job environment: export subdirs, setup printer subdir/date # - RUNDATA & RUNLIBS must have been exported (in .profile or in shell) # - .profile must "export FPATH=$RUNLIBS/sfun" to find functions # - autoload declares functions called by jcl/scripts # - stores any restart step# in alias 'goto' (last line inserted above) # - #export start=S0000 #<-- insert above, users uncomment + step# for schedulers # #Mar2020 - 7 lines above updated for election3 uvdemos part9 & future JCL conversions # #------------------ general purporse search/replace tables ------------------ # REPTBL1 - replace any pattern on any INPUT (JCL before conversion to script) # - entries must be tidle filled & table ended by a line of all tildes # 01-30=search pattern, 31-60=replace pattern, 61-80=qualifier pattern :REPTBL1: search/replace table for input IBM JCL oldpattern~~~~~~~~~~~~~~~~~~~~newpattern~~~~~~~~~~~~~~~~~~~~qualifier~~~~~~~~~~~ .DSN=~~~~~~~~~~~~~~~~~~~~~~~~~.,DSN=~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ $ORTPARM~~~~~~~~~~~~~~~~~~~~~~SORTPARM~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ <<HIQUAL>>.<<ENVIRON>>~~~~~~~~app_files~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ HIQUAL.ENVIRON~~~~~~~~~~~~~~~~app_files~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # REPTBL2 - replace any pattern on any OUTPUT (after conversion to script) #Jun03/2017 - modified JCL converter to allow replacement 50 bytes, # - moving qualifier to 81-100, allowing 100 byte lines in ctl/jclunixop51 # - was 01-30=search, 31-60=replace, 61-80=qualifier # - now 01-30=search, 31-80=replace, 81-100=qualifier :REPTBL2: search/replace table for output UNIX script rcs=99999~~~~~~~~~~~~~~~~~~~~~rcs=99999~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # could change missing recsizes rcs=99999 to rcs=3999 OK if all files typ=LST #rcs=99999~~~~~~~~~~~~~~~~~~~~~rcs=3999~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # :REPTBL0: search/replace table for jclproc51 input JCL Jul16/2019 oldpattern~~~~~~~~~~~~~~~~~~~~newpattern~~~~~~~~~~~~~~~~~~~~qualifier~~~~~~~~~~~ <<BTCHSCHED>>~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # #--------------------- filename search/replace table --------------------- # 01-30=search pattern, 31-60=replace pattern, 61-80=qualifier pattern # - entries must be tidle filled & table ended by a line of all tildes :FILEREP1: filename search/replace table for input JCL DSNs oldpattern~~~~~~~~~~~~~~~~~~~~newpattern~~~~~~~~~~~~~~~~~~~~qualifier~~~~~~~~~~~ <<HIQUAL>>.<<ENVIRON>>~~~~~~~~app_files~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ HIQUAL.ENVIRON~~~~~~~~~~~~~~~~app_files~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # #----------------------- control table -------------------------------- :CTLTBL: TOPDIRDEFAULT~~~~~~~~~~~~~~~~~data1~~~~~~~~~~~~~~~~~~~~~~~~~ TOPDIRINSERT~~~~~~~~~~~~~~~~~~data1~~~~~~~~~~~~~~~~~~~~~~~~~ TOPDIRINSTAPE~~~~~~~~~~~~~~~~~tape1~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # TOPDIRDEFAULT - inserted if only 1 node (no High Level Qualifier) in DSN # TOPDIRINSERT - if specified, will be inserted above HLQ, retaining all '.'s # - TOPNODES table would be irrelevant if TOPDIRINSERT specified # TOPDIRINSTAPE - added June11/14, used if UNIT=CTAPE found on output DD
# # If TOPDIRs not specified, existing top-node used as a subdir within $RUNDATA # use TOPNODES table to allow desired topnodes & convert unwanteds to 'misc' # - could use FILEREP1 table to convert any part of filenames before above # OR see option 'u1' to make topnode subdir # #------------------------ TOPNODES table ------------------------------ # table of TOPNODES allowed & replacements (optional) # 01-30 - topnodes allowed # 31-60 - topnode replacements (if col 31 not '~') # - if 01-30 of last entry 'ALLOTHER', any others replaced by 31-60 # - use statmvsjcl1 to create summary table of topnodes used in all JCL # - code in lower case since applied at output time(script) vs input(JCL) :TOPNODES: data1~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ tape1~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ALLOTHERS~~~~~~~~~~~~~~~~~~~~~data1~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # comment lines in INput script by search patterns & qualifiers # - inserts '# ' in cols 1 & 2 if search & qualifier patterns present :CMTTBL1: cmtsearch~~~~~~~~~~~~~~~~~~~~~cmtqualifier~~~~~~~~~~~~~~~~~~ <<BTCHSCHED>>~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # comment lines in OUTput script by search patterns & qualifiers # - inserts '# ' in cols 1 & 2 if search & qualifier patterns present :CMTTBL2: cmtsearch~~~~~~~~~~~~~~~~~~~~~cmtqualifier~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # CMTTBL0 - effective for jclproc51 vs jclunix51 # - for PROC expansion before JCL conversion to script :CMTTBL0: cmtsearch~~~~~~~~~~~~~~~~~~~~~cmtqualifier~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # :DROPDDN: drop DDnames on input //DDname DD CEEDUMP~~~~~~~~~~~~~~~~~~~~~~~ DFSRESLB~~~~~~~~~~~~~~~~~~~~~~ DFSESL~~~~~~~~~~~~~~~~~~~~~~~~ DFSVSAMP~~~~~~~~~~~~~~~~~~~~~~ DMMSGFIL~~~~~~~~~~~~~~~~~~~~~~ DMNETMAP~~~~~~~~~~~~~~~~~~~~~~ DMPRINT~~~~~~~~~~~~~~~~~~~~~~~ DMPUBLIB~~~~~~~~~~~~~~~~~~~~~~ ESYLIB~~~~~~~~~~~~~~~~~~~~~~~~ IEFRDER~~~~~~~~~~~~~~~~~~~~~~~ IEFRDER2~~~~~~~~~~~~~~~~~~~~~~ IMS~~~~~~~~~~~~~~~~~~~~~~~~~~~ JOBLIB~~~~~~~~~~~~~~~~~~~~~~~~ PROCLIB~~~~~~~~~~~~~~~~~~~~~~~ RESLIB~~~~~~~~~~~~~~~~~~~~~~~~ SORTLIB~~~~~~~~~~~~~~~~~~~~~~~ SORTWK*~~~~~~~~~~~~~~~~~~~~~~~ STEPLIB~~~~~~~~~~~~~~~~~~~~~~~ SYSABEND~~~~~~~~~~~~~~~~~~~~~~ SYSABOUT~~~~~~~~~~~~~~~~~~~~~~ SYSEXEC~~~~~~~~~~~~~~~~~~~~~~~ SYSDBOUT~~~~~~~~~~~~~~~~~~~~~~ SYSLIB~~~~~~~~~~~~~~~~~~~~~~~~ SYSPRINT~~~~~~~~~~~~~~~~~~~~~~ SYSTSPRT~~~~~~~~~~~~~~~~~~~~~~ SYSOUX~~~~~~~~~~~~~~~~~~~~~~~~ SYSUDUMP~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ #Oct20/15 - need SYSPROC to ID REXX steps for Tsite # - if DDN SYSPROC with 'rexx' in the DSN # SYSPROC~~~~~~~~~~~~~~~~~~~~~~~ # :DROPDDNSAS: drop DDnames on SAS... steps DSN=$(ssid2Dsn DSNC) UID=$DB2UID PWD="$DB2PWD" SCHEMA=TUPROD~~~~~~~~~~~~~~~~~~~~ SASAUTOS~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ SASCLOG~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ SASLOG~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ SASLIST~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ SASMSG~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ SASPARM~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ TKMVSENV~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ #Oct05/2015 1st line in DROPDDNSAS table replaces SSID=... on LIBNAME ... in SYSIN data # # Table of DSNnames to drop if pattern matched anywhere in filename # - optionally qualified by EXECname in cols 31-38 :DROPDSN: CATALOG~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ COBOLMVS~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ EMER.LOAD~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ JCL.LIB~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ JCLLIB~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ LINKLIB~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ LOADLIB~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ LOADRUN~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ PROD.JCL~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ PROD.LOAD~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ RUNLIB~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ SORTWK~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ STREAMW~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ TEST.LOAD~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ USERCAT~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # Table of STEP names of utilities # - jclunix54 considers anything else COBOL # & appends cft=QSAM etc on exportfile,gen0,gen1,genall :UTILITIES: ABEND~~~~~~~~~~~~~~~~~~~~~~~~~ ACPMAIN~~~~~~~~~~~~~~~~~~~~~~~ ADUUMAIN~~~~~~~~~~~~~~~~~~~~~~ AMUUMAIN~~~~~~~~~~~~~~~~~~~~~~ ADRDSSU~~~~~~~~~~~~~~~~~~~~~~~ BTQMAIN~~~~~~~~~~~~~~~~~~~~~~~ CFCONDRV~~~~~~~~~~~~~~~~~~~~~~ CFPRSDRV~~~~~~~~~~~~~~~~~~~~~~ D2P4PROC~~~~~~~~~~~~~~~~~~~~~~ D2T4PROC~~~~~~~~~~~~~~~~~~~~~~ DB2~~~~~~~~~~~~~~~~~~~~~~~~~~~ DFSRRC00~~~~~~~~~~~~~~~~~~~~~~ DFSUCUM0~~~~~~~~~~~~~~~~~~~~~~ DFSULTR0~~~~~~~~~~~~~~~~~~~~~~ DMBATCH~~~~~~~~~~~~~~~~~~~~~~~ DMS~~~~~~~~~~~~~~~~~~~~~~~~~~~ DSNUPROC~~~~~~~~~~~~~~~~~~~~~~ DSNTEP2~~~~~~~~~~~~~~~~~~~~~~~ DSNTIAD~~~~~~~~~~~~~~~~~~~~~~~ DSNTIAUL~~~~~~~~~~~~~~~~~~~~~~ DSNUPROC~~~~~~~~~~~~~~~~~~~~~~ DSNUTILB~~~~~~~~~~~~~~~~~~~~~~ EZTPA~~~~~~~~~~~~~~~UV~~~~~~~~ FASTLOAD~~~~~~~~~~~~~~~~~~~~~~ FDRABR~~~~~~~~~~~~~~~~~~~~~~~~ FILEAID~~~~~~~~~~~~~UV~~~~~~~~ FTP~~~~~~~~~~~~~~~~~~~~~~~~~~~ HRNBATCH~~~~~~~~~~~~~~~~~~~~~~ ICEGENER~~~~~~~~~~~~UV~~~~~~~~ ICEMAN~~~~~~~~~~~~~~UV~~~~~~~~ ICETOOL~~~~~~~~~~~~~UV~~~~~~~~ IDCAMS~~~~~~~~~~~~~~UV~~~~~~~~ IEBCOPY~~~~~~~~~~~~~UV~~~~~~~~ IEBGENER~~~~~~~~~~~~UV~~~~~~~~ IEFBR14~~~~~~~~~~~~~UV~~~~~~~~ IKJEFT~~~~~~~~~~~~~~~~~~~~~~~~ ISRSUPC~~~~~~~~~~~~~UV~~~~~~~~ KWIKKEY~~~~~~~~~~~~~~~~~~~~~~~ KWIKLOD~~~~~~~~~~~~~~~~~~~~~~~ MAIL~~~~~~~~~~~~~~~~~~~~~~~~~~ MLOAD~~~~~~~~~~~~~~~~~~~~~~~~~ NDMWAIT~~~~~~~~~~~~~~~~~~~~~~~ OCOPY~~~~~~~~~~~~~~~~~~~~~~~~~ OIVBE~~~~~~~~~~~~~~~~~~~~~~~~~ ORA~~~~~~~~~~~~~~~~~~~~~~~~~~~ PMBMTCH~~~~~~~~~~~~~~~~~~~~~~~ QMFBATCH~~~~~~~~~~~~~~~~~~~~~~ QUIKJOB~~~~~~~~~~~~~UV~~~~~~~~ REXX~~~~~~~~~~~~~~~~~~~~~~~~~~ SAS81~~~~~~~~~~~~~~~~~~~~~~~~~ REXX~~~~~~~~~~~~~~~~~~~~~~~~~~ SORT~~~~~~~~~~~~~~~~UV~~~~~~~~ SYNCGENR~~~~~~~~~~~~~~~~~~~~~~ TPQUTIL~~~~~~~~~~~~~~~~~~~~~~~ TSLIST~~~~~~~~~~~~~~~~~~~~~~~~ TXT2PDF~~~~~~~~~~~~~~~~~~~~~~~ VSAMINIT~~~~~~~~~~~~~~~~~~~~~~ WTO~~~~~~~~~~~~~~~~~~~~~~~~~~~ XMITIP~~~~~~~~~~~~~~~~~~~~~~~~ XPORT~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # table of DDN's to drop if on UV UTILITY step # - see 'UV' coded on some entries in :UTILITIES: table :UTILDDNDROP: SYSPRINT~~~~~~~~~~~~~~~~~~~~~~ SYSTSPRT~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # table of PROC names to be dropped by jclproc51.c :DROPPROC: ABNDTST~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ #Jun06/17 - ABNDTST added for Tsite # # table of INCLUDE names to be dropped by jclproc51.c :DROPINCLUDE: NOdropsasofJul2019~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # #Feb02/18 - table of //* comments to be dropped # - also see option q1 to drop all //* cmts in PROCs (default) :DROPCOMMENTS: NOdropsasofFeb2018~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # Table of $SYMBOLs not to substituted # controlled by option h0/h1 on line 1 above # h1 - all &SYMBOLs replaced with values if available # (ie - this table ignored) # - 1st convert to generate data file control table # h0 - 2nd+ convert for ksh scripts with $SYMBOLs (not replaced) # - for expansion when script executed #Jun11/14 - &ENV for IMMD #Jun29/17 - ZZZZCHKP for Tsite & set h0 :NOREPLACE: ZZZZCHKP~~~~~~~~~~~~~~~~~~~~~~
SYM4~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # Table of uvsort jobname_step#s to be supplemented by uvcopy # - for complex SORT operations beyond uvsort capability # - such as JOINKEYS, PARSE, IFTHEN :UVSORTUVCOPY: bvc57_S0030 bvc57_S0050 bv4x8_S0020 bv4x8_S0040 ~~~~~~~~~~~~~~~~~~~~ # # Table of symbols whose value is to be translated to lower case :LOWERCASE: APPLID~~~~~~~~~~~~~~~~~~~~~~~~ FLEV~~~~~~~~~~~~~~~~~~~~~~~~~~ DPC~~~~~~~~~~~~~~~~~~~~~~~~~~~ PE~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # # Table of patterns to insert '/' replacing '.' at 1st node # - applied in putstmt() only if option 'u1' (Asite) :TOPNODEDIR: DSN=~~~~~~~~~~~~~~~~~~~~~~~~~~ DSN1=~~~~~~~~~~~~~~~~~~~~~~~~~ export PARM=~~~~~~~~~~~~~~~~~~ LIST=~~~~~~~~~~~~~~~~~~~~~~~~~ LOCFILE=~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ # #-------------------------------------------------------------- #Oct28/2010 - add table of Connect:Direct code # export NDMAPICFG=$CDNDM/cfg/cliapi/ndmapi.cfg # - above NDMAPICFG will be defined in profile # - will replace PROCESSNAME with $jobid2_$STEP_CD # - Inode --> pnode if copy from mainframe(unix) to NT, else snode # - Onode --> snode if copy from mainframe(unix) to NT, else pnode #Nov09/10 - direct option '-s' removed for debug ? :CONNECTDIRECT: logmsg1 "/search for any CD ERROR codes in \$CDNDM/cfg/msgfile.cfg" $CDNDM/bin/direct $CDOPTNS << EOJ submit maxdelay=0 PROCESSNAME process snode=$CDSERVER step1 copy from (file=$INFILE Inode) to (file=$OUTFILE Onode) pend ; EOJ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ #------------------------ end jclunixop51 ---------------------------
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Please see the full listing of the jclunixop51 file at '12A1' followed by more extensive explanations than given here for options more likely to need changes depending on your site.
Here is the 'jclunixop51' options string, followed by expalanations for a few of the options that you might need to modify for your site.
jclunixop51:a2b0c8d0e2f1g1h0i0j0k15l1999m1n3o8p0q0r0s0t15u0v0w0x0y6z0 #<- mvstest options # ====*=====*=======*=========*=========*==================
# *c0 - all filenames lower case # c1 - program names UPPER case # c2 - filenames UPPER case (also uop=l2 jcldata52) # c4 - control card modulenames UPPER case # c8 - UPPERcase filenames to lookup datactl53I
# f0 - default file typ=RSF (allows packed/binary) # *f1 - default uvsort/uvcp file typ=LST # f2 - over-ride file typ by ctl/datactl53I (if filename match) # f3 - f1+f2 force typ=LST regardless datactl53I
# *j0 - default COBOL calls as per option 'r' # j1 - generate JAVA calls (vs COBOL) for COBOLs called directly # j2 - include -DDDNAME=$DDNAME for each file in step # j4 - generate JAVA calls for COBOLs called by IKJEFT01 # j8 - change output script file extension from .ksh to .java # j16 - insert # elif ((S0020 == 4)); then & # alias goto ...
# m0 - generate exportfile modules original filename # *m1 - $RUNLIBS/parms/... (or $RUNLIBS/parmsds/parmlib/...) # m2 - parms subdirs $RUNLIBS/parmsds/subdir/parmnames # m4 - generate --> sed -f $SEDSCRIPT $SYSIN_P >$SYSIN # m8 - assign parms to $RUNDATA vs $RUNLIBS default
# *r0 - gen cobrun, executes .int's & allows animation # r1 - gen runw for .ints NetExpress/Windows # r2 - assume executable programs either unix/windows # r4 - assume executables in PATH (progname only) # r8 - generate 'findexec' Multi-Level program search # r16 - generate 'unikixvsam $RLX/COBOLX' # r32 - generate Natural call (natsec batch ...)
Note |
|
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
$RUNLIBS/ctl/jclunixop51 (options control-file used by the JCL converter), contains several search/replace table, that you may modify & re-convert. See '12A1' for a listing of the complete jclunixop51 file.
Here we will list just 1 of the search/replace tables, as an example a problem that may be solved with this feature.
# REPTBL2 - replace any pattern on any OUTPUT (after conversion to script) # - entries must be tidle filled & table ended by a line of all tildes # 01-30=search pattern, 31-60=replace pattern, 61-80=qualifier pattern :REPTBL2: search/replace table for output UNIX script 219.68.193.1~~~~~~~~~~~~~~~~~~$IP_SITE_AAA~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 219.68.193.2~~~~~~~~~~~~~~~~~~$IP_SITE_BBB~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 219.68.193.3~~~~~~~~~~~~~~~~~~$IP_SITE_CCC~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Then you would define the values for the IP $variables in the profiles. You could define the values desired for production in the common_profile used by operators & the job scheduler.
export IP_SITE_AAA=219.68.193.1 export IP_SITE_BBB=219.68.193.2 export IP_SITE_CCC=219.68.193.3
Programmers could define alternates for testing in their stub_profile, that would override those in the common_profile, for example:
export IP_SITE_AAA=220.120.77.7 export IP_SITE_BBB=220.120.77.8 export IP_SITE_CCC=220.120.77.9
open 219.68.193.1 <-- JCL before conversion open $IP_SITE_AAA <-- JCL/script after conversion
open 219.68.193.1 <-- JCL/script executed in production open 220.120.77.7 <-- JCL/script executed by programmer testing
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
13A1. | parms in multiple subdirs OR 1 combined subdir |
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Mainframe parms are usually stored in PDS libraries. Here are 3 examples of how the libraries might be defined in the JCL.
//SYSIN DD DSN=PGAPO.CDG.GMUVCFC.XNM(CTVITR) //SYSIN DD DSN=PGAPO.CDG.MCVCNQI.XNM(CTVITR) //SYSIN DD DSN=PGAPO.CDG.UVCOOF.XNM(CTVKMGN)
Each library may have multiple PDS members, for example:
parmsd0/PGAPO_CDG_GMUVCFC_XNM: -rw-rw-r--. 1 cnv1 apps 16974 Feb 4 17:22 AIRGRP -rw-rw-r--. 1 cnv1 apps 16974 Feb 4 17:22 AIRGW -rw-rw-r--. 1 cnv1 apps 19106 Feb 4 17:22 AIRIKEL -rw-rw-r--. 1 cnv1 apps 1148 Feb 4 17:22 EMAIL2
parmsd0/PGAPO_CDG_MCVCNQI_XNM: -rw-rw-r--. 1 cnv1 apps 820 Feb 4 17:22 AIRGRP -rw-rw-r--. 1 cnv1 apps 820 Feb 4 17:22 BAKDF -rw-rw-r--. 1 cnv1 apps 9594 Feb 4 17:22 MECH -rw-rw-r--. 1 cnv1 apps 11316 Feb 4 17:22 AIRIKEL
parmsd0/PGAPO_CDG_UVCOOF_XNM: -rw-rw-r--. 1 cnv1 apps 17466 Feb 4 17:22 AIRIKEL -rw-rw-r--. 1 cnv1 apps 1230 Feb 4 17:22 EMAIL2 -rw-rw-r--. 1 cnv1 apps 6560 Feb 4 17:22 MECH
Note that when we transferred the PDS libraries to unix, we converted the DSN's to unix directories (stored within parmsd0/...)
For some mainframe sites, there may be only 1 PDS library & if multiple, the PDS member names may be unique, which means we could store all the members in 1 directory on unix - which is $RUNLIBS/parms/...
But for more complex sites, there may be many PDS libraries & a given member name may have different contents in different libraries. In this case (parm-names not unique), we need to store the parms in multiple subdirs. Please see the procedures in 'Part_13'.
The default conversion instructions begining at '1D1' assume that parm-names are unique and can be stored in 1 directory $RUNLIBS/parms/... If not unique, they need to be stored in multiple subdirs $RUNLIBS/parmsd0/.../...
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. mkdir parmsdx <-- make subdir for outputs of following commands =============
#2. renamedf2fd1 parmsd0 parmsdx/parmnames1 ======================================== - extract subdir-names & module-names from parmsd0 (see prior page) - reverse the names (module-names prior to subdir-names) - sample output as follows (for subdirs/modules shown on prior page)
AIRGRP----PGAPO_CDG_GMUVCFC_XNM AIRGW-----PGAPO_CDG_GMUVCFC_XNM AIRIKEL---PGAPO_CDG_GMUVCFC_XNM EMAIL2----PGAPO_CDG_GMUVCFC_XNM AIRGRP----PGAPO_CDG_MCVCNQI_XNM MECH------PGAPO_CDG_MCVCNQI_XNM AIRIKEL---PGAPO_CDG_MCVCNQI_XNM AIRIKEL---PGAPO_CDG_UVCOOF_XNM EMAIL2----PGAPO_CDG_UVCOOF_XNM MECH------PGAPO_CDG_UVCOOF_XNM
#3. sort parmsdx/parmnames1 -o parmsdx/parmnames2 ============================================= - sort to bring all same module-names together - sample output as follows:
AIRGRP----PGAPO_CDG_GMUVCFC_XNM AIRGRP----PGAPO_CDG_MCVCNQI_XNM AIRGW-----PGAPO_CDG_GMUVCFC_XNM AIRIKEL---PGAPO_CDG_GMUVCFC_XNM AIRIKEL---PGAPO_CDG_MCVCNQI_XNM AIRIKEL---PGAPO_CDG_UVCOOF_XNM EMAIL2----PGAPO_CDG_GMUVCFC_XNM EMAIL2----PGAPO_CDG_UVCOOF_XNM MECH------PGAPO_CDG_MCVCNQI_XNM MECH------PGAPO_CDG_UVCOOF_XNM
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
#4. uvcopy parmdiff1,fili1=parmsdx/parmnames2,filo1=parmsdx/parmnames3\ ,fild1=parmsd0,fild2=parmdiffs ==================================================================== - compare parms with same names in different sub-directories - output file of parmnames flagged '=' if contents match, or '!' if mismatch
----------- sample output ------------- 1256 total mismatches in 4100 total files AIRGRP----PGAPO_FGJWH_CFG_XNM - AIRMAIL---PGAPO_FGJWH_CFG_XNM 1 . AIRMAIL---PGAPO_FGJWH_CIA_XNM 2 = BADUPD----PGAPO_FGJWH_CDG_XNM 1 .. BADUPD----PGAPO_FGJWH_CEJ_XNM 2 = BADUPD----PGAPO_FGJWH_CFG_XNM 3 = BADUPD----PGAPO_FGJWH_CHK_XNM 4 !. BADUPD----PGAPO_FGJWH_CHT_XNM 5 = CPWH500---PGAPO_FGJWH_CFG_XNM 1 .. CPWH500---PGAPO_FGJWH_CIA_XNM 2 = CPWH500---PGAPO_FGJWH_OIA_XNM 3 !. DKAF105---PGAPO_FGJWH_CDG_XNM 1 . DKAF105---PGAPO_FGJWH_CEJ_XNM 2 = DKAF105---PGAPO_FGJWH_CFG_XNM 3 = DKAF105---PGAPO_FGJWH_CHK_XNM 4 !
#5. uvcp 'fili1=parmsdx/parmnames3,rcs=80,typ=LSTt,filo1=parmsdx/parmnames4\ ,sel1=57(1):!,sel1=57(2):..' ========================================================================= - copy file selecting mismatches '!' byte 57 (column 58) - also select 1st in parmname group used for comparison '..' bytes 57-58 - must use single quote (vs double) to enclose uvcp command string (due to '!')
------- sample output --------- BADUPD----PGAPO_FGJWH_CDG_XNM 1 .. BADUPD----PGAPO_FGJWH_CHK_XNM 4 !. CPWH500---PGAPO_FGJWH_CFG_XNM 1 .. CPWH500---PGAPO_FGJWH_OIA_XNM 3 !. DKAF105---PGAPO_FGJWH_CDG_XNM 1 . DKAF105---PGAPO_FGJWH_CHK_XNM 4 !
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
If parmnames are unique, all parms can be stored in 1 directory parm0/... & the JCL conversion script (jcl2ksh51A) will cleanup & lowercase into parms/...
The JCL converter options default to expecting all parms in 1 dir parms/... If your site parm names are not unique, you must modify option 'm' in the JCL converter control file. Setup script copymvsctls copies the supplied control file from $UV/ctl/jclunixop51 to $RUNLIBS/ctl/jclunixop51. Here is line 10 with options coded & lines 55-56 explaining options m2 & m4
jclunixop51:a2b0c0d1e2f0g1h1i15j0k15l1m4n3o8p0q0r0s2t1u1v0w0x0y1z1 # default options # ==========================**========================== ^^ - 'm4' default option, 'm6' for multi subdirs # m2 - parms subdirs $RUNLIBS/parmsds/file_name/parmnames # m4 - parms in 1 dir $RUNLIBS/parms/parmnames # m6 - must add m2 to m4 = m6
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. vi ctl/jclunixop51 ================== - change default option (on line 10) from m4 to m6 if your parmnames are not unique & parms must be stored in parmsd0 superdir with multiple subdirs (using JCL names with '.'s changed to '_' underscores)
If you determine (via parmdiff1) that your parmnames are unique, then you can combine all parms into 1 directory $RUNLIBS/parms/... and use option m4 in jclunixop51.
But if your parmnames are not unique (different contents in same parmnames in different subdirs of parmsd0/subdirs/parms), then you must maintain the multi-subdirs and use option m6 in jclunixop51.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Use these procedures if parmdiff1 on page '13A3' proved that your parmnames are unique (no different contents for same parmnames in different PDS libraries).
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. cp parmsd0/*/* parm0 ==================== - copy all parms from all subdirs into parm0/...
Note |
|
See this discussion on following page for parmsd0/multi-subdirs/parms. If you wish to do this for the 1 parms/ dir with unique parmnames, you could do as follows - but run after jcl2ksh51A (as per Note above).
uvcopy parmdirs1,fild1=parm0,fild2=parms,fili4=ctl/ipnum2var ============================================================
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
Use these procedures if parmdiff1 on page '13A3' proved that your parmnames are NOT unique (some different contents for same parmnames in different PDS libraries).
We will use script 'copyparmdirs1' which calls uvcopy job 'parmdirs1' to copy each subdir in superdir parmsd0/subdir/... to output superdir parmsds/subdir/...
Optionally, we can replace hard-coded IP#s in parmfiles with $variables. uvcopy job 'parmdirs1' reads a search/replace table, which you may update to the IP#s used at your site. You can copy a sample table from $UV/ctl as follows:
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. cp $UV/ctl/ipnumvar ctl/ <-- copy demo IP# table from uvadm ========================
#2. vi ctl/ipnumvar <-- update table for your site ===============
# ipnum2var - table of IP#s & $IP variables to replace hard-coded IP#s in JCLs & parms 219.68.193.1~~~~~~~~~~~~~~~~~~$IP_SITE_AAA~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 219.68.193.2~~~~~~~~~~~~~~~~~~$IP_SITE_BBB~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 219.68.193.3~~~~~~~~~~~~~~~~~~$IP_SITE_CCC~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
#3. copyparmdirs1 parmsd0 parmsds ============================= - copy from parms parmsd0/.../... to parmsds/.../... with cleanup, lowercasing subdir & parm filenames,& optionally replacing hard-coded IP#s with $IP variables.
Note |
|
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
export IP_SITE_AAA=219.68.193.1 export IP_SITE_BBB=219.68.193.2 export IP_SITE_CCC=219.68.193.3
sed -f $IPV2NSED $SYSIN_P >$SYSIN =================================
export IPV2NSED=$RUNLIBS/ctl/ipv2nsed_prod <-- production (as above) ========================================== See more explanation in $RUNLIBS/ctl/ipv2nsed_prod
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page
#0a. Login #0b. cdl (alias cd $RUNLIBS) --> /p1/cnv/cnv1/testlibs/ (or your testlibs)
#1. copyparmdirs1 parmsd0 parmsds ============================= - copy from parms parmsd0/.../... to parmsds/.../... with cleanup, lowercasing subdir & parm filenames,& optionally replacing hard-coded IP#s with $IP variables.
To understand what 'copyparmdirs1' is doing please refer back to the directory diagrams 2 pages above & a portion reproduced here:
: :--*--parmsd0 - parm superdir, when parmnames not unique : : :-----PGAPO_CDG_GMUVCFC_XNM <-- multi subdirs : : : :--------------------AIRGRP : : : :--------------------AIRGW <-- parm modules : : :-----PGAPO_CDG_MCVCNQI_XNM : : : :--------------------ADHW200 : : : :--------------------AIRGW : : :----------- etc ---------- : :-----parmsds - parms cleaned,lower-cased,IP#s $variables : : :-----pgapo_cdg_gmuvcfc_xnm <-- multi subdirs : : : :--------------------airgrp : : : :--------------------artgw <-- parm modules : : :-----pgapo_cdg_mcvcnqi_xnm : : : :--------------------adhw200 : : : :--------------------airgw : : :----------- etc ----------
Note that 'copyparmdirs1' calls uvcopy job 'parmdirs1' for each subdir For example here is the 1st call to uvcopy parmdirs1 for above subdirs:
uvcopy parmdirs1,fild1=parmsd0/PGAPO_CDG_GMUVCFC_XNM,fild2=pgapo_cdg_gmuvcfc_xnm ================================================================================
But you do not need to run the 'uvcopy parmdirs1' jobs, because the 'copyparmdirs1' script does it for all subdirs it finds in the parmsd0/... supedir.
Goto: Begin this doc , End this doc , Index this doc , Contents this library , UVSI Home-Page