What is GDKUTIL?

There’s a new utility that is provided in your base z/OS operating system called GDKUTIL. GDKUTIL was introduced in z/OS 2.5 by OA62318 and is a DFSMS utility like IEBGENER that can be used to share data between z/OS and the cloud without any additional hardware or software. This game changing new utility can be used to download or upload sequential data to the cloud.

GDKUTIL uses:

  • GDKUTIL facilitates the upload and download of sequential data sets, z/OS unix files, PDS/PDSE members, and GDG versions to and from cloud storage. This enables seamless data sharing between mainframe and cloud environments.
  • It allows integration with cloud object storage, providing a hybrid storage architecture that leverages both on-premises and cloud storage strengths for backup, archive and managing unstructured data.
  • By incorporating GDKUTIL into JCL scripts, you can automate batch process involving cloud storage, reducing manual intervention and potential errors.
  • Efficient backup and recovery strategies can be created using this tool.  
  • After mainframe data is sent to S3 bucket, we can use several AWS tools to do data analytics.  Examples include:
    • Amazon Redshift:  Utilize Amazon Redshift to load data from S3 and run queries to generate business insights.
    • Amazon Quick Sight:  For visualization, Amazon Quick Sight can be used to create dashboards and reports based on data stored in S3, providing real time insights to stakeholders.
    • Amazon Q : You can link Amazon Q Business to an S3 data source and utilize a chat-based platform for interacting with your mainframe data

Cloud Data Access configuration:

There are basically two parts to the cloud data access configuration:

  1. System administrator configuration
  2. User configuration

This document would be key to implementing the GDKUTIL successfully in your mainframe:

https://www.ibm.com/docs/en/SSLTBW_2.5.0/pdf/ieac100_v2r5.pdf

( MVS Programming: Callable Services for High-Level Languages – Version 2.5)

For the System administrator configuration, you need to work with your mainframe security administrator, if you are not a RACF shop, please consult with Broadcom to get the equivalent ACF2 commands. For this, look at the chapter 24 of the above document, and implement all the steps in “System administrator configuration quick-start”

For the User configuration , as a developer you can do it yourself, just follow the User configuration quick start from chapter 24 of the above document, and implement all the steps in “User configuration quick-start”

Just keep in mind that you need to add the amazon cert (Amazon RSA 2048 M01) to the virtual keyring, and you need this certificate added for successful communication with S3 buckets.

OMVS folder structure after configuring the user steps:

You need to create the gdk folder in your root user directory, the add the below files.

gdk files
gdk cloud provider files

CDA utilizes these files during its processing: (Just copy these JSON files to yours and change as needed)

gdkkeyf.json file contains the encrypted cloud credentials for the cloud provider . See example below:

{
  "Credentials": [
    {
      "MVSUserID": "ASCSXS",
      "cloud provider": {
        "S3CLOUD": [
          {
            "keylabelID": "A00000",
            "name": "/",
            "key": "",
            "secretkey": "",
            "timestamp": "2024-05-21 14:07:05"
          }
        ]
      }
    }
  ]
}

gdkconfig.json file contains the logging level configuration. (INFO, WARN, ERROR) , See example below:

{
  "log-level": "ERROR",
  "web-toolkit-logging": false,
  "translation": true
}

S3CLOUD.json file contains the cloud provider configuration. See example below:

z/OS Cloud Data Access Authorization Utility

After you have successfully finished the system and user configuration steps, you can add your cloud credentials to be accessed by CDA services, see below the screen shots:

EX ‘SYS1.SAXREXEC(GDKAUTHP)’  – To bring up the ISPF panel for Cloud data access authorization utility

Select from one of the configured cloud provider
Securely store your AWS key and secret key

After you successfully save the cloud user credentials, the gdkkeyf.json will be updated with the key and secret key values in it.

Using the GDKUTIL:

I will provide you with two example JCL’s which you can use in your environment after successful completion of your setup.

Uploading a PS file to AWS S3 bucket:

//ASCSXSU JOB (T,ASC,ASCSXS),'UPLOAD',MSGLEVEL=(1,1),    
//   CLASS=R,MSGCLASS=Q,REGION=0K,NOTIFY=&SYSUID         
//*                                                      
//UPLOADD EXEC PGM=GDKUTIL,REGION=0M                     
//SYSPRINT DD SYSOUT=*                                   
//SYSOUT DD SYSOUT=*                                     
//SYSIN DD *                                             
  UPLOAD PROVIDER(S3CLOUD) CONVERT                       
/*                                                       
//OBJNAME DD *                                           
     /testinggdkutil/INPUT.TXT                           
/*                                                       
//LOCNAME DD DISP=SHR,DSN=ASCSXS.INPUT.FTP               

Hint: ASCSXS.INPUT.FTP is not a data file, it contains the name of the data file. Make sure you see this , I wasted some time here :)

Downloading a file from S3 bucket to OMVS:
//ASCSXSU JOB (T,ASC,ASCSXS),'UPLOAD',MSGLEVEL=(1,1),     
//   CLASS=R,MSGCLASS=Q,REGION=0K,NOTIFY=&SYSUID          
//*                                                       
//UPLOADD EXEC PGM=GDKUTIL,REGION=0M                      
//SYSPRINT DD SYSOUT=*                                    
//SYSOUT DD SYSOUT=*                                      
//SYSIN DD *                                              
  DOWNLOAD PROVIDER(S3CLOUD) CONVERT                      
/*                                                        
//OBJNAME DD *                                            
     /testinggdkutil/INPUT.TXT                            
/*                                                        
//LOCNAME DD *                                            
 /u/ascsxs/download/cloud/input.txt                       

Using Amazon Q to create a web experience URL to chat with you mainframe data:

I explored Amazon Q, which allows seamless connections to S3 data sources. I created a web experience URL that enabled users to interact with test data stored in the S3 bucket, providing insights that were previously unattainable.

This is really easy to do, just use the Amazon Q business , and the steps are pretty self explanatory, ask me in comments if you have any issues doing this.

The web experience URL was great, I was able to get great insights of my mainframe data using some effective prompts.

follow this URL:

https://docs.aws.amazon.com/amazonq/latest/qbusiness-ug/s3-connector.html

Useful Links to use for configuring and using GDKUTIL:

 IBM doc for GDKUTIL:

 https://www.ibm.com/docs/en/zos/2.5.0?topic=utilities-gdkutil-cloud-object-utility-program

 IBM community doc for GDKUTIL : 

Sharing z/OS data in the cloud using base z/OS utility GDKUTIL – IBM Z and LinuxONE Community

IBM z/OS 2.5 CDA configuration for cloud storage:

https://www.ibm.com/docs/en/zos/2.5.0?topic=storage-2-cda-configuration-cloud

 Cloud data access cloud credentials storage:

https://www.ibm.com/docs/en/zos/2.5.0?topic=services-cloud-data-access-cloud-credential-storage

 Cloud data access files:

https://www.ibm.com/docs/en/zos/2.5.0?topic=services-cloud-data-access-files  

 GDK util examples:    

https://www.ibm.com/docs/en/zos/2.5.0?topic=program-gdkutil-examples

I hope you can successfully implement GDKUTIL in your mainframe by following these steps, please feel free to ask you questions in comments.

Leave a Reply

Your email address will not be published. Required fields are marked *