Something about DataStage, DataStage Administration, Job Designing,Developing, DataStage troubleshooting, DataStage Installation & Configuration, ETL, DataWareHousing, DB2, Teradata, Oracle and Scripting.
Showing posts with label Project. Show all posts
Showing posts with label Project. Show all posts
Wednesday, July 15, 2015
Error while creating new jobs in DataStage
If you face the error error similar to below can occur when trying to save a newly created job:
Error On CREATE.FILE command
Creating file "RT_CONFIG4817 as type 30, mkdbfile: connot create file
RT_CONFIG4817. Unable to create operating system file "RT_CONFIG4817"
Monday, January 19, 2015
Some Musts To Avoid When Developing Your Data Warehouse Project
Here’s a list of things to avoid when developing a data warehouse. These are not necessarily in priority order. The priorities depend on your project.
1. Avoid technology for technology’s sake. Focus on business requirements and goals.
2. Avoid not securing a high-ranking business sponsor. If you lose your current sponsor, immediately find a replacement in the business sector of your organization.
3. Avoid trying to implement the entire data warehouse all at once. Approach the data warehouse project as a series of integrated sub-projects (data marts), and deliver each sub-project as it’s completed.
4. Avoid expending excess energy and budget on structuring the data in the warehouse. Do not over-normalize (starflake schema). Your focus should be the best query performance that you can deliver and delivering a quality, easy-to-use set of user interfaces.
Thursday, March 13, 2014
DOS Batch Script to Export DataStage Jobs Automatically from a Project
Copy the following
script into a text file and rename into ".bat" file.
This batch script is used to Export all jobs from a Projects
This script must be run from a DataStage client machine and the parameters below should be given
Host is server name
User is DataStage username
Password is DataStage password
Imp Location is the directory where the datastage dsx's are stored
This batch script is used to Export all jobs from a Projects
This script must be run from a DataStage client machine and the parameters below should be given
Host is server name
User is DataStage username
Password is DataStage password
Imp Location is the directory where the datastage dsx's are stored
Thursday, September 19, 2013
DataStage compile error 65280; Couldn't change directory to /tmp
Error when trying to compile parallel (PX) jobs in a project.
The compilation errors include the following:
Subprocess command failed with exit status 65280.
Output from subprocess: Couldn't change directory to /tmp : No such file or directory
Tuesday, September 03, 2013
Commands to delete files in &PH& directories in IBM InfoSphere DataStage
How should we manage cleaning up the &PH& directories for my DataStage projects? Can the process be automated?
Files in the &PH& directories under
DataStage project directories store runtime information when jobs are
run and need to be cleared out periodically.
Steps :
To clear the &PH& directory from within DataStage:
Friday, August 23, 2013
DataStage BASIC functions
These functions can be used in a job control routine, which is defined as part of a jobs properties and allows other jobs to be run and controlled from the first job. Some of the functions can also be used for getting status information on the current job; these are useful in active stage expressions and before- and after-stage subroutines.
Specify the job you want to control
DSAttachJob
Tuesday, August 13, 2013
Error 39202 when logging into IBM DataStage
DataStage Client programs (Designer, Manager, Director) experience a connection failure, and the message displayed to the user is:
Failed to connect to host: xxxxx, project: UV
(Internal Error (39202))
Labels:
Administration
,
checkpoint
,
client
,
commands
,
connection
,
DataStage
,
dsenv
,
Errors
,
Logging
,
orphan
,
phantom
,
process
,
Project
,
Server
,
Troubleshoot
,
UV
Friday, August 09, 2013
DataStage Project Name With Space
Wednesday, June 05, 2013
DataStage Macros Example
Labels:
Administration
,
DataStage
,
Host
,
Job
,
macro
,
names
,
programming
,
Project
,
Start
,
Tutorial
DataStage Macros
They are built from Data Stage
functions and do not require arguments.
A number of macros are provided
in the JOBCONTROL.H file to facilitate getting information about the current
job, and links and stages belonging to the current job. These can be used in
expressions, job control routines, filenames and table names, and before/after
subroutines.
Some Parallel transformer Macros are
listed here, as said, these can be used in anywhere in job
Labels:
Administration
,
DataStage
,
Host
,
Job
,
macro
,
names
,
programming
,
Project
,
Start
,
Tutorial
Friday, April 05, 2013
Something about DataStage Phantom(&PH& - Universe DB file) directory
&PH& directory which is a UniData/UniVerse/U2 file structure in the DataStage
contain the files which get created during a job's execution.
The two letters in the name of the directory hint to the word "phantom" which is the naming convention in DataStage for background processes. Every time a job runs, it generates these logs which contain the status of the different phantom processes generated during the run.
There are two types of the files that can be seen in &PH& directory one which starts from DSD.RUN and another one which starts from DSD.OshMonitor.
The two letters in the name of the directory hint to the word "phantom" which is the naming convention in DataStage for background processes. Every time a job runs, it generates these logs which contain the status of the different phantom processes generated during the run.
There are two types of the files that can be seen in &PH& directory one which starts from DSD.RUN and another one which starts from DSD.OshMonitor.
Wednesday, February 20, 2013
Changing Logs method in DataStage
DataStage logging was changed at release 8.1 to log job run detail records into the operational repository, (xmeta) rather than the local project level log files, (RT_LOGxxx).
In case XMETA repository logging causing problem (i.e not visible from designer), we able to change the logging method back to local project log files.
Thursday, January 10, 2013
How to delete DataStage jobs at the command line
1. Login
to the DataStage Administrator. Select the Project and click the Command
button. Then execute the following command:
LIST DS_JOBS <job_name>
Thursday, July 26, 2012
Subscribe to:
Posts
(
Atom
)