IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Going Barefoot with for the DB2 DBA

Robert Mala, Database Administrator Session Code: 1089 Date and : 13th September 2012, 10.30am to 11.30am Platform: DB2 for LUW

Abstract: Using UNIX can be bewildering until you have grasped the basic concepts, capabilities and features. DBAs with a z/OS background UNIX to be arcane, esoteric, and pernickety. Getting the UNIX “mindset” takes effort, but once obtained UNIX becomes a capable, powerful and productive environment in which to work. Stepping through tailored examples using DB2 LUW, this presentation is for the DB2 DBA wanting to go barefoot with UNIX.

Objectives: - Introduce UNIX concepts, capabilities and features. - Understand and cultivate a UNIX "mindset". - Provide tailored examples using DB2 LUW with UNIX. - Getting the most from the interactive Korn Shell.

Speaker Biography: Robert has worked with DB2 and SQL for over 20 years and has worked in a variety of database centric roles for organisations in Australia and Europe. Robert is IBM Certified for DB2 UDB Database Administration for LUW & z/OS, and DB2 UDB Family Application Development.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 1 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

Agenda • UNIX Software Tools Philosophy • Simple Commands • Compound Commands • • Secure Shell • Interactive Korn Shell • GNU screen

Regardless of the platform or product, the activities of a Database Administrator are broadly the same. From checking house keeping has completed successfully to promoting changes, deploying stored procedures to tuning SQL. The working practices and methods may vary but the basic activities of a Database Administrator remain the same.

GUIs can give a large productivity boost for certain types of activities. They can also be very confining. GUIs restrict you to what the developers of the GUI think is needed. This can result in having to do activities in a less than optimal way. The tooling can also leave large gaps in functionality leaving you to find alternatives. GUIs shield you from the underlying commands and insulate you from server side information which times is critical to doing the job effectively. If you are responsible for looking after many instances of db2 over many servers, GUIs can be unwieldy when trying to apply changes globally.

CLIs give the freedom to do just about anything. That freedom comes at a cost of a steep learning curve. Scripts can be developed for common activities so that you need not reinvent the wheel each time. However the command line can be a harsh and unforgiving place. It requires a good level of typing skills and time to build the knowledge and skills to be able to drive the CLI proficiently.

Driving the CLI via UNIX is a potent mix. It gives the freedom and flexibility of the CLI coupled with an environment that can super charge your productivity. Going barefoot with Unix for the DB2 DBA requires study and practice to establish practices and finger habits. Which once established will enable you to develop command line scripts that help you do you job quickly and efficiently.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 2 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

UNIX Software Tools Philosophy • It's a philosophy, not standardised methodology • Based on pragmatic and practical experience • Programs designed to do one thing well • Create new programs rather than extend • Text stream as the universal interface • Output of one program can be used as input to another • Sum is greater than the parts

There are a number of guiding principles used when developing UNIX commands and programs in general. Collectively they are referred to as the UNIX Software Tools Philosophy. The philosophy is not a standard or methodology developed by a committee or from academic ponderings. Instead, it is based on the pragmatic and practical experience of a number of individuals contributed to the development of UNIX. There are several interpretations of the philosophy but the points of interest for this presentation are:

•Design programs to do one thing well. •Create new programs rather than extend existing ones. •Use text as the universal interface • the output of one program usable as input to another.

Creating new programs rather than extend existing ones goes some way in explaining the overlap of functionality between programs. The idea is programs become overly complex when they are extended. As additional options and features are added the program becomes difficult to use. This philosophy is complimentary to the "do one thing well" guideline. The down side to creating new programs is that UNIX appears to have a number of programs that have overlapping uses. In part this has given UNIX a reputation as being disorderly collection of tools. The benefit of creating new programs is that tools are small light weight programs can be coupled together in various ways.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 3 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

Simple Commands

Name Description awk Pattern scanning and processing language Concatenate and print files Cut out selected fields of each line of a date Display the date and time Report free disk space find Find files Search text for a pattern Line numbering filter print / output / formatted output Stream editor Sort, merge, or sequence check text Translate characters Report or filter out repeated lines Line, word, byte or character count Constructs parameter lists and runs commands

While there are many unix utilities there are some that are used very frequently. Of all the utilities, grep, sed, and awk are the “work horses”. The list of commands above is by no means extensive or exhaustive. Though my experience has been that the above utilities will cover 95% of my activities. I haven’t included , or editors, even though these can be driven from the command line.

What makes unix so powerful is when commands are combined. This requires that programs are developed using the unix software tools philosophy. Programs can then be combined in many different ways. This leads to the whole becoming greater than the some of the parts. Complex results can be achieved from simple steps.

There is a similar mindset in combining unix utilities as to writing complex sql statements. In unix, the biggest hurdle is to not think in terms of iterative processing of records within files. Doing so will lead to overly complicated solutions that go against the unix way. Instead you need to think of commands processing files and data streams. Once you have master this way of thinking the utilities provided by unix will start to work for you rather than against you.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 4 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

Compound Commands • Pipeline Command • List Commands • And , Or , Sequential , Compound List , Background Process • Command Grouping • Subshell , Brace • Iteration Commands • for , while , until

Compound commands are used combine commands. There are various ways to combine commands including pipelines, list commands, command grouping and iteration commands.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 5 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

Pipeline Command • Pipelines are the easiest way to work with UNIX • The | ( pipeline ) operator is used to combine commands

command [ | [ newline… ] command ]…

• pipeline is a sequence of one or more simple or compound commands. • Stdin for command comes from the keyboard or previous command • Stdout for command goes to the terminal or next command

The pipe is arguably the most important construct in unix. Pipes are used to flow data from one command to another. It's named as an analogy to a physical pipe. Several unix philosophies make pipes very effective. Namely text as the universal interface, output of a command should reflect the format of the input, doing one thing well. When programs follow these philosophies, pipes allows commands to work together.

The shell does not run the commands in a pipe directly. Child processes are created for each command. While you can think of the sequence of execution going left to right, in reality all commands are running concurrently. Though they might not be doing anything if waiting on input from the previous command. To improve the efficiency and performance of a pipeline, previous commands should filter as much of the data as quickly as possible. Commands that must process the entire data-stream before passing data to the next command, such as sorting, are costly. This should all sound very familiar if you tune sql statements.

One or more newlines can be entered after the | character. From the command line you will be given the secondary prompt ( PS2 is normally set to a > character) which indicates that ksh is waiting for the rest of the pipeline to be entered. Adding newlines to a pipeline is useful as it makes reading and understanding the pipeline easier.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 6 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style Pipeline Command Unix tools solution to show word frequency $ man grep | > tr -cs '[:alnum:]' '[\n*]' | > tr '[:upper:]' '[:lower:]' | > sort | > uniq - | > sort -k1,1nr -k2 | > -3 108 the 47 a 25 to

The pipeline in the slide demonstrates the power of pipelines ( examples using db2 coming up shortly). The pipeline prints the 3 most frequently words in descending order from the manpage on the grep command. The pipeline is based on one originally developed by Doug McIlory ( Bell Labs researcher and inventor of the unix pipe ) in response to a challenge posed by Jon Bentley ( also a Bell Labs researcher ) to print the 25 most frequently words in descending order from a file.

The pipeline approach breaks the requirement down into simpler tasks. The contents of the text file is broken into a list of words, one per line. The words are changed to lower case, sorted then reduced to a list of unique words with count of the number of times that word appeared. This is then sorted by the count in descending order. Finally the first 3 lines are printed.

Unix commands generally operate at the file level. While the command syntax is very terse, the unix approach of doing one thing well means that standard idioms to perform common operations are used. The unix approach is 4 GL like in that you specify what needs to be done, not how to do it. Using a 3 GL approach would require iterating over records. This would require much more effort to develop and . Of course the utilities are programs that do exactly this. Though as a user of unix utilities this is black boxed from view.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 7 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

Pipeline Command Unix tools solution to show sqlcode frequency $ db2 -tvf script.db2 | > script.db2.out | > grep -E '^DB2|^SQL' | > sort | > uniq -c | > sort -k1n 1 DB20000I The UPDATE COMMAND OPTIONS command 1 DB21034E The command was processed as an SQL 1 SQL0100W No row was found for FETCH, UPDATE 15 DB20000I The SQL command completed successfu

The above example gives an example of a pipeline that is more relevant to a DB2 DBA. It parses the results of the db2 command line processor and shows a count of DB2 or SQL codes produced in ascending order.

The first thing to note is the tee command. I like to keep a record of the output when running scripts as it allows me to go back at any time and check to see what was done. However using the db2 clp in a pipeline means that the output isn't kept. The tee command writes to file and stdout. Used in a pipeline it means input from the previous command can be written to file while also being used as input to the next command. The db2 clp has switches for keeping output as well though data is appended to the file ( the tee command has the -a switch for appending data to a file ).

The pipeline uses the common idiom to filter, count and remove duplicates, and sort on the count. The results of the script are filtered by the grep command so that only records that start with DB2 or SQL returned. This is then sorted and then processed by the uniq command to count and removes duplicates. The output is then sorted by the count.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 8 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style List Commands • List can be pipeline or combination of the following: • And List ( list [ && [ newline…] pipeline ]… ) $ db2 connect to sample && db2 values current timestamp • Or List ( list [ || [ newline…] pipeline ]… ) $ db2 connect to sample || print 'connect failed' • Sequential List ( list [ ; pipeline ]… ) $ db2 connect to sample ; db2 values current timestamp • Compound List ( [ newline ] list [ newline ] ) $ for i in *.del ; do > db2 import from $i of del insert into t1 > done • Background Process ( list & [ pipeline & ]… ) $ db2 -tvf file.db2 > file.db2.out & • Co-process ( list |& )

There are several list operators that can be used with pipelines including: and list, or list, sequential list, compound list, background process and co-process.

Commands provide a return code on completion. A value between 0 and 255 indicates that the command was not terminated due to a signal. A value of 0 indicates normal . A value between 1 - 125 indicates and 128 - 255 indicates failure. A value of 126 indicates that a command was found but cannot be executed. A value of 127 indicates a command could not be found.

ksh has operators for conditionally combining commands based on the return code. This is more compact than using the if command. && runs the command following operator if the preceding command returns true ( return code 0 ). || runs the command following operator if the preceding command returns false ( return code <> 0 ). These binary operators allow for conditional execution of sequent commands in the pipeline. The return code of the previous command in the pipeline is used to determine true or false values. If the preceding command is a pipeline, the return code of the last command in the pipeline is used.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 9 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

Mail error or severe db2 diagnostic messages

$ db2diag -H 1d -l 'Error,Severe' -o $$.out && > uuencode $$.out db2diag.txt | > mail -s 'db2 message alert' rob && > print 'email sent' && > $$.out

The above example mails error or sever db2 diagnostic messages reported in the last 24 hours. The db2diag command is used to write error or server messages reported in the last 24 hours to output file. ( The output file $$.out uses the $$ variable which contains the process id or PID of the current shell. )

The return code from the db2diag command is 0 if any messages were found. This will cause the next command to be executed ( due to the and list operator && ). The uuencode command encodes the output file $$.out as db2diag.txt which is piped to the mail command.

The mail command sends the encoded file to rob ( this is a mail has been setup in the ~/.mailrc file ) with a subject of 'db2 message alert'. The string 'email sent' is displayed if the mail was sent successfully. Finally the output file &&.out is removed.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 10 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

Command Grouping • Commands can be grouped together • Brace grouping run the commands in the current shell { command ; command ; } | command • Subshell grouping run commands in a subshell ( command ; command ) | command

$ ( uuencode f1 f1 ; uuencode f2 f2 ) | > mail -s 'files attached' rob

While commands are typically programs such as grep or sort, commands can also be a grouping of other commands, or a loop construct ( more on iterative commands coming up ). The benefit of grouping is that it allows the output of all of the grouped commands to be used as input to the next pipeline command.

Grouping commands is achieved by using braces or brackets to surround commands. The difference between braces and brackets is that brace grouping run the commands in the current shell whereas subshell grouping run the commands in a sub shell environment.

I've used command grouping in a pipeline to calculate delta values between two invocations of a command where awk was then used to process the output of the command group. More on awk coming up in the presentation.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 11 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style Iteration Commands - for loops • for loop

for var [ in word ] do compound list done

• c-style for loop ( ksh93 only )

for (( [expr1] ; [expr2]; [expr3] )) do compound list done

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 12 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style Iteration Commands - for loop Process several files using the CLP $ for f in create_objects_*.db2 ; do > print $f > db2 -tvf $f | > tee $f.out | > grep -E '^DB2|^SQL' > done create_objects_1.db2 DB20000I The SQL command completed successfully. DB20000I The SQL command completed successfully. create_objects_2.db2 DB20000I The SQL command completed successfully.

The above slide shows an example of an for loop to process files through the db2 command line processor. A ksh patterns can be used in a for loop. ksh will perform pathname expansion and matching files processed by the for loop.

ksh93 extends patterns to include regular expressions called pattern lists ( though the syntax is different from grep and sed commands ). For example, para*([0-9]) matches the string para and para followed by any number of digits.

More complex pattern lists can be developed that incorporate or and and patterns. For example, @(*.db2&!(para*)) matches any string ending in .db2 and not beginning with para.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 13 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style Iteration Commands - for loop Switching db2 instance owners and Here documents $ for i in $( grep -v "^#" db2list ) ; do > sudo su - $i <<- 'EOF' > grep -li 'error' $HOME/output/cleanup* > EOF > done /var/db2/db2inst1/output/cleanup_20120825.out /var/db2/db2inst2/output/cleanup_20120826.out /var/db2/db2inst3/output/cleanup_20120826.out /var/db2/db2inst3/output/cleanup_20120827.out

It's common to have several db2 instances running on the one server. Being able to perform tasks that span all instances at one time is a major time saver. For instance, checking cleanup output for errors from several db2 instances can be a time consuming task. However, this time can be dramatically reduced using a for loop to switch db2 instance owner accounts and grep for errors.

Command substitution can be used to produce a word list used by a for loop. In the slide above, the file db2list contains a list of db2 instance owner account names. This list is filtered by the grep command. The -v switch is used to return records that do not match the pattern. In this case, records that do not start with the # character are returned ( this allows accounts to be commented out depending on your needs ). The variable i will be assigned words from the results of the grep command.

The example above uses sudo to switch to the db2 instance owner account. A Here document is used to provide inlined data to the su ( swtich user ) command. The inlined data is a grep command that will return only the names of files containing the pattern 'error' ( case insensitive ) for files named cleanup* in the output directory under the home directory.

Here documents are specified using the << operator and are used to provide inlined data to a command. A user supplied label after the << and is used to signify the end of the inlined data. The label must appear on a separate line after the inlined data. The label must start in the first character position. However, if a - character is appended, as in <<- , the label can have one or more tab characters before it. This is used for readability. Variables in the inlined data are evaluated before being processed. However, if the label is quoted, variables are not evaluated ( though they maybe evaluated by the consuming process ). In the example above the label has been quoted as the $HOME variable needs to be evaluated after switching to the db2 instance owner account.

Here documents are useful for ad-hoc scripting. For example, when I want to prototype db2 commands I sometimes use Here documents to the db2 command line processor:

$ db2 +p << EOF > connect to sample > drop table t1 > create table t1 ( c1 int not null , c2 int ) > create unique index i1 on t1 ( c1 ) > alter table t1 add primary key ( c1 ) > EOF

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 14 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style Iteration Commands - c-style for loop $ db2 list indoubt transactions with prompting

1. originator: XA appl_id: xx.xx.x.xx.xxxxx.xxxxxxxxxxxx timestamp: 07/09/2012 16:26:33 auth_id: XXXXX log_full: n : RM xid: 5741534400000024 0000003600000138 6A6AFD8200000001

2. originator: XA appl_id: xx.xx.x.xx. xxxxx.xxxxxxxxxxxx timestamp: 07/09/2012 16:26:33 auth_id: XXXXX log_full: n type: RM xid: 5741534400000024 0000003600000138 6A6B005F00000001

Enter in-doubt transaction command or 'q' to quit. e.g. 'c 1' heuristically commits transaction 1. c/r/f/l/q:

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 15 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style Iteration Commands - c-style for loop $ for (( i = 1 ; i <= 2 ; i++ )); do > s="${s}r ${i}\ny\nf ${i}\ny\n" > done $ print $s r 1 y f 1 y r 2 y f 2 y $ print $s | > db2 list indoubt transactions \ > with prompting

The above slide shows a c-style for loop to build a sequence of characters that are used to feed the db2 list indoubt transactions with prompting command. Within double quotes, variables are expanded and \n will embed a newline character.

Note that commands can be continued over several lines by using the \ character. As with a pipeline or compound list, the secondary prompt is displayed indicating that the shell is waiting for further input.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 16 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style Iteration Commands - while and until loops

• while loop

while compound list do compound list done

• until loop

until compound list do compound list done

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 17 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style while loop Monitor temporary space disk usage $ while true > do > df -Pg /var/db2tmp* > 10 > done Filesystem GB blks Used Avail Cap Mounted on /dev/db2tmp1 22.00 1.00 21.00 5% /var/db2tmp1 /dev/db2tmp2 22.00 1.00 21.00 5% /var/db2tmp2 /dev/db2tmp3 22.00 1.00 21.00 5% /var/db2tmp3 /dev/db2tmp4 22.00 1.00 21.00 5% /var/db2tmp4 /dev/db2tmp5 22.00 1.00 21.00 5% /var/db2tmp5

The above example uses an infinite while loop to execute the df command ( disk free ) to monitor temporary space usage. In this situation several file systems have been setup for db2 temporary space. When a db2 sort spills to disk these file systems are used. The above command line script allows you to monitor disk usage in real time.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 18 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style while loop Monitor index reorgs $ while true ; do > db2pd -d sample -reorgs index | > grep -p 'Status: In Progress' || break > sleep 10 > done Retrieval Time: 08/07/2012 06:16:57 TbspaceID: 8 TableID: 10 Schema: SCHEMA_1 TableName: TABLE_1 Access: Allow write Status: In Progress Start Time: 08/07/2012 06:12:08 End Time: -

The above uses an infinite while loop to execute a compound list. The pipeline uses db2pd to get information about index reorgs and greps out only those reorgs that are currently in progress. If the grep command fails to find the pattern the Or list breaks out of the infinite loop.

When developing command line scripts, I start with a simple template and then add to it ( and test ) several times until I get the result I'm after. For one-off requirements, the command line script can be discarded. For the more useful or complex commands, I save them. There are several options for using saved command line scripts including aliases, functions and scripts.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 19 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

Pipe Mills $ lsvg -l db2datat1 | > +3 | > while read lv dummy ; do > lslv -l $lv > done db2data01:/var/db2/db2data01 PV COPIES IN BAND hdisk10 585:000:000 20% db2data02:/var/db2/db2data02 PV COPIES IN BAND hdisk11 585:000:000 20%

The above example shows a pipeline that feeds a while loop. This type of pipeline is referred to as a pipe mill. Pipe mills are needed when commands can't be used directly in a pipeline because they do not use stdin to get their input. In the example above the lslv command requires a value be given to the -l switch. This argument value can't be provided via a pipeline or the xargs command. The lslv command has to be run for each logical volume name.

The pipeline lists the logical volumes for the volume group db2data1. The first 2 lines of the command output are ignored using the tail command. Using a positive number as an argument causes tail to writes from that line onwards. This is then processed by the while loop. The while loop uses the read command to populate two variables. lv will contain the first word and dummy the remainder of the input. The lv variable is then specified for the lslv command.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 20 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

awk - overview • A simple programming model and language syntax $ awk 'BEGIN{ print "hello world" }' • Applies conditions and actions over an input stream • Splits input records into fields automatically $ print hello world | awk '{ print $1 , $2 }' • Can define record or field delimiters • Used in pipelines to complete applications • Various built-in functions and variables available • Good support for regular expressions • Supports user defined functions and locally scoped variables

awk is a general purpose scripting language that can be used for one liners in pipelines (effectively replace grep and sed in many instances) to complete applications. It has a simple programming model where an input stream is processed by applying actions when conditions are met.

The input stream is automatically opened and read ( though you can explicitly open and read files if you want). Record fields are automatically into internal variables. Though what constitutes a record and a field can be changed. awk is a very powerful language that enables you to do a host of text parsing and manipulation. A presentation devoted just to awk would be needed to do it justice.

The following outlines the awk programming model.

BEGIN { statement ; statement} condition1 { statement ; statement } condition2 { statement ; statement } condition3 { statement ; statement } END { statement ; statement }

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 21 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

awk • Using UNIX commands to list database aliases $ db2 list db directory | > grep '^ Database alias' | > tr -s ' ' | > cut -d ' ' -f5

• Using awk to list database aliases $ db2 list db directory | > awk '$2 == "alias" { print $4 }'

The output of db2 commands are textual. However, the format is often verbose and isn't usable as input to another command. For example, the output of the 'list database directory' isn't in a format that is readily usable in a pipeline.

awk has a simple and elegant programming model and language. It is also a very versatile language that can be used for simple one line scripts or entire applications. In many instances awk can be used to replace several commands in a pipeline.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 22 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

awk Unix tools generating reports $ db2 list history archive log since \ > 20120731 for sample | > awk '/Start Time:/ { > print substr( $3 , 1 , 10 ) > }' | > uniq -c 5 2012073103 8 2012073104 3 2012073105 2 2012073107

The above pipeline produces a report on the number of archive logs cut per hour. awk is used to parse the output from the db2 list history archive log command. Any record that contains the string 'Start Time:' will print the first 10 bytes of field 3. Field 3 contains the start timestamp for the log archive operation. The uniq -c command is then used to count and then remove duplicates. This produces a report of the number of archive logs cut for given hours.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 23 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

awk Unix tools generating db2 commands

$ db2 "call reorgchk_ix_stats( 'S' , 'S1' )" | > awk '/\*/ && $NF != "*----" { print $2 }' | > sort -u | > xargs \ > printf "reorg indexes all for table s1.%s\n" reorg indexes all for table s1.TABLE1 reorg indexes all for table s1.TABLE2 reorg indexes all for table s1.TABLE3 reorg indexes all for table s1.TABLE4 reorg indexes all for table s1.TABLE5

The example above generates db2 reorg index commands. The reorgcheck_ix_stats administrative procedure is used to return information about indexes in the S1 schema. awk is used to parse this information and print tables names that have indexes needing to be reorganised ( other than having the cluster ratio flag set ). To do this the awk condition looks for the '*' character anywhere in the record and checks the last record field isn't "*----". Duplicate tables names are removed. db2 reorg index commands are then constructed. The xargs command takes an input stream and converts it to a parameter list. This is used as the printf command uses command line arguments over the standard input stream.

Don't loose sight of the fact that a lot of the information from db2 commands can be obtained from administrative views, functions or procedures. Why go to the trouble of developing a unix tool solution to parse db2 command output when you can use sql and its capabilities.

For example, the reorgchk output that isn’t readily usable. While it could be parsed using unix tools why bother when the reorgchk_tb_stats and reorgchk_ix_stats procedures can be used to give the same information in a form that is much easier to handle. Then there are also times when you need to use db2 commands over the administrative sql, as the command contains more information.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 24 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style awk Monitor temporary space disk usage revisited. $ while true ; do > df -Pg /var/db2tmp* | > awk '{ a += $2 ; u += $3 ; f += $4 } > END{ > "date +%H.%M.%S" | getline t > printf("%s %.2f %.2f %.2f\n",t,a,u,f) > }' > sleep 10 > done 16.19.00 110.00 5.00 105.00 16.19.10 110.00 5.10 104.90

The above uses an infinite while loop to execute a compound list. The pipeline gets disk free information for file systems used for db2 temporary storage. As db2 temporary storage is spread over several file systems, awk is used to aggregate disk allocation, used and free. At the end of the input stream, the system time is obtained and the aggregated information printed. The process sleeps for 10 seconds before being executed again.

When developing things like the above, I start with a simple template and then add to it ( and test ) several times until I get the result I'm after. For one-off requirements, the commands can be discarded. For more useful pipelines I save them by defining an alias, function or script.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 25 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style awk $ db2 list db directory | > sed -e '/^$/d' \ > -e 's/^Database \([0-9]*\) entry:/@ \1/' System Database Directory Number of entries in the directory = 2 @ 1 Database alias = SAMPLE Database name = SAMPLE Local database directory = /var/db2/db2i Database release level = d.00 ... @ 2 Database alias = SAMPLE2 Database name = SAMPLE Node name = SERVER2 Database release level = d.00 ...

Parsing db2 command output sometimes requires more effort to get it into a form that is usable. A strength of awk is the ability to define the delimiters that constitute a record and fields. By default a new line is the record delimiter and white space the field delimiter. When dealing with complex formats it is sometimes easier to parse the data using different record and/or field delimiters. Though doing so might need manipulation of the data stream before it is processed by awk.

For example, the output from the list database directory command can be manipulated so that the pattern '^Database [0-9]* entry:' is replaced with a '@' character. awk can now be told to use the '@' character as the record delimiter.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 26 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

awk

$ db2 list db directory | > sed -e '/^$/d' \ > -e 's/^Database \([0-9]*\) entry:/@ \1/' | > awk ' > BEGIN{ RS="@" ; FS="[=\n]" } > NR > 1 { > printf("%s %-10s %-10s %-20s\n",$1,$3,$5,$7) > }' 1 SAMPLE SAMPLE /var/db2/db2ccp2 2 SAMPLE2 SAMPLE SERVER2 3 SAMPLE3 SAMPLE SERVER3

awk has several internal variables that control its operation. The RS variable ( record separator ) is used to detail the record delimiter ( single character only ). The FS variable ( field separator ) is used tp detail the field delimiter ( single character or a pattern ). These variables are usually set in the BEGIN stanza before processing of the input stream starts.

In the above example RS is set to the @ character and FS to the pattern [=\n] ( equals character or new line ). The awk condition NR > 1 is used to print records other than the first ( the internal variable NR which holds the current record number ). Due to the setting of the FS variable, the odd field numbers contain values to be printed.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 27 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style Secure Shell $ ssh-keygen Generating public/private rsa key pair. Enter file in which to save the key (/home/user1/.ssh/id_rsa): Enter passphrase (empty for no passphrase): Enter same passphrase again: Your identification has been saved in /home/user1/.ssh/id_rsa. Your public key has been saved in /home/user1/.ssh/id_rsa.pub. The key fingerprint is: xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx user1@server1 The key's randomart image is: +--[ RSA 2048]----+ | o+ | | . o + o. | |E. = + . | |+. + = . | |= . * . S | | *.+o | +------+

Most sites have several test and production environments. As DB2 DBAs you need to be able to manage these environments. The Secure Shell (ssh) provides the ability to access servers without the need to provide a password. This can dramatically reduce the time and effort to manage environments.

To be able to access a remote server without having to provide a password requires some initial setup work. You need to create a public private key pair and then add the public key ( that's the file with the .pub extension ) to the ~/.ssh/authorized_keys file on all the remote servers you want to access. Making sure the permissions for the directory is set to 700 and the file is set to 600. Once the public key has been added, access the remote server via ssh and verify you're prompted for the passphrase. The last step is to load the private key into memory ( see next slide ).

Being able to access servers without having to provide a password each time you login is great. But don't think you're completely free of your passwords. Most sites have a policy of expiring account passwords after so many weeks. While you can still access you account if you have setup a key pair, it's good practice to change account passwords, as once in a while you'll doing something that actually requires your password.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 28 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style Secure Shell $ eval $( ssh-agent -s ) Agent pid 60555396 $ ssh-add Enter passphrase for /home/user1/.ssh/id_rsa: Identity added: /home/user1/.ssh/id_rsa $ for s in server2 server3 server4 ; do > ssh $s 'hostname ; lsuser -a user2' > done server2 user2 server3 3004-687 User "user2" does not exist. server4 user2

The next thing you need to do is load your private key into memory. When doing this you'll be prompted to enter your passphrase. This need only be done once per session. Once loaded you can ssh to the remote server without providing a password.

The following shows the commands to add to your .profile so that your private key is loaded when you create a new session. The trap command is executed when you exit from the session. It tests to see if the $SSH_AGENT_PID has been set and if so kills the running ssh-agent.

trap 'test -n "$SSH_AGENT_PID" && eval $( ssh-agent -k )' EXIT eval $( ssh-agent -s ) ssh-add

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 29 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

Secure Shell $ for s in server1 server2 server3 ; do > ssh $s 'sudo su - db2inst1' << EOF > db2 +o connect to sample && > db2 -x values current timestamp > EOF > done 2012-08-20-16.57.27.443525 2012-08-20-16.57.28.845883 2012-08-20-16.57.30.551746

Some accounts don't allow remote logins. The only way to access these accounts is to login using another account ( usually your own user account ) and then switch user. This is done so that access to privileged accounts can be traced. If you could login directly to a privileged account it would be difficult to determine who actually was using the account.

Using such accounts is more of a challenge as you need to pass the commands to be executed. The above example uses a Here document. Here documents allow text entered at the shell to be passed into a program as if it were from a file ( so called as the data is right "here" ). Here documents are terminated using a label specified the with << operator.

In the slide a Here document is used to provide data to ssh which is passed through to the su command. The label EOF was used with the << operator. A EOF on a separate line terminates the Here document.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 30 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

Interactive Korn shell • Aliases • Save useful or commonly used commands or pipelines $ db2gsfdo sample > snap_db_$(datetime).out • Reduce typing and typing errors • Command line interaction via expansion macros • Functions • Easier to manage than aliases • Extends the Korn shell functionality • Stepping stone to writing shell scripts • vi Command Line Editor • Editing directives

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 31 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

aliases • Add aliases to an Environment file ~/.kshrc

alias datetime='date +%Y%m%d.%H%M%S' alias db2ad='db2 activate database' alias db2at='db2 attach to' alias db2cr='db2 connect reset' alias db2ct='db2 connect to' alias db2d='db2 detach' alias db2dd='db2 deactivate database'

• Define the Environment file to ~/.profile

export ENV=~/.kshrc

Aliases are one way of creating your own command to run commonly used commands and with preferred options. For example, if you want the verbose option enabled when using the db2 command line processor you could define an alias for the db2 command eg. alias db2='db2 -v'. Aliases are a great way of reducing the amount of typing when entering long db2 commands. I have a number of aliases defined just for this purpose.

I start all my aliases with 'db2' followed by the first letter of each word of the command. For example, the alias for the 'activate database' command is db2ad. This approach allows me to remember alias name by reciting the command to myself. Over time finger habits are established where I no longer need to recite the db2 command.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 32 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

macro expansion aliases

$ export esc=’‚v 0

$ export esc=^[█

$ alias _f="ccdb2 -tvf${esc}_"

$ vi myfile.db2

0 ‘‚2 f

$ db2 –tvf myfile.db2█

There is a facility called 'macro expansion aliases'. These aliases can include editing directives. The alias names must start with a ‘_’ ( underscore ) followed by a single character. The string you assign to the alias can include editing directives and literal text. To represent the escape key you need to enter a character escape sequence.

Character escape sequences are entered by using ctrl-v followed by the key being escaped. In this case, the escape key. On the terminal two characters ^[ are display. This is actually a single character. Moving the cursor over these characters will cause it to jump an extra character. When defining macro expansion aliases you are initially in edit mode. To run a macro expansion alias, use the ‘@’ edit command followed by the single character used to define the alias.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 33 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

vi Command Line Editor • Use the command line like a text editor • Reuse vi finger habits • Distinct modes of operation • Directives used to perform edits

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 34 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style vi Command Line Editor • Enable vi Editing Mode ( put in ~/.kshrc ) set –o vi • Use 0 for Command mode $ db2 list aplications█ 0FpapT 0 command F find p a append p T 0 command k previous A append for database sample T

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 35 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style vi Command Line Editor $ vi myfile.db2

$ db2 –tvf0‘‚_

$ db2 –tvf myfile.db2█

$ db2 –tvf myfile.db20‘‚#

$ #db2 –tvf myfile.db2

$ #db2 –tvf myfile.db20‘‚#

$ db2 –tvf myfile.db2

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 36 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style vi Command Line Editor Command Mirror Moving the Cursor Directives [n] h [n] l Move left / right one character [n] b [n] w Move left / right one word [n] B [n] W Move left / right non-blank one word [n] e [n] E Move to end of word / non-blank word 0 $ Move to first / last character of line ^ [n] | Move to first displayable character / column

Command Mirror Delete and Change Directives D C Delete to end of line / and enter input mode cc Delete entire line / and enter input mode [n] X [n] x Delete character backward / forward

Command Mirror Yank and Put Directives [n] y m Y Yank from cursor to motion directive / end of line yy Yank entire line [n] p [n] P Puts previously yanked or deleted text right / left of cursor

This slide is for reference. It details vi Command Line Editor directives. It’s well worth learning the various directives as they will make your time at the command line much more productive ( and enjoyable ! ). However it takes time and effort for finger habits to form and usage become second nature.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 37 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

vi Command Line Editor

Command Mirror Previous Command Directives [n] k [n] j Scroll history backward / forward [n] - [n] + ?[^] string /[^] string Search history backward / forward n N Repeat search in same / opposite direction

Command Mirror Adding and Changing Directives i a Insert / append current position I A Insert beginning of line / append end of line r R Replace existing character / characters

Command Mirror Moving to Character Directives [n] fx [n] Fx Find right / left next x character ( inclusive ) [n] tx [n] Tx Find right / left next x character ( exclusive ) [n] ; [n] , Repeat find in same / opposite direction

This slide is for reference. It details vi Command Line Editor directives. It’s well worth learning the various directives as they will make your time at the command line much more productive ( and enjoyable ! ). However it takes time and effort for finger habits to form and usage become second nature.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 38 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

vi Command Editor Command Miscellaneous Directives [n] . Repeats the last change directive ~ Invert case of current character _ Append last word of previous command and enter input mode # Prepend comment character and send to history file Remove comment character and remain in command mode = List pathnames that match the current word \ Pathname completion * Pathname expansion % Move cursor to balancing parenthesis, brace or bracket @x Insert expansion of macro expansion alias _x [n] v Edit current line using vi ctrl-l Redraw current line on new line

This slide is for reference. It details vi Command Line Editor directives. It’s well worth learning the various directives as they will make your time at the command line much more productive ( and enjoyable ! ). However it takes time and effort for finger habits to form and usage become second nature.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 39 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

GNU screen • GNU screen is a window manager • Create, and manage multiple windows • Name, list, split, move between windows • Cut'n' lines or blocks between windows • Share screen sessions • Detach and reattach to screen sessions

GNU screen is a window manager. It allows you to have multiple windows from a single session. You can create and manage windows including giving windows names, list existing windows and split the window into regions. You can also cut'n'paste lines or blocks between windows. If screen is configured to support it, you can share screen sessions amongst several users.

One of the main benefits of screen is the ability to detach from a session and then reattach it later. This allows you to carry on working from where you were as if you never logged off. Processes continue to run after you detach from a session. This means background jobs do not need to be run using .

Screen can be setup in the .profile to automatically reattach to a detached screen session if there is one to attach too, otherwise a new screen session is created.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 40 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

GNU screen • Use 3a ? to display list of screen commands

Screen key bindings, page 1 of 2.

Command key: ^A Literal ^A: a

break ^B b license , removebuf = clear C lockscreen ^X x reset Z colon : log H screen ^C c copy ^[ [ login L select " ' detach ^D d meta a silence _ digraph ^V monitor M split S displays * next ^@ ^N sp n suspend ^Z z dumptermcap . number N time ^T t fit F only Q title A flow ^F f other ^A vbell ^G focus ^I pow_break B version v hardcopy h pow_detach D width W help ? prev ^H ^P p ^? windows ^W w history { } quit \ wrap ^R r info i readbuf < writebuf > kill K k redisplay ^L l xoff ^S s lastmsg ^M m remove X xon ^Q q

[Press Space for next page; Return to end.]

Of all the screen commands ? is the one to not forget as it displays all the screen commands and key bindings to invoke them. To enter a screen command you must use Ctrl-a first before entering the required command. So to display screen key bindings you need to enter Ctrl-a then ?.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 41 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

GNU screen • Use .screenrc to initialise screen setup

$ cat .screenrc startup_message off hardstatus on hardstatus alwayslastline hardstatus string "%w" term xterm defscrollback 2000 screen -t db2inst1@server1 ssh -t user1@server1 'sudo su - db2inst1' screen -t db2inst1@server2 ssh -t user1@server2 'sudo su - db2inst1' screen -t db2inst1@server3 ssh -t user1@server3 'sudo su - db2inst1'

Screen uses an initialisation file called .screenrc. This file can be used to setup screen just the way you want when you start screen. An example .screenrc file is shown above. It disables the startup message to screen; the hard status line will always be shown at the bottom of the window showing a list of windows created; the scroll back buffer is set to 2,000 lines. Three windows will be created, each accessing a db2 instance owner on a different server.

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 42 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style

GNU screen • Use 3a S to split screen into regions

$

0 db2inst1@server1 $

1 db2inst1@server2 $

2 db2inst1@server3 0* db2inst1@server1 1 db2inst1@server2 2 db2inst1@server3

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 43 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style GNU screen 3+a c to create new window 3+a k to kill current window 3+a A to title current window 3+a S to add new region 3+a @ to switch region 3+a n to swap next window 3+a p to swap previous window 3+a 1 to swap to window 1 3+a 3a to swap to other window 3+a d to detach from screen ( use screen -D -R to attach to a detached session ) 3+a h to show help

There are a number of commands used to control screen. To enter a screen command you must prefix each command with ctrl+a. By default each screen command is also bound to ctrl+command key. For instance, the command to create a new window in screen is c. To enter this command, you can use either ctrl+a c or ctrl+a ctrl+c

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 44 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Click to edit Master title style References • Raymond, Eric S. The Art of Unix Programming. Boston: Addison-Wesley, 2004. Print.

• Bolsky, Morris I., and David G. Korn. The New KornShell: Command and Programming Language. Upper Saddle River, NJ: Prentice Hall PTR, 1995. Print.

• Rosenblatt, Bill, and Arnold Robbins. Learning the Korn Shell. Sebastopol, CA: O'Reilly, 2002. Print.

• Robbins, Arnold, and Nelson H. F. Beebe. Classic Shell Scripting. Sebastopol, CA: O'Reilly., 2005. Print

• Robbins, Arnold. Effective Awk Programming. Beijing: O'Reilly, 2001. Print.

• Screen User's Manual. .

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 45 IDUG DB2 Tech Conference Sydney, Australia, September 12 - 14, 2012

Robert Mala Database Administrator [email protected]

Session Code: 1089 Going Barefoot with UNIX for the DB2 DBA

"Going Barefoot with UNIX for the DB2 DBA" by Robert Mala Page 46