Last update on October 8, 1997
Send questions, comments and corrections to MDMS
dbq is a command line utility used to send commands to a database and to retrieve results sets and status information. Result data sets are written using the STDOUT file descriptor. Status messages are written using the STDERR file descriptor. By default, both of the theses descriptors send output to your terminal.
While dbq can be used directly from a system prompt in a terminal screen, it was designed mainly for use in command line interpreter script files, offering a method to gain access to database information from within the script. If your database work requires interactive access, see the MDMS utilities dbView and dbWindow; they will server you better. Note: dbView specific commands cannot be used with dbq. dbq can only execute SQL/Sybase ISQL commands.
dbq offers two types of connections. You can supply a user name and password on the command line, in which case, the parameters are used to make the connection. Or, if you are registered with an MDMS password server, the user name and password will be supplied by the server. The connection mode that used a password server is called secure dbq. To find out more about the password server, refer to the Password Server Administration Guide.
To get started, let's look at a simple dbq command that returns results after executing the Sybase Stored procedure sp_help. We must supply parameters to dbq that allow it to connect to the appropriate database server and database along with the command. The entire set of parameters needed are:
The dbq command looks like this:
> dbq -u franklin -p `not*a*pswd' -s MIPSDB1 -d catalog sp_helpThe password is in single quotes to prevent the Unix C Shell from interpreting the character. If you are using a different interpreter, it may or may not be necessary to quote special characters.
A partial listing returned after this command is executed looks like this:
Name Owner Object_type ------------------------ --------------- ---------------------- groupAccounts dbo view groupCharges dbo view groupEmp dbo view missionObjective dbo view myAccounts dbo view myCharges dbo view myEmp dbo view accounts dbo user table attrValues dbo user table attributes dbo user table ...The same command using the password server to provide the user name and password looks like this:
> dbq -s MIPSDB1 -d catalog sp_help
Before you can run dbq, you must set-up its environment by doing the following:
> which dbq /usr/local/bin/dbq
The syntax for dbq is shown below.
dbq [-u <userName>] [-p <password>] -s <DataserverName> -d <DataBaseName> [-c {<commandFile> | -}] [-h {yes,no}] [-o <outputFile>] [-f {table, list, export}] [-b <bufferSize>] <command>All arguments except <command> can be place in any order on the command line. The database command must be the last argument. If you include database command in a file, using the -c <command file> argument, you should not include a command on the command line.
If you use secure dbq, only the database server name and a command-or command file-are mandatory parameters. Otherwise you must also include a user name and password.
The arguments for the dbq command are defined below:
> cat $SYBASE/interfaces
/* This is a single line comment */ /* ** This is a ** multi-line comment */If you use this option, do not include a database command as a dbq command line argument.
<field name> = <value>
If dbq can connect to a database server and execute its commands without error, it returns a value of 0. If one or more error occur, it returns a value of 1.
Even though a query may execute properly, it may return no rows, because the query qualified none. In this case, dbq still exits with a value of 0, because the query was executed properly. For queries, after checking the exit status of dbq, check the number of rows returned before proceeding.
Information associated with results returned, including headers, data, and status lines, are written to STDOUT. Error messages returned from the database server or generated by dbq are written to STDERR.
For Unix environments, if you send dbq a SIGINT (2) or SIGTERM (15) signal, dbq will finish its current command before exiting. All other signals will cause dbq to exit immediately.
Signals are normally sent to a process by the Unix kill command; but typing
dbq represents single precision floating point numbers (4 bytes) in decimal format if the number can be expressed by 6 characters or less. Otherwise, scientific notation is used. For double precision numbers (8 bytes), decimal format is used if the number can be expressed by 12 characters or less. Otherwise, scientific notation is used.
dbq is primarily meant to be used in scripts executed by a command line interpreter. But before looking at that type of application, it's a good idea to become familiar with dbq itself and ways of applying it. So in this section we'll look at dbq examples that can be executed directly from a terminal command line. The examples were run from the Unix C Shell, put similar results can be obtained in other environments.
The query results include, the field names, the line of dashes separating the field names from the data, the data itself, and a status line, telling what was accomplished. In coming example, the status line tells us that 4 rows were affected, or retrieved.
Notice that the entire command must be typed on a single line. If the line is long, just let it wrap to the next line, but don't press the
To see what can happened when the command is not properly encapsulated by quotes, let's add a WHERE clause to our command containing the value `GLL':
In our last set of examples, we included the password on the command line. This is not a good idea. We strongly recommend that you use secure dbq instead.
Since secure dbq uses Kerberos to authenticate the user, we must run the Kerberos utility kinit first to get a Kerberos ticket granting ticket that can be used by dbq for authentication to the password server. Then, we specify a dbq command without the user name and password on the command line:
When dbq commands are embedded in a script, we often just want the data returned and not the header, status information or blank lines. dbq does this if we include the (-h) option as an argument. Repeating one of our earlier examples with the header option set to `no', we get the following:
If we have several long queries that we use on a regular basis, we could put each one in a file and then reference each by a file name. We could also create an alias to include the -c option. First, let's create two files, each with a query in it. Commands in a file are executed when the word "go" appears, so we add that after the command in each file:
Since dbq write its output using the file descriptor STDOUT, we can save the output to a file by redirecting STDOUT. To redirect output from the Unix C Shell, use something like this: >myFile, or to append to an existing file, use: >>myFile. Lets save the results from the last example to the file named queryResults:
Putting the -o option in our last example would look like this and achieve the same result:
For example, if we can't connect to a server for some reason-it may not be running, or we may use incorrect connection information-we get a message. In the following example, we've specified the database server name incorrectly:
If we specify a database command incorrectly, dbq will also send an error message. In the next example, the column named missions should have been mission - without a final `s'.
The basic dbq command takes the same set of arguments each time. In Unix environments we can shorten the amount of typing necessary and make the command easier to remember by creating an alias. For example:
As we've mentioned before, dbq is meant to be used in command line interpreter scripts. Most interactive work is done in dbView or dbWindow. But there can be commands at you want to get at rapidly without going into a database sessions. Here's one example that could be useful; it shows the amount of database activity by displaying the list of users:
In this section, we'll show how dbq commands can be embedding in script files. For our examples, we'll use Perl, an interpretive language that combines the functionality of Unix shells, awk, sed, grep and C-to some extent. Perl is available for most operating system environments, so the scripts are portable; it's free; and the language is quite powerful and well suited to the kind of work we want to do.
Wether or not you choose to use Perl, the examples in this section will show you how dbq can be embedded in a command interpreter script. After seeing the possibilities, you can go off and create your own scripts for the interpreter of you choice.
Since these examples tend to be lengthy, you can find copies of them in the same directory where you found this user's guide. Feel free to modify the scripts. The scripts are unsupported so you are on your own if you use them. However, you can send bugs reports and suggestions to the email address listed in the on-line examples. If changes are made as a result of your message, we'll send you an update copy of the script if you request it.
Before you can run any of these examples, you'll have to make them executable with the command:
The basic Perl script for dbq has the following structure:
The parameters supplied with the dbq command are:
We'll also get the database parameter for the dbq statement at run time. We'll pass this parameter on the Perl script's command line. We could use this technique to alter any of the dbq arguments at run time; in that way, generalizing our script.
The "-h no" argument is included so dbq will not print header and status information for the query. Normally in a Perl script, we're only interested in the rows returned by a query. Including header and status information would just mean that we'd have to parse the data returned to get just the rows returned.
The -f export argument is supplied because we don't want values returned to be padded with blanks as they are when using table format. If dbq put in extra spaces, we'd just have to take them out again. There is second reason for using export mode. In this mode, fields are delimited by the tab character (\t). Since each row is returned as a string, we have to split the fields out of the row using some field delimiter. We don't want to use a space character, because fields in the row can contain spaces. For example, a date, like the ert field in the query, contains spaces in it: "May 7 1995 13:30:00AM" for example. The tab delimiter allows us to easily separate the entire data value from the row.
Now let's look at the code segment that describes the dbq command:
The line following the assignment of $dbq:
The name of the Perl script we're creating is called: basicDbq.p. When we run the script we place the database name on the command line as the argument for the script:
Later, once we have the password, we'll substitute the value of $targetDb into the dbq command string.
The following lines of code call a subroutine to get the password. Once we have it, it substitute it and the target database name into the dbq statement.
Here is the code for the password subroutine. The chomp function removes the newline character supplied with the user input string.
You should include this subroutine, or one like it, in any script that requires a password for dbq:
Executing the command and getting the rows returned is easy. Here's the three lines of code that do it:
The second line of code:
To find the number of rows returned, we look at the number of elements in the array @rows. If there are no elements, no rows were returned, so we exit with a value of 0, meaning the program executed properly, but there's nothing left to do. The number of elements in the array is stored in the Perl variable $#rows. Since the first element of an array is 0, we add one to the value to get the row count. This is done in the third line of code:
The loop is surrounded by two newlines so that the output is isolated from other information appearing on the screen.
Now let's run the script and see what happens.
Here's the entire script. Just change the database command and the output lines in the foreach loop and you've go your own script.
The script appears much longer that the actual code warrants because the script is so heavily commented. In your version, you may want to illuminate some of the comments.
In this section we'll alter basicDbq.p so that the parameters returned for each row are supplied to a program which is then run from the Perl script. We'll look at three cases:
For our examples, we'll use a test program named testProc that just prints out its name and its argument list.
Now when we run our Perl script the program testProc prints it arguments to the screen:
Here's what the output looks like:
If the program we want to pass database information to, reads a list of file names from a file-or a set of information for that mater - we can easily do that, too.
We'll use the array of file names as we did in the last example. Again, we'll replace the foreach loop in basicDbq.p. The replace code is shown below:
Using the script basicDbq.p and these simple modifications will allow you to build scripts to handle many common processing tasks that involve getting meta-data from a database and then using it to perform processing steps.
In our previous examples, we returned the row set from the database to an array. When we don't have a large number of rows to return that's a good method, but if we're concerned about the amount of memory taken up by the array, we can use an alternate method that returns one row at a time to our program, much as though we we're reading lines from a file.
To do this, we'll pipe the output from dbq into our Perl script-we call the Perl script readRows.p. To start with, we'll open the pipe from dbq and associated it with a file descriptor named DBQ in our Perl script:
We're using the same definition of $dbq that we used in basicDbq.p. The vertical bar associated with the variable: "$dbq |" indicates that we want the output of dbq piped directly to the Perl script's DBQ file descriptor. This form of the Perl open function also implicitly runs the dbq command for us.
The rest of line:
The open function replaces the following line in the script basicDbq.p:
The while loop replace the following foreach loop in the script basicDbq.p:
After making these few simple changes to basicDbq.p to create addRows.p, we can run the script. The output is the same, but we didn't use an array to capture the rows returned by the query this time.
Here's the entire perl script. The code changes from basicDbq.p are marked in bold type. We've removed the tutorial comments in this script so you can better see the script's structure.
In this example, we create a batch program that meets the following requirements:
We'll look at a solution to each of these requirements in turn. At the end, we integrate the solutions into a program named batch.p which is listed at the end of this example.
Since our batch process must run continually, until signaled, we create a main loop that continues to execute until the variable notDone is set to 0. Here some skeletal code to meet the requirement:
The subroutine sigHandler is called when the program receives the signal SIGINT or SIGTERM. It sets the value of $notDone to 0, so the program exits the main loop the next time through.
Putting this code in batch.p and invoking the program as a detached process, satisfies the first requirement. Using the Unix C Shell, we would invoke the program like this:
To execute a query at a specified time, we include the time as one of batch.p's arguments. The time is specified to the minute and uses a 24 hour clock, so a time value of 2:00 means two o'clock in the morning and 14:00 means two o'clock in the afternoon.
We want to sleep at the beginning of the main loop. This code fragment will do it:
Unshift puts the directory specification at the front of the array @INC so that directory is always searched first. Require is the command that reads and executes the code in the file sleepTill.pl-pl for Perl library.
The third bold line-the one in the while loop-causes the program to sleep until the time specified by the variable $batchTime. The string: &sleepTill'tomorrow ($ARGV[0]) has four components:
We don't have to know what tomorrow actually does; we just have to know how to call it. The code for the sleepTill package is in the appendix to this guide.
The bold line in the code calls the subroutine tomorrow with the variable $batchTime. $batchTime is the argument we pass to the programming containing our time value.
Now that the program sleeps as intended, we want to get data from the database to complete our second requirement. We do that with the following code segment:
To get the data, we referenced three subroutines and two variables from the package named dbqServer. We use a require statement to load the package just as we did for sleepTill. Again, we don't need to know what dbqServer does, just how to use it. (The code for the package dbqServer is in the guide's appendix.) Here's a discussion of how each dbqServer subroutine and variable is used by our program:
These steps take care of the second requirement.
Once we have a row of data back from the database and the value split-out from the row, we want to use the data for a post-processing step. Normally we'd do something real work here, but for our example, we just pass the database values to a process called testProc which write the values along with a time stamp to the file named testProc.out. The code that does this is show by the single line in bold below:
The next three lines test return code of testProc. If it's not 0, a message is mailed using the package mail.
We mail messages to franklin@hardKnocks.edu with the subroutine message in the package mail. Each time a row is retrieved, a mail message is sent with the following lines of code:
That complete our discussion of batch.p. The diagram below shows the processes involved. We changed testProc in the diagram to show that it is intended to read a file named as one of its command line arguments and to write a new file after a processing step.
An example batch.p command looks like this:
The script for batch.p can be found in the appendix to this document.
# Usage:
# sleepTill
#
# Description:
# Each time the subroutine tomorrow is called, the program will sleep
# until the time defined by the variable $batchTime. $batchTime is
# passed as the first parameter to the subroutine tomorrow.
# The syntax for the $batchTime value is:
#
#
Representation Of Floating Point Numbers
Command Line Examples
Simple Commands
> dbq -u franklin -p 'not*a*pswd' -s MIPSDB1 -d catalog Select id, mission, scId from
missions order by mission
id mission scId
----------- --------------- -----------
1 Cassini 72
2 GLL 35
3 MO 91
4 VGR 0
(4 row(s) affected)
Notice that the password parameter: 'not*a*pswd' is quoted. That prevents the command line interpreter from trying to expand the string into a set of file names. The parameter in quotes is left uninterpreted.
> dbq -u franklin -p 'not*a*pswd' -s MIPSDB1 -d catalog Select id, mission, scId
from missions where mission = 'GLL'
MDMS DBS WARNING judith::dbq Wed May 24 15:22:02 1995
(Db: jar, MsgNo: 207, Svr: 16, St: 2)
Invalid column name 'GLL'.
The interpreter did not sent dbq what we want it to. Now let's execute the command again, but this time we'll place single quotes around the entire command and double quotes around the value in the WHERE clause:
> dbq -u franklin -p 'not*a*pswd' -s MIPSDB1 -d catalog 'Select id, mission, scId
from missions where mission = "GLL"'
id mission scId
----------- --------------- -----------
2 GLL 35
(1 row(s) affected)
> dbq -u franklin -p 'not*a*pswd' -s MIPSDB1 -d catalog `showMissions GLL'
MISSION INFORMATION
mission scId objective
--------------- ----------- ---------------
GLL 35 Jupiter
(1 row(s) affected, return status = 0)
The status information now displays the status returned by the stored procedure as well as the number of rows effected.
> dbq -u franklin -p 'not*a*pswd' -s MIPSDB1 'showMissions GLL'
MISSION INFORMATION
mission scId objective
--------------- ----------- ---------------
GLL 35 Jupiter
Secure dbq
> kinit jake
SunOS (milano)
Kerberos Initialization for "jake"
Password:
> dbq -s MIPSDB1 "Select id, mission, scId from missions order by mission"
No only is this command secure-the password no longer appears on the command line-but it's also shorter to type in and we don't have to worry about quoting our password if special characters appear in it. (Note: In the rest of the User's Guide examples, we'll assume that secure dbq is used.)
Suppressing Headers
> dbq -s MIPSDB1 -h no 'Select id, mission, scId from missions order by mission'
1 Cassini 72
2 GLL 35
3 MO 91
4 VGR 0
Using Command Files
If we want to execute several commands at a time, or if we have "canned" commands that we can easily reference, we can place them in a file and have dbq read the file. Commands in a file are terminated with the word `go' on a line by itself. The file myCommands.sql contains two SQL queries:
myCommands.sql
select id, mission, objective
from missions
where mission in ("GLL", "Cassini")
go
select number, name, lgtYrsFromSun
from planets
order by number
go
We execute the commands in the file by including the name of the file in the dbq command:
dbq -s MIPSDB1 -c myCommands.sql
1> select id, mission, objective
2> from missions
3> where mission in ("GLL", "Cassini")
4> go
id mission objective
----------- --------------- ---------------
1 Cassini Saturn
2 GLL Jupiter
(2 row(s) affected)
1> select number, name, lgtYrsFromSun
2> from planets
3> order by number
4> go
number name lgtYrsFromSun
------ --------------- --------------
1 Mercury 0.00
2 Venus 0.00
3 Earth 0.00
4 Mars 0.00
5 Jupiter 0.00
6 Saturn 0.00
7 Uranus 0.00
8 Neptune 0.00
9 Pluto 0.00
(9 row(s) affected)
Since the SQL commands are in a file, they are not passed to the command line interpreter, so there's no need to place them in quotes to prevent special character interpretation.
missions.sql
Select id, mission, scId from missions order by mission
go
gll.sql
Select id, mission, scId from missions where mission = "GLL"
go
Now a new alias:
> alias query "-s MIPDB1 -c"
And we can run commands like this:
> query missions.sql <- our input
1> Select id, mission, scId from missions order by mission
2> go
id mission scId
----------- --------------- -----------
1 Cassini 72
2 GLL 35
3 MO 91
4 VGR 0
(4 row(s) affected)
> query gll.sql <- our input
1> Select id, mission, scId from missions where mission = "GLL"
2> go
id mission scId
----------- --------------- -----------
2 GLL 35
(1 row(s) affected)
Saving Output To A File
> dbq -s MIPSDB1 -c myCommands.sql >queryResults
To look at the file's contents, we'll use the Unix cat command:
> cat queryResults
id mission objective
----------- --------------- ---------------
1 Cassini Saturn
2 GLL Jupiter
(2 row(s) affected)
number name lgtYrsFromSun
------ --------------- --------------
1 Mercury 0.00
2 Venus 0.00
3 Earth 0.00
4 Mars 0.00
5 Jupiter 0.00
6 Saturn 0.00
7 Uranus 0.00
8 Neptune 0.00
9 Pluto 0.00
(9 row(s) affected)
If we're using an interpreter that can't redirect dbq's output, we can use the -o option to obtain the same effect. Note: files created with the -o option always overwrite a file with the same name if it the file already exits. You can't append data to a file with this option.
> dbq -s MIPSDB1 -c myCommands.sql -o queryResults
Error Messages
If we issue a database command that can't be executed for some reason, dbq returns an error message. All such messages are written using the STDERR file descriptor, so we can redirect them to a file. Since all data is returned to STDOUT, any data would be displayed, unless STDOUT was redirected to a file too.
dbq -s MIPDDB1 -d catalog 'select id, mission, from missions'
MDMS DBLIB MSGFAILED judith::General Delivery Thu May 25 08:24:01 1995
MsgNo: 20012, Svr: 2
Server name not found in interface file.
The first line of the message tells us where the message was originated and at what time. The second line gives us the message number and its severity. The third line is the one we want to pay attention to initially-it tells us what went wrong. In this case the server name was not in the Sybase interfaces file, so dbq could not find the information it needed to connect to a server.
dbq -s MIPSDB1 'select id, missions, description from missions'
MDMS DBS WARNING judith::dbq Thu May 25 08:15:23 1995
(Db: jar, MsgNo: 207, Svr: 16, St: 2)
Invalid column name 'missions'.
Using Aliases With dbq
> alias query "dbq -s MDM1 -d catalog"
Now we can specify a database command using just the alias "query" followed by the command:
> query 'Select id, mission, scId from missions order by mission'
id mission scId
----------- --------------- -----------
1 Cassini 72
2 GLL 35
3 MO 91
4 VGR 0
(4 row(s) affected)
Or, specifying a command as a quoted string:
query 'Select id, mission, scId from missions where mission = "GLL"'
id mission scId
----------- --------------- -----------
2 GLL 35
(1 row(s) affected)
Note: As much as you might like to, don't save you alias in a file if the alias contains you password. That is a security violation. Take a moment in a new session to recreate the alias. If you're using secure dbq, then you can save the alias.
> alias dbwho "-s MDM1 sp_who"
Whenever we execute dbwho, we see something like this:
> dbwho
spid status loginame hostname blk dbname cmd
------ ------------ ------------ ---------- ----- ---------- ----------------
1 recv sleep devServer venatia 0 fei2_0 AWAITING COMMAND
2 sleeping NULL NULL 0 master NETWORK HANDLER
3 sleeping NULL NULL 0 master MIRROR HANDLER
4 sleeping NULL NULL 0 master CHECKPOINT SLEEP
5 running franklin milano 0 franklin SELECT
(5 row(s) affected, return status = 0)
Embedding dbq In Perl Scripts
Making A Perl Script Executable
> chmod u+x
or for groups of users
> chmod g+rx
To make the examples available to everyone, you should place the scripts in a directory available to all users. For Unix environments, make sure the directory is included in the standard PATH definition of each user so users can run them.
Basic Perl Constructs Used With dbq
For our first Perl example, we show how to retrieve rows from a database and manipulate them within a Perl script. We'll focus on basic steps in this example and then go on to show how the example can be modified to start a process, passing it parameter derived from database data.
Define The dbq Command
For our example, we'll execute the SQL command:
Select fileName, filter, ert
from gllSsiEdr
where receivedAt > "5/27/95 12:00"
The query states that we want the file name, filter value and ert date (Earth Receive Time) for every Galileo SSI ERT file received later than May 27, 1995 at noon.
dbq -u franklin -p `%s' -s MIPSDB1 -d %s -h no -f export
The password value is defined as `%s' for two reasons. First, the single quotes tell the Perl interpreter not to evaluate the string contained by the quotes. This allows us to use special characters in passwords, as in the string: `not*a*pswd'. Second, we don't want to hard-code the password in the script where it could easily be discovered by someone else. Instead, we'll prompt for the password from a function that won't echo the password as we type it in. The value we supply will be substituted for the symbol `%s' in the dbq command string.
$dbq = "dbq -u franklin -p '%s' -s MIPSDB1 -d %s -h no -f export
'Select fileName, filter, ert
from gllSsiEdr
where receivedAt > \"5/27/95 12:00\"'";
$dbq =~ s/\n/ /g;
The dbq command arguments appear on a line. The query follows, broken into several lines so that we can easily read it. Let's pay careful attention to how quotes are used in this command:
$dbq =~ s/\n/ /g;
does a global substitution in the string assigned to $dbq, replacing the newline character (\n) with a space. Why is this necessary? The dbq command can't contain newlines; a newline character signals the end of the command, so we take them out. But then, why did we put them in the first place? We did it to make the database command easier to read and to make replacing the command easier if we were to use this code in another script.
Getting The Target Database Name
> basicDbq.p catalog ? the target database name
When the script is run, we check to see if the argument was supplied. If not, we exit with a message explaining what the command requires. If the argument is supplied, we assign it to the variable $targetDb. (This assignment is necessary. We do it to make the script more readable.)
unless (defined $ARGV[0]) {
print "You forgot to include the target database name.\n";
print "Usage: $0 targetDatabase\n";
exit 1;
}
$targetDb = $ARGV[0];
($0 is the Perl variable that contains the name of the script. We use it rather than the hard-coded name so the name of the script can be changed and the message remains accurate.)
Getting The Password
&pswd;
$dbq = sprintf ($dbq, $password, $targetDb);
After the substitutions are made, the dbq command string looks like this:
$dbq = "dbq -u franklin -p 'not*a*pswd' -s MIPSDB1 -d catalog -h no -f export
'Select fileName, filter, ert
from gllSsiEdr
where receivedAt > \"5/27/95 12:00\"'";
The subroutine that gets the password does two special things. First it turns off terminal echo of keyboard events while the password is typed in so the password can't be seen. Second, it ignores the signal SIGINT while echo is turned off. SIGINT is generated when we type
sub pswd {
$SIG{'INT'} = 'ignore';
system 'stty', '-echo';
print "password: ";
chomp ($password =
Executing The Command
@rows = `$dbq`;
#if ($? >> 8) { exit 1;};
unless ($#rows +1) {exit 0;}
The dbq command variable is "backquoted". This tells Perl to execute the command and return the output of the command as a string. This string is assigned to the array @rows. Since dbq's output contains newline characters at the end of each row, each row becomes an element in the array. So now we have the data back in an array of rows. Before processing the rows. we have to look at two things: the return value of dbq and the number of rows returned. If the command failed or there were no rows returned, there's nothing left to do, so we exit.
#if ($? >> 8) {exit 1;};
test the exit value of the command. The value is stored in the top byte of a two byte Perl variable, $?. To test the value, we shift the bits by 8 and test. If the value is not 0, we exit with a value of 1, indicating an error.
unless ($#rows +1) {exit 0;}
Processing The Rows Returned
If rows were returned, we display the fields for each row with the following code:
print "\n";
foreach $row (@rows) {
($fileName, $filter, $ert) = split ('\t', $row);
print "file name: $fileName, filter: $filter, ERT: $ert\n";
}
print "\n";
The foreach loop assigns an element of the array @rows to the scalar variable $row each time through the loop. The split function, splits the string into elements. The character used to denote a split is the tab (\t) because that separates fields when using the dbq export format. We then print the field values for each row.
Executing A Perl Script
Before we can run a Perl script, we must change its permissions to make it executable:
> chmod u+x basicDbq.p
And to do it for every member in our group, we use this:
> chmod g+rx basicDbq.p
If we want a group of people to be able to execute the command, we put the Perl scripts in a directory that is referenced by the PATH variable for our group.
> basicDbq.p catalog
password:
file name: file125.edr, filter: C, ERT: May 27 1995 1:45:47:166PM
file name: file126.edr, filter: A, ERT: May 27 1995 2:10:32:203PM
file name: file127.edr, filter: B, ERT: May 27 1995 2:40:46:220PM
file name: file128.edr, filter: C, ERT: May 27 1995 3:08:12:236PM
We typed in the command suppling the target database name on the command line. Once the script runs, we're prompted for the password, which we enter. The dbq command is executed and the data returned from the database server is displayed.
Perl Script basicDbq.p
#! /usr/local/bin/perl
#
# Usage:
# basicDbq.p targetDatabase
#
# Description:
# An example showing the basic Perl script constructs used when
# executing a dbq command and processing the rows returned by a
# query.
#
# Written by:
# John Rector
# Jet Propulsion Laboratory
# jar@next1.jpl.nasa.gov
#
# History:
# June, 1995 1.0 - initial release
#
# Version:
# 1.0
# Check to make sure that the target database name was supplied on the
# command line. If not exit with a message.
unless (defined $ARGV[0]) {
print "You forgot to include the target database name.\n";
print "Usage: $0 targetDatabase\n";
exit 1;
}
$targetDb = $ARGV[0];
# Define the basic parameters for the dbq command used in a Perl script.
# Follow that with the database command. We use multiple lines here to
# make the command more readable. Do a global substitution to remove
# the newlines we used to make the command readable. The Perl interpreter
# expects the dbq command to be on a single line.
$dbq = "dbq -u franklin -p '%s' -s MIPSDB1 -d %s -h no -f export
'Select fileName, filter, ert
from gllSsiEdr
where receivedAt > \"5/27/95 12:00\"'";
$dbq =~ s/\n/ /g;
# Subroutine to get the password for the command. Don't put passwords in
# a script file; it's a security violation. Use this method or use secure
# dbq which gets the password from a password server. Once we have the
# password, insert it into the dbq command, replacing %s. Note that the
# password string is placed in single quotes so special characters in
# the string won't be interpreted.
&pswd;
$dbq = sprintf ($dbq, $password, $targetDb);
# Execute the command using the Perl backquote invocation method. This
# will cause Perl to treat the value printed by the command (a set of
# database rows) as a string. The rows returned are assigned to the
# array @rows. The newline character at the end of each row retuned by
# dbq cause Perl to enter the row as an element in the array.
#
# Test to see if the command was executed correctly. The return value is
# stored in the high byte of the Perl variable '$?'. Shift the value by 8
# bits to look at the dbq exit value. If everything went OK, dbq's exit
# value will be 0; otherwise it's 1.
#
# Then test that rows were returned. '$#rows' is the number of the last
# element in the array, so the number of rows is one more than that.
# (The first element in the array is number 0.)
@rows = `$dbq`;
#if ($? >> 8) { exit 1;};
unless ($#rows +1) {exit 0;}
# Go through the array, one element, or row, at a time. Split each row
# into fields and print the fields. Since we're using dbq's export format,
# we want to split on tabs (\t). Spaces in fields are preserved using
# this technique.
# Newlines are place around the print out so that the rows are isolated
# on the screen.
print "\n";
foreach $row (@rows) {
($fileName, $filter, $ert) = split ('\t', $row);
print "file name: $fileName, filter: $filter, ERT: $ert\n";
}
print "\n";
#################
# Subroutine pswd
#################
sub pswd {
# Prompt for a password. Turn off terminal echo so we don't see the password.
# Ignore SIGINT during this time so we can't use
Supplying Database Values At Run Time
Supplying Scalar Values On The Command Line
The foreach loop in basicDbq.p is altered to look like this:
foreach $row (@rows) {
($fileName, $filter, $ert) = split ('\t', $row);
system ("testProc", "$fileName", "$filter", "$ert");
}
We substitute the line in bold for the print statement in basicDbq.p. The system function runs the program testProc, passing it the values for $fileName, $filter and $ert with each pass through the loop.
./testProc
Argument 0: file125.edr
Argument 1: C
Argument 2: May 27 1995 1:45:47:166PM
./testProc
Argument 0: file126.edr
Argument 1: A
Argument 2: May 27 1995 1:45:47:203PM
./testProc
Argument 0: file127.edr
Argument 1: B
Argument 2: May 27 1995 1:45:47:220PM
./testProc
Argument 0: file128.edr
Argument 1: C
Argument 2: May 27 1995 1:45:47:236PM
Using An Array Of Arguments
It's not uncommon for a program to take a list of file names as arguments. If we're using such a program, we can modify basicDbq.p to efficiently handle this case. First, we modify the SQL statement to only return the file name:
$dbq = "dbq -u jar -p '%s' -s MDM1 -d %s -h no -f export
'Select fileName
from gllSsiEdr
where receivedAt > \"5/27/95 12:00\"'";
And then we replace the foreach loop with the following single line of code:
system ("testProc", @rows);
Now when we run our Perl script each of the file names returned from the database is contained in the array @rows, which we pass as a list to the program testProc. Since we only need to call testProc once with the list of file names, our design is more efficient.
./testProc
Argument 0: file125.edr
Argument 1: file126.edr
Argument 2: file127.edr
Argument 3: file128.edr
Writing Arguments To A File
$" = "\n";
open (OUT, ">inputFile");
print OUT @rows;
close OUT;
system ("testProc", "inputFile");
The first line changes the array element separator from its default value of a space to the newline character. In the next three lines, we open the file inputFile and write the array elements to it and then close the file. The file now contains the file names returned by dbq. Each is separated by a newline character in the file. Finally we call testProc and pass it the name of the file containing the file names, which is reads, one record at a time:
./testProc
File name: file125.edr
File name: file126.edr
File name: file127.edr
File name: file128.edr
Summary
A Script That Returns A Large Number Of Rows
open (DBQ, "$dbq |") || die "Can't open $dbq: $!\n";
|| die "Can't open $dbq: $!\n";
tells the script to exit if the command cannot be executed and to print out the quoted string following the word die. $! is a Perl variable that holds the message associated with the system errno value. This too is printed.
@rows = `$dbq`;
To read the rows one at a time, we make an assignment to $row using the newly defined file descriptor, DBQ. Each time through the while loop another row is read from dbq. When there are no more rows to be read, the loop exits.
$count = 0;
while ($row =
We initialize a count variable to 0 before entering the loop and then increment it each time a row is read. Once we exit the loop, we execute the following command:
print "No rows returned\n" unless $count;
Which prints out "No rows returned" if the count is still 0.
foreach $row (@rows) {
($id, $mission) = split ('\t', $row);
print "id: $id, mission: $mission\n";
}
Executing readRows.p
> readRows.p catalog
password:
file name: file125.edr, filter: C, ERT: May 27 1995 1:45:47:166PM
file name: file126.edr, filter: A, ERT: May 27 1995 2:10:32:203PM
file name: file127.edr, filter: B, ERT: May 27 1995 2:40:46:220PM
file name: file128.edr, filter: C, ERT: May 27 1995 3:08:12:236PM
Perl Script readRows.p
#! /usr/local/bin/perl
#
# Usage:
# readRows.p targetDatabase
#
# Description:
# A basic Perl script for returning a large number of rows from a dbq query.
# Rows are piped from dbq's STDOUT file descriptor to Perl.
#
# Written by:
# John Rector
# Jet Propulsion Laboratory
# jar@next1.jpl.nasa.gov
#
# History:
# June, 1995 1.0 - initial release
#
# Version:
# 1.0
# Check to make sure that the target database name was supplied on the
# command line. If not exit with a message.
unless (defined $ARGV[0]) {
print "You forgot to include the target database name.\n";
print "Usage: $0 targetDatabase\n";
exit 1;
}
$targetDb = $ARGV[0];
# Dbq command with query. We remove the newlines after the query is defined.
$dbq = "dbq -u franklin -p '%s' -s MIPSDB1 -d %s -h no -f export
'Select fileName, filter, ert
from gllSsiEdr
where receivedAt > \"5/27/95 12:00\"'";
$dbq =~ s/\n/ /g;
# Get the password and substitute it into the dbq command.
&pswd;
$dbq = sprintf ($dbq, $password, $targetDb);
# Execute the dbq command and pipe results back to the file descriptor DBQ.
# Check for errors.
open (DBQ, "$dbq |") || die "Can't open $dbq: $!\n";
#if ($? >> 8) { exit 1;};
# Read each row and print out the field values. Keep a count of the
# number of rows.
print "\n";
$count = 0;
while ($row =
dbq In A Batch Processing Script
Creating A Background Process
$SIG{'INT'} = 'sigHandler';
$SIG{'TERM'} = 'sigHandler';
$notDone = 1;
while ($notDone) {
get information from a database
Invoke a process using the values found in a row
of database data.
}
exit 0;
#######################
# subroutine sigHandler
#######################
# Captures SIGINT and SIGTERM. Sets the main loop variable, so the program
# exits on the next pass through the loop.
sub sigHandler {
$notDone = 0;
}
We begin by defining a subroutine to be called when either a SIGINT or SIGTERM signal is received. Then we go into a loop that executes the work needed to be done as long as the variable $notDone is some value other than 0. When we exit the loop, the program exits.
> batch.p <argument list> &
Executing A Query At A Specified Time
unshift (@INC, "/home/jar/Perl/Dbq");
require "sleepTill.pl" || die "Can't locate sleepTill.pl package\n";
...
$notDone = 1;
while ($notDone) {
&sleepTill'tomarrow ($ARGV[1]);
/* get information from a database */
/* Invoke a process using the values found in a row of database data. */
}
The first two lines in bold type, bring in a Perl package that contains a subroutine that will cause the program to execute at a specified time each day. The first line, tells us where to find the file that contains the package we want.
$sql =
"Select fileName, filter, ert, receivedAt
from gllSsiEdr
where receivedAt > \"$filesLaterThan\"";
...
&dbqServer'connect ("MIPSDB1", $ARGV[0], $logFile);
$notDone = 1;
while ($notDone) {
&dbqServer'cmd ($sql);
if (defined @dbqServer'rows) {
foreach $row (@dbqServer'rows) {
($fileName, $filter, $ert, $filesLaterThan) = split ('\t', $row);
}
}
elsif (defined @dbqServer'error) {
print LOG @dbqServer'error;
}
}
$dbqServer'disconnect;
The code begins with the definition of the query to perform. The query's WHERE clause gets rows where the receivedAt value is later than a specified date and time; the value for the date and time is held in the variable $filesLaterThan. The first time through the loop, $filesLaterThan is set to a default value or to the value supplied on the command line as the third argument, $ARGV[2]. After that, the value is reset each time we get a row back from the database, because $filesLaterThan is one of the values split out from $row. The next time through the main loop, $filesLaterThan will hold the value returned by the last row of the query executed on the previous day.
Post-Processing Using Database Data
while ($notDone) {
...
&dbqServer'cmd ($cmd);
if (defined @dbqServer'rows) {
foreach $row (@dbqServer'rows) {
($fileName, $filter, $ert, $filesLaterThan) = split ('\t', $row);
system ("testProc", "$fileName", "$filter", "$ert");
if ($? >> 8) {
$message = "testProc failed with status: " . ($? >> 8);
&mail'message ($message);
}
}
}
}
...
The first parameter to the system function is the program name. The next three are the database values passed as command line arguments. The last parameter writes any messages from testProc to the log file shared with batch.p. We don't have to worry about error messages getting mixed in the shared file because the writes are flushed when the program exits and batch.p flushes it's error messages immediately.
Mailing Results
foreach $row (@dbqServer'rows) {
($fileName, $filter, $ert, $filesLaterThan) = split ('\t', $row);
$message = "batch.p: Processing file: $fileName\n";
&mail'message ($message);
system ("testProc", "$fileName", "$filter", "$ert");
}
Summary
> batch.p catalog 2:00 5/27/1995 &
This runs the process in the background with the parameters:
Appendix: Perl Scripts
Perl Script: dbqServer
#! /usr/local/bin/perl
#
# Usage:
# dbqServer
#
# Description:
# The package contains the following subroutines:
#
# connect - Does a fork and exec on dbq with appropriate parameters
# for a secure connection. Two pipes are established. Command are sent
# on CMD_OUT. Data is read on DATA_IN.
#
# cmd - Take a database command and sends it to dbq using CMD_OUT.
# Results are received on DATA_IN. Error detection is included.
#
# disconnect - Closes the pipes and kills dbq.
#
# Each time cmd is executed, the rows returned are left in the array
# @dbqServer'rows. If there are no rows or an error occured,
# @dbqServer'rows is undefined. Also, if an error occured $dbqServer'error
# has a positive value.
#
# Written by:
# John Rector
# Jet Propulsion Laboratory
# jar@next1.jpl.nasa.gov
#
# History:
# June, 1995 1.0 - initial release
#
# Version:
# 1.0
package dbqServer;
# Create a dbq process that can talk to its parent. Commands are sent
# from the parent to dbq with CMD_OUT. Data is received with DATA_IN.
sub connect {
if ($#_ < 2) {
print "Too few arguments passed to dbqServer'connect.\n";
last;
}
$dbSrv = @_[0];
$db = @_[1];
$logFile = @_[2];
# Open two pipes: one for piping commands from the parent process to dbq
# and a second for piping data from dbq to the parent.
pipe (CMD_IN, CMD_OUT);
pipe (DATA_IN, DATA_OUT);
# Fork and exec dbq. The dbq's pid is returned to the parent. The
# dbq process's pid is defined as 0.
$dbqPid = fork ();
if ($dbqPid != 0) { # parent
# Close the ends of the pipe not used and force non-buffered I/0
# on the others. Reselect STDOUT as the default file descriptor
# used by the print function.
close (CMD_IN);
select (CMD_OUT); $| = 1;
select (DATA_IN); $| = 1;
close (DATA_OUT);
select (STDIN); $| = 1;
select (STDOUT);
}
elsif (defined $dbqPid) { #child: dbq
# Close the ends of the pipe not used and force non-buffered I/0
# on the pipes used.
select (CMD_IN); $| = 1;
close (CMD_OUT);
close (DATA_IN);
select (DATA_OUT); $| = 1;
# Redirect STDIN, STDERR and STDOUT to the pipe descriptors
# so they'll be used when we exec dbq. Again, non-buffered IO.
close (STDIN);
open (STDIN, "<&CMD_IN");
select (STDIN); $| = 1;
close (STDERR);
open (STDERR, ">&DATA_OUT");
select (STDERR); $| = 1;
close (STDOUT);
open (STDOUT, ">&DATA_OUT");
select (STDOUT); $| = 1;
# Exec the dbq command with the following command line arguments.
$dbqCmd = "-s $dbSrv -d $db -h no -c stdin >>$logFile";
exec "$dbqCmd" || die "Can't exec dbq: $!\n";
exit 0;
}
else { # weird fork error
die "Can't fork: $!\n";
}
}
# First argument is the command to send to dbq. Result rows and error
# messages are returned to the array @rows. Command ends with a row
# containing nothing, terminated with a newline character.
sub cmd {
local ($row);
$error = 0;
@rows = ();
# Remove any newlines from the command and send it as a single
# string. Follow that with the dbq end-of-command signal.
$_[0] =~ s/\n/ /g;
print CMD_OUT @_[0];
print CMD_OUT "\ngo\n";
# Get a row. Check for an error - string beginning with MDMS.
# If no error, push the row onto the stack until last-row indicator -
# an empty row. If error, print the message and undef @rows; there's
# no data to access. Test for error is on the value of $error; 1 if
# error.
while (1) {
$row =
Perl Script: sleepTill
Perl Script: batch
#! /usr/local/bin/perl
#
# Usage:
# batch targetDatabase startAt [filesLaterThan]
#
# Description:
# Retrieves file names, filter values and ert times from gllSsiEdr
# for all files with a receivedAt time value greater than the value
# of filesLaterThan. To retrieve the database information the
# package dbqServer is used. For each row returned, the program testProc
# is invoked with the row's parameters as testProc command line arguments.
# Once processing is complete, batch sleeps untill startAt time arrives
# the following day. Sleep is done in the package sleepTill.
#
# The format for the argument startAt is:
#
#