I recently got a request from a colleague to help them with scheduling some reports for delivery, with the destination being a unix machine folder. I am aware that iBots can call a Java Program to write to a disk, and that BI Publisher's scheduling engine can directly write to the disk, but I did not want to delve into any large lines of code nor even use the OBIEE scheduling. I wanted to see how I can make it a purely pull job from the unix machine side. To this end, I wrote a shell script to accomplish this.
Note: This works for OBIEE 10g. In OBIEE 11g, there are additional steps to get this to work, including setting the User Agent to be one which is supported by OBIEE11g and issuing a separate Login command.
#Custom Variables
export BIUSER=Administrator ß Username which can access the report
export BIPASS=scbdev ß Password for the user
export REPORTPATH="/shared/Cocoa Global Reports/Global Reports/Reports/Global Commodities Report By Location" ß Location of the report
export TARGETFILE=GlobalCommoditiesReportByLocation.xls ß Name of the target file
export BACKUPDIR="/obieee/SavedReports" ß Location where the exports will be saved.
#Period Variables
export YEAR=$(date +"%Y")
export MONTH=$(date +"%m")
export DAY=$(date +"%d")
#Reporting Variables
export URL1="http://localhost:9704/analytics/saw.dll?Go&Path=" ß Assumes that the BI server is running locally. If not, you can replace localhost with the ip/host of the BI server.
export URL2=$(echo $REPORTPATH | sed -e 's/ /%20/g') ß Trims the spaces from the report path and makes it HTML ready (%20)
export URL3="&NQUser="
export URL4="&NQPassword="
export URL5="&Format=mht&Extension=.xls&Action=Download" #FOR PDF, URL5="&Format=pdf&Action=Download" ß Depending on whether you want XLS or PDF output, select the value for URL5. Note PDFs have default rowcount limites.
export REFERVAL="http://locahost:9704/analytics/saw.dll?Dashboard"
export USERAGENTVAL="Mozilla/5.0 (Windows NT 5.1; rv:14.0) Gecko/20100101 Firefox/14.0.1"
export FULLURL=$URL1$URL2$URL3$BIUSER$URL4$BIPASS$URL5
# Create and move to backup directory
cd $BACKUPDIR
if [ ! -d YEAR_$YEAR ]; then
mkdir YEAR_$YEAR
fi
cd YEAR_$YEAR
if [ ! -d MON_$MONTH ]; then
mkdir MON_$MONTH
fi
cd MON_$MONTH
if [ ! -d D_$DAY ]; then
mkdir D_$DAY
fi
cd D_$DAY
wget --output-document="$TARGETFILE" --referer="$REFERVAL" --user-agent="$USERAGENTVAL" "$FULLURL" ß
This is the actual command which gets the file.