Please note that VisualCron support is not actively monitoring this community forum. Please use our contact page for contacting the VisualCron support directly.


dpulley
2019-01-30T14:34:52Z
Hello,

I'm having an issue with the SQL Task STD Output. It looks like there is some internal limit on the size of the output itself and if it exceeds that limit the output won't show and just says no output. I have a semi large table I need to setup a data extract from nightly and when I pull the whole table the SQL task simply says no output and my process fails but when I set some limit on the SQL query itself like Top 200,000 rows I can get the output which indicates to me there is some sort of hard limit on the output of a task and I am simply exceeding that. My question is 2 fold first where can I find out what exactly the limit of a task output is in either bytes or characters? Is there a way for me to increase this limit?

For now I have a workaround which is to loop through the data and extract it in chunks to the file but ideally I'd like to be able to just dump the whole table in one step to a csv.

I've looked through all of the server settings and was not successful, looking at the Log Setting tab I tried to up the output settings character limit to its maximum possible value and that had no effect as it seems to effect prior runs in the log but not the actual output limit of the current run.

Thank you for your time!
Sponsor
Forum information
thordw
2019-01-30T15:21:33Z
Hi

Since it's that much data, couldn't you export it directly via the query?
With say something like this?:

SELECT * from myTable
INTO OUTFILE '/tmp/querydump.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'


Just a thought.
dpulley
2019-01-30T16:02:04Z
Thank you for the suggestion, normally I would do something similar to that if I had visualcron installed on my database server but, the only issue in this case is that visualcron is running on our integration server and the database is running on a separate server. The integration server already has all of the connectivity I need to the particular blob storage account I have to drop the file onto and I'm trying to avoid having to make those modification to the database server itself to drop the file there.

If I dropped the file onto the database server I'd have to use the folder that it dropped into as a network share so visualcron could reach it and move it to the mounted blob storage and I'd like to avoid sharing out a folder directly on our database server is possible.

Thank you for your time!
thordw
2019-01-30T16:50:46Z
Hi

Any possibility then to create a share on the vc server that only the sql have access to and have thte sql export to that share?

Otherwise my solution would be to create a program that does the export, and let vc handle the execution and errorhandling.
That's what I would've done if the data output is large.
Gary_W
2019-01-30T20:08:45Z
dpulley, What RDBMS do you use? I had a similar issue where I didn't want to risk large SQL result sets causing issues and wanted to write straight to a file. We use Oracle with stored procedures that return sys_refcursors as OUT parameters for capturing query output for writing to files (they can also be used as source objects for a crystal report). A sys_refcursor is basically a pointer to a result set. The stored procedure can be called from a job containing a powershell script task where I pass the procedure name and other parameters and the powershell script connects to the database, runs the procedure, removes double-quotes and writes to the path passed in using a stream-writer for efficiency. You could create a stored procedure that just selects all rows from the table and use the same mechanism. I tried to make it as generic as possible, using parameters for everything.
Caveat: I know enough Powershell to be dangerous and I cobbled this together from various examples so I'm sure there are improvements that could be made.

Powershell script:
######################################################################################################
##
## Generic powershell code to run the passed-in stored procedure and write the results to a file.
##
## The stored procedure should simply open a cursor as (select ...)
##
## Call this job from a VC task, passing all the parameters as job variables.
##
## This is needed as it seems VC does not handle data sets coming back from stored procedures in
## sys_refcursors. The stored procedure can be used as a data source in a Crystal report.
##
######################################################################################################
Param(
[string] $STORED_PROCEDURE, # Name of the stored procedure, with schema
[string] $REF_CURSOR_NAME, # Name of the SYS_REFCURSOR argument to the above
[string] $OUT_FILENAME, # Full path to the output file
[string] $DELIMITER, # Delimiter character
[int] $SKIP, # Rows to skip as column headers are included by default
[string] $DATASRC,
[string] $CONNSTR
)
[void][System.Reflection.Assembly]::LoadWithPartialName("Oracle.DataAccess")
$connectionstring = ($CONNSTR -f $DATASRC)
$dt = New-Object System.Data.DataTable
$command = New-Object Oracle.DataAccess.Client.OracleCommand($STORED_PROCEDURE,$connectionstring)
$command.CommandType = [System.Data.CommandType]::StoredProcedure
$params = New-Object Oracle.DataAccess.Client.OracleParameter($REF_CURSOR_NAME, [Oracle.DataAccess.Client.OracleDbType]::RefCursor, 0) `
-property @{Direction = [System.Data.ParameterDirection]::Output; Value = $value}
$command.parameters.add($params)
$dataAdapter = New-Object Oracle.DataAccess.Client.OracleDataAdapter($command)
[void]$dataAdapter.Fill($dt)

#To export to csv
$writer = New-Object System.IO.StreamWriter($OUT_FILENAME)
$dt | ConvertTo-Csv -Delimiter "$DELIMITER" -NoTypeInformation | Select -skip "$SKIP" |
ForEach-Object {
$writer.writeline($_ -replace '"', "") # Remove double-quotes. Data better not contain any!
}

$dt.Dispose
# End powershell script

Params:
datasource is from the TNSNAMES.ora file.
UserPostedImage

Example call:
UserPostedImage

Example stored procedure that is inside a package:
PROCEDURE get_SQL_output(prc OUT SYS_REFCURSOR) IS
BEGIN
OPEN prc FOR
select level
from dual
connect by level <= 10;
END get_SQL_output;
David Finkelstein
2021-07-16T15:53:42Z
These are kind of excessive solutions when we should be able to simply dump the output of a SQL to a file right in VisualCron. It seems to have this feature but once the output of the query gets too large it fails. How can we just increase the amount of data that is allowed to be written into the file option of the SQL Output option?

Thank you!
Support
2021-07-28T14:27:30Z
Originally Posted by: David Finkelstein 

These are kind of excessive solutions when we should be able to simply dump the output of a SQL to a file right in VisualCron. It seems to have this feature but once the output of the query gets too large it fails. How can we just increase the amount of data that is allowed to be written into the file option of the SQL Output option?

Thank you!



Hi David,

Are you still experiencing issues with this or did you find an alternate solution that works? If not, please send us an email to support@visualcron.com
Michael
Support
http://www.visualcron.com 

Please like  VisualCron on facebook!
David Finkelstein
2021-07-28T15:36:45Z
No I do not have a solution for this yet and would like to have one that is easy to implement which does not have these crazy workarounds. Dumping to a temporary file automatically would be fine.

Thank you,
David Finkelstein
Brinks Home
Support
2021-08-11T12:25:02Z
Originally Posted by: David Finkelstein 

No I do not have a solution for this yet and would like to have one that is easy to implement which does not have these crazy workarounds. Dumping to a temporary file automatically would be fine.

Thank you,
David Finkelstein
Brinks Home



Unfortunately there is no easier way to handle this - we went through this internally. The only way would be to either perform the powershell workaround or using the looping like you're doing right now.
Michael
Support
http://www.visualcron.com 

Please like  VisualCron on facebook!
Scroll to Top