TAGS :Viewed: 8 - Published at: a few seconds ago

[ VBA big SQL-Query - Method 'CopyFromRecordset' of object 'Range' failed ]

the code as posted below connects to an oracle database, processes an SQL query and saves the resulting table in a new workbook. It works well up to about 200.000 rows. However, with larger datasets the error Method 'CopyFromRecordset' of object 'Range' failed occurs when I try to copy the data from the recordset object to the workbook:

dataWs.Range("A2").CopyFromRecordset dataset 

Is there any solution to it? I tried looping over all elements of the dataset and copy them into the worksheet, however that takes extremely long for big datasets. Do you have any ideas? I appreciate your help! Here's the code now:

Sub QueryExecute(sqlString, userPW, userID, serverName)
'Connect to database <serverName> using user name <userID> and 
'password <userPW> to process SQL query <sqlString> and save the
'query result in a new workbook

   Dim ConnStr As String
   Dim Cn As ADODB.Connection
   Dim dataset As ADODB.Recordset
   Dim dataWs As Worksheet
   Dim dataWb As Workbook
   Dim icols As Integer

   'Create new workbook that will hold the query result/table:
   Set dataWb = Excel.Application.Workbooks.Add
   Set dataWs = dataWb.Sheets(1)

   Application.Calculation = xlManual

   'Trim trailing/leading blanks from sqlString:
   sqlString = Trim(sqlString)

   'Create string for database connection:
   ConnStr = "UID=" & userID & ";PWD=" & userPW & ";DRIVER={Microsoft ODBC for Oracle};" _
                    & "SERVER=" & serverName & ";"

   'Connect to database:
   Set Cn = New ADODB.Connection

   On Error Resume Next 'Error handling in case connection does not work

   With Cn
     .ConnectionString = ConnStr
     .CursorLocation = adUseClient
   End With

   'Error handling for failed connection:
   If Err.Number <> 0 Then

     MsgBox "Connection to database failed. Check username and password."
     Exit Sub

   End If

   'Send SQL query to database:
   Set dataset = Cn.Execute(sqlString)

   'Error handling for failed query:
   If Err.Number <> 0 Then

     MsgBox "SQL-query could not be processed."
     Exit Sub

   End If

   On Error GoTo 0

   'Copy column names in first row of table worksheet:
   For icols = 0 To dataset.Fields.count - 1
     dataWs.Cells(1, icols + 1).Value = dataset.Fields(icols).Name

   dataWs.Range(dataWs.Cells(1, 1), _
   dataWs.Cells(1, dataset.Fields.count)).Font.Bold = True 'Format column names

   'Copy data to workbook:
   dataWs.Range("A2").CopyFromRecordset dataset 


   MsgBox "Query successful."

   Application.Calculation = xlCalculationAutomatic

End Sub

Answer 1

According to the Microsoft article - the maximum rows is 1,048,576 rows by 16,384 columns. Given, it is unrealistic to manipulate or scrutinize a million rows - can we assume the spreadsheet is then summarizing the rows? If that is the case - you should be always looking to minimize the size of the recordset being returned to Excel. And to do this you would off-load the processing / summarizing of the data onto the database.

This could be done in the SQL query or a database procedure returning a SYS_REFCURSOR. This is essentially a pointer to the resultset.

Answer 2

Like @OraNob says, minimise the volume of data that you're returning, by filtering, aggregating and sorting, at the database-end. If you must retrieve large datasets (to reduce multiple calls), you could consider keeping the recordset open, and just populating the worksheet with the data you need for various sub-sets of the data. If your recordset has more than the million-or-so rows, then you can write out the results to multiple worksheets.

I also recommend using the GetRows function, and you will need to transpose it, because the GetRows array will be dimensioned as column then row, and Excel works best with row then column.

Also, given the size of your datasets, and assuming 32-bit Office, you won't be able to rely on Application.Worksheet.Transpose to do the transpose, as you'll probably run out of memory, and you may need to be careful with memory anyway, if doing the transpose yourself. Consider breaking the transposes and insertions up into batches.

And finally, remember to do the insertions into the worksheet as ranges, as it will be much faster than cell-by-cell. eg:

Dim aData(1 to 10000, 1 to 16)
aRecordset = rst.GetRows(10000)
'Transpose the aRecordset into aData
Sheet1.Range(Sheet1.cells(1,1),Sheet1.Cells(10000,16) = aData