API_ImportFromCSV is a powerful developers tool and one of my personal favorites to use via Webhooks and custom scripting. This API call would be synonymous with mass importing, or sending multiple records to add or update in Quick Base at one time via the API. The use cases and examples for using an import will be the same relative to API_AddRecord or API_EditRecord, the difference being that you can use ImportFromCSV to do multiple adds, multiple edits, or both, all at once.
So for example, if I were setting up a project and wanted a pre-defined lists of tasks to be generated with it, let's say 15 tasks, one option would be to string together 15 AddRecord calls together to accomplish your goal. Or, create all 15 at once using ImportFromCSV.
A simple example of a basic import call might be forecasting out a calendar year when a new project is created. A new project is created, and I want a financial forecast created in a 'Forecasts' module, for Months 1-12.
If we're utilizing Webhooks in this example - then what I'm hoping to have happen is that when a new record is created, I automatically create 12 forecast records so I can plan my spend. Our example POST then, might look something like below:
To break down the above - let's looks at the the individual tags themselves
<qdbapi></qdbapi>- The opening tag for any Quick Base API call
<usertoken></usertoken> - Where you put your User Token
<records_csv></records_csv> - This is where you are actually putting your data. Much like the name suggests, API_ImportFromCSV interprets data like a CSV file. So think so about the way yout might do a standard Quick Base import. You organize your data into a file structure with rows and columns, Quick Base interprets that and you confirm which field(s) you want to map your data to, and then you import it. Same thing applies here, your <records_csv> is basically your import file, so it should have a similar row/column structure. In the example above:
Project #, Month represent our Column headers
Each line after that,
represent each new 'row' in our data, still organized in the same column format
Note, that each new row is on a new line, and each data point is separated by a comma. Again, the name importFromCSV means you are structuring your data like a CSV file, where the data is structured and formatted by having each new line interpreted by a line break, and each data point separated by a delimiter, in this case a comma (CSV -> Comma-Separated Values).
If you had additional data - say an 'owner' of this forecast, such as the PM assigned to the project, then you would add another value to each line after the # like so:
Project #, Month, Owner
<clist></clist> - Your clist corresponds with the actual fields in Quick Base that you want to write data to. Each field ID is separated by a period delimiter, so in our example above, I want to write to fields 6 and 7 (Project # and Month), so my clist would be 6.7
For each additional field you want to write to, add another period and then the pertinent field ID. It's important to note that the number of fields present in your clist should match the number of columns in your data set.
<skipfirst></skipfirst> - With this, you are telling Quick Base to skip the first line of your data. This is common when you have Column headers. Putting a value of '1' will mean that the first line of your data is ignored.
When evaluating when to use ImportFromCSV - the biggest driver in my experience is not knowing how many records might be created. In the example above - you could get away with setting up a QB Automation and chain 15 Add requests together. But what if you need 18 this time? Or 30? The unknown is where this import call comes in handy. You can utilize custom scripting, creative formulas etc. to account for that variability and achieve your desired result.
For an example of how to use formulas to achieve dynamic record creation, I suggest taking a look at Formulas + Webhooks = Dynamic Record Creation to explore further.
Author: Chayce Duncan (email@example.com)
Date Submitted: 1/16/19