Our example represents a very simple customer database with ten linked tables. To assimilate JSON, we need to extract all the JSON data and store it in a relational form.This is fine where all the database has to do is to store an application object without understanding it. To accommodate JSON, we store it as a CLOB, usually NVARCHAR(MAX), with extra columns containing the extracted values for the data fields with which you would want to index the data.In any relational database, we can use two approaches to JSON data, we can accommodate it, meaning we treat it as an ‘atomic’ unit and store the JSON unprocessed, or we can assimilate it, meaning that we turn the data into a relational format that can be easily indexed and accessed. Now we need to consider those cases where the JSON document or collection represents more than one table. We have successfully imported the very simplest JSON files into SQL Server. Importing a More Complex JSON Data Collection into a SQL Server Database The technique is only suitable where columns are of fixed length. This allowed simple adding, editing and deleting of data items. The JSON code was created by using three simple functions, one for the cell-level value, one for the row value and a final summation. Each table was pasted into Excel and tidied up. Just as a side-note, this data was collected for this article in various places on the internet but mainly from Yan Tan Tethera. We will start off by creating a simple table that we want to import into. Converting Simple JSON Arrays of Objects to Table-sources You will need access to SQL Server version, 2016 and later or Azure SQL Database or Warehouse to play along and you can download data and code from GitHub. I don’t use Sheep-counting words because they are of general importance but because they can be used to represent whatever data you are trying to import. The simple aim is to put them into a table. We’ll start by using the example of sheep-counting words, collected from many different parts of Great Britain and Brittany. We’ll then try slightly trickier JSON documents with embedded arrays and so on. Let’s start this gently, putting simple collections into strings which we will insert into a table. Once those are in place we’ll then import a single JSON Document, filling the ten tables with the data of 70,000 fake records from it. In this article we’ll start simply and work through a couple of sample examples before ending by creating a SQL server database schema with ten tables, constraints and keys. Fortunately, schema-less data collections are rare. If each of the documents have different schemas, then you have little chance. It is fairly easy to Import JSON collections of documents into SQL Server if there is an underlying ‘explicit’ table schema available to them.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |