0 / 0
Loading data through generated code snippets

Loading data through generated code snippets

You can add automatically generated code to load data from project data assets to a notebook cell. The asset type can be a file or a database connection.

By clicking in an empty code cell in your notebook, clicking the Code snippets icon (the Code snippets icon) from the notebook toolbar, selecting Read data and an asset from the project, you can:

  • Insert the data source access credentials. This capability is available for all data assets that are added to a project. With the credentials, you can write your own code to access the asset and load the data into data structures of your choice.

  • Generate code that is added to the notebook cell. The inserted code serves as a quick start to allow you to easily begin working with a data set or connection. For production systems, you should carefully review the inserted code to determine if you should write your own code that better meets your needs.

    When you run the code cell, the data is accessed and loaded into the data structure you selected.

    Notes:

    1. The ability to provide generated code is disabled for some connections if:
      • The connection credentials are personal credentials
      • The connection uses a secure gateway link
      • The connection credentials are stored in vaults
    2. If the file type or database connection that you are using doesn't appear in the following lists, you can select to create generic code. For Python this is a StreamingBody object and for R a textConnection object.

The following tables show you which data source connections (file types and database connections) support the option to generate code. The options for generating code vary depending on the data source, the notebook coding language, and the notebook runtime compute.

Supported files types

Table 1. Supported file types
Data source Notebook coding language Compute engine type Available support to load data
CSV files
Python Anaconda Python distribution Load data into pandasDataFrame
With Spark Load data into pandasDataFrame and sparkSessionDataFrame
With Hadoop Load data into pandasDataFrame and sparkSessionDataFrame
R Anaconda R distribution Load data into R data frame
With Spark Load data into R data frame and sparkSessionDataFrame
With Hadoop Load data into R data frame and sparkSessionDataFrame
Python Script
Python Anaconda Python distribution Load data into pandasStreamingBody
With Spark Load data into pandasStreamingBody
With Hadoop Load data into pandasStreamingBody
R Anaconda R distribution Load data into rRawObject
With Spark Load data into rRawObject
With Hadoop Load data into rRawObject
JSON files
Python Anaconda Python distribution Load data into pandasDataFrame
With Spark Load data into pandasDataFrame and sparkSessionDataFrame
With Hadoop Load data into pandasDataFrame and sparkSessionDataFrame
R Anaconda R distribution Load data into R data frame
With Spark Load data into R data frame, rRawObject and sparkSessionDataFrame
With Hadoop Load data into R data frame, rRawObject and sparkSessionDataFrame
.xlsx and .xls files
Python Anaconda Python distribution Load data into pandasDataFrame
With Spark Load data into pandasDataFrame
With Hadoop Load data into pandasDataFrame
R Anaconda R distribution Load data into rRawObject
With Spark No data load support
With Hadoop No data load support
Octet-stream file types
Python Anaconda Python distribution Load data into pandasStreamingBody
With Spark Load data into pandasStreamingBody
R Anaconda R distribution Load data in rRawObject
With Spark Load data in rDataObject
PDF file type
Python Anaconda Python distribution Load data into pandasStreamingBody
With Spark Load data into pandasStreamingBody
With Hadoop Load data into pandasStreamingBody
R Anaconda R distribution Load data in rRawObject
With Spark Load data in rDataObject
With Hadoop Load data into rRawData
ZIP file type
Python Anaconda Python distribution Load data into pandasStreamingBody
With Spark Load data into pandasStreamingBody
R Anaconda R distribution Load data in rRawObject
With Spark Load data in rDataObject
JPEG, PNG image files
Python Anaconda Python distribution Load data into pandasStreamingBody
With Spark Load data into pandasStreamingBody
With Hadoop Load data into pandasStreamingBody
R Anaconda R distribution Load data in rRawObject
With Spark Load data in rDataObject
With Hadoop Load data in rDataObject
Binary files
Python Anaconda Python distribution Load data into pandasStreamingBody
With Spark Load data into pandasStreamingBody
Hadoop No data load support
R Anaconda R distribution Load data in rRawObject
With Spark Load data into rRawObject
Hadoop Load data in rDataObject

Supported database connections

Table 2. Supported database connections
Data source Notebook coding language Compute engine type Available support to load data
- Db2 Warehouse on Cloud
- IBM Db2 on Cloud
- IBM Db2 Database
Python Anaconda Python distribution Load data into ibmdbpyIda and ibmdbpyPandas
With Spark Load data into ibmdbpyIda, ibmdbpyPandas and sparkSessionDataFrame
With Hadoop Load data into ibmdbpyIda, ibmdbpyPandas and sparkSessionDataFrame
R Anaconda R distribution Load data into ibmdbrIda and ibmdbrDataframe
With Spark Load data into ibmdbrIda, ibmdbrDataFrame and sparkSessionDataFrame
With Hadoop Load data into ibmdbrIda, ibmdbrDataFrame and sparkSessionDataFrame
- Db2 for z/OS
Python Anaconda Python distribution Load data into ibmdbpyIda and ibmdbpyPandas
With Spark No data load support
- Amazon Simple Storage Services (S3)
- Amazon Simple Storage Services (S3) with an IAM access policy
Python Anaconda Python distribution Load data into pandasStreamingBody
With Hadoop Load data into pandasStreamingBody and sparkSessionSetup
R Anaconda R distributuion Load data into rRawObject
With Hadoop Load data into rRawObject and sparkSessionSetup
- IBM Cloud Databases for PostgreSQL
- Microsoft SQL Server
Python Anaconda Python distribution Load data into pandasDataFrame
With Spark Load data into pandasDataFrame
R Anaconda R distribution Load data into R data frame
With Spark Load data into R data frame and sparkSessionDataFrame
Python Anaconda Python distribution Load data into pandasDataFrame

In the generated code:
- Edit the path parameter in the last line of code
- Remove the comment tagging

To read data, see Reading data from a data source
To search data, see Searching for data objects
To write data, see Writing data to a data source
With Spark No data load support
R Anaconda R distribution Load data into R data frame

In the generated code:
- Edit the path parameter in the last line of code
- Remove the comment tagging

To read data, see Reading data from a data source
To search data, see Searching for data objects
To write data, see Writing data to a data source
With Spark No data load support
Python Anaconda Python distribution Load data into pandasDataFrame
With Spark Load data into pandasDataFrame
R Anaconda R distribution No data load support
With Spark No data load support
- Amazon RDS for MySQL
Python Anaconda Python distribution Load data into pandasDataFrame
With Spark Load data into pandasDataFrame
R Anaconda R distribution Load data into R data frame and sparkSessionDataFrame
With Spark No data load support
- HTTP
- Apache Cassandra
- Amazon RDS for PostgreSQL
Python Anaconda Python distribution Load data into pandasDataFrame
With Spark Load data into pandasDataFrame
R Anaconda R distribution Load data into R data frame
With Spark Load data into R data frame

Parent topic: Loading and accessing data in a notebook

Generative AI search and answer
These answers are generated by a large language model in watsonx.ai based on content from the product documentation. Learn more