0 / 0
Testing AI services
Last updated: Nov 08, 2024
Testing AI services

After creating your AI service, you can test the coding logic of your AI service by using the watsonx.ai Python client library.

Testing AI services with Python client library

To test the logic of your AI service locally by using the RuntimeContext class of the watsonx.ai Python client library, follow these steps:

  1. Use the RuntimeContext class of the Python client library to test your AI service locally:

    from ibm_watsonx_ai.deployments import RuntimeContext
    
    context = RuntimeContext(
        api_client=client, request_payload_json={}
    )
    
    # custom is optional argument which is specified during the time of creation of deployment
    custom_object = {"space_id": space_id}
    
    generate, generate_stream, generate_batch = basic_generate_demo(context, **custom_object)
    
    

    For more information, see watsonx.ai Python client library documentation for using RuntimeContext for AI services.

  2. Depending on your use case, you can test the generate(), generate_stream(), or generate_batch() functions as follows:

    • To test the generate() function:

      context.request_payload_json = { "test": "ai_service inference payload"}
      print(generate(context))
      
    • To test the generate_stream() function:

      context.request_payload_json = {"sse": ["ai_service_stream", "inference", "test"]}
      for data in generate_stream(context):
          print(data)
      
    • To test the generate_batch() function:

      input_data_references = [
          {
              "type": "connection_asset",
              "connection": {"id": "2d07a6b4-8fa9-43ab-91c8-befcd9dab8d2"},
              "location": {
                  "bucket": "wml-v4-fvt-batch-pytorch-connection-input",
                  "file_name": "testing-123",
              },
          }
      ]
      output_data_reference = {
          "type": "data_asset",
          "location": {"name": "nb-pytorch_output.zip"},
      }
      
      generate_batch(input_data_references, output_data_reference)
      

Learn more

Parent topic: Deploying AI services with direct coding

Generative AI search and answer
These answers are generated by a large language model in watsonx.ai based on content from the product documentation. Learn more