Skip to main content

 

API Call History

In this release, we updated the design and function of the API Call History page. Search and filtering options easily narrow down the list of calls, now with drop-down menus for call purpose and user too. We also made the table easier to navigate by rearranging columns, adding Correlation ID and Call Purpose, and keeping the checkboxes and actions columns static to avoid unnecessary scrolling. Finally, we made some backend enhancements, resulting in improved performance for larger history sizes!

Spark API call history page showing multiple search and filter options above a list of 5 calls

 

Other changes

  • Recent Activity Log - When a user restores a service to an earlier version, this is now reported in the Recent Activity Log.

  • Testing Center - Testbed tables no longer include input columns for sub-services that are not selected. Previously the columns were present but blank.

  • Testing Center - Spark now displays a warning when uploading a testbed template whose inputs/outputs do not match that of the Service.

  • Compare versions - After requesting a service Version Comparison Report, if a user minimizes the progress modal window, they may now re-display it via the Background Activity dropdown (located in the top-right of the Spark interface) to view the comparison summary before downloading the report.

  • Service Download - To make it more obvious when a user is working on a “Configured Service” as opposed to an “Original Service”, Spark now injects an additional tab into the file when the user opts to download a Configured Service. This tab includes details on the version downloaded.

  • Insights - Some adjustments have been made to how services are counted in Insights. New service uploads will no longer be considered updates, and new versions of a service will only be considered updates.

 

Security

 

Single Sign On

Customers can now set up Single Sign On (SSO) for their Spark tenants and control access and permissions automatically. This allows network administrators to easily set up single-click access to Spark for their organization and to use existing user groups. For more details, see our instructions on using Azure AD as an identity provider (IdP). Instructions for other IdPs can be added on request.

 

Features Permissions

We have improved some of our backend implementations around Features permissions. The Features permissions screen gives tenant-administrators highly granular control over API Key access to microservices in the Spark backend. This allows useful integrations with backend APIs that can: download converted logic, download the API call history, orchestrate some of the testing capabilities outside of Spark, and many others.

To use Features Permissions:

  • 2.

    Then create a new API Key Group. Include the newly-created user group as well as all user groups needed to access the target services. This will automatically create a new API Key.

  • 3.

    Go to Options > Tenant configuration > Features permissions and find a feature that you would like to call with the API Key, Spark.DownloadCsvLog.json for example, which can download the API Call History as a CSV file.

features permission section of tenant configuration screen showing a list of features and the numbers of users and APIs they encompass. Each row also has a button to view more details.

  • 4.

    View the feature and add the user group created in Step 1.

Feature details box including a description, list of user groups added and a dialog to add others, followed by a list the API endpoints made accessible.

  • 5.

    The API key created in Step 2 can now call the feature in Step 3.

For help, please reach out to our Customer Success team!

 

Spark Forms

Frontend apps can now display a service's name when calling the service using the ID or version methods since folder_name and service_name are now included in the response_meta section of getFormSpec API responses. Previously, this was only possible when calling the service using the service name method.

 

Spark Assistant

In this release of Spark Assistant, we have now prefixed all previous Coherent functions with C., for example, =C.SPARK_XCALL or =C.SPARK_XMLTOJSON This is to help better distinguish Coherent functions from other add-ins. Any services uploaded to Spark with the older convention, e.g. SPARK_XCALL will still work in the Spark UI and APIs; if they are being used in Spark Assistant, however, we recommend adding the prefix to existing functions in order to interact with the calculated outputs in Excel.

Be the first to reply!