Skip to main content

Custom Branding

Spark now supports custom branding! Tenants may be configured to display a different logo in place of the Coherent logo. With your own logo in place, you can present Spark as an integral part of your technology solution and give your employees and customers confidence that they are using an internally approved and vetted application. For now, Coherent must enable custom branding for you via tenant configuration; soon, however, tenant admins will be able to do so on their own. Once enabled, tenant admins simply upload the logo and it will be displayed the next time they log in.

 

Webhook Configuration

In March we introduced the ability to create webhooks in Spark, now we’ve added the ability to configure and enable webhooks via the new Webhook configuration section of Tenant configuration.

Webhooks allow you to export event data (which is typically seen in the Recent activity panel within the Spark UI) and process or log it within your own systems. You can then set up workflows based on these events, for example sending requests to Spark APIs or to external systems. One example might be to deactivate a service if requests exceed a certain threshold or if the service is executed by users other than those in a pre-defined list.

 

How to set up a webhook:

  1. As a Tenant Admin, open the User menu and click or tap Options.
  2. When the Tenant configuration page opens, navigate to the Webhook configuration tab.
  3. Check the Enable webhooks option.
  4. Click the New webhook button.
  5. Complete the fields in the modal window:
    • Add a descriptive Name

    • Specify the Endpoint URL to be called when an event occurs. The event details will be passed to this endpoint in the request payload.

    • Add Request headers as required.

    • Add Query string parameters as required. These static parameters and values will be appended to the Endpoint URL before it's called.

    • Enter a Description, if required.

  6. Click Add to save your changes. The webhook will now be triggered whenever any event occurs.

 

Neuron

Performance improvements

In our latest release, we're excited to announce a significant performance improvement for our SaaS platform users. By leveraging the power of Neuron technology, we've optimized the execution of our APIs, resulting in a speed increase of up to 30%. This enhancement will enable more efficient and seamless interactions with the platform, improving the overall user experience and accelerating the completion of critical tasks. Stay ahead with our faster and more reliable SaaS platform, and continue to enjoy the benefits of our ongoing commitment to innovation and excellence.

 

Neuron version targeting improvements

Whenever a new version of a service is created or recompiled in Neuron, then when the default compiler version for updates is set to MaintainVersion, Spark will first verify that the resolved version of Neuron exists. If it does not, Spark will instead use StableLatest for recompilation.

Please note: If the tenant default compiler for new services is set to Release Candidate, it is best to have the same configuration for service version updates as well.

 

Recompile process improvements

Now when recompiling a service with a different version of Neuron, Spark shows a status bar and provides access to the full upload log.

 

Security

 

Access expiration

Spark now provides the ability to define deactivation dates for users. In the user creation and user editing screen, tenant admins will find a new option field called, Access expiration. The date and time defined here will determine the moment when the user will lose access to Spark.

When the expiration date has passed, the expired user will be sent to a redirect page when they try to log in. If they try to access any Spark APIs with their bearer token, it will return a 401: NO AUTHORIZATION error.

Tenant admins can now be confident that users will only have access for as long as they need it.

 

Other security improvements

  • The way Spark handles Deactivated users has also changed. They will no longer simply be disabled but instead, the date of their deactivation will be added as a custom attribute account_end_date. When users attempt to log in, this parameter will be checked, and if the user has the account_end_date claim in their token and the defined date has passed, a 401: NO AUTHORIZATION message will be returned.

  • Spark now includes a nonce parameter in authentication requests to prevent ID token replay attacks and enhance the security of Spark's authorization flow.

  • Spark now validates audience claims when a client accesses an API using an access or ID token. Access will not be granted to any client using an ID token.

 

Spark Forms

Previously, users who edited/customized their Spark Forms found that changes to the underlying model were not being incorporated into the edited form. They then needed to manually synchronize the edited form with the updated service inputs. The only viable option had been to delete the service and upload it afresh, then redo their previous edits and customizations. With this release, Spark will intelligently merge changes from newly uploaded service versions into the customized “FormSpec”.

If a new service version includes new inputs or outputs, these will be inserted into the customized FormSpec within a subsection labeled, “New control”. For existing controls, properties such as control type and metadata will be updated as necessary. If a new service version removes inputs/outputs, their corresponding control definitions in FormSpec will also be removed. If that results in an empty section or sub-section, this will also be removed.

 

Execute API enhancements

  • Direct addressing to submit and request outputs directly from the file - see: request_dataunder "Direct cell reference".

  • Tenant administrators can control access to directly reference outputs using the Direct addressing outputs enabled toggle in Service Documentation > Service Details. We have added the ability to download a copy of the original Excel file with the API response - see: request_metaunder excel_file.

 

Validation API

As requested by several customers, the Validation API now includes the following additional information:

  • input_message_title

  • input_message

  • error_style

  • error_title

  • error_message

 

API Call History

A JSON-formatted API Call History log provides more flexibility and options for integration with other systems and allows consumers to filter and include only the relevant information more easily. In addition to the existing Excel and CSV format options, Spark now offers JSON format download via the UI and a new API. Within Spark, navigate to the API call history page for your service, click Download all API calls, then click Download in JSON format to download the .zip archive. See the User Guide for more details.

 

Xcall

 

C.SPARK_XCALL() UDF improvements

Previously, if the Xinput and Xoutput tables had a maximum of 100 rows and 100 columns, but the file uploaded by a user only contained data in 80 rows, C.SPARK_XCALL() UDF (User Defined Function) would generate only 80 rows. With this release, Xcall calls the Service Info API first and stores the response in its memory. This definition will be used to generate the input and output instead of the default. Xcall’s memory will be refreshed whenever the user logs in or the Spark_Xcall function is recalculated. If a user already has Excel and Spark Assistant open, and a new version of the service used within an Xcall is published, Spark Assistant prompts the user to sync with the latest version by clicking Sync.

 

Other improvements

  • Spark truncates file names longer than 50 characters, which occasionally led to the creation of new services instead of new versions of existing services. Spark now stores the full names of uploaded files and uses these values when determining whether an upload should be treated as a new service or a new version.

  • XReport now supports the Montserrat (Bold/Regular) and Halant (Regular) fonts.

  • The Upload Log now displays any Spark-related messages first before any Neuron-related messages, for all categories: Info, Warning, Error, Tips.

  • The Active Service counter displayed on the Insights page will now only be displayed if EnableActiveService flag has been set to TRUE for the Tenant. Since most Tenants are not using this feature, it proved confusing to have this displayed by default.

  • A link to our recently launched Ignitors Community has been added to the Spark User Menu.

Be the first to reply!