Skip to main content

APIs

 

What is an API?​

An API (Application Programming Interface) is, at it’s simplest, software that allows 2 computers or different programs to be connected (see here for more detail)​​

 

What options does Spark provide to view the API request?

The API request can be viewed in several different ways:​

  • Field view provides an interactive experience to easily change values and validate the results​

  • The JSON view provides a syntax highlighted structure of the JSON request body​

  • The Raw view is a plain text version of the request. This can be modified to test the request structure.

 

What options does Spark provide to support integrations with other systems?

  • cURL command: For any Excel service added to Spark, users can download a cURL command that can be used to receive or send data from within the model to an external system using a URL syntax​

  • Swagger: Spark can also download a set of specifications for a format describing the API for a particular model. This is shown in a Swagger file that not only can be used to communicate the document across different teams but also for various API processes​

  • HTML or PDF documentation: Spark automatically generates an informative document for your APIs that describes your input and output parameters, the Spark API response, along with sample integration methods

 

What does the API URL look like? And is the URL generated live?

The default URL endpoint is defined as a URL containing the tenant name, folder name and Spark service name. We also provide alternative structures for the URL including defining a custom proxy URL endpoint and referencing the Spark service using UUIDs. Once a model is uploaded to Spark the API endpoint generated is live and can be accessed via an API key.

 

Who is the typical end user for Spark?

Spark is aimed at both business users and IT users / programmers alike, depending on the use case. The end user can also be an application which supports APIs (e.g. front-end, back-end systems)​

 

SLAs​

 

What is Spark’s typical response time? Does Spark have performance benchmarks?

  • Previous performance tests of Spark Platform show it can achieve > 30,000 transactions per minute with 99% of response times < 500 miliseconds​

  • Coherent uses AWS Web Application Firewall (WAF) to filter out rogue requests before they reach our system​

  • Coherent has monitoring and alerting on system resource utilization​

 

What is the standard response time of a service?​

Performance will depend on the complexity of our Customers’ model. Simple models will typically respond in 2ms (although this will depend on network paths)​

 

What does Support look like for Spark?​

We maintain a support team which operates during local business hours and can respond based on agreed SLAs. Furthermore, customer success teams are available in each region to assist with general support queries​

 

Will Spark have scheduled outages?​

We don't anticipate a scheduled outage (last year there were 2 related to infrastructure upgrades), but if we do need to schedule one we provide three week’s notice​

 

General​

 

What does an “API Endpoint” refer to?​

An API Endpoint is the URL for a (Spark) service. APIs operate through responses and requests i.e. one application makes a request, and the (Spark) API Endpoint sends a response (based on the input provided in the request). In Spark, each spreadsheet (Excel workbook) generates an active API endpoint. ​

 

Will subsequent versions of the same Excel file uploaded continue to produce the same API / API end point?​

New uploads to Spark can be used as an updated version of an existing API endpoint or a new API endpoint. If there are no changes to the tagged inputs and outputs of the Excel file, then the API endpoint will continue to work without any issues from the calling application. Furthermore, it is also possible to call previous versions of the API based on rules such as effective dates. Previous versions of services can be reinstated, if required. ​

 

Which Excel formulas does Spark support?​

Spark supports the vast majority of Excel’s most commonly used functions. Spark also supports the LAMBDA function in Excel which can be used by spreadsheet authors to create custom, reusable functions that can be implemented in a workbook. A full list of Spark supported Excel functions can be provided upon request, as we keep adding new functions over time. ​

 

Does Spark support VBA/Macros?​

  1. Spark has helped hundreds of companies migrate tens of thousands of spreadsheets to APIs.  Many of those included migrating away from old, insecure VBA technology that Microsoft now actively dissuade extended usage of​

  2. Addressing the VBA depends on the direction you are seeking to go:​

  • In +50% of cases Coherent have worked on with clients, the VBA has been made obsolete thanks to either Spark's out-of-the-box functionality, or because the system that consumes the Spark-produced API has taken care of it​

  • For some spreadsheets, companies just want to control the calculations & improve auditability.  In these cases Spark Shell can be an effective, low-touch solution​

  • For others Coherent provide data connectivity and function modules where we centralize the work done by the VBA code in a modern way​

  1. For any remaining edge cases or advanced scenarios, our Field Engineering team is able to help you explore options. To date, we have not run into a scenario we could not accommodate.​

How does Spark deal with formulas using/leveraging Excel Add-Ins?​

This is more of an advanced use case - i.e. bringing in custom code that someone else has built - but we have done this before in a number of instances. Depending on how the add-in is built, Spark will either convert this custom code into a lambda function or turn it into an API.​

Note: Such conversion would need to be evaluated by Coherent first and may come with an additional cost

 

How does Spark cope with different versions of Microsoft Office / Excel​

Our objective is to match to the same mechanics and behaviour as Excel which extends to file compatibility i.e. Spark will be able to support your old file formats in the same way that Excel does.​

 

What can Spark not do?​

Spark does not support some other Excel functionalities such as:​

  • Data Types, PivotTables, Microsoft Query or Microsoft Power Query connections​

  • Scenario Manager and Forecast sheets​

 

What is the maximum Excel file size that Spark will convert?​

Spark can support files in excess of 100MB+ and once converted these files execute significantly faster than in Excel​

 

How does Spark handle errors/warning in the model or input parameters?​

  • Data validations that exist in Excel spreadsheets are processed in Spark to enable appropriate validation before data is accepted​

  • Users can also define custom error messages that are driven using Excel formulas​

  • Spark captures model metadata information to allow a deeper dive into monitoring and management of data issues​

 

What native connectors do you have & what capability exists to make Spark easy to connect to other platforms?​

Spark currently has the following native connectors: Salesforce, Snowflake; whilst connectors for Android, iOS, JavaScript, .NET, Python and R are in development​

These connectors can easily be used to connect Spark to other platforms with the following benefits:​

  • Run Excel-based calculation logic in the target/connected application, especially complex logic that is not easy to do with simple UIs or scripting languages​

  • Enables business experts to define and own the business logic (in Excel) without needing developers with expertise in those target/connected applications (these developers can be difficult to find at scale/cheaply)​

  • SDKs improve the speed of which developers can leverage the power of Coherent Spark in a programming language they are familiar with​

 

Can Spark connect to external data sources?​

Spark supports multiple methods for connecting with external data sources such as:​

  • Establishing an API connection between spreadsheet models and external data sources like Bloomberg​

  • Using Excel files as a form of passing inputs to another Spark service​

 

Does Spark support calculations linked across multiple spreadsheets or must all be stored in one spreadsheet?​

Yes, Spark has different methods to link spreadsheets depending on the use case, including calling external APIs/data sources, tables/ranges in other Excel files, etc. When Spark services are connected together, each individual model follows the appropriate access rights and the collective inputs and outputs are logged and audited similar to other Spark services.​

Be the first to reply!

Reply