Xignite, Inc., a cloud-based market data distribution and management solutions provider for financial services and technology companies, announced a new Vendor of Record service for clients subscribing to real-time and delayed market data. The new service vastly simplifies the administration and reporting required by exchanges and often eliminates the need to pay redistribution fees, potentially saving clients thousands of dollars a month.
As an approved Vendor of Record, also called a Service Facilitator, Xignite can redistribute real-time and delayed equities and options pricing data from Nasdaq, New York Stock Exchange (NYSE), Options Price Reporting Authority (OPRA), OTC Markets (OTCM), and the Toronto Stock Exchange (TSX).
Adhering to the complex compliance guidelines required by exchanges is extremely difficult for investment advisers, financial advisers, or order management software providers that need to display real-time or delayed data. Each exchange has its own unique set of regulations and compliance requirements, and clients need to prove that they have control over who receives the data, in what format, and for what use case. Xignite’s Vendor of Record service eliminates the administrative burden of tracking these complex compliance requirements.
The new service utilizes Xignite’s cloud-native Entitlements and Usage Microservices to give firms complete control and transparency of their data consumption and usage. Xignite provides data entitlements, usage tracking, and exchange reporting across various data sets, users, and applications to ensure exchange compliance. Xignite’s new service sometimes eliminates the need to pay expensive redistribution fees. Exchange fees for display data, regardless of the number of users, can cost upwards of $10,000 per month. These high fees are especially difficult for smaller financial firms with just a few real-time data users.
“Maneuvering through the maze of required compliance policies, entitlements, usage tracking, and reporting requirements, and being subjected to frequent audits is no easy feat,” said Vijay Choudhary, Head of Product for Xignite. “Xignite’s mission is to “Make Market Data Easy.” Today’s announcement is another step towards this. We are taking away the administrative burdens and complexity of licensing market data and allowing our clients the freedom to focus on their investment and trading strategies and building innovative products.”
Xignite’s Vendor of Record service is available for professional users with internal and display-only use cases. The service is available now as an add-on service for subscribers of our real-time and delayed equities and options pricing data APIs. These include:
Xignite has been disrupting the financial and market data industry from its Silicon Valley headquarters since 2003 when it introduced the first commercial REST API. Since then, Xignite has continually refined its technology to help Fintech and financial institutions get the most value from their data. Today, more than 700 clients access over 500 cloud-native APIs to build efficient and cost-effective enterprise data management solutions. Visit xignite.com or follow on Twitter @xignite.
In the foreign metal market and the world of international rates, currencies play the crucial role of acting as the medium of exchange in the transactions that take place.
Currencies like the United States dollar, the Euro, or the British Pound are commonly used around the world in order to get a metal rate. Some companies that offer precious metal live and historical rates have exposed their APIs (Application Programming Interfaces) to allow developers to integrate current and historical metal rates, currency conversion, or other capabilities into their applications.
In order to know about precious metals live and historical rates, there’s a lot of APIs available online, and if you want to try one, Barchart is going to be one of your first options. But if you take a look at what else is in the market, you’ll find alternatives so many great alternatives:
Xignite Market Data as a Service was one of the first market data services built to run in AWS and they are one of the few vendors that is an AWS Advanced Technology Partner with a Financial Services Competency.
With more than a decade of cloud expertise in building, scaling and operating cloud-based market data technology, it is no surprise that leading financial services and capital markets firms rely on this company to empower their journey to the cloud. Their Metals API Service offers real-time prices and quotes for metals including Gold, Silver, Palladium, Platinum and other base metals. In addition to real-time precious metals prices, the service provides daily London Fixing prices as well as historical precious metal prices and metal news.
Xignite Cloud APIs are sourced from leading providers such as FactSet and Morningstar as well as Xignite’s own curated, high-quality data.
Read the article Top 3 Alternatives for Barchart Precious Metals Rates
Each year, Bobsguide asks the market to vote for fintech companies they believe stand out from the competition – those who have gone the extra mile in terms of development and servicing their clients. Xignite is proud to be listed as the "Best API Management" vendor on the Bobsguide 2020 Rankings.
Web services data provider Xignite captured the AFTAs judges’ attention on the infrastructure front with its release of Xignite Enterprise Microservices in July 2020, a suite of cloud-based microservices for data management, storage and distribution in the cloud, designed to help financial firms migrate from monolithic legacy data architectures to more agile and less expensive cloud services and data sources.
Requires subscription to read the article on WatersTechnology
Xignite, Inc., a provider of market data distribution and management solutions for financial services and technology companies, today revealed the results of its collaboration with StockCharts, a leading technical analysis and financial charting platform for online retail investors. The collaboration involved a move from an on-premise market data provider to Xignite’s cloud-native technology hosted in Amazon Web Services (AWS). Download the case study containing the full results.
StockCharts requires vast quantities of financial data to power its visualization, charting and tracking tools, which investors use to analyze the markets to help with investment decisions. The company was frustrated by the limits of its on-premise market data center, which was forcing the team to make architectural decisions based on what its data center could handle in terms of speed and storage, not on their technology. Its previous market data provider was just starting to build out some cloud offerings, but they were far away from what the business required. StockCharts decided to migrate its infrastructure to the AWS cloud and partner with Xignite to gain access to endlessly scalable market and financial data delivered through innovative cloud APIs.
The collaboration made an immediate impact as StockCharts was able to expand its offerings and customer base by pursuing growth strategies enabled by Xignite’s cloud-based approach, which provides easy access to data and eliminates architectural limits on storage and speed.
The pandemic provided further validation. Seattle-based StockCharts was in one of the first areas hit by COVID-19 and was forced to quickly shut down its office. Pandemic-driven market volatility followed and StockCharts customers wanted to visualize what was happening. The company’s ability to scale quickly and accommodate a high volume of new requests would not have been possible without Xignite.
“The move to the AWS cloud and Xignite has unlocked tremendous new potential for us in a lot of architectural ways, and has given us a lot of data options that we could not even consider before,” said Grayson Roze, Vice President of Operations at StockCharts. “It relieved us of the burden of figuring out how to source things. Instead, we know exactly where we need to go to get the data and can access it instantly. That is a huge, huge benefit for our business.”
“We are proud to have played a role in transforming how StockCharts approaches data,” said Stephane Dubois, CEO and Founder of Xignite. “The events of this year unleashed a massive spike in retail trading and a host of other unexpected forces that reinforced the need for financial services firms to leverage the cloud. Despite the disruption of this year, StockCharts was positioned for success, and we look forward to continuing to deliver our financial and market data solutions to the industry at large.”
Xignite has been disrupting the financial and market data industry from its Silicon Valley headquarters since 2006 when it introduced the first commercial REST API. Since then, Xignite has been continually refining its technology to help fintech and financial institutions get the most value from their data. Today, more than 700 clients access over 500 cloud-native APIs and leverage a suite of specialized microservices-delivered modules to build efficient and cost-effective enterprise data management solutions. Visit http://www.xignite.com or follow on Twitter @xignite
Xignite, Inc., a provider of cloud-based market data distribution and management solutions for financial services and technology companies, today announced that its Market Data Management-as-a-Service solution has been named “Best New Technology Introduced over the last 12 months – Infrastructure” at the 2020 WatersTechnology American Financial Technology Awards (AFTAs). Selected by the editors of WatersTechnology, the AFTAs recognize excellence in the deployment and management of financial technology within the asset management and investment banking communities.
Xignite’s Market Data Management-as-a-Service (MDMaaS) solution enables buy- and sell-side firms to centralize the management of vendor data feeds into their own cloud environment. The solution is built around the cloud microservice-based architecture and technology stack Xignite has been refining and scaling for more than 10 years. Xignite’s technology platform has been the backbone of the company’s Data-as-a-Service business, daily supporting 12 billion API requests of financial data for their 700 fintech and financial services clients. Now Xignite is leveraging this battle-tested cloud-native data management architecture to offer buy- and sell-side firms a market data vendor agnostic offering, with connectors available for firms to load data they license from numerous market data providers.
The MDMaaS solution includes a suite of loosely-coupled modules that enable market data user firms to control their data usage, automate entitlements, optimize their data spend and minimize liabilities by simplifying data governance and ensuring regulatory compliance.
The functionality is delivered via microservices, an architectural approach in which core functionality is handled by loosely coupled, independently deployable modules that can work together or separately. Microservices architecture stands in stark contrast with monolithic platforms that require expensive on-premise technology – that is especially hard to maintain in the context of a pandemic.
The MDMaaS microservice-delivered modules introduced in 2020 include:
Xignite Entitlements and Usage - Manage the entitlement of vendor data to users and applications to ensure compliance and eliminate excess spend.
Xignite Optimization - Streamline data consumption to avoid duplicated vendor requests, leverage cached bulk data and get recommendations to reduce data costs.
Xignite Data Lake - Centralize, catalog and connect data shapes to enable frictionless integration by consumers via unified cloud APIs.
Xignite Reference - Aggregate, normalize, store and index vendor reference data to centralize enterprise-wide access.
Xignite Historical - Provide centralized access to normalized, stitched and adjusted historical data via cloud APIs.
Xignite Real-Time - Distribute real-time vendor data via cloud APIs, eliminating on-premise infrastructure.
Xignite Fundamentals - Make simple and complex time-series data structures available via cloud APIs.
“Xignite has pioneered market data in the cloud for more than 10 years now, so we are very excited to announce – and be recognized for – our Market Data Management-as-a-Service solution,” said Stephane Dubois, CEO, and founder of Xignite. “The pandemic has reinforced the need for financial services firms to migrate to the cloud as a means of navigating disruption and enabling scalability, among other benefits. We are proud to spearhead that effort and help the industry modernize its approach to financial and market data.”
Xignite has been disrupting the financial and market data industry from its Silicon Valley headquarters since 2006 when it introduced the first commercial REST API. Since then, Xignite has been continually refining its technology to help fintech and financial institutions get the most value from their data via its Data-as-a-Service and Market Data Management-as-a-Service solutions. Today, more than 700 clients access over 500 cloud-native APIs and leverage a suite of specialized microservices to build efficient and cost-effective enterprise data management solutions. Visit http://www.xignite.com or follow on Twitter @xignite
Data, data management, the cloud, it's all here in this edition of Forecast 2021. Learn what emerging trends in data are in store for the capital markets. Here you'll read analysis from industry pros at Crux Informatics, Knoema, MayStreet, S&P Global Market Intelligence and Xignite in this article by Sam Belden of Forefront Communications. Read the article at Data, data management, the cloud, it's all here in this edition of Forecast 2021. Learn what emerging trends in data are in store for the capital markets. Here you'll read analysis from industry pros at Crux Informatics, Knoema, MayStreet, S&P Global Market Intelligence and Xignite in this article by Sam Belden of Forefront Communications.
Read the article on TABB Forum
Interview with Xignite:
Have the events of 2020 offered any lessons as to what market participants want in terms of data?
2020 brought several lessons. Financial retail markets have gone crazy and consumer demand has put significant pressure on the whole market data infrastructure. Many data providers and brokers experienced issues during periods of heavy market activity, showing that non-elastic, non-cloud native infrastructures have a hard time dealing with those periods of peak activities. In addition, market data quality is more important than ever before as many brokers were impacted by issues with the AAPL and TSLA splits.
The cloud has obviously fundamentally changed the way data is created and consumed. What’s the next big thing for data providers who leverage the cloud?
The cloud was at first a way to deliver data easily and efficiently, while the consumers and consuming apps remained mostly off cloud. Now most of the applications consuming market data are becoming cloud-native themselves, so delivery directly into those cloud-native applications will be the next big thing.
Aside from the cloud, what are the most important data trends to keep an eye on in the years ahead?
Markets are fueled by millennials and millennials care about the environment and social causes. This is driving demand for ESG data. In addition, COVID has beat the hell out of quant models which were never designed or tuned for major events like this. Expect a reinvention of machine trading in the years to come.
The Department of Justice has officially sued Visa to block its $5.3 billion acquisition of Plaid — and the fintech world is scrutinizing what this might mean for the industry.
Business Insider spoke with Xignite's CEO and Founder Stephane Dubois, and other legal and industry experts on how they see the DoJ's lawsuit shaking out — and what this means for the fintech world.
If the Justice Department wins in court, the merger could be scuttled
Stephane Dubois, the CEO of financial data provider Xignite, thinks that the fact that the DOJ sued suggests that it does probably have a solid legal basis for its allegations.
Unless Visa — which has been represented by powerhouse law firm Skadden in connection with the deal — can fight the DOJ's lawsuit on a legal basis and argue successfully that the government's argument is too speculative, that they're not anticompetitive, he doesn't think the acquisition will go through.
Otherwise, Visa would need to comply with conditions set by the DOJ — for example, lower fees on credit cards, or breaking up its business — to make itself non-competitive. But he's not sure if Visa would be willing to do that.
Dubois said such a lawsuit could be a "cold shower" for fintechs that are considering mergers and acquisitions given the massive $200 million Plaid paid for its API competitor, Quovo, in January 2019, not to mention the $5.3 billion price tag of Visa's acquisition of Plaid.
The DOJ's lawsuit could fail and Visa's acquisition could go through, but with diverging possible outcomes for Plaid and other fintechs
Dubois sees several possible outcomes playing out should the DOJ's lawsuit fail. The acquisition would go through and Visa could continue to make Plaid available to fintechs, but in a way that it doesn't "cannibalize" its own business — for example, by charging 3% fees to competitor services that Plaid enables.
It's also possible that Visa shuts down Plaid after a successful acquisition, essentially squashing competition for the market, something Dubois called a "worst case scenario."
Yugabyte, the leader in open-source distributed SQL databases, today announced that market data distribution and management solutions provider Xignite has selected YugabyteDB as its database of choice to power its cloud-native financial data distribution and management solutions. Xignite selected Yugabyte’s distributed SQL database based on YugabyteDB’s high performance, on-demand scalability, and operational ease.
“Due to the nature of our business, performance and scalability are the two most important factors we look for in a database solution,” said Dr. Qin Yu, VP of Engineering, Xignite. “Financial data is ever-changing and we need to capitalize on that data to give our customers the most accurate, real-time view of the markets. The performance and scalability of YugabyteDB allows us to provide granular data in real-time to our high-profile clientele, combined with the Yugabyte Platform, which greatly simplifies operations and management. In addition, we have come to rely on the Yugabyte as key partners, providing us with a best-in-class distributed SQL platform and support.”
Xignite provides customers with a scalable way to manage, control, and optimize real-time and reference data across traditional systems and cloud applications. It does this through its cloud-native market data platform that unifies financial data consumption and market data management—delivering clients a real-time view of market activity as a service via the cloud. However, serving financial services and fintech customers like Robinhood, SoFi, Investopedia, and BlackRock requires scaling as their data requirements change and grow, while still providing the high availability and high performance they need and expect.
“When you’re building a leading market data management platform like Xignte, data accuracy and availability are absolutely imperative,'' said Karthik Raganathan, CTO and Co-Founder, Yugabyte. “Making sure customers have always-on access to real-time and reference data in a market with high–and continuously growing–volumes, sources, and types of data puts extensive demands on the scalability and performance of a database and the teams that support it. We are thrilled to be a partner to Xignite, eliminating their database pain points and enabling the Xignite team to invest more time and money in building new features for their customers.”
As Xignite’s business grows, so does the amount and granularity of data, creating the need to quickly scale the database tier. Scaling Microsoft SQL Server on AWS with Amazon RDS was very challenging, requiring manual partitioning of data at the application layer, which was time-consuming and increased complexity. After trying MySQL and considering NoSQL solutions, Xiginite turned to Yugabyte to address its need for a database provider that could easily scale on-demand, future-proofing the company for continued growth. Yugabyte has seamlessly handled Xignite’s performance requirements for both reads and writes, and enabled the company to add capacity and scale quickly, with operational ease and no downtime.
Moving to YugabyteDB has enabled Xignite to scale to more than 11 terabytes of data, unlock new use cases that would not have been possible with the older technology stack, and achieve an overall cost savings of approximately 50% compared to SQL Server.
For further information on Xignite’s work with Yugabyte visit www.yugabyte.com/success-stories/xignite/
Yugabyte is the company behind YugabyteDB, the open source, high-performance distributed SQL database for building global, internet-scale applications. YugabyteDB serves business-critical applications with SQL query flexibility, high performance and cloud native agility, thus allowing enterprises to focus on business growth instead of complex data infrastructure management. It is trusted by companies in cybersecurity, financial markets, IoT, retail and e-commerce verticals. Founded in 2016 by former Facebook and Oracle engineers, Yugabyte is backed by Lightspeed Venture Partners and Dell Technologies Capital. www.yugabyte.com.
By Mike O’Hara, Special Correspondent
Cloud-delivered market data was once ‘over my dead body’ territory for institutional market data managers, who understandably fretted aloud about performance, security and licence compliance issues. But Covid-19 has forced those same data managers to confront the fact that many of their professional market data users are able to work from home (WFH), in turn driving financial firms to question whether the pandemic could be the catalyst for a rethink of their expensive-to-maintain market data infrastructures, with cloud part of the data delivery solution.
For many financial firms, today’s cloud delivery and hosting capabilities offer a viable solution for supporting trading and investment teams and their support staff, accelerating demand for cloud-based market data delivery infrastructures. The thinking is that cloud may help firms with their broader aim of reducing their on-premises technology and equipment footprint, a trend that was emerging even before the Coronavirus struck.
But embracing cloud delivery introduces new challenges for market data and trading technology professionals. While WFH will doubtless continue in some form, it’s far from clear that all market data delivery can be migrated to the cloud. Essential market data functions will remain on-premise. High-performance trading applications and low-latency market data connectivity, for example, will continue to rely on state-of-the-art colocation and proximity hosting data centres.
For many financial institutions, the challenge will be how to manage these several tiers of market data delivery and consumption. Going forward, practitioners will face a three-way hybrid of on-premises, cloud-based (private/public) and collocated market data services in order to support a range of users: from work-from-home traders and support staff to trading-room-based traders, analysts and quants, to collocated electronic applications like algorithms, smart order routers and FIX engines.
Indeed, A-Team will be discussing the infrastructure, connectivity and market data delivery challenges associated with cloud adoption in a webinar panel session on November 3. The webinar will offer a ‘reality check’ that discusses best practices for embracing cloud, colo and on-prem to support this new mix of user types, with emphasis on capacity, orchestration, licensing, entitlements and system / usage monitoring.
With firms’ appetite for exploring the potential of the cloud piqued, data managers are now looking at whether they can hope to take advantage of some of the more widely recognised benefits of the cloud – flexibility, agility, speed-to-market, scalability, elasticity, interoperability and so on – as they grapple with the future market data delivery landscape.
“Market data infrastructure, in terms of data vendor contracts, servers, and data centre space, typically represents a large, lumpy, cap ex expenditure”, says independent consultant Nick Morrison. “And so having the ability to transition that to something with costs that are more elastic, is highly attractive”.
Of course, every firm has its own unique requirements and nuances in this regard. Proprietary trading firms, asset managers, hedge funds, brokers and investment banks are all heavy consumers of market data. But the volume, breadth, depth and speed of the data they need in order to operate is highly diverse. Which means that there is no ‘one size fits all’ when it comes to sourcing and distribution mechanisms (including the cloud).
Market data and the cloud – what’s applicable?
As they consider their options for including cloud in their overall data delivery plans, data managers need to assess whether and how specific data types could be migrated to a hybrid environment: Level 1 (best bid/offer), level 2 (order book with aggregated depth at each price level) or level 3 (full order book)? Historic, end of day, delayed or real-time? Streaming or on-demand? This all has a bearing on the feasibility of cloud as a delivery mechanism.
Firms also need to consider their mix of public and private cloud, or what mix or hybrid cloud solution best fits their needs. What about virtualisation? Or internal use of cloud architecture, such as building a market data infrastructure around microservices and containers?
The marketplace already has identified at least one workable use-case: the use of historical, tick or time-series market data, usually to drive some form of analytics. A growing number of trading venues (such as ICE and CME) and service providers (Refinitiv, BMLL and others) now offer full level 3 tick data on a T+1 basis, delivered via the cloud. Plenty more providers can offer historic level 1 & 2 data.
This kind of capability can be used for critical use-cases, such as back-testing trading models for signal generation and alpha capture, performing transaction cost analysis (TCA), developing and testing smart order routers (SORs), or fine-tuning trading algos to better source liquidity. In all of these cases, cloud-hosted historical tick databases can reduce on-premises footprint and cost, while offering flexible access to vast computing resource on demand, and many are finding this compelling. “When churning through such vast quantities of data, having access to a cloud environment enables you to scale up horizontally to process that data”, says Elliot Banks, Chief Product Officer at BMLL.
Where things start to get more complicated, though, is with real-time market data, where two of the biggest hurdles from a cloud delivery perspective are speed and complexity.
From a trading standpoint, speed is always going to be a significant factor. Nobody, regardless of whether they’re an ultra-low latency high-frequency trading firm or a human trader dealing from a vendor or broker screen, wants to trade on stale prices. The tolerances may be different but the principle applies across the board.
It’s a safe bet that any firm currently receiving market data directly from a trading venue into a trading server (collocated at the venue’s data centre or hosted at a specialized proximity hosting centre operated by the likes of Interxion) relies on deterministic low latency, and is therefore unlikely to consider cloud as an alternative delivery mechanism.
Clearly, HFT firms with trading platforms that require microsecond-level data delivery won’t be replacing their direct exchange feeds and often hardware-accelerated infrastructure with the cloud, as the performance just isn’t there, for now at least. This, of course, could change if and when the trading venues themselves migrate to cloud platforms, creating a new kind of colocation environment, but that’s likely some way off. “But these guys only have a few applications that really need ultra-low latency data”, says Bill Fenick, VP Enterprise at Interxion. “Most of their applications, be they middle office, settlements or risk, they’re perfectly happy to take low-millisecond latency”.
And what about other market participants? Particularly those that currently make use of consolidated feeds from market data vendors, where speed is perhaps a secondary consideration? This is where cloud delivery may have some real potential. But it’s also where the issue of complexity rears its head.
Navigating the complexity
To deal with the myriad of sources, delivery frequencies, formats and vendor connections used to feed real-time market data into their trading, risk, pricing and analytics systems, many financial firms have built up a complex mesh of infrastructure that ensures the right data gets delivered to the right place at the right time. The integration layer required to handle these data inputs may be delivered as part of the data service or may stand alone as a discrete entity. In either case, it’s unrealistic to expect that all of this infrastructure can just be stripped out and replicated in a cloud environment.
To address this challenge, some service providers are starting to offer solutions where the source of the data is decoupled from the distribution mechanism, aiming for the holy grail where either, or both, can be cloud-based.
By building individual cloud-hosted microservices for sourcing market data, processing that data in a variety of ways, and delivering it into end-user applications, such solutions can help firms migrate their market data infrastructure incrementally from legacy to cloud-based platforms. Refinitiv is starting to shift much of its infrastructure onto AWS, and other specialist cloud-centric vendors such as Xignite and BCC Group also enable internal systems to be decoupled from data sources, thus facilitating a shift towards cloud-based infrastructure. “We believe the customer should be able to easily move from source to source and get as many sources as they want. The cloud enables this kind of flexibility”, says Bill Bierds, President & Chief Business Development Officer at BCC Group.
Firms have long wanted to become more vendor-agnostic by decoupling their data integration capability from the primary data source. One investment bank in London, for example, was able to decouple Refinitiv’s TREP platform from its Elektron data feed and switch to Bloomberg’s B-Pipe for its data, delivered via the TREP framework. From a market data perspective, this has given the bank more negotiating power and less vendor lock-in, opening up greater opportunities to utilise cloud-based market data sources in the future.
Permissioning and entitlements
Perhaps one of the toughest challenges that firms face around real-time market data on the cloud is that of entitlements and usage authorisation. Firms sourcing data from the two main data vendors, Refinitiv and Bloomberg, will generally be tied into their respective DACS and EMRS entitlements systems, often augmented by data inventory and contract management platforms like MDSL’s MDM or TRG Screen’s FITS and InfoMatch.
Entitlements can be a thorny subject when it comes to cloud-based distribution of market data. Firms are wary of falling foul of their licence agreements with their various data vendors, all of whom have different commercial considerations and penalties for non-compliance. This is why accurate tracking and reporting of market data access and usage is crucial.
The cloud can be a double-edged sword in this regard. One the one hand, transitioning from a dedicated infrastructure to the cloud might trigger extra licensing costs for what is effectively an additional data centre, so they may need to go through a period of paying twice for the same data. Indeed, firms may already be facing this situation as they entitle staff to operate from home while holding enterprise licences covering only their headquarters and regional offices.
On the other hand, cloud-based services such as those offered by Xignite and others can make it easier for firms to manage entitlements across multiple data vendors from a central source via a UI. “Our entitlements microservice is integrated with our real time microservice, to make sure that any distribution and any consumption of data is authenticated and entitled properly, so that only the right users have access to the data,” says Stephane Dubois, CEO of Xignite, whose microservices suite is supporting NICE Actimize’s cloud-based market data delivery infrastructure.
With new products, services and technologies emerging all the time, firms can be optimistic about the growing opportunities that the cloud can offer for managing market data. One particularly interesting development worth watching is the rise of Low Code Application Platforms (LCAPs), such as that offered by Genesis, which provides a cloud-based microservices framework that can be used for rapidly developing and delivering applications around real-time market data. One example is on-demand margining. “A prime broker can link to all of its customers and know exactly what their risk positions are based on real-time market data, so within minutes, they can be sending out margin calls”, says Felipe Oliviera, Head of Sales and Marketing at Genesis.
Industry behemoths such as Refinitiv, SIX and FactSet are also embracing the cloud. Refinitiv has now launched delivery of market data via AWS, is making its tick history data available on Google Cloud and has also recently announced a partnership with Microsoft Azure. FactSet has launched a cloud-based ticker plant on Amazon EC2. And SIX is partnering with Xignite for real-time market data delivery via the cloud. Bloomberg is also partnering with AWS to make its B-Pipe data feed available through the cloud. And the main cloud vendors themselves – Amazon, Google and Microsoft – have established dedicated teams to develop these markets
In conclusion, it’s clear that there are a number of challenges that firms still face when transitioning any part of their market data infrastructure to the cloud. (To register for A-Team’s free webinar on the topic, click here.) And in many cases, particularly where ultra-low latency is required, cloud is not the answer. But equally, by migrating certain elements of their market data infrastructure to the cloud, cost savings can be achieved, efficiencies can be gained and firms can potentially do more with less.