In this regular column, we’ll bring you all the latest industry news centered around our main topics of focus: big data, data science, machine learning, AI, and deep learning. Our industry is constantly accelerating with new products and services being announced everyday. Fortunately, we’re in close touch with vendors from this vast ecosystem, so we’re in a unique position to inform you about all that’s new and exciting. Our massive industry database is growing all the time so stay tuned for the latest news items describing technology that may make you and your organization more competitive.
A Product for ML-based Customer Behavior Prediction and Personalization
Provectus, a Silicon Valley artificial intelligence (AI) consultancy, has launched a new product, Crystal Engine, designed to solve the problem of customer behavior prediction and personalization for startups, SMBs, and enterprise-scale businesses. The product is now available in its closed beta version. Crystal Engine provides an easy-to-use interface for non-technical users, enabling them to make customer predictions without the need for coding or installation. In addition to the simplicity of the product, Provectus emphasizes explainability, empowering businesses with insights into customer behavior to help them prove or disprove hypotheses, adjust products, and develop marketing strategies.
“Finding the right balance between versatility and flexibility of an AI solution is always a challenge,” said Stepan Pushkarev, CEO of Provectus. “We strive to bring maximum value to our partners, so we created Crystal as a perfect fit to address two needs at the same time — minimizing time to market and meeting flexibility requirements for each particular business and product we work with. Our partners can benefit by gaining maximum accuracy of ML models while preserving ownership and security of their data and maintaining low TCO.”
NICE Announces New AI-Powered Robotic Process Automation, Accelerating Organizations’ Digital-First CX Strategy
NICE (Nasdaq: NICE) introduced new AI-powered capabilities that enable organizations to maximize the benefits of Robotic Process Automation (RPA) for their business. Included in version 7.6, NICE RPA’s new capabilities include document digitization, ROI-based recommendation of ideal processes to automate, and a complimentary resource center with ready-made low-code/no-code resources for sharing. In addition to reducing process analysis time and automating manual tasks, the innovative new capabilities also help organizations boost ROI and maximize the value of automation projects for the business.
“The digital age is powering productivity, improving service experiences, and accelerating ROI,“ said Barry Cooper, President, NICE Workforce and Customer Experience Group. “By digitizing processes and prioritizing automations that drive maximum business value, our latest RPA capabilities are accelerating the path to a digital-first strategy.”
Vast Data Plays the End Game with Enterprise Hard Drives
VAST Data, the storage software company breaking decades-old tradeoffs, announced that it has doubled the storage density of the hardware platforms supported by its Universal Storage offering, providing customers with an even more cost-effective and power-efficient all-flash data platform that is the vehicle to achieving cost and efficiency supremacy in enterprise and cloud data centers. Based on Intel’s 30 terabyte (TB) quad level cell (QLC) solid state drives (SSDs), VAST-supported NVMe enclosures double data center density to feature over a petabyte (PB) of effective capacity per rack unit (RU). VAST’s innovative approach to data management and industry leading Disaggregated Shared Everything (DASE) architecture enables enterprise customers to deploy cutting-edge hyperscale hardware, unlocking greater power efficiency, greater physical density and unprecedented flash capacity per total cost of acquisition.
“With this announcement, we are eliminating all of the arguments for HDD-based infrastructure and making it even easier for customers to reach the all-silicon data center destination we first charted back in 2018,” said VAST Co-Founder Jeff Denworth. “Since our inception, we’ve set out to change the economics of all-flash infrastructure ownership so that companies can confidently and efficiently scale capacity and performance as their datasets grow. We’ve optimized Universal Storage to deliver the performance, capacity and cost profile in a single solution for all data.”
DataRobot Core Unveiled, Complete with Capabilities for the Expert Data Scientist
DataRobot announced DataRobot Core, a comprehensive offering that broadens its AI Cloud platform for code-first data science experts. DataRobot also announced its latest platform release, extending the capabilities of AI Cloud for all users with broader and more sophisticated analytical capabilities for data scientists, enhanced decision intelligence, and new features to manage and scale operations in production.
“For organizations today, translating data and AI into tangible outcomes is critical in order to remain competitive and thrive,” said Nenshad Bardoliwalla, Chief Product Officer at DataRobot. “DataRobot Core and 7.3 are designed to meet increasing demand and scale, and empower the largest number of AI creators, from code-centric data science teams to business analysts and decision makers, to experiment fast and collaborate effectively on the same platform. Together, these solutions provide the much-needed flexibility, speed and control that brings trustworthy AI solutions to life for every organization.”
Neural Network 2.0: A Game Changer
Uniquify, a Silicon Valley neural network technology and AI edge computing company, is introducing a transformative neural network fabric technology at CES 2022. Today’s neural network technology utilizes multiply-accumulate (MAC)-based operations to realize visual, audio, data, and natural language processing AI models. Uniquify’s Neural Network 2.0 technology shrinks the neuron in the neural network by replacing MAC operations with proprietary AI processing element (AIPE) technology. The area and cost reduction from AIPE technology enables AI edge computing anywhere and everywhere, including consumer markets, data centers and the cloud, enterprises, manufacturing, health and medicine, and government applications.
“Neural network technology is a new paradigm and complements the conventional software technology that has permeated all aspects of our lives for the last several decades. The expensive MAC hardware used to implement advanced but bulky neural network models often inhibits many edge computing application markets from realizing their full potential,” says Josh Lee, CEO of Uniquify. “Our Neural Network 2.0 technology shrinks the neuron in neural networks using AIPE technology to implement the most complex and advanced AI visual, audio, and natural language processing models using a fraction of real estate required by MAC-based neural networks. It’s a game changer in the AI edge computing arena.”
Hasura Innovations Enable Transition from REST to GraphQL APIs to Overcome Increasing Data Silo Complexity and Accelerate Software to Market
GraphQL innovation leader Hasura announced a new Data Hub, bi-directional REST API Connectors and support for Google Cloud, further reducing the time needed to ship software and providing easy onramps to GraphQL for organizations of all types. These innovations build atop existing industry-first capabilities including full-stack application previews and cross-database joins to enable companies to make more efficient, effective and impactful use of their data and services. Hasura has been downloaded more than 400 million times since its introduction in 2018.
“The new innovations announced today further enhance Hasura’s ability to make data access self-serve for teams and organizations of all sizes, regardless of where the data resides, reducing development time and accelerating software delivery to users,” said Tanmai Gopal, CEO of Hasura. “It eliminates data silos and provides much-needed security, lowering the bar to universal GraphQL API use and modernizing the developer experience for applications of all types.”
Sisense Unveils Sisense Notebooks for Advanced Analysis Using SQL & Python
Sisense, a leading AI-driven platform for infusing analytics everywhere, announced Sisense Notebooks, a new code-first functionality within Sisense Fusion Analytics that empowers data analysts with the tools they need to conduct advanced analysis using Structured Query Language (SQL) and Python.
“The market is full of tools that offer fragmented workflows and manual procedures, which compromise productivity, accuracy, and security and slow down the time-to-insight for anyone looking to make a data-driven decision at the organization,” said Ashley Kramer, Chief Product and Marketing Officer at Sisense. “With Notebooks, we’re taking a code-first approach to help infuse insights and scale the decision making across the enterprise, creating a powerful partnership between business users and analysts.”
Open Source Tamper-Proof Database immudb Now Capable of Serving as Main Transactional Database
Now, with full transactional support for everyday business applications, the open source immudb tamper-proof database can serve as the main transactional database for enterprises. The first tamper-proof database, immudb 1.2 now has the ability to rollback changes and have data expire. Unlike other databases, immudb is built on a zero-trust model: history is preserved and can’t be changed. Data in immudb comes with cryptographic verification at every transaction to ensure there is no tampering possible. immudb, can be deployed in cluster configurations for demanding applications that require high scalability, up to billions of transactions per day, and high availability. Support for Amazon’s S3 storage cloud ensures immudb will never run out of disk space.
“There is no need to have immudb running next to a traditional database anymore, as immudb now has full ACID transactional integrity compliance,” said Jerónimo Irázabal, co-founder of immudb and lead architect at Codenotary. The company is the primary contributor to the open source project. “immudb provides full integrity of data, as well as compatibility with SQL and key/value making it possible to move data to immudb without having to make changes to applications.”
Tellius Announces Live Insights for Cloud Data Warehouses to Unlock the Value of the Modern Data Stack
Tellius, the AI-driven decision intelligence platform, today announced Live Insights, a new capability for users to quickly generate powerful data analysis within their cloud data warehouses. This new feature leverages the compute power of cloud data platforms for automated generation of advanced insights without requiring data to be extracted from these systems, fueling faster, better decision making.
“The modern data stack is not just about re-tooling for the cloud; it presents an opportunity to transform how organizations approach analytics by moving beyond decades-old processes of limiting business users to pre-built dashboards and data specialists to manually querying data,” said Ajay Khanna, CEO and Founder of Tellius. “Live Insights is another step in modernizing the analytics experience with AI-driven automation and natural language interfaces that allow our customers to stand up a complete modern data stack and go from data-to-insights in just a few minutes.”
Ascend.io Brings Autopilot to Snowflake with Data Automation Cloud
Ascend.io, the Data Automation Cloud, announced that customers of Snowflake, the Data Cloud Company, can now put workloads on autopilot with a single platform. Ascend’s Data Automation Cloud unifies the core capabilities a data engineer needs—data ingestion, transformation, delivery, orchestration, and observability—into a seamless experience. Backed by Ascend’s powerful DataAware™ intelligence, Ascend’s Data Automation Cloud analyzes and monitors end-to-end workflows, tracking and optimizing the movement of up to trillions of records, and dynamically responds to changes in data, schema, and code within seconds, giving Snowflake customers the most advanced autopilot system available for the data and analytics workloads that power their business.
“Data is the bedrock of today’s enterprises, and it is no surprise why Snowflake is an integral part of that foundation for so many organizations. We at Ascend—and our customers—love Snowflake for its remarkable performance and ability to support such a wide array of data workloads,” said Sean Knapp, Founder and CEO at Ascend.io. “Expanding the Ascend Platform to run natively on Snowflake was the clear next step for us to deliver industry-leading capabilities, and we are excited to take our partnership to the next level. In doing so, Ascend will enable more users than ever before to automate—and unify—their data and analytics workloads on Snowflake.”
MLCommons™ Unveils Open Datasets and Tools to Drive Democratization of Machine Learning
MLCommons, an open engineering consortium dedicated to improving machine learning for everyone, announced the general availability of the People’s Speech Dataset and the Multilingual Spoken Words Corpus (MSWC). The trail-blazing and permissively licensed datasets advance innovation in machine learning research and commercial applications.
“Speech technology can empower billions of people across the planet, but there’s a real need for large, open, and diverse datasets to catalyze innovation,” said David Kanter, MLCommons co-founder and executive director. “The People’s Speech is a large scale dataset in English, while MSWC offers a tremendous breadth of languages. I’m excited for these datasets to improve everyday experiences like voice-enabled consumer devices and speech recognition.”
Instaclustr Advances PostgreSQL to General Availability on the Instaclustr Platform for Open Source Data Infrastructure Technologies
Instaclustr, which helps organizations deliver applications at scale by operating and supporting their open source data infrastructure, announced the general availability of PostgreSQL on the Instaclustr Platform. Backed by full-service PostgreSQL management that includes secure and end-to-end database monitoring, optimizing, and scaling, customers gain the advantages of open source Postgres without the risk of proprietary lock-in or expensive licensing. The general availability of Postgres on the Instaclustr Platform comes after the completed public preview of the service for existing Instaclustr customers earlier this year. Postgres joins other fully open source solutions – including Apache Cassandra, Redis, Apache Kafka, and Open Search – as part of a complete ecosystem for customers’ data infrastructure needs.
“PostgreSQL is a natural addition to the Instaclustr platform, as we continue to expand the data infrastructure ecosystem that we operate and support on behalf of our customers,” said Ben Slater, Chief Product Officer, Instaclustr. “Postgres is the world’s most popular object-relational database, it is incredibly powerful as a fully open source technology, and it requires specialized expertise to wield optimally in enterprise environments – allowing us to provide meaningful value. From avoiding the costs and technical complexity of self-management, to ensuring security, to enabling enterprise focus on building applications and innovation, we’re excited to introduce customers to the many reasons why Instaclustr for PostgreSQL is the best strategy for operating this powerful solution.”
Outlier AI Revolutionizes Business Analysis by Bringing Hidden Insights Directly Into Existing BI Tools and Workflow
Outlier AI, a leading automated business analysis (ABA) platform, released new features that enable users to move from automated AI-generated insights to deep data analysis with a single click. Designed around the way teams already work, analyze, and share their business data, the new features streamline analysis processes across the organization. Outlier is used by leading brands to automatically analyze multiple large data sources to uncover and explain unexpected changes in their business. The new Outlier features enable users to more quickly investigate these insights further by seamlessly connecting to existing CRM systems, BI and other internal tools, and raw data sources. Customers can now export insights found by Outlier as a PDF, image, CSV, and SQL query, in addition to clicking from an Outlier story directly into a filtered view in their data visualization BI tools.
“With these new features, Outlier is moving the analytics bar to ‘Activated Insights’,” said Sean Byrnes, CEO of Outlier. “Insights, until recently, have been information provided to a user who is then responsible for taking an action. With our Activated Insights, Outlier not only tells the user what is happening, but gives them a variety of ways to take specific actions on the insight immediately.”
Clear Ventures Launches New Investment Initiative for Researchers
Clear Ventures announced a new investment initiative to fund and support AI startups emerging from academic research institutions. The new initiative leverages Clear’s innovative CLEAR EDGE approach, bringing capital, resources and network to AI startups, which are backed by years of scientific research at top institutions. CLEAR EDGE creates a new channel and structure for research-based founders wanting a streamlined way to productize and supercharge go-to-market plans for their IP. It also furthers Clear’s growth and expansion into AI + X, or AI applied to a specific industry or function, leveraging the breakthrough research now taking place at universities seen as crucibles of innovation.
“While some research is far into the future, in the areas of AI+X we see that the gap between research and what can have real-world impact, is a lot narrower and much more within reach for investors and businesses,” said Rajeev Madhavan, founder and General Partner at Clear Ventures. “As former entrepreneurs, we have this understanding deeply ingrained into our DNA. With that said, Clear looks forward to bringing CLEAR EDGE to research-based startups, marrying a practical approach with years of scientific rigor to efficiently harness the growth we are seeing at the intersection of AI, frontier technology and research.”
H2O.ai Announces H2O Document AI to Automate Document Processing
Leading AI cloud provider H2O.ai, announced the general availability of H2O Document AI, a machine learning service that understands, processes, and manages the large volume and types of documents and unstructured text data that businesses and organizations handle every day. H2O Document AI streamlines processes, reduces costs, and discovers new information and insights contained in documents. H2O Document AI “learns as it goes,” continuously improving processing accuracy using H2O.ai’s latest innovations in machine learning and deep learning to achieve automation across business verticals and use cases not previously possible. To get started with H2O Document AI, visit https://www.h2o.ai/document-ai.
“Our banking, insurance, health, audit, and public sector customers each process billions of documents every year. Documents are the fastest growing source of data in the enterprise, ranging from contracts, bank statements, invoices, payroll reports, regulatory reports, and medical referrals to customer conversations in text, chat, and email,” said Sri Ambati, CEO and founder, H2O.ai. “H2O Document AI enables customers to sieve intelligence across a wide variety of document types not possible before, with unprecedented accuracy and speed. With H2O Document AI businesses can now seamlessly integrate insights from documents to their feature stores and transactional systems to delight their customers.”
Bigeye Launches Dashboard and Issues to Create a Complete Data Quality Workflow
Bigeye, a leading data observability platform, announced the release of Dashboard and Issues — a pair of integrated features that create a complete data quality workflow from a holistic understanding of the state of data quality to a smarter, faster way to resolve issues. Dashboard gives data team leaders a bird’s eye view of data health, making it simple to monitor data quality coverage, impact on SLAs, time to resolution, and other key analytics. Underpinning the insights that Dashboard provides are Issues. Issues make it easier than ever for data engineers and analysts to dig in and resolve data quality issues.
“With Bigeye, we have created a powerful data observability platform that enables data teams at companies like Clubhouse, Instacart, and Udacity to create more reliable data applications and support thousands of users with trusted data. We’re excited to build more intelligence into our data observability platform to give customers even more tools to understand the health of their data and resolve issues fast,” said Egor Gryaznov, CTO and cofounder, at Bigeye.
Ahana Cloud for Presto Delivers Deep Integration with AWS Lake Formation Through Participation in Launch Program
Ahana, the SaaS for Presto, announced Ahana Cloud for Presto’s deep integration with AWS Lake Formation, an Amazon Web Services, Inc. (AWS) service that makes it easy to set up a secure data lake, manage security, and provide self-service access to data with Amazon Simple Storage Service (Amazon S3). As an early partner in the launch program, this integration allows data platform teams to quickly set up a secure data lake and run ad hoc analytics on that data lake with Presto, the de facto SQL query engine for data lakes.
“We are thrilled to announce our work with AWS Lake Formation, allowing AWS Lake Formation users seamless access to Presto on their data lake,” said Dipti Borkar, Cofounder and Chief Product Officer at Ahana. “Ahana Cloud for Presto coupled with AWS Lake Formation gives customers the ability to stand up a fully secure data lake with Presto on top in a matter of hours, decreasing time to value without compromising security for today’s data platform team. We look forward to opening up even more use cases on the secure data lake with Ahana Cloud for Presto and AWS Lake Formation.”
Latest Ondat Storage Platform Two Times Faster, Delivers Higher Availability and Improved Performance for Stateful Kubernetes Applications
Ondat, a leading Kubernetes-native storage platform provider, announced version 2.5, which can synchronize data across different data centers to provide high availability, along with failover disaster recovery. Plus, its industry-leading performance is enhanced for even faster replica synchronization — more than two times faster with even higher speeds on high-latency networks. The update applies to Ondat for on-premises data centers and its recently launched SaaS platform helping customers manage stateful Kubernetes applications with persistent data volumes.
“Ondat 2.5 addresses the top requirements we’re hearing from our customers and prospects who want the same features and performance of their Kubernetes deployments that they are used to before in their data center,” said James Brown, head of product and platforms at Ondat. “For Kubernetes deployments, this brings a new level of high availability and improved performance.”
Soda Unveils Data Health Metrics Store
Soda, the provider of Open Source data reliability tools and cloud data observability platform, has released Cloud Metrics Store, providing advanced testing-as-code capabilities to enable data teams to get ahead of data issues in a more sophisticated way than ever before. Available to all users of Soda’s Open Source (OSS) tools, Cloud Metrics Store captures historical information about the health of data to support the intelligent testing of data across every workload.
“It’s advantageous for data teams to unify around a common language that allows them to specify what good data looks like across the data value chain from ingestion to consumption, irrespective of roles, skills, or subject matter expertise. Most data teams today are organized by domain, and when creating data products, they often depend on each other to provide timely, accurate, and complete data,” explains Maarten Masschelein, CEO, Soda. “For this reason, we are delighted to release Soda Cloud Metrics Store to all users of our OSS tools, as it represents another important milestone in our mission to bring everyone closer to trusted data. Cloud Metrics Store helps data teams to be explicit about what good data looks like, enabling agreements to be made between domain teams that can be easily tracked and monitored, giving data product teams the freedom to work on the next big thing.”
DDN Raises the Bar in High-Frequency Trading
DDN®, a leader in data management solutions for artificial intelligence (AI), high performance computing (HPC) and multicloud, announced dramatically improved performance in the STAC-M3TM Benchmark utilizing its newly announced A3I® AI400X2 appliance. Financial services firms looking to accelerate analytics can create more sophisticated trading models with this new platform. These new benchmark results, audited by STAC® Research, show a single DDN appliance delivering high performance and throughput using a shared filesystem. While DDN A3I storage systems can scale without limit to increase capacity and throughput, these results demonstrate extreme performance in a cost-effective form factor. And, the A3I AI400X2 appliance has a smaller footprint, less complexity and lower overall TCO than competitive storage solutions.
“Simplifying the management of data-intensive workloads and delivering faster insights to our customers is the basis of our development here at DDN,” said Kurt Kuckein, vice president of marketing at DDN. “These STAC benchmark results reflect the progress we’ve made toward streamlining data management while significantly accelerating analytics and AI workloads – particularly for our financial services customers.”
Privacy Dynamics Launches One-Click De-Identification Tool, Eliminating Data Privacy and Disclosure Risks Within Minutes
Privacy Dynamics, a startup that simplifies ethical and responsible use of data, launched a SaaS application that can anonymize thousands of records per second with the click of a button. Privacy Dynamics saves data and analytics teams valuable time while also ensuring organizations pull compliant, accurate information from a central data warehouse.
“Too often, we hear from engineers that data is unavailable due to well-intentioned security
policies designed to prevent a data breach. The problem is that these policies also prevent them
from doing their jobs.” said CEO and Founder Graham Thompson. “We believe that Privacy
Dynamics will transform how companies manage access to data, and we’re excited to see what
our users can accomplish with centralized access to high quality, privacy-safe data.”
Lytics Unlocks the Power of the Data Warehouse for Ad Targeting with Cloud Connect
Lytics, a leading customer data platform (CDP) that improves marketing outcomes using first-party data, announced the launch of Cloud Connect, a freemium, self-service tool that connects data in cloud data warehouses directly to leading advertising platforms for more sophisticated audience targeting. With this release Lytics is also announcing the introduction of fully supported connections to data warehouses: Google Big Query, Amazon Redshift, Microsoft Azure and Snowflake. Cloud Connect is a simple to use tool to create consumer audiences with a SQL query editor directly on the customer’s data warehouse and can activate audiences in 40+ destinations, such as Google Ads, Amazon Ads, and LinkedIn Ads, to enable targeted advertising.
“The CDP industry needs to think differently about how our customers can manage and activate their most valuable asset – customer data,” said James McDermott, CEO of Lytics. “With this launch, Lytics is reimagining its CDP to fit into the new customer data architecture that sits directly on top of our customer’s cloud data warehouse.”
Datadobi Launches Multi-Petabyte Starter Packs to Accelerate Enterprise-Class Digital Transformation
Datadobi, the global leader in unstructured data management, today announced it has released new Starter Packs for DobiMigrate ranging from 1PB up to 7PB. The latest offering is purpose-built to bolster unstructured data management projects’ success in data-heavy, high-volume environments. The new Starter Packs will enable channel partners and end users to accelerate digital transformation and conduct data management projects to any storage platform or cloud environment.
“In order to be successful in today’s data-driven world, organizations must have a handle on the massive amounts of unstructured data they hold, whether it be on-premises or in the cloud,” said Michael Jack, CRO and Co-Founder, Datadobi. “We created the new larger Starter Packs because migration projects just keep getting bigger. Where just a few years ago multi-petabyte migrations were considered large, today they are the norm. The new starter packs will help customers with larger environments to get their migration projects started faster. Backed by the battle-hardened components of our Datadobi engine, our customers and partners can rest assured knowing that they have complete control over their unstructured data across all environments.”
PARSEC LABS ANNOUNCES ENHANCED DATA ANALYTICS
Parsec Labs, the fast-growing and innovative provider of enterprise data management and protection solutions, announced the release of Parsec Insight, an upgraded version of its data scanning and analysis tool that gives customers a deeper understanding of their data. With this new release, Parsec moves beyond the usual reporting of age, access time, modify time, size, etc. With Parsec Insight, customers can now also determine actual MIME type/content and compare with file name extensions to discover anomalies.
“The exciting thing about Insight is it allows our customers to do some interesting pattern detection where file data is represented to users/admins as one type, but in reality is something else,” said Chris Moore, Parsec’s Chief Executive Officer.
Open Source MLOps Tool DVC Adds Experiment Versioning
Iterative, the MLOps company dedicated to streamlining the workflow of data scientists and machine learning (ML) engineers, announced the latest release of Data Versioning Control (DVC), introducing industry-first, experiment versioning. Experiment versioning gives developers an easy way to save, compare, and reproduce ML experiments at scale in ways that neither traditional software version control nor existing experiment tracking tools can.
“Experiment tracking tools have come a long way. Users no longer need to log experiment information in spreadsheets or notebooks,” said Dave Berenbaum, technical product manager at Iterative. “But current experiment tracking tools usually provide an API to log experiment information, a database to store it, and a dashboard to compare and visualize. DVC experiment versioning builds on modern version control principles and technology to address experiment tracking needs and give developers the most integrated way to iterate their experiments.”
Striim announces general availability of version 4.0 of its streaming platform
Striim, Inc. announced the availability of Striim 4.0, the next generation of its industry-leading data integration and streaming platform. Striim 4.0 introduces new features and enhances existing capabilities for customers to run hundreds of data pipelines on auto-pilot and infuse real-time data into their business processes. Customers can choose from over 100 built-in adapters to read from and write to a data store, storage system, or cloud, and in the format of their choice. They can efficiently manage their pipelines using Striim’s visual, no-code user interface, SQL-based CLI (command line interface), and REST APIs (application programming interfaces). Striim 4.0 maximises operational uptime with data observability and automatic responses to changes in data structures and formats in upstream applications.
“Striim’s unified data integration and streaming platform enables modernization and digital transformation by connecting enterprise systems in a modern cloud architecture with fully automated real-time data pipelines,” said Alok Pareek, Executive Vice President of Engineering and Products at Striim. “Fresh data is essential for enterprises to make the right business decisions at the right time, and the Striim platform enables them to integrate, analyze and transform their data in-flight. This speed gives them a clear and distinct competitive advantage.”
Mode Introduces Visual Explorer
Mode Analytics, the comprehensive platform for collaborative Business Intelligence (BI) and interactive Data Science, introduced Visual Explorer, a new flexible visualization system that helps analysts explore data faster and provide easy-to-interpret insights to business stakeholders.
“Visual Explorer is a best-in-class visualization suite powered by Helix, Mode’s award-winning responsive data engine,” said Benn Stancil, Co-founder and Chief Analytics Officer, Mode. “Building upon Mode’s unique ability to combine BI and Data Science workflows, Visual Explorer brings together the workflows of both analysts and business users in a way that has never been done before. This capability is critical to answering questions quickly, and advancing knowledge across the organization.”
Sign up for the free insideBIGDATA newsletter.
Join us on Twitter: @InsideBigData1 – https://twitter.com/InsideBigData1