Large language models (LLMs) applications range from text processing to predicting virus variants. As the datasets on which LLM models are trained become increasingly massive — including trillions of parameters ….
HPCNow! Announces HPC Cluster Monitoring Offering
Barcelona, February 8th 2023 – HPCNow! has announced a real-time HPC cluster monitoring capability. The monitoring stack includes open-source solutions such as Grafana, Elasticsearch and Prometheus, for visualization and data storage, and Slurm plugins plus customized scripts to gather all the information needed by the system administrator. It is delivered using Docker Compose for single-node […]
Akridata Data Explorer Reduces Visual Data Analysis Time by 15x, Significantly Accelerates Model Accuracy for Production Grade AI Models
Akridata, a software company that provides an end-to-end suite of products that support both the smart ingestion and smart exploration of visual data to reduce cost and complexity while accelerating business value, announced the official launch of Akridata Data Explorer, its platform that provides data science teams the tools to easily explore, search, analyze, and compare visual data to improve data sets and improve model training.
Comet Introduces Kangas, An Open Source Smart Data Exploration, Analysis and Model Debugging Tool for Machine Learning
Comet, provider of a leading MLOps platform for machine learning (ML) teams from startup to enterprise, announced a bold new product: Kangas. Open sourced to democratize large scale visual dataset exploration and analysis for the computer vision and machine learning community, Kangas helps users understand and debug their data in a new and highly intuitive way.
OmniSci Rebrands as HEAVY.AI
SAN FRANCISCO – March 1, 2022 – Advanced analytics company OmniSci announced its rebrand to HEAVY.AI with the intent of helping enterprises face a new era of challenges in leveraging their big location and time data to visualize and identify business opportunities and risks. The pace of new digital data being created continues to skyrocket. […]
ECP Brings Visualization Software to Exascale and GPU-accelerated HPC Systems
The development of the VTK-m toolkit, a scientific visualization toolkit for emerging architectures, is a critical advancement in support of scientific visualization on exascale and GPU-accelerated systems for high-performance computing (HPC) users. VTK-m is needed because—counterintuitively—GPUs currently have software challenges when supporting large-scale scientific visualization tasks. For historical reasons, their massively multithreaded architecture, separate memory subsystems, and advent of new visualization workflows, such as in situ and in transit visualization, that bypass data movement for big-data simulations are currently problematic for scientific visualization.
GPU-Powered Smarter, Faster Visual Search
In visual search is seen as the next great search frontier, and Microsoft’s Bing has tapped the power of NVIDIA GPUs to make it a reality. Download the new white paper from Nvidia to explore how Bing deployed NVIDIA technology to speed up object detection and deliver pertinent results in real time.
Data Visualization: Making Big Data Approachable
Why are some companies able to use Big Data to their advantage, while others remain mired in reams of information, but gain little insight? In many cases, those companies that have found success with Big Data are using data visualization to help make sense of the information.
Field Report: Qlik Qonnections 2016
Hot on the conference circuit this year, we here at insideAI News were pleased to be Qlik’s guest for their Qlik Qonnections 2016 conference in Orlando, Florida on May 1-4. We had a blast at this annual tech extravaganza that’s hosted by one of the industry’s most innovative leaders.












