ChatGPT Plus gets "Memory" upgrade

+ Snowflake open-sources Arctic, an enterprise-focused LLM

Welcome to AI Disruptor! If you want to join our growing community of readers, click the button below.

TODAY’S TOP STORIES:

🧠 ChatGPT Plus gets "Memory" upgrade for enhanced learning
❄️ Snowflake open-sources Arctic, an enterprise-focused LLM

+ Quick reads
+ AI tool highlight

OpenAI has rolled out the "Memory" feature for all ChatGPT Plus users, allowing the AI model to retain context from previous conversations across chats. This feature is designed to extend ChatGPT's capabilities and enable users to customize their experience.

What to know:

  • Memory can be turned on and off in the settings, giving users control over what ChatGPT remembers.

  • To use Memory, users start a new chat and tell ChatGPT what they want it to remember.

  • Memory is currently unavailable in Europe and Korea, with no specific reason provided by OpenAI.

  • Team and Enterprise customers will also have access to Memory, and the feature will be available for GPTs in the future.

  • Technical implementation and privacy aspects of Memory have not yet been released by OpenAI.

Image: OpenAI

❄️ Snowflake open-sources Arctic, an enterprise-focused LLM

Image: Snowflake

Snowflake has developed its own large language model called Arctic, which is now available as open-source. Arctic is designed to be highly efficient in both training and inference, particularly for business-related tasks such as SQL code generation, general programming, and following complex instructions.

What to know:

  • Snowflake claims that Arctic was developed with a budget of less than $2 million, or about 3,000 GPU weeks, while matching or exceeding the performance of larger models like Meta's Llama 3 70B on "enterprise intelligence" metrics.

  • Arctic uses a hybrid architecture combining a Dense Transformer with 10 billion parameters and a Mixture of Experts (MoE) residual layer with 480 billion parameters (17B active parameters).

  • Snowflake has published a detailed "Cookbook" describing the model and its training process, sharing insights and best practices for training MoE models efficiently.

  • Model checkpoints for both base and instructed versions of Arctic are available on Hugging Face under the Apache 2.0 license, with instructions for inference and fine-tuning on GitHub.

  • Snowflake plans to work with Nvidia and the vLLM community to provide optimized implementations for fine-tuning and inference, and is working on additional models in the Arctic series.

Image: Snowflake

Image: Snowflake

AI tool highlight: ElevenLabs

ElevenLabs is an AI-powered text-to-speech platform that offers a range of features for generating high-quality, lifelike speech. It provides customizable voices, accent options, and voice cloning capabilities across multiple languages. The platform also includes tools for voice verification and aims to make content accessible while maintaining ethical AI practices and user privacy.

Quick reads

The Financial Times has struck a deal with OpenAI to license its content and develop AI tools. ChatGPT users will see summaries, quotes, and links to FT articles, while OpenAI will work with the news organization to develop new AI products.

University of Southern Denmark researchers have introduced SynthEval, a Python package designed to facilitate the easy and consistent evaluation of synthetic tabular data. The framework incorporates a large collection of metrics and allows users to create custom benchmarks.

The rise of powerful large language models (LLMs) has led to a new paradigm in AI and big data. SQL vector databases, like MyScaleDB, are becoming increasingly important for improving retrieval accuracy and cost-efficiency in real-world AI applications.

What did you think of this edition of AI Disruptor?

Your feedback helps us create a better newsletter!

Login or Subscribe to participate in polls.

Join the conversation

or to participate.