open weights

  • Liquid AI’s LFM2.5: Compact Models for On-Device AI


    Liquid AI Releases LFM2.5: A Compact AI Model Family For Real On Device AgentsLiquid AI has unveiled LFM2.5, a compact AI model family designed for on-device and edge deployments, based on the LFM2 architecture. The family includes several variants like LFM2.5-1.2B-Base, LFM2.5-1.2B-Instruct, a Japanese optimized model, and vision and audio language models. These models are released as open weights on Hugging Face and are accessible via the LEAP platform. LFM2.5-1.2B-Instruct, the primary text model, demonstrates superior performance on benchmarks such as GPQA and MMLU Pro compared to other 1B class models, while the Japanese variant excels in localized tasks. The vision and audio models are optimized for real-world applications, improving over previous iterations in visual reasoning and audio processing tasks. This matters because it represents a significant advancement in deploying powerful AI models on devices with limited computational resources, enhancing accessibility and efficiency in real-world applications.

    Read Full Article: Liquid AI’s LFM2.5: Compact Models for On-Device AI

  • AI2025Dev: A New Era in AI Analytics


    Marktechpost Releases ‘AI2025Dev’: A Structured Intelligence Layer for AI Models, Benchmarks, and Ecosystem SignalsMarktechpost has launched AI2025Dev, a comprehensive analytics platform for AI developers and researchers, offering a queryable dataset of AI activities in 2025 without requiring signup. The platform includes release analytics and ecosystem indexes, featuring "Top 100" collections that connect models to research papers, researchers, startups, founders, and investors. Key features include insights into open weights adoption, agentic systems, and model efficiency, alongside a detailed performance benchmarks section for evaluating AI models. AI2025Dev aims to facilitate model selection and ecosystem mapping through structured comparison tools and navigable indexes, supporting both quick scans and detailed analyses. This matters because it provides a centralized resource for understanding AI developments and trends, fostering informed decision-making in AI research and deployment.

    Read Full Article: AI2025Dev: A New Era in AI Analytics

  • Tencent’s HY-MT1.5: New Multilingual Translation Models


    Tencent Researchers Release Tencent HY-MT1.5: A New Translation Models Featuring 1.8B and 7B Models Designed for Seamless on-Device and Cloud DeploymentTencent's HY-MT1.5 is a new multilingual machine translation model family designed for both mobile and cloud deployment, featuring two models: HY-MT1.5-1.8B and HY-MT1.5-7B. Supporting translations across 33 languages and 5 dialect variations, these models offer advanced capabilities like terminology intervention, context-aware translation, and format-preserving translation. The 1.8B model is optimized for edge devices with low latency, while the 7B model targets high-end deployments with superior quality. Both models are trained using a comprehensive pipeline that includes general and MT-oriented pre-training, supervised fine-tuning, and reinforcement learning, ensuring high-quality translations and efficient performance. This matters because it enhances real-time, high-quality translation capabilities on a wide range of devices, making advanced language processing more accessible and efficient.

    Read Full Article: Tencent’s HY-MT1.5: New Multilingual Translation Models