Neuchips and ShareGuru Launch Gen AI Autonomy Solution

October 13, 2025 -- Neuchips Corp., an AI ASIC Design Company founded in 2019, today announced a deep collaboration with strategic partner ShareGuru Intelligence Technologies Inc. to introduce an industry-leading on-premise Generative AI solution. This comprehensive offering is centered on Neuchips' highly power-efficient, low-power Viper LLM Inference Card and integrates ShareGuru's unique Multi-Agent AI Application Platform. The solution aims to alleviate enterprise concerns over data leakage and significantly lower the barriers for small and medium-sized enterprises (SMEs) to adopt Generative AI, thereby promoting AI Autonomy.

Neuchips: Viper LLM Inference Card—The Core of AI Efficiency Redefined

Neuchips understands that memory and power efficiency are two critical factors in language model development. While the market is dominated by GPUs, we firmly believe that not every task requires expensive and power-hungry graphics processors. Our core product, the Viper LLM Inference Card , is purpose-built for inference applications, achieving an exceptional balance between performance and deployment cost:

  • Pioneering Chip Core: The Viper LLM Inference Card integrates our independently developed Raptor N3000 Inference Chip, delivering superior cost-effectiveness and energy efficiency.
  • Ultimate Performance-to-Power Ratio: Featuring 64GB of LPDDR5 memory, the single card supports models up to 14 Billion parameters. With an average power consumption of just 45 Watts, it successfully overcomes the traditional dilemma in AI solutions: the trade-off between AI energy efficiency and deployment cost.
  • Flexible Deployment: Utilizing an actively cooled HHHL PCIe (Half-Height, Half-Length PCIe) form factor design, it can be easily installed in standard small-form-factor computer hosts, making it perfectly suited for SMEs eager to embrace AI.

Seamless Integration: MINI SERVER All-in-One Knowledge Management Host

o solve the challenges SMEs face in dedicating resources and technical expertise to AI software development, Neuchips and its AI software development partner, ShareGuru, have co-developed the MINI SERVER all-in-one host. This establishes a truly integrated hardware-software system and an automated knowledge Q&A service ecosystem:

  • Robust Hardware Configuration: The MINI SERVER is equipped with an Intel i5 processor, a 32GB AMD Pro AI 9700 GPU, and our 64GB Viper LLM Inference Card. This combination efficiently allocates computing resources and overcomes memory capacity limitations. Paired with the ShareGuru solution, it can simultaneously support departmental services for 5 to 10 users.
  • On-Premise LLM Management Platform: It provides a multi-user, on-premise LLM management platform for enterprises, characterized by flexible scalability, low power consumption, and minimalist deployment, effectively redefining AI inference efficiency and practicality.

ShareGuru: The ShareQA System—On-Premise Knowledge Management Innovation

ShareGuru leverages its core Multi-Agent AI Application Platform to create the ShareQA Enterprise Customer Service and Knowledge Management AI System. This platform is designed to resolve corporate pain points related to ineffective knowledge transfer and data silos:

  • Core NL2SQL Technology: Utilizing ShareGuru’s proprietary AI Agent Framework technology combined with Natural Language to SQL (NL2SQL) generation and AI corpus processing techniques, the system significantly enhances the quality and accuracy of AI answers.
  • Data Sovereignty and Security: Enterprise data can be imported into the system for on-premise, proprietary smart Q&A without ever needing to be exposed externally.
  • Multilingual Advantage: Employing AI Multi-Agent Technology, the system effectively overcomes language barriers, automatically responding in the user's language of inquiry. This makes it ideal for multinational corporate customer service or multilingual organizations

Joint Vision: Building Sovereign AI and Implementing AI Autonomy

In addition to providing a complete on-premise solution, Neuchips and ShareGuru have been selected as trial operation partners for the TAIWAN AI RAP program led by the National Center for High-Performance Computing (NCHC). By integrating a proprietary AI chip (Neuchips Viper), an innovative solution (ShareGuru ShareQA), and an advanced cloud architecture (TAIWAN AI RAP), the partners are creating services featuring Sovereign AI for enterprises, ensuring national information security and collectively driving AI autonomy.

×
Semiconductor IP