MyAdvantech Registration

MyAdvantech is a personalized portal for Advantech customers. By becoming an Advantech member, you can receive latest product news, webinar invitations and special eStore offers.

Sign up today to get 24/7 quick access to your account information.

Leveraging GenAI and LLMs for Enhanced Decision Support in Manufacturing

8/26/2024
NVIDIA AI Enterprise

Background

Electronics manufacturing involves complex material management and production processes, requiring multiple information systems and a large number of human resources. Incorporating LLM AI into a decision support system can lead to enhanced efficiency, improved product quality, better customer satisfaction, and more strategic decision-making. It helps in various ways such as data analysis & insights, process optimization, customer support and employee training, ultimately contributing to the overall success and competitiveness of the electronics manufacturer.

Challenges

There are several obstacles to adopting LLM AI technology for decision support systems: 

Technical Complexity: Deploying AI systems requires significant technical expertise, including knowledge of LLMs, IT infrastructure, and relevant software skills. 

Edge Privacy: For stability and security reasons, customers prefer to use a local private network. This raises concerns when applying cloud-based AI applications. 

Challenging EnvironmeAIR-520nt: Manufacturing sites often lack adequate air conditioning, server rack space, and dedicated power sources, making them unsuitable for commercial AI servers.

Solutions

To overcome these challenges, the customer chose the AIR-520 70B LLM fine-tuning solution, which includes the following components: 

AIR-520 Edge AI Server System with NVIDIA RTX 6000 Ada GPU Cards: Equipped with an AMD EPYC 7003 series processor and two NVIDIA RTX 6000 Ada GPU cards, the server-grade system provides high-performance computational power. It is certified with IEC-61000 and features an industrial cooling design, meeting the requirements of challenging environments. 

aiDAPTIV+ ai100 AI SSD: The AIR-520 is equipped with the ai100 SSD, featuring 4TB of storage and patented middleware, allowing for the efficient training of larger models with just 2 GPUs and standard DRAM. This ensures that the platform can handle large datasets securely and effectively within a local private network. 

NVIDIA AI Enterprise: Comprehensive AI development suite for inference helps customers rapidly develop inference applications with direct technical support from NVIDIA AI experts.

Benefits

  • Real-time Status & Analytics: LLM AI can process and analyze large datasets quickly, providing real-time insights that aid in timely decision-making and predicting future trends. 
  • Keep Sensitive Data at the Edge: Both model training and inferencing run on the AIR-520 system, with all data sources on the same intranet, ensuring no security concerns. 
  • Reliable 24/7 Runtime: The AIR-520 system is a tower workstation, suitable for limited space in the field. Its industrial cooling design ensures stable operation even without optimal air conditioning.