Foxconn unveils FoxBrain, its first large language model
The Hindu
Taiwan’s Foxconn said on Monday it has launched its first large language model.
Taiwan’s Foxconn said on Monday it has launched its first large language model and plans to use the technology to improve manufacturing and supply chain management.
The model, named “FoxBrain,” was trained using 120 of Nvidia’s H100 GPUs and completed in about four weeks, the world's largest contract electronics manufacturer said in a statement.
The company, which assembles iPhones for Apple and also produces Nvidia's artificial intelligence servers, said the model is based on Meta’s Llama 3.1 architecture.
It is Taiwan's first large language model with reasoning capabilities that is optimised for traditional Chinese and Taiwanese language styles, it said.
Foxconn said that though there was a slight performance gap compared with China's DeepSeek's distillation model, its overall performance is very close to world-class standards.
Initially designed for internal applications, FoxBrain covers data analysis, decision support, document collaboration, mathematics, reasoning and problem-solving, and code generation.
Foxconn said it plans to collaborate with technology partners to expand the model’s applications, share its open-source information, and promote AI in manufacturing, supply chain management, and intelligent decision-making.