SAN FRANCISCO—Nvidia Corp said Wednesday (May 9) that LLVM, a popular open source compiler, now supports Nvida GPUs.
Nvidia (Santa Clara, Calif.) said it worked with LLVM developers to provide its CUDA compiler source code changes to the LLVM core and parallel thread execution backend. Programmers can now develop applications for GPU accelerators using a broader selection of programming languages, Nvidia said.
Nvidia announced late last year it would provide the source code for its CUDA low-level virtual machine (LLVM)-based compiler to academic researchers and software-tool vendors, a move the chip firm said would more easily add GPU support for more programming languages and support CUDA applications on alternative processor architectures.
LLVM is a widely used open source compiler infrastructure with a modular design said to make it easy to add support for programming languages and processor architectures. The CUDA compiler provides C, C++ and Fortran support for accelerating application using Nvidia's GPUs.
LLVM supports a range of programming languages and front ends, including C/C++, Objective-C, Fortran, Ada, Haskell, Java bytecode, Python, Ruby, ActionScript, GLSL and Rust. It is also the compiler infrastructure Nvidia uses for its CUDA C/C++ architecture, and it has been widely adopted by leading companies such as Apple, AMD and Adobe, Nvidia said.
"The code we provided to LLVM is based on proven, mainstream CUDA products, giving programmers the assurance of reliability and full compatibility with the hundreds of millions of Nvidia GPUs installed in PCs and servers today," said Ian Buck general manager of GPU computing software at Nvidia, in a statement. "This is truly a game-changing milestone for GPU computing, giving researchers and programmers an incredible amount of flexibility and choice in programming languages and hardware architectures for their next-generation applications."
The latest version of the LLVM compiler with Nvidia GPU support can be downloaded through the LLVM site.
This article originally appeared on EE Times.