It’s no longer a question of if we should leverage artificial intelligence (AI) in learning, teaching, and research. The Division of Information Technology wants to ensure that users have access to AI tools that can be trusted.

“AI will inevitably be used across all areas of research and education. Our goal is to ensure that users have secure, reliable, and equitable access to high-quality AI tools that enable them to advance their research, enhance their workflows, and support learning,” said Alberto Cano, associate vice president for research computing, who leads the division's Advanced Research Computing group.

In October, Advanced Research Computing released a new suite of large language model (LLM) services that is available to all students, faculty, and staff. Adding onto the original AI-as-a-Service released last spring, these new services are designed to address a broad range of user needs and experience levels. 

Secure and simple to use

Keeping data safe is paramount for responsible AI use at the university, and these LLM services are hosted entirely on-premises within Advanced Research Computing's secure infrastructure. No prompt entered or data used is sent to any party outside of the university, and user data is not used for training the model. Because of this, a wider range of data classifications, including high-risk data, are permitted for use with Advanced Research Computing's LLM services compared with other options available at the university.

Advanced Research Computing's services are designed for AI beginners and experienced users alike.

“While we do have researchers using AI for very advanced computations, our LLM services are as easy to use as commercial AI services and intended for any student, staff, or faculty member to use effectively regardless of their skill level or experience with high-performance computing,” said Cano.

Advanced Research Computing's LLMs are also free for users. 

About the services

A web interface, available at llm.arc.vt.edu gives users access to three LLMs hosted and run by ARC:

  • Z.ai GLM-4.5-Air, a high-performance public model
  • QuantTrio GLM-4.5V-AWQ, which offers vision capabilities
  • OpenAI gpt-oss-120b, OpenAI’s flagship public model

These LLMs can be used for various tasks including retrieval-augmented generation (RAG), web search, vision, and image generation. This service can be accessed through a user's Virginia Tech login without needing a separate Advanced Research Computing account.

A second option, llm-api.arc.vt.edu, provides an OpenAI-compatible API endpoint to access the LLMs mentioned above.

In simple terms, this means that users can connect the Advanced Research Computing LLMs to any software or tool that already works with commercial AI services, such as OpenAI’s ChatGPT, by updating the URL and API key, which is similar to a password. For example, if users currently connect an application such as VS Code GitHub Copilot, Jupyter notebooks, or a data analysis script to OpenAI using its public API, they can easily “plug and play” Advanced Research Computing's API by swapping in the Advanced Research Computing-provided URL and key. 

Advanced Research Computing's LLMs via Open OnDemand service offer a custom dedicated LLM for users who need intensive continuous API calls. More than 40 LLM models are available to support many types of research data. An Advanced Research Computing account is required to use this option.

“We're thrilled that our research computing team continues to expand capabilities to meet research and educational needs in this crucial moment of rapid AI adoption. These new tools are easy to use and protect the intellectual property and proprietary data of individuals and the university” said Sharon P. Pitt, vice president for information technology and chief information officer.

Those with questions or who need assistance in selecting an LLM model can reach out for a consultation. Users are also reminded to consider Virginia Tech’s AI Principles when using any LLM or other AI tool. 

More information about Advanced Research Computing's LLMs is available at the AI documentation page.

Share this story